perl or bash question ["convert strings in a txt to html links"]

Jonesy gmane at jonz.net
Sat Feb 27 18:05:51 UTC 2010


On Sat, 27 Feb 2010 12:40:39 -0500, Alex Janssen wrote:
> Vadkan Jozsef wrote:
>> How can I do that in bash or perl, that I have a txt file, e.g.:
>>
>> $cat file.txt
>> Hi, this is the content of the txt file, that contains links like this:
>> http://www.somewhere.it/, and it could contain: http://somewhere.com,
>> etc..
>> This is the second line, that doesn't contains links..
>> ..
>> This is the XYZ line, that contains a link: http://www.somewhere.net
>> $
>>
>> ...ok.. so how could I make a regexp for this?
>>
>> Turning:
>>
>> http://website.org
>> http://www.website.org
>>
>> to this:
>>
>> <a href=http://website.org>http://website.org</a>
>> <a href=http://www.website.org>http://www.website.org</a>
>>
>> The solution would be:
>>
>> sed 'SOMEMAGIC' file.txt > file.html
>> or
>> perl 'SOMEBIGMAGIC' file.txt > file.html
>>   
> grep -io "<a.*</a>" somefile.html
> That'll extract just the href code.

 ... _IF_ it's all on one line -- which it usually ain't.

And, that's not what the OP wants.
He want's to href'ify URLs found in 'raw' text.

Jonesy
-- 
  Marvin L Jones    | jonz          | W3DHJ  | linux
   38.24N  104.55W  |  @ config.com | Jonesy |  OS/2
    * Killfiling google & XXXXbanter.com: jonz.net/ng.htm





More information about the ubuntu-users mailing list