BaseSeed
John
dingo at coco2.arach.net.au
Wed Sep 8 17:56:13 CDT 2004
Matt Zimmerman wrote:
>On Thu, Sep 09, 2004 at 04:54:34AM +0800, John wrote:
>
>
>
>>How else do you do this?
>>summer at Dolphin:~$ time lynx -dump http://www.x.com/ | tail
>> 30. http://www.ebay.com/
>> 31. http://www.paypal.com/cgi-bin/webscr
>> 32. http://www.paypal.com/cgi-bin/webscr?cmd=p/gen/fdic-outside
>> 33. http://www.paypal.com/cgi-bin/webscr?cmd=p/gen/privacy-outside
>> 34. http://www.bbbonline.org/cks.asp?id=20111061155818568
>>
>>
>>I regularly want a list of URLs for some reason, often to get a list of
>>files to download with wget or (sometimes) with curl.
>>
>>
>
>You don't need a browser at all if you only want to extract URLs.
>
>wget -O- http://www.x.com/ | urlview
>
>
>
I installed it, I tried it.
Yuck.
It doesn't look to me like an option.
I used to have a script in contrib in fetchmail that got the latest
fetchmail source and built an RPM. It used lynx to get the urls and some
other (possibly fragile) code to identify the latest version etc.
I don't see urlview being useful. I am _not_ going to tamper with its
config file to stop it using whatever browser it insists on using, and
I've not managed to make it use cat using BROWER= (which, at best, would
be a Crude Hack), and piping doesn't work.
More information about the sounder
mailing list