Site Spider
Donn
donn.ingle at gmail.com
Sun Jan 20 06:30:18 UTC 2008
There is a gui that does this. It has a name so abysmal that I can't recall
it...
I used this scripts once a few years ago to fetch a website.
It gets two parameters: url level
The level is how far down a chain of links it should go.
You could just replace the vars and run the command directly.
===
#!/bin/bash
#Try to make using wget easier than it bloody is.
url=$1
if [ -z $url ]; then (echo "Bad url"; exit 1); fi
LEV=$2
if [ -z $LEV ]; then
LEV="2"
fi
echo "running: wget --convert-links -r -l$LEV $url -o log"
wget --convert-links -r -l$LEV "$url" -o log
===
man wget is the best plan really.
\d
--
Gee, what a terrific party. Later on we'll get some fluid and embalm each
other. -- Neil Simon
Fonty Python and other dev news at:
http://otherwiseingle.blogspot.com/
More information about the kubuntu-users
mailing list