ubuntu-users Digest, Vol 23, Issue 243

Asif Lodhi asif.lodhi at gmail.com
Fri Jul 14 19:14:08 UTC 2006


Hi Dimitri and Henk,

On 7/14/06, ubuntu-users-request at lists.ubuntu.com
<ubuntu-users-request at lists.ubuntu.com> wrote:
> Message: 6
> From: "Henk Postma" <henkpm at gmail.com>
> Subject: Re: download whole web pages/sites to view off-line
> On 7/13/06, Dimitri Mallis <dimitri.mallis at gmail.com> wrote:
> > is there a program the can download whole webpages just like "teleport
> > pro" could, so that i can view the pages off line
> httrack seems a good choice, but..........................
> Have a look at plucker ..........

I have been using "wget" successfully for the same purpose for at
least 2 years now.  It pulls everything you tell it to from the
specified website.  It's easy if everything on the website falls under
one directory/folder.  If not, then you have to specify the "-p"
option (exluding quotes) for perks (or those files that are lying
outside of the directory tree).  In addition, you can additionally
specify what other directories (-i) you would like wget to search for
related files and what (directories) _not_ to search (-e).  It comes
built in with Fedora Core3 at least and, if I recall correctly, all
previous versions of RedHat.  I am sure it is in Ubuntu as well.  If
not then you can use your favorite install-n-download tool to get it.

--
HTH,

Asif




More information about the ubuntu-users mailing list