Graphical or automatic "keep-alive" program?
Steve Flynn
anothermindbomb at gmail.com
Wed Dec 3 11:39:11 UTC 2008
On Wed, Dec 3, 2008 at 11:20 AM, Knapp <magick.crow at gmail.com> wrote:
>> need something that will either automatically function as a keep-alive
>> program or else I need to change some network setting that will do the
>> same for her (timeout limits maybe?)
The simplest solution is a one liner being executed from cron every 5
minutes or so. You have a selection of options depending on how much
data you want to suck down the connection to ensure it stays up.
A ping to a reliable machine (google,com, microsoft.com, bbc.co.uk)
will generate a small number of packets. It doesn't matter if the
server in question doesn't respond to pings as your packets going out
will generate data across the network.
A traceroute to the same will generate more packets, but still no more
than a few handfuls (if one can measure a IP packet in handfuls!)
A wget of a page from a reliable, reasonably sized site will generate
more data again and as you''ll be slinging it straight into /dev/null
you don't need to be concerned about storign it anywhere... something
like the NY Times or the BBC's frontpage should be more than
sufficient.
To implement any of these simply pop the relevant command into a file,
make it executable and place a call to it in your crontab with a timer
to call it every 5 or 10 minutes. Shout if you want any assistance
with the syntax but the man page for cron should give you suitable
examples to mess about with.
Any use?
--
Steve
When one person suffers from a delusion it is insanity. When many
people suffer from a delusion it is called religion.
09 F9 11 02 9D 74 E3 5B D8 41 56 C5 63 56 88 C0
More information about the ubuntu-users
mailing list