[wellylug] wget for big files
Rob Collins
robcollins55 at aim.com
Wed Mar 2 14:08:51 NZDT 2011
On 2/03/2011 12:04 p.m., Jethro Carr wrote:
> On Wed, 2011-03-02 at 12:02 +1300, Rob Collins wrote:
>> Hi.
>>
>> I am planning on using wget -g to obtain the new Opensuse 11.4 to be
>> released to public mirrors in just over a weeks time. What is the
>> general feeling on whether this is a good approach for downloading a
>> large 4.7GB file? I tried using firefox downthemall download manager
>> in the past very often with broken/non resumable connection results.
> hi Rob,
>
> wget is one of the most reliable ways to do a download, it works
> consistently and if a download is interrupted (eg due to network issues)
> you can just resume with wget -c $url to continue the download.
>
> Firefox's download manager is horrible by comparison....
>
> regards,
> jethro
Thanks Mark & Jethro,
I guess my question was that while there seems to be a plethora of
download managers available out there, there seems to be large variance
of reliability. I wanted to get opinions of reliability of resumability
the linux way and really didn't want to go wasting data allowance again
for such a huge file download.
Thanks Jethro, your answer threw light on this for me, can always rely
on you for a well thought out and informative response!
Rob Collins
More information about the wellylug
mailing list