Meh. Does it do anything substantially better? Or is it just different? I learned wget first, and have only had to interface to curl due to its presence in PHP installations. Where I can safely say it's only ever caused me grief.
Really, though, I should be using open-network-stream, but then I have to do by hand all the proxy-handling and redirect-handling stuff that wget does automatically.
Does it do anything substantially better? Or is it just different?
I learned wget first too, but the transition isn't that painful.
Here, someone suggests: Looking at the comparison chart between curl and snarf/wget/pavuk/fget/fetch, the answer may be simply -- development stopped on the others and continued on curl.
Mostly, it appears the features I'm missing are either features I wasn't using or features I wouldn't look for in a command-line client (HTTP ranges? is there any realistic use for this other than continuing an interrupted download, which wget does already?). Since it's no longer being developed, I guess wget is going to fall foul of bitrot at some point, but hey. It works for me for now.
not curl?
Re: not curl?
Really, though, I should be using open-network-stream, but then I have to do by hand all the proxy-handling and redirect-handling stuff that wget does automatically.
Re: not curl?
Re: not curl?
I learned wget first too, but the transition isn't that painful.
Here, someone suggests:
Looking at the comparison chart between curl and snarf/wget/pavuk/fget/fetch, the answer may be simply -- development stopped on the others and continued on curl.
http://curl.haxx.se/docs/comparison-table.html
Re: not curl?