Ok, since people truly and actually often ask me about what the differences are between curl and Wget, it might be suitable to throw in my garbage here and state the main differences as I see them. Please consider my bias towards curl since after all, curl is my baby – but I have contributed code to Wget as well.
- Features and is powered by libcurl -a cross-platform library with a stable API that can be used by each and everyone. This difference is major since it creates a completely different attitude on how to do things internally. It is also slightly harder to make a library than a “mere” command line tool.
- Pipes. curl is more in the traditional unix-style, it sends more stuff to stdout, and reads more from stdin in a “everything is a pipe” manner.
- Return codes. curl returns a range of defined and documented return codes for various (error) situations.
- Single shot. curl is bascially made to do single-shot transfers of data.It transfers just the URLs that
the user specifies, and does not contain any recursive downloading logic or any sort of HTML parser.
- More protocols. curl supports FTP, FTPS, HTTP, HTTPS, SCP, SFTP, TFTP, TELNET, DICT, LDAP, LDAPS and FILE at the time of this writing. Wget supports HTTP, HTTPS and FTP.
- More portable. Ironically curl builds and runs on lots of more platforms than wget, in spite of their attempts to keep things conservative. For example, VMS, OS/400, TPF and other more “exotic” platforms that aren’t straight-forward unix clones.
- More SSL libraries and SSL support. curl can be built with one out of four different SSL/TLS libraries, and it offers more control and wider support for protocol details.
- curl (or rather libcurl) supports more HTTP authentication methods, and especially when you try over HTTP proxies.
- Wget is command line only. There’s no lib or anything. Personally I’ve always disliked the project doesn’t provide a man page as they stand in the GNU side of this and consider “info” pages to be the superior way to document things like this. I strongly disagree.
- Recursive! Wget’s major strong side compared to curl is its ability to download recursively, or even just download everything that is referred to from a remote resource, be it a HTML page or a FTP directory listing.
- Older. Wget has traces back to 1995, while curl can be tracked back no longer than 1997.
- Less developer activity. While this can be debated, I consider three metrics here: mailing list activity, source code commit frequency and release frequency. Anyone following these two projects can see that the curl project has a lot higher pace in all these areas, and it has indeed been so for several years.
- HTTP 1.0. Wget still does its HTTP operations using HTTP 1.0, and while that is still working remarkably fine and hardly ever is troublesome to the end-users, it is still a fact. curl has done HTTP 1.1 since March 2001 (while still offering optional 1.0 requests).
- GPL. Wget is 100% GPL v2, which I believe is going v3 really soon when they release their next release. curl is MIT licensed.
- GNU. Wget is part of the GNU project and all copyrights are assigned to them etc. The curl project is entirely stand-alone and independent with no organization parenting at all.
This turned out to be a long post and it might in fact be usable to save for the future, so I’m also posting this as a more permanent doc on my site on this URL: http://daniel.haxx.se/docs/curl-vs-wget.html. So possible updates will be done there. Do let me know if you have further evident differences or if you disagree with me on details here!
Right now there’s darkness outside my window. It is in the middle of the night and this is my prime hacking time, when the rest of the family are all sound asleep.
The three first days of nursing Rex full time has been gentle, as he’s been his very happy himself and we’ve had some great weather allowing walks outdoors and visits to the nearby playgrounds etc. Also, Rex’s two naps per day (totaling at around 3-4 hours) does allow for some personal time as well, so I manage to read my mails and even do occasional commits during the days.
Things will get rougher when the days go darker, colder and wetter. Or when Rex is getting more cranky and similar. But I’m optimistic.
Keeping up with IRC like I can when I’m sitting in front of my computer all day at work isn’t really possible and the no commuting will prevent me from keeping up with the podcasts I used to listen to, but those are no biggies to overcome. I quite like not having to go anywhere particular in the morning and thus not have to “travel” home again in the later afternoon.
Instead, I’ve fixed bugs and worked in several patches to curl and c-ares and I’ve even been able to submit blog posts at a decent pace.
Microsoft hasn’t given in yet it seems, as they announced their updated Zunes yesterday. They’re available as 4 or 8GB flash and a 80GB hdd version, and these ones are claimed to play more movie formats (like h.264 and MPEG-4) and they actually seem to be capable of using the wifi for things like syncing music etc.
The zune music is also said to go DRM-free… All in all, I’d say they seem to really make an effort to be a serious iPod alternative.
Anyway, there hasn’t of course been any serious dissect of these new Zunes yet but given how their earlier models were made it seems unlikely that they will attract any larger crowds of eager hackers. They also seem to have applied a fair amount of cryptography, another Apple-like approach, so it is hard to put a replacement firmware on it.
The guys in the Zune Linux project have really no clues about what hacking these things require, and their early chatter on deciding what logo to use and what “distro” to base their work on have just been hilarious jokes. I don’t expect this new set of models to change this situation in any significant way.
I’m not aware of any known skilled (Rockbox) hacker having a go at Zune. The old Zune models are however quite similar (but not identical) hardware wise to the Toshiba Gigabeat S models, for which there is a Rockbox port in the works (as I’ve mentioned before).
We’re slowly building a team and effort in the Rockbox project to make a port to the Cowon iAudio 7 player.
It’s a 60 gram 4/8/16 GB flash player with a 1.3″ 160×128 TFT LCD, FM tuner, Telechips TCC771 MCU and a bunch of chips familiar to us from other existing Rockbox ports.
TMM already bricked his first player…
Update: this entry does not allow comments anymore. Go to the Rockbox forums to continue!
There’s a bunch of eager hackers hanging out in the Rockbox forums, working on getting a Rockbox port for the Dell DJ going.
This player has a monochrome LCD at 160×104, features the dreaded TMS320 series MCU and comes with up to 20GB hard drive.
MrH mailed me a document describing his latest research on the PP5024 memory controller, and I figure we have reasons to believe that the other chips in the PP family might be similar. He did the work by running test programs and disassembling the Sansa firmware.
Of course, I keep the collected e200 details from MrH on my Sansa e200 page.
The other day while I was browsing the endless stream of pointless articles about iPhone this and iPhone that, I fell over this slashdot article that mentioned the US Magnuson-Moss Warranty Act which basically says that a company cannot void a warranty just because the user has tampered with its software if the company cannot prove that the alternative software is to blame for the failure.
Of course I’m not a lawyer or even in the US, but it certainly seems to be something that should apply for quite a few Rockbox users who have feared returning broken units to manufacturers with the Rockbox installation left intact. (Both Archos and iriver are known to have refused to service such players – but I guess neither of those cases actually were in the US with US customers.)
It does however require that there is an existing written warranty in the first place.
And then I figure the struggle for a mere single human being to fight against one of these companies claiming that Rockbox isn’t to blame could be more than just a little intimidating and probably just won’t happen…
I have to admit the little I’ve used SUSE Linux I just disliked it and the yast thing is completely inferior to debian’s system – a lot because of its slowness.
However, I noticed they’ve worked a lot on improving boot speed, boasting a cutdown from 55 seconds to 27, and I’m a bit jealous about that…
IÂ mean, I reboot like once every two months so I could save like 3 minutes of my life during a year. Not too shabby… 😛
Being an ordinary hacker person in an industrial country such as Sweden, I own lots of random technical devices that I either have and use in my home or carry around for my use and enjoyment. Most, if not all, of these provide a fair amount of features and bugs. Many of them are controlled by an internal microcontroller.
My dect phone, my gsm phone, my DVB-T boxes, my TVs, my music players, the “entertainment system” of my car, my DVD-players, my wifi-router, my printer, my digital camera, my GPS, my video camera and the likes.
I seriously wish I had the docs and the source code for all of these, and thus the ability to change them to behave more like I want them to. I don’t believe I’m alone either. I wouldn’t even have to do most of these changes myself, we would have communities built up around basically all of these devices so that people from all over would share their ideas and code to improve your device. I would hack them all, if I could.
Of course, some of these devices aren’t at all possible to upgrade since they’re produced and sold without that ability and for those I’d have to accept this (and buy a different model the next time around), but a lot of these things can be reprogrammed at will already if we only knew how.
If only the manufacturers didn’t hate us.
Gary Maxwell enlightened us that his build (of a slightly older libcurl) is way below 50KB on an ARM7 architecture, while Dan Fandrich could squeeze the latest libcurl release to at least below 100KB on x86.
Of course these particular builds are fairly stripped down builds with only HTTP support left, but they are built from unmodified sources. Full-fledged builds with all protocols will of course be significantly larger.