Category Archives: cURL and libcurl

curl and/or libcurl related

curl and libcurl 7.19.0

With almost 40 described bug fixes curl and libcurl 7.19.0 come flying with a range of new things, including the following:

  • curl_off_t gets its size/typedef somewhat differently than before. This may cause an ABI change for you. See lib/README.curl_off_t for a full explanation.
  • Added CURLINFO_PRIMARY_IP
  • Added CURLOPT_CRLFILE and CURLE_SSL_CRL_BADFILE
  • Added CURLOPT_ISSUERCERT and CURLE_SSL_ISSUER_ERROR
  • curl’s option parser for boolean options reworked
  • Added –remote-name-all
  • Now builds for the INTEGRITY operating system
  • Added CURLINFO_APPCONNECT_TIME
  • Added test selection by key word in runtests.pl
  • the curl tool’s -w option support the %{ssl_verify_result} variable
  • Added CURLOPT_ADDRESS_SCOPE and scope parsing of the URL according to RFC4007
  • Support –append on SFTP uploads (not with OpenSSH, though)
  • Added curlbuild.h and curlrules.h to the external library interface

We’ve worked really hard to get this to be a really solid and fine release. I hope it’ll show.

Getting cacerts for your tools

As the primary curl author, I’m finding the comments here interesting. That blog entry “Teaching wget About Root Certificates” is about how you can get cacerts for wget by downloading them from curl’s web site, and people quickly point out how getting cacerts from an untrusted third party place of course is an ideal situation for an MITM “attack”.

Of course you can’t trust any files off a HTTP site or a HTTPS site without a “trusted” certificate, but thinking that the curl project would run one of those just to let random people load PEM files from our site seems a bit weird. Thus, we also provide the scripts we do all this with so that you can run them yourself with whatever input data you need, preferably something you trust. The more paranoid you are, the harder that gets of course.

On Fedora, curl does come with ca certs (at least I’m told recent Fedoras do) and even if it doesn’t, you can actually point curl to use whatever cacert you like and since most default installs of curl uses OpenSSL like wget does, you could tell curl to use the same cacert your wget install uses.

This last thing gets a little more complicated when one of the two gets compiled with a SSL library that doesn’t easily support PEM (read: NSS), but in the case of curl in recent Fedora they build it with NSS but with an additional patch that allows it to still be able to read PEM files.

FTP vs HTTP, really!

Since I’m doing my share of both FTP and HTTP hacking in the curl project, I quite often see and sometimes get the questions about what the actual differences are between FTP and HTTP, which is the “best” and isn’t it so that … is the faster one?

FTP vs HTTP is my attempt at a write-up covering most differences to users of the protocols without going into too technical details. If you find flaws or have additional info you think should be included, please let me know!

The document includes comparisons between the protocols in these areas:

  • Age
  • Upload
  • ASCII/binary
  • Headers
  • Pipelining
  • FTP Command/Response
  • Two Connections
  • Active and Passive
  • Firewalls
  • Encrypted Control Connections
  • Authentications
  • Download
  • Ranges/resume
  • Persistent Connections
  • Chunked Encoding
  • Compression
  • FXP
  • IPv6
  • Name based virtual hosting
  • Proxy Support
  • Transfer Speed

With your help it could become a good resource to point curious minds to in the future…

The hack will still be useful

Okay, in my recent blog entry about Flash 10 using native libcurl I got a bit side-tracked and mentioned something about distros confusing libcurl’s soname 3 and 4. This caused some comments in that post and some further activities behind the curtains, so let me spell out exactly what I mean:

The ABI for libcurl did change between soname 3 and 4, but the change was in a rather subtle area (FTP third party transfers, sometimes known as FXP) which is rarely used. It certainly will not hurt the Adobe Flash system.

I’m not against “the hack” (or perhaps “a hack” as there are several ways an ordinary system could provide work-arounds or fixes for this problem) per-se, I am mainly trying to fight the belief or misconception that the ABI break doesn’t exist.

Since Adobe doesn’t want to provide an updated package that links against a modern libcurl and refuses to provide multiple packages, distros of course need to address this dilemma.

I just want all to know that 3 != 4, even if the risk that it’ll cause problems is very slim.

Update: it seems Adobe will change this behavior in their next release and then try to load either 3 or 4.

CA cert bundle from Firefox

It could be interesting to note that extracting all the cacerts from your local Firefox installation isn’t that tricky, if you just use some of the magic that are at hand with the NSS certutil tool.

Users of OpenSSL or GnuTLS based tools or libraries (such as libcurl) might be pleased to learn this.

curl users in general of course should be aware that we no longer ship any ca-cert bundle with curl (as of curl 7.18.1), as it seems some ports haven’t yet updated or discovered this.

Update: this script is now present as lib/firefox-db2pem.sh in the curl CVS repository.

Site deadness

When I got to work this morning I immediately noticed that one of the servers that host a lot of services for open source projects I tend to play around with (curl, Rockbox and more), had died. It responded to pings but didn’t allow my usual login via ssh. It also hosts this blog.

I called our sysadmin guy who works next to the server and he reported that the screen mentioned inode problems on an ext3 filesystem on sda1. Powercycling the machine did nothing good but the machine simply didn’t even see the hard drive…

I did change our slave DNS for rockbox.org and made it point to a backup web server in the mean time, just to make people aware of the situation.

Some 12 hours after the discovery of the situation, Linus Nielsen Feltzing had the system back up again and it’s looking more or less identical to how it was yesterday. The backup procedure proved itself to be working flawlessly. Linus inserted a new disk, partitioned similar like the previous one, restored the whole backup, fixed the boot (lilo) and wham (ignoring some minor additional fiddling) the server was again up and running.

Thanks Linus!

Flash 10 uses native libcurl

In Adobe’s Penguin.SWF blog, we can learn some details about the upcoming version 10 of the Adobe flash player for Linux:

They’ll rely on more libraries to be present in the system rather than provide them all by themselves in their own install. This apparently includes libcurl.

So if you get the RPM of the pre-release player, you’ll notice that it requires “libcurl.so.3” which is the old SONAME for libcurl (libcurl 7.15.5 was the last release which used the number 3) which no up-to-date distribution should provide anymore. Since october 2006 we’ve shipped libcurl.so.4.

Apparently, this made the Fedora people first implement a work-around for this that re-introduces the SONAME 3 from the same source the SONAME 4 is made from, only to a short while afterwards revert that decision

An interesting side-note is how the Fedora people repeat over and over in those threads that libcurl with SONAME 3 and SONAME 4 use the same ABI, although that is not true (at least not by my definition of what an ABI is). The bump was not accidentally made.

Update: it seems some blame this 3 == 4 thing on Debian

Projects in need of your help

I’m involved in numerous projects, and a subset of them take a lot of my “copious” spare time. This has the unfortunate downside that a few other projects get left behind a bit. Projects that also really could use with some more attention and improvements. Two of the most obvious examples of this are c-ares and libssh2. Coincidentally, both of these projects are also used by libcurl (although of course also by others).

c-ares is a library that performs asynchronous DNS lookups. It is quite mature and functional already, as it is based on the ares project and has been proved in use for quite some time. There are currently one or two issues that have appeared recently when the Debian project tried to provide the curl package built with c-ares…

libssh2 is a client library for talking SSH2 with servers. There are actually not very many SSH libraries “out there”, and in an evaluation I did a few years ago libssh2 was the best one around. libcurl uses libssh2 for SCP and SFTP transfers, and it (libssh2) does suffer from a few API flaws, a few bugs and perhaps most noticably it is significantly slower than the openssh tools in just about all transfer tests.

I’m still highly involved in both of these projects, but lack of time prevents me from participating as much as I’d like to.

Two fellow curl hackers

During many years I was really and truly the primary and almost single developer of curl and libcurl. Sure we’ve always got a steady stream of quality patches by contributors but I was the single guy who cared for the whole picture and who took on greater work to advance the project.

This is no longer the case. These days there are more people around that bite the really big bullets and who show that they know a lot about the internals, the protocols and have a feel and understanding for the general ideas and concepts of the project. I think they get too little attention, so I thought I’d put the light on two of our bright hackers that really are true rocks in the community:

Daniel Fandrich first appeared on the curl-library list in April 2003. More than 1500 email posts later, he’s a knowledgeable, friendly and skilled contributor in just about all areas of curl and libcurl.

Yang Tse appeared on the curl-users list in September 2005 and has somewhat specialized in cleaning up dusty corners of the code. Redoing things The Right Way, fixing compiler warnings and fixing up configure checks so that the code runs all over as it is supposed to.

These are two of our valuable committers. Ohloh.net counts 10 committers during the last 12 months, which puts us within the top 10% of all project teams on Ohloh!

But as I mentioned above, the curl development is largely built upon patches provided by people who send in one or two patches and never appear in the project again. We have over 650 named contributors and the list keeps growing at a steady pace all the time.

You can be our next contributor or even committer. Just join us and help out!

curl feature freeze on August 10

In order to get the next curl release done, we’re entering feature freeze on Sunday August 10 (at 00:00 UTC to make it specific).

Starting then, we add no new features for the upcoming two weeks but we only fix bugs. At the end of the two week period, or possibly before that if all looks well, we release 7.19.0.

The work on CURLOPT_POSTREDIR and the “Limiting IP addresses” work can of course continue but if we don’t have working patches for them very quickly, they’ll have to wait for next release.