It could be interesting to note that extracting all the cacerts from your local Firefox installation isn’t that tricky, if you just use some of the magic that are at hand with the NSS certutil tool.
Users of OpenSSL or GnuTLS based tools or libraries (such as libcurl) might be pleased to learn this.
curl users in general of course should be aware that we no longer ship any ca-cert bundle with curl (as of curl 7.18.1), as it seems some ports haven’t yet updated or discovered this.
Update: this script is now present as lib/firefox-db2pem.sh in the curl CVS repository.
When I got to work this morning I immediately noticed that one of the servers that host a lot of services for open source projects I tend to play around with (curl, Rockbox and more), had died. It responded to pings but didn’t allow my usual login via ssh. It also hosts this blog.
I called our sysadmin guy who works next to the server and he reported that the screen mentioned inode problems on an ext3 filesystem on sda1. Powercycling the machine did nothing good but the machine simply didn’t even see the hard drive…
I did change our slave DNS for rockbox.org and made it point to a backup web server in the mean time, just to make people aware of the situation.
Some 12 hours after the discovery of the situation, Linus Nielsen Feltzing had the system back up again and it’s looking more or less identical to how it was yesterday. The backup procedure proved itself to be working flawlessly. Linus inserted a new disk, partitioned similar like the previous one, restored the whole backup, fixed the boot (lilo) and wham (ignoring some minor additional fiddling) the server was again up and running.
In Adobe’s Penguin.SWF blog, we can learn some details about the upcoming version 10 of the Adobe flash player for Linux:
They’ll rely on more libraries to be present in the system rather than provide them all by themselves in their own install. This apparently includes libcurl.
So if you get the RPM of the pre-release player, you’ll notice that it requires “libcurl.so.3” which is the old SONAME for libcurl (libcurl 7.15.5 was the last release which used the number 3) which no up-to-date distribution should provide anymore. Since october 2006 we’ve shipped libcurl.so.4.
Apparently, this made the Fedora people first implement a work-around for this that re-introduces the SONAME 3 from the same source the SONAME 4 is made from, only to a short while afterwards revert that decision…
An interesting side-note is how the Fedora people repeat over and over in those threads that libcurl with SONAME 3 and SONAME 4 use the same ABI, although that is not true (at least not by my definition of what an ABI is). The bump was not accidentally made.
Update: it seems some blame this 3 == 4 thing on Debian…
I’m involved in numerous projects, and a subset of them take a lot of my “copious” spare time. This has the unfortunate downside that a few other projects get left behind a bit. Projects that also really could use with some more attention and improvements. Two of the most obvious examples of this are c-ares and libssh2. Coincidentally, both of these projects are also used by libcurl (although of course also by others).
c-ares is a library that performs asynchronous DNS lookups. It is quite mature and functional already, as it is based on the ares project and has been proved in use for quite some time. There are currently one or two issues that have appeared recently when the Debian project tried to provide the curl package built with c-ares…
libssh2 is a client library for talking SSH2 with servers. There are actually not very many SSH libraries “out there”, and in an evaluation I did a few years ago libssh2 was the best one around. libcurl uses libssh2 for SCP and SFTP transfers, and it (libssh2) does suffer from a few API flaws, a few bugs and perhaps most noticably it is significantly slower than the openssh tools in just about all transfer tests.
I’m still highly involved in both of these projects, but lack of time prevents me from participating as much as I’d like to.
During many years I was really and truly the primary and almost single developer of curl and libcurl. Sure we’ve always got a steady stream of quality patches by contributors but I was the single guy who cared for the whole picture and who took on greater work to advance the project.
This is no longer the case. These days there are more people around that bite the really big bullets and who show that they know a lot about the internals, the protocols and have a feel and understanding for the general ideas and concepts of the project. I think they get too little attention, so I thought I’d put the light on two of our bright hackers that really are true rocks in the community:
Daniel Fandrich first appeared on the curl-library list in April 2003. More than 1500 email posts later, he’s a knowledgeable, friendly and skilled contributor in just about all areas of curl and libcurl.
Yang Tse appeared on the curl-users list in September 2005 and has somewhat specialized in cleaning up dusty corners of the code. Redoing things The Right Way, fixing compiler warnings and fixing up configure checks so that the code runs all over as it is supposed to.
These are two of our valuable committers. Ohloh.net counts 10 committers during the last 12 months, which puts us within the top 10% of all project teams on Ohloh!
But as I mentioned above, the curl development is largely built upon patches provided by people who send in one or two patches and never appear in the project again. We have over 650 named contributors and the list keeps growing at a steady pace all the time.
You can be our next contributor or even committer. Just join us and help out!
In order to get the next curl release done, we’re entering feature freeze on Sunday August 10 (at 00:00 UTC to make it specific).
Starting then, we add no new features for the upcoming two weeks but we only fix bugs. At the end of the two week period, or possibly before that if all looks well, we release 7.19.0.
The work on CURLOPT_POSTREDIR and the “Limiting IP addresses” work can of course continue but if we don’t have working patches for them very quickly, they’ll have to wait for next release.
I previously mentioned on the libcurl mailing list, that Mark Nottingham in the IETF HTTP Working Group has initiated the work on putting together an overview of all (interesting) existing HTTP implementations
Of course curl is included in the bunch, or rather libcurl, but I would also urge you all to step forward and provide further details on other implementations you worked on or know of!
I just wanted to drop a note saying that the biggest explanation for the silence and slowness of my blog the last couple of weeks have been my ongoing vacation, which is still going on for another two weeks.
Of course things are happening still, but due to my lack of computer time right now I tend to prioritize actually working with those things rather than posting here writing about the stuff I do/read/fix.
The Rockbox Steering Board vote is over, results will be published soon (spoiler: I’m voted in as one of the members). There’s a curl release coming up for August and we still have a few outstanding issues to fix. More on these topics later!
I spoke to Anthony Bryan from the metalink project over Skype the other day, and the 16 minute recorded interview was recently posted so I thought I’d just announce my local copy of the 14MB file.
The topics should be of no surprise to readers of my blog: me, curl, Rockbox and metalink basically.
I like user feedback and comments from people in projects I participate in – even those that I run or maintain myself. I value bug reports and I think no project can evolve without a fair amount of external input.
But they can also be annoying since when done in public places they tend to stick around. If they’re negative I can respond to them if posted in forums where that is possible and where I care about it, but sometimes they’re just “blurted” out in a way that I cannot respond to and that I cannot do anything about. And the review/comment/complaint will sit there to be watched by the world. Uncommented by me or anyone else thinking otherwise.
Let me point out the recent example that made me write this particular rant: user review on curl at ohloh.
I realize there’s nobody to blame and that this is the way of life and how things work and that everybody is entitled to publish their opinions and all that. It still doesn’t feel really good when you just don’t agree with them and they’re “against” one of your own babies.