c-ares and me

I’ve said this a few times on the c-ares mailing list, but I guess that just doesn’t reach very many people outside the very closest “family” so I decided I’d mention a blurb here. (I don’t think this reaches very many people either, but quite possibly at least a few others…)

Background

A couple of years ago, I wanted to introduce asynchronous name resolving to libcurl to better allow many simultaneous requests still being single-threaded. This venture started with me and Bjorn Reese starting the Denise project that would do exactly this. We found no proper existing alternatives with a suitable license so we started our own.

Then someone mentioned that ares is almost exactly what Denise was meant to become and it had a fine license. I immediately jumped the Denise idea and went with ares. Soon enough we found out that ares needed improvements and tweaks, and its original author didn’t seem interested in incorporation those into ares… so I created a fork named c-ares.

c-ares has since then been used by libcurl and it has been bug-fixed and improved by a bunch of skilled hackers and it works solidly and reliably. It has also been discovered and incorporated into a bunch of other softwares, including UnreadlIRCd, BZFlag, Hobbit network monitor, libevnet, Tor, gLite, aria2, sipsak, Second Life and more…

Today

I don’t normally work with any of my open source projects in my full-time job, so I need to distribute my spare time on the various projects. When my spare time gets limited, I need to cut down on the projects that I deem is least interesting or perhaps least in need of attention (from me). Recently, it has been obvious that c-ares is one of them projects that I rarely have time and energy left for at the end of the day.

Caretaker

I have no plans to “jump ship” or to abandon the project in any way, but I think it would be beneficial for the c-ares project if someone would step forward and if not “take over” the project, at least join in and help share the burden with patch applying, source code reviewing, do design decisions, reply to mailing list questions etc.

There’s no crisis, there’s no hurry, but the project won’t move forward very fast as the situation currently is.

Fresh CA Cert Bundle Anyone?

cURLThe popular ca extract service on the curl web site converts the Firefox ca certs into a PEM file suitable for use with curl, wget or anything else OpenSSL-based that likes PEM formatted CA cert bundles.

The main script was fixed yesterday as it was previously getting a nightly source code snapshot to get the “magic” file to convert from, but I noticed they stopped updating the nightly source snapshots a good while ago so the updates had stopped!

Now, the script only gets the actually needed certdata file and converts it, so now it downloads a lot less data in vain and it also thus runs much faster. Now the PEM files offered on that page are up-to-date with the most recent Firefox.

Distributed Builds on Every Commit

Rockbox

I’m not sure everyone out there has yet realized what a cool build system we’ve created in the Rockbox project!

We’re using Subversion for source code version control. We have a master server that detects whenever there has been a commit and it then starts one thread for each known build server, where each thread connects to the remote server, asks it to update to a specific rev number and then asks the server to build for a particular target.

When the build is done, the master copies the build log and in some cases also the final zip file (which is then offered for download to users). At the time of this writing, we have 67 different builds and we average on 15 build servers. The master adapts to the servers that respond and just ignores the ones that don’t.

This has the cool outcome that roughly 5 – 7 minutes after a commit, there are zip files offered on the site with the absolutely latest code for 27 different players! There’s also a huge table presented on the site with the results from all builds so that warnings and build errors can be worked on.

Of course the master then goes back to check for commits again and the whole thing starts all over again.

Just now, the build for the Olympus M:Robe 500 was modified to depend on a recent ARM tool chain patch so we need to get all build server admins to update their ARM compilers!

The build servers are of course “donated” to the cause by volunteers. It is a fairly easy way to help out the project, if you have the sufficient bandwidth and machine. You can help too!

libssh2 0.18

I stole some time from my family this weekend and I managed to put together and released a fresh libssh2 tarball, labeled 0.18.

I’m not really too fond of spending a lot of time with a sub-zero version number, but the recent releases have felt like just minor improvements to the previous so bumping up to 1.0 has a certain mental brake all over it. I guess I need to just ignore the brake and take the plunge soon, as I believe it’ll make users more likely to start actually using the lib and that will be good.

Users Get Paid While Developers Are Not

Many (if not most – at least if we count every single project we can find) open source projects are mostly and primarily developed by volunteers on their spare time.

The volunteers may be professional developers, students, chefs or plumbers. When they work on their particular pet peeves they do that on time that would otherwise be spent with their family, with friends, sleeping, baking cakes, collecting stamps or similar.

open sourceHowever, when this team of volunteers (which usually is a very small team in most projects, in fact most projects start with just a single guy or perhaps two in the developer team) is successful in producing a project or a tool that finds a larger audience something happens.

When more and more professionals out there start using this tool, when companies start to embed and integrate this product into their projects and start to rely on it for business and day to day routines, not only do the guys in the open source project get more patches, bug reports and quite possibly more volunteers joining the project, something else can happen:

Suddenly, the developer team may notice, most of the people that ask questions, have problems, report bugs, post patches are people that are getting paid while doing it! The project has gotten so popular many companies use it and ordinary employees are set to use it as part of their day time job. They get paid to use it, to fix it, to install it and to customize it.

The developer team – however – is still consisting of volunteering spare time hacking individuals who do this without any monetary compensation

Of course, if the projects grow wildly popular they’re likely to be bought by a company and the dev team or at least a guy or two are likely to be hired by a company to do what he/they were doing, but successful open source projects aren’t really that attractive to companies to buy since they (the companies) can instead just collaborate a bit on the side and just use the software fine without having to buy it and without having to employ anyone from there.

Please note that whatever parallels to existing projects I may or may not be part of that you can find or imagine here, I’m not whining and I’m not complaining! This is just me taking notice of what I believe is an interesting paradox happening to some open source projects.

Crashing Firefox Goes libcurl

I guess I haven’t been paying attention lately, but I stumbled over the Breakpad project, which incidentally is gonna be used as crash reporting tool for Firefox 3 (the original link to that is no longer working), and it uses libcurl (the original link to that quote is no longer working): “On Linux, libcurl is used for this function, as it is the closest thing to a standard HTTP library available on that platform.

The wording implies that it uses something else on Mac OS X, but I’m not aware of any standard HTTP library on it. Am I missing something or are they going libcurl there too?

Also, I wonder if using different HTTP libraries on different platforms instead of a single one isn’t just begging for more problems than what it solves? As far as I know, libcurl has a few upsides compared to wininet for example. Of course, I’m not the man to tell how they should do their stuff.

MapShare part II – not the last episode

Tomtom ONEContinued from Tomtom MapShare.

I bought a new shiny 2GB SD card (which btw made me realize how dirt cheap these things are nowadays) and inserted it into my Tomtom ONE only to find out that backups done with the previous version of “Tomtom HOME” (their windows-only PC-based management tool) weren’t recognized, so I had to put back the old SD card again, make a backup, swap back to the new card and restore the backup.

Then I could buy a version 7 Scandinavian map from Tomtom (4o Euros) and yes, the MapShare options are now available and I also enabled the “correction” button for the main screen to allow me to tap it to make corrections as I go. Now I’ll just need to find places to go to where corrections are needed. A bit dissapointing fact is that I’ve selected options to get other people’s corrections even those that aren’t Tomtom-verified (as long as “multiple persons” did the correction), but the tool just says there are no corrections available for me!

Are there really no corrections done? I find that hard to believe, but I’ll give this the benefit of a doubt for a while. Anyone reading this who made a correction on the Tomtom-provided Scandinavian v7 map? I’m curious if the corrections are based on maps or just on positions, since I would guess that their “Western Europe” map will have the same flaws and mistakes as the “Scandinavia” map does…

Data Sheet Leakage

Irony is part of life.

Data Sheet for a technical thingOne of the “secret” kind of manufacturers out there which refuses to provide docs to their chips unless you sign an NDA and God-knows-what, requires a user name and a password on their web site before they hand out docs. It turned out they only protect themselves using javascript so you can just read the HTML pages and the embedded javascript in them to figure out the exact URLs to use and wham, the data sheets are downloadable…

No, I won’t tell you the exact company nor site (or even exactly when this was discovered or tested) since then they might discover this and fix. I’ve tried this myself and it works fine, but I was not the one who figured it out.

Yeah, this is a moral dilemma: should we tell the manufacturer about their problem and thus close the doors for users to get this docs? Or would that risk backfiring on the guy(s) that tell them? What would you do?

Food Calendar

MatkalendernFacing the every day problem with what to eat, planning and shopping food for the family I took the familiar route: I wrote a web site (the site is in Swedish!) and service for it to ease this boring work!

Using this site, you enter the recipes of the dishes you tend to eat, you assign meals to days and then you can get the site to produce a nice and handy shopping list of all the ingredients that the planned meals require.

With multiple users, you can bookmark other users’ recipes to avoid having to enter them yourself. With top-lists and statistics you can see what meals you plan to eat the most and the least and so on. It actually works pretty neat. We have a set of user logins handed out as well, but I don’t think very many people other than me and my wife actually use it…

It still has lots of room for improvements to make it even easier to plan and to make the grocery store shopping list easier and quicker to deal with, but it scratches my itch already and I’m improving it slowly over time. I started the development of this in early 2006 and we’ve been using it in my family for well over a year now.

curl, open source and networking