Recent and Current Hardware Problems

During the last week or so, we’ve experienced major problems on some of the main servers at work, and I happen to host a bunch of services on them. Thus, I not only get problems to access my regular mail, but also the primary curl web site gets shaky!

Unfortunately, this is holiday season so most people that can fix these issues aren’t around so waiting for a reboot of the boxes can take a long time. Fortunately, we already have work in progress that is meant to replace the two main servers with two new ones on January second 2008, so things should at least settle after that operation.

cURLSo, remember that you can always find a suitable curl web mirror at curlm.haxx.se that has most of the contents you’ll need. Some stuff is only provided on the main site, but all downloads, docs and more are distributed on mirrors.

Aiming for 7.18.0 in January 2008

cURLThis info was also posted to the curl-library list today.

I previously thought of releasing 7.18.0 in December but since there are still outstanding topics in the list and since there’s no pressure due to any serious bug fixes or anything, I decided we can just as wait until January. I want January 13th to be the feature freeze day after which no new features will be committed until the release, which hopefully then could be done by January 28th or so.

The live updated TODO-RELEASE document will change over time, but it currently contains these items:

Is there anything we’ve forgotten we should include in the next release? To get a feel for how the next release will look like, check out the RELEASE-NOTES in progress, or try out a daily snapshot!

10K Commits

ohloh.net counts my commits done in 11 different open projects over roughly the last 8 years. (I am a member of 17 projects on sourceforge, but the remainders are old and/or dead projects.)

I’ve now truly surpassed 10,000 public commits, making roughly 3.5 commits per day over the years! Clearly that number and the number of people giving me “kudos” on the site makes me rated #55 out of 83,000. At least currently, my rating is slowly falling…

ohloh profile for Daniel Stenberg

Anyway, since I’m a fan of stats and numbers, I encourage you to register your own projects and contributions there!

Maybe we should form a 10K commit club? ๐Ÿ˜‰

library for proxy detection

Only days after I wrote about the pacparser, another and in many ways more complete approach to detecting what proxy to use for accessing various internet resources emerge: libproxy.

One of the main authors of it, Alex Panit, already submitted a feature-request for libcurl to support this. but I’m not at all convinced that is a good idea. It seems the authors submit “please include support for this”-requests all over in similar and related projects – similar in style to how metalink did.

As usual, I value your input and feedback so please raise your voice and speak up!

So far this young project lacks docs on API and install process, so I haven’t yet even been able to build it for a test drive…

Jun-Ichiro

Today I just casually noticed the NetBSD 4.0 release, but what really caught my interest was the dedication of the release. It is dedicated to Jun-Ichiro “itojun” Hagino, who apparently passed away in October this year only 37 years old. See also this “memorial”.

Anyway, the reason I noticed is because I remember him and his contributions in the curl project back in 2001 and 2002 when he provided a set of very good and useful patches (for example this) that made curl take a huge step forward in its IPv6-support. It is of course with sadness that I get to hear about him again this way. I’m sure he will be missed in many camps.

curl icons

I got a question on the #curl IRC channel and I have to admit that the curl and libcurl icons are a bit hard to find on the curl web site, so I thought I’d paste them here too to put the lights on them a bit more.

If you use (lib)curl in your projects or site, please consider making this noticeable by mentioning it and possibly using an existing curl icon to show:

powered by libcurl powered by libcurl powered by libcurl powered by libcurl powered by libcurl

powered by libcurl powered by libcurl powered by libcurl

small curl icon small curl icon with border libcurl icon

cURL libcurl

… as you can see there are plenty to choose from! ๐Ÿ˜‰

And please, if you are capable of drawing logos, bitmaps, icons or the likes and you feel like you can do better than these, please step forward and bring on your work!

Parsing those Dreaded PACs

Since a long time companies all over have found it suitable to use “PAC” (proxy auto-config) scripts to control browsers and tell them what proxy to use for what particular URL and more, and this has been a major nuisance for people who use other (automated) means to go through the proxy than one of them huge gui-based browsers.

For a long time, Ralph Mitchell’s take on this was the only half-decent way to get it working. Half-decent only because it was never packaged and provided in a convenient way to people, it was fully functional.

Now, Manu Garg seems to have stepped forward and introduces pacparser which is a C library for parsing PAC files. Period. It looks (very) similar to Ralph’s approach, in that it incorporates the Mozilla’s javascript engine Spidermonkey, with nothing really special added. So, it’s basically the same approach but this time it is packaged, shipped and documented how it is used!

Announced on the curl-users mailing list today!

The Bourne Nmap

Over at insecure.org we can read about nmap‘s appearance in The Bourne Ultimatum (IMDB) movie and they also show two screenshots, out of which I’ll show you one:

I couldn’t resist trying to resolve the host name in there, only to find that telservice.net is a Korean company/network (which kind of makes it less likely to have the address of the Guardian UK – supposedly the hacking target in the movie) and of course the specific host name in this shot doesn’t resolve and the IP address showing isn’t belonging to telservice.net… Wow, who could’ve guessed that? ๐Ÿ˜‰

And yeah, I’m jealous. I want one of the projects I participate in to appear in movies too!

curl keeps connections alive

Just in the last few days we modified curl to enable the SO_KEEPALIVE option on connections it creates. It basically means that curl will now detect connections that are idle after a certain amount of time, even if that time is something around two hours by default and that’s what most systems will have it set to.cURL

The main problem that caused us to finally enable this (you can still disable this by using –no-keep-alive) is when people do (long-lasting) FTP transfers and they use a NAT, firewall or router that detects and removes what it considers are idle connections. An FTP transfer is using two connections, but the control one where the commands are sent over is completely quiet while the actual data transfer is in progress so when the transfer is done, the control connection has been nuked by the router/NAT. Of course curl survives this as good as possible, but it can’t do proper error-checking etc in this situation.

Funnily, there’s no really good fix for the FTP situation since the two hours SO_KEEPALIVE timeout will many times be too long to help (although most modern systems allows you to change the timeout or a system or application level), but the other “obvious” fix is to send a “NOOP” command on the control channel every once in a while during the transfer. But no, that doesn’t work fine either on most servers since it seems the servers often don’t listen on the control connection during the transfer, so all we’d get is curl sending commands that won’t be replied to until the end of the transfer, and thus it will end up causing problems.

Note: curl sets this option. libcurl still doesn’t, so if you want your app to set the option you can use the same CURLOPT_SOCKOPTFUNCTION callback that curl uses. This requires libcurl 7.16.0 or later.