Category Archives: cURL and libcurl

curl and/or libcurl related

To fscons and back in 16 hours

I took the X2000 train to Gothenburg from Stockholm at 08:10 so I was at the conference place first at almost 11:30.

SELF

This meant I got to listen in on the end of Jonas Öberg‘s speech on SELF (an FSF Europe and others project on e-learning and a lot of related matters). This wasn’t really my cup of tea, but the other track had a MySQL talk and that isn’t really my thing so I had to just pick one… 🙂 Nothing bad about the subject or Jonas really, just a hint of where my interests are not so much.

Lunch

At lunch I got the opportunity to catch up with Squid-Henrik (Nordström) to talk about recent happenings in our projects. Squid being about to release v3 after several years, and I could report that curl has gotten support for SSH-related protocols during the last year or so… I also exchanged a few words with Peter Stuge who expressed interest in hacking on libssh2 recently, and well I have done some of that!

Qtopia

Trolltech’s GreenphoneKnut Yrvin from Trolltech Norway did an excellent talk on Qtopia in the Telecom business, which had changed name from something involving Greenphone since they have since ditched that project. Anyway, he spoke of the upcoming possible opportunities for free software and open source in consumer electronics, and then particularly in smart phones and then of course mostly related to Qt and Qtopia. He passed around a Greenphone to let us get a feel or it and fiddle a bit with it and yes, it seemed like a nice phone – not a lot bigger or heavier than my current Sony Ericsson thing. It featured a nice “sliding UI” even if I had serious troubles moving around in the system and I couldn’t really figure out the maneuvering concept much! I suspect all it would take is a little more time and perhaps a manual or someone explaining it to me.

OpenMoko

Neo1973In the subsequent talk Ole Tange from the OpenMoko project also handed out “fiddle-versions” of their primary phone, Neo1973 for us in the audience to touch and hold. The major bad with these were however that both devices were dead so we couldn’t see anything on them, just hold them and feel that… eh yeah. they’re a bit on the biggish side and quite a bit bigger than my current phone in all three dimensions! He spoke about the upcoming 2nd version of the phone that is supposed to become available in Q1 2008 and given that it will feature wifi, bluetooth, accelerometers, GPS, 640×480 pixels touch screen, accelerated graphics card and a mini-B USB plug and run an entirely open and free Linux version with documented hardware is indeed thrilling. The Neo1973’s size is not attractive, but its internals are. This is in fact a unit I will seriously consider buying/hacking when/if it becomes available for purchase.

Other details in his OpenMoko talk gave me the impression that the software is not yet very far advanced. Like he first made a comparison to the OLPC system with hw and sw items side-by-side both listing as GTK+ based UIs, but then he also mentioned a thanks to Trolltech for having ported their Greenphone Qtopia system to OpenMoko. On my direct question if that wasn’t a bit contradictive since surely they must be focusing on ONE of these graphics/widgets systems for their main development, he went on to rant about how OpenMoko “is a computer” that can “run anything”. I’m not sure, but it certainly gave me the impression that there just is no main development… Where is OpenMoko at right now really? Anyone knows? I guess I should spend some time on researching that, and also investigate a bit on the “running Rockbox on OpenMoko” front…

curl

cURLWhen the time came for my talk, at 15:00 we first had to mess about a bit since the computer I was supposed to borrow to run my presentation on was suddenly gone (and used for the other track’s talk I later learned) but thanks to other people I soon had a replacement and I got on with it.

I know the topic by heart of course, curl being my primary open source project for ten years and I know every bit of it and its history and so on, but making a fine presentation based on that is an entirely different story. Also, giving it in English adds a layer of, well not complexity perhaps, but it makes it all bit more rough in the edges since even though I know English pretty well and all, my vocabulary isn’t the largest and I don’t always find the right synonyms and the phrasing etc when trying to explain or argue for my sake.

Also, since I don’t quite know my own presentation by heart it isn’t really the best possible performance I can do, but what the heck. I tried to present curl and libcurl, what they are and what they’re good for, why people use it and how the development is done and why YOU should use it now and in the future. The guys at fscons got all talks on video so I hope to be able to see myself on video soon and I’ll try to learn from that for my next talk. And of course those of you who weren’t present at fscons will get your chance to see my pale face and listen to my Swedish-accented stumbling English! 😉 Oh, and I had to rush the presentation a bit towards the end when my 45 minutes ran out a little bit faster than I had anticipated, or was it the questions that popped up? Questions are good, since they make me aware the audience is with me and are interested.

I’m not sure if the topic of curl is somewhat boring, or if it felt too technical or what, but I think I had less than 50% of the audience listening. The other talk going on while I spoke was a lightning talk session with a bunch of people.

Here’s the slides from my talk, in a 31 page 500K pdf: http://daniel.haxx.se/curl-20071208.pdf

LinuxBIOS

Slightly dry in my mouth after this, I recharged myself with a cup of coffee and some cinnamon-rolls and walked it to see the next talk. Or rather series of talks since this was a “lightning talks” session where five guys spoke quickly about various topics. They were about web development with perl, a weird ajax system called gaia that seemed to involve a lot of .NET, a web development system of some sorts named makumba, and a quick mentioning of a 10 gigabit full open source router. For me, the most interesting piece was Peter Stuge’s brief talk about LinuxBIOS, what it is, what it does and so on. That’s really a to-the-metal project and I like getting back to earth and on to real stuff. Much of what he said and explained about difficulties with documentation from hardware vendors etc are just so familiar to me based on Rockbox experiences. To the great enjoyment of the audience, Peter’s live demo of LinuxBIOS booting up failed notoriously and after numerous resets it finally booted up and started playing loud music – when the following speaker already was half-through his router presentation!

Closing

I only got to hear the beginnings of the closing talk held by Georg Greve from FSF Europe as I had to leave after 20 minutes or so to catch my cab that took me back to the train station and I was on my way back to Stockholm again on the 18:42 train…

Did I mention that I got a tshirt? I planned to include a picture of the shirt here, but I took a shot with my mobile phone when I got home and the camera in it is just so extremely crappy in low-light situations (even if I had all the lights in the room turned on) so I can’t torture you by including it. I’ll have to make another attempt later or find a link to someone else who did…

In conclusion: even though I only did a quick visit and didn’t get to see that many talks, I liked what I saw and I had fun. It sounded like the guys doing this are seriously planning on doing it again next year. I hope they’ll do and that I’ll manage to do there again, hopefully to do another talk!

Tunneling with libcurl

As I wrote a while ago, companies using http proxies make people feel a need to break out of their proxies.

Bryan is a friend who recently found out that his company is switching proxy to a different one and apparently both corkscrew and proxytunnel have problems with this new piece, and since libcurl offers quite a lot of functionality to accomplish almost this, a new project was born: curltunnel.libcurl

One immediate benefit of using libcurl is the support for multiple authentication methods, in fact more than any of the above mentioned tools.

However, it seems our first quick stab at making this tool (currently 278 lines of code), made it work for several common cases but… not for Bryan’s new proxy.

The current theory is that the proxy actually checks for SSL traffic and only lets that through, and thus it prevents the ssh server banner to appear when we try to tunnel through the proxy to a remote ssh server on port 443. If further testing proves this correct, we will of course have to add a SSL layer to the mix.

URL Encode POST Data

Several months ago I did a job down south in Sweden – it was a three hour train ride (one way, and I went down and back the same day…) with the fastest train we have in this country. It gave me some time on the train to tinker with things and I didn’t feel like bothering with the “Internet On Train” thing they so fancifully offer these days. I’m not saying that’s a bad idea, I just felt that perhaps me and my Linux laptop would have to spend too much time fighting it to get it up to really enjoy it. Instead I wrote up a patch for curl for a feature we discussed ages ago: send POST data with the command line client that gets URL encoded automatically!cURL

The idea is of course that when you write a simple shell script of some sorts and want to automate POSTs to a web site, it is somewhat complicated to url encode the strings before you pass them on to curl. curl could instead get an option that does it for you.

Fast forward to current time and now I’ve dug up the old patch again, I had a discussion on the mailing list about it and what do you know! Today I’ve posted a patch that introduces –data-urlencode, and I’m very interested in feedback or suggestions on how to polish it further and then to commit it.

Human Connections

LinkedIn logoFor fun, I created two “groups” on LinkedIn for two of the open source projects I’m perhaps the most active in. I’m not quite sure what benefit and good use we’ll get from them, but anyway they’re created and if you feel in any way related to Rockbox or curl, here are the links you can use to do a join request:

Rockbox tiny Rockbox: http://www.linkedin.com/e/gis/42081/49AF807A7908

curl tiny cURL: http://www.linkedin.com/e/gis/42082/362F5916AFF1

And a link to my public LinkedIn profile

Fresh CA Cert Bundle Anyone?

cURLThe popular ca extract service on the curl web site converts the Firefox ca certs into a PEM file suitable for use with curl, wget or anything else OpenSSL-based that likes PEM formatted CA cert bundles.

The main script was fixed yesterday as it was previously getting a nightly source code snapshot to get the “magic” file to convert from, but I noticed they stopped updating the nightly source snapshots a good while ago so the updates had stopped!

Now, the script only gets the actually needed certdata file and converts it, so now it downloads a lot less data in vain and it also thus runs much faster. Now the PEM files offered on that page are up-to-date with the most recent Firefox.

Crashing Firefox Goes libcurl

I guess I haven’t been paying attention lately, but I stumbled over the Breakpad project, which incidentally is gonna be used as crash reporting tool for Firefox 3 (the original link to that is no longer working), and it uses libcurl (the original link to that quote is no longer working): “On Linux, libcurl is used for this function, as it is the closest thing to a standard HTTP library available on that platform.

The wording implies that it uses something else on Mac OS X, but I’m not aware of any standard HTTP library on it. Am I missing something or are they going libcurl there too?

Also, I wonder if using different HTTP libraries on different platforms instead of a single one isn’t just begging for more problems than what it solves? As far as I know, libcurl has a few upsides compared to wininet for example. Of course, I’m not the man to tell how they should do their stuff.

libcurl DNS resolve problems on Leopard

libcurlI found this article by Jungle Dave titled Leopard DNS Issues (and work-around), which explains how libcurl built with IPv6 support may cause trouble on MacOS X 10.5 (Leopard).

According to him, that’s because getaddrinfo() causes a SRV lookup to be made and that may be either slow or get discarded completely and thus cause trouble.

This just adds another problem to getaddrinfo() resolves then, since we already have the problem with it when resolving round-robin DNSes since more or less every machine has a bad /etc/gai.conf setup that makes getaddrinfo() return a sorted list instead of the “random” one DNS admins in the wild would prefer the users to use…

aget compared to curl

As you should know, we maintain this curl comparison table on the curl web site, and it lists a set of free tools and how they compare against curl and each other in various aspects. If you want more features compared or other tools included, please tell. Also if you disagree with any of the facts stated there, just shout!cURL

The other day I got an email asking me to add aget to the table, and since it is a free tool (original BSD licensed) with a similar purpose it would indeed fit.

So I downloaded aget 0.4 and had a go at it.

  1. The “stable” 0.4 version doesn’t build out of the tarball. It does wrong assumptions on “errno” and thus I had to manually poke on 3 source files to make it generate a fine binary! While this error is claimed to be fixed in the “devel” version, the devel version fails to build on compiler errors instead!
  2. My first test was to download aget’s own home page with aget… and it failed. It says the page is 0 bytes and it doesn’t download anything and outputs something about a bad seek 0 bytes!
  3. This really turned me off, but then I thought I should report this back to the guys rather than just blog it… but there’s no email address in the package that seems suitable, and when checking on the site I find a reference to a mailing list but when trying to read the list’s archive it just redirects back to the main page! So blogging it is.
  4. aget 0.4 from 2002, the aget devel version is from June 2004. Development seems to have stopped.
  5. I decided aget isn’t going to be added to the table by me at this time. It’ll have to mature some more first (and given the age of the tarballs I doubt that’ll happen…). I also read through the source code a bit and it really gives the impression of being a young project that hasn’t yet have time to settle since there are numerous of suspicious conclusions and source code doing “funny” things.

curl and libcurl 7.17.1

7.17.1 – the 102nd release of curl is out, with less than 5 months left to our ten year anniversary!

The previous release (7.17.0) included a few larger internal changes and unfortunately that had the backside that it brought a whole array of new bugs in, that we now have spent almost two months polishing off.

cURL

Apart from the twenty or so bug fixes, a range of new things are introduced as well, including improved NSS support, –proxy-negotiate, –post301 (to make curl act more standards compliant on HTTP 301 responses), –hostpubmd.

libcurl hackers will appreciate CURLOPT_OPENSOCKETFUNCTION and CURLOPT_COPYPOSTFIELDS (the latter a complement to the existing CURLOPT_POSTFIELDS that got broken in 7.17.0 if you posted binary data that contains a zero byte).

7.17.1 contains contributions by at least 16 different people (me not included).