Category Archives: cURL and libcurl

curl and/or libcurl related

curl feature freeze March 20 2008

It is yet again time to pause the add-new-features-craze in order to settle down and fix a few more remaining bugs before we go ship another curl and libcurl release in the beginning of April.

cURL

So at March 20 we hold back and only fix bugs for about 2 weeks until we release curl and libcurl 7.18.1.

The only currently mentioned flaw in TODO-RELEASE to fix before this release is the claimed race condition in win32 gethostbyname_thread but since the reporter doesn’t respond anymore and we can’t repeat the problem it is deemed to just be buried and forgotten.

Other problems currently mentioned on the mailing list is a POST problem with digest and read callbacks and a mysterious bad progress callbacks for uploads, but none of them seem very serious and thus terribly important to get fixed in case they should turn out hard-to-fix.

Yes, I picked the date on purpose as that is the magic date in this project. Especially this year.

Summer of code ideas and mentors!

To get good results from Google’s Summer of Code, we need a fair amount of volunteering mentors and we need a good set of interesting projects to make students get attracted.

Rockbox tinyIf your interest is in the Rockbox project, add your project ideas or add yourself as mentor on this wiki page.

curl tinyIf your interest is in the cURL project, read this page about the existing ideas and provide new ones or submit yourself as mentor on the mailing list!

Organizations can apply for becoming part of this starting tomorrow, March 3 2008.

AOL UK uses curl in disguise

Information to this was mailed to me from a friend but is easily verified as I’ll describe below.curl tiny

America Online in the UK (AOL UK) is using our cURL application (without including the license anywhere) as part of their automated broadband router configuration CD for their AOL UK customer base. The CD is provided to all AOL UK customers and the automated router configuration component using cURL has been included with it for a couple of years.

The software includes the cURL application renamed to “AOL_Broadband_Installer.EXE“. There is no license included or mention of the license anywhere on the CD or installer, contrary to what’s required at http://curl.haxx.se/docs/copyright.html.

The md5sum for AOL_Broadband_Installer.EXE matches the win32 binary in the curl-7.15.3-win32-nossl.zip release package I personally built and offer for download…!

If you want to check it out yourself, the direct link I figured out to the installer is here and I found it on this page (download the “easy installer” for Netgear DG834G).

Update: see my reply below.

Google Summer of Code for cURL?

Google Summer of Code 2007 front print

As I was involved in gsoc 2007 within the Rockbox project, I ventilated the idea on the libcurl mailing list just yesterday that perhaps this is a good year for the cURL project to apply to become a mentoring organization to be able to host students doing gsoc work?

If so, this is no point unless we can at least present a bunch of interesting projects to lure students to us to have them work to improve (lib)curl and do stuff we otherwise might have a hard time to get done.

What things would you like to see that you consider would be a good project for a student to work on during the summer 2008?

New protocols? Fixing the last remaining blocking calls within libcurl? Fixing up/replacing language bindings? It’s not strictly a requirement that we come up with the best ideas since students apply with their own suggestion anyway, but we can provide good suggestions and ideas that will make students attracted to us and make them select to work for our project – should we be selected as a mentor organization.

cURL

libcurl built with yassl

yassl is a (in comparison) small SSL/TLS library that libcurl can be built to use instead of using one of the other SSL/TLS libraries libcurl supports (OpenSSL, GnuTLS, NSS and QSSL). However, yassl differs somewhat from the others in the way that it provides an OpenSSL-emulating API layer so that in libcurl we pretty much use exactly the same code for OpenSSL as we do for yassl.libcurl

yassl claimed this libcurl support for several years ago and indeed libcurl builds fine with it and you can even do some basic SSL operations with it, but the emulation is just not good enough to let the curl test suite go through so lots of stuff breaks when libcurl is built with yassl.

I did this same test 15 months ago and it had roughly the same problems then.

As far as I know, the other three main libraries are much more stable (the QSSL/QsoSSL library is only available on the AS/400 platform and thus I don’t personally know much about how it performs) and thus they remain the libraries I recommend users to select from if they want something that’s proven working.

CA cert bundle or not

Since the dawn of time (at least it feels that long) we’ve included a copy of a ca cert bundle in the curl releases. That ca cert bundle originates from Netscape 4.72 and no cert has been added to it since the year 2000(!)

Instead, we’ve offered things like an easy downloadable version from our web site, and documented that this is what you often need to do.

Anyway, we were recently triggered by a bug report and are discussing updating the bundle in the curl tarballs – we’ll just need to sort out the license situation first but we’re slowly progressing there and I think we’re pretty fine with things as they are right now.

However, the question is perhaps better put the other way: why should we bother to include a ca cert bundle in the first place? Most users will already have one in their system (since basically all SSL-based applications want one) and those that don’t can very easily get an updated one using our online server or a recent perl script added to the curl source tree.

I hope I don’t have to tell you that I value all input I can get on this issue!

Make Them Pick Us

Given that there are an endless series of open source and free software projects around. What makes companies and projects likely to chose to depend and use one of the existing ones rather than to write it themselves or possibly buy a closed-source solution instead? I’ll try to answer a few of the things that might matter, and deal with how curl and libcurl relates to them.

Proven Track Record

The project needs to have been around for a while, so that external people can see that the development continues and that there is a continued interest in the project from developers and users. That bug reports are acknowledged and fixed, that it has been scrutinized for the most obvious security problems etc. The curl project started almost ten years ago, have done more than one hundred releases and there is now more developer activity in the project than ever before.

Certified Goodness

With companies and associations that “certify” others, you can get others’ views on the quality of the projects.

The company named OpenLogic offers “certification” of open source software for companies to feel safer. I must admit I like seeing they’ve certified curl and libcurl. You can get their sales-pitch style description of their certification process here.

Of course I also like to see curl going to rung 2 on the scan.coverity.com list as it would mean a second (independent from the first) source would also claim that there’s a reasonable level of quality in the product.

If they did it so can we

With a vast list of existing companies and products that already are using the project, newcomers can see that this and that company and project already depend on this, and that fact alone makes the project even more likely to be a solid and trustworthy choice.

Being the answer when the question comes

Being known is important. When someone asks for help and guidance about what possible solutions there are to a particular problem, you want a large portion of your target audience to know about your project and to say “oh for doing X you could try project Y”. I want people to think libcurl when asked a question about doing internet-related transfers, like HTTP or FTP.

This is of course a matter of marketing and getting known to lots of people is a hard thing for an open source project with nothing but volunteers with no particular company backing.

Being a fine project

Of course the prerequisite to all points above is that the project is well maintained, the source is written in a nice manner and that there’s an open and prosperous community…

Distros Going Their Own Way

Lemme take the opportunity to express my serious dislike about a particular habit in the open source world, frequently seen performed by various distros (and by distro I then mean in the wider sense, not limited to only Linux distros):

They fix problems by patching code in projects they ship/offer, but they don’t discuss the problem upstream and they don’t ship their patch upstream. In fact, in one particular case in a project near to me (make a guess!) I’ve even tried to contact the patch author(s) over the years but they’ve never responded so even though I know of their patch, I can’t get anyone to explain to me why they think they need it…

So hello hey you packagers working on distros! When you get a bug report that clearly is a problem with the particular tool/project and that isn’t really a problem with your particular distro’s way of doing things, please please please forward it upstream or at least involve the actual project team behind the tool in the discussions around the bug and possible solutions. And if you don’t do that, the very least you should do is to make sure the patches you do and apply are forwarded upstream to the project team.

How else are we gonna be able to improve the project if you absorb the bug reports and you keep fixes hidden? That’s not a very open source’ish attitude, methinks.

Recent example that triggered this post.

curl and libcurl 7.18.0

cURL

I’m happy to announce the 103rd curl release: curl and libcurl 7.18.0.

No less than 35 persons beside myself contributed with info, reports and/or code to make the release as it turned out. We’ve added a bunch of new features and we’ve solved well over 30 different bugs. This is the news:

Changes:

Bugfixes:

  • curl-config –features and –protocols show the correct output when built with NSS, and also when SCP, SFTP and libz are not available
  • free problem in the curl tool for users with empty home dir
  • curl.h version 7.17.1 problem when building C++ apps with MSVC
  • SFTP and SCP use persistent connections
  • segfault on bad URL
  • variable wrapping when using absolutely huge send buffer sizes
  • variable wrapping when using debug callback and the HTTP request wasn’t sent in one go
  • SSL connections with NSS done with the multi-interface
  • setting a share no longer activates cookies
  • Negotiate now works on auth and proxy simultaneously
  • support HTTP Digest nonces up to 1023 letters
  • resumed ftp upload no longer requires the read callback to return full buffers
  • no longer default-appends ;type= on FTP URLs thru proxies
  • SSL session id caching
  • POST with callback over proxy requiring NTLM or Digest
  • Expect: 100-continue flaw on re-used connection with POSTs
  • build fix for MSVC 9.0 (VS2008)
  • Windows curl builds failed file truncation when retry downloading
  • SSL session ID cache memory leak
  • bad connection re-use check with environment variable-activated proxy use
  • –libcurl now generates a return statement as well
  • socklen_t is no longer used in the public includes
  • time zone offsets from -1400 to +1400 are now accepted by the date parser
  • allows more spaces in WWW/Proxy-Authenticate: headers
  • curl-config –libs skips /usr/lib64
  • range support for file:// transfers
  • libcurl hang with huge POST request and request-body read from callback
  • removed extra newlines from many error messages
  • improved pipelining
  • improved OOM handling for data url encoded HTTP POSTs when read from a file
  • test suite could pick wrong tool(s) if more than one existed in the PATH
  • curl_multi_fdset() failed to return socket while doing CONNECT over proxy
  • curl_multi_remove_handle() on a handle that is in used for a pipeline now break that pipeline
  • CURLOPT_COOKIELIST memory leaks
  • progress meter/callback during http proxy CONNECT requests
  • auth for http proxy when the proxy closes connection after first response