Tag Archives: cURL and libcurl

Make Them Pick Us

Given that there are an endless series of open source and free software projects around. What makes companies and projects likely to chose to depend and use one of the existing ones rather than to write it themselves or possibly buy a closed-source solution instead? I’ll try to answer a few of the things that might matter, and deal with how curl and libcurl relates to them.

Proven Track Record

The project needs to have been around for a while, so that external people can see that the development continues and that there is a continued interest in the project from developers and users. That bug reports are acknowledged and fixed, that it has been scrutinized for the most obvious security problems etc. The curl project started almost ten years ago, have done more than one hundred releases and there is now more developer activity in the project than ever before.

Certified Goodness

With companies and associations that “certify” others, you can get others’ views on the quality of the projects.

The company named OpenLogic offers “certification” of open source software for companies to feel safer. I must admit I like seeing they’ve certified curl and libcurl. You can get their sales-pitch style description of their certification process here.

Of course I also like to see curl going to rung 2 on the scan.coverity.com list as it would mean a second (independent from the first) source would also claim that there’s a reasonable level of quality in the product.

If they did it so can we

With a vast list of existing companies and products that already are using the project, newcomers can see that this and that company and project already depend on this, and that fact alone makes the project even more likely to be a solid and trustworthy choice.

Being the answer when the question comes

Being known is important. When someone asks for help and guidance about what possible solutions there are to a particular problem, you want a large portion of your target audience to know about your project and to say “oh for doing X you could try project Y”. I want people to think libcurl when asked a question about doing internet-related transfers, like HTTP or FTP.

This is of course a matter of marketing and getting known to lots of people is a hard thing for an open source project with nothing but volunteers with no particular company backing.

Being a fine project

Of course the prerequisite to all points above is that the project is well maintained, the source is written in a nice manner and that there’s an open and prosperous community…

Distros Going Their Own Way

Lemme take the opportunity to express my serious dislike about a particular habit in the open source world, frequently seen performed by various distros (and by distro I then mean in the wider sense, not limited to only Linux distros):

They fix problems by patching code in projects they ship/offer, but they don’t discuss the problem upstream and they don’t ship their patch upstream. In fact, in one particular case in a project near to me (make a guess!) I’ve even tried to contact the patch author(s) over the years but they’ve never responded so even though I know of their patch, I can’t get anyone to explain to me why they think they need it…

So hello hey you packagers working on distros! When you get a bug report that clearly is a problem with the particular tool/project and that isn’t really a problem with your particular distro’s way of doing things, please please please forward it upstream or at least involve the actual project team behind the tool in the discussions around the bug and possible solutions. And if you don’t do that, the very least you should do is to make sure the patches you do and apply are forwarded upstream to the project team.

How else are we gonna be able to improve the project if you absorb the bug reports and you keep fixes hidden? That’s not a very open source’ish attitude, methinks.

Recent example that triggered this post.

curl and libcurl 7.18.0

cURL

I’m happy to announce the 103rd curl release: curl and libcurl 7.18.0.

No less than 35 persons beside myself contributed with info, reports and/or code to make the release as it turned out. We’ve added a bunch of new features and we’ve solved well over 30 different bugs. This is the news:

Changes:

Bugfixes:

  • curl-config –features and –protocols show the correct output when built with NSS, and also when SCP, SFTP and libz are not available
  • free problem in the curl tool for users with empty home dir
  • curl.h version 7.17.1 problem when building C++ apps with MSVC
  • SFTP and SCP use persistent connections
  • segfault on bad URL
  • variable wrapping when using absolutely huge send buffer sizes
  • variable wrapping when using debug callback and the HTTP request wasn’t sent in one go
  • SSL connections with NSS done with the multi-interface
  • setting a share no longer activates cookies
  • Negotiate now works on auth and proxy simultaneously
  • support HTTP Digest nonces up to 1023 letters
  • resumed ftp upload no longer requires the read callback to return full buffers
  • no longer default-appends ;type= on FTP URLs thru proxies
  • SSL session id caching
  • POST with callback over proxy requiring NTLM or Digest
  • Expect: 100-continue flaw on re-used connection with POSTs
  • build fix for MSVC 9.0 (VS2008)
  • Windows curl builds failed file truncation when retry downloading
  • SSL session ID cache memory leak
  • bad connection re-use check with environment variable-activated proxy use
  • –libcurl now generates a return statement as well
  • socklen_t is no longer used in the public includes
  • time zone offsets from -1400 to +1400 are now accepted by the date parser
  • allows more spaces in WWW/Proxy-Authenticate: headers
  • curl-config –libs skips /usr/lib64
  • range support for file:// transfers
  • libcurl hang with huge POST request and request-body read from callback
  • removed extra newlines from many error messages
  • improved pipelining
  • improved OOM handling for data url encoded HTTP POSTs when read from a file
  • test suite could pick wrong tool(s) if more than one existed in the PATH
  • curl_multi_fdset() failed to return socket while doing CONNECT over proxy
  • curl_multi_remove_handle() on a handle that is in used for a pipeline now break that pipeline
  • CURLOPT_COOKIELIST memory leaks
  • progress meter/callback during http proxy CONNECT requests
  • auth for http proxy when the proxy closes connection after first response

curl with NSS and Fedora

Dave Jones blogged about his recent problems with curl on Fedora 8. It seems to be a problem somewhere in or related to the NSS library, that Fedora links curl to for SSL/TLS these days.cURL

What I find a bit annoying with this situation, is that I’m using Debian unstable and I’m dist-upgrading fairly frequently to be able to run on the bleeding edge and yet I don’t have the equivalent NSS version Fedora has and what’s perhaps worse is: I don’t even know how to get it and build my own local version! Is Fedora using their own patched version of this (rhetorical question as I’m quite sure they are)? Is it possible to get that version or patch so that I can build it and test on my non- Fedora development machine(s) ?

So, even though it really isn’t my problem or my issue to deal with, I couldn’t even try out his problem on my own!

Axis2/C going libcurl?

Axis2/CApache’s Axis2/C project, said to be “the only complete SOAP engine” is considering to move over to use libcurl for HTTP transport by default. At least Axis2/C developer Dinesh Premalal thinks they should, and he lists multiple reasons in his blog and I can of course do nothing but agree.libcurl

One reason he failed to mention is that we all (Axis2/C users and libcurl users) benefit from them switching to libcurl since then we’ll have a larger combined potential developer base and we’ll get more eyes on the code, more testing done and thus in the end we will get a better transport library all over.

I’m slightly puzzled by Dinesh’s blog entry since this bug tracker entry submitted to Axis2/C mentions their failure to include curl’s copyright/license text in the distribution, which seems to imply that they already use (parts of) curl. Or?

curl 7.18.0 feature freeze

Feature freeze!I just mailed the curl-library list about us entering feature freeze for the upcoming 7.18.0 release. The plan is to have two weeks of bug fixing and time to allow people to find bugs, before we release it to the public. Please get a daily snapshot and give it a spin!

Here’s the changes that’ll be coming:

… and there are 26 bug fixes mentioned in the RELEASE-NOTES in progress so far!

curl on scan.coverity.com

On scan.coverity.com, the nice guys at Coverity run scans on open source projects to check for flaws in their source code. Their list currently includes 265 projects, and curl is one of them. I have only good words to say about their scanning, as they found no less than 27 flaws in curl 7.16.1 and only one of them was a false positive. All the others were valid and true flaws that we could fix. I don’t think anyone was any serious security risk, but still. 26 bugs detected in one go.

On January 8th 2008, Coverity announced their “rung 2” for eleven projects that had zero flaws left in rung 1 and the rung 2 projects get an upgraded analysis. curl was also at zero flaws left, but it isn’t clear to me what else we could to do to reach rung 2 or even how we can get them to do a follow-up scan on a newer release since 7.16.1 is quite old by now and with all the changes in the code over time there’s always the risk new nasty bugs have crept in… So we’re at rung 1 still with no recent release scanned.

Aiming for 7.18.0 in January 2008

cURLThis info was also posted to the curl-library list today.

I previously thought of releasing 7.18.0 in December but since there are still outstanding topics in the list and since there’s no pressure due to any serious bug fixes or anything, I decided we can just as wait until January. I want January 13th to be the feature freeze day after which no new features will be committed until the release, which hopefully then could be done by January 28th or so.

The live updated TODO-RELEASE document will change over time, but it currently contains these items:

Is there anything we’ve forgotten we should include in the next release? To get a feel for how the next release will look like, check out the RELEASE-NOTES in progress, or try out a daily snapshot!

Fresh CA Cert Bundle Anyone?

cURLThe popular ca extract service on the curl web site converts the Firefox ca certs into a PEM file suitable for use with curl, wget or anything else OpenSSL-based that likes PEM formatted CA cert bundles.

The main script was fixed yesterday as it was previously getting a nightly source code snapshot to get the “magic” file to convert from, but I noticed they stopped updating the nightly source snapshots a good while ago so the updates had stopped!

Now, the script only gets the actually needed certdata file and converts it, so now it downloads a lot less data in vain and it also thus runs much faster. Now the PEM files offered on that page are up-to-date with the most recent Firefox.

wget going libcurl?

Micah Cowan is the current maintainer of GNU Wget, and he recently posted a long mail to the wgetA gnu head! mailing list titled “Thoughts on Wget 1.x, 2.0“.

Two fun quotes for the curious who don’t feel like reading the whole post:

1. On the subject of making wget deal with multiple simultanous connections/requests: The obvious solution to that is to use c-ares, which does exactly that: handle DNS queries asynchronously. Actually, I didn’t know this until just now, but c-ares was split off from ares to meet the needs of the curl developers.

2. Following the first reasoning, they can indeed get away with even less work if they base that work on an existing solution: While I’ve talked about not reinventing the wheel, using existing packages to save us the trouble of having to maintain portable async code, higher-level buffered-IO and network comm code, etc, I’ve been neglecting one more package choice. There is, after all, already a Free Software package that goes beyond handling asynchronous network operations, to specifically handle asynchronous _web_ operations; I’m speaking, of course, of libcurl. There would seem to be some obvious motivation for simply using libcurl to handle all asynchronous web traffic, and wrapping it with the logic we need to handle retries, recursion, timestamping, traversing, selecting which files to download, etc. Besides async web code, of course, we’d also automatically get support for a number of various protocols (SFTP, for example) that have been requested in Wget.libcurl

I am of course happy to see that the consideration exists – even if this won’t go further than just expressed in a mail. I did ventilate this idea to the wget people back in 2001, and even though we’re now more than six years down the road since then the situation is now even more clear: libcurl is a much more capable and proven transport layer solution and it supports much more protocols than wget is/does.

Me biased? naaah… 🙂