curl up in Stockholm 2018

Welcome all curl hackers and fans to Stockholm 2018! We are hosting the curl up developers conference April 14-15 2018 in my home city in what now might really become an annual tradition. Remember Nuremberg 2017?

All details are collected and updated on the curl up 2018 wiki page:

curl-up-2018

curl up 2018  will be another two full days event over a weekend and we will make an effort to keep the attendance fee at a minimum.

Presentations by core curl contributors on how things work, what we’ve done lately, what we’re working on how, what we should work on in the future and blue-sky visions about what curl should become when it grows up. Everyone attending are encouraged to present something. All this, mixed with lots of discussions, Q&As and socializing.

This time, I hope you will also have the chance to arrive early and spend the Friday, or a part of it, working hands-on with other curl developers on actual programming/debugging curl features or bugs. The curl-hacking-Friday? While not firmly organized yet, I’d like to see this become reality. Let me know if that’s something you’d be interested to participate on.

Who should come?

Anyone interested in curl, curl development or how to best use curl in application or solutions. Here’s your chance to learn a lot and an excellent opportunity to influence the future of curl. And to get some short glimpses of a spring-time Stockholm when going back and forth to the curl up venue!

Sign up?

We will open up the “ticket booth” in January/February 2018, so just keep your eyes and ears open and you won’t miss it! The general idea is to keep the fee at a minimum, ideally zero. Currently the exact price depends on how we manage to cover the remaining costs with friendly sponsors.

Sponsor us!

We are looking for sponsors. If your company is interested and willing to help us out, in any capacity, please contact me! Remember that our project has no funds of its own and we have no particular company backing.

Where in Stockholm exactly?

We will spend the curl up 2018 weekend at Goto 10, which is near both the Skanstull and Gullmarsplan subway stations.

Action photo from curl up 2017 when Kamil Dudka spoke about Redhat’s use of curl built with NSS.

Firefox Quantum

Next week, Mozilla will release Firefox 57. Also referred to as Firefox Quantum, from the project name we’ve used for all the work that has been put into making this the most awesome Firefox release ever. This is underscored by the fact that I’ve gotten mailed release-swag for the first time during my four years so far as a Mozilla employee.

Firefox 57 is the major milestone hundreds of engineers have worked really hard toward during the last year or so, and most of the efforts have been focused on performance. Or perhaps perceived end user snappiness. Early comments I’ve read and heard also hints that it is also quite notable. I think every single Mozilla engineer (and most non-engineers as well) has contributed to at least some parts of this, and of course many have done a lot. My personal contributions to 57 are not much to write home about, but are mostly a stream of minor things that combined at least move the notch forward.

[edited out some secrets I accidentally leaked here.] I’m a proud Mozillian and being part of a crowd that has put together something as grand as Firefox 57 is an honor and a privilege.

Releasing a product to hundreds of millions of end users across the world is interesting. People get accustomed to things, get emotional and don’t particularly like change very much. I’m sure Firefox 57 will also get a fair share of sour feedback and comments written in uppercase. That’s inevitable. But sometimes, in order to move forward and do good stuff, we have to make some tough decisions for the greater good that not everyone will agree with.

This is however not the end of anything. It is rather the beginning of a new Firefox. The work on future releases goes on, we will continue to improve the web experience for users all over the world. Firefox 58 will have even more goodies, and I know there are much more good stuff planned for the releases coming in 2018 too…

Onwards and upwards!

(Update: as I feared in this text, I got a lot of negativism, vitriol and criticism in the comments to this post. So much that I decided to close down comments for this entry and delete the worst entries.)

My night at the museum

Thursday October 19, 2017,

I arrived at the Technical Museum in Stockholm together with my two kids just a short while before 17:30. A fresh, cool and clear autumn evening. For this occasion I had purchased myself a brand new suit as I hadn’t gotten one since almost twenty years before this and it had been almost that long since I last wore it. I went for a slightly less conservative purple colored shirt with the dark suit.

Apart from my kids, my wife was of course also present and so was my brother Björn and my parents in law. Plus a few hundred other visitors, most of them of course unknown to me.

My eleven year old son truly appreciates this museum so we took the opportunity to quickly check out parts of the exhibitions while the pre-event mingling went on and drinks were served. Not too long though as we were soon asked to proceed to the restaurant part and take our assigned seats. I was seated at table #6.

The whole evening was not entirely “mine”, but as I am the winner of this year’s Polhem Prize it was setup to eventually lead to the hand over of the award to me. An evening for me. Lots of attention on me and references to my work through-out the evening, that otherwise had the theme of traffic safety (my guess is that’s partly due to last year’s Prize winner who was a lead person in the invention of seat belts in cars).

A three-course dinner, with some entertainment intermixed. At my table I sat next to some brilliant and interesting people and I had a great time and good conversations. Sitting across the table from the His Majesty the king of Sweden was an unexpected and awesome honor.

Somewhere mid-through the evening, a short movie was presented on the big screens. A (Swedish-speaking) movie with me trying to explain what curl is, what it does and why I’ve made it. I think the movie was really great and I think it helps explaining curl to non-techies (including my own family). The movie is the result of a perhaps 40 minutes interview/talk we did on camera and then a fair amount of skilled editing by the production company. (Available here.)

At around 21:30 I was called on stage. I received a gold medal from the king and shook his hand. I also received a diploma and a paper with the award committee’s motivation for me getting the prize. And huge bouquet of lovely flowers. A bit more than what I could hold in my arms really.

(me, and Carl XVI Gustaf, king of Sweden)

As the king graciously offered to hold my diploma and medal, I took the microphone and expressed a few words of thanks. I was and I still am genuinely and deeply moved by receiving this prize. I’m happy and proud. I said my piece in which I explicitly mentioned my family members by name: Anja, Agnes and Rex for bearing with me.

(me, H.M the king and Cecilia Schelin Seidegård)

Afterwards I received several appraisals for my short speech which made me even happier. Who would’ve thought that was even possible?

I posed for pictures, shook many hands, received many congratulations and I even participated in a few selfies until the time came when it was time for me and my family to escape into a taxi and go home.

What a night. In the cab home we scanned social media and awed over pictures and mentions. I hadn’t checked my phone even once during the event so it had piled up a bit. It’s great to have so many friends and acquaintances who shared this award and moment with us!

I also experienced a strong “post award emptiness” sort of feeling. Okay, that was it. That was great. Now it’s over. Back to reality again. Back to fixing bugs and responding to emails.

Thank you everyone who contributed to this! In whatever capacity.

The Swedish motivation (shown in a picture above) goes like this, translated to English with google and edited by me:

Motivation for the Polhem Prize 2017

Our modern era consists of more and more ones and zeroes. Each individual programming tool that instructs technical machines to do
what we want has its own important function.

Everything that is connected needs to exchange information.  Twenty years ago, Daniel Stenberg started working on what we  now call cURL. Since then he has spent late evenings and weekends, doing unpaid work to refine his digital tool. It consists of open source code and allows you to retrieve data from home page URLs. The English letter c, see, makes it “see URL”.

In practice, its wide spread use means that millions, up to billions of people, worldwide, every day benefit from cURL in their mobile phones, computers, cars and a lot more. The economic value created with this can not be overestimated.

Daniel Stenberg initiated, keeps it together and leads the continuous development work with the tool. Completely voluntary. In total, nearly 1400 individuals have contributed. It is a solid engineering work and an expression of dedicated governance that has benefited many companies and the entire society. For this, Daniel Stenberg is awarded the Polhem Prize 2017.

Polhemspriset 2017

I’m awarded the Swedish Polhem Prize 2017. (Link to a Swedish-speaking site.)

The Polhem Prize (Polhemspriset in Swedish), is awarded “for a high-level technological innovation or an ingenious solution to a technical problem.” The Swedish innovation must be available and shown competitive on the open market.

This award has been handed out in the name of the scientist and inventor Christopher Polhem, sometimes called the father of Swedish engineering, since 1878. It is Sweden’s oldest and most prestigious award for technological innovation.

I first got the news on the afternoon on September 24th and I don’t think I exaggerate much if I say that I got a mild shock. Me? A prize? How did they even find me or figure out what I’ve done?

I get this award for having worked on curl for a very long time, and by doing this having provided an Internet infrastructure of significant value to the world. I’ve never sold it nor earned much of commercial income from this hobby of mine, but my code now helps to power an almost unimaginable amount of devices, machines and other connected things in the world.

I’m not used to getting noticed or getting awards. I’m used to sitting by myself working on bugs, merging patches and responding to user emails. I don’t expect outsiders to notice what I do much and I always have a hard time to explain to friends and “mortals” what it is I actually do.

I accept this prize, not as a single inventor or brilliant mind of anything, but like the captain of a boat with a large and varying crew without whom I would never have reached this far. I’m excited that the nominee board found me and our merry project and that they were open-minded enough to see and realize the value and position of an open source project that is used literally everywhere. I feel deeply honored.

I’m fascinated the award nominee group found me and I think it is super cool that an open source project gets this attention and acknowledgement.

Apart from the honor, the prize comes in form of a monetary part (250K SEK, about 31,000 USD) and a gold medal with Polhem’s image on. See this blog post’s featured image. The official award ceremony will take place in a few days at the Technical Museum in Stockholm. I’m then supposed to get the medal handed to me by his royal highness Carl XVI Gustav , the king of Sweden. An honor very few people get to experience. Especially very few open source hackers.

Thank you

While I have so many people to thank for having contributed to my (and curl’s) success, there are some that have been fundamental.

I’d like to specifically highlight my wife Anja and my kids Agnes and Rex who are the ones I routinely steal time away from to instead spend on curl. They’re the ones who I drift away from when I respond to issues on the phones or run off to the computer to “just respond to something quickly”. They’re the best.

I’d like to thank Björn, my brother, who chipped in half the amount of money for that first Commodore 64 we purchased back in 1985 and which was the first stepping stone to me being here.

I’d like to thank all my friends and team mates in the curl project without whom curl would’ve died as an infant already in the 1990s. It is with honest communication, hard work and good will that good software is crafted. (Well, there might be some more components necessary too, but let’s keep it simple here.)

I’d like to thank everyone who ever said thanks to me for curl and told me that what I did or brought to the world actually made a difference or served a purpose. Positive feedback is what drives me. It is the fuel that keeps me going.

How will this award affect me and the curl project going forward?

I hope the award will strengthen my spine even more in knowing that we’re going down the right path here. Not necessarily with every single decision or choice we do, but the general one: we do things open source, we do things together and we work long-term.

I hope the award puts a little more light and attention on the world of open source and how this development model can produce the most stellar and robust software components you can think of – without a “best before” stamp.

I would like the award to make one or two more people find and take a closer look at the curl project. To dive in and contribute, in one way or another. We always need more eyes and hands!

Further, I realize that this award might bring some additional eyes on me who will watch how I act and behave. I intend to keep trying to do the right thing and act properly in every situation and I know my friends and community will help me stand straight – no matter how the winds blow.

What will I do with the money?

I intend to take my family with me on an extended vacation trip to New Zealand!

Hopefully there will be some money left afterward, that I hope to at least in part spend on curl related activities such as birthday cakes on the pending curl 20th birthday celebrations in spring 2018…

But really, how many use curl?

Virtually every smart phone has one or more curl installs. Most modern cars and television sets do as well. Probably just about all Linux servers on the Internet run it. Almost all PHP sites on the Internet do. Portable devices and internet-connected machines use it extensively. curl sends crash-reports when your Chrome or Firefox browser fail. It is the underlying data transfer engine for countless systems, languages, programs, games and environments.

Every single human in the connected world use something that runs curl every day. Probably more than once per day. Most have it installed in devices they carry around with them.

It is installed and runs in tens of billions of instances, as most modern-life rich people have numerous installations in their phones, with their web browsers, in their tablets, their cars, their TVs, their kitchen appliances etc.

Most humans, of course, don’t know this. They use devices and apps that just work and are fine with that. curl is just a little piece in the engines of those systems.

Testing curl

In order to ship a quality product – once every eight weeks – we need lots of testing. This is what we do to test curl and libcurl.

checksrc

We have basic script that verifies that the source code adheres to our code standard. It doesn’t catch all possible mistakes, but usually it complains with enough details to help contributors to write their code to match the style we already use. Consistent code style makes the code easier to read. Easier reading makes less bugs and quicker debugging.

By doing this check with a script (that can be run automatically when building curl), it makes it easier for everyone to ship properly formatted code.

We have not (yet) managed to convince clang-format or other tools to reformat code to correctly match our style, and we don’t feel like changing it just for the sake of such a tool. I consider this a decent work-around.

make test

The test suite that we bundle with the source code in the git repository has a large number of tests that test…

  • curl – it runs the command line tool against test servers for a large range of protocols and verifies error code, the output, the protocol details and that there are no memory leaks
  • libcurl – we then build many small test programs that use the libcurl API and perform tests against test servers and verifies that they behave correctly and don’t leak memory etc.
  • unit tests – we build small test programs that use libcurl internal functions that aren’t exposed in the API and verify that they behave correctly and generate the presumed output.
  • valgrind – all the tests above can be run with and without valgrind to better detect memory issues
  • “torture” – a special mode that can run the tests above in a way that first runs the entire test, counts the number of memory related functions (malloc, strdup, fopen, etc) that are called and then runs the test again that number of times and for each run it makes one of the memory related functions fail – and makes sure that no memory is leaked in any of those situations and no crash occurs etc. It runs the test over and over until all memory related functions have been made to fail once each.

Right now, a single “make test” runs over 1100 test cases, varying a little depending on exactly what features that are enabled in the build. Without valgrind, running those tests takes about 8 minutes on a reasonably fast machine but still over 25 minutes with valgrind.

Then we of course want to run all tests with different build options…

CI

For every pull request and for every source code commit done, the curl source is built for Linux, mac and windows. With a large set of different build options and TLS libraries selected, and all the tests mentioned above are run for most of these build combinations. Running ‘checksrc’ on the pull requests is of course awesome so that humans don’t have to remark on code style mistakes much. There are around 30 different builds done and verified for each commit.

If any CI build fails, the pull request on github gets a red X to signal that something was not OK.

We also run test case coverage analyses in the CI so that we can quickly detect if we for some reason significantly decrease test coverage or similar.

We use Travis CI, Appveyor and Coveralls.io for this.

Autobuilds

Independently of the CI builds, volunteers run machines that regularly update from git, build and run the entire test suite and then finally email the results back to a central server. These setups help us cover even more platforms, architectures and build combinations. Just with a little longer turn around time.

With millions of build combinations and support for virtually every operating system and CPU architecture under the sun, we have to accept that not everything can be fully tested. But since almost all code is shared for many platforms, we can still be reasonably sure about the code even for targets we don’t test regularly.

Static code analyzing

We run the clang scan-build on the source code daily and we run Coverity scans on the code “regularly”, about once a week.

We always address defects detected by these analyzers immediately when notified.

Fuzzing

We’re happy to be part of Google’s OSS-fuzz effort, which with a little help with integration from us keeps hammering our code with fuzz to make sure we’re solid.

OSS-fuzz has so far resulted in two security advisories for curl and a range of other bug fixes. It hasn’t been going on for very long and based on the number it has detected so far, I expect it to keep finding flaws – at least for a while more into the future.

Fuzzing is really the best way to hammer out bugs. When we’re down to zero detected static analyzer detects and thousands of test cases that all do good, the fuzzers can still continue to find holes in the net.

External

Independently of what we test, there are a large amount of external testing going on, for each curl release we do.

In a presentation by Google at curl up 2017, they mentioned their use of curl in “hundreds of applications” and how each curl release they adopt gets tested more than 400,000 times. We also know a lot of other users also have curl as a core component in their systems and test their installations extensively.

We have a large set of security interested developers who run tests and fuzzers on curl at their own will.

(image from pixabay)

The life of a curl security bug

The report

Usually, security problems in the curl project come to us out of the blue. Someone has found a bug they suspect may have a security impact and they tell us about it on the curl-security@haxx.se email address. Mails sent to this address reach a private mailing list with the curl security team members as the only subscribers.

An important first step is that we respond to the sender, acknowledging the report. Often we also include a few follow-up questions at once. It is important to us to keep the original reporter in the loop and included in all subsequent discussions about this issue – unless they prefer to opt out.

If we find the issue ourselves, we act pretty much the same way.

In the most obvious and well-reported cases there are no room for doubts or hesitation about what the bugs and the impact of them are, but very often the reports lead to discussions.

The assessment

Is it a bug in the first place, is it perhaps even documented or just plain bad use?

If it is a bug, is this a security problem that can be abused or somehow put users in some sort of risk?

Most issues we get reported as security issues are also in the end treated as such, as we tend to err on the safe side.

The time plan

Unless the issue is critical, we prefer to schedule a fix and announcement of the issue in association with the pending next release, and as we do releases every 8 weeks like clockwork, that’s never very far away.

We communicate the suggested schedule with the reporter to make sure we agree. If a sooner release is preferred, we work out a schedule for an extra release. In the past we’ve did occasional faster security releases also when the issue already had been made public, so we wanted to shorten the time window during which users could be harmed by the problem.

We really really do not want a problem to persist longer than until the next release.

The fix

The curl security team and the reporter work on fixing the issue. Ideally in part by the reporter making sure that they can’t reproduce it anymore and we add a test case or two.

We keep the fix undisclosed for the time being. It is not committed to the public git repository but kept in a private branch. We usually put it on a private URL so that we can link to it when we ask for a CVE, see below.

All security issues should make us ask ourselves – what did we do wrong that made us not discover this sooner? And ideally we should introduce processes, tests and checks to make sure we detect other similar mistakes now and in the future.

Typically we only generate a single patch from the git master master and offer that as the final solution. In the curl project we don’t maintain multiple branches. Distros and vendors who ship older or even multiple curl versions backport the patch to their systems by themselves. Sometimes we get backported patches back to offer users as well, but those are exceptions to the rule.

The advisory

In parallel to working on the fix, we write up a “security advisory” about the problem. It is a detailed description about the problem, what impact it may have if triggered or abused and if we know of any exploits of it.

What conditions need to be met for the bug to trigger. What’s the version range that is affected, what’s the remedies that can be done as a work-around if the patch is not applied etc.

We work out the advisory in cooperation with the reporter so that we get the description and the credits right.

The advisory also always contains a time line that clearly describes when we got to know about the problem etc.

The CVE

Once we have an advisory and a patch, none of which needs to be their final versions, we can proceed and ask for a CVE. A CVE is a unique “ID” that is issued for security problems to make them easy to reference. CVE stands for Common Vulnerabilities and Exposures.

Depending on where in the release cycle we are, we might have to hold off at this point. For all bugs that aren’t proprietary-operating-system specific, we pre-notify and ask for a CVE on the distros@openwall mailing list. This mailing list prohibits an embargo longer than 14 days, so we cannot ask for a CVE from them longer than 2 weeks in advance before our release.

The idea here is that the embargo time gives the distributions time and opportunity to prepare updates of their packages so they can be pretty much in sync with our release and reduce the time window their users are at risk. Of course, not all operating system vendors manage to actually ship a curl update on two weeks notice, and at least one major commercial vendor regularly informs me that this is a too short time frame for them.

For flaws that don’t affect the free operating systems at all, we ask MITRE directly for CVEs.

The last 48 hours

When there is roughly 48 hours left until the coming release and security announcement, we merge the private security fix branch into master and push it. That immediately makes the fix public and those who are alert can then take advantage of this knowledge – potentially for malicious purposes. The security advisory itself is however not made public until release day.

We use these 48 hours to get the fix tested on more systems to verify that it is not doing any major breakage. The weakest part of our security procedure is that the fix has been worked out in secret so it has not had the chance to get widely built and tested, so that is performed now.

The release

We upload the new release. We send out the release announcement email, update the web site and make the advisory for the issue public. We send out the security advisory alert on the proper email lists.

Bug Bounty?

Unfortunately we don’t have any bug bounties on our own in the curl project. We simply have no money for that. We actually don’t have money at all for anything.

Hackerone offers bounties for curl related issues. If you have reported a critical issue you can request one from them after it has been fixed in curl.

 

Say hi to curl 7.56.0

Another curl version has been released into the world. curl 7.56.0 is available for download from the usual place. Here are some news I think are worthy to mention this time…

An FTP security issue

A mistake in the code that parses responses to the PWD command could make curl read beyond the end of a buffer, Max Dymond figured it out, and we’ve released a security advisory about it. Our 69th security vulnerability counted from the beginning and the 8th reported in 2017.

Multiple SSL backends

Since basically forever you’ve been able to build curl with a selected SSL backend to make it get a different feature set or behave slightly different – or use a different license or get a different footprint. curl supports eleven different TLS libraries!

Starting now, libcurl can be built to support more than one SSL backend! You specify all the SSL backends at build-time and then you can tell libcurl at run-time exactly which of the backends it should use.

The selection can only happen once per invocation so there’s no switching back and forth among them, but still. It also of course requires that you actually build curl with more than one TLS library, which you do by telling configure all the libs to use.

The first user of this feature that I’m aware of is git for windows that can select between using the schannel and OpenSSL backends.

curl_global_sslset() is the new libcurl call to do this with.

This feature was brought by Johannes Schindelin.

New MIME API

The currently provided API for creating multipart formposts, curl_formadd, has always been considered a bit quirky and complicated to work with. Its extensive use of varargs is to blame for a significant part of that.

Now, we finally introduce a replacement API to accomplish basically the same features but also with a few additional ones, using a new API that is supposed to be easier to use and easier to wrap for bindings etc.

Introducing the mime API: curl_mime_init, curl_mime_addpart, curl_mime_name and more. See the postit2.c and multi-post.c examples for some easy to grasp examples.

This work was done by Patrick Monnerat.

SSH compression

The SSH protocol allows clients and servers to negotiate to use of compression when communicating, and now curl can too. curl has the new –compressed-ssh option and libcurl has a new setopt called CURLOPT_SSH_COMPRESSION using the familiar style.

Feature worked on by Viktor Szakats.

SSLKEYLOGFILE

Peter Wu and Jay Satiro have worked on this feature that allows curl to store SSL session secrets in a file if this environment variable is set. This is normally the way you tell Chrome and Firefox to do this, and is extremely helpful when you want to wireshark and analyze a TLS stream.

This is still disabled by default due to its early days. Enable it by defining ENABLE_SSLKEYLOGFILE when building libcurl and set environment variable SSLKEYLOGFILE to a pathname that will receive the keys.

Numbers

This, the 169th curl release, contains 89 bug fixes done during the 51 days since the previous release.

47 contributors helped making this release, out of whom 18 are new.

254 commits were done since the previous release, by 26 authors.

The top-5 commit authors this release are:

  1. Daniel Stenberg (116)
  2. Johannes Schindelin (37)
  3. Patrick Monnerat (28)
  4. Jay Satiro (12)
  5. Dan Fandrich (10)

Thanks a lot everyone!

(picture from pixabay)

The backdoor threat

— “Have you ever detected anyone trying to add a backdoor to curl?”

— “Have you ever been pressured by an organization or a person to add suspicious code to curl that you wouldn’t otherwise accept?”

— “If a crime syndicate would kidnap your family to force you to comply, what backdoor would you be be able to insert into curl that is the least likely to get detected?” (The less grim version of this question would instead offer huge amounts of money.)

I’ve been asked these questions and variations of them when I’ve stood up in front of audiences around the world and talked about curl and how it is one of the most widely used software components in the world, counting way over three billion instances.

Back door (noun)
— a feature or defect of a computer system that allows surreptitious unauthorized access to data.

So how is it?

No. I’ve never seen a deliberate attempt to add a flaw, a vulnerability or a backdoor into curl. I’ve seen bad patches and I’ve seen patches that brought bugs that years later were reported as security problems, but I did not spot any deliberate attempt to do bad in any of them. But if done with skills, certainly I wouldn’t have noticed them being deliberate?

If I had cooperated in adding a backdoor or been threatened to, then I wouldn’t tell you anyway and I’d thus say no to questions about it.

How to be sure

There is only one way to be sure: review the code you download and intend to use. Or get it from a trusted source that did the review for you.

If you have a version you trust, you really only have to review the changes done since then.

Possibly there’s some degree of safety in numbers, and as thousands of applications and systems use curl and libcurl and at least some of them do reviews and extensive testing, one of those could discover mischievous activities if there are any and report them publicly.

Infected machines or owned users

The servers that host the curl releases could be targeted by attackers and the tarballs for download could be replaced by something that carries evil code. There’s no such thing as a fail-safe machine, especially not if someone really wants to and tries to target us. The safeguard there is the GPG signature with which I sign all official releases. No malicious user can (re-)produce them. They have to be made by me (since I package the curl releases). That comes back to trusting me again. There’s of course no safe-guard against me being forced to signed evil code with a knife to my throat…

If one of the curl project members with git push rights would get her account hacked and her SSH key password brute-forced, a very skilled hacker could possibly sneak in something, short-term. Although my hopes are that as we review and comment each others’ code to a very high degree, that would be really hard. And the hacked person herself would most likely react.

Downloading from somewhere

I think the highest risk scenario is when users download pre-built curl or libcurl binaries from various places on the internet that isn’t the official curl web site. How can you know for sure what you’re getting then, as you couldn’t review the code or changes done. You just put your trust in a remote person or organization to do what’s right for you.

Trusting other organizations can be totally fine, as when you download using Linux distro package management systems etc as then you can expect a certain level of checks and vouching have happened and there will be digital signatures and more involved to minimize the risk of external malicious interference.

Pledging there’s no backdoor

Some people argue that projects could or should pledge for every release that there’s no deliberate backdoor planted so that if the day comes in the future when a three-letter secret organization forces us to insert a backdoor, the lack of such a pledge for the subsequent release would function as an alarm signal to people that something is wrong.

That takes us back to trusting a single person again. A truly evil adversary can of course force such a pledge to be uttered no matter what, even if that then probably is more mafia level evilness and not mere three-letter organization shadiness anymore.

I would be a bit stressed out to have to do that pledge every single release as if I ever forgot or messed it up, it should lead to a lot of people getting up in arms and how would such a mistake be fixed? It’s little too irrevocable for me. And we do quite frequent releases so the risk for mistakes is not insignificant.

Also, if I would pledge that, is that then a promise regarding all my code only, or is that meant to be a pledge for the entire code base as done by all committers? It doesn’t scale very well…

Additionally, I’m a Swede living in Sweden. The American organizations cannot legally force me to backdoor anything, and the Swedish versions of those secret organizations don’t have the legal rights to do so either (caveat: I’m not a lawyer). So, the real threat is not by legal means.

What backdoor would be likely?

It would be very hard to add code, unnoticed, that sends off data to somewhere else. Too much code that would be too obvious.

A backdoor similarly couldn’t really be made to split off data from the transfer pipe and store it locally for other systems to read, as that too is probably too much code that is too different than the current code and would be detected instantly.

No, I’m convinced the most likely backdoor code in curl is a deliberate but hard-to-detect security vulnerability that let’s the attacker exploit the program using libcurl/curl by some sort of specific usage pattern. So when triggered it can trick the program to send off memory contents or perhaps overwrite the local stack or the heap. Quite possibly only one step out of several steps necessary for a successful attack, much like how a single-byte-overwrite can lead to root access.

Any past security problems on purpose?

We’ve had almost 70 security vulnerabilities reported through the project’s almost twenty years of existence. Since most of them were triggered by mistakes in code I wrote myself, I can be certain that none of those problems were introduced on purpose. I can’t completely rule out that someone else’s patch modified curl along the way and then by extension maybe made a vulnerability worse or easier to trigger, could have been made on purpose. None of the security problems that were introduced by others have shown any sign of “deliberateness”. (Or were written cleverly enough to not make me see that!)

Maybe backdoors have been planted that we just haven’t discovered yet?

Discussion

Follow-up discussion/comments on hacker news.

curl author activity illustrated

At the time of each commit, check how many unique authors that had a change committed within the previous 120, 90, 60, 30 and 7 days. Run the script on the curl git repository and then plot a graph of the data, ranging from 2010 until today. This is just under 10,000 commits.

(click for the full resolution version)

git-authors-active.pl is the little stand-alone script I wrote and used for this – should work fine for any git repository. I then made the graph from that using libreoffice.

curl, open source and networking