Tag Archives: patches

More HTTP framing attempts

Previously, in my exciting series "improving the HTTP framing checks in Firefox" we learned that I landed a patch, got it backed out, struggled to improve the checks and finally landed the fixed version only to eventually get that one backed out as well.

And now I've landed my third version. The amendment I did this time:

When receiving HTTP content that is content-encoded and compressed I learned that when receiving deflate compression there is basically no good way for us to know if the content gets prematurely cut off. They seem to lack the footer too often for it to make any sense in checking for that. gzip streams however end with a footer so they are easier to reliably detect when they are incomplete. (As was discovered before, the Content-Length: is far too often not updated by the server so it is instead wrongly showing the uncompressed size.)

This (deflate vs gzip) knowledge is now used by the patch, meaning that deflate compressed downloads can be cut off without the browser noticing...

Will this version of the fix actually stick? I don't know. There's lots of bad voodoo out there in the HTTP world and I'm putting my finger right in the middle of some of it with this change. I'm pretty sure I've not written my last blog post on this topic just yet... If it sticks this time, it should show up in Firefox 39.

bolt-cutter

Apple’s modified CA cert handling and curl

I tweeted about me finding a change in Apple's version of curl that I haven't seen any public patch for. Apple otherwise hosts a whole slew of curl patches which they never discuss with us about but still make public and we can see what they did.

I was trying to help out a fellow curl user on IRC (we're in #curl on freenode, come see us) and he was trying to understand some funny effects of running curl against a HTTPS site and he showed me the output from a "curl -v" log. The verbose log curiously was different than mine (same curl version built by myself on Linux). My conclusion was that something was different in the Apple version.

The users log said:

* About to connect() to host.example.com port 443 (#0)
*   Trying 1.2.3.4... connected
* Connected to host.example.com (1.2.3.4) port 443 (#0)
* SSLv3, TLS handshake, Client hello (1):

... while my command against the same site said:

* About to connect() to host.example.com port 443 (#0)
*   Trying 1.2.3.4... connected
* Connected to host.example.com (1.2.3.4) port 443 (#0)
* successfully set certificate verify locations:
*   CAfile: /etc/ssl/certs/ca-certificates.crt
  CApath: none
* SSLv3, TLS handshake, Client hello (1):

(I've bolded the part my output showed that wasn't in the mac version, the real host name and IP have been changed.)

It seems I was wrong however.

The output above is only shown if libcurl sets the CA cert path to OpenSSL and it seems the Mac version doesn't. Somehow they get the CA certs loaded to libcurl differently.

So ok, maybe they didn't modify curl but they certainly changed how curl uses CA certs and they did this by modifying OpenSSL and clearly their version of OpenSSL now defaults to use their CA cert bundle. The end result for me is still the same though: I have no idea how CA certs work with curl on Mac so it leaves me with the unfortunate situation where I can't help fellow curl users when they have CA cert problems on a Mac.

It also leaves me very curious on what --cacert does exactly on the mac version of curl.

OpenSSL is patched. Apparently it now works so that if the "normal" x509 validation fails, and TrustEvaluationAgent (TEA) is enabled, it will attempt to use the TEA to validate the certificate. The apple source code to read through for this is x509_vfy_apple.c in their patched OpenSSL tree. It is also possible to skip the TEA verification thing in OpenSSL by setting an environment variable, so that we can still have curl on mac act "as default" with a command line like:

$ env OPENSSL_X509_TEA_DISABLE=1 curl https://www.example.com/

Finally: yes, curl is released under an MIT license. They're perfectly allowed to do whichever of these actions they want. I know this, and I chose the MIT license fully aware that any company can take the code, modify it and never return any changes. I'm not arguing against anyone's rights to do this with curl.

Thank you, friendly anonymous helper for helping me straighten out my findings!

Distros Going Their Own Way

Lemme take the opportunity to express my serious dislike about a particular habit in the open source world, frequently seen performed by various distros (and by distro I then mean in the wider sense, not limited to only Linux distros):

They fix problems by patching code in projects they ship/offer, but they don't discuss the problem upstream and they don't ship their patch upstream. In fact, in one particular case in a project near to me (make a guess!) I've even tried to contact the patch author(s) over the years but they've never responded so even though I know of their patch, I can't get anyone to explain to me why they think they need it...

So hello hey you packagers working on distros! When you get a bug report that clearly is a problem with the particular tool/project and that isn't really a problem with your particular distro's way of doing things, please please please forward it upstream or at least involve the actual project team behind the tool in the discussions around the bug and possible solutions. And if you don't do that, the very least you should do is to make sure the patches you do and apply are forwarded upstream to the project team.

How else are we gonna be able to improve the project if you absorb the bug reports and you keep fixes hidden? That's not a very open source'ish attitude, methinks.

Recent example that triggered this post.