I sat down and talked curl, HTTP, HTTP/2, IETF, the web, Firefox and various internet subjects with Mattias Geniar on his podcast the syscast the other day.
On April 12 I had the pleasure of doing another talk in the Google Tech Talk series arranged in the Google Stockholm offices. I had given it the title "HTTP/2 is upon us, and here's what you need to know about it." in the invitation.
The room seated 70 persons but we had the amazing amount of over 300 people in the waiting line who unfortunately didn't manage to get a seat. To those, and to anyone else who cares, here's the video recording of the event.
If you've seen me talk about HTTP/2 before, you might notice that I've refreshed the material somewhat since before.
Here's an encouraging graph from our regular Coverity scans of the curl source code, showing that we've maintained a fairly low "defect density" over the last two years, staying way below the average density level.
Click the image to view it slightly larger.
Defect density is simply the number of found problems per 1,000 lines of code. As a little (and probably unfair) comparison, right now when curl is flat on 0, Firefox is at 0.47, c-ares at 0.12 and libssh2 at 0.21.
Coverity is still the primary static code analyzer for C code that I'm aware of. None of the flaws Coverity picked up in curl during the last two years were detected by clang-analyzer for example.
When I asked my surrounding in March 2015 to guess the expected HTTP/2 adoption by now, we as a group ended up with about 10%. OK, the question was vaguely phrased and what does it really mean? Let's take a look at some aspects of where we are now.
Perhaps the biggest flaw in the question was that it didn't specify HTTPS. All the browsers of today only implement HTTP/2 over HTTPS so of course if every HTTPS site in the world would support HTTP/2 that would still be far away from all the HTTP requests. Admittedly, browsers aren't the only HTTP clients...
During the fall of 2015, both nginx and Apache shipped release versions with HTTP/2 support. nginx made it slightly harder for people by forcing users to select either SPDY or HTTP/2 (which was a technical choice done by them, not really enforced by the protocols) and also still telling users that SPDY is the safer choice.
Let's Encrypt's finally launching their public beta in the early December also helps HTTP/2 by removing one of the most annoying HTTPS obstacles: the cost and manual administration of server certs.
Amount of Firefox responses
This is the easiest metric since Mozilla offers public access to the metric data. It is skewed since it is opt-in data and we know that certain kinds of users are less likely to enable this (if you're more privacy aware or if you're using it in enterprise environments for example). This also then measures the share by volume of requests; making the popular sites get more weight.
Firefox 43 counts no less than 22% of all HTTP responses as HTTP/2 (based on data from Dec 8 to Dec 16, 2015).
Out of all HTTP traffic Firefox 43 generates, about 63% is HTTPS which then makes almost 35% of all Firefox HTTPS requests are HTTP/2!
Firefox 43 is also negotiating HTTP/2 four times as often as it ends up with SPDY.
Amount of browser traffic
One estimate of how large share of browsers that supports HTTP/2 is the caniuse.com number. Roughly 70% on a global level. Another metric is the one published by KeyCDN at the end of October 2015. When they enabled HTTP/2 by default for their HTTPS customers world wide, the average number of users negotiating HTTP/2 turned out to be 51%. More than half!
Cloudflare however, claims the share of supported browsers are at a mere 26%. That's a really big difference and I personally don't buy their numbers as they're way too negative and give some popular browsers very small market share. For example: Chrome 41 - 49 at a mere 15% of the world market, really?
I think the key is rather that it all boils down to what you measure - as always.
Amount of the top-sites in the world
Netcraft bundles SPDY with HTTP/2 in their October report, but it says that "29% of SSL sites within the thousand most popular sites currently support SPDY or HTTP/2, while 8% of those within the top million sites do." (note the "of SSL sites" in there)
That's now slightly old data that came out almost exactly when Apache first release its HTTP/2 support in a public release and Nginx hadn't even had it for a full month yet.
Facebook eventually enabled HTTP/2 in November 2015.
Amount of "regular" sites
There's still no ideal service that scans a larger portion of the Internet to measure adoption level. The httparchive.org site is about to change to a chrome-based spider (from IE) and once that goes live I hope that we will get better data.
W3Tech's report says 2.5% of web sites in early December - less than SPDY!
I like how isthewebhttp2yet.com looks so far and I've provided them with my personal opinions and feedback on what I think they should do to make that the preferred site for this sort of data.
Using the shodan search engine, we could see that mid December 2015 there were about 115,000 servers on the Internet using HTTP/2. That's 20,000 (~24%) more than isthewebhttp2yet site says. It doesn't really show percentages there, but it could be interpreted to say that slightly over 6% of HTTP/1.1 sites also support HTTP/2.
On Dec 3rd 2015, Cloudflare enabled HTTP/2 for all its customers and they claimed they doubled the number of HTTP/2 servers on the net in that single move. (The shodan numbers seem to disagree with that statement.)
Amount of system lib support
iOS 9 supports HTTP/2 in its native HTTP library. That's so far the leader of HTTP/2 in system libraries department. Does Mac OS X have something similar?
I had expected Window's wininet or other HTTP libs to be up there as well but I can't find any details online about it. I hear the Android HTTP libs are not up to snuff either but since okhttp is now part of Android to some extent, I guess proper HTTP/2 in Android is not too far away?
Amount of HTTP API support
I hear very little about HTTP API providers accepting HTTP/2 in addition or even instead of HTTP/1.1. My perception is that this is basically not happening at all yet.
If you're using a modern Chrome browser today against a Google service you're already (mostly) using QUIC instead of HTTP/2, thus you aren't really adding to the HTTP/2 client side numbers but you're also not adding to the HTTP/1.1 numbers.
QUIC and other QUIC-like (UDP-based with the entire stack in user space) protocols are destined to grow and get used even more as we go forward. I'm convinced of this.
Everyone was right! It is mostly a matter of what you meant and how to measure it.
Recall the words on the Chromium blog: "We plan to remove support for SPDY in early 2016". For Firefox we haven't said anything that absolute, but I doubt that Firefox will support SPDY for very long after Chrome drops it.
Using curl to perform an operation a user just managed to do with his or her browser is one of the more common requests and areas people ask for help about.
How do you get a curl command line to get a resource, just like the browser would get it, nice and easy? Both Chrome and Firefox have provided this feature for quite some time already!
You get the site shown with Firefox's network tools. You then right-click on the specific request you want to repeat in the "Web Developer->Network" tool when you see the HTTP traffic, and in the menu that appears you select "Copy as cURL". Like this screenshot below shows. The operation then generates a curl command line to your clipboard and you can then paste that into your favorite shell window. This feature is available by default in all Firefox installations.
When you pop up the More tools->Developer mode in Chrome, and you select the Network tab you see the HTTP traffic used to get the resources of the site. On the line of the specific resource you're interested in, you right-click with the mouse and you select "Copy as cURL" and it'll generate a command line for you in your clipboard. Paste that in a shell to get a curl command line that makes the transfer. This feature is available by default in all Chome and Chromium installations.
On Firefox, without using the devtools
If this is something you'd like to get done more often, you probably find using the developer tools a bit inconvenient and cumbersome to pop up just to get the command line copied. Then cliget is the perfect add-on for you as it gives you a new option in the right-click menu, so you can get a quick command line generated really quickly, like this example when I right-click an image in Firefox:
So I'd love to see brotli supported as a Content-Encoding in curl too, and then we just basically have to write some conditional code to detect the brotli library, add the adaption code for it and we should be in a good position. But...
There is (was) no brotli library!
It turns out the brotli team just writes their code to be linked with their tools, without making any library nor making it easy to install and use for third party applications.
We can't have it like that! I rolled up my imaginary sleeves (imaginary since my swag tshirt doesn't really have sleeves) and I now offer libbrotli to the world. It is just a bunch of files and a build system that sucks in the brotli upstream repo as a submodule and then it builds a decoder library (brotlidec) and an encoder library (brotlienc) out of them. So there's no code of our own here. Just building on top of the great stuff done by others.
It's not complicated. It's nothing fancy. But you can configure, make and make install two libraries and I can now go on and write a curl adaption for this library so that we can get brotli support for it done. Ideally, this (making a library) is something the brotli project will do on their own at some point, but until they do I don't mind handling this.
Kodsnack is a Swedish-speaking weekly podcast with a small team of web/app- developers discussing their experiences and thoughts on and around software development.
I was invited to participate a week ago or so, and I had a great time. Not surprisingly, the topics at hand moved a lot around curl, Firefox and HTTP/2. The recorded episode has now gone live, today.
You can find kodsnack episode 120 here, and again, it is all Swedish.
Back in March 2015, I asked friends for a forecast on how much HTTP traffic that will be HTTP/2 by the end of the year and we arrived at about 10% as a group. Are we getting there? Remember that RFC 7540 was published on May 15th, so it is still less than 4 months old!
The HTTP/2 implementations page now lists almost 40 reasonably up-to-date implementations.
Since then, all browsers used by the vast majority of people have stated that they have or will have HTTP/2 support soon (Firefox, Chrome, Edge, Safari and Opera - including Firefox and Chrome on Android and Safari on iPhone). Even OS support is coming: on iOS 9 the support is coming as we speak and the windows HTTP library is getting HTTP/2 support. The adoption rate so far is not limited by the clients.
Unfortunately, the WGet summer of code project to add HTTP/2 support failed.
(I have high hopes for getting a HTTP/2 enabled curl into Debian soon as they've just packaged a new enough nghttp2 library. If things go well, this leads the way for other distros too.)
Server-side we see Apache's mod_h2 module ship in a public release soon (possibly in a httpd version 2.4 series release), nginx has this alpha patch I've already mentioned and Apache Traffic Server (ATS) has already shipped h2 support for a while and my friends tell me that 6.0 has fixed numerous of their initial bugs. IIS 10 for Windows 10 was released on July 29th 2015 and supports HTTP/2. H2O and nghttp2 have shipped HTTP/2 for a long time by now. I would say that the infrastructure offering is starting to look really good! Around the end of the year it'll look even better than today.
Of course we're still seeing HTTP/2 only deployed over HTTPS so HTTP/2 cannot currently get more popular than HTTPS is but there's also no real reason for a site using HTTPS today to not provide HTTP/2 within the near future. I think there's a real possibility that we go above 10% use already in 2015 and at least for browser traffic to HTTPS sites we should be able to that almost every single HTTPS site will go HTTP/2 during 2016.
The delayed start of letsencrypt has also delayed more and easier HTTPS adoption.
Still catching up
I'm waiting to see the intermediaries really catch up. Varnish, Squid and HAProxy I believe all are planning to support it to at least some extent, but I've not yet seen them release a version with HTTP/2 enabled.
I hear there's still not a good HTTP/2 story on Android and its stock HTTP library, although you can in fact run libcurl HTTP/2 enabled even there, and I believe there are other stand-alone libs for Android that support HTTP/2 too, like OkHttp for example.
The latest stable Firefox release right now is version 40. It counts 13% HTTP/2 responses among all HTTP responses. Counted as a share of the transactions going over HTTPS, the share is roughly 27%! (Since Firefox 40 counts 47% of the transactions as HTTPS.)
This is certainly showing a share of the high volume sites of course, but there are also several very high volume sites that have not yet gone HTTP/2, like Facebook, Yahoo, Amazon, Wikipedia and more...
The IPv6 comparison
So we started today. I won't get into any live details or quotes from the day since it has all been informal and we've all agreed to not expose snippets from here without checking properly first. There will be a detailed report put together from this event afterwards.
The most critical peace of information is however how we must not walk on the red parts of the sidewalks here in Münster, as that's the bicycle lane and they (the bicyclers) can be ruthless there.
We've had a bunch of presentations today with associated Q&A and follow-up discussions. Roy Fielding (HTTP spec pioneer) started out the series with a look at HTTP full of historic details and views from the past and where we are and what we've gone through over the years. Patrick Mcmanus (of Firefox HTTP networking) took us through some of the quirks of what a modern day browser has to do to speak HTTP and topped it off with a quiz regrading Firefox metrics. Did you know 31% of all Firefox HTTP requests get fulfilled by the cache or that 73% of all Firefox HTTP/2 connections are used more than once but only 7% of the HTTP/1 ones?
Poul-Henning Kamp (author of Varnish) brought his view on HTTP/2 from an intermediary's point of view with a slightly pessimistic view, not totally unlike what he's published before. Stefan Eissing (from Green Bytes) entertained us by talking about his work on writing mod_h2 for Apache Httpd (and how it might be included in the coming 2.4.x release) and we got to discuss a bit around timing measurements and its difficulties.
We rounded off the afternoon with a priority and dependency tree discussion topped off with a walk-through of numbers and slides from Kazuho Oku (author of H2O) on how dependency-trees really help and from Moto Ishizawa (from Yahoo! Japan) explaining Firefox's (Patrick's really) implementation of dependencies for HTTP/2.
We spent the evening having a 5-course (!) meal at a nice Italian restaurant while trading war stories about HTTP, networking and the web. Now it is close to midnight and it is time to reload and get ready for another busy day tomorrow.
I'll round off with a picture of where most of the important conversations were had today:
My series of weekly videos, in lack of a better name called daniel weekly, reached episode 35 today. I'm celebrating this fact by also adding an RSS-feed for those of you who prefer to listen to me in an audio-only version.
As an avid podcast listener myself, I can certainly see how this will be a better fit to some. Most of these videos are just me talking anyway so losing the visual shouldn't be much of a problem.
A typical episode
I talk about what I work on in my open source projects, which means a lot of curl stuff and occasional stuff from my work on Firefox for Mozilla. I also tend to mention events I attend and HTTP/networking developments I find interesting and grab my attention. Lots of HTTP/2 talk for example. I only ever express my own personal opinions.
It is generally an extremely geeky and technical video series.
Every week I mention a (curl) "bug of the week" that allows me to joke or rant about the bug in question or just mention what it is about. In episode 31 I started my "command line options of the week" series in which I explain one or a few curl command line options with some amount of detail. There are over 170 options so the series is bound to continue for a while. I've explained ten options so far.
I've set a limit for myself and I make an effort to keep the episodes shorter than 20 minutes. I've not succeed every time.
The 35 episodes have been viewed over 17,000 times in total. Episode two is the most watched individual one with almost 1,500 views.
Right now, my channel has 190 subscribers.
The top-3 countries that watch my videos: USA, Sweden and UK.
Share of viewers that are female: 3.7%