Category Archives: Work

Work stuff

tiny-curl

curl, or libcurl specifically, is probably the world’s most popular and widely used HTTP client side library counting more than six billion installs.

curl is a rock solid and feature-packed library that supports a huge amount of protocols and capabilities that surpass most competitors. But this comes at a cost: it is not the smallest library you can find.

Within a 100K

Instead of being happy with getting told that curl is “too big” for certain use cases, I set a goal for myself: make it possible to build a version of curl that can do HTTPS and fit in 100K (including the wolfSSL TLS library) on a typical 32 bit architecture.

As a comparison, the tiny-curl shared library when built on an x86-64 Linux, is smaller than 25% of the size as the default Debian shipped library is.

FreeRTOS

But let’s not stop there. Users with this kind of strict size requirements are rarely running a full Linux installation or similar OS. If you are sensitive about storage to the exact kilobyte level, you usually run a more slimmed down OS as well – so I decided that my initial tiny-curl effort should be done on FreeRTOS. That’s a fairly popular and free RTOS for the more resource constrained devices.

This port is still rough and I expect us to release follow-up releases soon that improves the FreeRTOS port and ideally also adds support for other popular RTOSes. Which RTOS would you like to support for that isn’t already supported?

Offer the libcurl API for HTTPS on FreeRTOS, within 100 kilobytes.

Maintain API

I strongly believe that the power of having libcurl in your embedded devices is partly powered by the libcurl API. The API that you can use for libcurl on any platform, that’s been around for a very long time and for which you can find numerous examples for on the Internet and in libcurl’s extensive documentation. Maintaining support for the API was of the highest priority.

Patch it

My secondary goal was to patch as clean as possible so that we can upstream patches into the main curl source tree for the changes makes sense and that aren’t disturbing to the general code base, and for the work that we can’t we should be able to rebase on top of the curl code base with as little obstruction as possible going forward.

Keep the HTTPS basics

I just want to do HTTPS GET

That’s the mantra here. My patch disables a lot of protocols and features:

  • No protocols except HTTP(S) are supported
  • HTTP/1 only
  • No cookie support
  • No date parsing
  • No alt-svc
  • No HTTP authentication
  • No DNS-over-HTTPS
  • No .netrc parsing
  • No HTTP multi-part formposts
  • No shuffled DNS support
  • No built-in progress meter

Although they’re all disabled individually so it is still easy to enable one or more of these for specific builds.

Downloads and versions?

Tiny-curl 0.9 is the first shot at this and can be downloaded from wolfSSL. It is based on curl 7.64.1.

Most of the patches in tiny-curl are being upstreamed into curl in the #3844 pull request. I intend to upstream most, if not all, of the tiny-curl work over time.

License

The FreeRTOS port of tiny-curl is licensed GPLv3 and not MIT like the rest of curl. This is an experiment to see how we can do curl work like this in a sustainable way. If you want this under another license, we’re open for business over at wolfSSL!

Sometimes I speak

I view myself as primarily a software developer. Perhaps secondary as someone who’s somewhat knowledgeable in networking and is participating in protocol development and discussions. I do not regularly proclaim myself to be a “speaker” or someone who’s even very good at talking in front of people.

Time to wake up and face reality? I’m slowly starting to realize that I’m actually doing more presentations than ever before in my life and I’m enjoying it.

Since October 2015 I’ve done 53 talks and presentations in front of audiences – in ten countries. That’s one presentation done every 25 days on average. (The start date of this count is a little random but it just happens that I started to keep a proper log then.) I’ve talked to huge audiences and to small. I done presentations that were appreciated and I’ve done some that were less successful.

The room for the JAX keynote, May 2019, as seen from the stage, some 20 minutes before 700 persons sat down in the audience to hear my talk on HTTP/3.

My increased frequency in speaking engagements coincides with me starting to work full-time from home back in 2014. Going to places to speak is one way to get out of the house and see the “real world” a little bit and see what the real people are doing. And a chance to hang out with humans for a change. Besides, I only ever talk on topics that are dear to me and that I know intimately well so I rarely feel pressure when delivering them. 2014 – 2015 was also the time frame when HTTP/2 was being finalized and the general curiosity on that new protocol version helped me find opportunities back then.

Public speaking is like most other things: surprisingly enough, practice actually makes you better at it! I still have a lot to learn and improve, but speaking many times has for example made me better at figuring out roughly how long time I need to deliver a particular talk. It has taught me to “find myself” better when presenting and be more relaxed and the real me – no need to put up a facade of some kind or pretend. People like seeing that there’s a real person there.

I talked HTTP/2 at Techday by Init, in November 2016.

I’m not even getting that terribly nervous before my talks anymore. I used to really get a raised pulse for the first 45 talks or so, but by doing it over and over and over I think the practice has made me more secure and more relaxed in my attitude to the audience and the topics. I think it has made me a slightly better presenter and it certainly makes me enjoy it more.

I’m not “a good presenter”. I can deliver a talk and I can do it with dignity and I think the audience is satisfied with me in most cases, but by watching actual good presenters talk I realize that I still have a long journey ahead of me. Of course, parts of the explanation is that, to connect with the beginning of this post, I’m a developer. I don’t talk for a living and I actually very rarely practice my presentations very much because I don’t feel I can spend that time.

The JAX keynote in May 2019 as seen from the audience. Photo by Bernd Ruecker.

Some of the things that are still difficult include:

The money issue. I actually am a developer and that’s what I do for a living. Taking time off the development to prepare a presentation, travel to a distant place, sacrifice my spare time for one or more days and communicating something interesting to an audience that demands and expects it to be both good and reasonably entertaining takes time away from that development. Getting travel and accommodation compensated is awesome but unfortunately not enough. I need to insist on getting paid for this. I frequently turn down speaking opportunities when they can’t pay me for my time.

Saying no. Oh my god do I have a hard time to do this. This year, I’ve been invited to so many different conferences and the invitations keep flying in. For every single received invitation, I get this warm and comfy feeling and I feel honored and humbled by the fact that someone actually wants me to come to their conference or gathering to talk. There’s the calendar problem: I can’t be in two places at once. Then I also can’t plan events too close to each other in time to avoid them holding up “real work” too much or to become too much of a nuisance to my family. Sometimes there’s also the financial dilemma: if I can’t get compensation, it gets tricky for me to do it, no matter how good the conference seems to be and the noble cause they’re working for.

At SUE 2016 in the Netherlands.

Feedback. To determine what parts of the presentation that should be improved for the next time I speak of the same or similar topic, which parts should be removed and if something should be expanded, figuring what works and what doesn’t work is vital. For most talks I’ve done, there’s been no formal way to provide or receive this feedback, and for the small percentage that had a formal feedback form or a scoring system or similar, taking care of a bunch of distributed grades (for example “your talk was graded 4.2 on a scale between 1 and 5”) and random comments – either positive or negative – is really hard… I get the best feedback from close friends who dare to tell me the truth as it is.

Conforming to silly formats. Slightly different, but some places want me to send me my slides in, either a long time before the event (I’ve had people ask me to provide way over a week(!) before), or they dictate that the slides should be sent to them using Microsoft Powerpoint, PDF or some other silly format. I want to use my own preferred tools when designing presentations as I need to be able to reuse the material for more and future presentations. Sure, I can convert to other formats but that usually ruins formatting and design. Then a lot the time and sweat I put into making a fine and good-looking presentation is more or less discarded! Fortunately, most places let me plug in my laptop and everything is fine!

Upcoming talks?

As a little service to potential audience members and conference organizers, I’m listing all my upcoming speaking engagements on a dedicated page on my web site:

https://daniel.haxx.se/talks.html

I try to keep that page updated to reflect current reality. It also shows that some organizers are forward-planning waaaay in advance…

Here’s me talking about DNS-over-HTTPS at FOSDEM 2019. Photo by Steve Holme.

Invite someone like me to talk?

Here’s some advice on how to invite a speaker (like me) with style:

  1. Ask well in advance (more than 2-3 months preferably, probably not more than 9). When I agree to a talk, others who ask for talks in close proximity to that date will get declined. I get a surprisingly large amount of invitations for events just a month into the future or so, and it rarely works for me to get those into my calendar in that time frame.
  2. Do not assume for-free delivery. I think it is good tone of you to address the price/charge situation, if not in the first contact email at least in the following discussion. If you cannot pay, that’s also useful information to provide early.
  3. If the time or duration of the talk you’d like is “unusual” (ie not 30-60 minutes) do spell that out early on.
  4. Surprisingly often I get invited to talk without a specified topic or title. The inviter then expects me to present that. Since you contact me you clearly had some kind of vision of what a talk by me would entail, it would make my life easier if that vision was conveyed as it could certainly help me produce a talk subject that will work!
Presenting HTTP/2 at the Velocity conference in New York, October 2015, together with Ragnar Lönn.

What I bring

To every presentation I do, I bring my laptop. It has HDMI and USB-C ports. I also carry a HDMI-to-VGA adapter for the few installations that still use the old “projector port”. Places that need something else than those ports tend to have their own converters already since they’re then used with equipment not being fitted for their requirements.

I always bring my own clicker (the “remote” with which I can advance to next slide). I never use the laser-pointer feature, but I like being able to move around on the stage and not have to stand close to the keyboard when I present.

Presentations

I never create my presentations with video or sound in them, and I don’t do presentations that need Internet access. All this to simplify and to reduce the risk of problems.

I work hard on limiting the amount of text on each slide, but I also acknowledge that if a slide set should have value after-the-fact there needs to be a certain amount. I’m a fan of revealing the text or graphics step-by-step on the slides to avoid having half the audience reading ahead on the slide and not listening.

I’ve settled on 16:9 ratio for all presentations. Luckily, the remaining 4:3 projectors are now scarce.

I always make and bring a backup of my presentations in PDF format so that basically “any” computer could display that in case of emergency. Like if my laptop dies. As mentioned above, PDF is not an ideal format, but as a backup it works.

I talked “web transport” in the Mozilla devroom at FOSDEM, February 2017 in front of this audience. Not a single empty seat…

What is the incentive for curl to release the library for free?

(This is a repost of the answer I posted on stackoverflow for this question. This answer immediately became my most ever upvoted answer on stackoverflow with 516 upvotes during the 48 hours it was up before a moderator deleted it for unspecified reasons. It had then already been marked “on hold” for being “primarily opinion- based” and then locked but kept: “exists because it has historical significance”. But apparently that wasn’t good enough. I’ve saved a screenshot of the deletion. Debated on meta.stackoverflow.com. Status now: it was brought back but remains locked.)

I’m Daniel Stenberg.

I made curl

I founded the curl project back in 1998, I wrote the initial curl version and I created libcurl. I’ve written more than half of all the 24,000 commits done in the source code repository up to this point in time. I’m still the lead developer of the project. To a large extent, curl is my baby.

I shipped the first version of curl as open source since I wanted to “give back” to the open source world that had given me so much code already. I had used so much open source and I wanted to be as cool as the other open source authors.

Thanks to it being open source, literally thousands of people have been able to help us out over the years and have improved the products, the documentation. the web site and just about every other detail around the project. curl and libcurl would never have become the products that they are today were they not open source. The list of contributors now surpass 1900 names and currently the list grows with a few hundred names per year.

Thanks to curl and libcurl being open source and liberally licensed, they were immediately adopted in numerous products and soon shipped by operating systems and Linux distributions everywhere thus getting a reach beyond imagination.

Thanks to them being “everywhere”, available and liberally licensed they got adopted and used everywhere and by everyone. It created a defacto transfer library standard.

At an estimated six billion installations world wide, we can safely say that curl is the most widely used internet transfer library in the world. It simply would not have gone there had it not been open source. curl runs in billions of mobile phones, a billion Windows 10 installations, in a half a billion games and several hundred million TVs – and more.

Should I have released it with proprietary license instead and charged users for it? It never occured to me, and it wouldn’t have worked because I would never had managed to create this kind of stellar project on my own. And projects and companies wouldn’t have used it.

Why do I still work on curl?

Now, why do I and my fellow curl developers still continue to develop curl and give it away for free to the world?

  1. I can’t speak for my fellow project team members. We all participate in this for our own reasons.
  2. I think it’s still the right thing to do. I’m proud of what we’ve accomplished and I truly want to make the world a better place and I think curl does its little part in this.
  3. There are still bugs to fix and features to add!
  4. curl is free but my time is not. I still have a job and someone still has to pay someone for me to get paid every month so that I can put food on the table for my family. I charge customers and companies to help them with curl. You too can get my help for a fee, which then indirectly helps making sure that curl continues to evolve, remain free and the kick-ass product it is.
  5. curl was my spare time project for twenty years before I started working with it full time. I’ve had great jobs and worked on awesome projects. I’ve been in a position of luxury where I could continue to work on curl on my spare time and keep shipping a quality product for free. My work on curl has given me friends, boosted my career and taken me to places I would not have been at otherwise.
  6. I would not do it differently if I could back and do it again.

Am I proud of what we’ve done?

Yes. So insanely much.

But I’m not satisfied with this and I’m not just leaning back, happy with what we’ve done. I keep working on curl every single day, to improve, to fix bugs, to add features and to make sure curl keeps being the number one file transfer solution for the world even going forward.

We do mistakes along the way. We make the wrong decisions and sometimes we implement things in crazy ways. But to win in the end and to conquer the world is about patience and endurance and constantly going back and reconsidering previous decisions and correcting previous mistakes. To continuously iterate, polish off rough edges and gradually improve over time.

Never give in. Never stop. Fix bugs. Add features. Iterate. To the end of time.

For real?

Yeah. For real.

Do I ever get tired? Is it ever done?

Sure I get tired at times. Working on something every day for over twenty years isn’t a paved downhill road. Sometimes there are obstacles. During times things are rough. Occasionally people are just as ugly and annoying as people can be.

But curl is my life’s project and I have patience. I have thick skin and I don’t give up easily. The tough times pass and most days are awesome. I get to hang out with awesome people and the reward is knowing that my code helps driving the Internet revolution everywhere is an ego boost above normal.

curl will never be “done” and so far I think work on curl is pretty much the most fun I can imagine. Yes, I still think so even after twenty years in the driver’s seat. And as long as I think it’s fun I intend to keep at it.

One year in still no visa

One year ago today. On the sunny Tuesday of April 17th 2018 I visited the US embassy in Stockholm Sweden and applied for a visa. I’m still waiting for them to respond.

My days-since-my-visa-application counter page is still counting. Technically speaking, I had already applied but that was the day of the actual physical in-person interview that served as the last formal step in the application process. Most people are then getting their visa application confirmed within weeks.

Initially I emailed them a few times after that interview since the process took so long (little did I know back then), but now I haven’t done it for many months. Their last response assured me that they are “working on it”.

Lots of things have happened in my life since last April. I quit my job at Mozilla and started a new job at wolfSSL, again working for a US based company. One that I cannot go visit.

During this year I missed out on a Mozilla all-hands, I’ve been invited to the US several times to talk at conferences that I had to decline and a friend is getting married there this summer and I can’t go. And more.

Going forward I will miss more interesting meetings and speaking opportunities and I have many friends whom I cannot visit. This is a mild blocker to things I would want to do and it is an obstacle to my profession and career.

I guess I might get my rejection notice before my counter reaches two full years, based on stories I’ve heard from other people in similar situations such as mine. I don’t know yet what I’ll do when it eventually happens. I don’t think there are any rules that prevent me from reapplying, other than the fact that I need to pay more money and I can’t think of any particular reason why they would change their minds just by me trying again. I will probably give it a rest a while first.

I’m lucky and fortunate that people and organizations have adopted to my situation – a situation I of course I share with many others so it’s not uniquely mine – so lots of meetings and events have been held outside of the US at least partially to accommodate me. I’m most grateful for this and I understand that at times it won’t work and I then can’t attend. These days most things are at least partly accessible via video streams etc, repairing the harm a little. (And yes, this is a first-world problem and I’m fortunate that I can still travel to most other parts of the world without much trouble.)

Finally: no, I still have no clue as to why they act like this and I don’t have any hope of ever finding out.


curl up 2019 is over

(I will update this blog post with more links to videos and PDFs to presentations as they get published, so come back later in case your favorite isn’t linked already.)

The third curl developers conference, curl up 2019, is how history. We gathered in the lovely Charles University in central Prague where we sat down in an excellent class room. After the HTTP symposium on the Friday, we spent the weekend to dive in deeper in protocols and curl details.

I started off the Saturday by The state of the curl project (youtube). An overview of how we’re doing right now in terms of stats, graphs and numbers from different aspects and then something about what we’ve done the last year and a quick look at what’s not do good and what we could work on going forward.

James Fuller took the next session and his Newbie guide to contributing to libcurl presentation. Things to consider and general best practices to that could make your first steps into the project more likely to be pleasant!

Long term curl hacker Dan Fandrich (also known as “Daniel two” out of the three Daniels we have among our top committers) followed up with Writing an effective curl test where the detailed what different tests we have in curl, what they’re for and a little about how to write such tests.

Sign seen at the curl up dinner reception Friday night

After that I was back behind the desk in the classroom that we used for this event and I talked The Deprecation of legacy crap (Youtube). How and why we are removing things, some things we are removing and will soon remove and finally a little explainer on our new concept and handling of “experimental” features.

Igor Chubin then explained his new protect for us: curlator: a framework for console services (Youtube). It’s a way and tooling that makes it easier to provide access to shell and console oriented services over the web, using curl.

Me again. Governance, money in the curl project and someone offering commercial support (Youtube) was a presentation about how we intend for the project to join a legal entity SFC, and a little about money we have, what to spend it on and how I feel it is good to keep the project separate from any commercial support ventures any of us might do!

While the list above might seems like more than enough, the day wasn’t over. Christian Schmitz also did his presentation on Using SSL root certificate from Mac/Windows.

Our local hero organizer James Fuller then spoiled us completely when we got around to have dinner at a monastery with beer brewing monks and excellent food. Good food, good company and curl related dinner subjects. That’s almost heaven defined!

Sunday

Daylight saving time morning and you could tell. I’m sure it was not at all related to the beers from the night before…

James Fuller fired off the day by talking to us about Curlpipe (github), a DSL for building http execution pipelines.

The class room we used for the curl up presentations and discussions during Saturday and Sunday.

Robin Marx then put in the next gear and entertained us another hour with a protocol deep dive titled HTTP/3 (QUIC): the details (slides). For me personally this was a exactly what I needed as Robin clearly has kept up with more details and specifics in the QUIC and HTTP/3 protocols specifications than I’ve managed and his talk help the rest of the room get at least little bit more in sync with current development.

Jakub Nesetril and Lukáš Linhart from Apiary then talked us through what they’re doing and thinking around web based APIs and how they and their customers use curl: Real World curl usage at Apiary.

Then I was up again and I got to explain to my fellow curl hackers about HTTP/3 in curl. Internal architecture, 3rd party libs and APIs.

Jakub Klímek explained to us in very clear terms about current and existing problems in his talk IRIs and IDNs: Problems of non-ASCII countries. Some of the problems involve curl and while most of them have their clear explanations, I think we have to lessons to learn from this: URLs are still as messy and undocumented as ever before and that we might have some issues to fix in this area in curl.

To bring my fellow up to speed on the details of the new API introduced the last year I then made a presentation called The new URL API.

Clearly overdoing it for a single weekend, I then got the honors of doing the last presentation of curl up 2019 and for an audience that were about to die from exhaustion I talked Internals. A walk-through of the architecture and what libcurl does when doing a transfer.

Summary

I ended up doing seven presentations during this single weekend. Not all of them stellar or delivered with elegance but I hope they were still valuable to some. I did not steal someone else’s time slot as I would gladly have given up time if we had other speakers wanted to say something. Let’s aim for more non-Daniel talkers next time!

A weekend like this is such a boost for inspiration, for morale and for my ego. All the friendly faces with the encouraging and appreciating comments will keep me going for a long time after this.

Thank you to our awesome and lovely event sponsors – shown in the curl up logo below! Without you, this sort of happening would not happen.

curl up 2020

I will of course want to see another curl up next year. There are no plans yet and we don’t know where to host. I think it is valuable to move it around but I think it is even more valuable that we have a friend on the ground in that particular city to help us out. Once this year’s event has sunken in properly and a month or two has passed, the case for and organization of next year’s conference will commence. Stay tuned, and if you want to help hosting us do let me know!


commercial curl support!

If you want commercial support, ports of curl to other operating systems or just instant help to fix your curl related problems, we’re here to help. Get in touch now! This is the premiere. This has not been offered by me or anyone else before.

I’m not sure I need to say it, but I personally have authored almost 60% of all commits in the curl source code during my more than twenty years in the project. I started the project, I’ve designed its architecture etc. There is simply no one around with my curl experience and knowledge of curl internals. You can’t buy better curl expertise.

curl has become one of the world’s most widely used software components and is the transfer engine doing a large chunk of all non-browser Internet transfers in the world today. curl has reached this level of success entirely without anyone offering commercial services around it. Still, not every company and product made out there has a team of curl experts and in this demanding time and age we know there are times when you rather hire the right team to help you out.

We are the curl experts that can help you and your team. Contact us for all and any support questions at support@wolfssl.com.

What about the curl project?

I’m heading into this new chapter of my life and the curl project with the full knowledge that this blurs the lines between my job and my spare time even more than before. But fear not!

The curl project is free and open and will remain independent of any commercial enterprise helping out customers. I realize me offering companies and organizations to deal with curl problems and solving curl issues for compensation creates new challenges and questions where boundaries go, if for nothing else for me personally. I still think this is worth pursuing and I’m sure we can figure out and handle whatever minor issues this can lead to.

My friends, the community, the users and harsh critiques on twitter will all help me stay true and honest. I know this. This should end up a plus for the curl project in general as well as for me personally. More focus, more work and more money involved in curl related activities should improve the project.

It is with great joy and excitement I take on this new step.

I’m on team wolfSSL

Let me start by saying thank you to all and everyone who sent me job offers or otherwise reached out with suggestions and interesting career moves. I received more than twenty different offers and almost every one of those were truly good options that I could have said yes to and still pulled home a good job. What a luxury challenge to have to select something from that! Publicly announcing me leaving Mozilla turned out a great ego-boost.

I took some time off to really reflect and contemplate on what I wanted from my next career step. What would the right next move be?

I love working on open source. Internet protocols, and transfers and doing libraries written in C are things considered pure fun for me. Can I get all that and yet keep working from home, not sacrifice my wage and perhaps integrate working on curl better in my day to day job?

I talked to different companies. Very interesting companies too, where I have friends and people who like me and who really wanted to get me working for them, but in the end there was one offer with a setup that stood out. One offer for which basically all check marks in my wish-list were checked.

wolfSSL

On February 5, 2019 I’m starting my new job at wolfSSL. My short and sweet period as unemployed is over and now it’s full steam ahead again! (Some members of my family have expressed that they haven’t really noticed any difference between me having a job and me not having a job as I spend all work days the same way nevertheless: in front of my computer.)

Starting now, we offer commercial curl support and various services for and around curl that companies and organizations previously really haven’t been able to get. Time I do not spend on curl related activities for paying customers I will spend on other networking libraries in the wolfSSL “portfolio”. I’m sure I will be able to keep busy.

I’ve met Larry at wolfSSL physically many times over the years and every year at FOSDEM I’ve made certain to say hello to my wolfSSL friends in their booth they’ve had there for years. They’re truly old-time friends.

wolfSSL is mostly a US-based company – I’m the only Swede on the team and the only one based in Sweden. My new colleagues all of course know just as well as you that I’m prevented from traveling to the US. All coming physical meetings with my work mates will happen in other countries.

commercial curl support!

We offer all sorts of commercial support for curl. I’ll post separately with more details around this.

I’m leaving Mozilla

It’s been five great years, but now it is time for me to move on and try something else.

During these five years I’ve met and interacted with a large number of awesome people at Mozilla, lots of new friends! I got the chance to work from home and yet work with a global team on a widely used product, all done with open source. I have worked on internet protocols during work-hours (in addition to my regular spare-time working with them) and its been great! Heck, lots of the HTTP/2 development and the publication of that was made while I was employed by Mozilla and I fondly participated in that. I shall forever have this time ingrained in my memory as a very good period of my life.

I had already before I joined the Firefox development understood some of the challenges of making a browser in the modern era, but that understanding has now been properly enriched with lots of hands-on and code-digging in sometimes decades-old messy C++, a spaghetti armada of threads and the wild wild west of users on the Internet.

A very big thank you and a warm bye bye go to everyone of my friends at Mozilla. I won’t be far off and I’m sure I will have reasons to see many of you again.

My last day as officially employed by Mozilla is December 11 2018, but I plan to spend some of my remaining saved up vacation days before then so I’ll hand over most of my responsibilities way before.

The future is bright but unknown!

I don’t yet know what to do next.

I have some ideas and communications with friends and companies, but nothing is firmly decided yet. I will certainly entertain you with a totally separate post on this blog once I have that figured out! Don’t worry.

Will it affect curl or other open source I do?

I had worked on curl for a very long time already before joining Mozilla and I expect to keep doing curl and other open source things even going forward. I don’t think my choice of future employer should have to affect that negatively too much, except of course in periods.

With me leaving Mozilla, we’re also losing Mozilla as a primary sponsor of the curl project, since that was made up of them allowing me to spend some of my work days on curl and that’s now over.

Short-term at least, this move might increase my curl activities since I don’t have any new job yet and I need to fill my days with something…

What about toying with HTTP?

I was involved in the IETF HTTPbis working group for many years before I joined Mozilla (for over ten years now!) and I hope to be involved for many years still. I still have a lot of things I want to do with curl and to keep curl the champion of its class I need to stay on top of the game.

I will continue to follow and work with HTTP and other internet protocols very closely. After all curl remains the world’s most widely used HTTP client.

Can I enter the US now?

No. That’s unfortunately not related, and I’m not leaving Mozilla because of this problem and I unfortunately don’t expect my visa situation to change because of this change. My visa counter is now showing more than 214 days since I applied.

How to DoH-only with Firefox

Firefox supports DNS-over-HTTPS (aka DoH) since version 62.

You can instruct your Firefox to only use DoH and never fall-back and try the native resolver; the mode we call trr-only. Without any other ability to resolve host names, this is a little tricky so this guide is here to help you. (This situation might improve in the future.)

In trr-only mode, nobody on your local network nor on your ISP can snoop on your name resolves. The SNI part of HTTPS connections are still clear text though, so eavesdroppers on path can still figure out which hosts you connect to.

There’s a name in my URI

A primary problem for trr-only is that we usually want to use a host name in the URI for the DoH server (we typically need it to be a name so that we can verify the server’s certificate against it), but we can’t resolve that host name until DoH is setup to work. A catch-22.

There are currently two ways around this problem:

  1. Tell Firefox the IP address of the name that you use in the URI. We call it the “bootstrapAddress”. See further below.
  2. Use a DoH server that is provided on an IP-number URI. This is rather unusual. There’s for example one at 1.1.1.1.

Setup and use trr-only

There are three prefs to focus on (they’re all explained elsewhere):

network.trr.mode – set this to the number 3.

network.trr.uri – set this to the URI of the DoH server you want to use. This should be a server you trust and want to hand over your name resolves to. The Cloudflare one we’ve previously used in DoH tests with Firefox is https://mozilla.cloudflare-dns.com/dns-query.

network.trr.bootstrapAddress– when you use a host name in the URI for the network.trr.uri pref you must set this pref to an IP address that host name resolves to for you. It is important that you pick an IP address that the name you use actually would resolve to.

Example

Let’s pretend you want to go full trr-only and use a DoH server at https://example.com/dns. (it’s a pretend URI, it doesn’t work).

Figure out the bootstrapAddress with dig. Resolve the host name from the URI:

$ dig +short example.com
93.184.216.34

or if you prefer to be classy and use the IPv6 address (only do this if IPv6 is actually working for you)

$ dig -t AAAA +short example.com
2606:2800:220:1:248:1893:25c8:1946

dig might give you a whole list of addresses back, and then you can pick any one of them in the list. Only pick one address though.

Go to “about:config” and paste the copied IP address into the value field for network.trr.bootstrapAddress. Now TRR / DoH should be able to get going. When you can see web pages, you know it works!

DoH-only means only DoH

If you happen to start Firefox behind a captive portal while in trr-only mode, the connections to the DoH server will fail and no name resolves can be performed.

In those situations, normally Firefox’s captive portable detector would trigger and show you the login page etc, but when no names can be resolved and the captive portal can’t respond with a fake response to the name lookup and redirect you to the login, it won’t get anywhere. It gets stuck. And currently, there’s no good visual indication anywhere that this is what happens.

You simply can’t get out of a captive portal with trr-only. You probably then temporarily switch mode, login to the portal and switch the mode to 3 again.

If you “unlock” the captive portal with another browser/system, Firefox’s regular retries while in trr-only will soon detect that and things should start working again.

administrative purgatory

 your case is still going through administrative processing and we don’t know when that process will be completed.

Last year I was denied to go to the US when I was about to travel to San Francisco. Me and my employer’s legal team never got answers as to why this happened so I’ve personally tried to convince myself it was all because of some human screw-up. Because why would they suddenly block me? I’ve traveled to the US almost a dozen times over the years.

The fact that there was no reason or explanation given makes any theory as likely as the next. Whatever we think or guess might have happened can be true. Or not. We will probably never know. And I’ve been told a lot of different theories.

Denied again

In early April 2018 I applied for ESTA again to go to San Francisco in mid June for another Mozilla All Hands conference and… got denied. The craziness continues. This also ruled out some of the theories from last year that it was just some human error by the airline or similar…

As seen on the screenshot, this decision has no expire date… While they don’t provide any motivation for not accepting me, this result makes it perfectly clear that it wasn’t just a mistake last year. It makes me view last year with different eyes.

Put in this situation, I activated plan B.

Plan B

I then applied for a “real” non-immigrant visa – even though it feels that having been denied ESTA probably puts me in a disadvantage for that as well. Applying for this visa means filling in a 10-something-page “DS-160” form online on a site that sometimes takes minutes just to display the next page in the form where they ask for a lot of personal details. After finally having conquered that obstacle, I paid the 160 USD fee and scheduled an appointment to appear physically at the US embassy in Sweden.

I acquired an “extraction of the population register” (“personbevis” in Swedish) from the Swedish tax authorities – as required (including personal details of my parents and siblings), I got myself a new mugshot printed on photo paper and was lucky enough to find a date for an appointment not too far into the future.

Appointment

I spent the better part of a fine Tuesday morning in different waiting lines at my local US embassy where I eventually was called up to a man at a counter behind a window. I was fingerprinted, handed over my papers and told the clerk I have no idea why I was denied ESTA when asked, and no, I have not been on vacation in Iraq, Iran or Sudan. The clerk gave me the impression that’s the sort of thing that is the common reason for not getting ESTA.

When I answered the interviewer’s question that I work for Mozilla, he responded “Aha, Firefox?” – which brightened up my moment a little.

Apparently the process is then supposed to take “several weeks” until I get to know anything more. I explained that I needed my passport in three weeks (for another trip) and he said he didn’t expect them to be done that quickly.  Therefore I got the passport back while they process my application and I’m expected to mail it to them when they ask for it.

The next form

When I got back home again, I got an email from “the visa unit” asking me to fill in another form (in the shape of a Word document). And what a form it is! It might be called “OMB 1405-0226” and has this fancy title:

“SUPPLEMENTAL QUESTIONS FOR VISA APPLICANTS”

Among other things it requires me to provide info about all trips abroad (with dates and duration) I’ve done over the last 15 years. What aliases I use on social media sites (hello mr US visa agent, how do you like this post so far?), every physical address I’ve lived at in the last 15 years, information about all my employers the last 15 years and every email address I’ve used during the last 5 years.

It took me many hours digging through old calendars, archives and memories and asking around in order to fill this in properly. (“hey that company trip we did to Germany back in 2005, can you remember the dates?”) As a side-note: it turns out I’ve been in the US no less than nine times the last fifteen years. In total I managed to list sixty-five different trips abroad for this period.

How do I submit my filled-in form, with all these specific and very private details from my life for the last 15 years, back to “the visa unit”? By email. Good old insecure, easy to snoop on, email! At least I’m using my own mail server (and it is configured to prefer TLS for connections) but that’s a small comfort.

Is it worth it?

This is a very time and energy consuming process – I understand why this puts people off and simply make them decide its not worth it to go there. And of course I understand that I’m in a lucky position where I’ve not had to deal with this much in the past.

I have many friends and contacts in the US in both my personal and professional life. I would be sad if I couldn’t go there ever again. It would give me grief personally since it’ll limit where I can go on vacation and who out of my friends I can visit, but it will also limit my professional life as interesting Mozilla, Internet, open source and curl related events that I’d like to attend are frequently hosted there.

What’s happening?

So the weeks came and went and on May 29th,  six weeks after I was interviewed at the embassy, I checked the online service that allows me to check my application progress. It said “Case Created: April 17” and the following useful addition “Case Last Updated: April 17”.

Wat? Did something go fatally wrong here? I emailed the embassy to double-check. I got this single sentence response back:

Dear Sir,

You don't have to do anything, your case is still going through administrative processing and we don't know when that process will be completed.

In my life I’ve visited a whole series of countries for which I’ve been required to apply for a visa. None of them have ever taken more than a few weeks, including countries with complicated bureaucracy like India and China. What are they doing all this time?

At the time of this writing, more than 100 days have passed and I have still not heard back from them. I know this is unusually long and I have a strong suspicion this means they will deny me visa, but for some reason they want to keep me unaware for a while more.

No All Hands in the US

I clearly underestimated the time this required so I missed our meeting in SF this year again…

Mozilla has since then announced that a number of the forthcoming All Hands conferences in the coming years will be held outside of the US. Unfortunately several of them are to be held in Canada, and there are indications that having being denied entry to the US means that Canada will deny me as well. But I have yet to test that!

Why they deny me?

Me knowingly, I’ve never broken a law, rule or regulation that would explain this. Some speculations me and others can think of include…

  1. I’m the main author of curl, a tool that is used in a lot of security research and proof of concept exploits of security vulnerabilities
  2. I’m the main author of libcurl, a transfer library that is one of the world’s most widely used software components. It is subsequently also used extensively by malware and other offensive and undesired software.
  3. I use the name haxx.se domain for many of my sites and email address etc. haxx or hacking could be interpreted by some, not as “To program a computer in a clever, virtuosic, and wizardly manner” but as the act to “gain unauthorized access to data in a system or computer”.
  4. It’s been suggested that my presence at multiple conferences in the US over the years could’ve been a violation of the ESTA rules – but the rules explicitly allow this. I have not violated the ESTA rules.

Administrative Processing

It’s been 102 days now. I’m not optimistic.