front blog talks videos docs photos podcasts contact about me

This is the story of my background. What I've done and how I ended up like this.

Daniel Stenberg

I was born and raised in Huddinge, a suburb south of Sweden's capital Stockholm. I have two brothers and two sisters. My parents have no computer background or interest. My dad was a construction worker for most of his life and my mom has mostly been into arts and handcrafts.

Before computers

At the age of seven, I started playing football (soccer as Americans would say) in a team. I enjoyed it properly and kept playing in that team quite actively - even at a fairly decent level - until I was seventeen years old when school and my computer interest eventually made me quit.

In school I was a fairly good student, but never one of the popular ones, and math was always one of my favorite subjects.

1985 - it begins

I discovered the joy of computers for the first time sometime in the early 80s when Kjell, a friend of mine, and I entered data sets in Basic that we eagerly read in some of the first C64 magazines at his place and since then I have been hooked. Kjell owned a C64 before me so it was in his home I had my first experiences in the computer world. Me and my younger brother Björn then subsequently saved up money for our first own computer that we finally bought together when I was 14 years old, 1985. A Commodore 64. A glorious and marvelous Commodore 64.

I was immediately fascinated by the concept of being able to control the computer and tell it what to do and how. I headed straight into programming and quickly learned BASIC and how to do simple stuff. Soon I realized that the cool stuff we could see other people do and all the games etc were not made with BASIC. What did they use? Assembler.

The three of us (me, Kjell and Björn) dove wholeheartedly into the wonderful world of 6510 assembly. We started hacking demos because we liked watching demos and we wanted to make demos too. We figured out that all the cool demo making people were part of demo-groups and had nicknames and so on, and we felt we too had to join that spirit and quickly founded our own c64 group Confusing Solution (we could make fun of ourselves already back then).

This was when I started spending spare time on programming. Up to several hours per day. This is something that I have never stopped doing since...

Demo Scene

Due to a happy coincidence, Triad and Fairlight, two of the giant groups on the demo scene of the time, organized a copy-party in our school (Kvarnbergsskolan) in Huddinge in the late winter 1987. We got sucked deeper and harder into the C64 demo and hacker spirit and community. During that meet-up with hundreds of other C64 geeks we met many like-minded people, released our first demo ever (actually, our first software release at all any category - I was 17 years old by then). We released two more demos as Confusing Solution in the early 1998. We spent more and more of our spare time coding C64 assembly.

Later in the spring 1988, we were invited to a small gathering by our friend Fonzi who was then the leading person in the C64 group called Super Swap Sweden (SSS). When asked, we decided to join their team. At that time, Super Swap Sweden was already a large and well known group in Sweden that made both cracks (ie removed copy protection from games and copied pirated versions of them) and demos. We were taken away by the attention and did not hesitate to join this large group of friends. We went on and released more demos under the SSS flag, got better and learned more about the C64's undocumented corners, opcodes and circuits.

bagder

There was this established nickname tradition on the demo scene. I started out calling myself D$85 based on my initials and the year I got the C64, but I did not really like it (for one it was impossible to pronounce) and when we joined SSS I switched.

I figured I would pick an animal and I went with bagder. I quickly realized I had actually misspelled the animal, but then it also dawned on me that my nickname would be better and more special this way and so I stuck with it. Like a dyslexic badger. People have frequently been mistaking my nick for the animal ever since.

I used the nickname during our demo years and after that period, I have selected this name for various other services and account names. Like on GitHub, Twitter etc.

Horizon

The three of us (me, Kjell and Björn) and a few other coders left SSS after a while and instead we created Horizon together with a bunch of other demo-hacker friends from the Swedish scene (several came from the group Thundercats) and now we were definitely one of the leading demo groups in Sweden. We wanted to have a more tight-knit group that would do and focus on demos only - no cracking at all. We won a whole range of demo competitions in Sweden and Denmark during that period of a few years. We also organized some of the biggest nerd-meetings in northern Europe during the period. So called copy-parties. We would gather more than 500 teenagers from all over northern Europe in a school over a weekend and spend it hacking on code, chat, drink coca cola and then compete in a demo competition toward the end. (Such events would later on get called LAN-parties but back in the late 80s and early 90s we had no LANs...)

Skyline Techniques

During these days I also wrote a music editor (a program which you use to compose and create music) with a separate optimized production music player and under the name Skyline Techniques Björn and Linus wrote several catchy SID tunes that we and others would use in many demos of the time. We had a dream of being able to produce and sell music to commercial C64 productions but that never materialized.

Games

We created at least two quite well-developed and functioning game embryos on the C64, one was a two-car split-screen driving game with a top-view of the cars and the other was a horizontal shoot-em up game with quite advanced multiplexed sprite enemies which allowed for 20-30 simultaneously flying objects to shoot down. Creating the games and reaching maybe 90% of them was fun. Actually having the discipline to take them all the way was another matter so we never completed any of them.

The C64 golden age faded away for us - it felt like we were done with that platform and its set of limitations. Several of us looked at making the jump over to the new emerging platform: the Amiga and continue the same activities there - as was common at the time - but the Amiga's almost unlimited conditions (compared to the C64) with lots of memory, super-fast CPU with plenty of registers, blitter (co-processor) and audio chip took in many ways away much of what we considered was the charm of demo-hacking: the strict limits. We only released one demo on the Amiga as Horizon.

(There's a whole separate story about a different set of people who also called themselves Horizon on the Amiga and who also did demos, but this is not the place to tell that story.)

I did my mandatory military service basically through-out the entire 1990 without a clear direction of what to program next.

No university for me

In early 1991 I applied for a course at the university while doing odd jobs as a substitute teacher during the spring. A friend at IBM contacted me and offered me a job, so I dropped the studying plans and figured I could always go back and do that later instead. Which I then never did.

Amiga

Instead of continuing with demos, I and Kjell started our ambitious project FrexxEd around 1991 - a customizable and programmable text editor for the Amiga. In that same year - when I was 20 years old and moved into my first apartment I shared with my brother - I debuted in the IT industry professionally by starting that job at IBM. I worked with RS/6000 machines and IBM's Unix flavor called AIX. This was my first introduction to Unix and C and wow, I was immediately hooked and fascinated by the unix concepts. "Unix is the future!" I said to my girlfriend then (she would later become the Mrs. Stenberg I'm married to today), who of course had no any idea what I was talking about. I learned all this new stuff primarily through man pages. My actual work was probably called something like system installation and setup of RS/6000 machines that arrived to us to get customized and polished before they were sent out to customers.

IBM

At IBM, I learned that there were lots of free source code for programs available. That there is a super cool editor called Emacs which you can do anything with. Much of the inspiration and ideas for FrexxEd which we continued to work on I got through my discoveries and lessons with Emacs on that job. Emacs on the Amiga existed too, but it did not really come to justice there and we thought that we could do better in the (somewhat limited compared to the big unix machines of the times) Amiga environment.

FrexxEd

Basically the only thing I did software-wise on the Amiga was to write FrexxEd. I wrote a dedicated scripting language for it, the Frexx Programming Language (FPL). I made FPL really portable and it ran fine on several unixes as well as on AmigaOS, etc. Meanwhile, Björn (my brother, remember?) wrote up a BBS system under OS/2 that used FPL quite extensively. We ran our dual-line BBS The Holy Grail for several years into the 90s.

The name FrexxEd was just a playful word using two xx's which we enjoyed and that habit has followed us later in life too. Basically the Swedish word fräck (translates to cheeky) Englishified with xes, and then Ed tacked on to the end of it like many text editors were named at that time. The fact that the name turned similar to the Amiga scripting language Arexx was actually not intentional.

FrexxEd was shareware for long time. We came from the C64 and Amiga background where FOSS was not a familiar concept and it was not at all existing within that culture - sadly enough, it would have been a really good idea for that community too. Eventually I learned the true ways of life and I released FPL fully open. In modern times people who run one of them new AmigaOS versions have found a renewed interest in FrexxEd and they have ported it over. It is exciting that it is still alive - containing more than 30 years old code of ours. FrexxEd code still exists on GitHub.

Dancer

1993, I started working as a full-time C developer for real (at Frontec Railway Systems) and I programmed embedded devices that measured temperatures of railway wagons' axle bearings when they passed over the device and its infrared camera - and I came across and programmed on SunOS and DELL Unix as well. I discovered IRC and the fact that there were lots of people out there to talk to. I hung out a lot in #amiga on EFnet, before IRCnet existed. It soon led to me writing an IRC bot on my spare time with a friend (Bjorn Reese) from #amiga - a bot that could be scripted with FPL. We released the bot (Dancer) and FPL fully open source. It was not anything we considered much really, there was never any other consideration. If we could stand on the shoulders of giants and use this large amount of good software, the least we could do was to also share our contribution with the world. That bot was written for unix systems primarily (I believe SunOS on Sparc was the system we used) and was my first real application doing TCP/IP networking.

By now the Amiga had completely left my life, and I used my job's modem pool with dial-back to log on to my employer's various unix machines to IRC and hack on bots on my spare time. I still spent a lot of time in #amiga and #amigaswe where I got lots of online friends.

Httpget

After the summer 1996, I changed roles at work and I started as a consultant within embedded systems. Frontec Tekniksystem was then the name of my new professional home. At my first assignment I improved a PPP implementation for Ericsson running on pSOS. I then moved on and implemented my own malloc replacement. That was the beginnings of my years as an embedded systems consultant. Almost always working at the customer's place deeply within their product teams.

One day, later in 1996, it struck me that of course it would be cool to have a service added to the bot where you could ask it for up-to-date exchange rates of currencies. Shopping and prices were often discussed in the channels, so why not offer something that could make the bot say what 100 SEK would equal in US dollars? OK, to make this happen I first needed a command-line tool to download currency rates from a web page at a regular interval.

I found a little tool online called httpget which was written by a Brazilian fellow named Rafael Sagula. It fit my purposes almost perfectly. It only required a few small fixes and patches first...

It is a curious coincident that the first httpget release (0.1) was done on November 11 1996 which also happens to be the same day the first ever Wget release was done. Wget has been considered a curl alternative or substitute by many command line tool users.

Around this time I installed my first Linux systems at work, and we fired up our first public web servers and more. As I had experience from various other unixes from before, Linux was not particularly challenging to install but was still way more interesting due to its price and level of freedom.

I had more or less taken over as leader of the httpget project when I found another currency exchange site that was hosting data and offering it using GOPHER, I had to implement support for that protocol too. And then httpget was not a good name anymore so I changed it to Urlget. But not long after that, I added FTP support as well and then the step to adding FTP upload support was not big.

Haxx

In October 1997, my friend Linus and I registered our company, Haxx HB, to use when doing odd spare time jobs outside of our regular employments. Another playful name (hack in plural, hacks, but with two Xs instead of cks). Several years later we converted Haxx into a proper and real corporation; aktiebolag in Swedish.

Due to the restrictive Internet domain registration rules in Sweden in the 1990s we registered and used the domain haxx.nu until the year 2000 when the rules changed and we could register haxx.se.

Spare time hacking and full-time work

Already pretty early on in my adult life I established a system that would allow me to keep doing spare time software while still working full-time and spending time with my wife and later on my kids during the day. I realized they need more sleep than I do, so I simply started staying up after they go to bed and I get around two extra hours, totally alone to work on whatever I want.

Two hours per day, every day through decades end up a lot of time. Of course I also spend a little extra at times and during vacations I do not spend as much.

curl

By the time the urlget tool got the ability to do uploads, the name had became misleading again, so the project was up for a name change one more time and curl was born. curl as in see URL or client for URLs. Gee, naming things is really hard.

I made the first curl release on March 20, 1998. curl version 4.0, as I kept the version numbering from the previous names.

My interest for the Dancer project faded away slowly and I gave most of my spare time programming focus on curl.

Of course, over time I also dipped into and participated in other projects. I spent a lot of time in hypermail - a program that converts mailboxes to HTML pages. I have written mail2sms to convert email to SMS (it was useful in the times before the smartphones), and been working in Smash to send SMS messages to operators' modem receivers. I worked with Trio - a printf and string function library. I have contributed code to and I am involved somewhat in wget. I was an early contributor and committer in the Subversion project. I write and maintain roffit - a tool for to create HTML pages from nroff files (man pages).

Licensing

curl had started out GPL licensed pretty much without thought, but after some thinking I decided the GPL approach was not exactly in line with my philosophy.

In 1998 when we released curl 4.9, we switched to the MPL license. It is a liberal license and was much more in line with what I really wanted people to take away from curl: have them send back code if they actually change the curl code, but otherwise they could do whatever they wanted.

However, MPL proved to be a really unwise choice when we later launched libcurl - curl as a library made for other programs to use. Because the MPL is considered GPL-incompatible, those applications that were GPL licensed could not easily use libcurl because of this license collision. Therefore, in 2001 curl was again relicensed. This time to an MIT license. That license has since stuck and I have not regretted that choice ever since.

Of course I realize that people can take our code, change it and ship it with their applications and become millionaires without us ever getting back any changes. But in reality this is not a problem because people do not want to maintain their own forks, their own custom versions of curl. By avoiding a copyleft license we have successfully seen numerous businesses use curl. Companies that otherwise would not have considered using curl.

Rockbox

In the year 2000 lots of things happened. I and several of my friends and colleagues switched employer to Contactor AB, but I basically remained doing the same thing: embedded systems development as a consultant. I got married.

In that period I co-founded the Rockbox project (together with Björn and Linus) and I worked a lot within that project for many years. It was great fun and I met a lot of new friends through that, many of which I still meet and chat with regularly. Rockbox is an mp3 player firmware replacement. We reverse engineered mp3 players and replaced the original firmwares with our free version, that often was far better than the original one in terms of functionality, features and battery life.

libcurl

Up to that point, curl was just a command line tool. You would invoke it from scripts or from a shell prompt. I of course suspected that there would be programs and systems out there that could benefit from getting curl's powers into their applications and that doing curl as a library would enable that. curl was always sort of written with that mind-set internally, but of course it needed some work to make a real and official API out of it.

On August 7 2000, we released the first libcurl version. libcurl 7.1. It was immediately getting used and appreciated by early adopters and it gave me inspiration and energy to continue down that path.

Life 2.0

I continued to hack on curl on my spare time, and work as an embedded systems consultant during my days. In 2003, me and my wife bought a house in a southern suburb to Stockholm and on September 26 our daughter Agnes was born. Life would never be the same again (as every parent knows).

c-ares

Name resolving for applications have always been done with a synchronous function call with the POSIX API and this had been a concern for a while for me and a few friends who at this time had been pondering on starting up a project to work on this problem. One day however, I stumbled over the existing library called ares that did almost exactly what we wanted. I quickly took it to heart and implemented support in curl to use this library to do asynchronous and non-blocking name resolves. Soon I learned that the maintainer of ares pretty much considered his work done on that code base and he did not want to merge the changes I fed back and deemed necessary - for example support for building and working on Windows. I felt that I had no other option than to fork the project and adopt it myself to drive it forward. So I did, and c-ares was born.

IIS funding

When my daughter was roughly a year old, I applied for funding from the Swedish foundation IIS (The Internet Foundation In Sweden) to get some focused development time on curl. I wanted to implement a new API and make it more fit to do really large amounts of parallel transfers. I was given a grant that I worked on during spring 2005 and the multi_socket API was born. Doing 10,000 simultaneous transfers in the same thread became possible. Working from home a few months doing this was awesome.

Adobe funding

In 2006 my second child was born, Rex, and he was still just a few months old when I was contracted by Adobe to work on implementing SFTP support for curl. Adobe wanted to use it in one of their products to complement FTP uploading. SFTP itself being based on SSH protocol required that we could use a proper library to do the binary protocol level parts with so that I would not have to do the actual SSH bits within the curl project.

I loved getting the opportunity to once again work full time on curl for a few months.

libssh2

I looked around for options and at this time I found two feasible alternatives. Quite amusingly they were named libssh and libssh2 (yes the number two at the end is the only difference in naming). Unfortunately, none of them offered a truly non-blocking API and as my interest was to integrate and use this within libcurl that already had a non-blocking API that was an absolute requirement. So I asked both projects about it. Basically how they looked at the prospect of (me) adding non-blocking support and what they think about it. Both responded fairly quickly from what I recall. One in a fairly dismissing manner suggesting I should use threads instead, and the other in a welcoming and interested fashion. Of course I went with the project that had the better welcoming. I immediately felt welcome and got to know Sara who ran the libssh2 project.

In cooperation with others in the libssh2 project we implemented a non-blocking API and I made curl use this API and starting in November 2006 we could do SFTP and SCP transfers using that.

Sara, the lead of libssh2 changed jobs in 2006 and was as a consequence of that unable to continue maintaining the libssh2 project and pretty soon I took over as maintainer of the libssh2 project.

blogging

The concept of writing articles and sometimes almost diary-like entries in a single place on the Internet, blogging, was created at some point in the latter half of the 1990s.

I published my first ever first blog post on advogato.org in May 2000. My premiere blog post was, to no one's surprise, about my work on curl. I would then post frequent updates on that site for years to come.

On August 28, 2007 I moved my blogging over to my own site, daniel.haxx.se, which I had been hosting since 2000. I installed WordPress on my site and since then my blog has been self-hosted. Over the following seventeen years, I posted 1,448 blog posts there - and I have not stopped yet.

HTTPbis

I had been working with all these protocols up until now without knowing and not really caring about exactly how protocols are made or how decisions were made about them. But the more I worked with HTTP and all its intricate details, I become aware of differences in implementations and struggles to work with servers that obviously did not follow what was written the RFCs. Until someone one day pointed out the HTTPbis working group to me.

HTTPbis was an IETF working group that had been started in 2007 with an effort to refresh the HTTP/1.1 spec. I joined the list and started to follow the development and discussions. I wrote my first post to the list in the spring of 2008.

IETF 75

After gradually having increased my participation in the HTTPbis group over the years, it was a lucky fluke that the 75th IETF meeting in summer of 2009 happened to be organized in Stockholm Sweden. My home town. Since curl and HTTP were primarily hobbies of mine, I had a hard time to motivate the investment and travel budget of going to IETF meetings abroad. But this time the circus was coming to me and now I finally got to meet a lot of the mailing list participants in person for the first time. Friends. This made me even more interested in and motivated to work within HTTPbis going forward.

Developers sometimes ask me if the slowness and bureaucracy of standardization is not tedious. For me, working within the IETF is a matter of bringing technology and interoperability forward. To be involved and ensure that the specs get done right, taking the right things into consideration and not go over board to fiddle with things we should not. It is good for everyone to have a good IETF. I find the spirit and working methods to be similar to open source.

For example, we carried out work within the IETF to specify how cookies are actually used in HTTP. Cookies had been around for maybe 15 years already at the time and the only spec that actually had been used was less than one hundred lines and totally useless. Attempts had been made over the years to correct it and at least two new cookie RFC were written that failed to get adopted. Finally in 2009 we started a group within IETF that worked to document how cookies actually work on the web. I felt that I, as an independent and non-browsers orient cookie parser implementer since many years, could provide good feedback and a completely different point of view then most others who were participating - many of them coming from the browser world. I would like to think my few bits of contribution helped making RFC 6265 as good as it is. (Published in 2011.)

Haxx AB

Professionally, I had spent the last several years doing contract work where I basically had found the assignments on my own and sold myself without my employer's involvement. At the same time I felt that the company I worked for was not really going in the direction that I wanted to go in. I did not really get my money's worth there. In the end of August 2009 I quit my employment and I instead become the first full-time employee of Haxx AB, our own firm.

In 2009, I was awarded the Nordic Free Software Award along with Simon Josefsson for my work in open source and free software up until that point.

Under our own name (Haxx) I continued to do embedded systems contracting. Now being my own boss and of course having the ultimate freedom to decide what jobs to take and how to spend my time and money. I still did not get many curl related jobs more than the occasional smaller hacks and minor improvements (and a series of smaller I want to automate this using curl can you do it for me please tasks), so the protocol side remained a spare time occupation.

A few months after me, my brother Björn joined me as Haxx employee number two and a year after, Linus become employee number three. What a glorious development. Looking back, that switch was one of the best decisions I have ever done in my professional life.

Haxx was like a dream since forever, transformed into reality. A small number of close friends who are all experts in embedded systems and Linux. We worked as expert consultants and contractors for companies that built various embedded systems Embedded systems today means a high degree Linux and open Source.

HTTP/2

The HTTPbis working group took upon itself to work on an update to HTTP 1.1 that had been the major HTTP version for many years. It has started to shown its age and HTTP/2 took off from Google's SPDY. I participated in that work.

Mozilla

In the fall of 2013 I ended a two-year contracting job for Enea AB where I had worked fiercely to kick-start their embedded Linux distribution. I looked - and asked - around my wider circle of contacts and friends to see if anyone had an interesting opening for me to take on next.

Someone did. Patrick Mcmanus worked in the networking team at Mozilla and asked if I would not be interested to do my next gig for them. I was thrilled to get the opportunity, even if I then also had to do that as an employee and not as a contractor. It felt weird to give up that style of life but for this chance I was willing to do a lot. I traveled to Mountain View in November 2013 and did seven different interviews in one rather long day...

I started at Mozilla the first days of January 2014, in the networking team. HTTP, FTP, DNS, cookies, caching, sockets etc. All day at work. And then all night with curl. Mozilla even allowed me to spend a part of my work time on curl stuff. Mozilla has no office in Sweden, this meant I could work full time from home.

RFC 7540

In May 2015 the HTTP/2 RFC shipped. In relation to the introduction of this new protocol version, I did several talks and presentations to a lot of different audiences in multiple countries. I also wrote a document about it, called http2 explained that I released freely and openly on the web. It turned out a huge success and for the period I counted downloads, I saw more than 200,000 downloads.

everything curl

I have always believed that a key component to reaching success for a project is to provide plentiful and accurate up-to-date documentation. In September 2015 I started writing a separate documentation for curl. Something that would better help describe how to do things in curl if for example you don't know where to start. The existing manpages are excellent resources when you know what you want to do and you need to look up how to do it. everything curl was meant as an attempt to make a more tutorial style documentation for everything that is curl related.

As with everything, given enough time it might turn into something quite good. Nine years later the book consists of 114,000 words and when rendered as a PDF it has more than 550 pages.

Second best developer in Sweden

In 2016, the Swedish online publication called Techworld had a contest for Sweden's best developer - and were accepting nominations from the public. They have had this contest before but this time I was nominated and ultimately awarded the 2nd place.

QUIC

The QUIC working group was formed in IETF during late 2016 and I joined the mailing list and subscribed to the GitHub repository at once to keep track of and possibly participate in the development.

CDN

At times my blog posts would get large volumes of visitors which caused us problems when the webserver would get on its knees due to the intense traffic and load. Since the same server hosted several other sites, it was a nuisance.

In May 2017, my personal blog as well as the curl website was switched over to get fronted by Fastly's CDN network. Suddenly almost all traffic was taken off my own server and the instabilities were gone.

US issues

In December 2016 I attended the week long Mozilla all-hands meeting on Hawaii (which was also my 12th visit to the US through the years - yes I have had reason go back and actually carefully count the occasions). In June 2017 I was set to travel to San Francisco for another all-hands company meeting, when I was refused to board the flight due to unspecified "problems with my ESTA". (ESTA is the visa waiver program under which I as a Swede can travel to the US.)

As my employer at the time, Mozilla engaged some people on both the American as well as the Swedish side to try to figure out what was wrong and what we could do to correct the situation. Unfortunately, no one would offer any clues or information about why I was denied so that effort resulted in nothing.

In the spring of 2018, I reapplied for ESTA, got denied and then applied for a visa instead. I did the final steps for that when I visited the US embassy in Stockholm Sweden for an interview on April 17, 2018.

Polhem Prize

I was awarded the Polhem Prize in October 2017 for my almost 20 years of having run the curl project and its impact on the world. An amazing honor.

At the award ceremony, I was handed a gold medal from the hands of the Swedish king himself.

HTTP/3 explained

In November 2018 it was announced that the protocol previously called just HTTP over QUIC would officially become HTTP/3. I renamed my new document I was working on and soon I could reveal HTTP/3 Explained online. Free and open. Soon enough volunteers joined in and started to offer and contributed to translated versions.

Leaving Mozilla

In December 2018, I left my employment at Mozilla. I had spent almost five full years employed there and it was a great time with many awesome colleagues and friends. Working full-time from home on open source had been awesome, but it was time for me to do something else.

Why did I leave? Three parts:

  1. I was bored with the C++ Firefox messy development and getting more bug-reports filed than we manged to close and
  2. my manager turned out to be a bully who worked hard to make my life miserable.
  3. It was time to attempt to figure out how to work full-time on curl

Life 3.0

A new home: wolfSSL

In February 2019 I joined wolfSSL to do commercial curl support and work on curl full-time. wolfSSL is an American company. I am the only Swede working for wolfSSL and I could continue to work from home. I love it.

At the time I started I was still not allowed entry to the US so I was not even able to visit my new colleagues.

wolfSSL already offered a set of existing open source libraries and commercial support on those, so adding curl and libcurl to that offer was a good match - and many customers use both wolfSSL and libcurl in products.

I knew wolfSSL and its CEO Larry already since many years back.

Covid-19

I did a presentation in person at FOSDEM in early February 2020 just before we learned exactly how big of a deal this Covid-19 thing was. Up until then I did a little more than one presentation per month, which served as an excellent way to spice up the working-from-home life.

Then came the era of doing presentations over video. So much harder. So much not the same thing. Neither for me as a presenter and certainly not for the audience either.

curl.se

In early November 2020 I could finally get my hands on the curl.se domain. I had been trying to get a curl domain under any top level domain for a long time when this domain finally ended up mine.

In the early 2000s this domain was used for a curling website that seems to have been handed over between a few different curling teams until it was purchased by someone who decided to run some kind of casino ads on it was a good idea. It remained like that for several years during which I at one timed tried to reached out to see if I could purchase it - but my offer was declined. Instead it eventually was abandoned and a friend of mine managed to snatch it and then gave it to me. Having good friends is awesome.

Visa

My silly US travel situation lasted until November 9, 2020 when I after 937 days of waiting finally received a visa in my passport. Of course, at this time the Covid-19 pandemic was still ravaging so not the ideal timing to travel anyway.

uncurled

In the spring of 2022, I decided to convert my then planned series of coming blog entries into an online book instead. The idea was to basically write down what I have learned from maintaining Open Source projects for several decades. To share lessons and insights I have gathered over the years - to produce something that maybe I would have been interested in when I was a newcomer.

I decided to call this "uncurled" since so much of my Open Source work and life has been done on and around curl.

trurl

In the last day of March 2023 I made the first commit in a new project that within shortly would get named 'trurl'.

The idea for this came out of me earlier this year adding a few new output features to curl that would allow users to output parts of the URL they told curl to work with. It struck me that while it was cool to have curl to do this, it is not at all curl's job to help users dissect URLs. Also, since we have learned several times in the recent years about the dangers of mixing URL parsers, it dawned on me that it would make a lot of sense to offer a separate tool for "URL management" that uses the same URL parser as curl does. Conveniently enough, we introduced a URL API to libcurl a couple of years ago.

trurl is an additional tool managed by the curl project.

wcurl

MVP

Future

I never plan never far ahead.