One of the early IRC pioneers, Ari Lemmke, contacted me to set some facts straight on my History of IRC page that I didn’t previously have as bluntly spelled out as I do now… 🙂
Monthly Archives: September 2008
Not Based on Linux
Ok so the guys on the Linux Action Show podcast don’t really get a lot of bonus points from me lately. The episode after they had their “we need to sell proprietary software” outburst, they slammed the Rockbox 3.0 release (roughly 23:40 into the episode for you who want to fast-forward to it).
They started off the news about Rockbox 3.0 claiming it is based on Linux (which it isn’t and never was), only to mention that they failed to install on their ipod 3rd gen at their first attempt (but succeeded at a second attempt), whined somewhat on the installer and then again complained about the inability to install themes even though this is 3.0 yada yada yada.
All in all, pretty much a complete non-understanding for the hard work and endless time that hundreds of people have put into Rockbox. Nothing particular to hear or care about, just a bit annoying.
So THAT is the point of releases!
In the Rockbox project we’ve been using a rather sophisticated build system for many years that provide updated binary packages to the public after every single commit. We also provide daily built zips, manuals, fonts and other extras directly off the subversion server fully automatic every day.
I used to be in the camp that thought that this is a very good system to the extent that it makes ordinary version-numbered releases somewhat unnecessary since everyone can easily get recent downloads whenever they want anyway. We also had a general problem getting a release done.
But as you all know by now, we shipped Rockbox 3.0 the other day. And man did it hit the news!
lifehacker.com, gizmodo.com, engadget.com, slashdot.org, golum.de, boingboing.net, reddit.com and others helped us really put our web server to a crawl. The 4 days following the release, we got roughly 160,000 more visits on our site than usual, 5 times the normal amount (200,000 visits compared to the “normal” 40,000).
Of course, as a pure open source project with no company or money involved anywhere, we don’t exactly need new users but we of course want more developers and hopefully we do reach out to a few new potential contributors when we become known to a larger amount of people.
So I’m now officially convinced: doing this release was a good thing!
They can’t do it so I won’t
I listened to a recent episode of the Linux Action Show podcast the other day (s9e4), and in that episode the hosts Bryan and Chris really lost touch with reality.
First they started ranting about how “the Linux Desktop” needs an eco system for proprietary closed-source applications. They claimed that we cannot make good quality software entirely open source, that open source products and tools won’t be as good as proprietary ones. They apparently decided that the reason there’s a lack of some tools (notable example that these guys like to bring up: video editors) is that the creators of these tools don’t make them proprietary so that they can sell them.
Of course they had nothing to back up their claims but a few random guesses from their behalf.
Then, after that whole weird segment that seemed to be taken out of the blue, Bryan strikes with announcing how he intends to improve the linux desktop environment by start selling two proprietary tools to the world to show that it can be done and yada yada.
I mean, this guy has never done any particular open source or free software contribution of significance. It’s not like he even tried to contribute and make a living off of something related. They decided that others have tried and failed, so he shall not.
The two tools he now sell are two minor tools that will prove nothing about how proprietary programs can or cannot survive on the Linux market. If he fails to sell enough to make a living it just says nobody wanted his niche products well enough (or that he asks too much money for them), and in case he does get money from the products to make a decent living it is not a proof that he couldn’t have made a business case for an open source version.
These are two guys who tend to praise linux and open source and everything in episode after episode. In my view, the open source world has proven over and over again that it is capable of producing and making just about anything to a quality that matches and surpasses those of the proprietary closed-source world. These guys just happen to come to a conclusion that this concept doesn’t work exactly at the same time when one of them decides it’s time to sell proprietary linux software?
I say hypocrites.
We finally shipped it!
Corbet mocked us a bit about the very very long time since the previous Rockbox release, but we finally released Rockbox 3.0 and all is fine and dandy now.
gdgt #2 said Rockbox
Ryan and Peter from Engadget and Gizmodo fame are now making a new site and podcast series. The latter seem to have climbed the “charts” very rapidly and it is a top podcast in the tech sector on itunes apparently.
Anyway, in the second episode (about 20 minutes into it) they did a very brief and non-explanatory reference to Rockbox about wanting to install it on a SanDisk Sansa e280. Anyway, they didn’t say much about it at all but I simply enjoyed having it reached that level of no-need-to-explain-what-it-is-when-mentioned.
Shared Dictionary Compression over HTTP
Wei-Hsin Lee of Google posted about their effort to create a dictionary-based compression scheme for HTTP. I find the idea rather interesting, and it’ll be fun to see what the actual browser and server vendors will say about this.
The idea is basically to use “cookie rules” (domain, path, port number, max-age etc) to make sure a client gets a dictionary and then the server can deliver responses that are diffs computed against the dictionary it has delivered before to the client. For repeated similar contents it should be able to achieve a lot better compression ratios than any other existing HTTP compression in use.
I figure it should be seen as a relative to the “Delta encoding in HTTP” idea, although the SDCH idea seems somewhat more generically applicable.
Since they seem to be using the VCDIFF algorithm for SDCH, the recent open-vcdiff announcement of course is interesting too.
A bad move. A really bad move.
So I wrote this little perl script to perform a lot of repeated binary Rockbox builds. It builds something like 35 builds and zips them up and gives them proper names in a dedicated output directory. Perfect to do things such as release builds.
Then I wrote a similar one to build manuals and offer them too. I then made the results available on the Rockbox 3.0RC (release candidate) page of mine.
Cool, me thinks, and since I’ll be away now for a week starting Wednesday I think I should make the scripts available in case someone else wants to play with them and possibly make a release while I’m gone.
mv buildall.pl webdirectory/buildall.pl.txt
… thinking that I don’t want it to try to execute as a perl script on the server so I rename it to a .txt extension. But did this work? No. Did it cause total havoc? Yes.
First, Apache apparently still thinks these files are perl scripts (== cgi scripts) on my server, even if they got an additional extension. I really really didn’t expect this.
Then, my scripts are doing a command chain similar to “mkdir dir; cd dir; rm -rf *”. It works great when invoked in the correct directory. It works less fine when the web server invokes this because someone clicked on the file I just made available to the world.
Recursive deletion of all files the web server user was allowed to erase.
Did I immediately suspect foul play and evil doings by outsiders? Yes. Did it take quite a while to restore the damages from backups? Yes. Did it feel painful to realize that I myself was to blame for this entire incident and not at all any outside or evil perpetrator? Yes yes yes.
But honestly, in the end I felt good that it wasn’t a security hole somewhere that caused it since I hate spending all that time to track it down and fix it. And thanks to a very fine backup system, I had most of the site and things back up and running after roughly one hour off-line time.
Security and Open Source
OWASP Sweden is arranging an event on October 6th in Stockholm Sweden to talk about security in the open source process.
I will be there doing talk about security in open source projects, in particular then how we work with security in the curl project. If you think of anything particular you would like me to address or include, feel free to give be a clue already before the event!
Your view on libcurl security
As I posted to the curl-library list, I’d be happy to get some feedback from libcurl-users on the security aspects of our project, and how you think we deal with security and how you deal with security in ways related to libcurl.