Category Archives: Security

A bad move. A really bad move.

So I wrote this little perl script to perform a lot of repeated binary Rockbox builds. It builds something like 35 builds and zips them up and gives them proper names in a dedicated output directory. Perfect to do things such as release builds.

Then I wrote a similar one to build manuals and offer them too. I then made the results available on the Rockbox 3.0RC (release candidate) page of mine.

Cool, me thinks, and since I’ll be away now for a week starting Wednesday I think I should make the scripts available in case someone else wants to play with them and possibly make a release while I’m gone.

I did

mv buildall.pl webdirectory/buildall.pl.txt

… thinking that I don’t want it to try to execute as a perl script on the server so I rename it to a .txt extension. But did this work? No. Did it cause total havoc? Yes.

First, Apache apparently still thinks these files are perl scripts (== cgi scripts) on my server, even if they got an additional extension. I really really didn’t expect this.

Then, my scripts are doing a command chain similar to “mkdir dir; cd dir; rm -rf *”. It works great when invoked in the correct directory. It works less fine when the web server invokes this because someone clicked on the file I just made available to the world.

Recursive deletion of all files the web server user was allowed to erase.

Did I immediately suspect foul play and evil doings by outsiders? Yes. Did it take quite a while to restore the damages from backups? Yes. Did it feel painful to realize that I myself was to blame for this entire incident and not at all any outside or evil perpetrator? Yes yes yes.

But honestly, in the end I felt good that it wasn’t a security hole somewhere that caused it since I hate spending all that time to track it down and fix it. And thanks to a very fine backup system, I had most of the site and things back up and running after roughly one hour off-line time.

Rockbox

Security and Open Source

OWASP Sweden is arranging an event on October 6th in Stockholm Sweden to talk about security in the open source process.

I will be there doing talk about security in open source projects, in particular then how we work with security in the curl project. If you think of anything particular you would like me to address or include, feel free to give be a clue already before the event!

Getting cacerts for your tools

As the primary curl author, I’m finding the comments here interesting. That blog entry “Teaching wget About Root Certificates” is about how you can get cacerts for wget by downloading them from curl’s web site, and people quickly point out how getting cacerts from an untrusted third party place of course is an ideal situation for an MITM “attack”.

Of course you can’t trust any files off a HTTP site or a HTTPS site without a “trusted” certificate, but thinking that the curl project would run one of those just to let random people load PEM files from our site seems a bit weird. Thus, we also provide the scripts we do all this with so that you can run them yourself with whatever input data you need, preferably something you trust. The more paranoid you are, the harder that gets of course.

On Fedora, curl does come with ca certs (at least I’m told recent Fedoras do) and even if it doesn’t, you can actually point curl to use whatever cacert you like and since most default installs of curl uses OpenSSL like wget does, you could tell curl to use the same cacert your wget install uses.

This last thing gets a little more complicated when one of the two gets compiled with a SSL library that doesn’t easily support PEM (read: NSS), but in the case of curl in recent Fedora they build it with NSS but with an additional patch that allows it to still be able to read PEM files.

Will 2008 become 1984?

Next week in Sweden (June 18th), as reported in several places lately including slashdot, the Swedish parliament is supposed to vote for the pretty far-going law allowing FRA (a swedish defence organization previously involved in radio-surveillance etc) to wire-tap phone calls and computer traffic that cross the Swedish borders. The majority in the parliament is for the law, while it seems most of the ordinary people are against it. The hope is now that a few people will vote against their parties, that they will have the guts to stand up and “do the right thing” instead of following the party line.

I won’t go into how silly, stupid and bad such a law is but I’ll instead just show this great video to all swedes:

(video snipped from here)

stopa FRAlagen nu

This banner says (roughly translated by me) “On June 18th the government will take away your personal integrety. All internet traffic, all phone calls, all email and SMS traffic will be wire-tapped starting January 1st 2009. Big brother sees you! … and violates the Swedish Constitution.”

public suffixes list

I noticed the new site publicsuffix.org that has been setup by the mozilla organization in an attempt to list public suffixes for all TLDs in the world, to basically know how to prevent sites from setting cookies that would span over just about all sites under that “public suffix”.

While I can see what drives this effort and since we have the same underlying problem in curl as well, I have sympathy for the effort. Still, I dread “having to” import and support this entire list in curl only to be able to better work like the browsers in the cookie department. Also, it feels like a cat and mouse race where the list may never be complete anyway. It is doomed to lack entries, or in the worst case list “public suffixes” that aren’t any such public suffixes anymore and thus it’ll prevent sites using that suffix to properly use cookies…

There’s no word on the site if IE or Opera etc are going to join this effort.

Update: there are several people expressing doubts about the virtues of this idea. Like Patrik Fältström on DNSOP.

Coverity’s open source bug report

The great guys at scan.coverity.com published their Open Source Report 2008 in which they detail findings about source code they’ve monitored and how quality and bug density etc have changed over time since they started scanning over 250 popular open source projects. curl is one of the projects included.

Some highlights from the report:

  • curl is mentioned as one of the (few) projects that fixed all defects identified by coverity
  • from their start, the average defect frequency has gone down from one defect per 3333 lines of code to one defect per 4000 lines
  • they find no support to backup the old belief that there’s a correlation between function length and bug count
  • the average function length is 66 lines

And the top-5 most frequently detected defects are:

  1. NULL Pointer Dereference 28%
  2. Resource Leak 26%
  3. Unintentional Ignored Expressions 10%
  4. Use Before Test (NULL) 8%
  5. Buffer Overrun (statically allocated) 6%

For all details and more fun reading, see the full Open Source Report 2008 (1MB pdf)

Taking down P2P botnets

Five german/french researchers wrote up this very interesting doc (9 page PDF!) called “Measurements and Mitigation of Peer-to-Peer-based Botnets: A Case Study on StormWorm” about one of the biggest and most persistent botnets out in the wild: Storm. It is used for spam and DDOS attacks, has up to 40,000 daily peers and the country hosting the largest amount of bots is the USA.

Anyway, their story on how it works, how they work on infecting new clients, how the researchers worked to infect it and disrupt the botnet communication is a good read.

Bad guys reveal other bad guys

In Sweden we currently have an interesting situation where a hacking group called “Hackare utan gränser” (should probably be “Hackers Without Borders” if translated) hacked one of those auction sites where you make the lowest unique bid to win. The site in question is called bideazy and according to the hacker group’s announcement (forum posting and following discussion entirely in Swedish) their database is full of evidence of the bidding not having been done correctly and it seems to show that the site and company owner has won a large amount of all “auctions”.

And they also made most of that data publicly available.

This brings many questions in my brain, including:

First of course the evident discussion if one crime (the hacking) can be justified to reveal another (the scam), but what I think is more important: isn’t auction sites and especially the lowest-bid kinds more or less designed to open up for the sites to easily scam the users? It is very very hard for someone on the outside of it all to see if things are done the right way and that all rules are followed. Heck, even a little tweak here and there would make a huge impact for the site but won’t be seen by the public.

I also find it a bit funny that in this case is they seem to have stored the scam data neat and properly in their data base which the hackers found, and I really can’t figure out why. If they wanted a database to show as a front end if someone would ask and blame them for cheating, then this wouldn’t be the one. And since they really seem to be cheaters, why would they need to store and keep track of all the cheats in a huge database?