Filling our pipes

At around 13:43 GMT Friday the 5th of December 2008, the network that hosts a lot of services like this site, the curl site, the rockbox site, the c-ares site, CVS repositories, mailing lists, my own email and a set of other open source related stuff, become target of a vicious and intense DDoS attack. The attack was in progress until about 17:00 GMT on Sunday the 7th. The target network is owned and ran by CAG Contactor.

Tens of thousands of machines on the internet suddenly started trying to access a single host within the network. The IP they targeted has in fact never been publicly used as long as we’ve owned it (which is just a bit under two years) and it has never had any public services.

We have no clue whatsoever why someone would do this against us. We don’t have any particular services that anyone would gain anything by killing. We’re just very puzzled.

Our “ISP”, the guys we buy bandwidth and related services from, said they used up about 1 gigabit/sec worth of bandwidth and with our “mere” 10 megabit/sec connection it was of course impossible to offer any services while this was going on.

It turns out our ISP did the biggest blunder and is the main cause for the length of this outage: we could immediately spot that the target was a single IP in our class C network. We asked them to block all traffic to this IP as far out as possible to stop such packets from entering their network. And they did. For a short while there was silence and sense again.

For some reason that block “fell off” and our network got swamped again and it then remained unusable for another 48 hours or so. We know this, since our sysadmin guy investigated our firewall logs on midday Sunday and they all revealed that same target IP as destination. Since we only have a during-office-hours support deal with our network guys (as we’re just a consultant company with no services that really need 24 hour support) they simply didn’t care much about our problem but said they would deal with it Monday morning. So our sysadmin shutdown our firewall to save our own network from logging overload and what not.

Given the explanations I’ve got over phone (I have yet to see and analyze logs from this), it does sound like some sort of SYN flood and they attempted to connect to many different TCP ports.

4-5 hours after the firewall was shutdown, the machines outside of our firewall (but still on our network) suddenly became accessible again. The attack had stopped. We have not seen any traces of it since then. The firewall is still shutdown though, as the first guy coming to the office Monday morning will switch it on again and then – hopefully – all services should be back to normal.

Open source in my day job

From people in the open source community and then especially friends and fellow hackers in projects I am involved in, I sometimes get questions on how my open source participations affect my “real job”.

I work as a consultant during the days and I do contract development for hire. I’ve been a consultant since 1996 and during this time I’ve participated in more than 25 “full-time” projects for almost as many customers.

I contribute to numerous open source projects and I’ve done it for many years. I lead and maintain several open source projects. I’ve committed many thousands of times to public source code repositories.

Does the contributions make me more attractive to potential customers? Not particularly is my rather sad experience. While some of my customers notice my track record (my CV does of course mention my most notable contributions) most of my day-job clients focus solely on other paid projects I’ve done and that exact technologies and products I worked with and created in the past. It may of course not be too strange as things I do and get paid for is then potentially “good enough for someone to pay me for” while the stuff I do for free in open source projects are… well, not paid for and thus it can’t be qualified by that ruler.

Do I get new assigments thanks to my open source projects? Yes, I do, but usually they tend to be on the smallish side and not of the bigger kinds my regular assignments at work are.

And the reality is of course also that the vast vast vaaast majority of all software consultants that people hire to do development have no public record of open source involvement so it could just be a result of that this is so rare the customers never had a reason to learn or adapt to using open source contributions as a “factor”.

Or maybe I’m just ignorant and haven’t figured out how my customers truly work.

Do I work with open source in my day job? Yes almost exclusively. I’ve been working with Linux in various embedded systems basically the last 8 years and working with Linux systems pretty much implies a wide range of open source development tools as well.

4 ohloh improvements I’d like

I am a stats junkie so I like my stats in large amounts. But I like the stats to be right and as accurate as possible, and when I look at what ohloh produces I like the concepts and ideas in general, I just think their implementation is lacking in a few vital areas that need improvement:

1. There are no dependencies or hierarchies between packages, so “I use this” counters get worthless since people mark end-user packages they use. Low-level support packages and libraries that are used indirectly don’t get many “use counts”

2. Doing very few commits in a very well used project with few authors gives you way way more points than doing a bus-load of commits in something less used with many fellow contributors. This makes the top-list of people very skewed as some of the top-64 people only did a few hundred commits ever. I doubt many mortals would consider someone who only ever did 300 commits to be a top community person. At the very moment I write this, the #1 ranked person has done 20 commits during 5 months…!

3. Too few versioning systems are supported, leaving out huge chunks of the open source world. Bazaar, mercurial and a few more are a bit too popular to be ignored without the results getting skewed.

4. I’d like to see the “number of users” of products as a percentage, as the total number of users they show include all contributors to all projects. Out of the 140,000 users (which undoubtedly include a lot of duplicates), it would surprise me if more than 10,000 have actually registered what products they use. I’ve tried to find the exact number but I failed. So 3,000 users don’t mean 3,000 out of 140,000 but 3,000 out of 10,000…

My Firefox Add-ons

I simply need to have this list somewhere so that I can find out my own add-ons again when I’m running Firefox away from home!

Adblock Plus – since ads are too annoying these days

DownThemAll – because I like to be able to get whole batches of images or similar at times

Fission – just a silly eye-candy thing

Forecastfox – I like weather forecasts!

FoxClocks – helps me keep track of the time my friends around the world have at different moments.

It’s All Text – makes web based editing/posting a more pleasurable experience by allowing me to edit such contents with emacs!

Live HTTP Headers is a must when you want to figure out how to repeat your browser’s actions with a set of curl commands.

Open in Browser allows me to open more stuff within the browser itself, even when the Content-Type is bad.

Right-Click-Link is great when you quickly want to browse to links you find in plain text sections.

Torbutton lets me quickly switch to anonymous browsing.

User Agent Switcher lets me trick stupid server-side scripts into beleiving I use a different browser or even operating system.

What great add-ons did I miss?

(Some nitpickers would say that I don’t run Firefox since I use Debian and then it is called Iceweasel, but while that is entirely true, Iceweasel is still the Firefox source code and the Add-ons are in fact still Firefox Add-ons even if they also run perfectly fine on Iceweasel.)

Fujifilm FinePix F100fd

Ok, I bought myself a Fujifilm FinePix F100fd camera the other day, as it fulfilled my requirements pretty good:

1. It’s compact, noticeably smaller than my previous Sony one.

2. While not a 3″ LCD it features a 2.7″ one, which is a tiny bit larger than my previous’ 2.5″.

3. Image Stabilizer. And in my test shots it seems to make a difference. I’ll admit I haven’t yet played a lot with it on and off, but especially when zooming it seems to do some good.

4. Good low-light images. Yes it does. I’ve so far seen it go down to ISO1600 on auto and while that isn’t the best pictures, using flash is certainly not a good way to achieve great pics either (in general).

5. It accepts SDHC cards. I put a 4GB one in to start with as it costs virtually nothing. My previous camera had 512MB so it’s still 8 times the size. Of course my Sony was 5 megapixels and this does 12 so it will of course produce larger image files.

Possibly I’ll try to make some comparison pictures with my old and my new cameras later on.

Snooping on government HTTPS

As was reported by some Swedish bloggers, and I found out thanks to kryptoblog, it seems the members of the Swedish parliament all access the internet via a HTTP proxy. And not only that, they seem to access HTTPS sites using the same proxy and while a lot of the netizens of the world do this, the members of the Swedish parliament have an IT department that is more big-brotherish than most: they decided they “needed” to snoop on the network traffic even for HTTPS connections – and how do you accomplish this you may ask?

Simple! The proxy simply terminates the SSL connection, then fetches the remote HTTPS document and run-time generates a “faked” SSL cert for the peer that is signed by a CA that the client trusts and then delivers that to the client. This does require that the client has got a CA cert installed locally that makes it trust certificates signed by the “faked” CA but I figure the parliament’s IT department “help” its users to this service.

Not only does this let every IT admin there be able to snoop on user names and passwords etc, it also allows for Man-In-The-Middle attacks big-time as I assume the users will be allowed to go to HTTPS sites using self-signed certificates – but they probably won’t even know it!

The motivation for this weird and intrusive idea seems to be that they want to scan the traffic for viruses and other malware.

If I were a member of the Swedish parliament I would be really upset and I would uninstall the custom CA and I would seriously consider accessing the internet using an ssh tunnel or similar. But somehow I doubt that many of them care, and the rest of them won’t be capable to take counter-measures against this.

Solaris 10 ships libcurl

I fell over this document named “What’s New in the Solaris 10 10/08 Release” and it includes this funny little quote towards the end:

C-URL – The C-URL Wrappers Library

C-URL is a utility library that provides programmatic access to the most common Internet protocols such as, HTTP, FTP, TFTP, SFTP, and TELNET. C-URL is also extensively used in various applications.

The project is cURL, the tool is curl and the library is libcurl. There’s nothing named C-URL and it isn’t any “wrappers library”… And the list of protocols is also funny since it includes 6 protocols while a modern libcurl supports 13 different ones, and also if you build libcurl to support SFTP you also get SCP (which the list doesn’t include) etc.

It just looks so very sloppy to me. But hey, what do I know?

curl, open source and networking