Category Archives: Technology

Really everything related to technology

IRC use is declining

I discovered IRC around 1993.

Back then, before EFnet split in two, the IRC channel I frequented was #amiga and we were a small bunch of people from all over the globe who got to know each other pretty good. In the 90s I participated in one of my first open source projects and we created the IRC bot we named Dancer. Dancer was a really talented “defence bot” back in the days of the “wild west” of IRC when channel take overs, flood attacks and nick collisions were widespread and frequently occurring. Dancer helped us keep things calm. Later on, I was part of the team that created and setup the new IRC network called amiganet.

I’ve been using IRC on and off since those days in the early 1990s and still today I hang out on 5-6 channels on freenode every day.

IRC was launched to the world already 1988, almost 23 years ago. I’ve been trying to document the basic history of IRC and when I updated that page the other day with some usage numbers for freenode, I decided to have a look around the net to see if there are any general numbers for IRC usage at large, and I found out that usage is decreasing all over and has been doing so for years. Without research, I figure IRC users are either old farts like myself or at least very tech oriented and geeky. Younger, newer and less techy people use other means of communication.

IRC never “took off” among the general public. In general, I find that general people prefer various IM systems (something that I’ve never understood or adopted myself) and most “ordinary”humans I know don’t even know what IRC is. Possibly, the fact that the IRC protocol never got very good (there’s only that original spec from ’93), that there’s a million completely separated IRC networks with no cross-network messages or that all IRC networks still today suffer from netsplits and other artifacts due deficiencies in how the IRC servers are talking to each other.

5-6 years ago the four most popular networks were all over 100,000 users regularly. Quakenet were well over 200,000. Last year, only Quakenet reached over 100,000. It seems basically all of  them have roughly half the numbers they had 2004.

Graphs from irc.netsplit.de:

2004

IRC usage 2004

2010

IRC usage 2010

Email asking for my products

In my mini-series of strange mails I receive, here’s another one:

Subject: Product Request

Hello,
I am interested in purchasing some of your products, I will like to know
if youcan ship directly to SPAIN , I also want you to know my mode of
payment for this order is via Credit Card. Get back to me if you can ship
to that destination and also if you accept the payment type I indicated.
Kindly return this email with your price list of your products..

I assume I’ll never figure out what products he speaks of, or how on earth he ended up sending me this… I’ll admit I was tempted to make up some “interesting” products to offer.

Update: I was informed that this is probably “just” another online fraud attempt. How boring.

Haxxup – cheap remote backup

The pains and guilty consciences from having a lacking backup concept established are widely common. I honestly don’t know anyone (and I mean it) that can say that they have their (home, private) backup covered with a straight face. We all know we should backup locally and remotely, so that we can do fast recovery for the easy things we mistakenly remove or ruin, and if we get burgled or the house burns down we need to have a backup remotely.

The importance of private computer backups has only increased over time, as these days most of us have vast amounts of family pictures and videos stored as well, things that in the old days were stored (and lost) separately.boom

A growing problem with remote backups is of course that we all have ridiculous amounts of data to backup. Getting a commercial remote backup deal for say 300GB (and growing) isn’t cheap. And we’re also very often at loss when it comes to get a solution that works on Linux.

In Haxx, we also recognized and suffered from these problems. We came up with a scheme to fix a distributed networked backup among ourselves! Getting large hard-drives to use locally is fairly cheap. We all have fairly good fixed-fee no-bandwidth-limit internet connections (although admittedly the uplink speeds are lacking for us typical ADSL users).

We decided that among us 4, each of us gets an account at two of our friends’ servers and we’ll be able to upload our backups to those at our own pace to store whatever we want. We decided on getting two places for everyone to decrease the risk even further, especially if you for example urgently need to get something back and one of us have a network problem (not completely unheard of) or something else.

My current total backup is about 100GB and I have a 1mbit uplink. If I use the entire bandwidth for this, other things get a little sluggish so I’ve capped the rsync job to 90KB/sec… My first run thus completed in roughly 13 days. Luckily I don’t add contents at a very high pace so the ordinary sync jobs from then on should be much smaller and should be able to complete within hours. As long as I add less than ~3.5GB during a 24 hour period, it should be able to keep up to sync to two remote places.

localhost hack on Windows

There's no place like 127.0.0.1

Readers of my blog and friends in general know that I’m not really a Windows guy. I never use it and I never develop things explicitly for windows – but I do my best in making sure my portable code also builds and runs on windows. This blog post is about a new detail that I’ve just learned and that I think I could help shed the light on, to help my fellow hackers. The other day I was contacted by a user of libcurl because he was using it on Windows and he noticed that when wanting to transfer data from the loopback device (where he had a service of his own), and he accessed it using “localhost” in the URL passed to libcurl, he would spot a DNS request for the address of that host name while when he used regular windows tools he would not see that! After some mails back and forth, the details got clear:

Windows has a default /etc/hosts version (conveniently instead put at “c:\WINDOWS\system32\drivers\etc\hosts”) and that default  /etc/hosts alternative used to have an entry for “localhost” in it that would point to 127.0.0.1.

When Windows 7 was released, Microsoft had removed the localhost entry from the /etc/hosts file. Reading sources on the net, it might be related to them supporting IPv6 for real but it’s not at all clear what the connection between those two actions would be.

getaddrinfo() in Windows has since then, and it is unclear exactly at which point in time it started to do this, been made to know about the specific string “localhost” and is documented to always return “all loopback addresses on the local computer”.

So, a custom resolver such as c-ares that doesn’t use Windows’ functions to resolve names but does it all by itself, that has been made to look in the /etc/host file etc now suddenly no longer finds “localhost” in a local file but ends up asking the DNS server for info about it… A case that is far from ideal. Most servers won’t have an entry for it and others might simply provide the wrong address.

I think we’ll have to give in and provide this hack in c-ares as well, just the way Windows itself does.

Oh, and as a bonus there’s even an additional hack mentioned in the getaddrinfo docs: On Windows Server 2003 and later if the pNodeName parameter points to a string equal to “..localmachine”, all registered addresses on the local computer are returned.

Fosdem 2011: my libcurl talk on video

Kai Engert was good enough to capture all the talks in the security devroom at Fosdem 2011, and while I’m seeding the full torrent I’ve made my own talk available as a direct download from here:

Fosdem 2011: security-room at 14:15 by Daniel Stenberg

The thing is about 107MB big, 640×480 resolution and is roughly 26 minutes playing time. WebM format.

News flash! Tech terms used almost correctly!

Ok, The Social Network isn’t a new movie by any means at this time, but I happened to see it the other day. I’ll leave the entire story and whatever facts or not it did or didn’t portrait in a correct manner.

But I did spot the use of several at least basic technical terms used in the beginning that struck me as amazingly correctly used! The movie character Mark actually used wget to download images (at about 10:05 into the movie), and as you can see on my first screenshot the initial keystrokes we get to see on the command line also actually resembles a correct wget command line. You can click on these images to get a slightly larger version of the pics. I’m sorry I couldn’t get any higher quality ones, but I figure the point is still the same!

the-social-network-wget-cmdline

After having invoked wget, as is explained he gets many pictures downloaded and what do you know, the screen output actually looks like it could’ve been a wget that has downloaded a couple of files:

the-social-network-wget-output

He also mentioned the terms ‘Apache’, ’emacs’ and ‘perl scripts’ in complete and correct sentences.

Where is the world heading?!

Update: Hrvoje Niksic, the founder of wget, helped out with some additional observations:

The options looked right to me, something like -r -A.jpg …

I was wondering about the historical accuracy of the progress bar, but it checks out. The movie takes place about a year and a half after the release of Wget 1.8, which added the feature. The department that takes care of these things did a good job. 🙂

Cookies and Websockets and HTTP headers

So yesterday we held a little HTTP-related event in Stockholm, arranged by OWASP Sweden. We talked a bit about cookies, websockets and recent HTTP headers.

Below are all the slides from the presentations I, Martin Holst Swende and John Wilanders did. (The entire event was done in Swedish.)

Martin Holst Swende’s talk:

John Wilander’s slides from his talk are here:

Back in the printing game

HP Officejet 8500AAfter my printer died, I immediately ordered a new one online and not long afterwards I could pick it up from my local post office. As I use both the scanner and the printer features pretty much I went with another “all-in-one” model and I chose an HP model (again) basically because I’ve been happy with how my previous worked (before its death). “HP Officejet Pro 8500 A910” seems to be the whole name. And yeah, it really is as black as the picture here shows it.

This model is less “photo-focused” than my previous but I never print my own photos so that’s no loss. What did annoy me was however that this model uses 4 ink cartridges instead of the 6 in my previous, but of a completely different design so I can’t even re-use my half-full ink containers from the corpse!

My new printer has some fancy features. It is one of them that I can give an email address and then print on by sending email to it. The email address then gets a really long one with lots of seemingly random letters, it is in the hp.com domain and I can set up a white-list of people (From: addresses) that is allowed to print on it via email.

It also has full internet access itself so it could fetch a firmware upgrade file and install that entirely on its own without the use of a computer. (Which made me wonder if they use libcurl, but I realize there’s no way for me to tell and of course there are many alternatives they might use.)

Driver-wise, it seems like a completely different set for Windows (hopefully this won’t uninstall itself) and on Linux I could install it fine to print, but xsane just won’t find it to scan. I intend to instead try to use the printer’s web service for scanning, hopefully that will be roughly equivalent for my limited use – I mostly scan documents, bills and invoices for my work.

NASed and RAID1ed

Synology DS211jI finally got my act together and bought myself a Synology DS211j NAS  with two 2TB drives. I’ll use it as a shared network disk at home and I intend to backup to it – as I also have a home office it’ll feel better to be able to also backup company related data somewhat more safely. My previous backup was only copying data from one HDD to another within the same physical machine.

To make it slightly more resistant to disk and hardware errors I’ve configured the disks to do RAID1 (all data is stored on both disks simultaneously).

The GPLv2 license was provided printed on paper in the package.

HTTP security, websockets and more

owasp

Together with friends in OWASP I’m happy to mention that we will do an event on January 31st on the topic “HTTP security, websockets and more” where I’ll talk. Starting at 17:30, the exact location is not decided yet and it’ll depend a bit on popularity, but it will be in Stockholm, Sweden.

The two other speakers to appear at the event are, apart from myself, John Wilander and Martin Holst-Swende. My part of the session will be about the WebSockets protocol, about the upcoming cookie RFC and some bits about the ongoing HTTPbis work.

Sign up to attend, the opportunity is only open one week.

Omegapoint will sponsor with something to eat and drink, and we do plan to go out and grab a beer afterwards and continue the discussion.

See you!