Category Archives: Open Source

Open Source, Free Software, and similar

DFU mode on 2nd gen Nanos

Some clever hackers in the Rockbox community wrote up a tool to access the Meizu players’ DFU mode (while running Linux – which I already mentioned), and using this we can upload and run code on several Meizu targets. The code is put and executed in SDRAM only. It makes it a perfect way to test new code on it.

The Meizu players have their SoC in common with Apple’s Nano 2nd gen and Shuffle 2nd gen.

There are indications that the Nanos have such a DFU mode as well, even though we don’t currently know of any way to trigger it by will. Possibly shorting the NAND chip or destroying the firmware or similar might do it.

If you have such a broken Nano or Shuffle, please get in touch and we can do some poking around!

Obviously, there’s a DFU mode on the iphone and iPod touch that can be triggered:

Your phone must be off, but attached via USB to the PC. Then you hold the power and “home” buttons for 10 seconds. At the ten second mark, you release the power button, but keep the “home” button pressed for another 10 seconds. At the end of that process, the phone enters DFU mode (the only way to tell is windows will tell you a USB DFU device has connected)” (thanks to GodEater)

Although I’m convinced our limited DFU experiments will not be a lot of fun on those devices (yet).

It seems iPod Classics can also go into this mode.

For the iPod Nano 2nd gen:

“To access DFU mode, reset the iPod with MENU+SELECT, then press and hold BACK+PLAY. A picture of the dock connector should appear with the Apple support URL; according to lsusb, this is DFU mode…  it seems that you have to first trash the firmware before you can access it.” (thanks to LambdaCalculus37)

Since autumn 2009, Rockbox boots and runs on the iPod Nano 2nd generation!

Not so public file with GPL license header

Here’s a license dilemma for you:

Imagine company X hosting a tarball on their public web server. There’s no publicly available link to this tarball, but if you access the URL with your browser or download tool, you can download it with no restrictions from anywhere in the world.

The tarball contains GPL code. That is, the code in question has GPL license headers (in addition to Copyright (C) by Company X notices).

If you get your hands on said code, is it to be considered GPL and thus valid to be used by a GPL-compatible open source project?

Arguments against this include that the tarball, while being accessible, may not actually have been meant for distribution and thus the license may perhaps not be the one intended for the code in the end.

What if someone would publish the link on a totally unrelated site and say “get the code [here]” and link to the above mentioned code. Wouldn’t that cause at least some people to get the code in good faith and then would the GPL apply?

(Any resemblance to a real-life scenario is purely coincidental. Names have been changed to protect the innocent.)

curl and libcurl 7.19.0

With almost 40 described bug fixes curl and libcurl 7.19.0 come flying with a range of new things, including the following:

  • curl_off_t gets its size/typedef somewhat differently than before. This may cause an ABI change for you. See lib/README.curl_off_t for a full explanation.
  • Added CURLINFO_PRIMARY_IP
  • Added CURLOPT_CRLFILE and CURLE_SSL_CRL_BADFILE
  • Added CURLOPT_ISSUERCERT and CURLE_SSL_ISSUER_ERROR
  • curl’s option parser for boolean options reworked
  • Added –remote-name-all
  • Now builds for the INTEGRITY operating system
  • Added CURLINFO_APPCONNECT_TIME
  • Added test selection by key word in runtests.pl
  • the curl tool’s -w option support the %{ssl_verify_result} variable
  • Added CURLOPT_ADDRESS_SCOPE and scope parsing of the URL according to RFC4007
  • Support –append on SFTP uploads (not with OpenSSH, though)
  • Added curlbuild.h and curlrules.h to the external library interface

We’ve worked really hard to get this to be a really solid and fine release. I hope it’ll show.

Getting cacerts for your tools

As the primary curl author, I’m finding the comments here interesting. That blog entry “Teaching wget About Root Certificates” is about how you can get cacerts for wget by downloading them from curl’s web site, and people quickly point out how getting cacerts from an untrusted third party place of course is an ideal situation for an MITM “attack”.

Of course you can’t trust any files off a HTTP site or a HTTPS site without a “trusted” certificate, but thinking that the curl project would run one of those just to let random people load PEM files from our site seems a bit weird. Thus, we also provide the scripts we do all this with so that you can run them yourself with whatever input data you need, preferably something you trust. The more paranoid you are, the harder that gets of course.

On Fedora, curl does come with ca certs (at least I’m told recent Fedoras do) and even if it doesn’t, you can actually point curl to use whatever cacert you like and since most default installs of curl uses OpenSSL like wget does, you could tell curl to use the same cacert your wget install uses.

This last thing gets a little more complicated when one of the two gets compiled with a SSL library that doesn’t easily support PEM (read: NSS), but in the case of curl in recent Fedora they build it with NSS but with an additional patch that allows it to still be able to read PEM files.

c-ares 1.5.3

I’m happy to announce the release of c-ares 1.5.3. c-ares is an asynchronous name resolver and somewhat generic DNS library with a liberal MIT-style license.

The news this time include:

  • fix adig sample application compilation failure on some systems
  • fix pkg-config reporting of private libraries needed for static linking
  • fallback to gettimeofday when monotonic clock is unavailable at run-time
  • ares_gethostbyname() fallback from AAA to A records with CNAME present
  • allow –enable-largefile and –disable-largefile configurations
  • configure process no longer needs nor checks size of curl_off_t
  • library will now be built with _REENTRANT symbol defined if needed
  • Improved configure detection of number of arguments for getservbyport_r
  • Improved query-ID randomness
  • Validate that DNS response address matches the request address
  • fix acountry sample application compilation failure on some systems

I’m also happy to see that the development version of Wireshark is currently using c-ares.

If you’re a graphics person, we’ll appreciate some kind of logo/symbol thing for the project!

Good port day

Things happen in bursts. Development goes on and on for long periods without any noticeable big breakthroughs, and then all of a sudden a lot happens at once. And those days are the best days!

Rockbox now works somewhat on the iAudio7, new patch posted today!

Rockbox now almost runs on the Creative Zen Vision:m, but at least the guys can now install the bootloader to load and start without having to rip out the harddrive and put it into a PC first!

We can now install new firmwares on the M6 players when running linux thanks to new tools being developed!

FTP vs HTTP, really!

Since I’m doing my share of both FTP and HTTP hacking in the curl project, I quite often see and sometimes get the questions about what the actual differences are between FTP and HTTP, which is the “best” and isn’t it so that … is the faster one?

FTP vs HTTP is my attempt at a write-up covering most differences to users of the protocols without going into too technical details. If you find flaws or have additional info you think should be included, please let me know!

The document includes comparisons between the protocols in these areas:

  • Age
  • Upload
  • ASCII/binary
  • Headers
  • Pipelining
  • FTP Command/Response
  • Two Connections
  • Active and Passive
  • Firewalls
  • Encrypted Control Connections
  • Authentications
  • Download
  • Ranges/resume
  • Persistent Connections
  • Chunked Encoding
  • Compression
  • FXP
  • IPv6
  • Name based virtual hosting
  • Proxy Support
  • Transfer Speed

With your help it could become a good resource to point curious minds to in the future…

The hack will still be useful

Okay, in my recent blog entry about Flash 10 using native libcurl I got a bit side-tracked and mentioned something about distros confusing libcurl’s soname 3 and 4. This caused some comments in that post and some further activities behind the curtains, so let me spell out exactly what I mean:

The ABI for libcurl did change between soname 3 and 4, but the change was in a rather subtle area (FTP third party transfers, sometimes known as FXP) which is rarely used. It certainly will not hurt the Adobe Flash system.

I’m not against “the hack” (or perhaps “a hack” as there are several ways an ordinary system could provide work-arounds or fixes for this problem) per-se, I am mainly trying to fight the belief or misconception that the ABI break doesn’t exist.

Since Adobe doesn’t want to provide an updated package that links against a modern libcurl and refuses to provide multiple packages, distros of course need to address this dilemma.

I just want all to know that 3 != 4, even if the risk that it’ll cause problems is very slim.

Update: it seems Adobe will change this behavior in their next release and then try to load either 3 or 4.

CA cert bundle from Firefox

It could be interesting to note that extracting all the cacerts from your local Firefox installation isn’t that tricky, if you just use some of the magic that are at hand with the NSS certutil tool.

Users of OpenSSL or GnuTLS based tools or libraries (such as libcurl) might be pleased to learn this.

curl users in general of course should be aware that we no longer ship any ca-cert bundle with curl (as of curl 7.18.1), as it seems some ports haven’t yet updated or discovered this.

Update: this script is now present as lib/firefox-db2pem.sh in the curl CVS repository.