Distributed Builds on Every Commit

Rockbox

I’m not sure everyone out there has yet realized what a cool build system we’ve created in the Rockbox project!

We’re using Subversion for source code version control. We have a master server that detects whenever there has been a commit and it then starts one thread for each known build server, where each thread connects to the remote server, asks it to update to a specific rev number and then asks the server to build for a particular target.

When the build is done, the master copies the build log and in some cases also the final zip file (which is then offered for download to users). At the time of this writing, we have 67 different builds and we average on 15 build servers. The master adapts to the servers that respond and just ignores the ones that don’t.

This has the cool outcome that roughly 5 – 7 minutes after a commit, there are zip files offered on the site with the absolutely latest code for 27 different players! There’s also a huge table presented on the site with the results from all builds so that warnings and build errors can be worked on.

Of course the master then goes back to check for commits again and the whole thing starts all over again.

Just now, the build for the Olympus M:Robe 500 was modified to depend on a recent ARM tool chain patch so we need to get all build server admins to update their ARM compilers!

The build servers are of course “donated” to the cause by volunteers. It is a fairly easy way to help out the project, if you have the sufficient bandwidth and machine. You can help too!

3 thoughts on “Distributed Builds on Every Commit”

  1. what the size of the compilation workflow tools and approximatively how much avaliable bandwith(in traffic by month) i need by to help the project ?

  2. My current “rbclient” (the user that does the actual builds) on my build server uses 1.2GB of data in its home directory and the build tool chains (excluding the native gcc used for simulators) use about 180MB.

    A lot of that data is due to ccache being used (cache size is 869MB at this moment).

    I don’t know how much bandwidth this becomes over time, but in general there isn’t very much data transfered other than the zip uploads to the build master. The SVN updates are usually fairly small.

    You also need to have a 2GHz CPU or so at minimum.

Comments are closed.