A matter of energy

In the past month I have looked into measuring the power consumption of Firefox on desktop. It all started with a complaint of excessive background activity during idle so I wrote a little tool to measure the energy consumption of FF while idling on a blank page. I wasn’t able though to notice any statistically valid difference between between the idle machine and the idle machine + the idle browser (on Win8, OSX & Ubuntu).

A more traditional profiling using DTrace yield some wakeups though that I could backtrack to a particular set of timers firing. Most of those timers have been addressed and fixed in the current Nightly. I suspect that the reason I couldn’t identify them on my power benchmark is that the profiling interval I used was just too short (5-10min). A more sensible test would be to use longer intervals and/or measure the time it takes for a battery to drain, for instance.

The next step was to have look also at the drainage of energy when idling on some popular websites. In order to reduce the variance of the measurements I had to disable as many background processes as possible, i.e. spotlight on OSX. For the same reason I couldn’t use Selenium and I had to rely on simple terminal commands to steer the browsers. The problem with Selenium is that each browser has its own implementation with a complete different power signature. It probably could be possible to remove the noise using some ML algorithm but that would require a large enough training set, for each distinct website and OS… A more realistic alternative would be to write some “simple” and lightweight windowing event generator to steer the different browsers.
That being said, even without Selenium it’s possible to get some interesting data: before we can optimize the power drain of FF while scrolling on cnn.com for instance, we must address the power drainage of simply idling on it, since that drainage is very likely to contribute to any power profile performed while scrolling anyway.

Keep in mind the following bits when interpreting the data:

  1. the latest release versions of FF, Chrome, IE & Safari as of this writing have been used;
  2. the sampling interval amounts to only to 30 seconds; even though its extremely short and might not catch background activities like sporadic timers, it’s obvious that if there is something very wrong with a page it will show up in the profile anyway;
  3. the error bars mark the 95% confidence intervals;
  4. the energy profiler starts measuring only about a minute after a page has been loaded; that’s needed to ensure the browser is in a steady state;
  5. the profiles were performed by idling on the homepages of the different sites, i.e. youtube.com wasn’t playing any video;
  6. only for Facebook the browsers were logged into my profile;
  7. the data was collected on different machines at different times, i.e. the 3 plots are not comparable between each other;
  8. I am assuming that a particular homepage doesn’t change dramatically when collecting the profiles for the different browsers (more on this later);
  9. only the CPU specific power drainage is measured.

Let’s start with a plot performed on Windows 8:

windows_power

Firefox is performing quite well, only Facebook seems to have a particular adverse effect on the energy profile.

Let’s have a look at the the OSX plot:

osx_power

It seems here that FF is performing particularly bad on Facebook and Yahoo, but is in good company in the latter case.

And finally we have the profile for Ubuntu:

ubuntu_power

And again Facebook is a clear outlier here. As you can notice the variance here is much lower since on Linux it was easier to remove all unneeded background tasks.

The main pattern across those profiles seems to be that the OS vendor-specific browsers are slightly better at not draining energy during idle. Also, Facebook seems to behave negatively on FF across all OSs so it doesn’t seem to be a random fluctuation. It looks like I am going to have some fun this week tracking down the root of the problem 🙂

For the future, a distributed benchmark that runs on O*B equally configured machines where O is the number of OSs and B the number of browsers, would be ideal. This would allow the profiles to be comparable across OSs and also make them independent of homepage updates.