Week 3

Published on 25 January 2015

This is a post about weeknotes.


Implementing the new landing page idea from last week and a raft of other changes ahead of a more public release in the next few weeks. Although there's not that much changing, it took up a huge chunk of time. There were lots of discussions about whether people read explanatory text or not and where the big green start button should go. Based on observing family members interacting with technology, I realise that they don't. At all. And so the whole aspect of pairing devices and linking accounts—which we do a lot of at work—starts to look completely useless unless it all becomes radically simpler.


Reading enchanted objects, where everyday objects get new powers through embedded computation, I'm struck at how the magic is probably preceded by a lot of frustration and pairing and getting onto the network. My Wemo connected plug, which controls a heater in my bedroom went offline at the weekend and I was quite frustrated to get a "sad cloud" image as I walked home in the freezing cold. It's a testament to Wemo and whatever cloud it uses that it hasn't happened before in a few months of using the system.

This is all related to Scott Jenson's write-up about home automation, IoT and CES.

TV on the radio

Our radio prototyping toolkit is good at audio, but what about video? An upcoming project will call for a prototype about future TV experiences. We've long talked about the system being extended to support a video media player but this week we got into the detail of how it currently works and how it could be extended to support a video-player like VLC. Turns out it might not be that hard, since a lot of the work already been done to swap out media players—we currently support MPD and Mopidy. Mostly because Mopidy has a Spotify plugin (if you're a Spotify premium user).


Got to see a Huddle Lamp after a workshop with some people at UCL Interaction Centre discussing possible areas for collaboration. Very exciting times. Talking about a radiodan core with pre-configured behaviour that can be plugged together in different ways. Removing some of the barriers to entry for people wanting to create there own media prototypes.

Generative logos

Libby's Shuffle bot got a generative logo in the iPlayer colours thanks to a Processing sketch Tristan made previously to mimic the new R&D website branding. It got me thinking if we could generate logos for all our prototypes, but somehow have a one-way hash map to the same generative logo. A sort of fingerprint for each project.

The walls have eyes

Meeting about it, feedback from exhibition and thinking about wallspace. Something that works on but also tells a story when it's off.