Category Archives: Research Studio

Starlight by Fieldguide

Jon, Tom and Mike from the studio have recently been busy with their Fieldguide project. Along with fourth Fieldguide co-founder Pete Thomas and his Uniform colleague Martin Skelly they created StarLight, an interactive lighting installation using space data. StarLight is a concept first seeded by Jane Wallace and James Thomas from the University of Northumbria.

StarLight is a collaboration between Fieldguide and the Swedish lighting manufacturer Wästberg. In 2009, NASA launched their Kepler space observatory to look at the light from far off stars and interpret their flickering and pulsing in order to discover habitable planets. StarLight uses NASA data to allow people to replay the light that originated from stars light-years away, giving people a sense of connectedness to these stars and encouraging them to dream of far-off worlds. Wästberg works with the most renowned architects and designers, combining aesthetic sensibility with Swedish engineering mentality. Their products have won over 40 awards for excellence and they represent a leading player in the future of lighting design.

The launch for this event is on Thursday the 19th Sept from 6pm onward at the Imperial College reception.

Images to follow…

Space Issue of Fieldguide at





Unlimited Space Agency and Bare Conductive at Greenman

Jon Spooner, Head of Human Spaceflight from the Unlimited Space Agency approached us this year to see if we could do something for Einstein’s Garden at the Greenman festival.

Together we came up with a workshop that allowed the attendants to connect paintings created with the amazing Bare Conductive paint to the International Space Station. This paint conducts electricity and meant that the painting could have a bunch of LEDs glued on that could flash when turned on.

We created a giant space communicating antenna that all the painting were hung on. As the International Space Station flew over the antenna became live and the painting all started to light up!

This would not of been able to happen if it was not for the amazing support from Bare Conductive!


Team UNSA preparing for a new influx of agents.
Mission control. The screen shows the proximity of the International Space Station. The control unit up top has two keys switches to override the system and allow for testing.
UNSA’s Head of Human Space Flight Jon Spooner looks worried just before we power up the antenna
Everything goes live and the LEDs all start flashing!
blog 6
The cable made to connect paper to the antenna.


Unlimited Space Agency using Bare Conductive at Greenman 2013 from michael shorter on Vimeo.


Interactive Newsprint



Interactive Newsprint (INP) has just come to an end (April 2013).  It was an 18month EPSRC funded project that involved academics from UCLan, Dundee and Surrey and the project’s technical partners Novalia.

It was a co-design project, we worked with the people of Preston to explore the future of print and journalism through the media of traditional print and an emerging technology, paper electronics.  Back in November 2011, we began the process off with a series of workshops in Preston.  Demonstrating where the technology currently was and using it is an opportunity to ask and discuss how paper electronics could work within Preston communities and where improvements needed to be made to interactions and technology specification.



Below is an introduction video to Interactive Newsprint, including user feedback, from our lead partner at UCLan:

We had the pleasure of working with some great people from around Preston and a few further afield.  Below are a couple of examples of the projects we worked on:

We first showed the Lancashire Evening Post and Outsiders at London Design Festival with Fieldguide.  The LEP and Fieldguide went on to be announced as a highlight of the Festival by Blueprint magazine.


Artist – Garry Cook and his project ‘Outsiders’


These examples above, are all connected to the internet.  This affords some super-interesting possibilities and functions.  Two of which are updatable audio, and analytics.  Below is a video sketch (may need to be fullscreen and HD) that demonstrates our early thinking around analytics:

Our most recent, and last outing with Interactive Newsprint was to SXSW in Austin in March.  We were invited to demonstrate, Web-connected Paper at the Mozilla, Knight Foundation, Soundcloud, MIT Media Lab Party, which went down really well.  The following day, Paul Egglestone and I were joined on our panel ‘Pitchforks and Printed Electronics’ by Nick White of the Daily Dot and Garrett Goodman of Worldcrunch to discuss the future of print, journalism and paper electronics and how they could affect everything from community journalists through to global news organisations.




Much more information on Interactive Newsprint can be found at

Additionally, in the near future, our research, findings and impact will be written up by the partners in a number of papers.  These will be announced soon.

UNSA – Jon Spooner’s next steps to Space

During the International SpaceApps Challenge in Exeter the Product Research Studio focused on helping Jon Spooner, an aspirant astronaut from the Unlimited Space Agency, get a few steps closer to his adventure into space.

Last year at the SpaceApps Challenge the team made a mini version of Jon Spooner in the hope he could go up to space in an astronaut’s pocket. Unfortunately this was never realised. So to help mini Jon get a bit closer to space we decided to make him a rocket with an on-board computer to run some test flights monitoring his environment. jon’s new rocket was 3D printed and filled up with mini-Jon and a Texas Instruments CC2541 Sensor tag.

More to follow…












Playing Paper

Playing Paper was created for the Paper exhibition put on by Analogue Social. This exhibition show from the 9th April to 12th May at the Lighthouse, Glasgow. 

Eight designers were chosen to create a project using paper for this exhibition. The designers choosen were Kate Colin, Kerr Vernon, Kerry McLaughlin, Lisa Catterson, Alan Moore, Jemima Dansey Wright, David Ross, Craig McIntosh and myself, Mike Shorter.

I decided to propose an evolution of The Invite. In creating the Invite I learned a lot about how to screen print with Bare Conductive carbon based ink. I saw this project as an opportunity to deploy all the things I had learnt, from graphic limitations to ink dilution.

Playing Paper consisted of three artworks, with the aim of showing that paper electronics doesn’t have to be all circuit diagrams, but can be much for artistic. This was achieved by having the artworks display three different levels of circuit, one very technical and traditional, one with hand drawn components and the other with instruments drawn. When plugged into the mixer the three artwork were all turned into musical instruments, all creating the same sounds. Three printed buttons on the bottom of the page gave off different drum sounds when touched, and a distance sensor printed at the top changed pitch the closer you got to the ink.


To make the screen printing a much less stessfull the Bare Conductive paint was diluted with water (roughly 4 parts paint to 1 part water). This made the paint less sticky so it flowed better on the screen, and also slowed down the paint  drying in the screen.


With this new diluted version of the paint the printing became much easier and we were able to print out over 50 artworks with about 100ml of Bare Conductive. Last time when we printed with Bare Conductive we made the trace thickness 1pt in illustrator which made it far to easy to mess up printing if enough pressure wasn’t applied when printing let alone the paint drying in the screen. This time the traces were all 2pt which led to a 100% success rate!


I decided to print out a bunch of the artworks in a couple of different colours. I was planning on exploring how other colours would print on top of the Bare Conductive but forgot due to the excitement of the printing going much better than expected! They look pretty good as a collection.


I decided I wanted to output noise to be a bit more exciting than that of The invite, so i decided to add an MP3 Trigger to the electronics inside the mixer. This allowed the three buttons along the bottom to control drum samples. In doing this it also meant that the ‘theremin’ sound was a lot smoother. Playing Paper stilled used the bespoke bulldog clip that was used with the Invite. This method still remains the best way I have seen to connect paper to other devices.


The preview of the show at the Lighthouse was absolutely packed! Playing paper was in constant use as the grubbiness of the prints illustrated, two hours of being constantly touched by excited fingers…

Below is a quick little video of Playing Paper at the Lighthouse on the preview night. After the show is finished I plan to upload a video with real time sound…

Playing Paper from michael shorter on Vimeo.

Mike is Presenting at the Electric Bookshop Shop Late Lab




Mike Shorter from the Product Design Research Studio will be presenting his thoughts on the future of paper tomorrow night. This event is part of the Edinburgh Science Festival and is being organised by the brilliant Electric Bookshop at Inspace.

There is a great line-up for this event, Mike will be talking with:

Ian Sansom the author of the amazing book  Paper; An Elegy. 

Alyson Fielding, an artist who hacks books, stories and Arduinos.

And finally Yvette Hawkins, a paper atrist who makes wonderful artworks and sculptures out of paper.

Check it out here

With this great diverse collection of speakers there are going to be some great conversations!


Mini Mars Rover


The Mini Mars Rover was built for NASA’s International Space Apps Challenge by Mike, Tom and Ali. The Mini Mars Rover will move in the exact same pattern as his big brother on Mars. It will also display some other live data such as sound and images. The Mini Mars rover was built to roam around the home environment allowing the users to have a connection to the Mars Rover as it explores alone. We want to take boring data and make it tangible and exciting.

The Mini Mars Rover is an internet controlled robot. He comprises of a Wild Thumper chasis, an Arduino and an Electric Imp.


The Mini Mars Rover began life by building a new laser cut acrylic body onto the Wild Thumper 6WD chassis, and adding some more suitable wheels.


The Mini Mars Rover has just come back from an eventful time at SXSW Interactive in Austin,Tx. Not only was he found driving around the Space Meet Up event but he also featured on the Making Space Data Real on Earth panel. This panel was hosted by Ali Llewellyn from the NASA Open government initiative, David McGloin and Jon Rogers from the University of Dundee and Jayne Wallace from Northumbria University.

Below are some videos of the rover in action….

On Mashable –

On the Global Post –


The Mini Mars Rover can be controlled from this url: 

Team NASA (including an astronaut) get behind the Mini Mars Rover

Here’s is some code for you….

Squirrel for imp (adapted from an online source which I can no longer find…) :



// remote control for rover
ledState <- 0;

function blink()
// Change state
ledState = ledState?0:1;
server.log(“ledState val: “+ledState);
// Reflect state to the pin

// input class for LED control channel
class inputHTTP extends InputPort

name = “power control”
type = “number”

function set(httpVal)
server.log(“Received val: “+httpVal);

if(httpVal == 1) {


else if(httpVal == 2) {


else if(httpVal == 3) {


else if(httpVal == 4) {



function watchdog() {

// start watchdog write every 60 seconds

// Configure pins as an open drain output with internal pull up

// Register with the server
imp.configure(“Reomote Control for Rover”, [inputHTTP()], []);


Arduino code (thanks Chris Martin!)…



Reads an analog input on pin 0, prints the result to the serial monitor.
Attach the center pin of a potentiometer to pin A0, and the outside pins to +5V and ground.

This example code is in the public domain.

int pinf=2;
int pinl=12;
int pinr=10;
int pinb=9;


#define LmotorA 3 // Left motor H bridge, input A
#define LmotorB 11 // Left motor H bridge, input B
#define RmotorA 5 // Right motor H bridge, input A
#define RmotorB 6 // Right motor H bridge, input B
#define v 255


#include <Servo.h>
//Servo myservo;
//int led = 12;
int pos = 0;
// the setup routine runs once when you press reset:
void setup() {
// pinMode(led, OUTPUT);
pinMode(pinf,INPUT); // initialize serial communication at 9600 bits per second:



// this is different on the serial monitor not sure if it is up or down
// Serial.begin(14400);

int lls=0;
int rls=0;
int al=0;

// the loop routine runs over and over again forever:
void loop() {
// read the input on analog pin 0:
int sensorValue1 = digitalRead(pinf);
int sensorValue2 = digitalRead(pinl);
int sensorValue3 = digitalRead(pinr);
int sensorValue4 = digitalRead(pinb);
// print out the value you read:
Serial.print(” : “);
Serial.print(” : “);

Serial.print(” : “);
delay(25); // delay in between reads for stability


if (sensorValue1 == 1) {

delay (500);
// myservo.write(10);
// delay (500);


if (sensorValue2 == 1) {
// digitalWrite(led, HIGH);
delay (100);
// myservo.write(10);
// delay (500);


if (sensorValue4 == 1) {
// digitalWrite(led, HIGH);
delay (100);
// myservo.write(10);
// delay (500);

if (sensorValue3 == 1) {
// digitalWrite(led, HIGH);
delay (500);
// myservo.write(10);
// delay (500);





Supported by New Media Scotland’s Alt-w Fund

Alt-w Square k


SXSW 2013 – Make Space Data Physical

Making data physical means that more people can access it in more ways. Taking data from the screen and making it do things in the real world dramatically increases the potential impact of this data. And as far as being able to touch data that can never be touched, then space and time have to be it.  Things that are far away or things that are lost in time are two physical barriers we simply can’t cross and we want to find a way to do this.

For me, one of the most dramatic pieces of data I have come across are the chunks of wall that are missing from the Victoria and Albert Museum on Exhibition Road in London. I used to walk past these every day and wondered why no one had filled them in – as the V&A is one of our most precious buildings we have. Then one day I saw the small plaque where these word are carved in stone.

The damage to these walls is the result of enemy bombing during the blitz of the Second World War 1939-1945 and is left as a memorial to the enduring values of this museum in a time of conflict.

 I could literally touch the holes where shrapnel from bombs had blown holes. I could touch the data. I held my breath and for that moment I was there in far more a real experience than any I had previously had of the war. The data from an event 70 years ago had touched me when I touched it. I had travelled through time.

So can we do this to space data? Can we build connections between people here on earth that reach across the vastness of space – to far-off stars  – and across the vastness of time – to the very origins of the universe itself? We hope we can! Which is why when Ali Llewellyn from Open NASA got in touch a year ago we literally jumped for the chance to work with her and her team in making space data real here on earth.

We’ve been hacking together examples as demonstrators, or starting points, of new ways to connect people to space data. To give an example, if you start to put people’s emotions first then loneliness rather than measured distance is a great way to connect people to space. Rather than think of Mars Rover or Voyager as machines sending data over distance you start to think that they are out there all alone; on cold dark planets or at the far reaches of space. Forever alone.   It is this starting point that can start to make data more human. To make data a thing we want to love… Then we start to connect people to their loneliness… what would this mean? What could we design?

Our friend the jeweler Jayne Wallace is on our panel and her take on this is about the way data we receive now has been generated in the past, possibly billions of years ago – right back to origins of the universe itself.

Our lives have a pace to them and time is both something we crave more of yet know has an ultimately finite quality for us. Our interactions with the digital are quickening our pace of life and altering not only the texture of days and years but also how we value, measure and perceive the passing of time. But there are things that are bigger than us, things older than we can imagine, things that give our atomized view of life and the time we have a very different perspective and we simply have to look up to start to engage them. We want to explore what it would mean to use digital technologies and space data to subvert our relationship with time and bring fresh potential to the digital objects we live with and through. Through design we can use space data to create ways to experience now things that occurred before humankind existed, we can read by the light of a lamp connected to the live feed from a telescope and know that when it flickers a new planet has been discovered and we can connect to the orbits and rhythms of planets through objects that gently respond to these different cycles and be reminded that we are part of something much greater, much faster, much slower and much more fascinating than our atomized lives sometimes allow us to consider.

We’re also sharing our panel with someone who has more than a little knowledge and authority on the science behind all of this – David McGloin (many of you will know as @dundeePhysics). His team of undergraduate,  high school teachers and pupils have been exploring ways to connect to the dark side of the moon for use in the education of ways of conducting optical measurements of space objects.

“We know that space is one of things that most inspires high school students to study subjects such as physics at University, but it’s clearly a challenge to get hands in practical work while still at school. Our  project is an example of how we can use space data to try and make a more physical and immediate connection to the subject.

Heading up our panel is Open NASA’s very own Ali Llewelyn. I asked her what excites her about making space data Physical.

From the time I was a child, I wanted to touch the stars. I wanted to walk on other planets with my own feet and fly a spaceship with my own hands. While I am not an astronaut, and there isn’t yet a human presence on Mars – making space data physical enables me to get closer than most humans can yet get to those opportunities. My work at NASA in open innovation and mass collaboration is dedicated to exactly this: enabling everyone on planet Earth to contribute directly and substantially to the exploration mission.”

I then asked her what she thought were the possibilities that this approach presents?

“This approach inspires everyone with the wonder of exploration by making the data engaging and allowing it to inform a new context. (Who doesn’t want to drive Curiosity or touch the sun?) This approach democratizes exploration for all citizens – making what we are learning in space accessible to everyone on planet Earth.  This approach encourages new approaches and opportunities to the challenges we face in improving life on our planet and taking our species off-planet. This approach extends the usefulness of space data. While the data often had one initial research purpose, we are “recycling” it for other applications and uses, especially in new contexts”

And why getting our hands ‘in’ data is an amazing thing.

In the time it took you to read this sentence, NASA gathered approximately 1.73 gigabytes of data from our nearly 100 currently active missions! We do this every hour, every day, every year – and the collection rate is growing exponentially. Handling, storing, and managing this data is a massive challenge. Our data is one of our most valuable assets, and its strategic importance in our research and science is huge. We are committed to making our data as accessible as possible, both for the benefit of our work and for the betterment of humankind through the innovation and creativity of the over seven billion other people on this planet who don’t work at NASA. What would become possible if everyone could not just access but remix and reuse the images, maps, metrics and lessons learned from this amazing trove of observation?


Get Physical: Making Space Data Real On Earth

With Ali Lewellyn (Open NASA), David McGloin (University of Dundee), Jayne Wallace (University of Northumbria) and myself (Jon Rogers)

11am Monday March 12th in Omni Downtown, Lone Star

Hope to see you there!

Thank you to: New Media Scotland, Open NASA, RCUK, University of Dundee,  and Northumbria University












Electric Imping


Over Christmas I was having a bit of a rest from writing my transfer report by playing with my new Electric Imp.

This wonderful little device is the size of an SD card that can be embedded into objects to make them internet enabled. I managed to create a few Christmasy experiments over the festive period. The first experiment was to borrow Bendan Dawes’ wonderful example to create a tweeting Christmas tree, every time the tree lights went on in the house it sent out a tweet to let everyone know. This was done by sticking an LDR directly onto one of the tree lights to recognise when they were turned on. When the imp sees that the lights are on it sends a message to, which composes a message and tweets for you.

Screen shot 2013-01-10 at 15.00.04


The second experiment was to make the Electric Imp work the other way round. Instead of using the real world as an input and digital world as an output,  I wanted to control the real world using the digital world. I managed to hack some code together from online examples so that every time a website was refreshed it activated a servo motor to spin a mini Christmas tree.

Electric Imp controlling a servo motor from michael shorter on Vimeo.

This code was then combined with a mains relay, now when the website was visited it turned the Christmas tree lights on, and then off when visited again.

Electric Imp controlling Christmas tree lights from michael shorter on Vimeo.


Conductive Ink Workshop at MakLab


Mike, Tom and Roy from the research studio held a conductive ink workshop at the wonderful MakLab in Glasgow. The aim of the workshop was to introduce conductive inks to to collection of Glasgow creatives to show them its potential. The workshop was attended by people from various backgrounds, from design to printmaking.

We went armed with some 555 timer circuits that allowed people to get stuck right into playing with the ink and interactions, and not worry too much about the technology side of things. The 555 timer circuits made it really easy for people to create basic noise making devices.

Due to the fact that the technology for the workshop was pre-prepared it really allowed people to concentrate on creating some great paper interactions. By the end of the night the ink had moved away fro paper and onto other objects like wooden blocks and even skin (much against the manufacturers recommendation).

Workshops like this are always rewarding because not only do you get to meet a bunch of great new people, you also always come away with some new information – this workshop was no exception. Sophie Dyer introduced us to a low tech screen-printing technique. This technique allowed us to rattle out  multiples of prints in less than an hour (this even includes cleaning the screen!). The magic thing about this process is that you can create detailed prints without having to expose a screen. Instead, you use a vinyl cutter to create the mask, and stick it on the underside of the screen. You then put a towel (or something soft) down on the table, tape your paper to the back of the screen and away you go!