Me, the AHRC and the IEEE Big Data 2013

IEEEBag

I trained as an electronic engineer. My PhD exploring digital neural network models of motion perception took me deep into technical detail and gave that badge of doctorate in the philosophies  about digital electronics.  But you know this right? You know tha I radically changed the course of my interests and my professional life when I stepped stage left from the labs of Imperial College and walked into the studios of the Royal College of Art. Why did I do this? I did it because in the studios of the computer related design at the RCA there was a connectivity to people.  Playful technologies were used to complete research and exploration stories in the way people could interact with digital tech. They explored stories that predicted the future and quite radically positioned, for me and much of the rest of the world, a completely different way to see the route to discovery.  I felt re-tuned to a sense of purpose I hadn’t felt since I was hacking my BigTrak in the 1980s. And there was no looking back.

Until now.

In July this year  the Arts and Humanities Research Council (AHRC) asked whether I would like to join them at the  2013 IEEE International Conference On Big Data in Santa Clara in October. A chance for me to return to my roots and to reflect on where the arts and humanities research community fits into the Big Data landscape. A chance for the AHRC to throw somebody new into this melting pot and uncover some insights.

As I registered and was given the ubiquitous conference bag I wondered what insights I could harvest in this space that everyone from global governments to space research agencies to weather scientists to museum curators  to big supermarkets and one man and his dog  are doing on their  way mow this particularly large data meadow.

I think we nee a bit of context here. So let’s start with what is the IEEE? The Institute of Electrical and Electronics Engineers (pronounced Eye Triple E) was formed in 1884 from a collective of enlightened engineers to support professionals in their nascent field and to aid them in their efforts to apply innovation for the betterment of humanity”.  The core value and aim for the betterment of humanity rings true. It is not for the betterment of machines, it is for people. This value of humanity makes me feel good. It makes me feel that the IEEE are the right people to be tackling Big Data. Because isn’t it all about people at the end of the day? Does anyone ever wish they had spent more time with machines? This ideal that has travelled through  from the peak of Victorian society to the 21st century  makes the  feel pretty good about the IEEE as a perfect partner for arts and humanities research. And with this in mind I enter the conference room….

Day one – First IEEE workshop on data visualisation.

What is Big Data? Good question – with a pretty clear party line broadcast from the keynote and  throughout the 15 or more talks on the days. Big Data is a problem of a number of Vs.

Volume (how much data)
Velocity (how fast is it moving)
Variety (images, text, video, sensor)
Veracity (incomplete, rumours, dirty)

Now don’t get me wrong but these don’t sound like definitions that lead to the betterment of humanity. These sound like problems for machines.  I think we can find some more appropriate Vs to throw in there. How about values, validity, visibility and voices? What is it about problems just for machines that takes a powerful headline such as “Big Data” and push it right back into a meeting for computer scientists fine tuning algorithms. There seems no sense of purpose or of any grand challenge to solve? Maybe it’s me, but I wonder if anyone has asked the question why?

And this theme continued. I was disappointed (you might have guessed this) as the focus of every talk was on models for processing more big data at faster speeds.  There was very little about the human side of big data.  There was, to my surprise at this being the first IEEE workshop on visualistion  of big data,  nothing visual at all – we got close a couple of times but the speakers quickly moved on as if embarressed that a bit of beauty and clarity might somehow lesson the science.  Is this a crisis of confidence? Almost as if the original aims of the IEEE had been lost?  That’s not anyone’s fault I think it is a global problem of technologists. That people who are passionate (and the speakers were incredibly passionate) about machines are incredibly focussed on improving machines. Not improving machines for the betterment of humanity. And I know I go on about this, but by making things visual or even going further and making them tangible we bring them into a our human world. When this happens people react. People leap into action and want to know more  And the arts world is no stranger to the realities of engaging people in the physical world. When the UK’s largest public sculpture, The Angel of The North  by Anthony Gormley was first commissioned it was met with public outrage with people voicing real anger at the prospect of money being spent on something landing in their neighbourhood. It cost around £1m and has lasted the test of 15 years with over 90,000 people seeing it every day.  It is a thing now to be adored – a beacon of hope for the people of the north. There were even proposals to make an Angel of The South….  However the accountability of something being physical is something that we as data researchers need to be aware of. We know what happens when data is not made available (MPs expenses scandal in 2009). And we should remember this when we present our stories to our community.

Insight
The IEEE community appear s to have lost sight  of people and the sensory world that people live in.

Opportunity
Can we bring together the IEEE community with arts researchers exploring big data in a visual way?

The final talk of the day was given by Klaus Mueller who presented a case for visual feedback during the long (24 hour) data processing periods of big datasets. He illustrated the description of his algorithm with reference to a scientist working in the field who required fast visual feedback of where the data she was sampling while flying over the Arctic Circle in a research plane.  The visual overview data she obtained quickly enabled her to get a picture for data stories and follow new leads while in the sky.  The details are sketchy, but the use of a story is powerful. It tells us what the data was, why speed is important and how visual data is used in the field.

Insight
Stories of how people use big data are powerful mechanisms for understanding the role of big data in our society.

Opportunity
There is an opportunity for the AHRC to use its research base to harvest these stories and present to the wider world.

DAY  TWODATA AND SOCIETY

On the second day we were met with a positive speech from the keynote on the value of the intersection of technology and people. The speaker then dismissed people and focussed on an hour of technologies about how databases (mostly in SQL) can be managed faster, how they can deal with great amounts of data and how their team is doing it. Once again people seem to have been left out of this equation.

Talking over lunch we decided that this conference was exactly the thing we needed to spur us as arts and humanities researchers to define what the grand challenges are for big data and how they will impact on human lives and live up to the very clear mission of the original IEEE collective in 1884 – for the betterment of humanity.  This is something I wanted to return to throughout this week. I’m not sure what these challenges are – I just know that there’s something incredibly powerful that we can bring to this space and something that could define the future of interdisciplinary research. (Note to self, turn down the bold-statement producing effects and think more clearly).

Tomorrow’s talks in the workshop on Big Data and The Humanities looks promising. Maybe here I’ll get back in touch with humanity and maybe just maybe we can start to debate the value and meaning of big data to our lives.

More tomorrow.

Jon
 

 

 

Hacking in front of an audience – Met Office at the V&A

Justin_VandA

 

WHAT DID WE DO?

You know from reading this blog that we have been running and attending collaborative making events, or hacks, for quite some time now. We’ve put someone into space, we’ve been at Unbox Festival in India, we’ve   run open news hacks with Mozilla and we’ve told you all about how much we love making data physical at SXSW.  We’ve hacked with conductive ink, with trousers and under canvas..  All have been amazing, have led to incredible new things and introduced us to amazing new people. But all our events have been behind closed doors.  The public have remained where they are – in public – while we’ve been locked in a room or atrium.  So when Irini Papadimitriou and Michael Saunby wanted to hold a public hack-jam we jumped at the chance.

The hack has been well documented through Michael Saunby at the Met Office  and our new friends at the Centre for Sustainable Fashion

We were hacking with our friends  Justin Marshall and Ollie Hatfield, who worked on the winning design that harvested  museums as  accidental  data sources for climate change. In their work they built an idea that in the V&A there are images on fabrics, objects and in prints that may have flora and fauna that is specific to that climate on them. They posed the question of how this might looked very different if there was a 4C rise in temperature.

WHAT DID WE LEARN?

Hacking in public is hard

It’s really hard. You are just about to reach the end of debugging a particularly nasty servo motor problem and are just about to test what has taken up the best part of the last hour when “What’s that do?” asks a 7 year. ‘That’ being a 3D printer in the middle of a two hour print job for a completely revolutionary new approach to dress fastenings. The 7 year old is joined by 29 nine of his friends and you have to stop and explain everything about a 3D printer (yes of course you mention That Gun because if you don’t they will).  And just when their parents have  moved them onto the next group and you’re about to crack the final line of the bug… “What does that do” says a bright eyed 9 year old girl with her slightly harassed looking dad…

You have to know your story

You’re going to get asked what you are doing. A lot. You need to work on that story. The public are interested and want to find out more but they don’t want to hear about your 1500 lines of code and the trouble you’re having with stack overflow.  They’re on a fun day out and you’re there to entertain. So think of great things to tell people. Connecting this to popular news stories relating to your tech and ideas worked well for us (the 3D printed gun story is always a winner).

Come prepared with interesting demos.

Bring some working examples. Our amazing hacker friend James Thomas is wearing his mini starlight broach to explain how star data can be modelled on lights.   When he tells people he’s wearing a datafeed from a far off star they stop and listen to what he has to say.

james_wearable_data_closeup

 

Have someone greeting and steering the public

You really need a front-of-house person. They need to be able to tell the bigger story of the event – what you’re doign and why. They need to be able to do this in less than 15 seconds and know who to take them to in the hack to continue the story. Find out what the person/people who are visiting is interested in and connect them to the amazing hackers in the room.

Hacking in public is accountable

We were hacking for climate change. We had a lot of people talking about climate change. The fantastic thing was that we were sharing the room with the world’s leading scientists. When someone from the public attempted to deny climate change – we were able to point them to the scientists with hard data up their sleeve. Don’t mess with the Met Office – they KNOW their data. But there’s a bigger picture here. It’s a picture of being able to justify what you’re doing. To talk about your idea, take critism and adapt what you’re doing based on the conversations you are having with the people visiting. If you can respond to 15,000 people asking you why and how – and you adapt to what they’re saying – you’re going to have a pretty robust idea that stands up to future development. You’ve done the market research and public impact during the development. I’m not sure that there are many development processes that can say they do that.

So what happens next? We’d like to explore this new way of hacking. Maybe it’s not new – maybe you have done this and want to shout – ‘hey, we did that first’ – or ‘we did that before you were born’… please do, we would love to hear your stories. Our feeling is that this is something really quite new and that it could change how hack-events are run in the future. So…. have your flu shot, get trained in public liaisons,  get a safety cage for your soldering iron and be prepared to find, play, make and talk. Maybe we’ll see you in the public gallery of the Houses of Parliament collaboratively finding new ways to be more accountable and more democratic. You up for that?

Starlight by Fieldguide

Jon, Tom and Mike from the studio have recently been busy with their Fieldguide project. Along with fourth Fieldguide co-founder Pete Thomas and his Uniform colleague Martin Skelly they created StarLight, an interactive lighting installation using space data. StarLight is a concept first seeded by Jane Wallace and James Thomas from the University of Northumbria.

StarLight is a collaboration between Fieldguide and the Swedish lighting manufacturer Wästberg. In 2009, NASA launched their Kepler space observatory to look at the light from far off stars and interpret their flickering and pulsing in order to discover habitable planets. StarLight uses NASA data to allow people to replay the light that originated from stars light-years away, giving people a sense of connectedness to these stars and encouraging them to dream of far-off worlds. Wästberg works with the most renowned architects and designers, combining aesthetic sensibility with Swedish engineering mentality. Their products have won over 40 awards for excellence and they represent a leading player in the future of lighting design.

The launch for this event is on Thursday the 19th Sept from 6pm onward at the Imperial College reception.

Images to follow…

Space Issue of Fieldguide at findplaymake.com
starlight1

 

starlight2

 

 

Unlimited Space Agency and Bare Conductive at Greenman


Jon Spooner, Head of Human Spaceflight from the Unlimited Space Agency approached us this year to see if we could do something for Einstein’s Garden at the Greenman festival.

Together we came up with a workshop that allowed the attendants to connect paintings created with the amazing Bare Conductive paint to the International Space Station. This paint conducts electricity and meant that the painting could have a bunch of LEDs glued on that could flash when turned on.

We created a giant space communicating antenna that all the painting were hung on. As the International Space Station flew over the antenna became live and the painting all started to light up!

This would not of been able to happen if it was not for the amazing support from Bare Conductive!

header

blog1
Team UNSA preparing for a new influx of agents.
blog2
Mission control. The screen shows the proximity of the International Space Station. The control unit up top has two keys switches to override the system and allow for testing.
blog3
UNSA’s Head of Human Space Flight Jon Spooner looks worried just before we power up the antenna
blog5
Everything goes live and the LEDs all start flashing!
blog 6
The cable made to connect paper to the antenna.

 

Unlimited Space Agency using Bare Conductive at Greenman 2013 from michael shorter on Vimeo.

 

Interactive Newsprint

title_v2

 

Interactive Newsprint (INP) has just come to an end (April 2013).  It was an 18month EPSRC funded project that involved academics from UCLan, Dundee and Surrey and the project’s technical partners Novalia.

It was a co-design project, we worked with the people of Preston to explore the future of print and journalism through the media of traditional print and an emerging technology, paper electronics.  Back in November 2011, we began the process off with a series of workshops in Preston.  Demonstrating where the technology currently was and using it is an opportunity to ask and discuss how paper electronics could work within Preston communities and where improvements needed to be made to interactions and technology specification.

inpworkshop1

inpworkshop3

Below is an introduction video to Interactive Newsprint, including user feedback, from our lead partner at UCLan:


We had the pleasure of working with some great people from around Preston and a few further afield.  Below are a couple of examples of the projects we worked on:

We first showed the Lancashire Evening Post and Outsiders at London Design Festival with Fieldguide.  The LEP and Fieldguide went on to be announced as a highlight of the Festival by Blueprint magazine.

inpfg1

Artist – Garry Cook and his project ‘Outsiders’

inpfg2

These examples above, are all connected to the internet.  This affords some super-interesting possibilities and functions.  Two of which are updatable audio, and analytics.  Below is a video sketch (may need to be fullscreen and HD) that demonstrates our early thinking around analytics:

Our most recent, and last outing with Interactive Newsprint was to SXSW in Austin in March.  We were invited to demonstrate, Web-connected Paper at the Mozilla, Knight Foundation, Soundcloud, MIT Media Lab Party, which went down really well.  The following day, Paul Egglestone and I were joined on our panel ‘Pitchforks and Printed Electronics’ by Nick White of the Daily Dot and Garrett Goodman of Worldcrunch to discuss the future of print, journalism and paper electronics and how they could affect everything from community journalists through to global news organisations.

inpsx1

inpsx3

inpsx2

Much more information on Interactive Newsprint can be found at http://interactivenewsprint.org/

Additionally, in the near future, our research, findings and impact will be written up by the partners in a number of papers.  These will be announced soon.

UNSA – Jon Spooner’s next steps to Space

During the International SpaceApps Challenge in Exeter the Product Research Studio focused on helping Jon Spooner, an aspirant astronaut from the Unlimited Space Agency, get a few steps closer to his adventure into space.

Last year at the SpaceApps Challenge the team made a mini version of Jon Spooner in the hope he could go up to space in an astronaut’s pocket. Unfortunately this was never realised. So to help mini Jon get a bit closer to space we decided to make him a rocket with an on-board computer to run some test flights monitoring his environment. jon’s new rocket was 3D printed and filled up with mini-Jon and a Texas Instruments CC2541 Sensor tag.

More to follow…
UNSA11

UNSA11

UNSA9

UNSA6

UNSA5

UNSA4

UNSA3

UNSA14

UNSA15

UNSA1

 

 

Playing Paper

pp1
Playing Paper was created for the Paper exhibition put on by Analogue Social. This exhibition show from the 9th April to 12th May at the Lighthouse, Glasgow. 

Eight designers were chosen to create a project using paper for this exhibition. The designers choosen were Kate Colin, Kerr Vernon, Kerry McLaughlin, Lisa Catterson, Alan Moore, Jemima Dansey Wright, David Ross, Craig McIntosh and myself, Mike Shorter.

I decided to propose an evolution of The Invite. In creating the Invite I learned a lot about how to screen print with Bare Conductive carbon based ink. I saw this project as an opportunity to deploy all the things I had learnt, from graphic limitations to ink dilution.

Playing Paper consisted of three artworks, with the aim of showing that paper electronics doesn’t have to be all circuit diagrams, but can be much for artistic. This was achieved by having the artworks display three different levels of circuit, one very technical and traditional, one with hand drawn components and the other with instruments drawn. When plugged into the mixer the three artwork were all turned into musical instruments, all creating the same sounds. Three printed buttons on the bottom of the page gave off different drum sounds when touched, and a distance sensor printed at the top changed pitch the closer you got to the ink.

pp2

To make the screen printing a much less stessfull the Bare Conductive paint was diluted with water (roughly 4 parts paint to 1 part water). This made the paint less sticky so it flowed better on the screen, and also slowed down the paint  drying in the screen.

pp3

With this new diluted version of the paint the printing became much easier and we were able to print out over 50 artworks with about 100ml of Bare Conductive. Last time when we printed with Bare Conductive we made the trace thickness 1pt in illustrator which made it far to easy to mess up printing if enough pressure wasn’t applied when printing let alone the paint drying in the screen. This time the traces were all 2pt which led to a 100% success rate!

pp6

I decided to print out a bunch of the artworks in a couple of different colours. I was planning on exploring how other colours would print on top of the Bare Conductive but forgot due to the excitement of the printing going much better than expected! They look pretty good as a collection.

pp4

I decided I wanted to output noise to be a bit more exciting than that of The invite, so i decided to add an MP3 Trigger to the electronics inside the mixer. This allowed the three buttons along the bottom to control drum samples. In doing this it also meant that the ‘theremin’ sound was a lot smoother. Playing Paper stilled used the bespoke bulldog clip that was used with the Invite. This method still remains the best way I have seen to connect paper to other devices.

pp5

The preview of the show at the Lighthouse was absolutely packed! Playing paper was in constant use as the grubbiness of the prints illustrated, two hours of being constantly touched by excited fingers…

Below is a quick little video of Playing Paper at the Lighthouse on the preview night. After the show is finished I plan to upload a video with real time sound…

Playing Paper from michael shorter on Vimeo.

Mike is Presenting at the Electric Bookshop Shop Late Lab

OLYMPUS DIGITAL CAMERA

electric-logo_yellow

 

Mike Shorter from the Product Design Research Studio will be presenting his thoughts on the future of paper tomorrow night. This event is part of the Edinburgh Science Festival and is being organised by the brilliant Electric Bookshop at Inspace.

There is a great line-up for this event, Mike will be talking with:

Ian Sansom the author of the amazing book  Paper; An Elegy. 

Alyson Fielding, an artist who hacks books, stories and Arduinos.

And finally Yvette Hawkins, a paper atrist who makes wonderful artworks and sculptures out of paper.

Check it out here

With this great diverse collection of speakers there are going to be some great conversations!

 

Mini Mars Rover

rover1

The Mini Mars Rover was built for NASA’s International Space Apps Challenge by Mike, Tom and Ali. The Mini Mars Rover will move in the exact same pattern as his big brother on Mars. It will also display some other live data such as sound and images. The Mini Mars rover was built to roam around the home environment allowing the users to have a connection to the Mars Rover as it explores alone. We want to take boring data and make it tangible and exciting.

The Mini Mars Rover is an internet controlled robot. He comprises of a Wild Thumper chasis, an Arduino and an Electric Imp.

rover2

The Mini Mars Rover began life by building a new laser cut acrylic body onto the Wild Thumper 6WD chassis, and adding some more suitable wheels.

rover3

The Mini Mars Rover has just come back from an eventful time at SXSW Interactive in Austin,Tx. Not only was he found driving around the Space Meet Up event but he also featured on the Making Space Data Real on Earth panel. This panel was hosted by Ali Llewellyn from the NASA Open government initiative, David McGloin and Jon Rogers from the University of Dundee and Jayne Wallace from Northumbria University.

Below are some videos of the rover in action….

On Mashable – http://mashable.com/2013/03/10/sxswi-day-3/

On the Global Post – http://www.globalpost.com/dispatch/news/regions/americas/united-states/130311/sxsw-interactive-video

rover4

The Mini Mars Rover can be controlled from this url:

socialdigital.dundee.ac.uk/~ali/php/rover/ 

rover5
Team NASA (including an astronaut) get behind the Mini Mars Rover

Here’s is some code for you….

Squirrel for imp (adapted from an online source which I can no longer find…) :

 

================================================

server.show(“”);

// remote control for rover
ledState <- 0;

function blink()
{
// Change state
ledState = ledState?0:1;
server.log(“ledState val: “+ledState);
// Reflect state to the pin
hardware.pin9.write(ledState);
}

// input class for LED control channel
class inputHTTP extends InputPort
{

name = “power control”
type = “number”

function set(httpVal)
{
server.log(“Received val: “+httpVal);

if(httpVal == 1) {

hardware.pin9.write(1);
imp.sleep(0.1);
hardware.pin9.write(0);
}

else if(httpVal == 2) {

hardware.pin8.write(1);
imp.sleep(0.1);
hardware.pin8.write(0);
}

else if(httpVal == 3) {

hardware.pin2.write(1);
imp.sleep(0.1);
hardware.pin2.write(0);
}

else if(httpVal == 4) {

hardware.pin1.write(1);
imp.sleep(0.1);
hardware.pin1.write(0);
}

else{
;
}
}
}

function watchdog() {
imp.wakeup(60,watchdog);
server.log(httpVal);
}

// start watchdog write every 60 seconds
//watchdog();

// Configure pins as an open drain output with internal pull up
hardware.pin9.configure(DIGITAL_OUT_OD_PULLUP);
hardware.pin8.configure(DIGITAL_OUT_OD_PULLUP);
hardware.pin2.configure(DIGITAL_OUT_OD_PULLUP);
hardware.pin1.configure(DIGITAL_OUT_OD_PULLUP);

// Register with the server
imp.configure(“Reomote Control for Rover”, [inputHTTP()], []);

================================================

Arduino code (thanks Chris Martin!)…

================================================

 

/*
AnalogReadSerial
Reads an analog input on pin 0, prints the result to the serial monitor.
Attach the center pin of a potentiometer to pin A0, and the outside pins to +5V and ground.

This example code is in the public domain.
*/

int pinf=2;
int pinl=12;
int pinr=10;
int pinb=9;

 

#define LmotorA 3 // Left motor H bridge, input A
#define LmotorB 11 // Left motor H bridge, input B
#define RmotorA 5 // Right motor H bridge, input A
#define RmotorB 6 // Right motor H bridge, input B
#define v 255

 

#include <Servo.h>
//Servo myservo;
//int led = 12;
int pos = 0;
// the setup routine runs once when you press reset:
void setup() {
//myservo.attach(9);
// pinMode(led, OUTPUT);
pinMode(pinf,INPUT); // initialize serial communication at 9600 bits per second:
pinMode(pinl,INPUT);
pinMode(pinr,INPUT);
pinMode(pinb,INPUT);
Serial.begin(9600);

digitalWrite(pinf,LOW);
digitalWrite(pinl,LOW);
digitalWrite(pinr,LOW);
digitalWrite(pinb,LOW);

 

//288000
// this is different on the serial monitor not sure if it is up or down
// Serial.begin(14400);
}

int lls=0;
int rls=0;
int al=0;

// the loop routine runs over and over again forever:
void loop() {
// read the input on analog pin 0:
int sensorValue1 = digitalRead(pinf);
int sensorValue2 = digitalRead(pinl);
int sensorValue3 = digitalRead(pinr);
int sensorValue4 = digitalRead(pinb);
// print out the value you read:
Serial.print(sensorValue1);
Serial.print(” : “);
Serial.print(sensorValue2);
Serial.print(” : “);

Serial.print(sensorValue3);
Serial.print(” : “);
Serial.println(sensorValue4);
delay(25); // delay in between reads for stability

 

if (sensorValue1 == 1) {

analogWrite(RmotorA,0);
analogWrite(RmotorB,120);
analogWrite(LmotorA,0);
analogWrite(LmotorB,120);
delay (500);
analogWrite(RmotorA,0);
analogWrite(RmotorB,0);
analogWrite(LmotorA,0);
analogWrite(LmotorB,0);
// myservo.write(10);
// delay (500);
}

else{
analogWrite(RmotorA,0);
analogWrite(RmotorB,0);
analogWrite(LmotorA,0);
analogWrite(LmotorB,0);
}

if (sensorValue2 == 1) {
// digitalWrite(led, HIGH);
analogWrite(RmotorA,0);
analogWrite(RmotorB,250);
analogWrite(LmotorA,250);
analogWrite(LmotorB,0);
delay (100);
analogWrite(RmotorA,0);
analogWrite(RmotorB,0);
analogWrite(LmotorA,0);
analogWrite(LmotorB,0);
// myservo.write(10);
// delay (500);
}
else
{
analogWrite(RmotorA,0);
analogWrite(RmotorB,0);
analogWrite(LmotorA,0);
analogWrite(LmotorB,0);

}

if (sensorValue4 == 1) {
// digitalWrite(led, HIGH);
analogWrite(RmotorA,250);
analogWrite(RmotorB,0);
analogWrite(LmotorA,0);
analogWrite(LmotorB,250);
delay (100);
analogWrite(RmotorA,0);
analogWrite(RmotorB,0);
analogWrite(LmotorA,0);
analogWrite(LmotorB,0);
// myservo.write(10);
// delay (500);
}
else
{
analogWrite(RmotorA,0);
analogWrite(RmotorB,0);
analogWrite(LmotorA,0);
analogWrite(LmotorB,0);

}
if (sensorValue3 == 1) {
// digitalWrite(led, HIGH);
analogWrite(RmotorA,120);
analogWrite(RmotorB,0);
analogWrite(LmotorA,120);
analogWrite(LmotorB,0);
delay (500);
analogWrite(RmotorA,0);
analogWrite(RmotorB,0);
analogWrite(LmotorA,0);
analogWrite(LmotorB,0);
// myservo.write(10);
// delay (500);
}
else
{
analogWrite(RmotorA,0);
analogWrite(RmotorB,0);
analogWrite(LmotorA,0);
analogWrite(LmotorB,0);

}

}

================================================

 

Supported by New Media Scotland’s Alt-w Fund

Alt-w Square k

 

SXSW 2013 – Make Space Data Physical

Making data physical means that more people can access it in more ways. Taking data from the screen and making it do things in the real world dramatically increases the potential impact of this data. And as far as being able to touch data that can never be touched, then space and time have to be it.  Things that are far away or things that are lost in time are two physical barriers we simply can’t cross and we want to find a way to do this.

For me, one of the most dramatic pieces of data I have come across are the chunks of wall that are missing from the Victoria and Albert Museum on Exhibition Road in London. I used to walk past these every day and wondered why no one had filled them in – as the V&A is one of our most precious buildings we have. Then one day I saw the small plaque where these word are carved in stone.

The damage to these walls is the result of enemy bombing during the blitz of the Second World War 1939-1945 and is left as a memorial to the enduring values of this museum in a time of conflict.

 I could literally touch the holes where shrapnel from bombs had blown holes. I could touch the data. I held my breath and for that moment I was there in far more a real experience than any I had previously had of the war. The data from an event 70 years ago had touched me when I touched it. I had travelled through time.

So can we do this to space data? Can we build connections between people here on earth that reach across the vastness of space – to far-off stars  – and across the vastness of time – to the very origins of the universe itself? We hope we can! Which is why when Ali Llewellyn from Open NASA got in touch a year ago we literally jumped for the chance to work with her and her team in making space data real here on earth.

We’ve been hacking together examples as demonstrators, or starting points, of new ways to connect people to space data. To give an example, if you start to put people’s emotions first then loneliness rather than measured distance is a great way to connect people to space. Rather than think of Mars Rover or Voyager as machines sending data over distance you start to think that they are out there all alone; on cold dark planets or at the far reaches of space. Forever alone.   It is this starting point that can start to make data more human. To make data a thing we want to love… Then we start to connect people to their loneliness… what would this mean? What could we design?

Our friend the jeweler Jayne Wallace is on our panel and her take on this is about the way data we receive now has been generated in the past, possibly billions of years ago – right back to origins of the universe itself.

Our lives have a pace to them and time is both something we crave more of yet know has an ultimately finite quality for us. Our interactions with the digital are quickening our pace of life and altering not only the texture of days and years but also how we value, measure and perceive the passing of time. But there are things that are bigger than us, things older than we can imagine, things that give our atomized view of life and the time we have a very different perspective and we simply have to look up to start to engage them. We want to explore what it would mean to use digital technologies and space data to subvert our relationship with time and bring fresh potential to the digital objects we live with and through. Through design we can use space data to create ways to experience now things that occurred before humankind existed, we can read by the light of a lamp connected to the live feed from a telescope and know that when it flickers a new planet has been discovered and we can connect to the orbits and rhythms of planets through objects that gently respond to these different cycles and be reminded that we are part of something much greater, much faster, much slower and much more fascinating than our atomized lives sometimes allow us to consider.

We’re also sharing our panel with someone who has more than a little knowledge and authority on the science behind all of this – David McGloin (many of you will know as @dundeePhysics). His team of undergraduate,  high school teachers and pupils have been exploring ways to connect to the dark side of the moon for use in the education of ways of conducting optical measurements of space objects.

“We know that space is one of things that most inspires high school students to study subjects such as physics at University, but it’s clearly a challenge to get hands in practical work while still at school. Our  project is an example of how we can use space data to try and make a more physical and immediate connection to the subject.

Heading up our panel is Open NASA’s very own Ali Llewelyn. I asked her what excites her about making space data Physical.

From the time I was a child, I wanted to touch the stars. I wanted to walk on other planets with my own feet and fly a spaceship with my own hands. While I am not an astronaut, and there isn’t yet a human presence on Mars – making space data physical enables me to get closer than most humans can yet get to those opportunities. My work at NASA in open innovation and mass collaboration is dedicated to exactly this: enabling everyone on planet Earth to contribute directly and substantially to the exploration mission.”

I then asked her what she thought were the possibilities that this approach presents?

“This approach inspires everyone with the wonder of exploration by making the data engaging and allowing it to inform a new context. (Who doesn’t want to drive Curiosity or touch the sun?) This approach democratizes exploration for all citizens – making what we are learning in space accessible to everyone on planet Earth.  This approach encourages new approaches and opportunities to the challenges we face in improving life on our planet and taking our species off-planet. This approach extends the usefulness of space data. While the data often had one initial research purpose, we are “recycling” it for other applications and uses, especially in new contexts”

And why getting our hands ‘in’ data is an amazing thing.

In the time it took you to read this sentence, NASA gathered approximately 1.73 gigabytes of data from our nearly 100 currently active missions! We do this every hour, every day, every year – and the collection rate is growing exponentially. Handling, storing, and managing this data is a massive challenge. Our data is one of our most valuable assets, and its strategic importance in our research and science is huge. We are committed to making our data as accessible as possible, both for the benefit of our work and for the betterment of humankind through the innovation and creativity of the over seven billion other people on this planet who don’t work at NASA. What would become possible if everyone could not just access but remix and reuse the images, maps, metrics and lessons learned from this amazing trove of observation?

 

Get Physical: Making Space Data Real On Earth

With Ali Lewellyn (Open NASA), David McGloin (University of Dundee), Jayne Wallace (University of Northumbria) and myself (Jon Rogers)

11am Monday March 12th in Omni Downtown, Lone Star

http://schedule.sxsw.com/2013/events/event_IAP5183

Hope to see you there!

Thank you to: New Media Scotland, Open NASA, RCUK, University of Dundee,  and Northumbria University