Monthly Archives: November 2011

Image of Alan Winfield

In Interview: Alan Winfield

Image of Alan Winfield

Professor Alan Winfield

Alan Winfield is Professor of Electronic Engineering and Director of the Science Communication Unit at the University of the West of England, Bristol. Alan will be on hand to discuss the cultural relevance and impact of swarm robotics at Robotville.

How did you become involved in robotic research?

Like many things in life there was a lot of luck involved. Although I have always been fascinated by robots I didn’t actively study them until I came to Bristol 20 years ago. I was lucky then because firstly I had a chance to set up a new research group, and secondly I met 2 other people who were also interested in robots. Together we started the robotics lab. We were also lucky because we managed to win the money (grants) to do robot research projects – without the funding the lab would have been very short lived.

Then, during the last 20 years my interest in robotics has changed, so that now I’m much more interested in basic scientific questions, like what is intelligence, how do animals evolve, how does culture emerge and so on, and use robots to try and answer (in a small way) those questions. So what makes me want to study robots now is a deep interest in some of the big questions of life.

What are swarm robots and what is it about them that fascinates you?

A robot swarm is a collection of relatively simple robots that interact with each other and the environment in ways that are inspired by the behaviour of social insects. Complex group behaviours, like flocking, foraging for food, or nest building can emerge from the micro-interactions of lots of individuals.

I’m fascinated by swarm robotics for two reasons. Firstly, future real-world applications for tasks as wide ranging as robotic agriculture, waste processing and recycling, search and rescue, or planetary exploration and colonisation are likely to use swarms of robots. And secondly, by building swarms of robots we can start to understand how the processes of emergence and self-organisation work, and how to engineer systems using these mechanisms.

Do you see a direct relationship between the swarm behaviour of robots and the herd like relationships of humans?

Yes, social insects (and the robots inspired by them) are not the only animals that show swarm intelligence. The flocking of birds, shoaling of fish and herding of mammals are all examples of the same kind of group behaviour. Humans are complicated, of course, but some aspects of human crowd behaviour are almost certainly swarm-like.

What are the ultimate goals and objectives attributed to swarm robotics?

The ultimate goal of swarm robotics is to be able to engineer safe and reliable robotic swarms for real-world applications. As I mentioned in the answer above there are many challenging real-world (and off-world) tasks that would benefit from a swarm robotics approach. Basically any task that is distributed in physical space, where it would be better to have multiple robots (than having just one robot).

Reaching this goal requires the solution of a number of difficult technical problems. One is how to design the behaviours of the individual robots so that when you put all the robots together in their working environment, they actually self-organise to complete the required task. The second challenge is how to design a human-swarm interface – in other words how would a human operator control and monitor a large swarm of robots. The third challenge is how to prove that the swarm of robots will always do the right thing, and never the wrong thing! And the final challenge is getting this all working in real-world applications so that people become confident in swarm robotics technology.

Our view of robots is shaped by books and film – do you think this is helpful or misleading?

Good question! I think robots in science fiction are both helpful and misleading. Helpful because many roboticists, myself included, were inspired by science fiction, and also because SF provides us with some great examples of ‘thought experiments’ for what future robots might perhaps be like – think of the movie AI, or Data from Star Trek. (Of course there are some terrible examples as well!)

But robots in science fiction are misleading too. They have created an expectation of what robots are (or should be) like that means that many people are disappointed by real-world robots. This is a great shame because real-world robots are – in many ways – much more exciting than the fantasy robots in the movies. And the misleading impression of robots from SF makes being a roboticist harder, because we sometimes have to explain why robotics has ‘failed’ – which of course it hasn’t!

Professor Peter McOwan

In Interview: Peter McOwan

Professor Peter McOwan

Professor Peter McOwan

Meet roboticist Peter McOwan, Professor of Computer Science in the School of Electronic Engineering and Computer Science at Queen Mary, University of London. At Robotville Peter will be showcasing software he has built that helps robots understands our emotions

How did you become involved in robotic research?

My interest started in artificial intelligence, understanding the brain using maths, then seeing how I could use that maths in a robot to help give it human like abilities, still a long way to go!

Your software teaches robots to understand emotions but how long until a robot shows preference, or ” falls in love” with someone?

Ah well ‘what is love’? Love shows in the way you feel and act, it’s how you take in information about the object of your desire, and react to them. Arguably then, love is really just a special type of brain information processing. Deep inside our brains when we fall in love the nerve cells change the way they connect and signal to each other. Perhaps in the future we will understand how this happens and perhaps be able to build a machine that can perform this complex information processing task. At present our robot can read your expressions and it can be programmed to respond to them, smiling back at you when you smile at it - awww bless. The robot can even ‘know’ who you are and pull an especially sweet looking face when you look and smile its way, but would that be love? Not quite yet I expect.

Would that ever lead them to reprioritise or even change their programming?

If falling in love changes the way a human acts, they way their brain cells interact, then mimicking this in a computer could cause the same type of effect. But robots are basically complicated mechanical and electronic tools. Would it be useful for your vacuum cleaner to recognise you? Possibly yes so it can’t be stolen. But would it be useful for your vacuum cleaner to have a crush on you? Probably not.

In 2008, Nokia developed an anthropomimetic robot with an ‘imagination’. What further developments have there been in this field and are we any closer, or have we achieved a robot with a stream of consciousness? How does your emotion software sit within this field? Submitted by Luke

Our system detects the changes in human faces when we make expressions. Often we make expressions to signal our emotions to other humans, it’s a kind of useful social signaling code. Our robots then take the facial expression and react to it given a set of rules we have programmed. At present it’s as simple as that, that’s all our robots need to do to be useful ‘slightly socially aware’ tools at present.

Of course looking at the outside of the robot it’s easy for humans to believe there is a lot more going on, our brains love to process faces and social signals and build stories, making things seem more ‘human’ than they are. We all assume that there is a stream of consciousness going on in others when we observe them, and that imagination is in there too playing a part. All of this comes from the electrochemical signals swirling in our brains, and no one really understands how that all works yet, so it would be difficult to build a robot to mimic it directly.

Of course we can build part of the computer program that takes input patterns and looks for similar patterns in stored or previously learned patterns, or just creates new random patterns to output and call it ‘imagination’, and in some way it is doing a bit of what our imagination does. These sorts of experiments are useful because it lets us test our understanding, and perhaps over time as new facts about the brain emerge we can refine these to make them more human. But it’s a long road ahead.

Apple’s recent iPhone update includes what some are claiming to be the first consumer robot. How relevant is this and do you think this idea of robots in our pockets will have a wider impact on robotic research and developments?

Robots = tools, if it’s useful for us to have robots in our pockets, or on our phones, even if they have limited abilities, they can be developed. What we can cram onto the processors in phones today is limited, but as the technology improves we will be able to do more. But in the end what will drive it will be the need for that tool to be able to help us do something that’s useful for us.

What are the ongoing cultural implications of your latest research and how do you think they will affect everybody’s day to day lives?

The view of robots differs in different parts of the world. In Japan for example where robots are big business they are seen as a positive thing. In the west there are more mixed feeling perhaps coloured by the frequent portrayal of robots as sinister baddies in movies and TV shows. As part of (our research project) LIREC we are looking at people’s concerns, hopes and fears for the technology that we are developing, and that’s important. This way we can design robots in a way that they become beneficial technological aids to improve human life, tools to make things better.

Nick Hawes

In Interview: Nick Hawes

Nick HawesMeet roboticist Nick Hawes, a lecturer in the School of Computer Science at the University of Birmingham. He is bringing his robot, Dora the Explorer, to our Robotville festival.

How did you become involved in robotic research?

When I was at school I didn’t really know what I wanted to do with my life, so I chose to study Artificial Intelligence (at the University of Birmingham) because I didn’t really believe that it was a real subject. I quickly fell in love with the idea of building intelligent systems, so stayed in Birmingham to do a PhD in AI for video games. It was only a few years later when I joined an ambitious project that wanted to use some of the ideas from my PhD to help build a cognitive robot that I moved away from virtual worlds and built things that ran in the real world. From then on I was hooked on robotic research.

How did you become involved in using robots for exploration?

We were interested in studying how robots could automatically extend their own knowledge about their worlds. We already had a robot that could build maps of its environment, but it had to be directed by a human using a joystick. Therefore it seemed like a natural extension to see if we could get the robot to take over from the human and guide the exploration itself.

What can Dora do?

The Dora robot can explore space to build up a map. It can choose where to explore based on how likely it is that a particular direction of exploration will produce interesting results (like previously unknown rooms). Dora can also search for objects, either in order to tell a human where they are, or to work out what kind of rooms are in the building (e.g. if it sees a kettle it will realise it is in a kitchen). Dora can also interact with humans in a limited way, by asking questions about the location of objects.

Dora the explorer robot

Do you think robots like Dora will become a household necessity?

I’m not sure they will become a necessity (unless we all become unable to care for ourselves!), but they will certainly become a luxury item in the future. It will be possible to order them from an online electronics store one day and have them arrive and start work the next. The big problem will then be training your robot to understand the layout of your home and the routines of your household. This is where we hope our work on exploration and curiosity will make people’s lives easier, as the robot will be motivated to learn things for itself, rather than wait to be taught everything.

Are the costs of transporting robots too prohibitive to send one to Mars? Submitted by Steve

The answer has to be no, as NASA has already sent two robots to Mars: Spirit and Opportunity. This has cost in excess of $900 million! One interesting question is not the cost of transport, but the difficulties of controlling robots once they get to Mars. The delay of a radio signal between Earth and Mars can be over half an hour, so this precludes any direct control by a human (just imagine playing a game where every command you send takes 30 minutes to have any effect!). Therefore these robots must be equipped with forms of Artificial Intelligence to allow them to safely make decisions for themselves with only limited human intervention.

Does the software that is tasked with being curious continuously increase its capacity to explore / make connections is it getting more creative? Submitted by Charlie and Jake

I would say that it doesn’t really increase its capacity to explore (except beyond software that isn’t curious at all), but it does have a limitless appetite for exploration. Unless you artificially restrict its mobility, Dora will just keep trying to explore until its batteries run out or it gets stuck somewhere. A few years ago we were demonstrating Dora in a corridor that led to some toilets. Someone had left the door to the women’s toilets open, and before we knew it Dora had wandered off in there. Unfortunately, as our research team was all male at the time, we had to wait for someone female to come along to retrieve Dora!

Our view of robots is shaped by books and film do you think this is helpful or misleading?

I think it’s mostly a good thing, as fiction (books, films and increasingly video games) motivate and inspire a large number of people to work in science and engineering to produce exciting and useful technologies such as robots. The main downside is that the reality of robotics research is a long way away from the images portrayed in fiction, so we risk disappointing people when we are unable to provide them with the robots that they imagine are possible. However, as you’ll see at Robotville, the robots that exist now are all really exciting in their own ways, and will only improve as more time and effort is dedicated to them.

Apple’s recent iphone update includes what some are claiming to be the first consumer robot. How relevant is this and do you think this idea of robots in our pockets will have a wider impact on robotic research and developments?

Whilst there have been consumer robots before Siri (including vacuum cleaners, lawn mowers and toys), the importance of Siri is that it demonstrates that it is possible to produce software that consumers can interact with about a range of tasks in a natural fashion (even if the results are not always perfect). Perhaps the biggest impact of the development of computers in our pockets will be the expectation that all our devices should share information to make our lives easier. For example my phone should know when I’m nearly home so that it can tell a future version of Dora to put the kettle on.

What are the ongoing cultural implications of your latest research?

Robots will change the way we live and work. The change will start from limited, well-defined tasks (such as robots that clean the floor and cars that park themselves) but will gradually become more noticeable across the whole of society. Some of the larger cultural questions we will ultimately need to face include whether we are happy to let robots care for our sick and elderly, and who is responsible when a robot that can learn and make decisions for itself causes some kind of problem.

Nao robot from Aldebaran Robotics

Question some of the roboticists

Next week over 20 robots will be arriving at the Museum for the Robotville Festival. Over the next few days we will be posting interviews with some of the roboticists to find out more about their research projects and the cultural implications of these latest developments in robotics.

Nao robot from Aldebaran Robotics

This is your chance to submit any questions you would like them to answer. Have a read of their biographies below and submit your questions via the comments.

Alan Winfield:

Alan Winfield is Professor of Electronic Engineering and Director of the Science Communication Unit at the University of the West of England, Bristol. He conducts research in swarm robotics in the Bristol Robotics Laboratory.

A robot swarm is a collection of relatively simple robots that interact with each other and with their environment, in ways that are inspired by the behaviour of the social insects. Even though the individual robot behaviours are simple we see fascinating group behaviours, such as flocking, emerge.

At Robotville Alan Winfield will demonstrate a swarm of miniature two wheeled mobile robots called e-pucks. Within the swarm, one group of robots are programmed with three simple rules which allow them to flock as a group. Another group of robots artificially ‘evolve’ their movement behaviours by both simulating and sharing solutions with one another whilst they move around.

Nick Hawes:

Dr Nick Hawes, is a lecturer in Intelligent Robotics at the University of Birmingham and will be bringing Dora The Explorer to Robotville. Dora is a mobile robot with a sense of curiosity and a drive to explore the world. Given an incomplete tour of an indoor environment, Dora is driven by internal motivations to probe the gaps in her spatial knowledge.

Dora will actively explore regions of space which she hasn’t previously visited but which she expects will lead her to further unexplored space. She will also attempt to determine the categories of rooms through active visual search for functionally important objects, and through ontology-driven inference on the results of this search

Peter McOwan:

Peter McOwan is currently a Professor of Computer Science in the School of Electronic Engineering and Computer Science at Queen Mary, University of London. His research interests are in visual perception and mathematical models for visual processing.

Peter will be showcasing software he has built that helps robots understands our emotions. This software will allow us to programme robots to understand how we feel, enabling them to respond to our various moods.

Please submit your questions in the comments below!


Contemporary arts programme

Not everyone knows that we have an ongoing contemporary arts programme, so we thought we would give you a bit of an update about what we have going on in the Museum art-wise.

Our contemporary art programme is now in its 15th year and is going from strength to strength. We exhibit art projects that explore artists’ perspectives on the past, present and future of science and technology and offer ways of thinking about the impact of science within wider cultural contexts.

Our latest offerings include a temporary display of large-scale photograph ‘In the House of My Father’ by seminal Black British artist Donald Rodney in the Who Am I Gallery, a gallery about human identity.

Our ever popular Cockroach Tour by maverick Danish artists collective Superflex is still in action. ‘Listening Post’ by Ben Rubin and Mark Hansen is an extraordinary ‘portrait of online chat’ run out over a ‘curtain’ of 251 vacuum fluorescent screens which show live chat fragments in real-time.

Coming up, we have the UK premier of works by innovative art production company Electroboutique (Alexei Shulgin and Aristarkh Chernyshev). Their gorgeously designed broadcast and interactive ‘art products’ encourage participation and new forms of ‘Crititainment’ (entertaining critiques) through what the artists call ‘Creative Consumption’.


Electroboutique work with the languages of pop culture, media and art histories, real-time data processing and custom electronics, framed by a tongue-in-cheek appropriation of the language of corporate marketing speak. ‘Electroboutique pop-up at the Science Museum’ opens on the 23 November, running until 14 February 2012.

[yframe url='']

Over the next few weeks we will be publishing an interview with Alexei Shulgin as well as running a live Q and A on Twitter.

Over the next year our prize-winning Writer in Residence Mick Jackson will be keeping us up to date with his discoveries in the Museum and next March we are delighted to be hosting a solo show of new works by British artist Suzanne Treister, details of which are to follow shortly.

Post by Hannah Redler, Head of Arts Projects

Robots are taking over the Museum

Robots are taking over the Museum! From Thursday 1 until Sunday 4 December, 23 robots will be setting up home in the Museum as part of our four day event – Robotville.

Come and meet Concept, a robot created to study how people react to a robotic face and help us understand why we’re drawn to lifelike technology. Watch his expression change as he learns from you and copies your movements.

Or test out Dora the Explorer’s skills. She could be the solution to those misplaced house keys! You can see her in action below.

Many of the robots on display have just come out of European research labs and will be on show to the British public for the first time. Check them out on our website.

The exhibition is divided into six zones which will include an area for domestic robots, swarming robots, humanoid robots and more. And if that wasn’t enough robot geekery for you, the roboticists will also be there to demonstrate their work and answer your questions.

The four day event explores the cultural impact robots have on our day to day life. The idea of robots in the home is familiar through science fiction, but the reality has yet to truly materialise. Until now…

To find out more about the event visit our website for more information and keep an eye on our blog. We will be posting interviews with some of the roboticists in the run up to the four day event and we will be asking you to send over some questions you may want them to answer.

A selection of Hidden Heroes

What do you use our Hidden Heroes for?

Hidden Heroes – a celebration of everyday things – opened at the Museum last week and it’s certainly opened our eyes to just how much we use these items. Especially every time we open our drawer for a new pen or pop another post it note on someone’s desk.

A selection of Hidden Heroes

A selection of Hidden Heroes

This realisation also got us talking about the objects’ other uses, their re-appropriation if you like.

Most people thought that they never used paperclips because they work on a computer most of the time. But when we asked them how they put their new sim card into the iphone’s they were holding, the answer was pretty much a unanimous ‘paperclip’.

Post it notes are a staple element of any to do list in our office and we even realised that when at home if we ever need to restart our wireless router a ballpoint pen is always the instrument of choice!

After discovering these other uses we started looking at some of the more creative uses for these Hidden Heroes.

The paperclip has become an inspiration for artists and designers alike. Tim Sterling makes beautiful, intricate sculptures out of paperclips.

A cube made from Paperclips

Photo Source:

And we discovered this great post it art coming out of a group of offices in Paris.

Ghostbuster logo made from sticky notes

Photo Source:

And striking abstract works of art by collaging used tea bags from Armén Rotch:

A collage made from recycled Teabags

Photo Source:

This theme of appropriation is explored further in the exhibition. We have a print of David Mach’s giant coat hanger sculpture alongside our wire coat hangers and a portrait of Mike Tyson made completely out of scotch tape.

Mike Tyson Portrait made from Scotch tape

Photo Source: Advertisement for ‘Tesa Ultra Strong’ parcel tape created by Mark Khaisman, 2008 - © Jung von Matt/Neckar für teas S

The more we talk to people the more anecdotes and insights we gather. Which is where you come in…

Henry Ford said that ’Every object tells a story’ and we want to hear your stories. Tell us your genius uses for these everyday objects in the comments below, or on Twitter with the hashtag #HiddenHeroes

Or if you want to get really creative why not upload your own art work made out of these everyday objects to our Facebook page.

Giant cockroach drama character

Albert Einstein, Isaac Newton and the world’s first pregnant man – these are just a few of the characters brought to life by actors inside the Museum.

Today I met a real-life giant cockroach (the actor’s name is Guy) who is kind enough to give humans a tour of the Science Museum from this critter’s perspective. Here’s what Professor John Cockroach had to say:

“Ah nice to meet another friendly cockroach face! Quick, join in. Don’t forget to put on your weekend ‘best’. Going among the humans, we’re all to be on our best cockroach behaviour.

Now, you’re probably wondering why we cockroaches are gathering here?

Well, as cockroaches, we forget – don’t we? – we haven’t had to evolve for, well, millions of years. We are, after all, pretty much the same as we were when there were dinosaurs. Humans, bless ‘em, clearly still have a long way to go. But they’re trying, obviously, so we mustn’t judge them too harshly. That would be very uncockroach of us, wouldn’t it?

Human beings are so fascinating. D’you know, they change their own environment to solve their problems rather than evolve? Yes, they have a strange habit of constructing things and when they’ve finished they sometimes put those things in a room and call it a museum. How strange!

So, being a cockroach professor of humanology, ahem, I lead a group of you cockroaches, on a quick scuttle around, as it were, for about 30 minutes.

We’ll see how humans seem absolutely obsessed with something they call time. Many of them can’t even eat, sleep, or indeed leave the house without first ‘checking the time’.

We’ll see plenty of examples of their machines too, how they try to save time, kill time, and how they love to burn things in order to go faster and faster.

Just a quick reminder, though, to any of our cockroach visitors. Please don’t expect to feed the humans. Sorry, but they’re very fussy eaters and have quite strict feeding times.”

Cockroaches studing the Apollo 10 capsule

Cockroaches studying the Apollo 10 capsule

If dressing up as a giant cockroach and participating in a unique tour sounds like fun, sign up for a Cockroach Tour of the Science Museum.

A big thanks to Guy for his contribution to this post!

Blythe House

Museum store or movie set?

Hollywood glamour isn’t the first thing that springs to mind when you think of a museum store but Blythe House, the Science Museum’s small object store, is the red hot destination for filmmakers right now.

Blythe House

Earlier this year the blockbuster spy thriller Tinker Tailor Soldier Spy, set in 1970s London, was filmed there. Featuring a preeminent cast with the likes of Gary Oldman, Colin Firth and Benedict Cumberbatch, Blythe was suddenly host to a crew of over 70 people and a 140 ft crane.

Rumours have it that our Blythe House colleagues were nearly ushered out of the cinema for cheering at every Blythe shot…

On a rather different note, check out the latest Jessie J video, which was entirely shot at Blythe House. Sadly, we can reveal the luxury basement boudoir isn’t a normal fixture at Blythe.

Filming is nothing new to Blythe House. Classic espionage and detective shows Minder and The New Avengers were shot there in the late 1970s early 1980s.

It’s no surprise that Blythe has captured so many Directors’ imagination. It’s a wonderfully atmospheric place with long corridors, towering staircases and rickety lift shafts. It’s certainly what captivated animators the Quay Brothers who made the short film The Phantom Museum. Pretty creepy stuff.

Our collections have also had their moment in the spotlight. If you’ve seen the vampire slayer movie Van Helsing then you might have glimpsed the Omniskop. The production crew created an exact replica of this awesome looking x-ray machine for the slayer’s laboratory.

Post written by Katie Maggs, Curator of Medicine

A selection of Hidden Heroes

Hidden Heroes

Next Wednesday we’ve got an exciting new exhibition opening. Hidden Heroes explores the way in which 36 of our unsung design classics came into existence.

The exhibition, curated by Vitra, is being shown in the UK for the first time and brings to life objects that are so familiar that we might forget – or never even consider – the stories that lead to their existence.

A selection of Hidden Heroes

Where would fashion be without the snap fastener, the zipper or the hook & loop fastener? Innovative little solutions such as these have helped create masterpieces we drool over and spend millions of pounds a year on.

When is the last time you used a rawl plug? A condom? Or picked up a 6 pack of beers by the plastic carrier? Every day we use seemingly unimportant little inventions that help make our world go round. But have we ever really considered where these came from?

Find out how Napoleon brought about the invention of the tin can, which museum inadvertently commissioned the rawl plug, or how a descending aeroplane inspired the design of bubblewrap.

Hidden Heroes opens Wednesday 9th November, tickets are £6 and you can book them online now.

Finally keep an eye on our blog and Twitter feed over the next few weeks. Not only are we celebrating the original stories and ideas that lead to these objects, but we want to know your stories and how these objects might impact your day to day life. Maybe in their conventional manner or perhaps in some brand new way!

Don’t forget to sign up to our newsletter to stay up to date on the latest events and exhibitions happening at the Museum this winter.