Category Archives: Computing

30 years on: the rise of the Macintosh computer

Chloe Vince, a volunteer working on the upcoming Information Age gallery, celebrates 30 years of the Apple Macintosh computer.

‘Hello, I am Macintosh’ said the robotic voice ‘It sure is great to get out of that bag!

The robotic voice in question belonged to a computer called Macintosh, which was first launched and demonstrated by Steve Jobs 30 years ago, on January 24 1984, in front of expectant audience of 3000 people. The Macintosh (there’s one from our collection pictured below) bears little resemble to those that are used today. A beige upright case housed a 9-inch black and white screen and had an in-built handle to easily transport its 7.5kg weight.

The Apple Macintosh, launched 24 January 1984 (Source: Science Museum / SSPL)

The Apple Macintosh, launched 24 January 1984 (Source: Science Museum / SSPL)

The Apple Macintosh was not the first of the Macintosh computers, but it was the first that was commercially successful. In comparison to previous models, it was considered to be relatively affordable at £1,840 and sold 70,000 in the first 5 months of sale. Prior to this, the Apple Lisa, targeted more at business users, was less affordable at £6,000 resulting in only around 6,500 being sold worldwide.

The Apple Lisa was the precursor the Apple Macintosh, but did not share such commercial success (Souce: Science Museum / SSPL)

The Apple Lisa was the precursor the Apple Macintosh, but did not share such commercial success (Souce: Science Museum / SSPL)

But it was not just the price that made this computer so popular. Firstly, it came complete with a mouse, which may seem an obvious counterpart to desktop computers we use today, but prior to this most computers had to rely on the keyboard entirely. In addition there was also 3.5” floppy disk drive which could manage 25% more data than discs used previously.

The biggest improvement came with the graphical user interface (GUI) which used square pixels instead of rectangular ones making the graphics much clearer and sharper. It also included icons of real-life items such as a ‘documents’ image and a ‘trash’ image instead of abstract text commands used previously.

These developments made the Macintosh an ideal personal computer for the majority of those that had no previous experience of computing, or as the advertising famously stated, ‘introducing Macintosh… for the rest of us.’

Do you own or have any memories of the original 1984 Macintosh? Can you remember a time before Macintosh existed or have you always grown up around these computers?

Discover more about the history of communication technologies in our new Information Age gallery, opening in Autumn 2014.

The pride and passion of Mr Babbage

Cate Watson, Content Developer takes a look at the pride and passion of Charles Babbage.

Designing the Difference and Analytical engines was a monumental task, demanding dedication and extreme attention to detail. Both engines were made up of thousands of parts that required near identical manufacturing – pushing Victorian technology to its limits. And Babbage was determined to make the machines operate without any possibility of errors.

Gearwheel cut-outs for Babbage's Difference Engine No 1, 1824-1832. Credit: Science Museum / SSPL

Gearwheel cut-outs for Babbage’s Difference Engine No 1, 1824-1832. Credit: Science Museum / SSPL

Babbage was very certain his engines would work. His passion for his machines kept him going despite numerous setbacks such as losing funding and the lack of acclaim or understanding of his inventions. Babbage continued designing engines until he died, absolutely sure that one day his work would be appreciated.

Babbage's Difference Engine No 1, 1824-1832. Credit: Science Museum / SSPL

Babbage’s Difference Engine No 1, 1824-1832. Credit: Science Museum / SSPL

And he was right. Nearly 150 years after Babbage’s death, our modern technological society can fully appreciate his genius in inventing the Analytical engine – a machine that embodies all the major principles of our computers – and the potential it had to change society.

Babbage passionately believed in his inventions and the importance of science. This uncompromising certainty made him highly critical of those who didn’t live up to his high standards. He published a scornful, sarcastic attack against the unscientific practices of the Royal Society. It was so shocking that Babbage’s friend John Herschel told him he would have given him a ‘good slap in the face’ for writing it if he had been within reach.

Babbage's Analytical Engine, 1834-1871. Credit: Science Museum / SSPL

Babbage’s Analytical Engine, 1834-1871. Credit: Science Museum / SSPL

Babbage acted according to his scientific principles and succeeded in alienating the Royal Society – which had previously persuaded the Government to fund the Difference Engine. Babbage tried demanding more money from the Prime Minister, failed and lost all hope of further support.

Babbage’s uncompromising personality contributed to his failure to build his machines. Yet it was his unswerving dedication to science that made him continue to work beyond hope of realisation and produce the engine plans you can see on show in the Science Museum’s Computing gallery.

The Clock of the Long Now

The Science Museum’s curator of time, David Rooney, reflects on the ‘Clock of the Long Now’, a prototype of which is on show in the museum’s Making the Modern World gallery. David will be talking about clocks, speed and slowness at this month’s Science Museum Lates.

‘Civilization is revving itself into a pathologically short attention span. The trend might be coming from the acceleration of technology, the short-horizon perspective of market-driven economics, the next-election perspective of democracies, or the distractions of personal multitasking. All are on the increase’. This analysis of society at the end of the twentieth century was written in 1998 by Stewart Brand (born 1938), writer, inventor and founder of the Whole Earth Catalog.

Brand, together with computer designer Danny Hillis (born 1956) and other prominent fin de siècle thinkers, had become increasingly concerned that the year 2000 had come to be seen as a temporal mental barrier to the future. Brand explained: ‘Some sort of balancing corrective to the short-sightedness is needed—some mechanism or myth that encourages the long view and the taking of long-term responsibility, where “the long term” is measured at least in centuries’.

Hillis’s proposal was to build ‘both a mechanism and a myth’, a monumental-scale mechanical clock capable of telling time for 10,000 years—if it was maintained properly. Such a clock would prompt conversations about ‘deep time’, perhaps becoming a public icon for time in the same way that photographs of earth from space taken by the Apollo 8 crew in December 1968 have become icons for a fragile planet in boundless space (It was partly due to Brand’s agitation that NASA released earlier satellite-based photographs of earth to the public in 1966).

Earthrise, a photograph of the Earth taken by astronaut William Anders during the 1968 Apollo 8 mission.

Earthrise, a photograph of the Earth taken by astronaut William Anders during the 1968 Apollo 8 mission. Credit: NASA / SSPL

In 1996, Brand and Hillis formed a board of like-minded friends. Calling themselves ‘The Long Now Foundation’, the organization’s title sprang from a suggestion by musician and composer Brian Eno that ‘The Long Now’ could be seen as an important extension of human temporal horizons.

In this scheme, ‘now’ was seen as the present moment plus or minus a day, and ‘nowadays’ extended the time horizon to a decade or so forward and backward. However, the ‘long now’ would dramatically extend this ‘time envelope’. Since settled farming began in about 8000 BCE, the futurist Peter Schwartz proposed that the ‘long now’ should mean the present day plus or minus 10,000 years—‘about as long as the history of human technology’, explained Hillis.

The design principles established for the clock laid down strict parameters for its construction. With occasional maintenance, it was thought that the clock should reasonably be expected to display the correct time for 10,000 years. It was designed to be maintainable with Bronze Age technology. The plan was also that it should be possible to determine the operational principles of the clock by close inspection, to improve the clock over time and to build working models of the clock from table-top to monumental size using the same design.

Clock of the Long Now. Credit: Rolfe Horn, courtesy of the Long Now Foundation

Clock of the Long Now. Credit: Rolfe Horn, courtesy of the Long Now Foundation

In 1997, a small team of expert engineers, mechanics and designers based in San Francisco, led by Alexander Rose, set about constructing a prototype of the Clock of the Long Now, as the project became known. Driven by the power of two falling weights, which are wound every few days, the torsional (twisting) pendulum beats twice per minute, transmitting its time through an oversized watch-escapement mechanism to the heart of the clock, a mechanical computer.

This computer, conceptually linked to the machines of nineteenth-century polymath Charles Babbage, operates once every hour, updating timekeeping elements within the dial display, including the position of the sun, the lunar phase and the locally-visible star field. The slowest-moving part of this display indicates the precession of the equinoxes.

Clock face of the Clock of the Long Now. Credit: Rolfe Horn, courtesy of the Long Now Foundation

Clock face of the Clock of the Long Now. Credit: Rolfe Horn, courtesy of the Long Now Foundation

As the designer of some of the world’s fastest supercomputers in the 1980s, Danny Hillis said in the 1990s that he wished to ‘atone for his sins’ of speeding up the world by designing the world’s slowest computer for the Clock of the Long Now.

This range of tempos reflects the Foundation’s idea of ‘layers of time’ in human existence. The fastest-changing layer is fashion and art; a little slower is commerce. Infrastructure and governance take still longer to change. Cultures change very slowly, with nature reflecting the slowest tempo of all. ‘The fast layers innovate; the slow layers stabilize’, explained Brand. The Foundation believes that an understanding of the opportunities and threats embodied in these layers of temporal change is crucial in correcting humankind’s apparent short-sightedness.

These ambitions and ideals were expressed eloquently in the finished prototype clock, which first ticked in San Francisco moments before the end of New Year’s Eve 1999. It was then moved to London, where the Clock of the Long Now had been selected as the final exhibit in the Science Museum’s Making the Modern World gallery, opened by Her Majesty The Queen in 2000.

A prototype of the Clock of the Long Now, on display at the Science Museum

A prototype of the Clock of the Long Now, on display at the Science Museum

Meanwhile, the Foundation continued to build further prototypes, refining the design of the clock’s several constituent subassemblies in preparation for the construction (now underway) of a 10,000-year clock inside a mountain in western Texas, near the town of Van Horn. The Foundation hopes to build several ‘millennial clocks’ over the course of time, and a site for another has been purchased atop a mountain in eastern Nevada, adjacent to Great Basin National Park.

By its nature, the clock is both a conclusion—of a long process of human thinking, making and acting—and a starting point, for a long future, the contents of which are uncertain, the opportunities of which are infinite. Stewart Brand observed, ‘This present moment used to be the unimaginable future’.

As a symbol for the past, present and future of human ingenuity, the Clock of the Long Now is a fitting device to represent the modern world and all of its milestones. As Danny Hillis has said, ‘Time is a ride—and you are on it’.

David Rooney (@rooneyvision)

Beyond the mouse – the future of computer interfaces

Chloe Vince, volunteer on the Information Age project takes a look at the humble computer mouse, Douglas Englebart’s best-known contribution to modern computing.

Since its invention in 1963, the computer mouse has become an iconic image of personal computing. It was designed and developed by visionary engineer Douglas Engelbart who recently passed away on 4th July 2013 at the age of 88.

This early version of the computer mouse bears very little resemblance to those that we use today – it began as simply a wooden shell encasing a circuit board attached to two wheels which allowed movement across a surface. It was the wire that extended from the wooden shell and connected it to the computer that gave it is resemblance to its namesake – christening it a ‘mouse.’

A replica of the first ever computer mouse designed by Douglas Engelbart invented in 1963 and patented in 1970 (Source: SRI International)

Whilst the function of the mouse has remained the same since this initial model, the design has become much more streamlined. In 1972 computer engineer Bill English replaced the wheels for a ball, allowing the mouse to move in any direction. However this design soon encountered problems when dirt accumulated on the ball and restricted its use and as a result, in 1981 the mouse underwent another redesign.

It was then that engineers at technology company Xerox developed the first optical mouse, which worked by using focused beams of light to detect the movement of the mouse relative the surface it was on. In successive years, the combination of reduced cost in equipment and the progression in optical technology provided us with the optical computer mice that are used widely today.

The computer mouse used with the Apple G4 computer. Source: Science Museum / SSPL

The computer mouse used with the Apple G4 computer. Source: Science Museum / SSPL

While computer mice have retained their popularity with desktop computers and laptops, more intuitive computer interface technologies started becoming favoured on tablet computers and smart phones.

In the early 1990’s, the stylus pen began to be used widely, particularly with smart phones and message pads. Shortly after, the pen was lost and multi-touch screens became the most popular means to interact with these devices. These screens can detect two or more points of contact on an interface so users can rotate, pinch and zoom in on graphics – something you may be used to doing on your mobile phone.

Apple Newton Message Pad, part of the Science Museum’s collection, used a stylus for the user to interact with the screen. (Source: Science Museum / SSPL)

Apple Newton Message Pad, part of the Science Museum’s collection, used a stylus for the user to interact with the screen. (Source: Science Museum / SSPL)

This technology is so effortless to use it is difficult to think of how this interaction can become any easier – but what if you didn’t have to do anything at all? What if all you had to do was think about what you wanted your computer to do?

Computer tablets and smart phones used today mostly use a combination of multi-touch screens and voice recognition software. (Source: Flickr user ‘Exacq’ under creative commons license)

This month, scientists at the University of Washington have published findings showing that patients who had a thin layer of electrodes placed in their brain were able to move a cursor on a computer screen by demand by just thinking about it. Although in the early stages, this technology has the potential for users to communicate with computers using only their thoughts to control the commands on the screen.

While the idea of computers interpreting our thoughts may seem like a daunting prospect for most, patients suffering with severe forms of paralysis could find this research to be a life-line, allowing them to communicate with people via computers for the first time.

At the moment it is unknown whether this technology will be taken further commercially. Do you think it has the potential to be used at home or work to improve our lives? Or do you think this could take our relationship with computer technology too far?

A replica of Englebart’s mouse prototype will be on display in the Science Museum’s new Information Age gallery, opening in September 2014.

Hidden Histories of Information

Tilly Blyth, Keeper of Technologies and Engineering, writes about the hidden histories of information. Information Age, a new £15.6m communication gallery, will reveal how our lives have been transformed by communication innovations over the last 200 years.

Our new gallery on information and communications technologies, Information Age, will open in Autumn 2014. It will look at the development of our information networks, from the growth of the worldwide electric telegraph network in the 19th century, to the influence of mobile phones on our lives today.

Artists impression of the GPS Satellite model

Artists impression of the GPS Satellite model

One of the challenges of exhibiting the complex, and mostly intangible, world of information in a museum context is how you bring together the technology with the people involved and the information shared. The history of information is not just a neat history of devices. The telegraph instruments, radio and televisions, computers and mobile phones all reflect the material culture of information, but the history and future of information is much more complex.

One approach for dealing with this complexity is to look at how users, as well as innovators, have developed information and communications networks. Through personal stories we can connect visitors to the lived experience of technological change and reveal the significance of these networks to our ancestors’ lives.

As part of this approach we are conducting some new oral histories. We have recorded Gulf War veterans discussing their experience in 1991 of navigating around the desert both with, and without GPS. We have talked to the original engineers who set up Britain’s first commercial mobile phone networks for Vodafone and Cellnet in 1985. We will be talking to those who created and used the world’s first computer for commercial applications, the Lyons Electronic Office (LEO 1) in 1951. We have also interviewed some of the women who worked at the last last manual telephone exchange in Greater London, the Enfield Exchange in North London.

Women operators at the Enfield telephone exchange, October 1960.

Women operators at the Enfield telephone exchange, October 1960.

A lovely example of one account if this interview with Jean Singleton, a telephone operator who worked at a few different telephone exchanges, including Enfield when it was still a manual exchange. Jean left school at 15 when she started working for the GPO. Here she describes what made a good telephone operator.

We hope that detailed personal accounts like these will enthuse our audiences, reveal histories that are often not formally documented and show how centuries of ‘new’ information and communication devices have changed people’s lives.

Artists impression of the GPS Satellite model

Science Museum enters the Information Age

Charlotte Connelly is a Content Developer for Information Age, a new communications technology gallery opening in September 2014.

Last night the Science Museum announced exciting details about a new £16m communications gallery, Information Age, which will open in September 2014.

Artist’s impression of the Cable Network exploring electric telegraph.

Artist’s impression of the Cable Network exploring electric telegraph. Image credit: Science Museum / Universal Design Studio

The gallery will be a celebration of information and communication technologies. We’re already working on cutting edge interactive displays and participatory experiences that will reveal the stories behind how our lives have been transformed by communication innovations over the last 200 years.

Hundreds of unique objects from the Science Museum’s collections will go on display, many of which have never been seen before. They will include the BBC’s first radio transmitter 2LO, the BESM-6, the only Russian supercomputer in a museum collection in the West, and a full sized communications satellite.

Laying the first transatlantic telegraph cable in 1858 proved to be a tricky challenge to overcome. (Source: Science Museum / SSPL)

In Information Age we tell some of the dramatic stories behind the growth of the worldwide telegraph network in the 19th century and the influence of mobile phones on our lives today. Visitors can uncover stories about the birth of British broadcasting and learn about pioneering achievements in the development of the telephone. The role of satellites in global communications and the birth of the World Wide Web will also be explored in the new gallery.

Not only are we working hard behind the scenes of the Museum, we’ve also been working with lots of other organisations to develop the gallery. For our mobile phone display, we have a great selection of objects collected in Cameroon – look out for a blog post all about that coming soon! We’ve been working with Cameroonian communities in both Cameroon and the UK to decide how these stories are displayed.

We’ve also interviewed women who worked on the manual telephone exchange at Enfield in North London. Their stories have been selected by young women from the same area to be included in the gallery.

Our Curator of Communication, John Liffen, looking at a section of the Enfield exchange when it was installed in the Enfield Museum (Source: Hilary Geoghegan)

Watch this space to discover more about Information Age as the team will be writing regular blog posts about their work on the gallery to keep you up to date. Add your comments below to tell us what you would like to find out about.

A Portrait of Alan Turing from the National Physical Laboratory archive

The multiple lives of Alan Turing

February is Lesbian, Gay, Bisexual and Trans History Month, and this year the focus is on mathematics, science and engineering. Here, David Rooney, curator of the Science Museum’s award-winning Codebreaker exhibition, discusses mathematician Alan Turing’s contributions to science and society.

Alan Turing’s life had many facets. He is perhaps most widely known today for his wartime codebreaking exploits at Bletchley Park, where he devised processes and technologies to crack German ‘Enigma’ messages on an industrial scale. The intelligence uncovered at Bletchley was central to Britain’s war effort and may have shortened the conflict by up to two years. Winston Churchill described the site’s cryptanalysts as his ‘golden geese that never cackled’.

A Portrait of Alan Turing from the National Physical Laboratory archive

A Portrait of Alan Turing from the National Physical Laboratory archive

Turing’s first major contribution to science had been a paper written in 1936, when he was just 24, on an abstruse theoretical problem in the philosophy of mathematics. ‘On computable numbers, with an application to the Entscheidungsproblem’ attacked German mathematician David Hilbert’s so-called ‘decision problem’, which sought a formal underpinning of mathematics. Turing’s paper was a philosophical bombshell which destroyed the consistency of the subject.

This work brought Turing to the attention of a small group of mathematicians and philosophers, but it was its theoretical description of a ‘universal computing machine’, capable of carrying out any computable task, which was later seen as the conceptual basis of today’s stored-program computers. For Turing, his 1936 universal machines were simply thought experiments, but for others they signalled the future of computing. Turing himself wrote one of the first practical designs for a stored-program computer, later realised as the ‘Pilot ACE’, on display in the exhibition.

The first demonstration of the pilot ACE at NPL, December, 1950.

Alongside his work in cryptanalysis and computing, Turing is also widely remembered for his work on machine intelligence after he left wartime Bletchley Park. The ‘Turing test’, sketched out in his seminal 1950 paper ‘Computing machinery and intelligence’, has become a popular trope in artificial intelligence. It was Turing’s response to a philosophical stumbling block. First he asked, ‘Can machines think?’ He then proposed that this, itself, could never be known. Instead, if a machine could appear to be intelligent in a guessing game, then it could be assumed to be intelligent.

The relationship between thought and matter was a common theme throughout Turing’s life. As a teenager at Sherborne School, Dorset, he became closely attracted to a fellow student, Christopher Morcom, who was a year older. Morcom was, if anything, even brighter than Turing, and more devoted to mathematics and science. The pair became close friends, although Turing’s love of Morcom was unrequited.

Meeting Morcom was a watershed in Turing’s life, acting as an emotional catalyst that converted the previously ill-focused, undisciplined but undoubtedly clever boy into a young man constantly attempting to improve himself. Morcom died, aged 18, from tuberculosis, and the rest of Turing’s life seemed to be an attempt to keep Morcom alive and make him proud.

If Morcom’s friendship and death was material in Turing’s intellectual development, it can also be seen as a focus for the complex ideas about intelligence and the mind that Turing developed towards the end of his own life. Writing to Morcom’s mother soon after her bereavement, Turing said, ‘when the body dies the “mechanism” of the body holding the spirit is gone and the spirit finds a new body’. Even in his 1950 paper on machine intelligence Turing showed great interest in paranormal phenomena such as telepathy and psychokinesis that were at the fringes of scientific respectability even then.

Turing’s science remained resolutely off the mainstream. Having broken codes for the nation and conceived new paradigms in mathematics, computing and intelligence, he produced final work that was so avant-garde that it was virtually abandoned after his death in 1954, only to be picked up again relatively recently. Morphogenesis – the development of pattern and form in living things – occupied his thoughts for the last four years of his life as he ran computer simulations of the mathematics and chemistry of life itself.

The intercept control room in hut 6 at Bletchley Park, Buckinghamshire, the British forces’ intelligence centre during WWII. Image credit: Science and Society Picture Library

At Cambridge University, where he studied in the 1930s, and at wartime Bletchley Park, Turing’s homosexuality was relatively tolerated. But in post-war Britain a new morality was rapidly emerging. Britain’s future rested on repopulating the country with young men to replace the millions slaughtered at war. Homosexual people – men and women – were increasingly characterised as deviant and harmful to the fitness of the race, and their presence in society became a matter of national concern.

The Cold War intensified these concerns, as gay people were assumed to be at risk of blackmail, endangering the security of the nation. Turing held some of the nation’s most secret knowledge in his head.

Alan Turing and colleagues working on the Ferranti Mark I Computer in 1950. Image credit: Science and Society Picture Library

In 1952, following an unlawful sexual relationship, Turing was tried and convicted of ‘gross indecency’ under the anti-homosexuality legislation of the day. He was stripped of his security clearance and his post-war consultancy to Bletchley Park’s successor, the Government Communications Headquarters (GCHQ), ended. He was offered a choice of imprisonment or a one-year course of hormone treatment to suppress his libido, and he took the latter. It was chemical castration.

Turing appeared to recover well from the sentence after its effects subsided, but by then he was under police surveillance and it is likely that his actions had become of grave concern to the security services. On 7 June 1954 he ingested a large amount of cyanide solution at his home in Wilmslow, Cheshire and was found dead the next day by his housekeeper. The coroner recorded a verdict of suicide, opining that Turing’s ‘mind had become unbalanced’. Turing did not leave a suicide note, and the full circumstances of his death remain a mystery.

For further information visit Codebreaker: Alan Turing’s Life and Legacy at the Science Museum, which runs until summer 2013.

 

Collecting synthetic biology – an iGEM of an idea

Collecting stuff is generally the bit I like most about my job. That’s probably why I’ve got a bit over excited about the new acquisitions we’ve made related to synthetic biology – from no other than Tom Knight widely described as the “father” of the discipline.

Synthetic biology is research that combines biology and engineering. Sounds like genetic engineering by another name? Well yes, but it goes much further. It looks to create new biological functions not found in nature, designing them according to engineering principles.  Some see the field as the ultimate achievement of knowledge, citing the engineer-mantra of American physicist Richard Feynman, “What I cannot create, I do not understand”.

Biofilm made by the UT Austin / UCSF team for the 2004 Synthetic Biology competition. From drugs to biofuels the potential applications are huge. (Image: WikiCommons)

Now like a lot of biotech, synthetic biology isn’t particularly easy to collect or represent through objects – as it’s the biology that’s interesting and most of the ‘stuff’ used in research is entirely indistinguishable from other biological equipment e.g. micropipettes and microwells.  

What we’ve acquired are a number of iGEM kits – hardware consisting of standardised biological components known as BioBricks™ . Students competing in iGEM are sent these kits to engineer new applications. Check out some of the former winner’s projects: Arsenic Biodetector, Bactoblood, E. Chromi.

Biological lego – parts that have particular functions and can be readily assembled. The kits document a fascinating ten year period in the discipline of synthetic biology – starting from this basic aliquot kit sent out when iGEM first launched c.2002. (Image: Science Museum)

The origin of these objects and the idea for BioBricks™ is rather curious. They didn’t emerge from biology – but from computer science. Tom Knight was a senior researcher at MIT’s Computer Science and Artificial Intelligence Laboratory. Tom became interested in the potential for using biochemistry to overcome the impending limitations of computer transistors.

Knight Lab: Tom set up a biology lab in his computer science department and began to explore whether simple biological systems could be built from standard, interchangeable parts and operated in living cells. That led to setting up iGEM.

From aliquots to paper based DNA to microwells – the kits show the technological change and sheer complexity of distributing biological components to teams competing around the globe.

In 2008 - the kits trialled paper embedded DNA via these folders - but it didn't quite work out. The kits do, however, represent an important ethic - that of open-sourcing in science. Students collaborate and contribute to adding new biological parts. (Image: Science Museum)

Suggestions for other synthetic biology stuff we could collect gratefully received!

Women of substance

Continuing our Women’s History Month theme, today we’re celebrating International Women’s Day. As the theme for 2011 is ‘equal access to education, training and science and technology’, it seems like a good day to celebrate Kathleen Lonsdale, who in 1945 became the first woman to be elected a Fellow of the Royal Society, along with microbiologist Marjory Stephenson (only 285 years after the men).

Kathleen Lonsdale in 1957 (Science Museum).

Lonsdale was a pioneer in the field of X-ray crystallography, in which scientists fire X-rays at crystals and study how they are scattered. This enables them to infer how atoms are arranged inside the crystal.

Lonsdale's models of the structure of ice, 1955 (Science Museum)

In the early days, it was an arduous process. Capturing X-rays on film could result in burns to the fingers. Calculating the atomic layout from the X-ray patterns had to be done manually, involving hours of slogging. Things got somewhat easier with the advent of scientific computing. The Pegasus computer on display in our Computing gallery (the world’s oldest working electronic computer) was used by Lonsdale’s group at University College London.

Pegasus speeded up crystallographers' calculations (Science Museum).

Lonsdale faced the additional challenge of being a woman in a man’s world, and for a time struggled to combine scientific work with raising a family. Her mentor William Henry Bragg arranged for a grant to help support her at home so that she could carry out her world-class research. Lonsdale said that to succeed as a woman scientist one must be a first-class organiser, work twice the usual hours, and learn to concentrate in any available moment of time.

In the early-to-mid 20th century, the field of X-ray crystallography was unusual in having a number of high-profile women scientists, including Lonsdale, Helen Megaw, Rosalind Franklin, whose X-ray photograph of DNA was infamously used by Crick and Watson in determining the double helix structure, and Nobel prizewinner Dorothy Hodgkin. Hodgkin always resisted being singled out as a ‘woman scientist’, but cannot have been impressed with the Daily Mail’s headline announcing her award: ‘Oxford housewife wins Nobel Prize’.

Dorothy Hodgkin in the 1940s (NMeM / Daily Herald Archive / Science & Society)

Things are easier for women in the sciences today but a 2010 report suggests that, in the UK at least, the picture’s still not so rosy – despite an increase in females studying science, technology and medicine, women still only make up 12% of the workforce. And women are noticeably absent as the famous faces of science. There’s still some way to go before the likes of Lonsdale become the norm rather than the inspirational exceptions.

Batteries not included

What’s the one gadget you couldn’t live without? Your mobile phone, PDA, music player, game console – or all those things combined in a sleek smartphone?

No matter which device you choose, the one thing that all these gadgets couldn’t exist without is their rechargeable battery - the beating heart of the modern world.

The first rechargeable battery was the Lead-Acid battery, invented in 1859 by Gaston Planté, but it was the Nickel Cadmium battery invented in 1899 by Waldemar Jungar that really paved the way for the future of mobile technology.

The very early mobile phones used Nickel Cadmium batteries, but the batteries were so enormous they had to be stored in the boot of a car. As demand increased improvements were made and soon you were able to carry your battery around with you in a handy carry case.

Vodafone transportable mobile phone, 1985. (Science Museum / Science & Society)

By 1983 the first stand alone mobile phone had been developed using the Nickel Cadmium battery the Motorola Dynamic Adaptive Total Area Coverage (DynaTAC 8000X). By 1989 they could even fit in your pocket – though it might have to be quite a large pocket.

Motorola MicroTAC cellular telephone.

Motorola MicroTAC cellular telephone, 1993. (Science Museum / Science & Society)

Today the battery that probably powers the phone in your pocket and the laptop on your desk is a Lithium battery, most likely a Lithium-Ion battery.

Introduced in 1990 these batteries have emerged as the best energy to weight ratio, meaning they last longer but weigh less, and they have enabled mobile phones to become smaller and smarter.

Sony Ericson T68i mobile phone, 2002. (Science Museum / Science & Society)

The iUnit concept car in our Plasticity exhibition is proof that in the future lithium batteries could be used to power even more aspects of our mobile lives.

Toyota i-Unit concept car, 2005 (Science Museum website)