The arts and the sciences: How collaboration could lead to a winner for the Longitude Prize

A shorter version of this blog was originally posted on the Longitude Prize Blog on the 23rd January 2015.

Artwork by C-Lab. Photo: Sas Schilten
Artwork by C-Lab. Photo: Sas Schilten

Since the Enlightenment there has been increasing specialisation in the respective roles of scientists and artists, which has fostered an image of the sciences and the arts as separated or even as diametrically opposed pursuits. At the same time, there has been a recent sociological, historical and artistic re-analysis that has revealed a shared underlying impetus in both fields to augment human knowledge and to extend our experience of the world. This can be conceptualised at its most simple level in the common desire within both fields to take pleasure from understanding something new and communicating this to others. And in many cases interdisciplinary collaborations have been a driving force behind the arts radically influencing the outcomes of science for the better.

Two cultures

 Many people still conceptualised ‘science’ as purely empirical, embracing the objective in the quest for the production of truths, while ‘arts’ are considered an expression of the ‘metaphysical’, focusing on personal reflection on the human experience. Scientist and novelist C.P. Snow brought this dichotomy to public attention in his 1959 lecture at Cambridge University, entitled ‘The Two Cultures and the Scientific Revolution’. The lecture was a landmark in the history of the relationship, and sparked a debate that continues in contemporary popular discourse. Snow asserted that two distinct cultures existed within the educated elite: that of scientists and that of literary and artistic intellectuals, the two parties experiencing “mutual incomprehension”. He described the relationship as an epistemological dichotomy, with different values placed on the existential moment and the human condition. He also generalised to establish that non-scientists saw scientists as “shallowly optimistic, unaware of man’s condition” and scientists saw intellectuals as “totally lacking in foresight…in a deep sense anti-intellectual, anxious to restrict both art and thought to the existential moment”.

Snow’s argument reflected contemporary cultural ideas, and though it may appear out-dated, it is interesting to consider whether a cultural divide between ‘sciences’ and ‘arts’ persists. In her book ‘Art and Science’ Ede argues that science is still characterised by a reductionist philosophy, while art remains attached to the world of superstition that certain sections of science see as encouraging prejudice and exploitation in society. Ede explores similarity and difference between both fields, as well as considering the ambiguities of language and epistemology. Ede’s work alludes to the same cultural dichotomy between the existential and metaphysical arts and the empirical and objective sciences.

From art and science to ‘sciart’

 A major theme within the ‘sciart’ movement has been to bridge this perceived divide between the ‘two cultures’ of science and art. However the field of ‘sciart’ is not new; the concept preceded the ‘two cultures’ lecture. The term ‘sciart’ was coined by Bern Porter in his ‘Sciart Manifesto’ in 1950. Notably, from the mid 1990’s the increasing popularity of sciart was reflected in growing encouragement for collaborations between scientists and artists through provision of funding from major institutions, including the Wellcome Trust, and the Sciart consortium.

This increase in interdisciplinary practice between science and art leads us to question if the separation between the epistemological traditions of the arts and sciences is still prevalent today. Some texts describe art and science coming together under such ‘metaphysical’ pretenses as to extend our experience of the world or to allow deeper moral discussion. Such themes are superficialy interesting, however they fail to evoke questions about the practical nature of science and art collaboration. The practices of art and science uphold different conventions. Often the publicly shared understanding of the ‘role’ of the artist and scientist are skewed. In some cases the way in which collaboration occurred has been shaped by the conventions of science, for example the need to gain funding through science institution, and in others the conventions associated with the ‘free expression’ of the arts have influenced science.

The boundaries between science and art becoming increasingly blurred. The artworks from C-lab use synthetic biology as their medium. From genetically engineering bacteria to glow when they feel stressed, to creating bacteria capable of degrading material dye, C-Lab’s creations use the laboratory as their canvas. In this sense artists are adhering to the conventions of science, through the use of lab techniques in order to produce artistic creations.

For innovators seeking to produce novel products for science, it is very interesting to consider how the ‘arts’ can disrupt or bring new insight to conventional science innovation processes. A good example of this happened through sciences at the European European Bioinformatics Institute in Cambridge collaborating with a science design company called Science Practice. Designers at Science Practice have developed new ways for scientists to visualise protein sequence bundles. Each individual sequences or groups of protein sequences can be highlighted using different colours and easily compared on the same visualisation. The technique developed keeps individual sequences as intact, distinct lines, rather than separating and summarising the individual building blocks, they reveal patterns in sequential blocks that would be obscured by statistical methods. This provides a novel way of collecting and analysing data. The graphic designers were able to assist scientists in ‘seeing’ their data in a new light.

New approaches, new perspectives.

Sometimes the influence of art on science can be more subtle, but just as important; new perspectives can be bought about through interdisciplinary collaboration. A malarial scientist obsessed with merozoites worked with a photographic artist through funding from the Wellcome Trust. Together they went to the clinics in East Africa that provided the blood samples for malarial research. The scientist became the photographer’s assistant, holding tripods not syringes, and seeing the community for the first time through eyes free to look more broadly at the issue of malaria. This allowed him to change the way he worked for the better. This collaborative influence is more difficult to quantify, however it is clear to see that art and science interactions can have a variety of effects from changing personal perspectives, to influencing how data is perceived.

For a successful application to the Longitude Prize, it is important that teams embrace interdisciplinarity and realise the potential that science and art collaborations have to strengthen ideas. Truly novel innovation often happens in unexpected ways. The winner of the longitude prize will have a novel diagnostic that has the potential to be game changing in the health care and surveillance sector – an artistic perspective has the potential to make a difference.


Lateral thinking on diagnostics can help tackle antibiotic resistance


(Originally published on Health Service Journal Blog, 18th Nov 2014, available here)

New diagnostic technologies can help improve patient outcomes, support changes to our habits and improve surveillance systems to tackle antibiotic resistance, writes Anna Williams

The rising use for antibiotics was linked with the growing prevalence of antimicrobial resistance in a Public Health England report, published last month. To most health professionals this is not groundbreaking news. Nevertheless, our understanding of the development mechanisms of resistance pales in comparison to the magnitude of the problem it presents.

Thankfully, this year there has been much very vocal activity about antimicrobial resistance, such as the publicity surrounding the Longitude Prize. Projects such as these should make us all re-examine our own relationship with antibiotics and look at what can be done to help slow resistance. There is no “magic bullet” for this problem but more can be done to help both from a technical and a social perspective.

Today marks the launch of the Longitude Prize, so it is fitting that we should open the competition for entries on the same occasion as European countries come together for Antibiotics Awareness Day. There is no better time to spread the message about prudent use of antibiotics and reflect on the global problem of antibiotic resistance.

Focus on the future

The Longitude Prize 2014 is a competition with a £10m fund to tackle antibiotic resistance.

I work as a researcher in the team developing the prize at Nesta. We are asking people from across the globe to come up with a point of care diagnostic to detect and understand infections to enable the prescription of the right antibiotics at the right time.

We hope entrants produce new and innovative diagnostics that will ultimately help to slow the rate of antimicrobial resistance and enable us to safeguard antibiotics for the future.

Tools and tests are one way to help us slow resistance by providing healthcare professionals with information to help make accurate clinical decisions. They can help definitively rule in or out bacterial infection, directly informing the decision to prescribe antibiotics.

However, there remains a technical challenge in developing a diagnostic that is rapid and sufficiently accurate to inform antibiotic prescriptions at the point of care, as well as being cost effective.The diagnostics the Longitude Prize seeks to develop may also go one step further by providing all of the necessary information to identify an effective antibiotic or combination of antibiotics.

This would allow clinicians to use a targeted, narrow spectrum of antibiotics that perhaps they wouldn’t normally use on the first occasion, thereby minimising the use of broad spectrum antibiotics. There are some exciting diagnostics out there already – for example, Spectromics is in the patent application stage for its urinary tract infection diagnostic. The test monitors phenotypic change that occurs between a urine sample and different candidate antibiotics to provide guidance on the most effective treatment. Up to 70 per cent of urinary tract infections are resistant, so Spectromics and other innovators in the field have the potential to inform accurate diagnosis and treatment, thereby improving care for a large proportion of patients.

Swelling numbers

Inflammation can be an indicator for bacterial infection and there are several tests that have been developed to detect biomarkers for inflammation. There are a variety of blood based tests for procalcitonin levels that are able to detect sepsis in patients. A recent systematic review of these tests showed that they may be effective in informing the initiation and termination of antibiotic treatment for respiratory infection, reducing overall exposure to antibiotics. This is encouraging news.

However, the relative benefits of inflammatory tests such as procalcitonin and C-reactive protein are widely debated, so there is a need for further research into their use in primary care settings. There has been innovation in the range of tests for bacterial infection, but they generally are unable to fully demonstrate cost effectiveness, accuracy and usability in comparison with the best available alternatives. The Longitude Prize is pushing innovators to address such inadequacies and calling for a more ambitious test than anything currently available.

Surveillance tactics

Better surveillance techniques to collate data on bacterial resistance will help us better understand its spread and underlying mechanisms. The more we understand the situation facing us, the better we can deal with the issue head on. Surveillance is needed in all environments where resistance can occur and diagnostics can act as a tool for collecting information.

At a national level, hospitals are best equipped to carry out surveillance. However, more and more care is being delivered in the community by multiple organisations providing more than one services and this may lead to difficulty in implementing standards for surveillance. Explicit infection control policies need to be developed by each organisation responsible for the care of individuals, which are specific to the settings of care. This should include all care settings, from hospitals to homes, community transport and day centres.

The standardised collection of data through diagnostic devices presents a simple way to increase surveillance in these settings. The 2011 chief medical officer’s annual report asserts that NHS England is well placed to collect, collate, analyse and disseminate information from surveillance. The UK needs to develop methods to ensure consistency and standardisation of data collection, but we also need to understand how this data will be used, so we collect the most useful information.

The European Centre for Disease Prevention and Control sets a good model for best practice in data collection, evaluation and dissemination. Such models should be considered when action is taken to federate and connect databases from health and social care in the UK.

The chief medical officer has also called for interoperability standards for health information systems, so emerging surveillance technologies can easily be integrated. This interoperability will also provide vital flexibility when actions are taken to globally integrate surveillance efforts.

Human nature

Human behaviour plays a big role in tackling resistance. The British public demonstrated their support for tackling the issue of antibiotic resistance by voting that this should be the focus of the Longitude Prize. In a new survey – published today for the prize – 78 per cent of respondents expressed concern about the issue of antibiotic resistance. However, it is interesting that public concern may not always translate into responsible behaviour, because nearly a quarter of respondents admitted to not completing a course of antibiotics prescribed to them.

Our own actions can contribute to resistance. Whether a doctor has prescribed antibiotics when not necessary or whether a patient has decided not to finish a course, all rests with the individual’s responsibility. Every time we expose bacteria unnecessarily to antibiotics, we create an environment that is favourable to the development of resistance. In another survey for the Longitude Prize, 28 per cent of British GPs prescribe antibiotics “several times a week”, even when they’re not sure they’re medically necessary.

Nearly all (90 per cent) say they feel pressure from patients to prescribe: 70 per cent do so because they’re not sure whether the patient has a viral or bacterial infection, and 24 per cent say it’s because they lack easy to use diagnostic tools. New diagnostic technologies can help improve patient outcomes, support changes to our habits and improve surveillance systems to tackle antibiotic resistance.

As well as technology to combat resistance there must be local, national and global action to innovate within healthcare systems to make effective use of existing tools and resources to tackle the problem. We hope diagnostics will play a key role in making this journey easier.

Anna Williams is a Researcher at Nesta

Widening scope to tackle antimicrobial resistance

Originally published in the National Health Executive July/Aug 2014 – available online. 

In June the public selected antibiotic resistance to be the focus of the Longitude Prize 2014. This £10m prize fund is calling for a new and innovative point of care toolkit that will help combat the growing problem of antibiotic resistance. Anna Williams, researcher on the project, explains.

Antibiotics have revolutionised medicine, adding an average of 20 years to each person’s lifetime. Penicillin alone has saved tens of millions of lives since its discovery 86 years ago but the efficacy of these life saving drugs is threatened by the evolution of resistant bacteria. This resistance is caused in part by human behaviour and prescription practices. The G8 Science Ministers’ meeting in June 2013 identified antimicrobial resistance as “humanity’s most pressing concern, transcending national boundaries and posing significant threats to societies and ecosystems”.

Taking action: Widening scope for stewardship

The effects of antimicrobial resistance on the NHS can already be seen in efforts to tackle Methicillin-resistant Staphylococcus aureus (MRSA) and C. difficile. These resistant Health Care Associated Infections, caused thousands to die at the peak of the outbreak. The highest recorded number of deaths in the UK caused by c. difficile was 8,324 in 2007.

Through better control of antibiotic prescribing, handwashing and hygiene protocols and consistent, meticulous, intravenous central line care, outbreaks of MRSA have reduced by 84.7% between 2003 and 2011 and outbreaks of C. difficile by 53% between 2008 and 2011.

There is a large evidence base for the methods used to tackle MRSA and C. difficile, success has come in part through mandatory surveillance and target-setting. However, our current infection control policies might not be best designed to tackle infections other than MRSA and C.difficile. With outbreaks of gram-negative health care associated infections on the rise, it cannot be assumed that the same methods of infection control and surveillance will be successful for these new types of resistant infection.

Gram-negative bacteria have unique features that make them harder for antibiotics to target. Gram-negative bacteria are also developing multi-drug resistance. Giving rise to infections including pneumonia, wound or surgical site infections, and meningitis in healthcare settings, they now account for the majority of blood stream infections.

Although continued monitoring is vital, perhaps we should widen the focus to infections other than MRSA and C. difficile. There are two main courses of action to be taken: prevention and control of infection, and the development of novel antibiotics. However, the number of antibiotics in the development pipelines is low, and with growing incidents of multi-drug resistance organisms, stewardship of existing antibiotics is of increasing importance.

Courses for action

Clinical Commissioning Groups can impose financial penalties upon a Foundation Trust that fails to reach targets for the reduction of MRSA and C. difficile infections. However, these penalties have been criticised because infection trajectories are difficult to predict, bringing into question the creation of specific targets. Financial penalties could be even less effective for gram-negative infections, for which less evidence for predicting trajectories is available.

‘Stewardship’ initiatives aim to reduce the unnecessary prescribing of all types of antibiotics. The ‘Target Toolkit’ and ‘Start Smart, Then Focus’ campaigns give guidance on antibiotic prescriptions in primary and secondary care respectively. However, because gram-negative bacteria respond to very few classes of antibiotics, we require more than guidance on prescription practices.

Surveillance is needed in all environments where resistance can occur. As care is increasingly being delivered in the community by multiple organisations, this may lead to difficulty in implementing standards. Therefore, we require action to develop explicit infection control policies that are specific to the settings of care.

The Chief Medical Officer commented that the NHS is well equipped to collect, collate, analyse and disseminate information from surveillance. The Department of Health has stated that the UK needs to develop methods to ensure consistency and standardisation of data collection and this could be helped by opening up access to statistical information about infection for modelling. This work must be underpinned by common interoperability standards for health information systems so that emerging surveillance technologies can be integrated across national borders.

Point of care test kits

There is a role for innovation in diagnostics for infections to improve infection control and surveillance in the future. Current lab culture based techniques for the diagnosis of infection are slow, and require specific resources and expertise. The development of cheap, accurate and rapid point of care diagnostics will allow for the more targeted use of antibiotics. This in turn will lead to a reduction in the use of broad-spectrum antibiotics, curtailing the opportunity for resistance to occur. The impact on global levels of resistance could be far reaching if the test was versatile and cheap enough to be used in primary and secondary care settings. Such test kits will also provide an opportunity to collect data on infection from a much great range of care settings across the world.

The more patients that get the right antibiotic prescription the first time, the longer we can preserve the action of our existing antibiotics. This is why the Longitude Prize is offering a reward fund of £10 million, in order to incentivise innovation in point of care diagnostics for infection. Over the next 5 years, Nesta with the support of the Technology Strategy Board, will be accepting and judging submissions from innovators who are working to produce a rapid, accurate test that has the potential to identify bacterial strains and profile possible resistance to antibiotics.

Novel innovation, both in diagnostics and the production of antibacterial agents have to be supported by surveillance and data collection. Then we can develop an effective evidence base for techniques to control antimicrobial resistance with co-ordination at a local and national level. The global nature of resistance means that we must also find new ways to work internationally, share data and set standards for interoperability in order to slow this growing threat.

Anna Williams, researcher, Longitude Prize 2014

Longitude Prize: Incentivising MedTech to deliver solutions


Originally published on MedTech Views Blog 19th June 2014, available here.

The £10 million pound Longitude Prize forms another route to funding vital medical and healthcare research, from antimicrobial resistance, to paralysis and dementia.


MedTech is transforming the world in which we live; we are healthier and more able than ever before. However, there are still a number of fundamental challenges that we face both locally and globally. The launch of the Longitude Prize 2014, with it’s £10 million prize fund, is seeking solutions to some of these fundamental scientific challenges. Antibiotic resistance, paralysis and dementia are on the shortlist of issues that could reap a £10m research windfall. The MedTech community has a vital and integral role to play, harnessing its power to innovate and accelerate towards the winning solutions!

The British public is being asked to cast the deciding vote to choose which challenge the Longitude Prize 2014 should focus on. Three out of a total of six challenges have implications for medicine and healthcare, that’s half the shortlisted challenges. This not only reflects the global importance of improving healthcare, but it also highlights the importance of MedTech as discipline. From dementia to antimicrobial resistance and paralysis, these challenges are diverse and require lateral thinking, which we so often see in the entrepreneurial MedTech community. Nesta, with the support of the Technology Strategy Board has developed the prize and the Longitude Committee identified these six challenges with careful consideration to ensure a science prize might be the most appropriate mechanism to stimulate innovation and collaboration that might otherwise not occur.


The preservation of antibiotics and the production of novel alternatives is vital to our future survival. Many existing antimicrobials are becoming less effective, as bacterial colonies are developing resistant to treatment, while the inappropriate use and misuse of these medicines is causing an acceleration of the numbers of reported cases of resistance globally. The pipeline for the development of new antibiotics is at an all time low and initiatives to implement behavioural and education programmes are in their infancy. Most policy proposals to tackle antimicrobial resistance put forward two main points for action: Action to conserve the antimicrobials that we already have, and action to accelerate solutions in diagnosis and drug development.

For the first time, antimicrobial resistance topped the agenda at the G8 meeting of science ministers last year. While solutions have been proposed to incentivise and accelerate solutions in drug development with initiatives such as, Advanced Market Agreements (AMAs) and Product Development Partnerships (PDPs), challenge prizes still have an important role to play. The Chief Medical Officer for England, Dame Sally Davies has highlighted the need to encourage a range of incentives to address the currently stagnated pipeline, primarily she proposes the use of PDPs, AMAs and Science Prizes as well as changes in patent agreements, to extend the patent period from twenty years to, say, twenty-five years.

Antibiotics underpin all modern medicine. It is vital that health professionals can make increasingly accurate prescriptions, reducing the number of broad-spectrum antibiotics used. There are several interesting MedTech and biomedical research groups already working in this area. For example, RAPP-ID are working on point-of-care test platforms for infectious diseases. But as Rangarajan Sampath mentioned in his blog yesterday, current culture based techniques are often inaccurate for bacterial diagnosis and we require innovation to improve patient care. The Longitude Prize for antibiotics will incentivise this much needed innovation, competitors will be asked to develop a cheap, rapid and extremely accurate point of care test that can enable practitioners to diagnose a bacterial infection in a variety of health care settings.


Paralysis can be devastating, and affects people with a range of medical conditions from stroke to nerve damage. Although some types of paralysis can be improved through intense rehabilitation, there is no effective treatment to restore the function of the nervous system. To the outside world, this limited mobility is the main symptom of paralysis. In reality, numerous secondary conditions dramatically affect the day to day life of those with paralysis. These secondary symptoms often include loss of normal bladder and bowel function, sexual function, low blood pressure, the formation of blood clots, pneumonia, neuropathic pain, spasticity and muscle spasms.

Given the multiple causes of paralysis, for example, stroke, spinal cord injury, and multiple sclerosis, a total cure for paralysis is distant goal for medical science. In the meantime, there is an extraordinary opportunity to develop incremental solutions that restore freedom to those who have been paralysed. Therefore the paralysis prize seeks solutions that could restore movement to individuals with any form of paralysis, in an easy light and useable form. The innovation must also address the secondary symptoms of paralysis.

The beauty of this prize is that there is scope for new forms of collaborations within the MedTech industry. For example, Neuroprosthetics is an area of research that has seen significant progress in the past decade, some treatments focus on replacement strategies by recording the electrical signals of neurons in the brain and translating them into the movement of devices such as robotic arms. Regenerative medicine has also made a lot of progress towards finding a cure for paralysis, but solutions are still in an early phase. Developments in robotics, bioengineering, and artificial intelligence have led to innovative technological solutions that offer support to people with paralysis. The emergence of assistive devices such as powered exoskeletons like REX, are an amazing feat of engineering, however all of the current fields of research need future refinement and could benefit from future collaboration in order to win the prize.


Picture: REX in action


It is estimated that 135 million people worldwide will have dementia by 2050.

In order to solve the problems posed by dementia, we need a cure, condition-altering treatment or a robust preventative intervention. New treatments, for instance anti-tau drugs, are currently being developed and could potentially improve cognitive functioning, but it could be many years before they are approved for clinical use, if at all. As with all chronic conditions, care plays a critical role in the management of dementia. This care will usually take the form of emotional, cognitive, and physical support from paid carers, but also from close family members, and friends.

Studies suggest that telecare systems and home automation have great potential to reduce the cost of chronic conditions where management is key. The largest barrier to success of the systems that currently exist is that they are not well suited to the nature
of dementia; for instance, they will often require interacting with new, unfamiliar devices or change established patterns of behavior in order to acquire meaningful information, things which many dementia sufferers would find difficult. However, there are a range of novel technologies that are in development, from simple location devices and fall sensors, to more complex ambient sensor systems such as the ambient kitchen developed by Newcastle University. Although stand-alone technologies exist, the prize for Dementia will focus on the development of assistive technologies that deliver an exceptional level of care, while rewarding innovation that provides an integrated home system of ambient technologies that support people with dementia to live independently in their own homes for longer.

Many of these issues are already being worked on by experts in a diverse range of science and technology disciplines related to medicine and healthcare. Three other non-medical challenge areas are also available to vote on: water, food and flight.Prizes open up new opportunities and we are throwing down the gauntlet to the MedTech community to provide new innovation. In Autumn 2014, we open the challenge to innovators across the world to solve the public’s chosen problem.

Find out more and vote for your chosen challenge here:

Watch the BBC horizon documentary about the Longitude prize here:

Twitter: @Longitude_prize


Antimicrobial resistance: Should a technological fix be considered?

bacteria 3

The Nesta project I’m currently working on is looking to identify challenges we face as a society. Under the theme of health, we have been discussing various challenges that might be solved in the future through science and technological intervention or development.

The Health panel is led by Dame Sally Davies, Chief Medical Officer for England, who has publicly stated her concern that over-prescription of antibiotics is causing antimicrobial resistance. At a G8 meeting on health in June Davies said, “If we don’t take action, then we may all be back in an almost 19th Century environment where infections kill us as a result of routine operations.”

After discussions with our expert panel last month, it became clear that a possible future ‘fix’ to this problem might be a cheap (no more than 10p per unit) test that distinguishes viral from bacterial infections.  This would reduce the number of viral infections mistreated with antibiotics. Such a testing device might even be able to tell which type of bacteria are present, reducing the probability of prescribing the wrong form of antibiotic.


Antibiotics are also misused by patients, often forgetting to complete a full course or taking old prescriptions. A test kit will help doctors’ prescription accuracy but it isn’t going to help the patient become more responsible about finishing the course of antibiotics. However, I want to explore how the technological fix might be part of the solution. For the most effective intervention we must see the problem of misuse by patients and the problem of mis-prescription by doctors as part of a bigger problem, a socio-technical one, that can be tackled holistically.

STS scholars often take a negative view of the ‘technological fix’. Rosner suggested that the term has come to be associated with a quick cheap fix using inappropriate technology. However, for some the technological fix was everything. Weinberg was one of the first to discuss the technological fix in the 1960’s. He used the example of the intrauterine device (IUD) as a means to stop unwanted pregnancy because it avoided the need to change people’s behaviour; it “provides a promising technological path to the achievement of birth control without having first to solve the infinitely more difficult problem of strongly motivating people to have fewer children”. This intervention embodies the cause – effect relationship, connecting problem to solution, which Sarewitz and Nelson argue is one of the three rules that determine the appropriateness of a technological fix for a social problem.

The trouble with the above arguments is that they assert an all-or-nothing approach to the technological fix. Even for those critics of the technological fix, such as Burke, conceiving of social problems through engineering terms would only make problems worse, but he concludes that the technological fix is either appropriate or inappropriate for the task in hand.

I propose that the technological fix is appropriate for some social problems, however I argue the all or nothing approach set out by Burke or Weinberg is flawed. The technological fix can happen alongside a social, political or economic intervention to achieve the best outcome – the ‘techno-social fix’.

According to Sarewitz and Nelson, the key to effective innovation policy is to distinguish problems that are amenable to technological fixes from those that are not.  Using Sarewitz and Nelsons three rules we can see how part of the antimicrobial resistance problem can be solved through the ‘fix’ of the test kit.  The first rule states the ‘technology must embody the cause effect relationship connecting problem to solution’. The effectiveness of a test for bacteria is independent of the person giving it; the test kit or ‘artefact’ encapsulates the essence of what needs to be done. The second rule states ‘the effects of the technological fix must be assessable using relatively unambiguous criteria’. Test kits are trusted tools in other forms of biological analysis, such as HIV testing, or blood sugar levels. Universal agreement leads to ‘operational cooperation’. The third rule is ‘research will solve a social problem when it focuses on improving a standardized technical core’. The standardised core in this case is the antibiotics themselves, providing the foundations from which improved practice can be implemented through an intervention such as test kits.

Clearly according to the three rules stated above, the bacteria test kits are a ‘technological fix’ to the problem of misdiagnosis causing antimicrobial resistance, but do not address the problem of misuse by patients. I would assert that the technological fix is still a vital part of the solution here. From a holistic perspective the technology would have to be implemented at the same time as a concerted campaign for education/behaviour change for patients. A technological fix on its own seems to form a part of the bigger whole. In this case technological and social intervention need to act together to produce change. This idea runs counter to many of the all or nothing arguments discussed in relation to the technological fix above.

The Crystal Exhibition: Determining the future?

The building
The building

The Crystal exhibit is a shining example of a futures exhibition, much like the modern equivalent of the Great Exhibition in 1851. The building is brand new, and looks much like a crystal from the outside as you approach via the new Emirate Air Line (effectively it’s a ski lift over the Thames). Both architecturally and visually impressive, the building literally practices what it preaches:

“The Crystal is an all-electric building that uses solar power and a ground source heat pump to generate its own energy. That means no fossil fuels are burnt within the building. The Crystal showcases state-of-the-art technologies to make buildings more efficient and also profiles Siemens’ Environmental Portfolio.”

This exhibit was impressive, set over two floors, a mixture of interactive and immersive displays helped the visitor explore the future. By showcasing global trends and challenges, it provides possible future scenarios and solutions to existing problems in areas such as environment, water systems, electricity, building materials and future cities.

On entering the exhibition I was impressed with its scale and design, yet the content left much to be desired. Here I will discuss two problems with this exhibit. Firstly this educational display presents a technologically deterministic view of science in the future. This perpetuation of a deterministic view of science could be dangerous. Secondly it presents a picture of science in the future as a truly dynamic one, where citizens have choice and community is key, yet it fails to communicate the contingency of scientific development, or its social construction. It omits to explain the process of reaching this future scenario; instead we just jump into a perfect technologically reliant future, which also happens to be directly promoting Siemens new products as part of that future.

Siemens future tube train

Technological determinism

It is difficult to predict the future trajectory of science and technology and the consequent impacts that this might have on society. However it is dangerous to present science and technology from a purely solutionism perspective. The exhibition’s centrepiece is a cinema room where the audience is presented with the future cities of the world:

“4.07pm New York, building facades trap CO2 and produce methanol for use as fuel. Renewable energy, efficient buildings and clean transport create the cleanest air since the industrial revolution.”

The presentation of technology here is angled from a technologically deterministic point of view where human actions are regulated by and reliant upon technology. Broadly speaking technological determinism has been defined as having two parts. MacKenzie and Wajcman have conceptualised these parts as; first, technological development happens in isolation from society, new ideas arise from internal logic and not outside influence. The second facet shows technological change causes or determines social change. This is a notion that has been mentioned by Feenberg in his categorisation of “unilinear progress” and “determination by the base”. Similar notions can be seen in the presentation of the future by Siemens. The people in the future city “do not just consume electricity, they also generate and store it. Many buildings harvest and store their own energy. Vehicles can use the surplus energy to charge their batteries. The smart grid directs drivers to buy energy off peak when the cost is low and sell it back to the gird at peak times. The city makes energy miners and energy traders of us all.”


The city ‘makes miners and traders’ of us all. Not only is this hugely deterministic, this message is imbued with the idea that technological progress has caused social progress, as technology has optimised human action.

This expression of technology as leading social progress in the future is one that should be used carefully. It is a model that makes most sense for many people’s experience of technology. “For most of us, most of the time, the technologies we use are mysterious in origin and design. We have no idea of whence they came and possibly even less idea how they actually work” Sally Wyatt

 Presenting a future where we rely on technology to make the best choices is an unsettling one. This is due to a growing dilemma; an increasing reliance on technology coupled with a growing worry about how to control it. In my work capacity, I recently asked a public focus group about the future challenges the world faces. The top problem they highlighted was speed of technological advancement and the concern that society is failing to keep abreast of its use and control. If Siemens are presenting technology as a future ‘fix’ to the world’s problems, it might be more advantageous to present some history of the future, which highlights the co-construction of these highly developed cities. To facilitate this Siemens should envision a society that might play a role in technological development rather than just being the passive users of technology.

Co-construction if not social construction

This brings me on to my second point; Siemens present the future use of technology as inclusive while strengthening democracy and communities. “Journeys across the city take people and packages from one mode of transport to another via mega hubs. Technology allows us to live efficient lives and this lifestyle allows people to join together, individuals are important but community is key.”

The Siemens future sounds quite close to perfect. The slick human-computer integrated systems of the ‘mega hub’ and ‘smart grid’ demonstrate that perhaps these technologies might have have developed in a co-constructed way, yet the Siemens exhibit presents technology as separate from society. “Many people work from home, switching between business and leisure, the real and the virtual, and this fluid lifestyle allows neighbours to join together”. Feenberg argued a similar point, that technology fundamentally changes the meaning of our world, arguing, “you are what you do, but also, you are what you use”. However if society had adapted to such a integrated and flawless use of technology, Feenberg would argue that feedback was necessary: “Technological knowledge is incomplete without the input of knowledge from experience that corrects its oversight and simplification”. Hofstadter makes a similar point about a tangled hierarchy between public and the experts, conceptualised as a ‘strange loop’. The Siemens exhibit simply fails to map out the politics of its future technology, denoted by the possible development of technology though the interplay of technical mediations and social groups. The exhibition therefore lacks an element of real plausibility that might make it more exciting to the adult audience.


It is clear that Siemens is driving its own agenda. In this ideal future, ‘consuming’ and ‘selling’ with regard to electricity, as well as ‘paying’ for goods and services, are still common concepts. We may assume a Siemens future would be one dominated by capitalism. As Feenberg suggested, technological development under capitalist enterprise has a very narrow goal – profit. Uniquely this also permits the freedom to pursue that goal without regard for consequences. Feenberg is correct; Siemens has not presented any possible consequences of their future technology, nor explained the role that society might have played in shaping it.



The Crystal exhibition is open Tuesday to Thursday from 10am to 5pm and Friday to Sunday from 10am to 7pm.


One Siemens Brothers Way
Royal Victoria Docks
London E16 1GB


The Golden Ticket


The shiny envelope contained a cheque – a multimillion pound prize to reward science and technology innovation.  It’s the science equivalent of Willy Wonka’s Golden Ticket. The process of prizes seems simple; offer a monetary reward to incentivise science or technology innovation.  Yet the lavish nature of past science prizes has been criticised. Some have suggested that prizes feed the ego of billionaire funders more than they advance science or produce role models.

Longitude 2014 is Nesta’s first multimillion-pound science and technology prize to solve a major societal problem. As we get swept along in the hype around science and technology prizes, should we look at the small print before we say ‘yes’ to the golden ticket and the invitation to the chocolate factory?

The situation is complex.  First we need to establish that there are different types of prize. We can simply spilt prizes into two categories. There are more complex ways of categorizing prizes, and the people who design prizes would probably growl at us for this. Anyway, onwards…

The first and probably the most famous prize type is the recognition prize. The most well known is probably the Nobel Prize. A reward made for amazing achievements in science, technology and the humanities after the innovation has taken place.  For some these are perceived to raise the profile of science, for others these seem to provide valuable tokenism; an effective way of rewarding certain scientists, reflecting society’s appreciation for science. In some respects this type of prize allows science to stick to its Mertonian norms. One central facet of Merton’s argument was that scientist should behave disinterestedly with no emotional or financial attachments to their work. Despite the glamorous ceremonies, these prizes are good at preserving this view of the scientist; where the people ‘doing’ science are on a quest to solve a problem. Their view is unclouded by personal opinion or monetary gain, they do it for the love of science. Recognition prizes reward achievements without making scientists look unprincipled.

The second category is the inducement prize, which is a different kettle of fish. They seek to capture existing and new interest in a pursuit to achieve a given goal set out by the prize designers. It’s a bit like bounty hunting – catch this piece of science through your data, hand it to us safely and in exchange we will give you a monetary reward.

Inducement prizes

Inducement prizes generally can be split into two domains. The first are those that look to achieve ‘moonshot’ thinking, a phrase used by Google’s Solve For <x>. These prizes look to accelerate solutions through science and technology. These solutions appear so far off that its worth having a prize to hurry things along. A good example is the Virgin Earth Challenge.  The second are those prizes that address a particular societal problem, in areas of science that may be less desirable or profitable, where scientists would have little incentive to find solutions.

The prize design

Alexandra Hall, Senior Director of the Google Lunar XPrize told me, “The most effective part of the process for us is ‘visioneering’, a process whereby we get lots of people in the room, from scientists to designers and musicians.” This process allows the development of prize design, which is vital to achieving eventual successful innovations.  Visioneering brings together people from both the creative and scientific fields to scope possible future challenges which science and technology innovation could address.

This is an important step in the prize design process, the visualising of the future, the knowing what might come next, especially when the scope of your prize might be looking 10-20 years into the future. A prize must be both achievable but also not too achievable based on current scientific knowledge. The Archon XPrize is a good example of how easy it is to misjudge the scale of scientific challenge. The prize was announced in 2006 and offered $10 million to the first team that could rapidly and accurately sequence 100 human genomes, at a cost of $10,000 or less dollars per genome. The expected pace of research has accelerated so much so that a genome can now be sequenced for less than $5,000 dollars, therefore the Archon Genomic XPrize was recently cancelled.

This raises questions about the value to science of inducement prizes; how do we accurately predict the future and set realistic goals? Do prizes stifle innovation through strict criteria? After all, lots of good science has happened by accident, from the synthetic dye and the ink jet printer, to penicillin and vulcanized rubber. Should we rule out inducement prizes because they won’t let us advance further than our imagination will allow?

It’s easy to design a challenge with a set metric or a specific achievement by which it can be judged. For example the Lunar XPrizes have a requirement to successfully land a spacecraft on the moon and subsequently send back two “mooncasts” to the earth. The question remains of how we design a prize to address a significant societal problem. Indeed more generally academics such as Nelson have asked why science has found it so difficult to solve major societal issues, when we have successfully put men on the moon.

Here ideas of innovation and the way in which knowledge is produced and utilized come into question. The concept of an ‘innovation system’ to describe the organisation of institutions and polices that direct technological change, has been used to better frame ideas around how science and technology come into being.

In his 2011 paper Nelson goes on to suggest that reorientation of innovation systems is a positive step towards allowing science and technology to address complex societal needs. However a fundamental barrier to innovation is limited knowledge and this needs to be highlighted. Indeed this problem can be seen in prize design, often the objective is to incentivise innovation, yet the judging criteria can be restrictive and exclude certain groups from entering.

Nesta’s multimillion-pound Longitude 2014 prize is seeking to reward innovation to solve particular societal problems. The Prize is currently under development, and it’s interesting to observe the intricacies that Nelson mentions being playing out.

Science could be seen as socially constructed, a process of interaction, sharing and validation, however setting specific science metrics to win prizes adheres to the empiricist view that science is in the pursuit of solid evidence. Jasanoff argued that science is a process of co-construction of knowledge. Nelson highlights that barriers in solving societal problems grow from limited knowledge. Belief replaces evidence, and there are divergent beliefs about the effects of various proposed solutions. Nelson uses an example of drug addiction and drug-related crime where lack of knowledge about solutions results in valued based conflicts within the scientific community. This is true today and I see challenges arising in the design of the Longitude Prize that follow the same pattern. There is disagreement in the scientific community about how we might tackle certain future societal problems.

Therefore it is imperative, that the Longitude Prize allows for unforeseen ‘moonshot’ thinking, and this must happen in the prize design phase. ‘Reoreintation’ of the innovation system with respect to inducement prizes would begin with flexibility in setting winning criteria. If we broaden our horizons the answer to solving the problem of drug addiction is as likely to come from a drug company as an artist or a designer.