Thursday, June 21, 2018

GUEST BLOG POST: Decolonize Science – Time To End Another Imperial Era — Rohan Deb Roy

Not all international collaborations are equal. US Army Africa/FlickrCC BY
Sir Ronald Ross had just returned from an expedition to Sierra Leone. The British doctor had been leading efforts to tackle the malaria that so often killed English colonists in the country, and in December 1899 he gave a lecture to the Liverpool Chamber of Commerce about his experience. In the words of a contemporary report, he argued that “in the coming century, the success of imperialism will depend largely upon success with the microscope”.

Ross, who won the Nobel Prize for Medicine for his malaria research, would later deny he was talking specifically about his own work. But his point neatly summarized how the efforts of British scientists were intertwined with their country’s attempt to conquer a quarter of the world.

Ross was very much a child of empire, born in India and later working there as a surgeon in the imperial army. So when he used a microscope to identify how a dreaded tropical disease was transmitted, he would have realized that his discovery promised to safeguard the health of British troops and officials in the tropics. In turn, this would enable Britain to expand and consolidate its colonial rule.

Ross’s words also suggest how science was used to argue imperialism was morally justified because it reflected British goodwill towards colonized people. It implied that scientific insights could be redeployed to promote superior health, hygiene and sanitation among colonial subjects. Empire was seen as a benevolent, selfless project. As Ross’s fellow Nobel laureate Rudyard Kipling described it, it was the “white man’s burden” to introduce modernity and civilized governance in the colonies.

But science at this time was more than just a practical or ideological tool when it came to empire. Since its birth around the same time as Europeans began conquering other parts of the world, modern Western science was inextricably entangled with colonialism, especially British imperialism. And the legacy of that colonialism still pervades science today.
As a result, recent years have seen an increasing number of calls to “decolonize science”, even going so far as to advocate scrapping the practice and findings of modern science altogether. Tackling the lingering influence of colonialism in science is much needed. But there are also dangers that the more extreme attempts to do so could play into the hands of religious fundamentalists and ultra-nationalists. We must find a way to remove the inequalities promoted by modern science while making sure its huge potential benefits work for everyone, instead of letting it become a tool for oppression.

The gracious gift of science
When a slave in an early 18th-century Jamaican plantation was found with a supposedly poisonous plant, his European overlords showed him no mercy. Suspected of conspiring to cause disorder on the plantation, he was treated with typical harshness and hanged to death. The historical records don’t even mention his name. His execution might also have been forgotten forever if it weren’t for the scientific enquiry that followed. Europeans on the plantation became curious about the plant and, building on the slave’s “accidental finding”, they eventually concluded it wasn’t poisonous at all.

Instead it became known as a cure for worms, warts, ringworm, freckles and cold swellings, with the name Apocynum erectum. As the historian Pratik Chakrabarti argues in a recent book, this incident serves as a neat example of how, under European political and commercial domination, gathering knowledge about nature could take place simultaneously with exploitation.

For imperialists and their modern apologistsscience and medicine were among the gracious gifts from the European empires to the colonial world. What’s more, the 19th-century imperial ideologues saw the scientific successes of the West as a way to allege that non-Europeans were intellectually inferior and so deserved and needed to be colonized.

In the incredibly influential 1835 memo “Minute on Indian Education”, British politician Thomas Macaulay denounced Indian languages partially because they lacked scientific words. He suggested that languages such as Sanskrit and Arabic were “barren of useful knowledge”, “fruitful of monstrous superstitions” and contained “false history, false astronomy, false medicine”.

Such opinions weren’t confined to colonial officials and imperial ideologues and were often shared by various representatives of the scientific profession. The prominent Victorian scientist Sir Francis Galton argued that the “the average intellectual standard of the negro race is some two grades below our own (the Anglo Saxon)”. Even Charles Darwin implied that “savage races” such as “the negro or the Australian” were closer to gorillas than were white Caucasians.

Yet 19th-century British science was itself built upon a global repertoire of wisdom, information, and living and material specimens collected from various corners of the colonial world. Extracting raw materials from colonial mines and plantations went hand in hand with extracting scientific information and specimens from colonized people.

Imperial collections
Leading public scientific institutions in imperial Britain, such as the Royal Botanic Gardens at Kew and the British Museum, as well as ethnographic displays of “exotic” humans, relied on a global network of colonial collectors and go-betweens. By 1857, the East India Company’s London zoological museum boasted insect specimens from across the colonial world, including from Ceylon,India, Java and Nepal.

The British and Natural History museums were founded using the personal collection of doctor and naturalist Sir Hans Sloane. To gather these thousands of specimens, Sloane had worked intimately with the East India, South Sea and Royal African companies, which did a great deal to help establish the British Empire.

The scientists who used this evidence were rarely sedentary geniuses working in laboratories insulated from imperial politics and economics. The likes of Charles Darwin on the Beagle and botanist Sir Joseph Banks on the Endeavour literally rode on the voyages of British exploration and conquest that enabled imperialism.

Other scientific careers were directly driven by imperial achievements and needs. Early anthropological work in British India, such as Sir Herbert Hope Risley’s Tribes and Castes of Bengal, published in 1891, drew upon massive administrative classifications of the colonised population.

Map-making operations including the work of the Great Trigonometrical Survey in South Asia came from the need to cross colonial landscapes for trade and military campaigns. The geological surveys commissioned around the world by Sir Roderick Murchison were linked with intelligence gathering on minerals and local politics.

Efforts to curb epidemic diseases such as plague, smallpox and cholera led to attempts to discipline the routines, diets and movements of colonial subjects. This opened up a political process that the historian David Arnold has termed the “colonization of the body”. By controlling people as well as countries, the authorities turned medicine into a weapon with which to secure imperial rule.

New technologies were also put to use expanding and consolidating the empire. Photographs were used for creating physical and racial stereotypes of different groups of colonized people. Steamboats were crucial in the colonial exploration of Africa in the mid-19th century. Aircraft enabled the British to surveil and then bomb rebellions in 20th-century Iraq. The innovation of wireless radio in the 1890s was shaped by Britain’s need for discreet, long-distance communication during the South African war.

In these ways and more, Europe’s leaps in science and technology during this period both drove and were driven by its political and economic domination of the rest of the world. Modern science was effectively built on a system that exploited millions of people. At the same time it helped justify and sustain that exploitation, in ways that hugely influenced how Europeans saw other races and countries. What’s more, colonial legacies continue to shape trends in science today.

Sir Hans Sloane’s imperial collection started the British Museum. Paul Hudson/WikipediaCC BY
Modern colonial science
Since the formal end of colonialism, we have become better at recognizing how scientific expertise has come from many different countries and ethnicities. Yet former imperial nations still appear almost self-evidently superior to most of the once-colonized countries when it comes to scientific study. The empires may have virtually disappeared, but the cultural biases and disadvantages they imposed have not.

You just have to look at the statistics on the way research is carried out globally to see how the scientific hierarchy created by colonialism continues. The annual rankings of universities are published mostly by the Western world and tend to favour its own institutions. Academic journals across the different branches of science are mostly dominated by the US and western Europe.

It is unlikely that anyone who wishes to be taken seriously today would explain this data in terms of innate intellectual superiority determined by race. The blatant scientific racism of the 19th century has now given way to the notion that excellence in science and technology are a euphemism for significant funding, infrastructure and economic development.

Because of this, most of Asia, Africa and the Caribbean is seen either as playing catch-up with the developed world or as dependent on its scientific expertise and financial aid. Some academics have identified these trends as evidence of the persisting “intellectual domination of the West” and labelled them a form of “neo-colonialism”.

Various well-meaning efforts to bridge this gap have struggled to go beyond the legacies of colonialism. For example, scientific collaboration between countries can be a fruitful way of sharing skills and knowledge, and learning from the intellectual insights of one another. But when an economically weaker part of the world collaborates almost exclusively with very strong scientific partners, it can take the form of dependence, if not subordination.

A 2009 study showed that about 80% of Central Africa’s research papers were produced with collaborators based outside the region. With the exception of Rwanda, each of the African countries principally collaborated with its former colonizer. As a result, these dominant collaborators shaped scientific work in the region. They prioritized research on immediate local health-related issues, particularly infectious and tropical diseases, rather than encouraging local scientists to also pursue the fuller range of topics pursued in the West.

In the case of Cameroon, local scientists’ most common role was in collecting data and fieldwork while foreign collaborators shouldered a significant amount of the analytical science. This echoed a 2003 study of international collaborations in at least 48 developing countries that suggested local scientists too often carried out “fieldwork in their own country for the foreign researchers”.

In the same study, 60% to 70% of the scientists based in developed countries did not acknowledge their collaborators in poorer countries as co-authors in their papers. This is despite the fact they later claimed in the survey that the papers were the result of close collaborations.

Mistrust and resistance
International health charities, which are dominated by Western countries, have faced similar issues. After the formal end of colonial rule, global health workers long appeared to represent a superior scientific culture in an alien environment. Unsurprisingly, interactions between these skilled and dedicated foreign personnel and the local population have often been characterized by mistrust.

For example, during the smallpox eradication campaigns of the 1970s and the polio campaign of past two decades, the World Health Organization’s representatives found it quite challenging to mobilize willing participants and volunteers in the interiors of South Asia. On occasions they even saw resistance on religious grounds from local people. But their stringent responses, which included the close surveillance of villages, cash incentives for identifying concealed cases and house-to-house searches, added to this climate of mutual suspicion. These experiences of mistrust are reminiscent of those created by strict colonial policies of plague control.

Western pharmaceutical firms also play a role by carrying out questionable clinical trials in the developing world where, as journalist Sonia Shah puts it, “ethical oversight is minimal and desperate patients abound”. This raises moral questions about whether multinational corporations misuse the economic weaknesses of once-colonized countries in the interests of scientific and medical research.

The colonial image of science as a domain of the white man even continues to shape contemporary scientific practice in developed countries. People from ethnic minorities are underrepresented in science and engineering jobs and more likely to face discrimination and other barriers to career progress.

To finally leave behind the baggage of colonialism, scientific collaborations need to become more symmetrical and founded on greater degrees of mutual respect. We need to decolonize science by recognizing the true achievements and potential of scientists from outside the Western world. Yet while this structural change is necessary, the path to decolonization has dangers of its own.

A March for Science protester in Melbourne. www.wikimedia.comCC BY-SA
Science must fall?
In October 2016, a YouTube video of students discussing the decolonization of science went surprisingly viral. The clip, which has been watched more than 1m times, shows a student from the University of Cape Town arguing that science as a whole should be scrapped and started again in a way that accommodates non-Western perspectives and experiences. The student’s point that science cannot explain so-called black magic earned the argument much derision and mockery. But you only have to look at the racist and ignorant comments left beneath the video to see why the topic is so in need of discussion.
Inspired by the recent “Rhodes Must Fall” campaign against the university legacy of the imperialist Cecil Rhodes, the Cape Town students became associated with the phrase “science must fall”. While it may be interestingly provocative, this slogan isn’t helpful at a time when government policies in a range of countries including the US,UK and India are already threatening to impose major limits on science research funding.

More alarmingly, the phrase also runs the risk of being used by religious fundamentalists and cynical politicians in their arguments against established scientific theories such as climate change. This is a time when the integrity of experts is under fire and science is the target of political manouevring. So polemically rejecting the subject altogether only plays into the hands of those who have no interest in decolonization.

Alongside its imperial history, science has also inspired many people in the former colonial world to demonstrate remarkable courage, critical thinking and dissent in the face of established beliefs and conservative traditions. These include the iconic Indian anti-caste activist Rohith Vemula and the murdered atheist authors Narendra Dabholkar and Avijit Roy. Demanding that “science must fall” fails to do justice to this legacy.

The call to decolonize science, as in the case of other disciplines such as literature, can encourage us to rethink the dominant image that scientific knowledge is the work of white men. But this much-needed critique of the scientific canon carries the other danger of inspiring alternative national narratives in post-colonial countries.

For example, some Indian nationalists, including the country’s current prime minister, Narendra Modi, have emphasized the scientific glories of an ancient Hindu civilization. They argue that plastic surgery, genetic science, aeroplanes and stem cell technology were in vogue in India thousands of years ago. These claims are not just a problem because they are factually inaccurate. Misusing science to stoke a sense of nationalist pride can easily feed into jingoism.

Meanwhile, various forms of modern science and their potential benefits have been rejected as unpatriotic. In 2016, a senior Indian government official even went so far as to claim that “doctors prescribing non-Ayurvedic medicines are anti-national”.

The path to decolonization
Attempts to decolonize science need to contest jingoistic claims of cultural superiority, whether they come from European imperial ideologues or the current representatives of post-colonial governments. This is where new trends in the history of science can be helpful.

For example, instead of the parochial understanding of science as the work of lone geniuses, we could insist on a more cosmopolitan model. This would recognize how different networks of people have often worked together in scientific projects and the cultural exchanges that helped them – even if those exchanges were unequal and exploitative.

But if scientists and historians are serious about “decolonizing science” in this way, they need to do much more to present the culturally diverse and global origins of science to a wider, non-specialist audience. For example, we need to make sure this decolonized story of the development of science makes its way into schools.

Students should also be taught how empires affected the development of science and how scientific knowledge was reinforced,used and sometimes resisted by colonized people. We should encourage budding scientists to question whether science has done enough to dispel modern prejudices based on concepts of race, gender, class and nationality.

Ronald Ross at his lab in Calcutta, 1898. Wellcome CollectionCC BY
Decolonizing science will also involve encouraging Western institutions that hold imperial scientific collections to reflect more on the violent political contexts of war and colonization in which these items were acquired. An obvious step forward would be to discuss repatriating scientific specimens to former colonies, as botanists working on plants originally from Angola but held primarily in Europe have done. If repatriation isn’t possible, then co-ownership or priority access for academics from post-colonial countries should at least be considered.

This is also an opportunity for the broader scientific community to critically reflect on its own profession. Doing so will inspire scientists to think more about the political contexts that have kept their work going and about how changing them could benefit the scientific profession around the world. It should spark conversations between the sciences and other disciplines about their shared colonial past and how to address the issues it creates.

Unravelling the legacies of colonial science will take time. But the field needs strengthening at a time when some of the most influential countries in the world have adopted a lukewarm attitude towards scientific values and findings. Decolonization promises to make science more appealing by integrating its findings more firmly with questions of justice, ethics and democracy. Perhaps, in the coming century, success with the microscope will depend on success in tackling the lingering effects of imperialism.

Schools need to teach the non-Western history of science. 

Originally published on THE CONVERSATION

Saturday, June 16, 2018

GUEST BLOG POST: How Clever People Help Societies Work Together Better — Eugenio Proto, Aldo Rustichini & Andis Sofianos

via shutterstock.com 
What drives people to cooperate with each other? And what characteristics lead a person to do something that will both benefit them, and those around them? Our new research suggests that the answer is intelligence: it is the primary condition for a socially cohesive and cooperative society.

In the past, some economists have suggested that consideration of others and generally pro-social attitudes are what motivate people towards more generous and cooperative behaviours which help sustain a cohesive society. Others have suggested that adhering to good norms and respecting institutions push us towards more socially useful behaviours.

But another possibility is that insightful self-interest guides us to become effectively good citizens – and that cooperation arises in society if people are smart enough to foresee the social consequences of their actions, including the consequences for others.

The prisoner’s dilemma
Our study, which took part in behavioural labs in the US and UK with 792 participants, was designed to test these three different suggestions for why people cooperate with each other. In it, we used games that contain a set of rules that assign a reward to two players depending on their decisions.

One of these games was the prisoner’s dilemma game. The easiest way to describe the game is using the original example of two criminals who have been arrested. They are interrogated in separate rooms with no means of communicating with each other. Each prisoner is given the opportunity either to: betray the other by testifying that the other committed the crime – an uncooperative choice – or to cooperate with the other by remaining silent.

If both prisoners betray each other, they each serve two years in prison – the uncooperative outcome. If one betrays the other and the other remains silent, the first will be set free and the other will serve three years in prison – and vice versa. If both remain silent, they will only serve one year in prison – the cooperative outcome.

This is a standard example of a game analysed in game theory that shows why two completely rational individuals might not cooperate, even if it appears that it is in their best interest to do so. It is also a good example of a non zero-sum game – where the cooperative behaviour is mutually beneficial. In general, it depicts a situation reflecting the properties of the interactions we all experience most frequently in society.

As usual in experimental economics, we had participants play this game with monetary payoffs – instead of imprisonment. We matched two subjects in the same session in an anonymous way and we let them play the same game repeatedly for an indefinite number of times. After that, we re-matched them with a different partner and the game started again. And this went on for 45 minutes. Each player learns by adjusting their decisions based on how others in the same room have played in the past.

Intelligence sparks cooperation
We then created two “cities”, or groups of subjects, sorted by characteristics based on cognitive and personality traits that we had measured two days earlier, by asking the participants to fill in a standard questionnaire. One such characteristic was a measure of pro-social attitudes, namely the personality trait of agreeableness. Another characteristic was a measure of adherence to norms, specifically the personality trait of conscientiousness. The third characteristic was that of intelligence.

We then analysed the frequency of cooperative choices they made in the prisoner’s dilemma game – so the number of times they chose the less selfish option. From this we calculated what we called the cooperation rate.

Overall, we found that the higher a person’s intelligence, the more cooperative they became as they continued playing the prisoner’s dilemma game. So while intelligent individuals are not inherently more cooperative, they have the ability to process information faster and to learn from it. We didn’t see such stark differences for the other two groups – those that scored highly in agreeableness and conscientiousness.

Cooperation rate across intelligence groups. Author provided
Helping each other
It’s possible that smarter people may try to use their cognitive advantage and take advantage of others. So in further analysis, we created combined “cities”, grouping together people who are similar across all characteristics in the personality test, and have similar levels of intelligence. We observed something quite different.

Cooperation rate across combined groups. Author provided
As the graph above shows, the smarter individuals – the blue line – within these combined groups helped to teach the less smart ones – the red line – and lead them to eventually increase their cooperation rate by the end of the experiment. This was eventually beneficial for all involved: on average, everyone was better off in terms of earnings. Taken together, these results show how even having a few intelligent people present in a group or the workplace can benefit others.

As other recent research has looked at how education can help from early childhood to develop cognitive ability, our results indicate how such interventions need not only benefit each individual, but society as a whole.

Originally published on THE CONVERSATION

Saturday, June 02, 2018

NEWS POST: Incredible Nuclear Battery That Lasts For 100 YEARS Could Help Power Everything From Pacemakers To A Mission To Mars

A nuclear-powered battery that lasts for a hundred years and packs ten times the punch of a traditional chemical cell has been unveiled by Russian scientists. The prototype (pictured) could lead to battery technology useful for long haul space flights
Radioactive battery produces ten times the power of a traditional chemical cell It consists of a semiconductor made out of diamond and a radioactive chemical The technology could lead to permanent pacemakers that never need changing NASA is looking to develop compact nuclear-powered batteries to power small devices like sensors on long haul space flights

A nuclear-powered battery that lasts for 100 years and packs ten times the power of a traditional cell has been unveiled by Russian scientists. The prototype consists of a semiconductor made from diamond, known as a Schottky diode, and a radioactive chemical that fuels it.

The technology could be used to power everything from permanent pacemakers that never need changing, to manned missions to Mars.

Scientists at Russia's Technological Institute for Superhard and Novel Carbon Materials in Moscow, insist the technology is safe for everyday use. The battery is powered by beta radiation – electrons and positrons – which is not dangerous to keep inside the body because it is unlikely to be absorbed by our cells.

Professor Vladimir Blank, director of the research, said: 'The results so far are already quite remarkable and can be applied in medicine and space technology.'

Nuclear-powered batteries have been around for a century, but are usually too large to be of any practical use. The Russian cell uses a new structure to make it much more compact, meaning it puts out 3,300 milliwatt-hours of power per gram – ten times more than commercially available chemical cell batteries. The device uses the isotope nickel-63, which decays and fires out high-speed electrons known as beta particles into layers of nickel foil, generating electricity.

The battery can continue producing power for a century, which is the length of time it takes for the radioactivity in nickel-63 to reach its half-life. In experiments the device achieved power of ten microwatts per cubic centimetre (166 microwatts per cubic inch) – enough for a modern artificial pacemaker. Most state-of-the-art cardiac pacemakers are over ten cubic centimetres (0.6 cubic inches) in size and require about ten microwatts of power. That means the new battery could be used to power these devices without any significant changes to their design and size.

Pacemakers that have batteries which need not be replaced or serviced would improve the quality of life of patients, said the researchers.
The prototype consists of a semiconductor made out of diamond, known as a Schottky diode (dark green), and a radioactive chemical that fuels it. It is powered by the isotope nickel-63 (pink) which fires high-speed electrons into nickel foil, generating electricity 
NASA – which is planning to land manned missions on the red planet within 20 years – would also greatly benefit from the compact nuclear batteries. Space agencies planning long trips will need to develop small power sources that don't need replacing in order to save on cargo space. NASA is already developing a large 'Kilopower' nuclear reactor that could power colonies on Mars for decades.

But there is also a demand for smaller nuclear batteries to power external sensors and memory chips with integrated power supply systems for spacecraft. The prototype battery has a stack of 200 diamond converters woven into layers of the radioactive material and nickel foil layers. The amount of power depends on the thickness of the foil and the converters themselves. Both affect how many beta particles are absorbed.

The researchers believe they could increase the battery's power by up to a factor of three by enriching the nickel-63 or boosting voltage with improvements to the diamond converters.

Professor Blank said: 'We are planning to do more. The higher the power density of the device, the more applications it will have. 'We have decent capabilities for high-quality diamond synthesis, so we are planning to utilize the unique properties of this material for creating new radiation-proof electronic components and designing novel electronic and optical devices.'

The researchers did not specify if they planned to make the device in bulk or market it to space agencies and medical firms. They said there is also an alternative radioisotope for use in nuclear batteries. Diamond converters could be made using radioactive carbon-14, which has an extremely long half-life of 5,700 years.

Batteries based on radioactive energy sources were first suggested in the 1960s during a boom in nuclear power research. The idea was ultimately shelved because of public safety concerns.

Originally published on DAILY MAIL UK