Saturday, November 30, 2013

Babies know when you're faking

Infants can detect unjustified emotional reactions as early as 18 months, Concordia University researchers prove

Montreal, 16 October 2013 — If you're happy and you know it, clap your hands! That's easy enough for children to figure out because the emotion matches the movement. But when feelings and reactions don't align, can kids tell there's something wrong? New research from Concordia University proves that they can — as early as 18 months.

In a study recently published in Infancy: The Official Journal of the International Society on Infant Studies, psychology researchers Sabrina Chiarella and Diane Poulin-Dubois demonstrate that infants can detect whether a person's emotions are justifiable given a particular context. They prove that babies understand how the meaning of an experience is directly linked to the expressions that follow.

The implications are significant, especially for caregivers. "Our research shows that babies cannot be fooled into believing something that causes pain results in pleasure. Adults often try to shield infants from distress by putting on a happy face following a negative experience. But babies know the truth: as early as 18 months, they can implicitly understand which emotions go with which events," says psychology professor Poulin-Dubois.

To perform the research, she and PhD candidate Sabrina Chiarella recruited 92 infants at the 15 and 18-month mark. In a lab setting, the babies watched as an actor went through several scenarios in which emotional reactions went with or against pantomimed experiences (for more, see the related video). In one scenario, the researcher showed a mismatched emotion by being sad when presented with a desired toy. In another, she expressed an emotion that went with the experience by reacting in pain when pretending to hurt her finger.

At 15 months, the infants did not show a significant difference in reactions to these events, showing empathy through their facial expressions to all sad faces. This indicates that the understanding of the link between a facial expression following an emotional experience is an ability that has yet to develop at that stage.

At 18 months, however, the infants clearly detected when facial expressions did not match the experience. They spent more time looking at the researcher's face and checked back more frequently with the caregiver in the room with them so that they could gauge the reaction of a trusted source. They also showed empathy toward the person only when her sad face was justified; that is, only when the researcher was sad or in pain when she was supposed to be.

Chiarella explains that the indiscriminate show of concern to sad faces in the younger infants is an adaptive behaviour. "The ability to detect sadness and then react immediately has an evolutionary implication. However, to function effectively in the social world, children need to develop the ability to understand others' behaviours by inferring what is going on internally for those around them."

The researchers are currently examining whether infants who are exposed to an individual who is emotionally unreliable will affect in their willingness to help or learn from that individual.

EurekAlert. 2013. “Babies know when you're faking”. EurekAlert. Posted: October 16, 2013. Available online:

Friday, November 29, 2013

Mysterious ancient human crossed Wallace's Line

Scientists have proposed that the most recently discovered ancient human relatives -- the Denisovans -- somehow managed to cross one of the world's most prominent marine barriers in Indonesia, and later interbred with modern humans moving through the area on the way to Australia and New Guinea.

Three years ago the genetic analysis of a little finger bone from Denisova cave in the Altai Mountains in northern Asia led to a complete genome sequence of a new line of the human family tree -- the Denisovans. Since then, genetic evidence pointing to their hybridisation with modern human populations has been detected, but only in Indigenous populations in Australia, New Guinea and surrounding areas. In contrast, Denisovan DNA appears to be absent or at very low levels in current populations on mainland Asia, even though this is where the fossil was found.

Published today in a Science opinion article, scientists Professor Alan Cooper of the University of Adelaide in Australia and Professor Chris Stringer of the Natural History Museum in the UK say that this pattern can be explained if the Denisovans had succeeded in crossing the famous Wallace's Line, one of the world's biggest biogeographic barriers which is formed by a powerful marine current along the east coast of Borneo. Wallace's Line marks the division between European and Asian mammals to the west from marsupial-dominated Australasia to the east.

"In mainland Asia, neither ancient human specimens, nor geographically isolated modern Indigenous populations have Denisovan DNA of any note, indicating that there has never been a genetic signal of Denisovan interbreeding in the area," says Professor Cooper, Director of the University of Adelaide's Australian Centre for Ancient DNA. "The only place where such a genetic signal exists appears to be in areas east of Wallace's Line and that is where we think interbreeding took place -- even though it means that the Denisovans must have somehow made that marine crossing."

"The recent discovery of another enigmatic ancient human species Homo floresiensis, the so-called Hobbits, in Flores, Indonesia, confirms that the diversity of archaic human relatives in this area was much higher than we'd thought," says Professor Stringer, Research Leader in Human Origins, Natural History Museum, in London. "The morphology of the Hobbits shows they are different from the Denisovans, meaning we now have at least two, and potentially more, unexpected groups in the area.

"The conclusions we've drawn are very important for our knowledge of early human evolution and culture. Knowing that the Denisovans spread beyond this significant sea barrier opens up all sorts of questions about the behaviours and capabilities of this group, and how far they could have spread."

"The key questions now are where and when the ancestors of current humans, who were on their way to colonise New Guinea and Australia around 50,000 years ago, met and interacted with the Denisovans," says Professor Cooper.

"Intriguingly, the genetic data suggest that male Denisovans interbred with modern human females, indicating the potential nature of the interactions as small numbers of modern humans first crossed Wallace's Line and entered Denisovan territory."

EurekAlert. 2013. “Mysterious ancient human crossed Wallace's Line”. EurekAlert. Posted: October 17, 2013. Available online:

Thursday, November 28, 2013

Wari, Predecessors of the Inca, Used Restraint to Reshape Human Landscape

The Wari, a complex civilization that preceded the Inca empire in pre-Columbia America, didn't rule solely by pillage, plunder and iron-fisted bureaucracy, a Dartmouth study finds. Instead, they started out by creating loosely administered colonies to expand trade, provide land for settlers and tap natural resources across much of the central Andes.

The results, which appear in the Journal of Anthropological Archaeology, shed new light on how early states evolved into empires in the region that became the Inca imperial heartland.

The study is the first large-scale look at the settlement patterns and power of the Wari civilization, which flourished from about AD 600-1000 in the Andean highlands, well before the Inca empire's 15th century rise. Relatively little is known about the Wari -- there are no historical documents and archaeologists are still debating their power and statecraft. Many scholars think the Wari established strong centralized control -- economic, political, cultural and military -- like their Inca successors to govern the majority of the far-flung populations living across the central Andes. But the Dartmouth study suggests that while the Wari had significant administrative power, they did not successfully transition most colonies into directly ruled provinces.

"The identification of limited Wari state power encourages a focus on colonization practices rather than an interpretation of strong provincial rule," says Professor Alan Covey, the study's lead author. "A 'colonization first' interpretation of early Wari expansion encourages the reconsideration of motivations for expansion, shifting from military conquest and economic exploitation of subject populations to issues such as demographic relief and strategic expansion of trade routes or natural resource access."

The results are based on a systematic inventory of archaeological surveys covering nearly 1,000 square miles and GIS analysis of more than 3,000 archaeological sites in and around Peru's Cusco Valley. The data indicate Wari power did not emanate continuously outward from Pikillacta, a key administrative center whose construction required a huge investment. Instead, the locations of Wari ceramics indicate a more uneven, indirect and limited influence even at the height of their power than traditional interpretations from excavations at Wari sites.

Science Daily. 2013. “Wari, Predecessors of the Inca, Used Restraint to Reshape Human Landscape”. Science Daily. Posted: October 16, 2013. Available online:

Wednesday, November 27, 2013

Columbus May Not Have Been First to America

An investigation worthy of a Dan Brown novel has shed new light on the voyages of John Cabot, the Italian navigator and explorer, revealing that he may have had knowledge of European expeditions to the "New World" that predated Christopher Columbus's 1492 voyage.

Although commonly credited with "discovering" America, Christopher Columbus would not reach the mainland of the New World until 1498, when he sailed to South America.

Farther north, Cabot became the first European since Leif Ericson and the Vikings to land on North American soil when he made three voyages for England's Henry VII between the summers of 1496 and 1498. The second of these expeditions, carried out in 1497, resulted in the European discovery of North America -- at Newfoundland.

Now a brief entry in a yellowed accounting ledger has revealed an unexpected European dimension to Cabot's discovery: In April 1496, the Italian-born explorer received financial backing from an Italian bank -- the Bardi banking house in London.

The notation -- found through some serious sleuthing of the works of Alwyn Ruddock, a deceased, secretive historian -- would also suggest that Europeans may have discovered the New World decades before both Cabot and Columbus set sail.

Found in a private Florentine archive, the document records that a payment of 50 nobles sterling was made to "Giovanni Chabotte" (John Cabot) of Venice so that he could undertake expeditions "to go and find the new land." "This brief entry opens a whole new chapter in Cabot scholarship. It shows that the Bristol voyages were part of a wider network of Italian-supported exploratory enterprises," historian Francesco Guidi-Bruscoli, of the University of Florence, told Discovery News.

Guidi Bruscoli, who detailed his finding in the scholarly journal Historical Research, noted that the short entry referred to "the new land" ("il‭ nuovo paese" in the original Italian version)and not to "a new land" (or "un nuovo paese").

"The use of the definite article ('il'- 'the') rather than the indefinite 'a' ('un' in Italian) is indeed puzzling," Guidi Bruscoli said.

The phrasing might imply that the money was given to Cabot so that he could find a land whose existence was already known. The Bardi, far from being disinterested patrons, would have had a sound economic reason to finance what would have been an almost certain discovery.

Since Cabot's royal patent only applied to lands "unknown to Christians,"it seems unlikely that "the new land" referred to here was that which Columbus had found four years earlier.

As such, the note may revive claims that Bristol merchants had discovered North America at an earlier time.

"Unfortunately, we only have clues. While the entry implies that the Bardi believed in a prior discovery, we can't assume this had occurred," Guidi-Bruscoli said.

The speculation receives some support, however, from a letter written in the winter of 1497/8 by an English merchant named John Day to the "Lord Grand Admira" almost certainly Christopher Columbus.

Discovered in the 1950s, the letter discussed Cabot's recently completed 1497 voyage to Newfoundland, adding it was "considered certain" that men from Bristol had already "found and discovered in the past" the said land, "as your lordship well knows."

Even more compelling evidence appeared to have existed in the archives investigated by the late historian Alwyn Ruddock, a leading expert on the Bristol discovery voyages.

According to University of Bristol historian Evan Jones, Ruddock made finds that "promised to revolutionize our understanding of Europe's engagement with North America in the three decades after 1492."

She claimed, for instance, to have found proof in Italian and Spanish sources that Bristol merchants reached the New World sometime before 1470, and that Cabot didn't die on the 1498 expedition as widely believed, but returned to England in 1500.

"She had made some extraordinary finds, but she ordered in her will the destruction of all her research following her death," said Jones, who founded the Cabot Project research initiative.

That was done in 2005, when the fiercely secretive Ruddock died aged 89. Her unpublished work -- 78 bags of notes, letters, photographs, and microfilms -- ended up in a shredder.

Another of Ruddock's claims was that Cabot was financed by an Italian bank. Following an invitation to visit the deceased historian's house in 2010, Jones and his co-researcher, Margaret Condon, discovered the source of her information -- in the form of a sticky label on an old shoe cupboard: "The Bardi firm of London" (an Italian bank).

"The Bardi firm of London -- that was all we needed to work out the identity of the Italian banking house that Ruddock kept secret for almost half a century," Jones said.

Jones and Condon contacted Guidi-Bruscoli in Florence, who was then able to locate a brief entry in the private archive of the Guicciardini family.

"Without Ruddock's sticky label, finding that small entry would have been a rather difficult task," Guidi Bruscoli admitted.

Meanwhile, Jones and his associates continue their investigation into Ruddock's secret findings.

"I have an enormous respect for Alwyn Ruddock as a scholar. But I can't respect her decision to destroy all her work. She did what is the antithesis of everything that historical research is about -- she sought to destroy all her findings. I can't and don't accept that," Jones said.

Lorenzi, Rossella. 2013. “Columbus May Not Have Been First to America”. Discovery News.. Posted: October 14, 2013. Available online:

Tuesday, November 26, 2013

Why Turkey Lifted Its Ban on the Islamic Headscarf

Turkish women who want to wear the hijab – the traditional Islamic headscarf covering the head and hair, but not the face – to civil service jobs and government offices will be able to do so now that the Turkish government has relaxed its decades-long restriction on wearing the headscarf in state institutions.

The new rules, which don't apply to workers in the military or judiciary, come into effect immediately and were put into place to address concerns that the restrictions on hijab were discouraging women from conservative backgrounds from seeking government jobs or higher education.

"A dark time eventually comes to an end," Turkish Prime Minister Recep Tayyip Erdogan said in a speech to the parliament. "Headscarf-wearing women are full members of the republic, as well as those who do not wear it."

Ataturk's Fashion Police

Turkey's restrictions on wearing overtly religious-oriented attire are rooted in the founding of the modern, secular Turkish state, when the republic's founding father, Mustafa Kemal Ataturk, introduced a series of clothing regulations designed to keep religious symbolism out of the civil service. The regulations were part of a sweeping series of reforms that altered virtually every aspect of Turkish life—from the civil code to the alphabet to education to social integration of the sexes.

The Western dress code at that time, though, was aimed at men. The fez—the short, conical, red-felt cap that had been in vogue in Turkey since the Ottoman Sultan Mahmud II made it part of the official national attire in 1826—was banished. Ataturk himself famously adopted a Panama hat to accent his Western-style gray linen suit, shirt, and tie when he toured the country in the summer of 1925 to sell his new ideas to a deeply conservative population. That autumn, the Hat Law of 1925 was passed, making European-style men's headwear de rigueur and punishing fez-wearers with lengthy sentences of imprisonment at hard labor, and even a few hangings.

Curiously enough, Ataturk left women's attire alone. In granting women the freedom to decide for themselves whether they wanted to cover their heads, it was more or less assumed they would eventually give up the headscarf as the new, secular Turkish identity took hold. Many did.

Fall From Favor

By the 1970s, though, and particularly after Turkey's military coup in 1980, discouraging headscarves had taken on the force of law. The headscarf was banned in government offices, hospitals, universities, and schools. By the 1980s, these lengths of cloth had taken on hot political connotations.

Critics worry that Turkey's relaxation of the headscarf ban will blur the line between religion and the state and could herald a stealthy march toward an Islamist state. When the repeal was announced this week, Turkey's opposition party declared it "a serious blow to the secular republic."

Others see it as a long-overdue reform. "The lifting of the ban on headscarves ends a disgraceful human rights abuse that took away futures of generations of women in Turkey," says U.K.-based Turkish academic and commentator Ziya Meral. "Yet this is likely to create tensions, particularly in western Turkey, once women wearing headscarves start appearing in workplaces and becoming more visible in certain sectors.

"The challenge that lies before Turkey is not whether or not Turkey is becoming more religious," he emphasizes, "but whether or not Turkey will finally move on from a rigid, state-controlled public space into a pluralistic society that can accommodate different ethnicities and beliefs."

Europe's Hijab Restrictions

Turkey's lifting of its ban on the hijab comes at a time when a number of countries are debating or imposing restrictions on traditional Muslim head coverings – particularly full-face veils such as the burqa and niqab, which are already banned in France and Belgium. Italy has banned full-face coverings under counterterrorism laws since the 1970s. The Dutch government has also drafted legislation banning the burqa. Some German states forbid it, as did many cities in Spain until the Spanish high court declared the bans unconstitutional earlier this year. Canada prohibits the wearing of veils during citizenship ceremonies, while British politicians are discussing restrictions on headscarves and veils in schools and in courts.

In a celebrated case in London last month, a burqa-wearing woman was ordered to raise her veil while giving evidence on the grounds that having a witness conceal her face while testifying was inconsistent with the principles of British justice. She was permitted to keep her veil lowered during the rest of the proceedings.

Europe and the West aren't the only regions grappling with these questions. In Morocco, veils and headscarves are discouraged, and Tunisia only recently relaxed its ban on wearing them. Syria banned the full-face veil for university students in 2010 – but President Bashar al-Assad rescinded the ban the following year when he sought to appease religious conservatives as the country slid into civil war.

Arguments for banning or restricting the traditional headwear range from security at airports to concerns about divisiveness and perceived polarization of society to preserving the religious neutrality of the state.

A Woman's Perspective

Much of the negativity about headscarves and veils comes from a lack of understanding about what they mean and why women choose wear them, says Shalina Litt, a popular Muslim radio presenter in Birmingham, England, who lectures and blogs about women's rights and Islamic issues and wears the niqab herself. "For me," she says, "it is an expression of faith, and modesty is a part of that. At the same time, I live in the real world. When I go to an airport and it is time to show my ID, I lift my veil—whether it is to a man or a woman—and just get on with it. That's life. Those security rules are in place to protect us all, and there is nothing in the teaching of Islam that says we shouldn't go along with those rules."

Wearing the veil can be surprisingly empowering, says Litt. In recalling how she adopted the niqab gradually over time, moving from loose-fitting clothing to a headscarf to occasionally wearing the niqab to becoming a full-time wearer as her relationship with her faith evolved, she spoke of the first time she sat down to talk with a man while wearing the veil: "I thought: Wow! This is liberating. He is having to listen to my words, not judge me by my clothes or my face, but paying attention purely to what I have to say."

Smith, Roff. 2013. “Why Turkey Lifted Its Ban on the Islamic Headscarf”. National Geographic News. Posted: October 11, 2013. Available online:

Monday, November 25, 2013

A Bridge Between Western Science and Eastern Faith

Quantum theory tells us that the world is a product of an infinite number of random events. Buddhism teaches us that nothing happens without a cause, trapping the universe in an unending karmic cycle.

Reconciling the two might seem as challenging as trying to explain the Higgs boson to a kindergarten class. But if someone has to do it, it might as well be the team of scholars, translators and six Tibetan monks clad in maroon robes who can be spied wandering among the magnolias at Emory University here.

They were joined this week by the Dalai Lama, the spiritual leader of the Tibetan people, who decided seven years ago that it was time to merge the hard science of the laboratory with the soft science of the meditative mind.

The leaders at Emory, who already had created formal relationships with Tibetan students there, agreed, and a unique partnership was formed.

For the monks, some of the challenges have been mundane, like learning to like pizza and trying to understand Lord Dooley, the university’s skeleton mascot.

For the team of professors involved in the project, the Emory-Tibet Science Initiative, there are the larger issues, like how to develop methods to quantify the power of meditation in a way the scientific world might actually accept.

But for the Dalai Lama, an energetic 78-year-old who rises at 3:30 every morning for four hours of meditation, his pet project is kind of a no-brainer.

Buddhist teaching offers education about the mind, he said in an interview after lunch Thursday at the home of James W. Wagner, the university president.

“It is quite rich material about what I call the inner world,” he said. “Modern science is very highly developed in matters concerning the material world. These two things separately are not complete. Together, the external and the internal worlds are complete.”

The first batch of six monks, who arrived on campus on 2010, have gone back to India, where much of the Tibetan exile community lives, and started teaching. Dozens of monks and nuns have taken lectures from Emory professors who traveled to Dharamsala, India, to instruct them, and 15 English-Tibetan science textbooks have been developed for monastic students.

The university pays about $700,000 a year for the program, which includes tuition for the monks, who then go back and teach science in the monasteries.

It has not been a smooth road. It took until last year for Buddhist leaders to accept science education as a mandatory part of monastic education. It was the first major change in 600 years.

But as anyone who has tried to carry out an idea from the boss knows, the real work is in the details.

Many of the toughest battles have come down to seemingly simple but vexing issues of lexicon. How does one create new words for concepts like photosynthesis and clones, which have no equivalent in the Tibetan language or culture? How does one begin to name thousands of molecules and chemical compounds? And what of words like process, which have several levels of meaning for Tibetans?

So far, 2,500 new scientific terms have been added to the Tibetan language.

“Much of our work is to make new phrases novel enough so students won’t take them with literal meaning,” said Tsondue Samphel, who leads the team of translators.

Still, some concepts are quite easy to translate.

“We understand impermanence of things as simply existing through our traditions,” said Jampa Khechok, 34, one of the new monks on campus. “We are now challenged to understand the nature of impermanence through the study of how fast particles decay.”

Learning has gone both ways. Professors here find themselves contemplating the science of the heart and mind in new ways. A student presenting a report on the cardiovascular system described the physiological reaction his own cardiovascular system might have if he were told the Tibetan people were free.

Debate is a constant, said Alexander Escobar of Emory, who has gone to India to teach biology. Monks have wanted to know, for example, how he could be so sure that seawater once covered the Himalayas. (The answer? Fossils.)

Western scholars have had to look at their work with a new lens, too, contemplating matters like the nature and origins of consciousness.

One result has been the development of something called cognitively based compassion training, a secular mediation program proven to improve empathy.

The partnership has had other, more practical applications.

Linda Hutton, a social worker, has a longstanding clinical practice treating sexually abused children and families in Greenville, S.C. She drove to Atlanta this week to attend a private luncheon with the Dalai Lama, who was making his sixth visit to Emory.

She teaches her young victims and their families to practice mindfulness and how to use meditation and breathing to cope with trauma.

“I draw from a lot of medical research,” she said, “but what I have found here transcends that.”

Severson, Kim. 2013. “A Bridge Between Western Science and Eastern Faith”. The New York Times. Posted: October 11, 2013. Available online:

Sunday, November 24, 2013

Vampire Cannibals: Real Ghouls Haunt Papua New Guinea

A large island nation north of Australia — boasts a fast-growing economy and a rich natural resource base of gold, copper, oil and agricultural products. But deep within the British Commonwealth country's rugged mountains and tropical rain forests, some dark practices still occur.

On Wednesday (Oct. 9), the father of a 3-year-old girl allegedly took his daughter into a wooded area and bit into her neck, eating the flesh and sucking her blood, the Papua New Guinea Post-Courier reports. Two boys reportedly witnessed the event and reported it to local officials, who quickly arrested the man.

"He was just laughing at the boys and continued eating the flesh and sucking the blood," local councilor John Kenny told the Post-Courier. As gruesome as the incident was, it's not an isolated event, according to numerous reports from Papua New Guinea (PNG). The relatively unexplored country is home to millions of people who live in isolated rural villages and maintain traditional practices that, by many reports, sometimes include cannibalism.

Last year, PNG officials arrested 29 people for killing and cannibalizing the brains and genitals of seven people accused of sorcery. In February, the family of a 6-year-old boy who had recently died accused a 20-year-old mother of witchcraft.

The woman was stripped, bound, tortured with a hot iron, doused with gasoline and burned to death on a pile of trash in broad daylight in front of hundreds of onlookers, The Associated Press reported. Officials condemned the brutal killing, but made no arrests.

Cult leader slaughtered

In March, Steven "Black Jesus" Tari — a convicted rapist and leader of a cult group dedicated to rape, sacrificial killings and cannibalism — escaped from prison and returned to his cult, which has an estimated 6,000 members.

But last month, Tari met his end after reportedly killing a teenage girl: He was castrated, butchered and thrown into a shallow pit by a group of vigilantes, the Independent reported. "Tari is dead, and this cult worship dies with him," said police investigator Ray Ban, as quoted by the Independent. "If I hear of any more cult worship here, I will return with my men."

Other PNG officials expressed similar disdain. "It is reprehensible that women, the old and the weak in our society should be targeted for alleged sorcery or wrongs that they actually have nothing to do with," PNG Prime Minister Peter O'Neill told the Associated Press.

Government cracks down

In response to vigilante violence and other sorcery-related crimes, PNG has repealed its 1971 Sorcery Act, which criminalized "evil sorcery," known locally as sanguma. The country has also reinstituted the death penalty for anyone found guilty of murdering a suspected witch; the United Nations, Amnesty International and other groups condemned this reinstatement.

"These are very tough penalties, but they reflect the seriousness of the nature of the crimes and the demand by the community for Parliament to act," Daniel Korimbao, a spokesman for O'Neill, said in a statement, as reported by The New York Times.

Lallanilla, Marc. 2013. “Vampire Cannibals: Real Ghouls Haunt Papua New Guinea”. Live Science. Posted: October 11, 2013. Available online:

Saturday, November 23, 2013

Genes Predispose Some People to Focus On the Negative

A new study by a University of British Columbia researcher finds that some people are genetically predisposed to see the world darkly.

The study, published in Psychological Science, finds that a previously known gene variant can cause individuals to perceive emotional events -- especially negative ones -- more vividly than others.

"This is the first study to find that this genetic variation can significantly affect how people see and experience the world," says Prof. Rebecca Todd of UBC's Dept. of Psychology. "The findings suggest people experience emotional aspects of the world partly through gene-coloured glasses -- and that biological variations at the genetic level can play a significant role in individual differences in perception."

The gene in question is the ADRA2b deletion variant, which influences the hormone and neurotransmitter norepinephrine. Previously found to play a role in the formation of emotional memories, the new study shows that the ADRA2b deletion variant also plays a role in real-time perception.

The study's 200 participants were shown positive, negative and neutral words in a rapid succession. Participants with the ADRA2b gene variant were more likely to perceive negative words than others, while both groups perceived positive words better than neutral words to an equal degree.

"These individuals may be more likely to pick out angry faces in a crowd of people," says Todd. "Outdoors, they might notice potential hazards -- places you could slip, loose rocks that might fall -- instead of seeing the natural beauty."

The findings shed new light on ways in which genetics -- combined with other factors such as education, culture, and moods -- can affect individual differences in emotional perception and human subjectivity, the researchers say.


Further research is planned to explore this phenomenon across ethnic groups. While more than half of Caucasians are believed to have the ADRA2b gene variant, statistics suggest it is significantly less prevalent in other ethnicities. For example, a recent study found that only 10 per cent of Rwandans had the ADRA2b gene variant.

The study was co-led by UBC Prof. Rebecca Todd (as a PhD student at the University of Toronto) and Adam Anderson (Cornell University). DNA samples and genotyping were provided by Daniel Mueller (Toronto's Centre for Addiction and Mental Health).

Science Daily. 2013. “Genes Predispose Some People to Focus On the Negative”. Science Daily. Posted: October 10, 2013. Available online:

Friday, November 22, 2013

Bones to shape stones

One of the great mysteries of prehistory is to understand the process of behavioural evolution in hominid groups. The main area of research has understandably concentrated  on lithic technologies, but other types of artefacts can help illuminate this process.

Recent research led by Ruth Blasco of the Gibraltar Museum, examined two bone retouchers (for retouching stone tools) dated to between the second half of the Middle Pleistocene and the beginning of the Upper Pleistocene – between 100.000 and 350.000 years BP. One originates from Bolomor Cave (Spain) and the other from Qesem Cave (Israel).

The emergence of these tools is found in the latest phases of the Acheulean and their use seems to coincide with widespread and emerging cultural complexes at both ends of the Mediterranean Sea: the pre-Mousterian of Western Europe and the Acheulo Yabrudian of the Levant. Collectively viewed, these retouchers represent significant behavioural changes seemingly taking place between 400 and 300 thousand years ago in two entirely separate parts of the world.

A fresh bone

Previous analyses of retouchers had only ever provided contradictory results regarding a potential selection criteria for use, but experimental work indicated that an important factor was the relative freshness of the bone. It was also noted that the tool maker would have to remove the periosteum (a membrane that covers the outer surface of all bones) as it would otherwise interfere with the flaking process. These attributes are evident on the two bones examined, although it is difficult to evaluate the precise state at the time of use, the retoucher from Bolomor seems to correspond to a fresh, defatted bone, the elasticity of which was still intact. The bone retoucher from Qesem Cave displays a slight increase of pits and scores, which is consistent with a semi-fresh bone.


The bone retouchers from the two caves represent some of the earliest examples known to date and both possess typical morphological and functional characteristics of such tools, despite the distance between sites. The researchers can only surmise that the similar technological advancement indicates convergent developments.

Both of these cultural groups were innovators, creating a new series of behaviours, including the use of fire on a regular basis, the roasting of meat (suggested by the presence of large amounts of burnt bones), and lithic recycling .

By using bone tools to alter other raw materials and thus connect different materials in new ways is not merely a technological innovation but an innovative human behaviour. By using bones that originated in hunted, defleshed and consumed animals into the lithic production process, it brings together two basic elements of prehistoric life – stone tool making and animal hunting/consumption.

Past Horizons. 2013. “Bones to shape stones”. Past Horizons. Posted: October 16, 2013. Available online:

Thursday, November 21, 2013

Pre-Incan Culture Expanded Through Trade, Not Conquest

Although Christopher Columbus is associated with discovering America, the 15th century explorer actually first set foot upon modern day Haiti and the Dominican Republic. But people were inhabiting both North and South America for thousands of years before Columbus sailed the ocean blue.

Historians commonly believe that humans first crossed to the Americans from Asia 12,000 years ago. But a new exhibit in Brazil features artifacts dating back as far as 30,000 years ago, 18,000 years earlier than previously believed.

100 items including cave paintings and ceramic art depicting animals, hunting expeditions and even sex scenes of the early Americans are on display in Brasilia, Brazil's capital.

The artifacts were found at the Serra da Capivara national park in Brazil’s northeastern Piaui state, which used to be a popular site for the hunter-gatherer civilization that created the artwork.

"To date, these are the oldest traces of human existence in the Americas," Franco-Brazilian archaeologist Niede Guidon who has headed a mission to carry out large-scale excavation of Piaui's interior since the 1970's told the AFP. "It's difficult to think there exists a site anywhere with a higher concentration of cave art."

In addition to the artwork, Guidon said charcoal remains of structured fires found at the site are among other traces of the Serra dwellers.

Some archaeologists disagree with Guidon that a few burnt flakes are not evidence of man-made fire hearths, but rather the remains of a natural stone formation.

However, Guidon contends the primitive civilization’s cave art provides enough evidence of early human activity.

"When it [cave art] began in Europe and Africa, it did here too," she said.

The paintings date back an estimated 29,000 years.

Fox News. 2013. “Cave art depicting early Americans’ sex lives suggests people inhabited Americas 18,000 years earlier than believed”. Fox News. Posted: October 14, 2013. Available online:

Wednesday, November 20, 2013

Hospital Sells Body Parts to Witch Doctors, Accusers Say

The second-largest hospital in the Southern African country of Swaziland may be operating a black market in human body parts used in magic spells, according to claims made by a reverend and others.

The organ trade at Raleigh Fitkin Memorial Hospital in the city of Manzini has been described as "an open secret" by critics such as Rev. Grace Masilela. Accusers say people come to the hospital from neighboring South Africa to buy bones, hearts, brains and other organs.

Whether or not this particular claim is true, such a practice is not uncommon in the area. There, traditional healers or witch doctors often grind up body parts and combine them with roots, herbs, seawater, animal parts and other ingredients to prepare potions and spells for their clients. Sometimes clients eat the mixture or rub it on the skin or into open wounds. In the region, the practice of using body parts for magical ritual or benefit is called "muti," a Zulu word meaning "medicine."

Magical beliefs

Belief in magic is widespread throughout Sub-Saharan Africa, according to a 2010 Gallup poll in which over half of the respondents reported personally believing in witchcraft and magic.

Gérard Labuschagne, of the South African Police Service's Investigative Psychology Unit, has investigated dozens of muti murders. Writing in the Journal of Investigative Psychology and Offender Profiling in January 2004,Labuschagne explains the underlying belief system of muti: "In traditional African beliefs, it is assumed that there is only a certain amount of luck in society. Each individual receives a portion of that luck. It is therefore believed that if another person is successful, then they have obtained an extra portion of luck via devious means, usually with the intervention of the supernatural.

"Setbacks or calamities, such as drought or illness, are signs that the natural and social order have been disturbed. One means of obtaining this extra portion of luck or restoring the natural order is through the use of strong muti. It is with this strong muti that muti murders are often associated. Muti made from human body parts is considered to be exceptionally powerful."

Muti murders differ from ritual or sacrificial killings in that the goal is not necessarily to kill the victim (though that often happens due to shock and blood loss), but instead to obtain body parts.

Just as different ingredients in a recipe are used for different purposes, certain body parts are used for particular goals. For example, eyes may be stolen and used in a magic ritual to help restore a client's failing eyesight, whereas severed hands are used to assure business success, and genitals are believed to attract luck.

In some cases, criminals have been arrested during robberies with muti in their pockets, having been told by a healer that the medicine from such body parts would make the thieves invisible to police — or even bulletproof.

Fresh body parts needed

Body parts from live victims are said to be the most powerful, though organs taken from the dead are sometimes used, as is claimed to have happened in Swaziland. Labuschagne notes, "There seems to be an increase in grave robbing, where the body parts removed are similar to those used in muti. Also, theft or sale of body parts from hospitals and mortuaries has occurred. It is uncertain whether or not the traditional healer would be able to tell if a body part is removed pre- or post-mortem."

Stealing organs from the dead for use in magic spells is a ghastly crime, but at least the unwilling donors are deceased. Just as often, murderers working on behalf of witch doctors attack and kill innocent people for their body parts. Muti murders are particularly brutal, with knives, machetes or even glass shards used to cut and hack off limbs, breasts and other body parts from their victims, including children.

In East Africa, at least 50 albinos were murdered for their body parts in 2009, according to the Red Cross. An albino's' arms, fingers, genitals, ears and blood are highly prized for their especially powerful magic, according to believers.

Muti murders have occurred throughout South Africa, and especially in rural areas. Reliable figures on the number of muti murders in the country are elusive, because police do not track those murders separately from other homicides. Even so, estimates range from a few dozen to a few hundred murders per year.

Science fiction fans may recall that muti was featured in the hit South African film "District 9," in which a local warlord tries to steal the hero's body parts, believing the stolen limbs would give him magical powers.

Most Africans, and most traditional healers there, reject muti murder and don't engage in the practice. Still, the belief that body parts can aid in magic rituals has been a part of African culture for centuries, and it will likely remain so.

Radford, Benjamin. 2013. “Hospital Sells Body Parts to Witch Doctors, Accusers Say”. Live Science. Posted: October 11, 2013. Available online:

Tuesday, November 19, 2013

New archaeoastronomical alignments found at Machu Picchu

A joint Peruvian-Polish team have examined a previously unexcavated building in the well-preserved Inca retreat of Machu Picchu and found that the structure is astronomically aligned according to Prof. Mariusz Ziółkowski, Head of Pre-Columbian Research Centre at the University of Warsaw.

The team used 3D laser scanners to fully model and survey the building, named “El Mirador” (the vantage point), so as to get precise locations and alignments.

“Despite the difficult terrain we managed to perform 3D laser scans, which we then used to prepare a precise model of this amazing complex.” said Prof. Ziółkowski. Results of preliminary analysis indicate that it is a device used probably by a small group of Inca priests astronomers for precise observations of the position of celestial bodies on the horizon, against the distinctive Yanantin mountain peaks.

The Inca were well-known as astronomers who took careful note of the movements of the heavens in order to plan their agricultural and religious calendars.

Archaeoastronomical significance

The Polish researchers who have been working at Machu Picchu since 2008, have been focusing on the site’s archaeoastronomical significance. They presented their findings at the International Conference of the Societe Europeenne pour l’ Astronomie dans la Culture in Athens in September 2013.

El Mirador, was constructed of well made blocks of stone and was identified in an inaccessible part of the National Park of Machu Picchu by the park director, anthropologist Fernando Astete Victoria, during the prospective – inventory work conducted on the slopes of Mount Huayna Picchu.  He then invited the Polish team to work with the Peruvian team to further investigate the site with the latest technology and so reveal a new alignment pattern unlike the Inca ceremonial complexes with south or west-oriented solstice.

Previous research by the Polish team had demonstrated Intimachay at Machu Picchu was an astronomical observatory far more complex and precise than it has been previously realised.

Past Horizons. 2013. “New archaeoastronomical alignments found at Machu Picchu”. Past Horizons. Posted: October 8, 2013. Available online:

Monday, November 18, 2013

New information is discovered about the ancestry of Ashkenazi Jews

Professor Martin Richards, of the Archaeogenetics Research Group at the University of Huddersfield, has published a paper uncovering new information about how Ashkenazi Jewish men moved into Europe from the Middle East, and their marriage practices with European women.

The origins of Ashkenazi Jews – that is, Jews with recent ancestry in central and Eastern Europe – is a long-standing controversy. It is usually assumed that their ancestors migrated into Europe from Palestine in the first century AD, after the destruction of the Second Temple by the Romans, with some intermarriage with Europeans later on. But some have argued that they have a mainly European ancestry, and arose by conversion to Judaism of indigenous Europeans, especially in Italy. Others have even argued that they were largely assimilated in the North Caucasus during the time of the Khazar Empire, whose rulers turned to Judaism around of the tenth century AD.

Archaeogenetics can help to resolve this dispute. Y-chromosome studies have shown that the male line of descent does indeed seem to trace back to the Middle East. But the female line, which can be illuminated by studies of mitochondrial DNA has until now proved more difficult to interpret. This would be especially intriguing because Judaism has been inherited maternally for about 2000 years.

We have settled this issue by looking at large numbers of whole mitochondrial genomes – sequencing the full 16,568 bases of the molecule – in many people from across Europe, the Caucasus and the Middle East. We have found that, in the vast majority of cases, Ashkenazi lineages are most closely related to southern and western European lineages – and that these lineages have been present in Europe for many thousands of years.

This means that, even though Jewish men may indeed have migrated into Europe from Palestine around 2000 years ago, they brought few or no wives with them. They seem to have married with European women, firstly along the Mediterranean, especially in Italy, and later (but probably to a lesser extent) in western and central Europe. This suggests that, in the early years of the Diaspora, Judaism took in many converts from amongst the European population, but they were mainly recruited from amongst women. Thus, on the female line of descent, the Ashkenazim primarily trace their ancestry neither to Palestine nor to Khazaria, but to southern and western Europe.

EurekAlert. 2013. “New information is discovered about the ancestry of Ashkenazi Jews”. EurekAlert. Posted: October 8, 2013. Available online:

Sunday, November 17, 2013

Reading the runestones of Denmark

The Danish National Museum Runestone Project sheds light on the numerous rune stones found across the whole country through a series of good quality images and text on Wikipedia. Now, the project is also available in a new smartphone app. from the Cultural Agency of Denmark’s ancient monuments, which can be downloaded free.

Fascinating stories

There are approximately 260 known rune stone in Denmark which provide vital clues to kings, kinship and the role and power of women during the period they were erected. The earliest stones date from the 700s CE and the latest from the 1000s CE.

The project database (which is currently only in Danish) has fascinating stories such as Ragnhild, who set up the Glavendrup-stone and the beautiful Fjenneslev-stone near Ringsted. You can also read about Skjern-stone’s possible connection to Harald Bluetooth.

Encyclopaedia Wikipedia

The National Museum Runestone Project, with photographer Roberto Fortuna and runeologist Lisbeth Imer went around Denmark in 2011 and 2012 to photograph and describe nearly 100 of the country’s stones. The texts along with detailed and stunning images can be seen on Encyclopaedia Wikipedia.

Smartphone app.

The smartphone app gives users the possibility to see Denmark’s various monuments on a map and hear a series of animated stories about selected monuments – including three rune stones; namely brooks Stone 2 north of the highway, Glavendrup-stone on Funen and Helnaes- stone (currently in storage at the National Museum).

Additionally, you can also find both new and old pictures of and information about the different rune stones and runic alphabet to ensure you are well prepared for your runestone travels.

Past Horizons. 2013. “Reading the runestones of Denmark”. Pat Horizons. Posted: October 8, 2013. Available online:

Saturday, November 16, 2013

Words are stupid, words are fun

As words fall in and out of fashion, new ones enter the language. But some, such as autonaut, chassimover and pupamotor, failed to reach the assembly line

English is a marvellous mashup of words. A few Celtic placenames. A stock of Old English words (day and night, black and white, food and drink, life and death, beer). More than twice as many words adopted from Norman French (marriage, parliament). Sometimes competing words from both: motherhood (Old English) and maternity (Norman French). Words of Greek derivation, like octopus. Words of Latin derivation, such as campus and ultimatum. Words from all over the place: Welsh (corgi), Irish (brogues), Arabic (algebra), German (hamster), Chinese (typhoon), Japanese (tycoon), American Indian (tobacco), Hawaian (ukulele), and many more.

Wherever they come from, words fall in and out of fashion. Within living memory gay has changed meaning completely, while bad and wicked changed, then changed back. Yesterday's slang is respectable today. In the 1950s and 60s, words that angered people who write to newspapers included job (the writer thought it vulgar, and preferred employment), breakdown ("horrible jargon"), and layby ("a combination of verb and preposition of rather obscure meaning"). The Manchester Guardian stylebook of 1950 banned such "slang" phrases as bank on, face up to, give away, sack (for "dismiss") and many others.

The expression "foregone conclusion" once meant an experience previously undergone, rather than making a decision without listening to the arguments. Many words we use today have a different meaning from 20, never mind 50, 100 or 200 years ago. Nice once meant silly (silly meant happy or blessed), then subtle, then pleasant. You could be sad with food and drink – it meant full to the brim, and was related to sated, satisfied and saturated. It then came to mean solid, so a reliable person could be called sad; in time, solid, heavy and dull came to mean sad in one of our modern uses. In recent years it subtly acquired an additional meaning, as in "how sad is that?"

Cicero invented the word qualitas because he felt Latin was inadequate to express a Greek philosophical concept. Now that's what I call nerdy. About 1,700 words are first recorded in Shakespeare (which does not necessarily mean he invented them), including barefaced, fancy-free, laughable and submerged. Milton is credited with beleaguered, impassive, jubilant and sensuous and the expressions "trip the light fantastic" and "all ears". Jung invented the word synchronicity as well as ambivalent, extrovert and introvert, while Freud came up with the word psychoanalysis, which is derived from the Greek for butterfly, psyche, who was also the Greek goddess of the soul.

Technology is a continual source of new words. The man who developed the wireless technology Bluetooth in 1996 was reading a historical novel about Harald Bluetooth, a 10th-century King of Denmark, at the time and appropriated his name. Spam, in the sense of unwanted emails, was named after the 1970 Monty Python cafe sketch in which Spam, in the sense of unwanted canned meat, was compulsory in every dish. Sometimes new words catch on, sometimes they don't, but you can always bet that someone, somewhere will object to them. I recall readers complaining about the Guardian's use of the new word blog (an abbreviation of another new word, weblog) but within a very short time it had become established. In the early 1960s, the AA sought suggestions from the public for a new word to describe drivers: submissions included autocarist, autonaut, chassimover, motorman, wheelist, and the bizarre acronym pupamotor ("person using power-assisted means of travelling on roads"). The idea was dropped. Whoever came up with laser ("light amplification by stimulated emission of radiation") in 1960 was more successful.

The writer AP Herbert devised a scoring system for new words, which would be given marks out of 10 on each of four criteria: is it readily understood, is it to be admired, is it sound etymologically, and is it actually required? The pass mark was 50% and television, for example, just scraped through (scoring respectively 10, 0, 0, and 10). One of my favourite recent words is bouncebackability, a neat alternative to "the ability to bounce back" attributed to the former football manager Iain Dowie. I fear it would fail the test.

This is an edited extract from For Who the Bell Tolls: One Man's Quest for Grammatical Perfection, by David Marsh, published by Guardian Faber. To order a copy for £8.99 (RRP £12.99) visit or call 0330 333 6846.

Marsh, David. 2013. “Words are stupid, words are fun”. The Guardian.. Posted: October 7, 2013. Available online:

Friday, November 15, 2013

Civilization Is Defined by 'the Others'

What does it mean to be a civilized person? A civilized nation? How are these notions changing over time? And from one country to another? In the recently concluded project Civility, Virtue and Emotions in Europe and Asia, researchers from several different countries and disciplines have studied these questions. One of the initiators is Professor Helge Jordheim, Academic Director for the inter-faculty research programme KULTRANS.

Jordheim and his colleagues have studied what was considered to be civilized behaviour in Europe and Asia in the late 19th and early 20th centuries.

"Western identity and mores were formed by the encounter with non-Western cultures," Jordheim states.

The period studied by the researchers was one characterized by imperialism. In light of this, the relationship between "the West and the rest" is particularly interesting, Jordheim claims.

"In Western Europe, the prevailing notion was 'civilization, that's us.' Even in Asia, the idea that standards were defined by the West tended to prevail. Implicitly, the objective was: how can we catch up with the West?"

A boost in self-confidence

At the same time, there was a clear perception in Asia about not just mimicking the West, Jordheim emphasizes. The Asian countries were concerned with "finding their own path."

"A challenge for the entire project has consisted in avoiding the pitfall of thinking that all influence emanated from Western Europe. It's not as simple as that. For example, we can see that there was a widespread exchange of ideas between the Ottoman Empire and the Arabic and Persian cultures, which also had an impact on the Urdu-speaking population of India. Thus, the influence appears to be far less homogenous than we have previously assumed," Jordheim says.

He believes that the Russo-Japanese War in the early 20th century was a key event for the Asian civilizing process.

"This was the first time that Asia defeated the West. It resulted in a real boost in self-confidence, and had an impact on the kinds of ideas that were nurtured," Jordheim says.

Similarly, the researchers have been interested in how the civilizing influence to some extent also ran in the opposite direction -- from East to West.

The written word shapes our thoughts

Jordheim and his colleagues have mainly studied different types of texts from the countries included in the project.

"We have looked at a lot of self-help literature, such as 'how to become a better person' and literature on 'etiquette.' We have also studied political documents that present ideas of how the nation should be formed. In addition, we have studied texts from encyclopaedias, which help explain concepts."

Jordheim points out that such texts help mould the views of the population in a particular manner.

"They help inculcate and foster certain emotions, while suppressing others. To render a population more civilized, changes must occur at the level of the individual," he says.

Scandinavia: A natural paradox

Jordheim's own research has focused on the concept of civilization in Scandinavia -- a region which is rarely included when processes of civilization are being studied.

"Scandinavia stands apart because civilization is relatively unimportant as a notion. Here, the concept of dannelse (formation) is used to refer to the same," Jordheim states.

Scandinavia is different also in other respects, mainly due to the population's relationship to nature.

"The entire idea of civilization involves abandoning nature and the natural state. This is problematic in Scandinavia, and especially in Norway, since so much of our identity is associated with nature," he says.

Jordheim argues that much of what is traditionally regarded as a development in the right direction in other countries will not be perceived in the same way in Scandinavia.

For example, migration to the cities will not necessarily be regarded as a sign of progress. The Scandinavian discourse fosters ideas and notions of nature as the ideal -- not civilization as such. Viewed thus, the idea of civilization is paradoxical, he says.

Fertile ground for Social Darwinism

Jordheim believes that this notion of nature is also reflected in the kinds of emotions that are regarded as "civilized" in Scandinavia.

"In many other countries, the civilizing process entails that emotions must be curbed. This is not necessarily so in Scandinavia. Emotions that are presumed to be natural, such as courage, anger and maternal instincts, are also regarded as desirable," he says.

In his research, Jordheim has been concerned with how these notions of civilization, nature and emotions helped Social Darwinism gain a firm foothold in Scandinavia, especially in Norway.

"Ideas that were explained on the basis of nature had already gained widespread acceptance. With Social Darwinism, "civilized" ideas could be integrated while nature maintained its position," Jordheim claims.

Private and global

Jordheim says that what makes the project particularly interesting is its wide scope -- from the private realm, to global matters.

"On the one hand, this is about how you behave within your own home. How do you behave towards your wife and children? What is the ideal? At the same time, this is about the nation and the world order.

It is important for all countries to appear civilized. How civilized a country is considered to be determines its position in the 'global pecking order,'" he says.

Science Daily. 2013. “Civilization Is Defined by 'the Others'”. Science Daily. Posted: October 3, 2013. Available online:

Thursday, November 14, 2013

Racism Is Dying

At least when it comes to geography.

Geographer Joni Seager taught me well. I signed up for the seminar "Gender, Place, and Culture," not knowing what to expect. Dr. Seager outlined the research agenda. Concerning geographic analysis, did gender matter? Her fundamental question wasn't rhetorical. It was open-ended. If we introduced a gender variable, did our understanding of human geography change significantly? I approach issues of race the same way. If we introduce a race variable, does our understanding of human geography change significantly?

Spoiler alert, it does. But like gender, the race variable isn't always the most important. That the race variable wouldn't be the most important is blasphemy. Well, sometimes gender trumps race. Sometimes, neither race nor gender are the best predictors.

For migration, gender beats race. Gender beats class. Gender beats any other variable championed. Yet gender and migration get short shrift. For the most part, migration is migration. You go where you know. Race taking a back seat to a river in Cleveland:

The hall at Mahall’s 20 Lanes was stuffed. Young men of the hip-hop generation ceded chairs at the bar to women their grandmothers’ age. We’d come to see Mariama Whyte show off her Broadway vocal chops to her hometown fans. After Whyte topped off the night singing with South African actors and fellow cast mates from The Lion King, I stood and added my cheers to the ovations. I’d had the time of my life — in Lakewood.

Clevelanders of a certain age will raise an eyebrow at that sentence, because Lakewood is on the West Side. When it comes to African-Americans, the West Side has been the wrong side of the city’s great divide.

When I came to Cleveland in 1993, I learned I had to make a choice. I’d covered Lakewood for The Plain Dealer and thought the houses were cute, the folks nice and the library excellent. I wanted to move there from Richmond Heights. But my East Side friends were adamant: They wouldn’t cross the river to see me. They claimed it was too far and that the police harassed black men. So I moved to South Euclid.

I joked that the Cuyahoga River might as well be the Red Sea, because getting Clevelanders to cross it took an act of God.

This wasn’t just a black thing. The Cuyahoga River split a city and suburbs fractured by racial and ethnic enclaves. On the East Side, Italians were in Little Italy. On the West Side, they were off Fulton Avenue and clustered near West 65th Street and Detroit Avenue. Although there were Irish on the East Side, the West Side parish endured around St. Colman on West 65th. Puerto Ricans were definitely on the West Side, although migrants from the island first settled around Hough and Lexington avenues.

Emphasis added. Those Rust Belt city divides don't fall neatly along racial lines. OK, they do if you completely ignore ethnic enclaves. Parochial geography rules.

Race does a poor job of explaining Lakewood. Yet race holds our understanding of cities hostage. We're stuck fighting the fallout from the MLK riots of the late 1960s. Trumping race, and even gender, is economic globalization. African Americans are gentrifying the West Side of Cleveland.

Russell, Jim. 2013. “Racism Is Dying”. Pacific Standard. Posted: October 3, 2013. Available online:

Wednesday, November 13, 2013

Jared Diamond's provocative insights

The provocative insights of the bestselling author have won him a huge following — and the disdain of some academics

As Washington has been rocked with political drama in recent weeks, investors, pundits and voters have watched with nervous horror and scorn. Over in the more rarefied environment of the University of California, Los Angeles, however, Jared Diamond, 76, professor and guru of environmental geography, is monitoring these peculiar power rituals with a more scholarly — and sweeping — perspective.

The reason? In the next couple of years, Diamond plans to write a blockbuster analysis of how modern civilisations “manage” the process of change and crisis. Some of this will focus on countries such as Japan and Britain; Diamond believes, for example, that Britain’s response to the loss of its empire after the Second World War was striking.

“I went to Britain myself in the late 1950s and it was a time when Britain was in slowly developing crisis, with Suez, race riots, the scrapping of the last battleships,” he observes, sitting in a London hotel and looking like the stereotype of the romantic, intellectual adventurer: greying beard, tweedy jacket and a tie seemingly decorated with armadillos. “[Then] we would never have guessed that Britain would have dealt with problems of becoming a heterogeneous society as peacefully as it has. Today no one talks about the empire on which the sun never sets — Britain has a new identity.”

But Diamond’s analysis of “conflict” will also cover the less peaceful world of Washington as he tries to assess whether modern America can handle the process of change — or not. “It would be premature to say what I make of [US politics] but what is clear is that the polarisation and intolerance in America today is greater than at any point in my lifetime,” he states, adding rhetorical questions with professorial earnestness.

“Why? Is it because of the electronic media? Or because people are not seeing each other face to face? Or is it due to the separation of politicians? Previously Democrats and Republicans would socialise at the weekends because they were trapped in Washington but now they go away because air travel is so good, which changes things.”

Whatever explanation Diamond eventually settles on as the cause of Washington’s woes, it is likely to be provocative — and blend the physical and cultural aspects of our lives in a dizzying bricolage. These, after all, have been the traits that have made him wildly popular in recent years, with a string of bestselling popular science-meets-history-meets-ecology books.

Born in Boston to a Bessarabian Jewish family, Diamond toiled in relative obscurity in the first few decades of his career as a physiologist at Cambridge university and UCLA. “For decades I was the world’s expert on the gall bladder,” he explains matter-of-factly, without any hint of modesty. “The gall bladder is a simple organ that absorbs salt and water — and that means you can study it with a minimum of equipment, which I like.”

But even as he obsessively observed gall-bladders, Diamond developed a second passion: birds. In his twenties he started to visit Papua New Guinea and used the material gathered to write academic papers in the field of ornithology. That led him into yet more — seemingly unlikely — areas of intellectual inquiry such as environmental geography, followed by physical and cultural anthropology (or the study of human evolution and culture).

“My study of New Guinea was initially motivated by birds but you cannot do anything there without dealing with local people,” he explains. “And once you have spent time dealing with local people, you realise that humans are similar all around the world in some respects — but different in others.”

This led Diamond to produce his first bestseller, “Guns, Germs and Steel”, which endeavoured to explain why the Eurasian people of North America and Europe displaced other native Indian American and Asian cultures by highlighting differences in ecology.

It was a controversial thesis. But it turned him into something of a cult hero: 16 years after the book, when I tell friends that I am interviewing Diamond, one remarks that “‘Guns, Germs and Steel’ changed how I thought.” In 2002, Diamond abandoned gall bladders, ending his career in physiology, to devote himself to writing. In 2005 he published another sweeping analysis, “Collapse”, which explained why some societies fail and others flourish. Then last year he published “The World Until Yesterday”, which describes how humans live in societies which are not “WEIRD”, or “Western, Educated, Industrialised, Rich and Democratic”.

This is a fun, lively read that sets out to illustrate two simple points: humans can live their lives in numerous, different ways; and the WEIRD approach is not always best. On the contrary, America and Europe could sometimes improve their own cultures and lives by looking at how other, more traditional cultures live.

And, to illustrate this, Diamond presents the reader with a host of colourful stories in places ranging from Papua New Guinea to the Amazon and African deserts, focusing on issues such as diet, child-rearing and dispute-resolution. Some of these anecdotes are mundane; others are colourful (there are lengthy tales about widow-strangling and infanticide). But the account weaves together a powerful tapestry that — if nothing else — forces us to recognise that it could be perfectly possible for the western world to change how we raise our children, sort out divorce, wage wars or guard our health — if we choose to widen our gaze.

“Many people I encounter have changed how they bring up their children as a result of reading my book,” Diamond says. “In UCLA one of my colleagues is an American whose wife is from Finland. They were debating whether to speak Finnish to the newborn baby alongside English but then they read my chapter on the benefits of multilingualism.”

More specifically, “The World Until Yesterday” explains that in many non-Western societies, in places such as Papua New Guinea, it is considered entirely normal for people to speak several languages. And this not only helps groups to maintain more social ties but also has a tangible benefit that could be of use in the WEIRD world: scientific research shows that if you use your brain to speak several languages you are less likely to suffer from dementia.

“When my friends read that, the mother decided she would only speak Finnish to the baby, but he speaks only English.”

It is provocative stuff — having read Diamond’s book I was left reflecting on many aspects of child care myself. And as somebody who once studied for a PhD in cultural anthropology, I am thrilled that Diamond’s work has highlighted a point that is central to the discipline: namely that studying “other” cultures is not just valuable in terms of understanding how the wider world works — but also because it helps to “flip the lens” and garner fresh perspective on our own lives.

As I chat with Diamond, however, I am also aware that a certain irony hangs over his work. In some senses his tomes are a powerful advertisement for anthropology — he cites no fewer than 39 anthropology studies in his latest book, which now appears on some undergraduate courses.

“Whenever I give a public lecture I get people coming up and saying that they went into anthropology because of my books,” he cheerfully declares. And yet if you talk to anthropologists about Diamond, some are scathing if not hostile. So much so that Survival International — an advocacy group for tribal people — issued a statement criticising “The World Until Yesterday”.

And groups such as the American Anthropological Association have staged critical debates on his work (“which they didn’t even ask me to attend myself”, Diamond says with some chagrin).

Some of this criticism seems to reflect academic jealousy or snobbery; there are complaints, for example, that Diamond doesn’t use many footnotes and only cites a narrow pool of ethnographic texts (many based on Papua New Guinea). But some anthropologists are angry that the examples of “traditional” culture are luridly presented, and that Diamond tends to assume that non-WEIRD societies live like WEIRD cultures did millennia ago. There has also been criticism of the suggestion in his earlier books that culture is driven by environmental and ecological factors; to cultural anthropologists this sounds unpleasantly determinist.

Diamond bristles when I cite those criticisms and insists that the vast majority of anthropologists have welcomed his book. “Whenever I hear the word environmental ‘determinism’, I know you will get a poor quality of reasoning,” he declares, explaining that people have misunderstood him: he does not view culture as something merely determined by material factors. Instead, he is fascinated by the variations of culture within a single ecosystem, citing the colourful example of widow-strangling in Papua New Guinea.

“There is one group that does that after a man dies — it’s considered entirely normal. But other groups do not do that and they live in identical environments,” he observes. “Or look at Europe, and think about dogs, horses and frogs. The French eat horses and frogs but the British eat neither and Germans don’t eat frogs — yet all have frogs, horses and dogs. I am not aware of any environmental explanation for that difference or for the fact we don’t eat dogs.”

That means, he concludes, that both environment and culture need to be analysed: if you want to understand political polarisation in Washington, in other words, look at Tea Party ideology and the advent of cheap airline flights. But anthropologists are sometimes ill-equipped to take this common-sense approach, due to infighting.

“Anthropology is a controversial discipline,” he points out. “Many anthropologists don’t like [cross-cultural] comparisons and syntheses — they dislike me coming in from a field outside anthropology and writing about this since I am performing territorial trespass!”

The key issue, it seems, is that Diamond has had the audacity to break the boundary taboo. “Silo-busting is exceptional in academia — one is expected to specialise. There is a lot of turf warfare,” he notes, explaining that when he first started studying ornithology he kept this secret from his colleagues in the medical department.

“Luckily my [academic] papers about birds were published in journals which no gall bladder physiologists ever read. But when my review committee eventually found out about what I was doing, they voted against my promotion. In academia, working in multiple fields is not a benefit but a penalty.” So much so that he now advises young academics to “make sure you get tenure before you start publishing in a second field”.

“In academia people talk about interdisciplinary thinking and run courses and programmes — but Lord help you if you try to make an interdisciplinary career, unless you are already so high that there is nothing they can do to you.”

These days, Diamond has found a solution to this, of sorts: although he was offered professorships in both the UCLA departments of anthropology and geography, he chose the latter. However, UCLA is sufficiently broad-based in its teaching approach — and broad-minded — that he is able to keep hopping between disciplines. And, as he knows, it is this bricolage approach which not only makes his books so readable but also enables him to develop his insights.

“There are numerous different disciplines which study human societies — all departments have different expertise and views about what is appropriate to do and what not to do,” he explains, pointing out that he first learnt about the value of comparing different populations that live in similar niches by studying birds. “It is only by comparison that you see what the other options are — and what your society is not doing.”

So would he prefer to live among the “non-WEIRD” people, to benefit from their wisdom, rather than have to cope with the bizarre rituals of American universities or meetings with journalists?

For the first time in almost an hour, Diamond cracks a wide smile and shakes his head. “My life is in California, I like it there. I like going to Papua New Guinea, too, and I have gone so many times, but just for short periods,” he admits. It is an answer that might provoke yet more questions from anthropologists. But it also captures the spirit of his books: we might not want to entirely abandon our western ways but we know that by peering outside our lives — however briefly in a book — we can better understand our own peculiarities, and sometimes even improve on them.

Maybe it is time to pack Congress off to Papua New Guinea for a short break. Or failing that, to read Diamond’s book — starting with what can be learnt from “traditional” forms of resolving tribal warfare.

Tett, Gillian. 2013. “Jared Diamond's provocative insights”. Gulf News. Posted: October 24, 2013. Available online:

Tuesday, November 12, 2013

Native Tribes' Traditional Knowledge Can Help US Adapt to Climate Change

New England's Native tribes, whose sustainable ways of farming, forestry, hunting and land and water management were devastated by European colonists four centuries ago, can help modern America adapt to climate change.

That's the conclusion of more than 50 researchers at Dartmouth and elsewhere in a special issue of the journal Climatic Change. It is the first time a peer-reviewed journal has focused exclusively on climate change's impacts on U.S. tribes and how they are responding to the changing environments. Dartmouth also will host an Indigenous Peoples Climate Change Working Group meeting Nov. 4- 5.

The special issue, which includes 13 articles, concludes that tribes' traditional ecological knowledge can play a key role in developing scientific solutions to adapt to the impacts. "The partnerships between tribal peoples and their non-tribal research allies give us a model for responsible and respectful international collaboration," the authors say.

Dartmouth assistant professors Nicholas Reo and Angela Parker, whose article is titled "Re-thinking colonialism to prepare for the impacts of rapid environmental change," said New England settlers created a cascade of environmental and human changes that spread across North America, including human diseases, invasive species, deforestation and overharvest.

The researchers identified social and ecological tipping points and feedback loops that amplify and mitigate environmental change. For example, prior to the arrival of Europeans, old growth deciduous forests were rich with animal and plant resources and covered more than 80 percent of New England. Native peoples helped to sustain this bountiful biodiversity for centuries through their land practices.

"But when indigenous communities were decimated by disease and eventually alienated from their known environments, land tenure innovations based on deep, local ecological knowledge, disappeared," the researchers say. "Colonists, and their extractive systems aimed at key animal and plant species, became the new shapers of cultural landscapes. Rapid ecological degradation subsequently ensued, and New Englanders created a difficult project of stewarding a far less resilient landscape without help from indigenous land managers who would have known best how to enact ecological restoration measures."

Today's tribal members who work with natural resources, such as fisherman, farmers and land managers, can play key roles in devising local and regional strategies to adapt to climate change, the researchers say.

Science Daily. 2013. “Native Tribes' Traditional Knowledge Can Help US Adapt to Climate Change”. Science Daily. Posted: October 3, 2013. Available online:

Monday, November 11, 2013

Extreme rituals enhance social cohesion

Physically and mentally exhausting rituals promote the spirit of community in society, according to a new study. This is the first time that this hypothesis has been confirmed experimentally.

We humans are weird creatures. In the Philippines, people crucify themselves in extremely painful ways in an annual ritual. In Spain and in Greece, people walk on burning charcoal and risk getting burned while the audience is spurring them on.

A popular explanation of this phenomenon is that we humans form bonds and find a shared identity through collective rituals. This explanation has now been confirmed experimentally for the first time by a group of researchers headed by Dmitri Xygalatas of the Interacting Minds Centre (IMC) at Aarhus University, Denmark.

The study, published in the journal Psychological Science, was conducted on the island of Mauritius in the Indian Ocean. It concludes that the locals who watch or participate in the annual Kavadi ritual subsequently experience a stronger sense of attachment to society.

Kavadi is a painful ritual in which the participants are pierced with spears and large needles, after which they have to carry heavy bamboo structures called Kavadi for several hours before climbing a mountain to offer these structures to the temple of Murugan as a sign of their devotion.

The Mauritian population is highly varied, but the ritual is Hindu.

The study found that the ritual participants and the audience are more willing to donate money to the temple than those who only participate in low-ordeal communal prayer.

“It appears that the more people sacrifice themselves physically and mentally, the greater the effect it has on the willingness of the participants and the audience to donate money to the community,” says co-author Professor Andreas Roepstorff, who also works at the IMC in Aarhus.

Pain and empathy make people donate money

After the ritual, the research team gave a group of ritual participants and a group of audience a bag of money equivalent to two days’ wages for a Mauritian worker. They also gave money to a third group, which had not taken part in the painful ritual; instead, they took part in a Hindu communal prayer.

The three groups, totalling 86 persons, were then given the option of donating a self-chosen amount anonymously to the Murugan temple.

“We observed that the participants from the painful Kavadi ritual were more willing to donate money than those who took part in the communal prayer. But it was actually the Kavadi audience that donated the most out of the three groups,” says Roepstorff.

“This indicates that the audience was affected by the ritual and felt empathy with the participants, which made them want to support the temple.”

Rituals act as social glue

The three groups were also handed a questionnaire designed to reveal how the people regarded themselves in relation to being part of the Mauritian community.

“We noted an interesting shift in identity and self-image. The painful rituals drew the participants and the audience towards a stronger sense of common identity and belonging, and away from their original stance,” says Roepstorff.

Religion is not the driving force

He believes it is in our nature to perform this kind of physical and social rituals, and that the religious aspect is not the driving force.

”Religion also consists, to a great extent, of participating in some coordinated social activities. We look at it as a social practice, not as something where ‘God said we should do things this way and that way’,” he says.

”It’s important for us humans to share experiences with one another, and preferably with people we can relate to. This is a characteristic of the human mind and part of our evolution, but it does not appear to be something we have in common with monkeys.”

Christensen, Bo. 2013. “Extreme rituals enhance social cohesion”. Science Nordic. Posted: October 2, 2013. Available online:

Sunday, November 10, 2013

Hawaiian Mythology Digs Deep into Volcanic Past

Hawaii's vibrant mythology is populated by savage, emotional gods. But behind the fantasy could lie clues to the catastrophic volcanic events that scientists now believe inspired those tales.

Ten centuries ago, the small group of Polynesian sailors who first glimpsed the Hawaiian Islands must have sensed the miraculous; a thousand miles from home, the Pacific had thrown them a lifeline. What they saw when they landed, though, confirmed the supernatural: On this lone outpost, in an unending ocean, the ground itself was alive.

The settlers had no written language, so we can only guess at the events that inspired early legends of a god who devoured forests. But some sights seem to have sparked such awe in the islanders that they left an invisible mark. Recently, the rich oral history of the native Hawaiians has begun to receive scientific attention. It seems that, preserved in the ancient tales of volcano gods, there could be something very real — relics of the two most incredible eruptions the Big Island has witnessed since humans first floated ashore.

In 1790, Captain Cook became the first outsider to meet — and be killed by — the inhabitants of what he called the "Sandwich Islands." Thirty years later, another Englishman — William Ellis, a missionary — spoke to them in their own tongue. (No hatchets this time.)

Instead, the islanders showed him their volcano — the immense, lava-scarred pit of Mount Kilauea — and told Ellis stories about a mythology revolving around the goddess Pele, who they revealed as jealous, volatile and eruptive.

Scientists aren't used to wading through poetic metaphor, but when Don Swanson, a former director of the scientific observatory which overlooks Kilauea, read Ellis' accounts, he saw more than just superstition — he saw a record.

His volcanologist's eye was drawn to one legend in particular. Pele had fallen in love. Steaming in her pit atop Kilauea, she demanded that her sister, Hi'iaka, fetch the object of her affections from his island home in the North. His name was Lohi'au, and he doesn't come out of this well. Hi'iaka agreed, on one condition: that her sister keep her fires away from a grove of flowering trees that she valued above all else.

Hi'iaka excelled in her task — first bringing Lohi'au back to life, and then back to Kilauea. But she had taken too long. Pele's temper flared (nobody said volcanoes were reasonable), and Hi'iaka returned to find her treasured forest ablaze. But her sister wasn't done. The goddess then proceeded to murder Lohi'au, and cast his body into the depths of her volcano. In response, grief-stricken, Hi'iaka began to dig. Frantically. Rocks were flying out of the crater. She delved so deep, she was warned that if she didn't stop, she would hit water, and put out Pele's fire.

Burning forests. Spitting craters. People should write what they know, I suppose — even if oral tradition takes the place of writing.

It doesn't take a huge leap to imagine, as Swanson did, that the story of Hi'iaka's burning forest might contain echoes of an ancient lava flow. But why would something as dull as a lava flow (of all things!) have diffused into myth? They're regular episodes above a volcanic hotspot, after all. Perhaps, though, there had been one worth remembering.

In the 1980s, a team of geologists stumbled across a flow that had been emitted from an extinct vent on Kilauea's east flank, sometime in the 15th century. It was huge — the lava had reached the sea, more than 25 miles (40 kilometers) away. But its length wasn't the only thing that caught Swanson's eye. Using carbon-14 analysis, he pinpointed the exact year that the flow had begun — 1410. Almost unbelievably, the end date was not years, but decades, later, in 1470. This single, gigantic stream of basalt had persisted for three generations. It would have changed the landscape forever. Enough, perhaps, to etch itself into legend.

Incredibly, though, the final act of this mythical quarrel could be hiding something even bigger. Hi'iaka's furious digging, Swanson realized, might be describing the single biggest volcanic upheaval at Hawaii since humans arrived: It was the perfect metaphor for a caldera collapse — the catastrophic slumping which turns a "traditional" volcanic crater into a huge, disfigured scar.

At the time of the megaflow, Kilauea had a relatively small summit crater. By the time Cook landed, however, it had morphed into a cauldron: 2 miles (3 km) wide and 400 feet (122 meters) deep.

Today, scientists can confidently say that the caldera formed due to the drainage of magma-filled chasms beneath the volcano. But if you're a fifteenth-century Hawaiian, and all you know is that the Earth itself is sinking around you in a chorus of explosions, then a god digging isn't a bad guess.

It's an inspired piece of detective work; but also a fascinating insight into how myths begin. Swanson's respectful treatment of the story of Pele allowed him to see it for what it partly was: a theory. Formed by regular people striving to explain the incredible — a best guess at a time when the accessible Earth ended at the surface. Anything below, like the unknown void above the stars, was given over to the gods.

Wylie, Robin. 2013. “Hawaiian Mythology Digs Deep into Volcanic Past”. Live Science. Posted: October 2, 2013. Available online:

Saturday, November 9, 2013

Gender and the Body Language of Power

Expansive body postures like those associated with masculinity increase people’s sense of powerfulness and entitlement.

Philosopher Sandra Lee Bartky once observed that being feminine often means using one’s body to portray powerlessness. Consider: A feminine person keeps her body small and contained; she makes sure that it doesn’t take up to much space or impose itself. She walks and sits in tightly packaged ways. She doesn’t cover the breadth of the sidewalk or expand herself beyond the chair she occupies.

Likewise, burping and farting, raising one’s voice in an argument, and even laughing loudly are considered distinctly unfeminine. A feminine person doesn’t use her body to forcefully interact with the world, she lets others do for her when possible. "Massiveness, power, or abundance in a woman’s body is met with distaste," Bartky wrote.

Stunningly, when you think about it, these features of feminine body comportment are, in fact, not uniquely feminine, but associated with deference more generally. Bartky again:

In groups of men, those with higher status typically assume looser and more relaxed postures; the boss lounges comfortably behind the desk while the applicant sits tense and rigid on the edge of his seat. Higher-status individuals may touch their subordinates more than they themselves get touched; they initiate more eye contact and are smiled at by their inferiors more than they are observed to smile in return. What is announced in the comportment of superiors is confidence and ease….
Acting feminine, then, overlaps with performances of submissiveness. Both men and women use their bodies in more feminine ways when their interacting with a superior, whether it be their boss, their commander, a police officer, or their professor.

New evidence suggests that this is not pure theory. Psychologist Andy Yap and his colleagues tested whether “expansive body postures” like the ones associated with masculinity increase people’s sense of powerfulness and entitlement. They did. In laboratory experiments, people who were prompted to take up more space were more likely to steal, cheat, and violate traffic laws in a simulation. A sense of powerfulness, reported by the subjects, mediated the effect (a robust finding that others have documented as well).

In a real world test of the theory, they found that large automobiles with greater internal space were more likely than small ones to be illegally parked in New York City.

Research, then, has shown that expansive body postures that take up room instill a psychological sense of power and entitlement. The fact that this behavior is gendered may go some way toward explaining the persistence of gender inequality and, more pointedly, some men’s belief that they have earned their unearned privileges.

Wade, Lisa. 2013. “Gender and the Body Language of Power”. Pacific Standard. Posted: October 2, 2013. Available online:

Friday, November 8, 2013

Transgendered males seen as an asset to some ancestral societies

Ethnographers say ancestral communities with transgendered males rarely frowned upon homosexual behavior

Transgendered androphilic males were accepted in traditional hunter-gatherer cultures because they were an extra set of hands to support their families. Conversely, by investing in and supporting their kin, these males ensured that their familial line – and therefore also their own genetic make-up – passed on to future generations despite their not having children of their own. This is according to an ethnographic study led by Doug VanderLaan of the Centre for Addiction and Mental Health in Canada, published in Springer's journal Human Nature. The study reports that this "kin selection" is still at play in pro-transgender societies today.

"Androphilia" refers to a predominant sexual attraction towards adult males, and takes on one of two possible gender roles depending on the cultural context: sex-gender congruent male androphilia (the typical male gender role) or transgendered androphilia (a gender role markedly similar to that of females in a given culture). Typically one of these variations is dominant within a society. For example, sex-gender congruency is more common in Western cultures, whereas the transgendered form is more typical of non-Western cultures, such as that of the Polynesian island nation of Samoa.

The researchers also wanted to test predictions that enhanced kin-directed altruism is prominent in societies in which transgendered male androphilia is predominant. To answer this question, VanderLaan and his colleagues compared the sociocultural environment of contemporary transgendered societies with ancestral small-group hunter-gatherers. Ancestral group size, sociopolitical systems, religious beliefs and patterns of residency were analyzed in 146 non-transgendered societies, and 46 transgender societies. The analysis utilized ethnographic information about well-described nonindustrial societies from the Standard Cross-Cultural Sample.

VanderLaan and his colleagues found that transgendered male androphilia is an ancestral phenomenon typically found in communities with certain ancestral sociocultural conditions, such as "bilateral descent." This term refers to societies in which the families of both one's father and mother are equally important for emotional, social, spiritual and political support, as well as the transfer of property or wealth. Also, the acceptance and tolerance of same-sex behavior evolved within a suitable, accepting environment in which discrimination against transgendered males was rare. Importantly, kin selection might have played a vital part in maintaining genes for male androphilia these societies. For example, it continues to be a driving force in contemporary Samoan fa'afafine transgender communities.

Unless transgendered androphilic males are accepted by their families, the opportunities for them to invest in kin are likely limited. What was true of our ancestors still holds true. A society's specific social organization and its general acceptance of transgenderism and homosexuality is even important today. When supported by society, transgendered males invest their time and energy in their kin in turn.

EurekAlert. 2013. “Transgendered males seen as an asset to some ancestral societies”. EurekAlert. Posted: October 2, 2013. Available online:

Thursday, November 7, 2013

Can Intelligence Really Be Measured?

Every year, the MacArthur Foundation bestows large financial grants on a group of people who are doing exceptionally creative or important work.

MacArthur fellowships are often called “genius grants,” and grant-winners tend to be unusually motivated, passionate and forward thinking. But are they geniuses? The annual conversation that ensues raises questions about what it means to be intelligent and whether that’s something that can be cultivated, measured or even defined.

Despite decades of research into how different brains work, experts said, there are no easy answers. Scientists now know that there are multiple types of intelligence. There's a strong genetic component to certain aspects of intelligence. And scores on intelligence tests are tightly linked to school performance, future income level, health and more.

But IQ scores are far from the only factor that determines how well people do in life. Also, conversations about innate differences in intelligence continue to make people uneasy, probably because there is a long history of racism, classism, sexism and even religious discrimination tied up in discussions about who is smarter than whom.

“The field is just fraught with controversy after controversy,” said Randall Engle, a psychologist at the Georgia Institute of Technology in Atlanta. “There are group differences all over the place in intelligence measures and that just adds to the controversy. It’s hard for the field to come to grips with what’s understandable about this in the midst of all this craziness.”

Researchers have been interested in understanding the nature of intelligence since at least the 1800s, but early studies were hampered by complications.

Part of the problem was that intelligence tests were designed before anyone had come up with a specific definition of what they were trying to measure, Engle said. What’s more, British scientist Sir Francis Galton, who was the first to use statistics to test whether intelligence could be inherited, was also a eugenicist, and beliefs that good traits were inborn led to forced sterilizations and other terrible outcomes.

In the early 1900s, French psychologist Alfred Binet developed a test to identify children who might need extra help in school, and his work was incorporated into the Stanford-Binet Intelligence Scales, which originally focused on verbal skills. That and other modern IQ tests have changed over the years as new research changes our understanding of what intelligence is.

Generally, Engle said, different IQ tests correlate well with each other and scores tend to be linked to real-world outcomes. Compared to people who score lower on the tests, for example, people with the highest IQs file more patents, publish more academic papers, and earn higher incomes.

But scoring well on an IQ test doesn’t predict success, nor does a relatively average or lower score predict a life of misery.

That’s because having a high IQ is like owning a car with a big engine, said David Lubinksi, psychologist and co-director of the Study of Mathematically Precocious Youth at Vanderbilt University in Nashville, Tenn.

“If there’s no gas in your car you’re not going to go anywhere. If road conditions are bad, you’re not going to go anywhere,” he said. In the case of intelligence, you need good health, hard work and motivation to take advantage of inherent brainpower.

Another complication is that intelligence comes in many forms. One category is crystallized intelligence, Engle said, which measures how much knowledge a person has acquired and that is highly correlated to education level.

On the flip side is fluid intelligence, which is the ability to reason and solve new problems. Studies of twins show that fluid intelligence is largely genetic. Identical twins are much more similar to each other on measures of fluid intelligence than fraternal twins are.

But, according to recent research, that is true only for people with high socioeconomic status, at least in the United States, where access to education varies by zip code. In other words, genes only kick in to influence IQ when people are already getting a relatively good education.

As scientists learn more about the components of intelligence, they are developing new ways to assess nuances in the way people think.

Traditional IQ tests have long focused on math and verbal skills, Lubinksi said. Now, though, it’s becoming clear that the ability to think spatially and rotate shapes in the mind’s eye is essential for pilots, orthopedic surgeons, architects and other occupations. Some newer tests evaluate those visualization skills.

Even as testing gets more refined, debates continue about whether IQ tests should be used at all. The military has used intelligence testing as a way to place people in the toughest posts, and experts said that evaluations can be useful in professional and educational settings, too -- as long as they’re used responsibly and sensitively.

For kids who are struggling in school, for instance, IQ tests can help determine whether they’re so intelligent that they’re bored or if they have cognitive deficits that require special attention. Determining mental strengths and weaknesses can also help teachers tailor education and steer students towards jobs that best fit their skills.

On the other hand, experts said, it wouldn’t be a good idea to test every child’s IQ and announce the number because that would unnecessarily and unfairly shape expectations.

“It’s like every other tool,” Lubinski said. “It can cause harm, and it can be of great service. There are examples of both.”

As for MacArthur fellows, IQ tests have nothing to do with who wins and “genius” is probably not the right word, Engle said.

“I know two MacArthur award winners,” he said. “They’re pretty normal people who have done some interesting things.”

Sohn, Emily. 2013. “Can Intelligence Really Be Measured?”. Discovery News. Posted: October 1, 2013. Available online: