Let’s start with learning from the learners. In all of my courses, I attempt to challenge my early undergraduate (Freshman and Sophomore) students to clarify and make connections among concepts that my scientific colleagues and I don’t entirely agree on. Or even pay attention to.
I look at it this way. As many of my faculty colleagues in diverse institutions would vouch for, it is basically impossible for any one of us scientists to steer general academic conversations–within any given discipline or specialization–toward concept-focused issues of what we don’t understand, have failed to define, or have thoroughly muddled. This is partly because, as Thomas Kuhn (1996 ) famously discussed, scientific research communities tend toward what are tacitly accepted as normal paradigms. No matter how compelling a lone dissenter’s argument may be, it’s unlikely that a broader research community will immediately see a shared interest in un-learning, re-learning, and creatively reorganizing and redefining key concepts, observation and analysis methods, questions, and all of the jargon and time spent reading and debating that goes with it.
In any case, for several years now, I have confronted my students in an Emory University 200-level core course in the Neuroscience and Behavioral Biology major with a major problem in studying the evolutionary foundations of human behavior:
Human cultures incorporating concepts of “moralizing high gods” tend to exist in relatively harsh environments.
Of course, the authors of the study (Botero et al., 2014) cautiously point out that the relevant data consist of statistical patterns. And that the societies under study are constituted by human beings. There have to be multiple interacting factors at play, Botero and colleagues state. I mean, we’re talking about a rather complex pattern of cultural development: whether “moralizing high gods” concepts and representations have already reached historical widespread adoption and intergenerational persistence within a community.
Yet, the authors are clear about their theoretical perspective on the data:
In general, our findings are consistent with the notion that a shared belief in moralizing high gods can improve a group’s ability to deal with environmental duress and may therefore be ecologically adaptive … (Botero et al. 2014: p. 3)
From an interdisciplinary perspective, a wide range of experimental, historical, and ethnographic evidence supports sound theoretical argument. This is a darn plausible hypothesis.
And it’s certainly nice to document–with a robust pattern of scientific observation–support for the “environment influences religion” hypothesis.
This result will certainly not please everyone. Botero et al.’s (2014) study will effectively pull on a salient contemporary cultural tension, which exists between our own conflicting beliefs about agency and freedom versus genetic or environmental inevitability.
But that cultural tension is not really salient for a comprehensive and consistent understanding of how religious beliefs persist or change in a culturally constituted environmental context. How can that really be? The main result seems to be a pretty clear point for the environmental determinism side.
What remains implicit in the study’s theoretical framework–but what warrants the authors’ point that many interlinked factors have to be involved in shaping cross-cultural variation in the religious structuration of moral commitment–is that there are multiple, historically dependent ways that large-scale societies can hold themselves together over many generations, in very challenging ecological conditions.
Basically, “moralizing high gods” are just one of many cultural possibilities for religious systems of moral commitment … even in harsh, unpredictable, low-biomass-production environments.
It’s just that, recently, there are many cultures from around the world that are documented to have concepts of “moralizing high gods” … AND they tend to associate–a bit statistically more, than not–with such difficult environmental settings.
“Ust’-Ishim” has now joined the ranks of human fossil find sites that biological anthropology students will be conscientiously writing down on index cards, as they prep to memorize spelling and key associated facts, like geographic location and geological age. It thus joins the likes of Hadar, of Dmanisi.
OK. As with those other sites, “Ust’-Ishim” itself should not really be the purpose of any memorization exercise. (Although you can really get into spacing out and repeating “Ust’-Ishim,” as it rolls ticklingly off the tongue … and as your concentration on studying fades … Now, rested and focused from your meditation, back to the important point.)
What is it that’s important to remember here? The site provides a kind of mnemonic tag for learning something new about our part in the natural world. “Ust’-Ishim” is now a heavy-stock, clearly printed and embossed tag for these two facts:
Anatomically modern humans spread out of Africa and–over many generations of recurring, slight, but significant population growth, over the time period 60,000-30,000 years ago (very roughly)–they decisively made an outsized contribution to ancestry of all later populations outside of Africa, thus shaping worldwide patterns of genetic diversity, right up to the present day.
But anatomically modern humans (AMH) did not spread into a Eurasian landmass devoid of people. When AMH population growth and expansion into Eurasia began to reach a (quite low but non-zero) escape velocity around 60,000 years ago, at least some AMH groups–with what was then relatively recent African ancestry–mixed and interbred at substantial rates with indigenous Neandertal and other Eurasian populations.
So, the basic recap: major AMH population expansion across Eurasia after 60,000 years ago, and admixture between AMH and indigenous “archaic” human groups.
To be sure, there’s plenty of controversy and debate about why AMH groups successfully and rapidly grew and spread out of Africa when they did … and then persistently pushed the geographic frontiers of the genus Homo, especially northward into arctic latitudes, southeastward into Australia, and eastward into the Americas. Moreover, there’s plenty of disagreement and uncertainty about when the initial, significant pattern of interbreeding occurred, how long it persisted, and–perhaps most significantly–why it was not followed by further interbreeding, but rather population competition and archaic human extinction. Continue reading Neandertals, Early Modern Humans, and Us→
The French economist–and newly announced Nobel laureate–Jean Tirole has illuminated the possibilities and limits of financial regulation of very large, powerful firms. (Image links to the Royal Swedish Academy of Sciences popular publication on Tirole’s work.)
The first and last Nobel prizes to be announced each year–Medicine/Physiology and Economics–are the ones most immediately relevant for biocultural anthropological research. (And to the extent that human behavioral decisions and aggregate behavior patterns are shaped and constrained by human biological life-history patterns, I would argue that the kind of pioneering behavioral and life-history work carried out by anthropologists Kristen Hawkes, Kim Hill, Hillard Kaplan and others would merit consideration in either prize category. In any case …) I have already posted about the medicine/physiology award-winning research, which is on a complex but important topic: dynamic proprioceptive embodied feedback with neural mapping of the immediate spatial environment.
How does an animal body achieve a sense of where it is and where it’s going? Today’s announcement of the Nobel Prize in Physiology or Medicine recognizes foundational scientific brain research in proprioception. This term in psychology may not be a familiar one, but perception (a more familiar “ception” word) wouldn’t be possible without proprioception, which is the central nervous system’s monitoring of the body’s relationship–in part and in whole–to its surroundings.