Showing posts with label Sociobiology. Show all posts
Showing posts with label Sociobiology. Show all posts

Wednesday, June 19, 2013

Are Humans Getting Dumber (or is it Just Journalists and Sociobiologists?)


Human intelligence is on the decline, according to the Huffington Post, writing about a recent study suggesting that westerners have lost 14 I.Q. points since the Victorian Era. Study co-author, Dr. Jan te Nijenhuis, professor of work and organizational psychology at the University of Amsterdam, says that because women of higher intelligence tend to have fewer children than do women of lower intelligence, intelligence is being selected out of the affluent populations of the west.
The Kallikak family, promoted by eugenicist Henry Godard as proof of heritability of idiocy
There are probably many on the Left who would like to jump on these findings as an explanation for everything from the high numbers of Americans who believe in creationism or who deny climate change to the continuing popularity of the Republican Party among people devastated by their economic policies.

The problem is that this study and all others pointing to a causal relationship between birthrates and IQ are seriously flawed. Indeed, even the claim that IQs are declining is suspect.

Alfred Binet
Let’s start with the fact that the IQ test, developed by Alfred Binet, in France, wasn’t even created until 1903, two years after Victoria’s death, making it exceedingly difficult, if not impossible, to make a valid comparison of Victorian and modern Westerners’ IQs. Furthermore, the original test emphasized memorization, vocabulary and questions about appropriate behavior, none of which has much to do with intelligence. Even modern versions of the test contain some of these types of questions. IQ tests also tend to have a class bias, as well as cultural and linguistic biases (e.g., questions about appropriate behavior depend on one’s cultural background).

To address this problem, te Nijenhuis used proxies for intelligence (comparing a variety of different tests for which data does exist going back to 1884). However, he chose a very weak proxy, reaction time, which he presumed was an accurate proxy for intelligence since reaction time reflects a person's mental processing speed. However, it is not necessarily true that a person who has a quick visual reaction rate also has a quick mental processing rate for math, puzzles or other types of problem solving.

Another problem with the research is that it lacked valid controls, drawing into question the validity of the comparisons. Supposedly the Victorian and modern experiments used a similar test for reaction times, but they used different instruments for measuring the results. Thus, the average late 19th century reaction time of 194 milliseconds might have actually been much closer to or even slower than the average 2004 reaction time of 275 milliseconds had researchers used the same equipment and methods.
 
1920s pseudoscientific image trying to connect brain types to criminality
Bad Science in Service to Race and Class Prejudice
The researchers also used data collected by Francis Galton (Darwin’s cousin), who coined the term eugenics, which included the idea that poor people were poor due to their inferior intelligence, which they presumed was due to their “bad” genes, and that the affluent were wealthy due to their good genes. This pseudoscience was used to justify government interventions promoting or limiting birthrates among different races and social classes, forced sterilizations in many countries, including the U.S., infanticide, and genocide, as practiced by the Nazis. Thus, Galton had a significant bias going into his research, specifically an Experimenter’s Bias (i.e., observing what you expect, rather than what actually occurs). In Galton’s case, he would be expecting white and affluent people (who also had smaller families) to be smarter, and could have inadvertently designed tests that would have given him these results.
 
Image from Wikipedia, based on Galton's Ideas
Te Nijenhuis’s research suffers from some of these same problems, particularly the presumption that intelligence is essentially a heritable trait (i.e., passed through the DNA), a presumption still shared by a large number of scientists as well as the lay public, despite a lack of credible data to support this idea (more on this below). However, his race and class prejudices also come out in his belief that “high-IQ people are more productive and more creative,” and his nostalgia for the flourishing of creativity and brilliance of the Victorian era. He uses the term dysgenics in his work, a term that is often associated with the eugenics movement thanks to the work of Richard Lynn, who argued in his book Dysgenics: Genetic Deterioration in Modern Populations, 1996, that human genetic health was declining because criminals have higher birthrates than the rest of the population (there is no evidence they have higher birthrates and it is unlikely that criminality is heritable), leading many, including Lynn, himself, to renew calls for eugenic policies.
 
Many states had similar sterilization laws, resulting in 10,000s of forced and voluntary sterilizations in the 20th century
Genetics is Not Destiny
While large numbers of scientists and the lay public believe that intelligence is highly heritable, there is no conclusive evidence for this. Indeed, estimates of the heritability of IQ range from as low as 40% to as high 90%, suggesting that intelligence is at least partly, and possibly quite significantly, influenced by factors other than DNA. Part of the reason why there is so much controversy over the degree to which intelligence is heritable is that no genes for intelligence have been positively identified, (though recent research has located positions on certain chromosomes where some genes related to intelligence might be located).

Intelligence and IQ, like most phenotypes (traits), are influenced not only by DNA, but by environmental influences and sometimes even by random events that occur during development. ABO blood type, for example, is 100% heritable, meaning that it is determined entirely by the DNA inherited from the parents and no environmental factors influence it. Human height is around 94% heritable. However, even a relatively high heritability of 94% is not sufficient to presume a cause and effect relationship between DNA and a particular phenotype. A person with tall parents could easily wind up being short if he does not have access to a diet rich in protein and calcium. Indeed, when one considers stereotypically short ethnic groups, most come from regions of the world with high levels of malnutrition in which protein and calcium are relatively scarce. Similarly, average human heights in Western Europe and the U.S. have increased 4” over the past 150 years, according to Scientific American, most likely because of improvements in childhood nutrition that occurred during that period.

There are many environmental factors that influence learning, memory, and even reaction time. Memory and reaction time, for example, can be improved with certain exercises and practice. Exposure to high levels of stress can impair memory and learning due to overexposure to the stress hormone cortisol (see here, here and here). How parents communicate with infants and children can influence the size and depth of their vocabularies (see here and here), which can influence how they comprehend phenomena and their ability to solve problems. Malnutrition and hunger can lead to cognitive impairment (see here, here and here).

Another problem with te Nijenhuis’ findings is that low IQ parents, while they may have larger families, do not necessarily produce low IQ children (“Resolving the debate over birth order, family size, and intelligence,” Rodgers, Joseph Lee; Cleveland, H. Harrington; van den Oord, Edwin; Rowe, David C. American Psychologist, Vol 55(6), Jun 2000, 599-612).

Confusing Correlation With Causation
Despite the fact that te Nijenhuis is a scientist, he apparently has difficulty distinguish between correlation and causation. There is considerable evidence that populations with higher IQs have lower birthrates. Thus, I.Q. and birthrates have a negative correlation (i.e., as one goes up, the other declines). However, this does is not evidence that one is caused by the other. Rather, they could both be products of one or more other causes or the correlation could simply be a coincidence.

Social class also correlates with both birthrate and intelligence. Wealthier women tend to have fewer babies. There are several logical explanations for this such as delaying motherhood to pursue college and career, for affluent women, versus having children earlier and more often among poor women because children can help with the farm work and care for you in your old age.

Wealthier people, in general, also tend to have higher IQs. However, this may have far more to do with environmental and social factors (e.g., access to better nutrition and healthcare, better quality schools, being read to more often as babies and toddlers, less stress, greater access to enriching extracurricular activities, like travel abroad, summer school and camps) than genetics. Indeed, two studies done in Texas and Minnesota seem to support this. According to the studies, the correlation in intelligence between mothers and biological children were not only quite low (0.20 to 0.34, respectively), but not much different than the correlations between mothers and adopted children (0.22 to 0.29, respectively), suggesting that social and environmental factors likely had a greater influence on children’s intelligence than the genetics of their mothers (Richard Lewontin, Not In Our Genes). In other words, intelligent people may very well be intelligent more as a consequence of their social class privileges than their parents’ genes.

Friday, October 14, 2011

Are Boys and Girls Really That Different (Part II)?

In honor of the one-year anniversary of Modern School, I'm reposting some of my favorite articles from the past year. The following is a a follow up to the quiz in Are Girls and Boys Really All That Different? which I recommend reading first (reposted yesterday)

Same Upbringing, Different Outcomes?

Kids in Gender Specific Costumes, by EpSos.de
Several friends have told me that boys and girls are inherently different from each other. Their evidence was that their own boys and girls came out so differently despite being raised in the same environment and in the same manner.

Even when the same parents raise two children in the “same” house and in the “same” manner, there are always slight differences in both. The furniture, decorations and toys change over time. Parents may be more stressed, confused, or overwhelmed early on, thus influencing how they interact with their first child. Grandparents, siblings, aunts, uncles, cousins, kids at daycare, television, videos, billboards, all broadcast gender images that babies start to pick up almost immediately.

My three-year old son recently explained to me that certain colors were for boys, and others were for girls, despite our best efforts to raise him in a gender neutral manner. Even if no one actually told him that certain colors were for girls or for boys, he certainly could have developed this hypothesis himself from observing people on television, at day care, and in the community. When I asked him what would happen if a child wore the wrong color, he very astutely answered that their feelings might get hurt.

Biological Gender Differences Are Actually Quite Small
Many people believe that girls and boys are innately or genetically different. They accept the sociobiological assumption that biology or DNA is destiny, that traits as complex as gender could have an entirely (or primarily) genetic basis.

This simply is not true.

Numerous factors influence a child’s development, and the role of genetics is actually quite small. There is really only one significant genetic difference between girls and boys (the presence or absence of a Y chromosome), and even this can be ambiguous. For example, a person born with XY chromosomes, but with a damaged SRY gene, will develop female genitalia and grow into a woman.

XY Females, from Wiki Commons
The SRY gene is what confers maleness, turning on the appropriate genes at the appropriate times during development to tell the body to start producing male hormones, testicles or facial hair. Lack of an SRY gene is what causes femaleness. Everyone, therefore, starts out as a female by default until an SRY gene turns on and starts producing male hormones and developing testicles and a penis. In addition to SRY mutations on the Y chromosome, there are over 400 known androgen receptor mutations that can occur on the X chromosome and cause XY individuals to grow into females due to insensitivity to male sex hormones.
 

There are numerous other genetic conditions that can result in ambiguous sexual differentiation. For example, someone who is born XXY has a condition known as Klinefelter’s syndrome and will usually have male genitalia, but may have some feminine secondary characteristics.
Klinefelter's Male from Wiki Commons

Even environmental factors, like pesticides and cosmetics can influence sexual differentiation. Gynecomastia (breast development in males) has been linked to tea tree and lavender oils. There are also numerous anti-androgenic pesticides and industrial chemicals (e.g., phthalates) that may cause gynecomastia, or alter the onset and progression of puberty.

So What Makes Girls and Boys Different?
Genetic factors (XX vs XY) contribute to primary sexual differentiation (development of testes or ovaries) as well as secondary sexual differentiation (development of body hair, voice, breasts and hip size) and play an important role in gender. Sexual differentiation is what most people think about when considering whether a person is male or female.

However, gender is more than what you have between your legs. Gender is the combined effects of sexual differentiation, behavior, social expectations and self-identity. The most obvious illustration of this is transgendered people who feel that they were born into the wrong body. Their gender does not match their biology.

While there is some variability in sexual differentiation (as discussed above), the non-biological influences on gender have considerably more variability. Many of our expectations and assumptions about gender are socially constructed. For example, in our society, it is unusual for men to wear gowns or to walk platonically hand in hand, yet in Morocco these behaviors are both common. Likewise, there is no rule that girls should play only with dolls and boys only with footballs, yet this is how most children are raised in our society, thus reinforcing gender stereotypes and the impression that gender is biologically predetermined.

The Baby X Experiment
The original Baby X experiment was conducted by Dr. Phylis Katz, et al, at the City University of New York (CUNY) in 1975, with a single baby girl, dressed in a yellow jumpsuit. There were three toys in the room: a football, a doll, and a teething ring. She was introduced to some adults as a girl. Others were told she was a boy. Some were not given any clues about her actual gender.

The experiment was repeated in 1980 using infants of both genders. Both studies had similar results. When told that the baby was a girl, adults tended to give the baby a doll to play with. When told the baby was a boy, they were more likely to give the baby a football to play with. When the baby’s gender was not specified, the adults tried to guess, using stereotypes like “She is friendly, and female infants smile more,” or “she is a girl because girls are more satisfied and accepting.”

Gender and Sociobiology
The popularity of sociobiological thinking is understandable with all the hype about the Human Genome project and shows like CSI, which have made DNA and Genetics seem sexy and hot. Sociobiology has also been intellectually legitimized and self-promoted by prominent and well-respected scientists like E.O. Wilson and Richard Dawkins. However, it is also a scientifically flawed oversimplification of social phenomena that has been abused throughout history by eugenicists, Nazis and other racists and supremacists.

The basic idea goes like this: All phenotypes (traits) are caused by proteins, which are synthesized based on genetic instructions. This is also known as the Central Dogma of Molecular Biology. Over the past ten years, mounting evidence has shown the central dogma to be a gross overgeneralization. As educators, we know that the phenotype of academic success is not based solely on a child’s intelligence, but also on socioeconomic factors that can influence cognitive and social development, health, resilience and motivation. Even intelligence is not based entirely on genetics. No gene for intelligence has been identified. Malnutrition, exposure to certain drugs and chemicals in utero and pollution can all play a role in cognitive development.

Despite the protestations of the sociobiologists, children’s color preferences in clothing are influenced by social factors, and cannot be reduced to purely biological or genetic causes. This is true for many of their likes and dislikes, communication patterns, what they want to be when they grow up, and numerous other behaviors that typify maleness and femaleness.

Many parents agonize over whether to allow their little boy to experiment with feminine clothing and cooking, or their little girl to be tough and play sports, out of fear that they will be bullied (or because of their own homophobia). However, a more fundamental issue is whether or not we teach our children to be resilient, self-confident, and accepting of others, especially those who do not fit our stereotypes.

Saturday, April 9, 2011

Intelligent Design is For ‘Fraidy Cats


Scaredy Cats (Image by Paparutzi)
Some scientists just love to look for a biological explanation for everything. A recent blog posting Death, Science And Intelligent Design, by Jonathan Parkinson, looks at why people are so wedded to Intelligent Design (ID), despite the overwhelming lack of evidence.

Parkinson writes about a recent study in PLoSOne that argues that ID's popularity in some cases is partly due to peoples’ anxiety about their own mortality. Here is the experimental design: 122 undergraduate students were asked to think about and then write about either their own death or a painful visit to the dentist (the control group). Then they read a 174-word passage by evolutionary biologist (and anti-religion activist) Richard Dawkins which summarized the evidence for evolution, followed by a passage by Michael Behe, also 174 words long, summarizing the arguments in favor of ID. Students then rated the authors on a 9-point scale and ranked their own religious beliefs on a 10-point scale. The researchers repeated the experiment with several other groups, including 832 randomly selected Americans.


The results were intriguing. In four of the groups, students who were asked to imagine their own deaths had a statistically significant higher appreciation for Behe's arguments and ID compared to the control group, even after controlling for religiosity. However, for the one group of natural science students, appreciation for Behe/ID declined after imagining their own deaths.

Okay, now let’s discuss the problems with this study. First, the sample size was small for most of the groups studied and the effects, while statistically significant were also pretty small for some of the groups. Choosing a painful dental experience as the control treatment doesn’t make a whole lot of sense either. Why not have the control group simply read the passages without writing the essay on death? The order in which they read the articles may also have created a bias. Perhaps if they read Dawkins last, they would have been more predisposed to his ideas. There are also a variety of variables that were not controlled (e.g., socioeconomic status, ethnicity, health status) that might have influenced either the subjects’ belief in ID or their receptiveness to it. And lastly, Dawkins was probably not the best author to have them read, considering that he is antagonistic to religiosity.


The conclusion of the authors is that support for ID is fueled by "existential anxiety," and that it offers them a sense of meaning and purpose while evolution does not. (Life science students ostensibly have found purpose and meaning in their search for rational explanations of natural phenomena). This brings up another problem with the study: why should existential anxiety over death draw one toward ID, but not anxiety over pain, especially when we consider that death puts an end to pain, whereas dental pain could continue long after the experience and include pain in the pocketbook and the loss of the ability to enjoy one’s meals? Of course this is too rational and the anxiety is really more about people’s lack of experience with death and their fear of the unknown.


Parkinson finds the study’s conclusions plausible, but insufficient, arguing that there are likely two additional factors that influence belief in ID. First, many people believe that the theory of evolution is incompatible with religious belief. Thus, if they are forced to choose between the two, the majority will choose the religion. Parkinson’s other factor is related to the limited imagination of humans and our tendency to use metaphors to understand complex phenomena. There aren’t really any good metaphors for evolution, nor is it easy to understand it based on everyday experiences, whereas ID is based on the anthropomorphic metaphor that life is too complex to have arisen spontaneously and must have been coordinated by an intelligent being.

While Parkinson may be correct, neither of his hypotheses really explains the results of the PLOS study. Why would thinking about death make some people (but not life scientists) more predisposed to ID than thinking about pain or dentistry? It is important to consider other possible explanations for these results. For example, perhaps life science students already have a predisposition against ID and perhaps they also have less existential anxiety about their own mortality. Perhaps they are just less fearful, in general, or have different coping mechanisms for dealing with their fears. Also, Parkinson’s last factor, that evolution is just plain difficult to understand, may be exacerbated by existential fear, at least for those who are susceptible to existential fear and who don’t already have a good grasp of evolution. There is also the question of whether existential anxiety predisposes people to religion in general, and not just religious explanations for the origins of life.

Saturday, December 11, 2010

Are Boys and Girls Really That Different (Part II)?

Same Upbringing, Different Outcomes?

Kids in Gender Specific Costumes, by EpSos.de
Several friends have told me that boys and girls are inherently different from each other. Their evidence was that their own boys and girls came out so differently despite being raised in the same environment and in the same manner.


Even when the same parents raise two children in the “same” house and in the “same” manner, there are always slight differences in both. The furniture, decorations and toys change over time. Parents may be more stressed, confused, or overwhelmed early on, thus influencing how they interact with their first child. Grandparents, siblings, aunts, uncles, cousins, kids at daycare, television, videos, billboards, all broadcast gender images that babies start to pick up almost immediately.


My three-year old son recently explained to me that certain colors were for boys, and others were for girls, despite our best efforts to raise him in a gender neutral manner. Even if no one actually told him that certain colors were for girls or for boys, he certainly could have developed this hypothesis himself from observing people on television, at day care, and in the community. When I asked him what would happen if a child wore the wrong color, he very astutely answered that their feelings might get hurt.


Biological Gender Differences Are Actually Quite Small

Many people believe that girls and boys are innately or genetically different. They
accept the sociobiological assumption that biology or DNA is destiny, that traits as complex as gender could have an entirely (or primarily) genetic basis.


This simply is not true.


Numerous factors influence a child’s development, and the role of genetics is actually quite small. There is really only one significant genetic difference between girls and boys (the presence or absence of a Y chromosome), and even this can be ambiguous. For example, a person born with XY chromosomes, but with a damaged SRY gene, will develop female genitalia and grow into a woman.


XY Females, from Wiki Commons
The SRY gene is what confers maleness, turning on the appropriate genes at the appropriate times during development to tell the body to start producing male hormones, testicles or facial hair. Lack of an SRY gene is what causes femaleness. Everyone, therefore, starts out as a female by default until an SRY gene turns on and starts producing male hormones and developing testicles and a penis. In addition to SRY mutations on the Y chromosome, there are over 400 known androgen receptor mutations that can occur on the X chromosome and cause XY individuals to grow into females due to insensitivity to male sex hormones.

 
Klinefelter's Male from Wiki Commons
There are numerous other genetic conditions that can result in ambiguous sexual differentiation. For example, someone who is born XXY has a condition known as Klinefelter’s syndrome and will usually have male genitalia, but may have some feminine secondary characteristics.



Even environmental factors, like pesticides and cosmetics can influence sexual differentiation.
Gynecomastia (breast development in males) has been linked to tea tree and lavender oils. There are also numerous anti-androgenic pesticides and industrial chemicals (e.g., phthalates) that may cause gynecomastia, or alter the onset and progression of puberty.


So What Makes Girls and Boys Different?

Genetic factors (XX vs XY) contribute to primary sexual differentiation (development of testes or ovaries) as well as secondary sexual differentiation (development of body hair, voice, breasts and hip size) and play an important role in gender. Sexual differentiation is what most people think about when considering whether a person is male or female.


However, gender is more than what you have between your legs. Gender is the combined effects of sexual differentiation, behavior, social expectations and self-identity. The most obvious illustration of this is transgendered people who feel that they were born into the wrong body. Their gender does not match their biology.


While there is some variability in sexual differentiation (as discussed above), the non-biological influences on gender have considerably more variability. Many of our expectations and assumptions about gender are socially constructed. For example, in our society, it is unusual for men to wear gowns or to walk platonically hand in hand, yet in Morocco these behaviors are both common. Likewise, there is no rule that girls should play only with dolls and boys only with footballs, yet this is how most children are raised in our society, thus reinforcing gender stereotypes and the impression that gender is biologically predetermined.


The Baby X Experiment

The original Baby X experiment was conducted by Dr. Phylis Katz, et al, at the City University of New York (CUNY) in 1975, with a single baby girl, dressed in a yellow jumpsuit. There were three toys in the room: a football, a doll, and a teething ring. She was introduced to some adults as a girl. Others were told she was a boy. Some were not given any clues about her actual gender.

The experiment was repeated in 1980 using infants of both genders. Both studies had similar results. When told that the baby was a girl, adults tended to give the baby a doll to play with. When told the baby was a boy, they were more likely to give the baby a football to play with. When the baby’s gender was not specified, the adults tried to guess, using stereotypes like “She is friendly, and female infants smile more,” or “she is a girl because girls are more satisfied and accepting.”


Gender and Sociobiology

The popularity of sociobiological thinking is understandable with all the hype about the Human Genome project and shows like CSI, which have made DNA and Genetics seem sexy and hot. Sociobiology has also been intellectually legitimized and self-promoted by prominent and well-respected scientists like E.O. Wilson and Richard Dawkins. However, it is also a scientifically flawed oversimplification of social phenomena that has been abused throughout history by eugenicists, Nazis and other racists and supremacists.


The basic idea goes like this: All phenotypes (traits) are caused by proteins, which are synthesized based on genetic instructions. This is also known as the Central Dogma of Molecular Biology. Over the past ten years, mounting evidence has shown the central dogma to be a gross overgeneralization. As educators, we know that the phenotype of academic success is not based solely on a child’s intelligence, but also on socioeconomic factors that can influence cognitive and social development, health, resilience and motivation. Even intelligence is not based entirely on genetics. No gene for intelligence has been identified. Malnutrition, exposure to certain drugs and chemicals in utero and pollution can all play a role in cognitive development.


Despite the protestations of the sociobiologists, children’s color preferences in clothing are influenced by social factors, and cannot be reduced to purely biological or genetic causes. This is true for many of their likes and dislikes, communication patterns, what they want to be when they grow up, and numerous other behaviors that typify maleness and femaleness.


Many parents agonize over whether to allow their little boy to experiment with feminine clothing and cooking, or their little girl to be tough and play sports, out of fear that they will be bullied (or because of their own homophobia). However, a more fundamental issue is whether or not we teach our children to be resilient, self-confident, and accepting of others, especially those who do not fit our stereotypes.