The Dating Advantage Theory
A new study argues that the same genes the increase schizophrenia risk also make you a more attractive person to date and mate. For example, those genes, when expressed in the general population at levels insufficient to lead to clinically diagnosed schizophrenia may enhance courtship related traits such as "creativity, emotional sensitivity, and expressiveness."
(The study is Del Giudice M (2010) Reduced Fertility in Patients' Families Is Consistent with the Sexual Selection Model of Schizophrenia and Schizotypy. PLoS ONE 5(12): e16040. doi:10.1371/journal.pone.0016040).
As the article explains, one theory is that:
Schizotypy [i.e. genes that predispose a person to schizophrenia] . . . act[s] as an “amplifier” of individual differences in genetic fitness and condition: high-schizotypy individuals would be more likely to achieve outstanding mating success when they have high genetic fitness and/or grow up in good environmental conditions, but also more likely to develop schizophrenia (and to suffer reduced mating success) when they have low genetic fitness and/or grow up in poor conditions. Thus, schizotypy itself can be sexually selected, possibly more strongly so in populations characterized by high levels of mating competition and short-term mating patterns, where displays of good genetic quality are more critical to successful courtship [than traits associated with success in a long term committed couple relationship].
In other words, schizophrenia related genes may be particularly adaptive in societies with "love marriage" as opposed to arranged marriage, because traits that pre-dispose you to schizophrenia also make you someone who is fun to date. The fact that schizophrenia generally manifests clearly only towards the end of puberty or early young adulthood, also suggests that full blown schizophrenia may often be invisible until after marriage in societies that do not, as modern American societies do, favor historically late ages at first marriage. In other words, if a typical couple gets engaged in their late teens, the most negative aspects of schitzotypy are more likely to be invisible.
This analysis generalizes a similar evolutionary explanation for the persistence of bipolar disorder, which presents similar evolutionary issues to schizophrenia, and is also very common in individuals with extraordinary creative output, a trait that is considered fitness enhancing, at least in modest frequencies, in a society. There is some anecdotal evidence points out, for example, in the graphic novel history of modern mathematics found in Logicomix that there may be a similar association between schizophrenia and extraordinary output in mathematics and logic.
The Undercounted Descendants Theory
The study also hypothesizes that schizophrenics, particularly men, may be more likely to have unacknowledged children outside of marriage, artificially suppressing their measured fertility rates: "This is not an unlikely possibility, given that schizotypal personality traits specifically predict increased engagement in short-term mating and casual sex." This hypothesis, which is powerful in traditional societies, may have less force in societies where contraception is widely used and effective in these kinds of relationships.
The greater fertility impact in men than in women, however, may also be explained by the fact that schizophrenia is more likely to manifest before men have had children (because men have an earlier onset on average, and later peak reproductive years than women) than it is to manifest before women have had children.
Limits Of The Methodology
The study is primarily simply an exposition of a mathematical model that shows that the explanation proposed would be sufficient to explain the survival of genes associated with schizophrenia with mating advantage effects that are within a range that is plausible, without actually measuring if those advantages really exist. But, it has considerable force, despite its modest empirical grounding, because it is hard to find another explanation that could explain the facts that are observed. If the model proposed in this paper is wrong, one has to find some other model of how a strongly hereditary trait that is strongly selected against in many respects has remained stable in population frequency.
Persistent, fitness reducing genetic traits very frequently have an aspect that is fitness enhancing, or was at some time in the past. The sickle cell anemia gene, which is also connected to malaria resistance is the most commonly cited example of this kind of trait, but many strongly hereditary fitness reducing conditions share that feature. A more recent example (details behind pay wall) that is largely speculative at this point, is the hypothesis that a key breast cancer risk gene (BRCA1) may also enhance intelligence, and that the cancer risks and cognitive benefits may both relate to the production of connective tissue proteins. The problem with this model in the case of schizophrenia, however, is that almost all of the genetic conditions of this type known so far have simple, single gene Mendelian inheritance patterns, something that is not what is behind the strongly hereditary component of schizophrenia. Extensive research directed at the genetics of schizophrenia have pretty much ruled out such a simple cause, and it is much harder to develop a "some mutations good, many mutations bad" scenario in a more complicated genetic model. I am not aware of any hereditary condition that has been definitively established as having that kind of phenotype structure.
Breast cancer risk genes, like schizophrenia risk genes, are also notable in manifesting relatively late in life, which reduces their selective impact. "Schizophrenia occurs equally in males and females, although typically appears earlier in men—the peak ages of onset are 20–28 years for males and 26–32 years for females. Onset in childhood is much rarer, as is onset in middle- or old age."
Keep in mind that life expectancy in the Roman Empire (and much of the period before it and after it) was twenty to thirty years, and that historically people had children young, so conditions that first have negative fitness effects when you are in your later teens or early twenties, or later, have had only slight selective effects until the last dozen or so generations or less, in the West, and for half a dozen generations or so in much of the rest of the world. So both breast cancer risk genes and schizophrenia risk genes may have had far less of an impact of evolutionary fitness for 99%+ of their existence in the human genome.
Probably the weakest piece of the model is the significance attached to environmental impacts on the likelihood that schizophrenia will manifest, which seems to be quite low empirically.
A Mutation Model
Perhaps the best alternative explanation to those suggested in the study would be a model similar to that advanced for autism (where that evidence seems to suggest that a majority of all cases may be first generation mutations attributable to advanced parental age; many kinds of mental retardation are also due primarily to "first generation" mutations), which sees the condition as being a product of many different kinds of common mutations that all have the same phenotypic effect (perhaps because they all interrupt the same delicate neurological system). Thus, there may be enough "first generation" schizophrenics to make up for a deficit in schizophrenic children of schizophrenics. The main pieces of evidence supporting this as part of the explanation are the facts that schizophrenia, like autism, is associated with advanced paternal age probably due to the increased frequency of new mutations in children of older fathers, and that both schizophrenia and bipolar disorder and autism seem to be associated more closely with the total number of atypical mutations (or subtypes of atypical mutations) in a person's genome than with any one mutation in a particular disease that accounts for a large share of all cases individually.
Schizophrenia, bipolar disorder and autism may all be the "fevers" of mental health. In other words, they may be common observable symptoms because a multitude of underlying causes all disrupt the delicate system that causes each symptom (schizophrenia, bipolar disorder, autism and fever respectively) to manifest.
As long as mutation rates among parents in the general population are high enough to make up for deficits in fertility in schizophrenics (which may be smaller than reported due to unacknowledged children of schizophrenic men who have lower fertility rates than schizophrenic women), schizophrenia can persist in the population at a steady prevalence despite being selected against.
The Theories Are Mutually Compatible And Avenues For Further Investigation
Of course, all three of these theories are compatible with each other. The larger the proportion of "first generation schizophrenics" (something that should be possible to estimate with reasonable accuracy from the strength of the paternal age-schizophrenia relationship), and the larger the number of unacknowledged children that men with schizophrenia are having (which might be best estimated as a percentage in excess of the percentage of unacknowledged children in the general population for which there is better empirical data with the percentage estimated by surveyed extramarital sex frequency of schizophrenic men relative to that of a control group), the less pressure there is on courtship trait advantages conferred by subclinical schitzotypy to have to be large and on subclinicial schitzotypy to be common in the general population (a frequency that can be estimated from the share of diagnosed schizophrenics with a family history of schizophrenia and that is, given the literature, probably quite low but non-zero) to explain the condition's evolutionary persistence.
Similarly, if it takes multiple mutations to create symptomatic schizophrenia, and a significant share of schitzotypy genes are "first generation", it is quite plausible that there is a significant rate of subclinical schitzotypy in the general population, which produces descendants who inherit a combination of schitzotypy genes that produce clinical schitzophrenia. One way to test this hypothesis would be to look not merely at advanced parental age as a risk factor, but also to examine advanced male grandparental age as a risk factor. A new mutation accumulation model with a threshold number of mutations required for an individual to be symptomatic would be supported and could be quantified at an order of magnitude level to the extent that advanced male grandparental age was a factor independent of advanced parental age. A study of advanced male grandparental age, of course, has the virtue of being something that can be cheaply, unobtrusively, and accurately determined from participants in existing studies.
Neurodiversity Implications
The "courtship advantage" model proposed in this study for schitzotypy genes, like the studies showing creativity advantages for sufferers of bipolar disorders, and analysis showing benefits in modern society from traits associated with the autism spectrum disorder all point to the importance of a neurodiversity model for thinking about human hereditiary cognitive differences.
If human hereditary cognitive differences are classified as flaws or diseases on one hand, and as enhancements on the other, there is a strong incentive for society and individual decision-makers to use eugenic approaches and gene therapy to wipe out the flaws from the human genome. But, if we recognize that some of the flaws are inseparable from cognitive traits that have value at some frequency (even if it is a low frequency) in our society, then efforts to wipe out genetic "flaws" in the population in the interest of improving mankind should be viewed with far greater skepticism.
6 comments:
I've read twice recently in your blog that life expectancy among Romans was 20-30 years. I think you are being mislead by high child mortality and maybe also high slave mortality (probably the average life expectancy of slaves was about 20 years).
I am quite sure of having read elsewhere that these figures are highly distorted by elevated child mortality and that people who reached adolescence could expect to live almost as much as we do (maybe 60 years on average).
As for schizophrenia, I am persuaded that, regardless of genetic susceptibility factors, it is being dramatically increased by modern medicalization of birthgiving, specially anti-natural practices such as forcing women to give birth in most unnatural positions (advantageous for the surgeon), abuse of drugs and cesareans in birthgiving and, very specially, separation of mother and child for hours after birth, all of which is very traumatic for children and may severely compromise the natural development of the mind.
My source for the life expectency of Romans is from the link in the my first recent post on that subject, which finds something close to a consensus in the literature on that point drawing on multiple lines of evidence that all reach the same conclusion. In addition to very high infant mortality, TB and general communicable illness death associated with malnutrition were important killers of young adults. Romans who survived their twenties often did live to near modern ages as I noted in the first post on the subject. Most societies that are less medically modern, and particularly those with high rates of infant mortality, also have very high rates of maternal mortality in childbirth. A typical pre-modern society has an excess of older men relative to older women in times of peace (warfare can, of course, shift the balance the other way). Colonial America, for example, had far more widowers than widows (exactly the reverse of modern age/gender balances). The upper crust in Rome did live longer (as the source I reference and quote included in the post note), but even in the most affluent Roman communities had life expectencies of only about 40 years old. I've previously cited in a post the example of George Washington's extended family as an example of just how common early deaths were, even among the affluent in societies that lack modern medical science.
The onset of schizophrenia in mothers often does come close in time to child bearing, as I have noted in a previous post, but given the characteristic time of schizophrenia onset in women in any case (the most plausible explanation of which neurobiologically is unrelated to child bearing) makes this more likely to be a largely coincidental factor, and the consistency of schizophrenia rates in places with very different childbearing practices and over time periods when childbearing practies have changed a great deal, undermines this hypothesis. The extremely high levels of hereditary influence in schizophrenia incidence also contradicts child birthing practices as an important factor for either mothers or their children. Schizophrenia rates are profoundly lower in people with no family history of schizophrenia than they are with people who do have that family history, even controlling for birthing method differences. A trend towards parenting by older fathers provides a more empirically supported basis for any uptake in schizophrenia rates that has been observed.
I'm not talking mothers but children. The traumas of medicalized birth may and should cause epigenetic modifications which may favor schizophrenia and other mental disorders later in life.
I do not mean to disregard genetic factors, which do exist, but generally schizophrenia sets on in early adulthood. It's not any old-age disease like Parkinson or most cancers, which have a low fitness cost: it is a highly disabling illness that affects specially young adults (all people I know developed it in their teens or twenties). So your claim that it may provide with any reproductive advantage seems most odd, unless it is in their relatives with unexpressed genes.
I'd rather tend to see it as a genetically influenced disorder with low prevalence in normal conditions (i.e. Paleolithic or even Neolithic ones) aggravated by modern maternity practices, which trigger the disease in people who would have otherwise been "normal" (more or less).
Thanks for the clarification on Romans, I was not sure where I had read the youth mortality explanation.
All of these theories are valid, but the missing hole in all of this is Vitamin B3. Also known as Niacin. A deficiency is the cause of schizophrenia, bipolar disorder, depression, and anxiety. Our ancestors ate plenty of fish, which contain high amounts of niacin, so that they did not have this deficiency. I developed schizophrenia at 19 years old and cured it by supplementing with niacin every day for a month.
Sorry, Alex but what you say doesn't make much sense. Niacin is not the typical vitamin only found in fish but rather its main dietary source may be liver, although there's a variety of them, including peanut butter and common meats like chicken or beef - fish may be one source but not a necessary one. Where fish may be important is for vitamin D generation, especially among people with darker skin colors living in high latitudes: vitamin D is either synthesized in the skin by exposure to sunlight, fish or pharmaceutical supplements. Vitamin D is the most critical cause of visible evolutive differentiation in humans and that's because it is fundamental in early brain development (fetal and infancy), as well as affecting bone formation. And that's precisely because fish was not always available as part of the diet (otherwise depigmentation would not have been as necessary). I'd strongly recommend vit. D supplements (or a fish-rich diet) for mothers and children (and to lesser extent other adults) of dark pigmentation living far away from the tropics. Its developmental deficiency can cause mild to severe mental problems, which AFAIK are not reversible.
I don't mean to question the veracity of what you say but the logic implicit in it. Vitamin B3 deficiency is well known medically and causes pellagra, which affects the skin and the brain but causing dementia rather than schizophrenia. Anyhow in a meat-rich diet, vitamin B3 should be plentiful, even maybe excessive: you get 5-6 mg per 100 g of beef or slightly more from chicken and the recommended dose is of around 14-18 mg (= three small meat rations), depending on age or gender. Cereals, legumes, nuts or even green vegetables like broccoli also provide niacin in variable doses.
Alex you still have schizophrenia, taking vitamin b-3 for a month did not cure it lol... that was a delusional comment. i do them all the time
Post a Comment