As the year of record unemployment and a stagnant economy comes to a close, it is fifty degrees colder here in Denver than it is in Buffalo, New York. In Colorado Springs, fireworks had to be cancelled because it was 25 degrees below on Pike's Peak with 70 mile an hour winds. People lost homes to foreclosures and fires. Low interest rates are irrelvant because no one seems to be lending. The crazy people have taken the reins in Colorado's House of Representatives and the United States House of Representatives.
Maybe we'll do better in 2011.
31 December 2010
Does Avoiding Alcohol Encourage Polygamy?
[I]f you are in a polygynous relationship in the developed world you are probably either of Mormon Fundamentalist or Muslim faith, both of which forbid the consumption of alcohol . . . [and] pre-industrial societies with polygyny as the dominant marriage institution consume less alcohol than those with monogamy.
From here (citing Squicciarini, Mara and Jo Swinne “Women or Wine? Monogamy and Alcohol.” AAWE Working Paper No. 75).
One theory is that industrialization favors both alcohol consumption and monogamy, but early LDS society seems a poor fit to this model, and both widespread use of alcohol and the demise of polygny far pre-date industrialization.
Polygyny virtually vanished in the West by the time of the late Roman empire. Alcohol consumption was rampant and exaulted in the West by the time that the Myceneans took over in Greece, if not earlier, a millenium or more earlier. The Byzantines had polygny at some some level pretty much until it fell, as did the Ottomans that followed.
Judaism appears to been polygynous at least through the early Iron Age (ca. 1000 BCE), but had probably ceased to be by the dawn of the Rabbinic period, ca. 70 CE, and has accepted alcohol consumption to some extent, although not necessarily to excess, for all of its history.
Perhaps the more relevant point is that a prohibition on alcohol consumption is a measure of how powerful a grip a religion has on its participants, and hence, how much capacity it has also to influence marital practices of its members.
Falling Housing Prices and Failing Banks
Many pundits think that national housing prices haven't hit bottom yet. Estimates range from a further drop of 5% to a further drop of 30% in 2011, at which point they will have hit bottom.
I'm really not all that interesteed in national housing price numbers because housing prices are the quintessentially locally determined economic indicator. National housing price trends are simply a noise filled prediction about the trend in the largest individual housing markets. But, it tells you little about tends in markets the mostly move at the metropolitian area and neighborhood level, and to a lesser extent at a state or few adjacent state region level.
Places that have experienced housing bubbles without corresponding housing price collapses are most at risk. This makes me quite nervous about housing markets in places like the Northwest, and a little nervous about housing prices in parts of California and the Northeast where the bust may not yet have fully run its course, but not very concerned about places like Florida or Michigan, where the declines seem to have run their course already, or places Denver and the American heartland, where there wasn't much of a housing bubble in the first place.
The case for all housing bubble collapses being local is supported from the bank failure rates by state during the financial crisis. Commercial banks rely on mortgages for their core lending income and commercial banks that are in trouble are disproportionately in states like California, Oregon, Washington, Florida, Arizona and Nevada that experienced housing bubbles. Georgia is an outlier by this measure, probably because many Georgia banks were exposed to risks in Florida real estate. (Colorado's bank failure rate has been roughly equal to the national average, but its share of troubled banks is 10th in the nation, below the markets with major housing price collapses but worse than most of the nation.)
Also, while bank failures are useful in assessing regional trends, I should once again praise the FDIC for limiting harm in this sector. While bank failures have shown a strong regional trend, the total numbers have been small everywhere. Just 4% of the nation's commercial banks have failed since the financial crisis (2008-2010), and their share of banking assets has been much, much smaller. Only six states (CA, FL, GA, NV, OR, WA) have seen more than 10% of their banks fail in this time period (again the percentage of banking assets has been smaller), and none have seen more than 15% of their banks (with a much small percentage of banking assets) fail. Prior to the creation of the FDIC, deep recessions routine caused half of all banks to fail, and more in the places that were hardest hit.
By comparison, none of the major investment banks in the United States, which are not FDIC insured, and almost none of the major mortgage finance companies in the United States, which similarly were not subject to FDIC-like regulation, survived the financial crisis as independent companies. Almost all either failed, or were acquired by more staid or foreign financial institutions, or reorganized as commercial banks regulated by the FDIC.
I'm really not all that interesteed in national housing price numbers because housing prices are the quintessentially locally determined economic indicator. National housing price trends are simply a noise filled prediction about the trend in the largest individual housing markets. But, it tells you little about tends in markets the mostly move at the metropolitian area and neighborhood level, and to a lesser extent at a state or few adjacent state region level.
Places that have experienced housing bubbles without corresponding housing price collapses are most at risk. This makes me quite nervous about housing markets in places like the Northwest, and a little nervous about housing prices in parts of California and the Northeast where the bust may not yet have fully run its course, but not very concerned about places like Florida or Michigan, where the declines seem to have run their course already, or places Denver and the American heartland, where there wasn't much of a housing bubble in the first place.
The case for all housing bubble collapses being local is supported from the bank failure rates by state during the financial crisis. Commercial banks rely on mortgages for their core lending income and commercial banks that are in trouble are disproportionately in states like California, Oregon, Washington, Florida, Arizona and Nevada that experienced housing bubbles. Georgia is an outlier by this measure, probably because many Georgia banks were exposed to risks in Florida real estate. (Colorado's bank failure rate has been roughly equal to the national average, but its share of troubled banks is 10th in the nation, below the markets with major housing price collapses but worse than most of the nation.)
Also, while bank failures are useful in assessing regional trends, I should once again praise the FDIC for limiting harm in this sector. While bank failures have shown a strong regional trend, the total numbers have been small everywhere. Just 4% of the nation's commercial banks have failed since the financial crisis (2008-2010), and their share of banking assets has been much, much smaller. Only six states (CA, FL, GA, NV, OR, WA) have seen more than 10% of their banks fail in this time period (again the percentage of banking assets has been smaller), and none have seen more than 15% of their banks (with a much small percentage of banking assets) fail. Prior to the creation of the FDIC, deep recessions routine caused half of all banks to fail, and more in the places that were hardest hit.
By comparison, none of the major investment banks in the United States, which are not FDIC insured, and almost none of the major mortgage finance companies in the United States, which similarly were not subject to FDIC-like regulation, survived the financial crisis as independent companies. Almost all either failed, or were acquired by more staid or foreign financial institutions, or reorganized as commercial banks regulated by the FDIC.
Navy Refuses To Choose, Buys Both LCS
The two winning LCS designs
Why Choose When You Can Have Both?
Rather than choosing a winner in a high stakes contest between two competing Littoral Combat Ship designs for its next twenty ship purchases in the newly created ship class, the U.S. Navy has decided to buy ten of each. One winning design is a radical (for the Navy) trimaran design (the LCS-2 Independence class of the Austral consortium lead by defense contractor General Dynamics), while the other is a more conventional naval ship design (the LCS-1 Freedom class of the Lockheed team).
The first of the ten new ships in each class will cost $430 million each. The next nine ships over five years will cost $789 million each. No that makes no "economies of scale" sense, but this is the military, so one can't expect such things. The original budget was $220 million per ship and the ships are both behind schedule. This is not cheap or timely, but no other ship class that the U.S. Navy is purchasing has a per unit cost of less than $1 billion per ship, and the least expensive buys involve 1980s destroyer designs with new electronics and selected weapons systems upgrades that fill mission niches that have been essentially unchanged since the late 1930s.
The split buy is supposed to save $2.9 billion dollars over buying twenty of the same class, on the theory that ongoing competition between suppliers with the potential to cease to purchase ships from a supplier who charges too much will limit cost overruns. With relatively few new ship designs on the drawing board, a failure to win this competition would have threatened the continued existence of either contracting team as a ship builder.
There are pros and cons to each design. The Independence team, for example, has touted greater fuel efficiency. But, ultimately, the Navy wasn't interested in analyzing these issues.
A New Class Of Ships
The ships are similar in size to frigates (the Freedom is 3,000 tons, the Independence is a couple hundred tons smaller), a ship class that has been phased out of the U.S. fleet. They are capable of achieving speeds a little short of twice as fast as that of than any existing ship class in the U.S. Navy for short sprints, with shallow drafts allowing them to serve in shallower waters. They have smaller crews than comparably sized frigates (about 75 including 35 crew for a module v. 200+ for past frigates) due mostly to increased automation, and they are supposed to be relatively inexpensive per ships so that they can be purchased in larger numbers and be "expendible" to some extent in strategic calculations.
The U.S. Navy is much more oriented towards "Blue Sea" missions than any other navy in the world. In less charitable terms, the U.S. Navy is basically deesigned to fight World War II with a few Cold War inspired tweaks. But, this hasn't been were much of the sea warfare action has been in the past half-century. While almost all other warships in the U.S. Navy in recent times (cruisers, destroyers, frigates and a good share of attack submarines) have primarily served as escorts for aircraft carriers or Marine helicopter carriers, these ships are intended to be used more autonomously.
They are designed so that they can be configured to fill multiple missions including anti-submarine warfare, anti-mine warfare, interdiction of shipping, anti-piracy, anti-small missle boat missions, serving as a mothership for a variety of drone classes (air, ship and submarine), humanitarian relief missions, and support of small units of ground troops with helicopters and artillery. Each has a helipad, some basic armaments and room for mission modules and their crews. All of the mission modules, however, seem to be having technical difficulties of one kind or another, at the moment that prevent them from being operational. Until the modules are perfected, the ships are simply glorified fast transport ships.
In order to fill new missions, the new Littoral Combat Ship class gives up the large complement of vertical launch cruise missiles, torpedoes, AEGIS comprehensive threat identification sensor and targeting system, the heavier duty hulls of other U.S. Navy ships, and more robust anti-missile and anti-aircraft capabilities associated with Phalanx anti-missile guns, 3" to 5" naval guns (in favor of a 2" main gun plus heavy machine guns, grenade launchers, helicopters and a small anti-air missile system) found on other Naval ships. Lockheed Martin is marketing an AEGIS equipped version for national missile defense missions to Gulf States, however.
In other words, the LCS is something of a navalized version of a large Coast Guard cutter.
It is likely that the first ships of the class will be outfitted for anti-mine warfare, because delays LCS procurement has left the Navy with a critical shortage of ships capable of carrying out that dangerous but essential role, and for boarding party missions that don't require much in the way of specialized mission modules.
Criticism
The LCS has a variety of criticisms directed at it. Like all major procurement programs, it is drastically overbudget, behind schedule and suffering serious technical difficulties despite not being a particularly technologically ambitious project.
More fundamentally, the LCS is a ship designed by committee and the committee can't even decide which version it really wants. As a jack of all trades, it may be a master of none. And, while modularity may be of use in providing flexibile alternatives to buying new ships over the course of the next decade or two, few observers believe that real time module changes will make military sense, so the flexibility may provide little short term benefit.
The LCS is a compromise between those who wanted a larger, more capable craft, and those who wanted a smaller, faster, more agile craft, in the 300 to 600 ton range, with a more focused mission and smaller crew that would be suitable even for racing up foreign rivers.
Those who wanted a more capable craft have aired concerns about its lack of defensive capabilities. Smaller crews means fewer resources available for damage control in a fight. Thinner hulls make the risk of serious damage from relatively minor hits greater. Even with its faster speed, its large size puts it at a disadvantage against militarized speed boats in speed and manuverability. None of its standard weapons are of much use at all against a submarine or full sized warship, or at ranges beyond line of sight, making it vulnerable to opponents with longer range weapontry.
Those who wanted a smaller ship also fear that the LCS will still be too expensive per ship and not sufficiently expendible to be a cost effective means of "fly swatting" individually small, but collectively troublesome threats. It is still very expensive spend $750,000,000 per ship to deal with $750,000 armed speed boats or $7,500,000 missile boats. A design concept for the LCS that insisted on each ship being able to support its own crew for several weeks onboard also prevented it from having the packing in all the capabilities that one could fit an all business military boat, which did not have to all the comforts of spartan cruise ship. Opposition forces deploying from coastal home bases in littoral waters don't face the same constraints.
It isn't obvious that the LCS compromise struck is well suited to taking on swarms of small craft, even with modules, despite the fact that this is one of the key missions used to justify the new class of ships after this vulnerability was discovered in military exercises. It's heavy machine guns and grenade launchers may be too light to reliably defeat at a reasonable range the kind of light attack boats common in many world navies. It has no surface to surface missiles as standard equipment, and its 2" gun may be inadequate to take on a large number of miltiarized small craft over a wide area at once at a safe range. The LCS seems better suited to interdicting smuggling and piracy than it does to providing a screen defense against small craft as part of a larger fleet.
Is It A Good Idea?
My own view is that the LCS is a welcome addition to the U.S. Navy. It fills a variety of holes in military capabilities in the existing fleet and is the first in a new generation of more highly automated ships. It has also been handled in a way that promotes creative military thinking rather than squashing it.
Time will tell if the U.S. Navy really needs more than twenty of them in the current designs, or if there are other more urgent niches to be filled by more purpose built small warships that the LCS doesn't fill well. But, the fact that the LCS won't be ideal for all of the missions that have been considered for it, doesn't mean that there aren't twenty ships worth of missions, like shallow water anti-submarine warfare, anti-piracy, interdiction of shipping, and anti-mine warfare that it is needed to fill more efficiently than current resources.
Its speed is something of a frill. An armed helicopter is five to ten times as faster as any surface craft, and fighter aircraft and cruise missiles are faster still.
The truth, that the Navy is reluctant to admit, given that it has a force more heavily concentrated in large surface combatant than any other navy in the world, is that large surface combatants are exceedingly vulnerable to any sophisticated opponent, and constitute an unacceptable loss when they suffer even a single disabling defeat (whether or not the ship actually sinks).
Exercise after excercise, and the small number of real life naval military encounters we have seen in the last half century has revealed that surface combatants are almost always outmatched by submarines and sophisticated heavily armed aircraft in naval warfare between warships. Sophisticated and powerful anti-ship missiles in the hands of the Russians, the Chinese and probably soon enough, other countries that acquire the technology, are very likely to be successful against U.S. ships at least some of time, even if they can usually be defeated by defensive weapons. In modern warfare, defending a large, slow target that can duck or jump, that can't hide effectively from radar and satellites, and that is counting on knocking out bullets with bullets to survive unscathed is not a winning proposition against any sophisticated opponent willing to devote enough resources to destroy a particular ship.
Realistically, the only strategy that the U.S. Navy can hope to prevail with in naval warfare against a sophisticated is to strike naval opponents with aircraft from distant launch points, long range missiles and submarines before friendly forces are close enough to be fired upon by their opponents. If seamen can see the enemy with their eyes, the battle has probably already been lost.
But, the Navy generally, and the Littoral Combat Ship in particular, is useful, because the vast majority of the countries of the world do not have sophisticated naval forces, or theoretically viable anti-ship forces. Only half a dozen or so nations (Russian, the People's Republic of China, Iran, North Korea, Pakistan, and South Africa) are both plausible military opponents and sophisticated enough to be a real naval threat.
Most of the potentially countries of the world have navies that consist of no more than a few frigates and a number of militarized boats with modestly powerful line of sight missile capabilities, and most of the potentially hostile countries of the world do not have even a single attack submarine. In these cases, naval power can cut off a country's access to the rest of the world, can provide an unassailable base of operations for U.S. forces on land, and can serve as a potent symbol of U.S. willingness to engage a diplomatic situation with military force if necessary. The LCS is well suited to providing back up to forces of Marines deployed to Third World hot spots. It is a ship that makes more sense to deploy in the Red Sea to control Somoli pirates or near the Ivory Coast during a disputed succession fight, than it does to make a part of the defense of Taiwan against an attack from the People's Republican of China, or even to defend shipping and U.S. ships against a concerted small boat swarm attack in the Persian Gulf directed by Iran.
The Littoral Combat Ship is also particularly useful against disorganized, non-governmental operations of isolated terrorists, pirates and smugglers at sea, and allows the U.S. to cover much more territory than a strategy based on large naval ships operating together in carrier groups ever could.
We may need a new design for a ship to defend shipping and U.S. fleets against swarms of small missile boats. But, if the LCS can plug most of the other holes in the U.S. fleet's capabilities (another which it won't plug is missile defense), then it has served a useful purpose.
The LCS is also consistent with a combined military and diplomatic strategy that attempts to reduce the cost of U.S. efforts to develop a large conventional naval to deal with a small number of sophisticated potential opponents, most of whom must be handled with special care in any case because they also have nuclear weapons, through conventional naval arms treaties, rather than increased U.S. Navy fleet capabilities. But, any such treaty would probably not restrict ships in the class of the LCS.
Colorado Dems Looking For New State Chair
Pat Waak, the incumbent chair of the Colorado Democratic Party, has decided not to run for a fourth term of office, leaving the race for the post open, with no clear front runner or even announced candidates.
The state chair is generally the most frequently quoted spokesperson for Democrats in the state who does not hold an elected public office, given the holder of the post a bully pulpit, and also has considerable informal influence over the recruitment of candidates for open partisan seats in the state. The state chair also is charged with making sure that the caucus process leading up to and including the state convention, and the coordinated campaign Get Out The Vote process have proper leadership, and that everyone else in the organization is doing their jobs in a manner in compliance with state law. Finally, the chair has an important say in the party's litigation strategy regarding election law related issues.
In years such as the four years to come (at least), when there is a Democratic Governor, however, the party chairs symbolic role can be reduced, because the Governor can assume the focal point role in the party.
The reality of modern campaign finance and the structure of the party is that the state chair has very little in the way of discretionary funds to spend, immense and costly state law mandates, and little influence over who is actually selected to be a nominee in races where there is more than one serious candidate within the party.
While the state chair position is higher profile than the host of thankless state and county party officer and committee positions, it is involves a great deal of work, travel and people skills, while imparting only a modest amount of influence that comes with threefold more criticism. But, at its best, a party chair can guide the party in the right direction.
The state chair is generally the most frequently quoted spokesperson for Democrats in the state who does not hold an elected public office, given the holder of the post a bully pulpit, and also has considerable informal influence over the recruitment of candidates for open partisan seats in the state. The state chair also is charged with making sure that the caucus process leading up to and including the state convention, and the coordinated campaign Get Out The Vote process have proper leadership, and that everyone else in the organization is doing their jobs in a manner in compliance with state law. Finally, the chair has an important say in the party's litigation strategy regarding election law related issues.
In years such as the four years to come (at least), when there is a Democratic Governor, however, the party chairs symbolic role can be reduced, because the Governor can assume the focal point role in the party.
The reality of modern campaign finance and the structure of the party is that the state chair has very little in the way of discretionary funds to spend, immense and costly state law mandates, and little influence over who is actually selected to be a nominee in races where there is more than one serious candidate within the party.
While the state chair position is higher profile than the host of thankless state and county party officer and committee positions, it is involves a great deal of work, travel and people skills, while imparting only a modest amount of influence that comes with threefold more criticism. But, at its best, a party chair can guide the party in the right direction.
Make It Official
The Tea Party faction is the reason that the Republican party made as much progress as it did in the 2010 midterm elections at both the state and federal levels. Their candidates are putting their stamp on the GOP at the legislative party level.
It would be an opportune time for Republicans to officially change their party's name. After all, this is no longer the party of Lincoln. The strongholds of the modern Republican party are overwhelmingly places that shunned the GOP for a century because it was the party of the North in the Civil War and Reconstruction. They are the direct political descendants of the Dixiecrats and the segregationists. A significant share of Republicans still think secession is a good idea, and many would like to repeal all or most of the Civil Rights Act. The Republican party was invented as a party in favor of a strong federal government, yet the modern Republican party is the party of state's rights. Lincoln and other early Republicans were in favor of constitutionally questionable taxes.
Why should Republicans continue to carry all that historical baggage that causes them such cognitive dissonance? Why not admit that there is nothing "Grand Old" about their party and that they have nothing whatsoever in common with the pro-tax, anti-states rights, socially liberal, culturally Yankee party that the Republicans were for their first couple of generations? Why not cast that all aside and replace it with the mantle of the Tea Party.
This would leave the people who put them in power, who happen to care a lot about symbolism, tickled pink. And, rebranding every once and a while is often a good thing for just about any organization. It would also be more honest, and crazy as it seems, a little honesty in politics, even once in a while, is appreciated by just about everyone. So, make it official. For 2011, the major political party that is not the Democratic party in the United States should cast aside the "Republican" and "GOP" names and become the "Tea Party".
Then, maybe the Democrats could follow suit and rename itself, honestly and in a way that reclaims a label that many in the base would like to rehabilitate, the "Liberal Party."
It would be an opportune time for Republicans to officially change their party's name. After all, this is no longer the party of Lincoln. The strongholds of the modern Republican party are overwhelmingly places that shunned the GOP for a century because it was the party of the North in the Civil War and Reconstruction. They are the direct political descendants of the Dixiecrats and the segregationists. A significant share of Republicans still think secession is a good idea, and many would like to repeal all or most of the Civil Rights Act. The Republican party was invented as a party in favor of a strong federal government, yet the modern Republican party is the party of state's rights. Lincoln and other early Republicans were in favor of constitutionally questionable taxes.
Why should Republicans continue to carry all that historical baggage that causes them such cognitive dissonance? Why not admit that there is nothing "Grand Old" about their party and that they have nothing whatsoever in common with the pro-tax, anti-states rights, socially liberal, culturally Yankee party that the Republicans were for their first couple of generations? Why not cast that all aside and replace it with the mantle of the Tea Party.
This would leave the people who put them in power, who happen to care a lot about symbolism, tickled pink. And, rebranding every once and a while is often a good thing for just about any organization. It would also be more honest, and crazy as it seems, a little honesty in politics, even once in a while, is appreciated by just about everyone. So, make it official. For 2011, the major political party that is not the Democratic party in the United States should cast aside the "Republican" and "GOP" names and become the "Tea Party".
Then, maybe the Democrats could follow suit and rename itself, honestly and in a way that reclaims a label that many in the base would like to rehabilitate, the "Liberal Party."
30 December 2010
Color and Language
What could be more fundamental than primary colors? You learn them in pre-school. Almost everyone on Earth (apart from the colorblind) has three different kinds of cone cells which analyze photon wave length (it actually takes three or four photons for a cell in your eye to register) and communicates that color to your brain.
But, for all practical purposes, photon wave length is a continuous property. To communicate we must assign words to ranges of photon wave length. And, it turns out, there is not a universal way to translate the photon frequencies that our eyes see into words. One of the more notable examples is that not all languages have separate primary color words for blue and green.
Neither Basque, nor the Celtic and Scandinavian languages made that distinction until in archaic texts although it exists now, and some of those languages divide the blue-green colors at the boundary between blue and green linguistically into primary colors in a different way. Russian, Italian and Hebrew all have three primary colors in the part of the color spectrum that English allocates to blue and green. Many African languages, and archaic forms of East Asian languages do not have a single primary color word that includes blue and green in the same color.
The process of distinguishing more kinds of colors linguistically does tend to follow a common pattern in the great majority of languages, however. Languages tend to distinguish blue from green before they distinguish red from pink, for example. Also, it appears that languages tend to gravitate towards making more rather than fewer primary color distinctions over time, and that finer color distinction seems to be a frequently linguistically borrowed language feature.
The examples above are not random. One point that they illustrate is the fact that there are a wide variety of different patterns of color naming even between relatively closely related language families in the European branch of the Indo-European languages. Two of the most closely related Indo-European language families, for example, are the Celtic languages and the Italic languages, which share many isoglosses (i.e. they share the same linguistic feature in areas where related languages can be grouped into two or more categories based on their linguistic features). Yet, Celtic languages and Italic languages, which probably started to diverge from each other linguistically in the Iron Age (i.e. after 1200 BCE), have different words for describing colors.
This in notable because commonly used, simple words of the kinds one learns in pre-school tend to be the least prone to random linguistic drift or evolution. The less often a word is used, the more likely it is the evolve over time. In the Indo-European languages, for example, the root words for core simple vocabulary tend to be much more likely to match proto-Indo-European terms than advanced words that are used less frequently. Yet, primary color words are very common and used frequently.
But, for all practical purposes, photon wave length is a continuous property. To communicate we must assign words to ranges of photon wave length. And, it turns out, there is not a universal way to translate the photon frequencies that our eyes see into words. One of the more notable examples is that not all languages have separate primary color words for blue and green.
Neither Basque, nor the Celtic and Scandinavian languages made that distinction until in archaic texts although it exists now, and some of those languages divide the blue-green colors at the boundary between blue and green linguistically into primary colors in a different way. Russian, Italian and Hebrew all have three primary colors in the part of the color spectrum that English allocates to blue and green. Many African languages, and archaic forms of East Asian languages do not have a single primary color word that includes blue and green in the same color.
The process of distinguishing more kinds of colors linguistically does tend to follow a common pattern in the great majority of languages, however. Languages tend to distinguish blue from green before they distinguish red from pink, for example. Also, it appears that languages tend to gravitate towards making more rather than fewer primary color distinctions over time, and that finer color distinction seems to be a frequently linguistically borrowed language feature.
The examples above are not random. One point that they illustrate is the fact that there are a wide variety of different patterns of color naming even between relatively closely related language families in the European branch of the Indo-European languages. Two of the most closely related Indo-European language families, for example, are the Celtic languages and the Italic languages, which share many isoglosses (i.e. they share the same linguistic feature in areas where related languages can be grouped into two or more categories based on their linguistic features). Yet, Celtic languages and Italic languages, which probably started to diverge from each other linguistically in the Iron Age (i.e. after 1200 BCE), have different words for describing colors.
This in notable because commonly used, simple words of the kinds one learns in pre-school tend to be the least prone to random linguistic drift or evolution. The less often a word is used, the more likely it is the evolve over time. In the Indo-European languages, for example, the root words for core simple vocabulary tend to be much more likely to match proto-Indo-European terms than advanced words that are used less frequently. Yet, primary color words are very common and used frequently.
29 December 2010
Strikes At All Time Low In 2009, Close in 2010
I previously posted about the scarcity of strikes in the American labor movement through 2008. But, I hadn't seen the 2009 and 2010 figures then.
In 2009, there were just 5 work stoppages involving 1,000 or more workers that involved just 13,000 workers, compared to 15 stopages involving 72,000 workers in 2008. The 2008 figures were an all time low as long as anyone has kept accurate figures. The 2009 figures are far lower - a third as many major strikes involving less than a fifth as many workers.
The have been 8 work stoppages involving 1,000 or more workers involving 43,000 workers through November in 2010 (half in the public sector and half in the private sector), which is more than in 2009, but is on track to be less than in 2008 and all prior years on record. No strikes involving 1,000 or more workers were in progress in the United States as December began, and I haven't noticed any in the news, although I could have missed one or two and there are still a couple of days left in the year.
I suspect that you would have to go back to before 1900 to find a year with that few work stoppages, even in absolute numbers, and one would probably have to go back well into the 19th century to find that a level of work stoppages that low relative to the size of the labor force.
In 2009, there were just 5 work stoppages involving 1,000 or more workers that involved just 13,000 workers, compared to 15 stopages involving 72,000 workers in 2008. The 2008 figures were an all time low as long as anyone has kept accurate figures. The 2009 figures are far lower - a third as many major strikes involving less than a fifth as many workers.
The have been 8 work stoppages involving 1,000 or more workers involving 43,000 workers through November in 2010 (half in the public sector and half in the private sector), which is more than in 2009, but is on track to be less than in 2008 and all prior years on record. No strikes involving 1,000 or more workers were in progress in the United States as December began, and I haven't noticed any in the news, although I could have missed one or two and there are still a couple of days left in the year.
I suspect that you would have to go back to before 1900 to find a year with that few work stoppages, even in absolute numbers, and one would probably have to go back well into the 19th century to find that a level of work stoppages that low relative to the size of the labor force.
28 December 2010
Number of Police Deaths Typical In 2010
Last year, 2009, was a year with a fifty year low number of police killed in the line of duty, 117. Alas, 2009 was a fluke. In 2010, there were 160 deaths of police in the line of duty.
What were the causes of the deaths in the line of duty?
There were 28 other deaths in the line of duty in 2010 compared to 17 in 2009.
The numbers need to be considered relative to the total number of law enforcement officers.
Thus, over the course of a career a little less than 1% of police officers die in the line of duty, with a little less than half of those deaths consisting of traffic accidents.
Those deaths are tragic, of course, and in the case of homicides, reprehensible and culpable. But, police officers are killed in the line of duty, and kill others in the line of duty (the annual number of justified killings by law enforcement is usually in the very low hundreds and are outnumbered by criminal homicides by roughly 100 to 1), far less often (thankfully) than a typical person's intuition based on fictional and media accounts would suggest. Something on the order of 97% of law enforcement officers, over an entire career, never kill anyone and are not killed in the line of duty. Most law enforcement officers never fire their firearms in anger in their entire careers.
Law enforcement is still a dangerous job (and in some times and places, such as contemporary Northern Mexico, Iraq and Afghanistan, it is very dangerous) and certainly requires officers to go into dangerous situations. The number of occupational deaths from farming, the private industry with the highest rate of occupational deaths is 4.5 per 100,000 per year compared to about 20 for law enforcement. But, it is not nearly a dangerous as most people suppose it to be.
The number of police deaths has topped 160 five times since 2000, including 240 in 2001. The annual toll routinely topped 200 in the 1970s and before that in the 1920s.
What were the causes of the deaths in the line of duty?
Fifty-nine federal, state and local officers were killed by gunfire in 2010, a 20 percent jump from last year's figures, when 49 were killed. The total does not include the death of a Georgia State Patrol trooper shot in the neck Monday night in Atlanta as he tried to make a traffic stop. And 73 officers died in traffic incidents, a rise from the 51 killed in 2009, according to the data.
There were 28 other deaths in the line of duty in 2010 compared to 17 in 2009.
The numbers need to be considered relative to the total number of law enforcement officers.
There are as of 2006, 683,396 full time state, city, university and college, metropolitan and non-metropolitan county, and other law enforcement officers in the United States. There are approx. 120,000 full time law enforcement personnel working for the federal government adding up to a total number of 800,000 law enforcement personnel in the U.S.
Thus, over the course of a career a little less than 1% of police officers die in the line of duty, with a little less than half of those deaths consisting of traffic accidents.
Those deaths are tragic, of course, and in the case of homicides, reprehensible and culpable. But, police officers are killed in the line of duty, and kill others in the line of duty (the annual number of justified killings by law enforcement is usually in the very low hundreds and are outnumbered by criminal homicides by roughly 100 to 1), far less often (thankfully) than a typical person's intuition based on fictional and media accounts would suggest. Something on the order of 97% of law enforcement officers, over an entire career, never kill anyone and are not killed in the line of duty. Most law enforcement officers never fire their firearms in anger in their entire careers.
Law enforcement is still a dangerous job (and in some times and places, such as contemporary Northern Mexico, Iraq and Afghanistan, it is very dangerous) and certainly requires officers to go into dangerous situations. The number of occupational deaths from farming, the private industry with the highest rate of occupational deaths is 4.5 per 100,000 per year compared to about 20 for law enforcement. But, it is not nearly a dangerous as most people suppose it to be.
27 December 2010
Quick Science Hits
* The brain structure called the amygdala is closely associated with the instinctual fear. But, that isn't all it does. A large amygdala is associated with "a rich and varied social life among humans" and closely related primates. It has been linked to the social deficits in autism and regulates emotions in social situations in coordination with other parts of the brain. It may play a key part in interpreting facial expressions and monitoring personal space. Atypical amygdala function has been associated with: borderline personality disorder, psychopathy, more severe social phobia, bipolar disorder, autism and schizophrenia. The role of the amygdala in interpreting facial expressions, particularly fearful ones, and in fear generally, appears to be implicated in many of the mental conditions, and the left amygdala seems to be implicated more often than the right amygdala.
* "[A] single DNA change that blocks a gene known as HTR2B was predictive of highly impulsive behavior. HTR2B encodes one type of serotonin receptor in the brain. . . . "we found that the genetic variant alone was insufficient to cause people to act in such ways . . . Carriers of the HTR2B variant who had committed impulsive crimes were male, and all had become violent only while drunk from alcohol, which itself leads to behavioral disinhibition." From here via here.
* Eating fried fish is linked to strokes.
* Out of Africa migrations may have been via a wet Sahara rather than the Nile River Valley or Red Sea:
* "[A] single DNA change that blocks a gene known as HTR2B was predictive of highly impulsive behavior. HTR2B encodes one type of serotonin receptor in the brain. . . . "we found that the genetic variant alone was insufficient to cause people to act in such ways . . . Carriers of the HTR2B variant who had committed impulsive crimes were male, and all had become violent only while drunk from alcohol, which itself leads to behavioral disinhibition." From here via here.
* Eating fried fish is linked to strokes.
* Out of Africa migrations may have been via a wet Sahara rather than the Nile River Valley or Red Sea:
Analysis of the zoogeography of the Sahara shows that more animals crossed via this route than used the Nile corridor. Furthermore, many of these species are aquatic. This dispersal was possible because during the Holocene humid period the region contained a series of linked lakes, rivers, and inland deltas comprising a large interlinked waterway, channeling water and animals into and across the Sahara, thus facilitating these dispersals. This system was last active in the early Holocene when many species appear to have occupied the entire Sahara. However, species that require deep water did not reach northern regions because of weak hydrological connections. Human dispersals were influenced by this distribution; Nilo-Saharan speakers hunting aquatic fauna with barbed bone points occupied the southern Sahara, while people hunting Savannah fauna with the bow and arrow spread southward. The dating of lacustrine sediments show that the “green Sahara” also existed during the last interglacial (∼125 ka) and provided green corridors that could have formed dispersal routes at a likely time for the migration of modern humans out of Africa.
Highlights From An Interview Of Governor Ritter
DavidThi808 interviewed Colorado's outgoing Democratic Governor, Bill Ritter, for Colorado Pols and got some interesting tidbits. Here are some highlights.
Governor Ritter's official reason for not running again is the need to balance work and family, which is the default reason that every politician and political appointee gives for a decision to bow out of politics.
But, I think that I am hardly in the minority in seeing his frayed relationship with unions who are a key constituency of the Democratic party as his key political reason for not running again. And, the truth of the matter is that he could have delivered more to unions, simply by refraining from vetoing pro-union legislation that was passed on his watch. With Democrats in control of the General Assembly, he didn't veto many pieces of legislation, and the most controversial ones were pro-union measures, at least one of which he promised union supporters on the campaign trail that he would support, accompanied by unconvincing veto messages about the political process. But for those vetoes, it is my opinion that he would have been running for re-election in 2010 with broad based support from the Democratic party.
The interview didn't discuss juvenile justice or pardons, so we have no more insight on how Governor Ritter will address those issues in his final days in office.
[Remember Me For Improving Education, Expanding Access To Healthcare, Funding Transportation and the New Energy Economy]
[I] asked him what is his proudest accomplishment. He started off saying that they made the quality of life better under very difficult circumstances. . . . He talked about the package of education policy bills that have been passed and how that is of dramatic importance for the future of our state, especially to address the drop-out rate and achievement gap. He next discussed healthcare policy, calling out in particular the healthcare availability act. He completed his list with sustainable transportation funding (FASTER). He then switched gears and discussed what he thinks the history books will say. He thinks history will remember his administration for changing the energy culture in this state. . . .
[He Needed To Be More Aware Of Union-Mgt Issues And Promise Less]
Next I told Governor Ritter he gets a time machine, but gets to go back 4 years for 10 seconds to tell Governor-elect Ritter one thing. What would it be. He immediately answered that he would tell himself to pay more attention to the relationship between labor and the business community. He then said he would go back 5 years before the campaign started and stop himself from over-promising where was then not able to deliver. He later said that this was his biggest regret.
[The Republicans Put Politics Ahead Of The Public Good]
I then asked the Governor what was the biggest surprise over the last 4 years. He said it was how difficult it was to reach across the aisle to find common ground. He thinks a large part of that was a giant shock to the Republican party to lose so much ground since 2004 that they decided to focus on harming him politically as much as they code for electoral advantage rather than focusing on what is best for the state. . . .
[We Need To Invest In Higher Ed]
I asked what is the big issue Colorado will face in 20 years (assuming we are a green energy center and education is better). Governor Ritter replied "that we can get more for less money" (he's right - people who say that are lying!). That we can get more services, more jobs, and at the same time we can shrink government. He said that yes we need to always be fiscally prudent, but there are a number of things that would be better for the state that would cost money.
He went on to say that higher education is a good place to start. We are underfunding higher education and we cannot continue to underfund it without losing an edge. We're 5th in the country for jobs that require a college degree. Yet our most rapidly growing segment of the population is Latino/Latina and we're doing a lousy job providing them education. That we need to fund the programs that get people through K-12 ready for college, get them in to college, and get them to successfully graduate from college.
He went on to say "if we haven't figured this out 20 years from now, we'll be in real trouble." He says the people of this state have to figure out what they really want going forward. And they have to understand the impact higher ed has on the quality of life, economic development, etc.
I asked if the root problem is that a significant chunk of the populace doesn't care about the benefits higher ed brings, or if it's that people think they can keep taxes low and should be able to get the services they want. He replied both. First that people don't know, or that they haven't made the case to the people, about how key higher ed is to the future of this state.
Governor Ritter then said that an equal problem is the cynicism people have for the government. They look at the federal government with the deficit spending and the debt to GDP ratio is worrisome. And that reflects on to the state government. And with that comes people's lack of trust in the government to do these things, and do them well. . . .
[Don't Legalize Drugs]
I asked him about the money we spend on prisons and should we treat drugs as a mental health issue instead of a criminal issue. Governor Ritter first talked about how he started the state's first drug court. But he then said we cannot legalize drugs. He then went on to say that 75% of violent crimes are committed because people are intoxicated. He then continued saying we have to continue to educate kids about the problems that come with drugs, we have to spend money on treatment, and you have to address those people who won't obey the law. But you cannot legalize it because if you do then drug use will become normative.
Governor Ritter's official reason for not running again is the need to balance work and family, which is the default reason that every politician and political appointee gives for a decision to bow out of politics.
But, I think that I am hardly in the minority in seeing his frayed relationship with unions who are a key constituency of the Democratic party as his key political reason for not running again. And, the truth of the matter is that he could have delivered more to unions, simply by refraining from vetoing pro-union legislation that was passed on his watch. With Democrats in control of the General Assembly, he didn't veto many pieces of legislation, and the most controversial ones were pro-union measures, at least one of which he promised union supporters on the campaign trail that he would support, accompanied by unconvincing veto messages about the political process. But for those vetoes, it is my opinion that he would have been running for re-election in 2010 with broad based support from the Democratic party.
The interview didn't discuss juvenile justice or pardons, so we have no more insight on how Governor Ritter will address those issues in his final days in office.
Mixed Preferences and Federalism
Inoljt at Daily Kos has made a nice post showing the regional breakdown of Colorado's Presidential vote on maps for the last five Presidential elections.
The really notable thing about it is that it underscores the extent to which Colorado politics at the state level is an eternal struggle between liberal forces in Boulder, Denver and Pueblo against conservative in Colorado Springs and Grand Junction, with a consistent and complex pattern of minor forces swaying the conflict elsewhere in the state.
Colorado is a poster child for the limits of federalism as a means of reconciling competing political ideologies. In Colorado, there is no way to neatly divide the state into a part that wants liberal policies and one that wants conservative one. Secession and division is an impossible answer to reconcile competing political forces because the mix of ideologies geographically is too muddled to be partitioned off with regard to genuinely state and federal level decisions. Concentrations of people with opposite political views live too close to each other to have different rules for each.
There is also no dominant political party in the state. No Presidential candidate in the last five elections has won by even 54% of the vote and in 1992, the winner got less than 41% of the vote. We cast our vote for two Democrats and three Republicans in five elections, the most even divide possible, and the margin of the winner in the closest Republican prevailling election was just 1.4 percentage points. Ross Perot, who ran as a center-right candidate between the Republican and Democrat in 1992 and 1996 received strong support here.
Our Congressional delegation has gone from red to blue and back again.
We are a state that is deeply entrenched in unavoidable political division and have no way to resolve it but to tussle it out as the electorate narrowly favors one party or the other, and to reach compromises. Neither party has any realistic hope of achieving a permanent majority here.
One of the easiest to utilize citizen initiative processes in the nation also means that there is always room for the wildcard of poitical actors who would have no hope of achieving their ends in the legislative process to bring their issues to the front of the public agenda.
We have intraparty conflicts, of course, but the center stage in Colorado is interparty conflict as the political center in Colorado is very close to the political center of the nation as a whole, but without the comparatively simple geographical divideds that we see nationally. The United States could rethink the decision it made in 1861, and neatly divide itself into one much more conservative country, and one much more liberal one. Colorado doesn't have that option.
From the point of view of political theory, this makes Colorado a great place to look at the issue of how parties interact to win power in the face of a fickle, evenly divided public and very different visions of the ideal government and its legal regime. Colorado is a place where the formal political process really has to resolve lots of issues upon which there is no natural consensus. Since the process often works, it could even be called a good model for how to go about doing it.
The really notable thing about it is that it underscores the extent to which Colorado politics at the state level is an eternal struggle between liberal forces in Boulder, Denver and Pueblo against conservative in Colorado Springs and Grand Junction, with a consistent and complex pattern of minor forces swaying the conflict elsewhere in the state.
Colorado is a poster child for the limits of federalism as a means of reconciling competing political ideologies. In Colorado, there is no way to neatly divide the state into a part that wants liberal policies and one that wants conservative one. Secession and division is an impossible answer to reconcile competing political forces because the mix of ideologies geographically is too muddled to be partitioned off with regard to genuinely state and federal level decisions. Concentrations of people with opposite political views live too close to each other to have different rules for each.
There is also no dominant political party in the state. No Presidential candidate in the last five elections has won by even 54% of the vote and in 1992, the winner got less than 41% of the vote. We cast our vote for two Democrats and three Republicans in five elections, the most even divide possible, and the margin of the winner in the closest Republican prevailling election was just 1.4 percentage points. Ross Perot, who ran as a center-right candidate between the Republican and Democrat in 1992 and 1996 received strong support here.
Our Congressional delegation has gone from red to blue and back again.
We are a state that is deeply entrenched in unavoidable political division and have no way to resolve it but to tussle it out as the electorate narrowly favors one party or the other, and to reach compromises. Neither party has any realistic hope of achieving a permanent majority here.
One of the easiest to utilize citizen initiative processes in the nation also means that there is always room for the wildcard of poitical actors who would have no hope of achieving their ends in the legislative process to bring their issues to the front of the public agenda.
We have intraparty conflicts, of course, but the center stage in Colorado is interparty conflict as the political center in Colorado is very close to the political center of the nation as a whole, but without the comparatively simple geographical divideds that we see nationally. The United States could rethink the decision it made in 1861, and neatly divide itself into one much more conservative country, and one much more liberal one. Colorado doesn't have that option.
From the point of view of political theory, this makes Colorado a great place to look at the issue of how parties interact to win power in the face of a fickle, evenly divided public and very different visions of the ideal government and its legal regime. Colorado is a place where the formal political process really has to resolve lots of issues upon which there is no natural consensus. Since the process often works, it could even be called a good model for how to go about doing it.
The Next Bell Curve
New discoveries in genetics are forcing us to revisit issues of nationalism, race, cultural differences, eugenics, national genetic fitness and more that surged in the 19th century with tragic political consequences, based on weak science.
The Intellectual Repurcussions of the Bell Curve
The most prominent recent echo of this old set of debates, although more subtle versions of them persist particularly in conservative circles and in the hotbed of identity politics of the left, is the book "The Bell Curve" which argued that IQ is largely genetic, that IQs differ between nations and racial groups, and that as a result, all men are not created equal.
IQ is reasonably well defined and studies in the psychological literature and it is real. But, its place in the nature-nuture debate is unsettled. Just as the variation in height within middle class people in the developed world has a very strong genetic component, despite the fact that we have very solid evidence that differences in height from time period to time period and from society to society have a strong environmental component, there is fairly good reason to think that population differences in IQ have a strong environmental component even though individual differences in IQ within a population in a healthy enviroment are mostly genetic. "The Bell Curve" spurred a strong academic rebuttal about the limitations of the folk concept of race as an intellectual construct and about the limitations of what we know about the hereditary sources of IQ in understanding the bigger picture. Liberal academia has since devoted considerable effort to making clear the cultural dimensions of race and ethnicity, the circumstances in which there is an environmental role in IQ (e.g. the Flynn Effect and the large role of environmental factors in determining IQ in deprived environments, and the role of breastfeeding in IQ development), and the incompleteness of IQ as a measure of fitness in modern society.
But, the kind of inquiry pursued first by 19th century racist and nationalist, and latter in the Bell Curve will recur, and with more rigor than its predecessors.
IQ measures a "phenotype" (i.e. an outward symptom whatever its cause), while increasingly, the rise of inexpensive genomic testin will make it possible to measure genotypes (i.e. specific genetic patterns) in large populations. And, when measuring genotypes directly, one can describe genotype prevalence in populations without resorting to 19th century folk race classifications in circumstances where they aren't justified by genetic population clustering that isn't arbitrary or socially defined.
The Relatively Benign Issue Of Physical Traits
Some genotypes are very pertinent to fitness in modern society.
The physical differences that these genotypes code aren't all that explosive politically and socially, and indeed, may be widely useful.
For example, if we can identify the genotypes that account for most variation in height, we can tell if an individual's stature development is atypical for a person with that genotype and thus indicates a possible health problem, rather than for an average individual, which muddies the diagnostic signal.
Pinning down genotypes for lactose intolerance can allow a university or prison food service program to distinguish between people who genuinely need special diet for health reasons, from those who simply don't like their choices for less weighty reasons.
Discovering a genotype associated with ability to metabolize alcohol may allow for fairer DUI prosecutions and help people to better know how much liquor they can handle safely in social situations.
Widespread genotyping will make insurability deeply problematic for some people with physical health issues, but Health Insurance reform has brought the United States finally into the era where the market is regulated in a way that does not tie insurance rates closely to actually risk of physical illness and makes health care universal, so this may not prove to be the real battle ground of genetic privacy in the future.
The Explosive Issue of Mental Traits
But, the explosive traits will be the genotypes associated with personality, character, mental health and intelligence.
There is uncertainty here as well. Genotypes do not translate directly into phenotypes. Some interact with unknown genetic or epigenetic components. Some interact with environmental triggers. But, there will be inevitable oversimplification, and there are going to be genotypes that are simply less adaptive in almost all ordinary circumstances in modern society that matter socio-economically, even if there are niche circumstances when they confer benefits.
There are and will be more genotypes associated with crime risk, genotypes associated with bad economic decision making, genotypes associated with taking unsafe risks, genotypes associated with being paranoid or trusting, anxious or lazy, sharp or dull, genotypes associated with marital strife.
Employers, spouses, clubs and educational institutions are all going to want this scientifically and impartially determined information about individuals and are going to want to act on it and will sometimes want to give it far more weight than it actually deserves because alternative measures are more expensive to make and more subjective.
"Bad" genotypes are inevitably going to be more frequent in some nationalities or ethnicities than others. And, the nationalities and ethnicities where "bad" genotypes are common are going to become associated with those nationalities or ethnicities with scientifically unassailable certainty, even though individuals will, of course, vary. Debates about the meaning of neurodiversity aren't going to be limited to autistic young men in obscure Internet forums anymore. Stereotypes will be applied to groups without regard to individual differences and will be used to support invideous discrimination and doctrines of national and ethnicity superiority.
The philosophical underminings of political and legal equality, and of the equal dignity of every human being morally is going to have to be divorced from any concept of ability that can be inferred from genotypes.
We are going to have to decide as a society how much of this is legitimate to consider, and how much is not and when. If the history of past efforts to deal with similar issues is any guide, efforts to maintain privacy will largely be overcome by "need to know" considerations. You may be able to keep your genome secret from your neighbor, but your casualty insurer, your employer and your prospective spouse will probably all manage to secure your consent to disclose it.
There will be strong temptations to employ eugenic methods and gene therapies to extinguish "bad" genotypes in the population. Individuals with "bad" genotypes may not get the benefit of the doubt in a wide variety of life changing decisions from educational choices, to criminal justice sentencing decisions, to employment and more.
Perspective
In some ways, genotyping is simply a refinement of the process of knowing an individual and knowing that individual's family that humans have used to judge each other for millenium, which a lower margin of error that requires less effort to apply. People have always been judged by their character, sexual selection and other forms of selection have always been at work based on those traits, and stereotyping has always had an element of population genetics at work in it when employed accurately.
The Founders who wrote the Declaration of Independence didn't think that "all men were created equal" in the sense of the kind of abilities relevant to track coaches, orchestra directors, college admissions officers and hiring officers, than people today do. They were thinking of equality under the law in contrast to the system that afforded aristocrats special treatment by virtue of hereditary or arbitrary royally granted privileges that prevailed in England, their colonial ruler.
The anti-hierachical doctrines of modern democratic political theory was a rationalist Englightenment effort to replace hereditary privilege and the rule of absolute monarchs, with meritocracy and the rule of law.
One can imagine a society that, burned by the insideous wrongs of racism and nationalism for the last four hundred years, eschews stereotyping based on ethnicity and nationality that represents only average trends in favor of meritocracy in which genotyping plays one part, no larger than it deserves to be, in sorting the fit from those less able to excel in modern society, and adjusts expectations for each accordingly.
If there is a genotype that makes someone prone to airplane accidents, we really don't want that person to be a pilot, if this may be unfair to the individuals who apply for the post by not giving them a chance to prove themselves.
Nobody was very morally concerns when the Jedis in Star Wars determined Anakin Skywalker's aptitude to become a Jedi knight with a blood test, and the movie's depiction of the multi-ethnic group of people so selected reflects the belief held by many Americans that truly fair and scientific, culturally neutral meritocratic testing would produce more diversity and room for social advancement than the status quo.
Fear of this development is well founded. But, not because it is inherently wrong to make decisions based upon genotypes. Instead, it is because we don't trust politicians and demagogues in the maelstrom of politics and social conflict to use that information rationally, scientifically, and in the manner in which the inferrences that can be drawn from them actually follow in a reliable way. Also, we fear this kind of development because we doubt the moral generosity of those who are fit to take responsibility for those in society who need help, when the two can be so clearly distinguished.
Elitism can lead to undemocratic authoritarianism which can lead to abuse of authority, and the more sacrosanct the legitimacy of that authority is, the more secure it is and the more freedom it has to abuse that authority. Claims to legitimacy based upon genetic fitness may be the next divine right of kings. Our meritocratic society, in a world where we know that many of the factors that go into merit have a strong hereditary component, already has begun to approximate that model.
But, politics is about both power and choice. Meritocracy helps leaders analyze the choices, but doesn't tell you who those choices should help or harm. Who benefits is a question of power, and the fear is that the ability to make choices will become too deeply entwined with the question of who decides who benefits from those choices. We need meritocrat leaders, but we need those leaders to look out for more than their own personal interests.
The Intellectual Repurcussions of the Bell Curve
The most prominent recent echo of this old set of debates, although more subtle versions of them persist particularly in conservative circles and in the hotbed of identity politics of the left, is the book "The Bell Curve" which argued that IQ is largely genetic, that IQs differ between nations and racial groups, and that as a result, all men are not created equal.
IQ is reasonably well defined and studies in the psychological literature and it is real. But, its place in the nature-nuture debate is unsettled. Just as the variation in height within middle class people in the developed world has a very strong genetic component, despite the fact that we have very solid evidence that differences in height from time period to time period and from society to society have a strong environmental component, there is fairly good reason to think that population differences in IQ have a strong environmental component even though individual differences in IQ within a population in a healthy enviroment are mostly genetic. "The Bell Curve" spurred a strong academic rebuttal about the limitations of the folk concept of race as an intellectual construct and about the limitations of what we know about the hereditary sources of IQ in understanding the bigger picture. Liberal academia has since devoted considerable effort to making clear the cultural dimensions of race and ethnicity, the circumstances in which there is an environmental role in IQ (e.g. the Flynn Effect and the large role of environmental factors in determining IQ in deprived environments, and the role of breastfeeding in IQ development), and the incompleteness of IQ as a measure of fitness in modern society.
But, the kind of inquiry pursued first by 19th century racist and nationalist, and latter in the Bell Curve will recur, and with more rigor than its predecessors.
IQ measures a "phenotype" (i.e. an outward symptom whatever its cause), while increasingly, the rise of inexpensive genomic testin will make it possible to measure genotypes (i.e. specific genetic patterns) in large populations. And, when measuring genotypes directly, one can describe genotype prevalence in populations without resorting to 19th century folk race classifications in circumstances where they aren't justified by genetic population clustering that isn't arbitrary or socially defined.
The Relatively Benign Issue Of Physical Traits
Some genotypes are very pertinent to fitness in modern society.
The physical differences that these genotypes code aren't all that explosive politically and socially, and indeed, may be widely useful.
For example, if we can identify the genotypes that account for most variation in height, we can tell if an individual's stature development is atypical for a person with that genotype and thus indicates a possible health problem, rather than for an average individual, which muddies the diagnostic signal.
Pinning down genotypes for lactose intolerance can allow a university or prison food service program to distinguish between people who genuinely need special diet for health reasons, from those who simply don't like their choices for less weighty reasons.
Discovering a genotype associated with ability to metabolize alcohol may allow for fairer DUI prosecutions and help people to better know how much liquor they can handle safely in social situations.
Widespread genotyping will make insurability deeply problematic for some people with physical health issues, but Health Insurance reform has brought the United States finally into the era where the market is regulated in a way that does not tie insurance rates closely to actually risk of physical illness and makes health care universal, so this may not prove to be the real battle ground of genetic privacy in the future.
The Explosive Issue of Mental Traits
But, the explosive traits will be the genotypes associated with personality, character, mental health and intelligence.
There is uncertainty here as well. Genotypes do not translate directly into phenotypes. Some interact with unknown genetic or epigenetic components. Some interact with environmental triggers. But, there will be inevitable oversimplification, and there are going to be genotypes that are simply less adaptive in almost all ordinary circumstances in modern society that matter socio-economically, even if there are niche circumstances when they confer benefits.
There are and will be more genotypes associated with crime risk, genotypes associated with bad economic decision making, genotypes associated with taking unsafe risks, genotypes associated with being paranoid or trusting, anxious or lazy, sharp or dull, genotypes associated with marital strife.
Employers, spouses, clubs and educational institutions are all going to want this scientifically and impartially determined information about individuals and are going to want to act on it and will sometimes want to give it far more weight than it actually deserves because alternative measures are more expensive to make and more subjective.
"Bad" genotypes are inevitably going to be more frequent in some nationalities or ethnicities than others. And, the nationalities and ethnicities where "bad" genotypes are common are going to become associated with those nationalities or ethnicities with scientifically unassailable certainty, even though individuals will, of course, vary. Debates about the meaning of neurodiversity aren't going to be limited to autistic young men in obscure Internet forums anymore. Stereotypes will be applied to groups without regard to individual differences and will be used to support invideous discrimination and doctrines of national and ethnicity superiority.
The philosophical underminings of political and legal equality, and of the equal dignity of every human being morally is going to have to be divorced from any concept of ability that can be inferred from genotypes.
We are going to have to decide as a society how much of this is legitimate to consider, and how much is not and when. If the history of past efforts to deal with similar issues is any guide, efforts to maintain privacy will largely be overcome by "need to know" considerations. You may be able to keep your genome secret from your neighbor, but your casualty insurer, your employer and your prospective spouse will probably all manage to secure your consent to disclose it.
There will be strong temptations to employ eugenic methods and gene therapies to extinguish "bad" genotypes in the population. Individuals with "bad" genotypes may not get the benefit of the doubt in a wide variety of life changing decisions from educational choices, to criminal justice sentencing decisions, to employment and more.
Perspective
In some ways, genotyping is simply a refinement of the process of knowing an individual and knowing that individual's family that humans have used to judge each other for millenium, which a lower margin of error that requires less effort to apply. People have always been judged by their character, sexual selection and other forms of selection have always been at work based on those traits, and stereotyping has always had an element of population genetics at work in it when employed accurately.
The Founders who wrote the Declaration of Independence didn't think that "all men were created equal" in the sense of the kind of abilities relevant to track coaches, orchestra directors, college admissions officers and hiring officers, than people today do. They were thinking of equality under the law in contrast to the system that afforded aristocrats special treatment by virtue of hereditary or arbitrary royally granted privileges that prevailed in England, their colonial ruler.
The anti-hierachical doctrines of modern democratic political theory was a rationalist Englightenment effort to replace hereditary privilege and the rule of absolute monarchs, with meritocracy and the rule of law.
One can imagine a society that, burned by the insideous wrongs of racism and nationalism for the last four hundred years, eschews stereotyping based on ethnicity and nationality that represents only average trends in favor of meritocracy in which genotyping plays one part, no larger than it deserves to be, in sorting the fit from those less able to excel in modern society, and adjusts expectations for each accordingly.
If there is a genotype that makes someone prone to airplane accidents, we really don't want that person to be a pilot, if this may be unfair to the individuals who apply for the post by not giving them a chance to prove themselves.
Nobody was very morally concerns when the Jedis in Star Wars determined Anakin Skywalker's aptitude to become a Jedi knight with a blood test, and the movie's depiction of the multi-ethnic group of people so selected reflects the belief held by many Americans that truly fair and scientific, culturally neutral meritocratic testing would produce more diversity and room for social advancement than the status quo.
Fear of this development is well founded. But, not because it is inherently wrong to make decisions based upon genotypes. Instead, it is because we don't trust politicians and demagogues in the maelstrom of politics and social conflict to use that information rationally, scientifically, and in the manner in which the inferrences that can be drawn from them actually follow in a reliable way. Also, we fear this kind of development because we doubt the moral generosity of those who are fit to take responsibility for those in society who need help, when the two can be so clearly distinguished.
Elitism can lead to undemocratic authoritarianism which can lead to abuse of authority, and the more sacrosanct the legitimacy of that authority is, the more secure it is and the more freedom it has to abuse that authority. Claims to legitimacy based upon genetic fitness may be the next divine right of kings. Our meritocratic society, in a world where we know that many of the factors that go into merit have a strong hereditary component, already has begun to approximate that model.
But, politics is about both power and choice. Meritocracy helps leaders analyze the choices, but doesn't tell you who those choices should help or harm. Who benefits is a question of power, and the fear is that the ability to make choices will become too deeply entwined with the question of who decides who benefits from those choices. We need meritocrat leaders, but we need those leaders to look out for more than their own personal interests.
The New Mild Multi-Regionalism
Razib Khan at Gene Expression has dubbed this "The Year of the Other human." This flows from two blockbuster discoveries.
Eurasians Have Neanderthal Ancestors
The first, based on a comparison of a total Neanderthal genome recovered from ancient remains to modern humans suggests that non-Africans have an average of 1% to 4% Neanderthal admixture distributed more or less evenly across the non-African world, relative to Africans. Multiple mtDNA samples from Neanderthals show that no modern human has any Neanderthal mtDNA, and there are strong, although less direct, reasons to suspect that there is likely no non-recombining Neanderthal Y-DNA in modern humans. As I've mentioned previously, this is surprising, as modern humans outside the Neanderthal range have the same level of Neanderthal admixture as those within the Nenaderthal range, suggesting that the admixture that survives in today's modern humans at detectable levels derives largely from admixture with a Eurasian founder population (or alternatively, from population structure in Africa in which the source population for Eurasians was more similar genetically to Neanderthals and has since largely vanished from genetically well sampled populations of Africa).
Papuans Have Denisovian Ancestors
The second, based on comparison of a total genome recovered from remains in the Denisova cave in South Siberia from 49,000 years ago (a population dubbed the Denisovians for now, in order to maintain agnosticism as to their species association within the Homo genus) to modern humans suggests that about 5% of the DNA in Melanesian populations (i.e. New Guinea) relative to Africans are derived from the Denisovians, and that there are smaller levels of Denisovian ancestry in other modern human populations. The two samples of mtDNA from the Denisovians are unlike both Neanderthal and modern human mtDNA, and a Densivoian contribution to modern human non-recombining Y-DNA is unlikely for the same reasons that it is unlikely in the case of Neanderthals. The autosomal DNA of the Denisovians show them to be more similar to Neanderthals than to modern humans, although still very different from both.
Since Melanesians are also non-Africans, it appears that as much as 7.25% of Melanesian autosomal DNA may be derived from archiac humans (i.e. Neanderthals and Denisovians combined), about one-fourteenth of their ancestry, which is more than one grandparent of a grandparent (although it isn't so recent or so direct, and instead likely reflects a small nubmer of admixture events with small founder populations, possibly over an extended period of time of co-existence of archaic Homo populations with modern humans).
The immense geographic distance from New Guinea, where Denisovian genetic traces are found, to Southern Siberia where the Denisovian DNA was found, also suggests that the Denisovian range may have once stretched as far as Southeast Asia, with admixture with proto-Melanesians taking place perhaps in Sahul.
The Denisovian remains are found in association with what seems like Mousterian artifacts typical of Neanderthal sites, rather than the earlier, more simple artifacts typically associated with Homo Erectus sites.
Early Warnings
Two other recent discoveries, while not from 2010, add to the stew.
First, there is Homo floresiensis (colloquially called Hobbits), a cluster of skeletons of apparently archaic humans remains from about 20,000 years ago (although the dating and taxonomy are disputed) whose discovery on the island of Flores was announced in 2004, that may have co-existed with modern humans and even survived to be mentioned in modern human folklore from the island. Some scholars have argued that they were simply modern humans with some sort of disorder, but the better supported view seems to be that they were a population archaic humans, perhaps a relict population of Homo Erectus, or if they were distinct from Homo Erectus, the Denisovians.
Also, while no one disputes the Cro-Magnons in Europe were anatomically modern humans descended from the same Out of Africa founder population as all modern humans living today, there is incresing evidence that Cro-Magnons were racially very distinct from subsequent Europeans -- as visibly and genetically different from Neolithic farmers as Australian Aborigines are from the people of Europe today.
The extent to which the people of Europe today trace their ancestry to Cro-Magnons is disputed. There has been a little bit of influx of Asian ancestry from Siberia that can be detected in the genetics of Uralic language speaking populations such as the Finns and the Estonians. There has been a substantial population influx to Europe from the Near East and by West Eurasian peoples from Central Asia and/or the Caucuses in the last 20,000 years, although the mix of ancestry varies from region to region in Europe. But, opinion differs on a Cro-Magnon contribution (although all serious commentators agree that it might vary from one part of Europe to another). Some argue for a negligible contribution, while others argue for as much as an 80% contribution in some parts of Europe.
The Not So Old Consensus Worldview
Two years ago, the overwhelming consensus among scholars in the evolution of modern humans, mostly anthropologist and geneticists, had been that there was no, or virtually no archaic Homo ancestry in modern humans. If Neanderthal or Homo Erectus had sex or had children, their offspring did not survive to the present.
Instead, the consensus view held, all modern humans are descendants from an African Mitochondrial Eve and Y-DNA Adam, and that virtually all non-Africans descend from a founder population that was a subset of all Africans, with two separate non-African mtDNA lineages tracing their roots to the most basal mtDNA L3 lineage in Africa, and all non-African Y-DNA lineages tracing their roots to two African rooted Y-DNA lineages, CF and DE. The remaining mtDNA L lineages, and the Y-DNA A and B lineages, in contrast, are seen as native to Africa and as never having been a part of the Out of Africa founder population (although some African lineage members have since left the continent, mostly to adjacent parts of Europe and the Near East, or in migrations documented or strongly suggested to be from in the historic era).
A currently small minority view imagines that a common ancestor of Y-DNA lineages CF and DE was Eurasian and that Y-DNA lineage E, which is now predominant in sub-Saharan African and virtually absent except in areas immediately adjacent to Africa or in places where there was historic era migration, was a back migration to Africa. In my view, the evidence more strongly favors a scenario where the main Out of Africa population derives from Y-DNA lineage CF, and Y-DNA lineage D found in West Africa, the Andaman Islands, Tibet, Japan, and Siberia, was an early and separate wave of Out of Africa migration paired with some of the early mtDNA M lineages.
The timing of the Out of Africa migration or migrations is a matter of ongoing controversy. At least some of the Out of Africa migrations had to have taken place by 50,000 years ago, based on the presence of the archaeologically documented anatomically modern humans (and ancient mtDNA of modern Eurasian lineages) in Europe, Papua New Guinea and Australia starting around 50,000 years ago. The oldest evidence of arguably modern human traces outside Africa go back as far as perhaps 120,000 years ago. There are anatomically modern human remains that old in the Levant, although it appears that this modern human presence outside of Africa was interrupted for tens of thousands of years before being re-established, and there are anatomically modern human remains in India that appear to be from about 75,000 years ago. There a very small number of Homo remains (whose taxonomy and dating are disputed) in China as old as 100,000 years ago or so. The multi-regionalist claim that modern humans evolved in multiple places that reflect different modern races was essentially routed, with all serious scholars agreeing that modern humans evolved in Africa perhaps 200,000 years ago, or perhaps a bit earlier, from archaic Homo types most closely related to Neanderthals who split off from their common ancestor with modern humans perhaps 500,000 years ago.
New fossil finds could always push back the oldest Out of Africa date, of course, but no one really expects to find modern humans outside Africa more than 200,000 years ago before the earliest modern human remains in Africa.
The Neanderthal record, to the exclusion of modern humans, is fairly continuous in Europe until around 30,000 years ago when they went extinct after tens of thousands of years of co-existence with modern humans in Europe and the Near East (interrupted at some points). Up through around 200,000 years ago, there is also a fossil record of archaic humans in Asia, which tended to be lumped together as Homo Erectus, although not without splitters who urged finer distinctions.
In the old worldview, all of the archaic Homo populations Eurasia were dead ends that left no genetic trace in modern humans anywhere, and their differences from modern humans were emphasized.
Even many of the early anatomically modern human populations have been reduced to insignificance with only faint genetic traces by subsequent migrating populations that were more successful and either wiped out or overwhelmed from a population genetic perspective, their predecessors.
The New Worldview
The ancient DNA evidence from Neanderthals and the Denisovians has shaken up the consensus. Archaic Homo types have made their way into the post-Out of Africa, post-evolution of anatomically modern humans, modern human family tree as a significant, but minor part of the total genome of part, but not all, of the human race.
While the vast majority of our ancestry is shared by all modern humans, a little of the ancestry of some modern humans in archaic. Not all modern humans share a common set of ancestors until one looks back as far as the common ancestor of Neanderthals and modern humans (or perhaps even further). The line between species, subspecies and race is beginning to look more arbitrary, and the fact that there were hybrid archaic and modern humans forces us to rethink who archaic modern humans were in relation to us.
There are a lot of open questions.
The mutation clock evidence from the Denisovian autosomal DNA on the timeline of the events that gave rise to their admixture with modern humans makes no sense, and is also a poor fit to the mtDNA mutation clock evidence.
It isn't clear which Homo remains and Paleolithic industries, if any outside of Denisovia yet found, should be associated with the Denisovians. Were they another race of Neanderthals? Homo Erectus? A hybrid Homo Erectus and Neanderthal population? A hybrid modern human and Neanderthal population? A hybrid Homo Erectus and modern human population? Were they Homo floresiensis? Were they something else entirely?
Are there other cases of archaic admixture yet to be discovered? How many archaic Homo types were there? Where there stable hybrid populations that were distinct from the pure types?
Where did the admixtures with archaic Homo take place? Was it bidirectional? How did it happen?
Why do we have autosomal traces of archaic Homo but not mtDNA and Y-DNA traces? What does that imply about the extent to which archaic Homo were separate species from modern humans?
In what ways does our archaic Homo ancestry make those of us who have it different? Does it include adapative traits? Does it include non-adaptive traits? Is it irrelevant except to the extent that it is ancestry identifying?
Should we reinterpret transitional period fossils to reflect our new understanding of the mix of Homo types that were present in the Paleolithic?
Were there archaic types with whom modern humans after their initial evolution admixed in Africa as well, and if so, are their some African lineages that show a stronger affinity to those archaic Homo types than others? Which archaic Homo types admixed with modern humans?
And, the old questions haven't gone away either. How many major demographic waves of human migration where there in pre-history, and when and where did they happen? Which waves admixed with modern archaic humans and which did not? From which demographic waves do today's modern human populations derive? Why did each of the archaic human populations cease to exist as distinct populations? When did the last pureblooded archaic human die? What did we take culturally from archaic humans and how? How did archaic humans differ from modern humans? Were those differences the cause of modern human survival, or did modern humans triumph largely as a matter of luck?
When did the various language families spoken today or known to have died come into being, and how did they spread and grow distinct? What is the oldest knowable linguistic stata in pre-history that we can say anything about and what do we know about what it was like and when and where it prevailed? When was language shift due to population replacement and when was it due to culturally transition? How did the cultural transitions happen? How closely are modern political, linguistic and ethnic divides linked to particular genetically distinct populations? How long does the cultural legacy of pre-historic events influence people in the present? Do our genetic roots matter? To what extent are modern humans the product of genetic and/or epigenetic adaptations to material circumstances that no longer exist? How fast are we evolving now, and in what respects and why?
Eurasians Have Neanderthal Ancestors
The first, based on a comparison of a total Neanderthal genome recovered from ancient remains to modern humans suggests that non-Africans have an average of 1% to 4% Neanderthal admixture distributed more or less evenly across the non-African world, relative to Africans. Multiple mtDNA samples from Neanderthals show that no modern human has any Neanderthal mtDNA, and there are strong, although less direct, reasons to suspect that there is likely no non-recombining Neanderthal Y-DNA in modern humans. As I've mentioned previously, this is surprising, as modern humans outside the Neanderthal range have the same level of Neanderthal admixture as those within the Nenaderthal range, suggesting that the admixture that survives in today's modern humans at detectable levels derives largely from admixture with a Eurasian founder population (or alternatively, from population structure in Africa in which the source population for Eurasians was more similar genetically to Neanderthals and has since largely vanished from genetically well sampled populations of Africa).
Papuans Have Denisovian Ancestors
The second, based on comparison of a total genome recovered from remains in the Denisova cave in South Siberia from 49,000 years ago (a population dubbed the Denisovians for now, in order to maintain agnosticism as to their species association within the Homo genus) to modern humans suggests that about 5% of the DNA in Melanesian populations (i.e. New Guinea) relative to Africans are derived from the Denisovians, and that there are smaller levels of Denisovian ancestry in other modern human populations. The two samples of mtDNA from the Denisovians are unlike both Neanderthal and modern human mtDNA, and a Densivoian contribution to modern human non-recombining Y-DNA is unlikely for the same reasons that it is unlikely in the case of Neanderthals. The autosomal DNA of the Denisovians show them to be more similar to Neanderthals than to modern humans, although still very different from both.
Since Melanesians are also non-Africans, it appears that as much as 7.25% of Melanesian autosomal DNA may be derived from archiac humans (i.e. Neanderthals and Denisovians combined), about one-fourteenth of their ancestry, which is more than one grandparent of a grandparent (although it isn't so recent or so direct, and instead likely reflects a small nubmer of admixture events with small founder populations, possibly over an extended period of time of co-existence of archaic Homo populations with modern humans).
The immense geographic distance from New Guinea, where Denisovian genetic traces are found, to Southern Siberia where the Denisovian DNA was found, also suggests that the Denisovian range may have once stretched as far as Southeast Asia, with admixture with proto-Melanesians taking place perhaps in Sahul.
The Denisovian remains are found in association with what seems like Mousterian artifacts typical of Neanderthal sites, rather than the earlier, more simple artifacts typically associated with Homo Erectus sites.
Early Warnings
Two other recent discoveries, while not from 2010, add to the stew.
First, there is Homo floresiensis (colloquially called Hobbits), a cluster of skeletons of apparently archaic humans remains from about 20,000 years ago (although the dating and taxonomy are disputed) whose discovery on the island of Flores was announced in 2004, that may have co-existed with modern humans and even survived to be mentioned in modern human folklore from the island. Some scholars have argued that they were simply modern humans with some sort of disorder, but the better supported view seems to be that they were a population archaic humans, perhaps a relict population of Homo Erectus, or if they were distinct from Homo Erectus, the Denisovians.
Also, while no one disputes the Cro-Magnons in Europe were anatomically modern humans descended from the same Out of Africa founder population as all modern humans living today, there is incresing evidence that Cro-Magnons were racially very distinct from subsequent Europeans -- as visibly and genetically different from Neolithic farmers as Australian Aborigines are from the people of Europe today.
The extent to which the people of Europe today trace their ancestry to Cro-Magnons is disputed. There has been a little bit of influx of Asian ancestry from Siberia that can be detected in the genetics of Uralic language speaking populations such as the Finns and the Estonians. There has been a substantial population influx to Europe from the Near East and by West Eurasian peoples from Central Asia and/or the Caucuses in the last 20,000 years, although the mix of ancestry varies from region to region in Europe. But, opinion differs on a Cro-Magnon contribution (although all serious commentators agree that it might vary from one part of Europe to another). Some argue for a negligible contribution, while others argue for as much as an 80% contribution in some parts of Europe.
The Not So Old Consensus Worldview
Two years ago, the overwhelming consensus among scholars in the evolution of modern humans, mostly anthropologist and geneticists, had been that there was no, or virtually no archaic Homo ancestry in modern humans. If Neanderthal or Homo Erectus had sex or had children, their offspring did not survive to the present.
Instead, the consensus view held, all modern humans are descendants from an African Mitochondrial Eve and Y-DNA Adam, and that virtually all non-Africans descend from a founder population that was a subset of all Africans, with two separate non-African mtDNA lineages tracing their roots to the most basal mtDNA L3 lineage in Africa, and all non-African Y-DNA lineages tracing their roots to two African rooted Y-DNA lineages, CF and DE. The remaining mtDNA L lineages, and the Y-DNA A and B lineages, in contrast, are seen as native to Africa and as never having been a part of the Out of Africa founder population (although some African lineage members have since left the continent, mostly to adjacent parts of Europe and the Near East, or in migrations documented or strongly suggested to be from in the historic era).
A currently small minority view imagines that a common ancestor of Y-DNA lineages CF and DE was Eurasian and that Y-DNA lineage E, which is now predominant in sub-Saharan African and virtually absent except in areas immediately adjacent to Africa or in places where there was historic era migration, was a back migration to Africa. In my view, the evidence more strongly favors a scenario where the main Out of Africa population derives from Y-DNA lineage CF, and Y-DNA lineage D found in West Africa, the Andaman Islands, Tibet, Japan, and Siberia, was an early and separate wave of Out of Africa migration paired with some of the early mtDNA M lineages.
The timing of the Out of Africa migration or migrations is a matter of ongoing controversy. At least some of the Out of Africa migrations had to have taken place by 50,000 years ago, based on the presence of the archaeologically documented anatomically modern humans (and ancient mtDNA of modern Eurasian lineages) in Europe, Papua New Guinea and Australia starting around 50,000 years ago. The oldest evidence of arguably modern human traces outside Africa go back as far as perhaps 120,000 years ago. There are anatomically modern human remains that old in the Levant, although it appears that this modern human presence outside of Africa was interrupted for tens of thousands of years before being re-established, and there are anatomically modern human remains in India that appear to be from about 75,000 years ago. There a very small number of Homo remains (whose taxonomy and dating are disputed) in China as old as 100,000 years ago or so. The multi-regionalist claim that modern humans evolved in multiple places that reflect different modern races was essentially routed, with all serious scholars agreeing that modern humans evolved in Africa perhaps 200,000 years ago, or perhaps a bit earlier, from archaic Homo types most closely related to Neanderthals who split off from their common ancestor with modern humans perhaps 500,000 years ago.
New fossil finds could always push back the oldest Out of Africa date, of course, but no one really expects to find modern humans outside Africa more than 200,000 years ago before the earliest modern human remains in Africa.
The Neanderthal record, to the exclusion of modern humans, is fairly continuous in Europe until around 30,000 years ago when they went extinct after tens of thousands of years of co-existence with modern humans in Europe and the Near East (interrupted at some points). Up through around 200,000 years ago, there is also a fossil record of archaic humans in Asia, which tended to be lumped together as Homo Erectus, although not without splitters who urged finer distinctions.
In the old worldview, all of the archaic Homo populations Eurasia were dead ends that left no genetic trace in modern humans anywhere, and their differences from modern humans were emphasized.
Even many of the early anatomically modern human populations have been reduced to insignificance with only faint genetic traces by subsequent migrating populations that were more successful and either wiped out or overwhelmed from a population genetic perspective, their predecessors.
The New Worldview
The ancient DNA evidence from Neanderthals and the Denisovians has shaken up the consensus. Archaic Homo types have made their way into the post-Out of Africa, post-evolution of anatomically modern humans, modern human family tree as a significant, but minor part of the total genome of part, but not all, of the human race.
While the vast majority of our ancestry is shared by all modern humans, a little of the ancestry of some modern humans in archaic. Not all modern humans share a common set of ancestors until one looks back as far as the common ancestor of Neanderthals and modern humans (or perhaps even further). The line between species, subspecies and race is beginning to look more arbitrary, and the fact that there were hybrid archaic and modern humans forces us to rethink who archaic modern humans were in relation to us.
There are a lot of open questions.
The mutation clock evidence from the Denisovian autosomal DNA on the timeline of the events that gave rise to their admixture with modern humans makes no sense, and is also a poor fit to the mtDNA mutation clock evidence.
It isn't clear which Homo remains and Paleolithic industries, if any outside of Denisovia yet found, should be associated with the Denisovians. Were they another race of Neanderthals? Homo Erectus? A hybrid Homo Erectus and Neanderthal population? A hybrid modern human and Neanderthal population? A hybrid Homo Erectus and modern human population? Were they Homo floresiensis? Were they something else entirely?
Are there other cases of archaic admixture yet to be discovered? How many archaic Homo types were there? Where there stable hybrid populations that were distinct from the pure types?
Where did the admixtures with archaic Homo take place? Was it bidirectional? How did it happen?
Why do we have autosomal traces of archaic Homo but not mtDNA and Y-DNA traces? What does that imply about the extent to which archaic Homo were separate species from modern humans?
In what ways does our archaic Homo ancestry make those of us who have it different? Does it include adapative traits? Does it include non-adaptive traits? Is it irrelevant except to the extent that it is ancestry identifying?
Should we reinterpret transitional period fossils to reflect our new understanding of the mix of Homo types that were present in the Paleolithic?
Were there archaic types with whom modern humans after their initial evolution admixed in Africa as well, and if so, are their some African lineages that show a stronger affinity to those archaic Homo types than others? Which archaic Homo types admixed with modern humans?
And, the old questions haven't gone away either. How many major demographic waves of human migration where there in pre-history, and when and where did they happen? Which waves admixed with modern archaic humans and which did not? From which demographic waves do today's modern human populations derive? Why did each of the archaic human populations cease to exist as distinct populations? When did the last pureblooded archaic human die? What did we take culturally from archaic humans and how? How did archaic humans differ from modern humans? Were those differences the cause of modern human survival, or did modern humans triumph largely as a matter of luck?
When did the various language families spoken today or known to have died come into being, and how did they spread and grow distinct? What is the oldest knowable linguistic stata in pre-history that we can say anything about and what do we know about what it was like and when and where it prevailed? When was language shift due to population replacement and when was it due to culturally transition? How did the cultural transitions happen? How closely are modern political, linguistic and ethnic divides linked to particular genetically distinct populations? How long does the cultural legacy of pre-historic events influence people in the present? Do our genetic roots matter? To what extent are modern humans the product of genetic and/or epigenetic adaptations to material circumstances that no longer exist? How fast are we evolving now, and in what respects and why?
The Drivers And Costs Of CEO Pay and Inequality
I've written before about the pay of superstars in sports and entertainment, and the portion of a New York Times article discussing that point covers little new ground. The shorter version is that their performances reach a very large number of people.
Entity Size And Profits Drive Corporate Pay
The article's analysis of high corporate pay, however, bears closer attention. It attributes rising pay for the executives in the "real economy" to the increasing scale of big business, and rising pay inn the finance sector to profits made possible deregulation.
The authors attribute the gains in the tides of financial industry pay largely to government regulation, which tightened during the Great Depression and on through about 1959, and then was relaxed in the late 1980s and 1990s.
A critical point to keep in mind is that the increased pay have top executives in big business or finance has a great deal to do with ability to pay and very little to do with actual competence.
Big businesses and financial firms offer an amount based on the scale of their enterprises and the amount of profit their firm creates, in the hope of attracting the best talent. But, while there is considerable evidence to indicate that better pay does lure better rank and file professionals to an industry or enterprise, there is very little evidence to indicate that big businesses are able to accurately discern which applicants for exhorbitantly paid top jbs are actually the best one, and there is considerable evidence to indicate that big businesses do an absolutely mediocre job of removing top executives who have failed to live up to the expectations upon which they were hired from their posts.
The Diminishing Marginal Returns Of Winner Take All Compensation
Put another way, if big businesses had hired someone from the same pool as the person actually hired who was willing to work for half a much compensation, there is very little to indicate that the business would be run any less well. One can find very compentent managers willing to work for $1,000,000 a year, and the marginal benefit accrued by offering $10,000,000 a year instead is not at all obvious.
This concern is particular great in the case of successor CEOs. One can argue that founders of firms have earned their great wealth by creating the firms that generate it. But, this point is much harder to make when a successor is appointed to manage to wealth created by his or her predecessors.
The divide between the rich and the megarich is driven by the concentration of economic activity into a smaller number of firms that is largely a product of economy of scale incentives in the economy, not by the fact that there is an immense divide in talent between the rich and the megarich.
Indeed, the concentration of rewards at the very top can actually create counterproductive incentives:
Counterproductive winner take all effects are the private sector equivalent of the Laffer Curve, the notion that at some very high marginal tax rate (far in excess of anything found in the U.S. or even European economies), increasing taxes reduces tax revenues because it creates an disincentive to work. There is a point at which increasing compensation to top performers actually decreases the productivity of the economic unit as a whole.
The Drivers and Economic Impacts Of Income Inequality
The article also notes the trend towards exceptionally high income inequality in the American economy, and its tenuous relationship to economic growth (emphasis added):
Keep in mind tha the 3% real increase in wages for factory workers since 1980 has not been 3% per year, but instead is 3% total. On an annualized basis, that is an increase of about 0.1% per year -- for example, a rate of about $40.00 per year for someone making $40,000 a year.
What the international comparisons citred by the New York Tiems do not reveal is the root causes of the differences in income inequality between the United States and continental Europe. It turns out that the bulk of the difference is due to government policy. The less well off are better in Europe because the net deal of taxes paid v. transfer payments received there is better than it is in the United States.
The pre-tax, pre-transfer payment earnings of the less well off are just as stagnant in Europe as they are in the United States, but Europeans have sweetened the pot for the average person to share the wealth their economies have grown, while the Americans have not.
This analysis tends to suggest that the poor in America have a comparatively wretched and insecure existence not because they are less able to contribute economically, but because our political system does not see everyone as being in the same boat to the same degree. It is plausible to hypothesize that this, in turn, has a lot to do with the fact that the less well off are far more likely to participate politically in Europe than in the United States. In Europe, the share of the voting aged population that votes is 50% to 100% greater than in the United States. Non-voters are ignored politically, and the correspondence between those who don't vote (the poor and uneducated, particularly minorities, and children) and those who receive meager governmental support by international standards, is probably not coincidental.
Notably, forcing the beneficiaries of economic growth to share their gains with the rest of the nation has not, as conservative economic intuition would suggest, had any significant effect on the rate of economic growth. In countries with mixed economies where one's earnings still significantly impact one's socio-economic well being, even significant taxation to fund transfer payments doesn't distort the economy very much, because insuring that that there are real economic incentives in earnings for economically productive conduct turns out to be much more important than the intensity of those economic incentives. The ability to capture much of your contribution to economic growth is very nearly as motivating as the ability to caputre almost all of your contribution to economic growth.
Entity Size And Profits Drive Corporate Pay
The article's analysis of high corporate pay, however, bears closer attention. It attributes rising pay for the executives in the "real economy" to the increasing scale of big business, and rising pay inn the finance sector to profits made possible deregulation.
In 1977, an elite chief executive working at one of America’s top 100 companies earned about 50 times the wage of its average worker. Three decades later, the nation’s best-paid C.E.O.’s made about 1,100 times the pay of a worker on the production line. . . . in the 1970s found that executives in the top 10 percent made about twice as much as those in the middle of the pack. By the early 2000s, the top suits made more than four times the pay of the executives in the middle. . . . Two economists at New York University, Xavier Gabaix and Augustin Landier, published a study in 2006 estimating that the sixfold rise in the pay of chief executives in the United States over the last quarter century or so was attributable entirely to the sixfold rise in the market size of large American companies. . . .
In 2007 . . . . financial companies accounted for a full third of the profits of the nation’s private sector. Wall Street bonuses hit a record $32.9 billion, or $177,000 a worker. . . . Financiers had a great time in the early decades of the 20th century: from 1909 to the mid-1930s, they typically made about 50 percent to 60 percent more than workers in other industries. But the stock market collapse of 1929 and the Great Depression changed all that. In 1934, corporate profits in the financial sector shrank to $236 million, one-eighth what they were five years earlier. Wages followed. From 1950 through about 1980, bankers and insurers made only 10 percent more than workers outside of finance, on average. . . .
By 2005, the share of workers in the finance industry with a college education exceeded that of other industries by nearly 20 percentage points. By 2006, pay in the financial sector was again 70 percent higher than wages elsewhere in the private sector. A third of the 2009 Princeton graduates who got jobs after graduation went into finance; 6.3 percent took jobs in government.
The authors attribute the gains in the tides of financial industry pay largely to government regulation, which tightened during the Great Depression and on through about 1959, and then was relaxed in the late 1980s and 1990s.
A critical point to keep in mind is that the increased pay have top executives in big business or finance has a great deal to do with ability to pay and very little to do with actual competence.
Big businesses and financial firms offer an amount based on the scale of their enterprises and the amount of profit their firm creates, in the hope of attracting the best talent. But, while there is considerable evidence to indicate that better pay does lure better rank and file professionals to an industry or enterprise, there is very little evidence to indicate that big businesses are able to accurately discern which applicants for exhorbitantly paid top jbs are actually the best one, and there is considerable evidence to indicate that big businesses do an absolutely mediocre job of removing top executives who have failed to live up to the expectations upon which they were hired from their posts.
The Diminishing Marginal Returns Of Winner Take All Compensation
Put another way, if big businesses had hired someone from the same pool as the person actually hired who was willing to work for half a much compensation, there is very little to indicate that the business would be run any less well. One can find very compentent managers willing to work for $1,000,000 a year, and the marginal benefit accrued by offering $10,000,000 a year instead is not at all obvious.
This concern is particular great in the case of successor CEOs. One can argue that founders of firms have earned their great wealth by creating the firms that generate it. But, this point is much harder to make when a successor is appointed to manage to wealth created by his or her predecessors.
The divide between the rich and the megarich is driven by the concentration of economic activity into a smaller number of firms that is largely a product of economy of scale incentives in the economy, not by the fact that there is an immense divide in talent between the rich and the megarich.
Indeed, the concentration of rewards at the very top can actually create counterproductive incentives:
If only a very lucky few can aspire to a big reward, most workers are likely to conclude that it is not worth the effort to try. The odds aren’t on their side. Inequality has been found to turn people off. A recent experiment conducted with workers at the University of California found that those who earned less than the typical wage for their pay unit and occupation became measurably less satisfied with their jobs, and more likely to look for another one if they found out the pay of their peers. Other experiments have found that winner-take-all games tend to elicit much less player effort — and more cheating — than those in which rewards are distributed more smoothly according to performance.
Counterproductive winner take all effects are the private sector equivalent of the Laffer Curve, the notion that at some very high marginal tax rate (far in excess of anything found in the U.S. or even European economies), increasing taxes reduces tax revenues because it creates an disincentive to work. There is a point at which increasing compensation to top performers actually decreases the productivity of the economic unit as a whole.
The Drivers and Economic Impacts Of Income Inequality
The article also notes the trend towards exceptionally high income inequality in the American economy, and its tenuous relationship to economic growth (emphasis added):
Since 1980, the country’s gross domestic product per person has increased about 69 percent, even as the share of income accruing to the richest 1 percent of the population jumped to 36 percent from 22 percent. But the economy grew even faster — 83 percent per capita — from 1951 to 1980, when inequality declined when measured as the share of national income going to the very top of the population.
One study concluded that each percentage-point increase in the share of national income channeled to the top 10 percent of Americans since 1960 led to an increase of 0.12 percentage points in the annual rate of economic growth — hardly an enormous boost. . . . Since 1980, the weekly wage of the average worker on the factory floor has increased little more than 3 percent, after inflation. . . . According to the Organization for Economic Cooperation and Development, the average earnings of the richest 10 percent of Americans are 16 times those for the 10 percent at the bottom of the pile. That compares with a multiple of 8 in Britain and 5 in Sweden. . . . There is a 42 percent chance that the son of an American man in the bottom fifth of the income distribution will be stuck in the same economic slot. The equivalent odds for a British man are 30 percent, and 25 percent for a Swede.
Keep in mind tha the 3% real increase in wages for factory workers since 1980 has not been 3% per year, but instead is 3% total. On an annualized basis, that is an increase of about 0.1% per year -- for example, a rate of about $40.00 per year for someone making $40,000 a year.
What the international comparisons citred by the New York Tiems do not reveal is the root causes of the differences in income inequality between the United States and continental Europe. It turns out that the bulk of the difference is due to government policy. The less well off are better in Europe because the net deal of taxes paid v. transfer payments received there is better than it is in the United States.
The pre-tax, pre-transfer payment earnings of the less well off are just as stagnant in Europe as they are in the United States, but Europeans have sweetened the pot for the average person to share the wealth their economies have grown, while the Americans have not.
This analysis tends to suggest that the poor in America have a comparatively wretched and insecure existence not because they are less able to contribute economically, but because our political system does not see everyone as being in the same boat to the same degree. It is plausible to hypothesize that this, in turn, has a lot to do with the fact that the less well off are far more likely to participate politically in Europe than in the United States. In Europe, the share of the voting aged population that votes is 50% to 100% greater than in the United States. Non-voters are ignored politically, and the correspondence between those who don't vote (the poor and uneducated, particularly minorities, and children) and those who receive meager governmental support by international standards, is probably not coincidental.
Notably, forcing the beneficiaries of economic growth to share their gains with the rest of the nation has not, as conservative economic intuition would suggest, had any significant effect on the rate of economic growth. In countries with mixed economies where one's earnings still significantly impact one's socio-economic well being, even significant taxation to fund transfer payments doesn't distort the economy very much, because insuring that that there are real economic incentives in earnings for economically productive conduct turns out to be much more important than the intensity of those economic incentives. The ability to capture much of your contribution to economic growth is very nearly as motivating as the ability to caputre almost all of your contribution to economic growth.
26 December 2010
How Reversible Is Progress?
Sometimes in history, mankind fails. Empires fall. Hard won knowledge is lost. Niceties of civilization disappear. Languages die. Prejudices tamed resurface. Barbarism defeats civilization. Centers of progress stagnate and ossify. Wars, pointless or not, kill millions when wiser statesmen could have secured better outcomes.
We are a fortunate people. We live in a time of technological virtuosity, global communication, remarkable tolerance, greater scientific knowledge than any generation to come before us, awareness of many of the threats to our way of life that loom ahead, economic prosperity viewed in the long term, and a point in time when many countries are liberal democracies, many countries that were once brutal tyrannies like the Soviet Union and early Communist China have changed course, and where truly abysmal countries, like North Korea, are small and isolated and perhaps even on the brink of collapse.
We are, in other words, at a point when we have a great deal to lose.
The risks are not abstract. Our era of toleration has brought out into the open the identities of people who would have been persecuted or killed in all prior eras.
Widespread gay marriage and the end of "Don't Ask, Don't Tell" which has already been matched in much of the Western world, means that hosts of gays and lesbians who would have lived secret lives in the past are now out and public. This openness has brought with it acceptance, toleration, and access to legal recognition with all of its blessings. But, openness comes with the price of constant vigilance. Once out of the closet, a population that will always be in the minority must rely forever after on the good will of a liberal democratic society to protect it.
Gays aren't the only ones taking a risk. Many people have come out publicly as atheists, and if religion becomes established, as there has often been democratic pressure of governments to do, they risk persecution as heretics and apostates and blasphemers.
Pagans are in much the same shoes. The Denver Post on Christmas Day informed us that 45 people in Haiti had been murdered based on accusations of witchcraft connected with its recent cholera outbreak.
Anti-Islamic sentiment has swept much of Europe, and lurks ready to turn ugly in America.
It would take a greater calamity to set back human knowledge, science and technology, but would it really take an event so unthinkable. As glorious as a digitized world may seem, the collapse of the Internet remains far more possible than the sack of every decent university and public library on earth. But, if the libraries no long have independent copies of the knowledge upon which we rely, if the technologies we use to access our data cease to be supported for any myriad reasons before the data the technologies make us privy to is converted to new media, how much could we lose?
It is hard to believe that we could ever see a setback as dire as the one that followed the collapse of the Roman Empire. The college textbooks lurking in the basements of hundreds of thousands of middle class families could reconstruct science, if not to the standards of 2010, at least to the start of the art from the 1980s. It might take a long time to rebuild particle accelerators necessary to prove the existence of quarks, and weak force carrier bosons and neutrinos and their properties from first principles and to return to the point where we could answer the remaining open questions of physics. But,a backpack full of graduate school textbooks (or even a binder full of relevant wikipedia printouts) would make the task far less arduous than it was the first time. Not many people know how to run an iron foundry, but again, if just once decent university library of thousands in the United States were to survive, we could learn to do so much more efficiently.
We could easily lose many of the primary documents that back up the story of history. But, there are enough histories out there that it is hard to believe that the broad outlines and even references to many of the key primary sources from which they can be reconstructed, will ever again be lost.
I'd sleep a little better if someone were consciously working out a way to insure that this knowledge survived longer -- printing out a massive library on bronze plates in a dozen different languages maybe -- but even a truly global apocalyptic event would have to be extremely thorough to set back the state of knowledge too far.
It might take a few decades to reconstruct the infrastructure necessary to manufacture decent computers, or mass produce precision stainless steel appliances, or to generate advanced medicines with high levels of purity at reasonable costs. But, there are quite a few copies of the key information available in the kind of technical detail found in patent applications, for example, out there.
More intangibly, how deeply can the modern worldview be suppressed? There have certainly been episodes of history that makes one worry. Who in late tsarist Russia on the verge of Western style liberal democracy could have foreseen Stalinist massacres? Who in the collapsing Chinese empire could have foreseen the Cultural Revolution? The example of Korea is particularly troubling because it was so recent. Dean Koh of the Harvard Law School recounts cities of people with eyes as listless of those of Orwell's 1984 in the hear and now in North Korea, which has been divided from the South for less than sixty years. The return of Albanian society to a medieval state from a similarly brief period of isolation likewise leaves one with qualms.
Yet, Cambodia, while hardly a model of national success, seems to have returned to some semblence of normalcy even after Pol Pot's massacre of the entire bourgouise class in his society. Albania is being reintegrated into Europe. The death of Soviet style communism was traumatic for a generation, but its successor regime seems to be emerging and tentatively is not a total nightmare.
If progress really is inevitable in the long run, despite dire and painful setbacks, then perhaps economic development in places that have not yet experienced its benefits, may also be more or less inevitable. Surely, results achieved from scratch in one place can be secured elsewhere, where they need only be copied, somewhat more easily than they came about in the first place. Or, can they?
Could it be that centuries of ruthless capital punishment that prevailed across Europe was necessary to build a more tame society? Could it be that modern nation states are not really stable unless they arise gradually and naturally from smaller principalities together with all the attendent wars and nationalist strife, unless they arise from a uniform orderly process of colonization? Could it be that liberal democracy has functioned well enough for the last couple of centuries (and far less in much of the world) because it has been sustained by economic growth that has actually only been a coincidental fellow traveler with it?
Historically, mankind has managed to make progress, even as one empire or another has collapsed, by passing the torch of knowledge and tolerance and innovation from one place to another. But, in an increasingly global society, we are increasingly placing all of our eggs in one basket. The collapse of China, or a subcontinental nuclear war between Pakistan and India, or the stagnation of American society, or economic collapse in the European Union due to mismanaged public finance, is no longer the kind of event that the rest of the world can ignore. A major collapse in one part of the world drags the rest of the world down with it to some extent.
In World Wars I and II, Switzerland managed to stay neutral and above the fray. In the next global calamity, there may be no Switzerlands.
The political-economies of the world evolve in very path dependent ways. When something works better than what came before it, it sweeps the "known world" at the time. The invention of farming obliterated the cultures that proceeded it for as far around of empires could spread, even though the extent to which the actual people who followed the old cultures is disputed. Rome expanded as far as the terrain would permit. Every country in the world has a basically modern military, even if its economy carries on with centuries old means of production. All but a handful of legal systems in the world are derived from those of Revolutionary France, the Kaiser's Germany, 17th century England, or 19th century Spain, and the European Civil Law systems had more similarities than differences due to their common Roman origins. There are a few not particularly promising outliers that are more home grown - China, Iran and Saudi Arabia, for example, some of which have a few followers, as do various dictatorships. But, what none of the four or five main political systems that predominate in our big blue marble are up to the next challenge our world needs to overcome.
I'm basically an optimist. I think that humanity will come up with a way to maintain modernity without oil and natural gas. I think that humanity will come up with a way to cope with global warming, although I have no confidence that we will have the will to keep it from running its course. I don't think that a global nuclear war will cause our species to go extinct. I am inclined to think that our ability to detect and deal with diseases have reached a point where we can address any new disease before it does irrepairable damage to the species. I don't think that the bulk of our knowledge scientific and otherwise will be lost.
I worry more that we may reach a point where we have a global government, initially a good one, that stagnates and stifles us. Perhaps we need one to prevent global environmental threats and nuclear war from doing us in. But, we aren't anywhere close to that point now. We may be in a single global system, but it is not a united one.
But, I am not a believer in the notion that we are at "the end of history." I fully expect to see rapid change for centuries to come. The social sciences have a moving target to chase, culture constantly evolves, and the scientific and technical advances we are experiencing now will take decades or even a century to run their course.
Yes, there surely must come a time when our constant barrage of new scientific discoveries is the physical sciences slows down and eventually reaches a trickle.
We are close to that point in physics and earth sciences and inorganic chemistry and mathematics. We are coming to that point in civil engineering, chemical engineering, materials science, mechanical engineering, electrical engineering and a great many other practical fields. Computers haven't been perfected to their theoretical limits, but their capabilities are continuing to grow so fast that we will surely come very close, very soon.
There is little in mathematics today that the leaning mathematicians of 1910 could not grasp with a year or so of additional study that would make heavy use of what they already knew. General Relativity has received a lot of experimental confirmation since the theory was published by Einstein in 1915, and several different mathematically equivalent expressions, but it remains state of the art physics 95 years later. Many blanks have been filled into the Periodic Table of the Elements since Dmitri Ivanovich Mendeleev published it in more or less its current form in 1869, and we understand in much more detail why the periodic table has the properties that it does, but it remains the point of departure for modern chemistry (and he also is the man responsible for the fact that liquor is sold primarily at 80 proof strength, a cultural impact still felt today).
The standard model of particle physics, which reached something close to its current form in 1981, and has been refined in a few minor respects since then, isn't quite so musty and will probably be refined further before it reaches its final form, but its resiliance in the face of three decades of intense efforts to find deviations and refinements suggest that whatever beyond the standard model physics are still out there to be discovered may have rather modest practical impact. The Higgs boson which the Large Hadron Collider will either discover or disprove in the next couple of years, was predicted in 1964.
We haven't discovered every species of life on Earth, but for many taxonomy categories, we are very close to having done so, and the tool of genetics has imparted considerable heft to those classifications. Most of the new discoveries have been of new species of existing taxonomic categories; we may well have discovered ever kingdom, phylum, class and order of living thing in existence already, and almost every species of vertebrates, certainly the pace of new discoveries of higher level taxons and new species of large animals and plants is slowing. There are probably undiscovered beetle species out there, and some new deep sea organisms or microbiological organism that have yet to be discovered, but our sense of the extent and shape of the biosphere and its evolutionary origins is already quite refined.
Biotechnology is still hurling forward. It will take time to truly master the genome and the epigenome, to really have a firm grasp of how the brain works, and to master the manipulation of it all. But, we are getting much closer.
The blue print in the form of the complete genome is available on computers already for a host of organisms including many people, even if we don't yet know how to read it fluently. We've identified the functions many key working parts. We know a great deal about the overall structure. We've identified a large share of the genome that is common to every human being alive today, and can even trace which parts changed from our common ancestor with chimpanzees and from our Neanderthal predecessors.
We have working models of the structure of the human brain functionally and involving its major subsystems, of the role of the major components of our brain chemistry, of the connection between the brain and the rest of our body, of the way that neurons bind themselves into networks, of the process by which memory works, of the extent to which knowledge is hardwired or learned, of the connections between our senses and our experience of them. We have some sense of how gender and age influences the brain. We have some sense of the variety of mental conditions that can arise, their usual course, and their causes. We may understand the brain far less well than we understand bacterial infections, the heart, or the liver, but for the most part, it seems as if our work is cut out for us. We don't know everything, but we have a pretty good sense of what we don't know and have to discover.
One time pipe dreams, like cures for many kinds of cancers, cures for nervous system disorders, cures for autoimmune diseases, workable treatments for many kinds of mental illnesses, treatments for hereditary blindness and more, while not yet attained, seem attainable. Science has brought us an understanding of many diseases that is exquisitely detailed. Even when we can't cure a disease yet, we often know with excruciating specificity how it works. The implications of those developments for our society may be profound.
Even if we don't learn all that science can offer us, we may reach a point where what we don't know is largely esoteric and is only of intellectual curiousity, with little practical impact, sooner. While history may not be at an end, I wouldn't be surprised if the pace of progress in the physical sciences and engineering in 2110 is much slower than it is now. By 2210, it will probably have reached a crawl indeed, simply because we will already know so much at that point. Perhaps at that point a certain amount of political stagnantion will be acceptable, because the world will stop changing so fast and our leaders can be enlightened by a large corpus of established scientific knowledge.
We are a fortunate people. We live in a time of technological virtuosity, global communication, remarkable tolerance, greater scientific knowledge than any generation to come before us, awareness of many of the threats to our way of life that loom ahead, economic prosperity viewed in the long term, and a point in time when many countries are liberal democracies, many countries that were once brutal tyrannies like the Soviet Union and early Communist China have changed course, and where truly abysmal countries, like North Korea, are small and isolated and perhaps even on the brink of collapse.
We are, in other words, at a point when we have a great deal to lose.
The risks are not abstract. Our era of toleration has brought out into the open the identities of people who would have been persecuted or killed in all prior eras.
Widespread gay marriage and the end of "Don't Ask, Don't Tell" which has already been matched in much of the Western world, means that hosts of gays and lesbians who would have lived secret lives in the past are now out and public. This openness has brought with it acceptance, toleration, and access to legal recognition with all of its blessings. But, openness comes with the price of constant vigilance. Once out of the closet, a population that will always be in the minority must rely forever after on the good will of a liberal democratic society to protect it.
Gays aren't the only ones taking a risk. Many people have come out publicly as atheists, and if religion becomes established, as there has often been democratic pressure of governments to do, they risk persecution as heretics and apostates and blasphemers.
Pagans are in much the same shoes. The Denver Post on Christmas Day informed us that 45 people in Haiti had been murdered based on accusations of witchcraft connected with its recent cholera outbreak.
Anti-Islamic sentiment has swept much of Europe, and lurks ready to turn ugly in America.
It would take a greater calamity to set back human knowledge, science and technology, but would it really take an event so unthinkable. As glorious as a digitized world may seem, the collapse of the Internet remains far more possible than the sack of every decent university and public library on earth. But, if the libraries no long have independent copies of the knowledge upon which we rely, if the technologies we use to access our data cease to be supported for any myriad reasons before the data the technologies make us privy to is converted to new media, how much could we lose?
It is hard to believe that we could ever see a setback as dire as the one that followed the collapse of the Roman Empire. The college textbooks lurking in the basements of hundreds of thousands of middle class families could reconstruct science, if not to the standards of 2010, at least to the start of the art from the 1980s. It might take a long time to rebuild particle accelerators necessary to prove the existence of quarks, and weak force carrier bosons and neutrinos and their properties from first principles and to return to the point where we could answer the remaining open questions of physics. But,a backpack full of graduate school textbooks (or even a binder full of relevant wikipedia printouts) would make the task far less arduous than it was the first time. Not many people know how to run an iron foundry, but again, if just once decent university library of thousands in the United States were to survive, we could learn to do so much more efficiently.
We could easily lose many of the primary documents that back up the story of history. But, there are enough histories out there that it is hard to believe that the broad outlines and even references to many of the key primary sources from which they can be reconstructed, will ever again be lost.
I'd sleep a little better if someone were consciously working out a way to insure that this knowledge survived longer -- printing out a massive library on bronze plates in a dozen different languages maybe -- but even a truly global apocalyptic event would have to be extremely thorough to set back the state of knowledge too far.
It might take a few decades to reconstruct the infrastructure necessary to manufacture decent computers, or mass produce precision stainless steel appliances, or to generate advanced medicines with high levels of purity at reasonable costs. But, there are quite a few copies of the key information available in the kind of technical detail found in patent applications, for example, out there.
More intangibly, how deeply can the modern worldview be suppressed? There have certainly been episodes of history that makes one worry. Who in late tsarist Russia on the verge of Western style liberal democracy could have foreseen Stalinist massacres? Who in the collapsing Chinese empire could have foreseen the Cultural Revolution? The example of Korea is particularly troubling because it was so recent. Dean Koh of the Harvard Law School recounts cities of people with eyes as listless of those of Orwell's 1984 in the hear and now in North Korea, which has been divided from the South for less than sixty years. The return of Albanian society to a medieval state from a similarly brief period of isolation likewise leaves one with qualms.
Yet, Cambodia, while hardly a model of national success, seems to have returned to some semblence of normalcy even after Pol Pot's massacre of the entire bourgouise class in his society. Albania is being reintegrated into Europe. The death of Soviet style communism was traumatic for a generation, but its successor regime seems to be emerging and tentatively is not a total nightmare.
If progress really is inevitable in the long run, despite dire and painful setbacks, then perhaps economic development in places that have not yet experienced its benefits, may also be more or less inevitable. Surely, results achieved from scratch in one place can be secured elsewhere, where they need only be copied, somewhat more easily than they came about in the first place. Or, can they?
Could it be that centuries of ruthless capital punishment that prevailed across Europe was necessary to build a more tame society? Could it be that modern nation states are not really stable unless they arise gradually and naturally from smaller principalities together with all the attendent wars and nationalist strife, unless they arise from a uniform orderly process of colonization? Could it be that liberal democracy has functioned well enough for the last couple of centuries (and far less in much of the world) because it has been sustained by economic growth that has actually only been a coincidental fellow traveler with it?
Historically, mankind has managed to make progress, even as one empire or another has collapsed, by passing the torch of knowledge and tolerance and innovation from one place to another. But, in an increasingly global society, we are increasingly placing all of our eggs in one basket. The collapse of China, or a subcontinental nuclear war between Pakistan and India, or the stagnation of American society, or economic collapse in the European Union due to mismanaged public finance, is no longer the kind of event that the rest of the world can ignore. A major collapse in one part of the world drags the rest of the world down with it to some extent.
In World Wars I and II, Switzerland managed to stay neutral and above the fray. In the next global calamity, there may be no Switzerlands.
The political-economies of the world evolve in very path dependent ways. When something works better than what came before it, it sweeps the "known world" at the time. The invention of farming obliterated the cultures that proceeded it for as far around of empires could spread, even though the extent to which the actual people who followed the old cultures is disputed. Rome expanded as far as the terrain would permit. Every country in the world has a basically modern military, even if its economy carries on with centuries old means of production. All but a handful of legal systems in the world are derived from those of Revolutionary France, the Kaiser's Germany, 17th century England, or 19th century Spain, and the European Civil Law systems had more similarities than differences due to their common Roman origins. There are a few not particularly promising outliers that are more home grown - China, Iran and Saudi Arabia, for example, some of which have a few followers, as do various dictatorships. But, what none of the four or five main political systems that predominate in our big blue marble are up to the next challenge our world needs to overcome.
I'm basically an optimist. I think that humanity will come up with a way to maintain modernity without oil and natural gas. I think that humanity will come up with a way to cope with global warming, although I have no confidence that we will have the will to keep it from running its course. I don't think that a global nuclear war will cause our species to go extinct. I am inclined to think that our ability to detect and deal with diseases have reached a point where we can address any new disease before it does irrepairable damage to the species. I don't think that the bulk of our knowledge scientific and otherwise will be lost.
I worry more that we may reach a point where we have a global government, initially a good one, that stagnates and stifles us. Perhaps we need one to prevent global environmental threats and nuclear war from doing us in. But, we aren't anywhere close to that point now. We may be in a single global system, but it is not a united one.
But, I am not a believer in the notion that we are at "the end of history." I fully expect to see rapid change for centuries to come. The social sciences have a moving target to chase, culture constantly evolves, and the scientific and technical advances we are experiencing now will take decades or even a century to run their course.
Yes, there surely must come a time when our constant barrage of new scientific discoveries is the physical sciences slows down and eventually reaches a trickle.
We are close to that point in physics and earth sciences and inorganic chemistry and mathematics. We are coming to that point in civil engineering, chemical engineering, materials science, mechanical engineering, electrical engineering and a great many other practical fields. Computers haven't been perfected to their theoretical limits, but their capabilities are continuing to grow so fast that we will surely come very close, very soon.
There is little in mathematics today that the leaning mathematicians of 1910 could not grasp with a year or so of additional study that would make heavy use of what they already knew. General Relativity has received a lot of experimental confirmation since the theory was published by Einstein in 1915, and several different mathematically equivalent expressions, but it remains state of the art physics 95 years later. Many blanks have been filled into the Periodic Table of the Elements since Dmitri Ivanovich Mendeleev published it in more or less its current form in 1869, and we understand in much more detail why the periodic table has the properties that it does, but it remains the point of departure for modern chemistry (and he also is the man responsible for the fact that liquor is sold primarily at 80 proof strength, a cultural impact still felt today).
The standard model of particle physics, which reached something close to its current form in 1981, and has been refined in a few minor respects since then, isn't quite so musty and will probably be refined further before it reaches its final form, but its resiliance in the face of three decades of intense efforts to find deviations and refinements suggest that whatever beyond the standard model physics are still out there to be discovered may have rather modest practical impact. The Higgs boson which the Large Hadron Collider will either discover or disprove in the next couple of years, was predicted in 1964.
We haven't discovered every species of life on Earth, but for many taxonomy categories, we are very close to having done so, and the tool of genetics has imparted considerable heft to those classifications. Most of the new discoveries have been of new species of existing taxonomic categories; we may well have discovered ever kingdom, phylum, class and order of living thing in existence already, and almost every species of vertebrates, certainly the pace of new discoveries of higher level taxons and new species of large animals and plants is slowing. There are probably undiscovered beetle species out there, and some new deep sea organisms or microbiological organism that have yet to be discovered, but our sense of the extent and shape of the biosphere and its evolutionary origins is already quite refined.
Biotechnology is still hurling forward. It will take time to truly master the genome and the epigenome, to really have a firm grasp of how the brain works, and to master the manipulation of it all. But, we are getting much closer.
The blue print in the form of the complete genome is available on computers already for a host of organisms including many people, even if we don't yet know how to read it fluently. We've identified the functions many key working parts. We know a great deal about the overall structure. We've identified a large share of the genome that is common to every human being alive today, and can even trace which parts changed from our common ancestor with chimpanzees and from our Neanderthal predecessors.
We have working models of the structure of the human brain functionally and involving its major subsystems, of the role of the major components of our brain chemistry, of the connection between the brain and the rest of our body, of the way that neurons bind themselves into networks, of the process by which memory works, of the extent to which knowledge is hardwired or learned, of the connections between our senses and our experience of them. We have some sense of how gender and age influences the brain. We have some sense of the variety of mental conditions that can arise, their usual course, and their causes. We may understand the brain far less well than we understand bacterial infections, the heart, or the liver, but for the most part, it seems as if our work is cut out for us. We don't know everything, but we have a pretty good sense of what we don't know and have to discover.
One time pipe dreams, like cures for many kinds of cancers, cures for nervous system disorders, cures for autoimmune diseases, workable treatments for many kinds of mental illnesses, treatments for hereditary blindness and more, while not yet attained, seem attainable. Science has brought us an understanding of many diseases that is exquisitely detailed. Even when we can't cure a disease yet, we often know with excruciating specificity how it works. The implications of those developments for our society may be profound.
Even if we don't learn all that science can offer us, we may reach a point where what we don't know is largely esoteric and is only of intellectual curiousity, with little practical impact, sooner. While history may not be at an end, I wouldn't be surprised if the pace of progress in the physical sciences and engineering in 2110 is much slower than it is now. By 2210, it will probably have reached a crawl indeed, simply because we will already know so much at that point. Perhaps at that point a certain amount of political stagnantion will be acceptable, because the world will stop changing so fast and our leaders can be enlightened by a large corpus of established scientific knowledge.
A Less Christian Nation
A comparison of the results of the American Religious Identification Survey from 2008 with the results in 1990 (limited to the continental U.S.) shows that the nation has gone from being 86% self-identified as Christian in 1990 to 76% in 2008, a drop of twelve percentage points.
In some places, particularly in New England and the Northwest, the percentage identifying as Christian has grown particularly low as of 2008 (1990 figures in parenthesis):
Vermont 55% (84%) -29
Massachusetts 61% (83%) -22
New Hampshire 62% (85%) -23
Nevada 64% (80%) -16
Washington 64% (79%) -15
Oregon 66% (77%) -12
Wyoming 67% (84%) -17
Maine 69% (85%) -16
Idaho 69% (84%) -15
Montana 70% (86%) -16
Rhode Island 74% (88%) -14
Connecticut 74% (86%) -12
Every state in New England and the Northwest was more than three-quarters Christian and in all but a couple of cases more than four-fifths Christian in 1990. In 2008, that had dropped to less than three-quarters in every state and less that two-thirds in many of those states. Vermont, the least Christian state in the nation, is just 55% Christian.
The Sharp Slide in Roman Catholic Identification
A significant part of the change in New England is attributable to the demise of white Catholicism. Consider the percentage of Catholic in each of the New England States (versus 1990):
Massachusetts 34% (54%) -20
Connecticut 38% (50%) -12
Vermont 26% (37%) -11
New Hampshire 32% (41%) -9
Rhode Island 46% (62%) -16
Maine 22% (31%) -9
In 1990, half the adults in three New England states identified as Catholics. Now, there is no U.S. state where a majority of adults identify as Catholic. Every New England State has seen a drop of at least at least a fifth in the population share that is Catholic, and the drop has been more than 30% in some states.
Among non-Hispanic whites, 27% identified as Catholic in 1990, while 21% did in 2008, a drop in more than a fifth in share of the total non-Hispanic white population. Among non-Hispanic blacks, 9% identified as Catholic in 1990, while 6% did in 2008, a drop of a third in share of total non-Hispanic black population. Among Hispanics, 66% identified as Catholic in 1990 compared to 59% in 2008, a drop of about ten percent in overall share. Among Asian-Americans the percentage identifying as Catholic was 27% in 1990 compared to 17% in 2008, a drop of more than a third in overall share.
In every racial and ethnic category, the decline in the percentage of the population identifying as Roman Catholic has been very close to the rise in the percentage of the population identifying as non-religious. Obviously, the reality is more complicated. Many people who formerly identified as Roman Catholics now identify with some other branch of Christianity or as "Christian" generally. Many people who identified as Christians of some non-Catholic faith now consider themselves to be non-religious.
A surge in the percentage of the population that is Hispanic, a population that is much more likely to be Roman Catholic than the United States as a whole, as concealed slippage in Roman Catholicism among all ethnic groups in the United States.
The percentage of people identifying with some non-Catholic form of Christianity has been more stable for non-Hispanic Whites, for Blacks and for Hispanics, but there has been a dramatic drop in the share of Asian-Americans who identify with any form of Christianity, mostly due to growth in the share of the Asian-American population that identifies as Muslim or affiliated with an Eastern Religion, in significant part due to immigration of non-Christians from South Asia ad East Asia.
In some places, particularly in New England and the Northwest, the percentage identifying as Christian has grown particularly low as of 2008 (1990 figures in parenthesis):
Vermont 55% (84%) -29
Massachusetts 61% (83%) -22
New Hampshire 62% (85%) -23
Nevada 64% (80%) -16
Washington 64% (79%) -15
Oregon 66% (77%) -12
Wyoming 67% (84%) -17
Maine 69% (85%) -16
Idaho 69% (84%) -15
Montana 70% (86%) -16
Rhode Island 74% (88%) -14
Connecticut 74% (86%) -12
Every state in New England and the Northwest was more than three-quarters Christian and in all but a couple of cases more than four-fifths Christian in 1990. In 2008, that had dropped to less than three-quarters in every state and less that two-thirds in many of those states. Vermont, the least Christian state in the nation, is just 55% Christian.
The Sharp Slide in Roman Catholic Identification
A significant part of the change in New England is attributable to the demise of white Catholicism. Consider the percentage of Catholic in each of the New England States (versus 1990):
Massachusetts 34% (54%) -20
Connecticut 38% (50%) -12
Vermont 26% (37%) -11
New Hampshire 32% (41%) -9
Rhode Island 46% (62%) -16
Maine 22% (31%) -9
In 1990, half the adults in three New England states identified as Catholics. Now, there is no U.S. state where a majority of adults identify as Catholic. Every New England State has seen a drop of at least at least a fifth in the population share that is Catholic, and the drop has been more than 30% in some states.
Among non-Hispanic whites, 27% identified as Catholic in 1990, while 21% did in 2008, a drop in more than a fifth in share of the total non-Hispanic white population. Among non-Hispanic blacks, 9% identified as Catholic in 1990, while 6% did in 2008, a drop of a third in share of total non-Hispanic black population. Among Hispanics, 66% identified as Catholic in 1990 compared to 59% in 2008, a drop of about ten percent in overall share. Among Asian-Americans the percentage identifying as Catholic was 27% in 1990 compared to 17% in 2008, a drop of more than a third in overall share.
In every racial and ethnic category, the decline in the percentage of the population identifying as Roman Catholic has been very close to the rise in the percentage of the population identifying as non-religious. Obviously, the reality is more complicated. Many people who formerly identified as Roman Catholics now identify with some other branch of Christianity or as "Christian" generally. Many people who identified as Christians of some non-Catholic faith now consider themselves to be non-religious.
A surge in the percentage of the population that is Hispanic, a population that is much more likely to be Roman Catholic than the United States as a whole, as concealed slippage in Roman Catholicism among all ethnic groups in the United States.
The percentage of people identifying with some non-Catholic form of Christianity has been more stable for non-Hispanic Whites, for Blacks and for Hispanics, but there has been a dramatic drop in the share of Asian-Americans who identify with any form of Christianity, mostly due to growth in the share of the Asian-American population that identifies as Muslim or affiliated with an Eastern Religion, in significant part due to immigration of non-Christians from South Asia ad East Asia.
Subscribe to:
Posts (Atom)