There Is Lots Of Matter That Is Hard To See
The rate at which galaxies spin in space is not a good fit for what the laws of physics would predict if the only matter in those galaxies was observable stars combined with unobservable ordinary matter in proportions similar to those seen in the Milky Way.
It turns out that you can estimate the total mass of a galaxy by its rate of spin (and via graviational lensing effects), and that stars of a particular brightness have a quite predictable mass because the size of a star is the dominant factor in high brightly it burns, and stars are actually quite simple phenomena that are easy to model from first principles using the laws of physics. So, it is easy, albeit tedious, to count up the number of stars you see in a galaxy with a good telescope, categorize them into weight classes, add up the total, adjust it for the percentage of matter in the Milky Way (for which the most accurate observations are possible, because it is so much closer) that wouldn't be observable from the distance that we are viewing a distant galaxy from, and then compare that total estimate of ordinary matter in the galaxy observed to the mass implied using an independent approach, from the rotation rate and gravitational lensing. Using those methods, astronomers have determiend that "dark matter constitutes 80% of the matter in the universe, while ordinary matter makes up only 20%."
The methods used to make that estimate should be pretty accurate when it comes to estimating the total amount of mass in the universe, because rotation curves and gravitational lensing allow for indirect observation of lots of mass that might otherwise be undetectable. But, since the dark matter figure is derived from the total less the estimated mass of ordinary matter, if we underestimated the amount of ordinary matter in the universe, then our estimate of the amount of dark matter in the universe will be too high.
Old Estimates We Based On Woefully Inaccurate Data For Ellipical Galaxies
The trouble is that the amount of hard to observe ordinary matter in the Milky Way, a spiral galaxy, turns out to be a poor estimate of the amount of hard to observe matter is elliptical galaxies, which "account for one-third of the stellar mass in the universe." In elliptical galaxies, small stars, called red dwarfs, which are hard to observe and about a third the size of the sun, are far more common than they are in the Milky Way.
How far off were estimates of the number of stars in the universe based on the Milky Way? A lot. New careful observations of eight nearby elliptical galaxies reveal that there are three times as many stars in the universe as we used to think that there were in the universe. Red dwarfs are about ten times as common in elliptical galaxis as they are in spiral galaxies like ours.
Implications For Cosmology
It isn't entirely clear to me from the report that I have read if the implication is that there is considerably more ordinary matter in the universe relative to dark matter than had been previously presumed, or if instead, this simply means that the amount of ordinary matter is the same and that the share of it that comes from small stars relative to the share that comes from large stars was grossly incorrect.
Either way, this has a profound effect on cosmology models.
If the amount of dark matter in the universe has been greatly overestimated, then dark matter candidates like massive neutrinos and interstellar hydrogen-helium gas that were insufficient to explain a very meaningful share of dark matter we thought we had, may start to look more attractive, and the target mass for any undiscovered king of particles that produces dark matter effects may be smaller than previously anticipated, which reduces the power of laboratory experiments that should be necessary to detect these particles. The authors of the paper imply, but don't definitively state that this is the case.
If the implication is simply that the proportion of stars of different types was wrong, then "the early history of the cosmos may need a rewrite, perhaps doubling previous estimates of the total mass of stars in many of the universe’s first, massive galaxies. If so, those early galaxies would have forged stars at a much more prodigious rate." This could have a great impact on concepts regarding how the universe arose like "inflation" that help explain the disconnect between a naiive model in which the rate of the expansion of the universe from the Big Bang is constant and the larger actual size of the universe that we observe. It could also cause a rethink of the distributions of mass in the very early universe. Some physicists looking at other data in the cosmic background radiation that is observed are already starting to do some of this rethinking based on observations that could suggest that the universe experiences cycles of expansion and collapse over and over again, rather than representing a beginning of space-time itself. This may also be the case.
Why Care?
Data that help us understand the nature of dark matter and the early history of the cosmos are discoveries with implications that have implications beyond the realm of astronomy. Theoretical physicists use information from astronomers that tell us what we know about dark matter and the history of the cosmos to distinguish between theories about fundamental physical laws that go beyond the Standard Model of Particle Physics and General Relativity that could be true, and those that are ruled out by empirical evidence. For example, if your theory predicts that there would be an undiscovered form of dark matter particle in the universe that should account for 90% of all of the matter in the universe, then you know that your theory is wrong. Similarly, if your theory predicts that it would have taken ten billion years for the first stars to emerge from the Big Bang, you again know that your theory is wrong.
This sounds mushy. The number of theories a person can create is seemingly infinite, and slicing off a smaller chunk of an infinite number of theories to consider doesn't leave you much better off than you were when you started. And, to some extent that's true. There is even a term in string theory for all the different possible versions of that very specific kind of theory that could exist, call "string vacua." But, it is less mushy than it seems, because those aren't the only constraints on theory building.
In any theory, you must first propose rules that are consistent with all laws of physics which are observable in laboratories and at scales involving the solar system or smaller areas. A century of quantum physics experiments, detailed observations from outer space in the solar system, and the like, have made those boundaries quite confining.
For example, if you have a theory that predicts that there are no charge-parity symmetry violations in the universe, or that quantum tunnelling effects don't occur, or that neutrinos don't have mass, or that any of 37 observed particles in the Standard Model doesn't exist or has properties different beyond the margin of error from the charges, masses, coupling constants and weak force interaction matrixes that are observed in real life, then you know that your theory is wrong.
Essentially, the only places there is room to tweak what we know from the laboratory is with gravitation-like or space-time structure effects that are too faint to be observed at sub-galactic distances (for example, by adding a new term to an existing equation with only modest impact), and with new particles that require too much energy to create, are too empheral, or are too weakly interacting with other particles at laboratory scales (where gravity is effectively impossible to measure) to be detected in experiments to date. In other words, one pretty much has to come up with a theory that is equivalent to the Standard Model and general relativity with some extra tweaks that are observable only in highly specific circumstances and you don't lead to impossible inconsistencies. For example, hypothesized dark matter particles called WIMPs, if the exist, should have a mass of about 8 GeV +/- 1 GeV, given that data to date and the models of how they should act, which should be observable at the Large Hadron Collider within the next decade if they exist.
We also know, to a considerable extent, where the discrepencies between what we observe, mostly in astronomy observations from which we infer dark matter and dark energy effects, and what our current theories applied naiively, look like. In other words, we know to a great extent how the tweaks should change the existing theories.
As a result, from a practical perspective, theories need to be variations on one of less than a dozen or so variations out there, most of which can be lumped into half a dozen or so kinds of variations on string theory each of which has been described in theoretical terms with some specificity, or half a dozen or so non-string theory approaches.
In any case, since the cosmology work done in the last few decades relied heavily on efforts to develop mathematical models of the Big Bang that would produce a universe that looks like ours, and the target of a universe that looks like ours turns out to have been significantly flawed based on the new observations of the number of red dwarfs that are present in elliptical galaxies, the cosmological models that produce the best fit to the old data are surely materially wrong, and some other models that had appeared to be ruled out by the data may now look more attractive.
Also, since the physical laws that apply to the dynamics of the universe are better understood the closer you get to conditions like the ones that exist today, most of the differences between the theories involve what happens in the very earliest stage of the Big Bang, where it is hardest to determine the laws of physics in laboratory conditions or by viewing similar situations today in telescopes. For a given distribution of matter a few hundred thousand years after the Big Bang (it happened about 14 billion years ago, give or take half a billion years), all of the models will predict that the universe evolved from their in almost the same way, apart from modest differences involving the way in which they model "dark energy," a concept that is largely observable only in the rate at which the universe is expanding. But, slight tweaks in initial conditions or the laws of particle physics in extremely high energy situations can greatly alter how the models play out in ways that are not ruled out empirically unless the end result after a fourteen billion year simulation looks much different from the real universe in some important effect not attributable simply to random chance aspects of the model.
Better understood fundamental laws of physics, if we could find them, might tell us something we wouldn't have otherwise known about how to build things or make things work, exploiting physical laws or particles that we hadn't previously known existed, or could simply make existing engineering calculations more accurate in cases where measurement error obscures results that could be obtained if you could use mathematical equations to calculate a result from first principles with fewer, more accurately known empircally determined constants (e.g. the speed of light, the strength of the electrical force, the weight of a particular kind of quark). The current theories are very accurate in many situations, but reach results based on dozens of empirically measured relationships, when the intuition of almost every scientist and educated layman who has looked at the issue is that a large number of these can be derivated by means as yet unknown, by a much smaller number of empirically measured physical constants (perhaps half a dozen or less) related by formulas that are not yet known despite several decades of trying by hundreds, if not thousands, of the smartest people in the world working together on the problem.
UPDATED based on comments by Karl on December 6, 2010.
Authors do say this would reduce the estimates of dark matter. But they hedge it I think because there's enough complications to deal with in proposing an "initial mass function" that describes something that might have happened 10 billion years in the galaxies' past. Last line of the paper: "The bottom-heavy IMF advocated here may also require a relatively low fraction of dark matter within the central regions of nearby massive galaxies"
ReplyDeleteUnderstatement is an art in physics.
The reason the IMF gives a fraction is because in the model the total mass of each galaxy is estimated from gravitational lensing effects (Ref. 22 of the paper.)
I should have said the authors avoid saying it would mean less dark matter. But that's the implication.
ReplyDeletelots-of-dark-matter-actually-just-dim
ReplyDeleteOoh. The Republican Party.
Thanks Karl. You've answered exactly one of my key questions about this paper.
ReplyDeleteGiven its immense implications, and seemingly very solid empirical basis, I'm surprised that it hasn't received more fanfare.
Footnote: The standard estimate of regular matter and dark matter is 4.6% and 23% respectively. If the amount of regular matter is tripled and the amount of combined dark and regular matter remains the same, this leaves a result of 13.8% regular matter and 13.8% dark matter, a surprising 50-50 correspondence that might fit a scenario in which half of the matter in the universe is ordinary and half of made of invisible SUSY particles, for example.
ReplyDeleteAlternately, it might support a scenario in which half of the mass in the universe comes from quark matter and half comes from lepton matter, with the leptonic portion being comprised predominantly of neutrinos. This result could flow perhaps from the democratic nature of the W particle that does not favor quark over lepton decays given sufficient energy.