F Rosa Rubicondior: Creationism in Crisis - The Fallacy of the Fine-Tuned Universe

Friday 31 March 2023

Creationism in Crisis - The Fallacy of the Fine-Tuned Universe

Creationism in Crisis

The Fallacy of the Fine-Tuned Universe

Great Mysteries of Physics 2: is the universe fine-tuned for life?

It won't be long in any debate forum with Creationists, or even theists who are not Bible literalists but still like to think their magic invisible friend created the Universe with them in mind, that someone will claim the Universe is fine-tuned for life. The claim is that there are about 30 constants which all have to have just the values they have for intelligent life to exist.

There are of course many problems with this claim, most of which the person making it will be completely unaware, having only read what they want to believe and not anything that might contradict their 'faith'.

The main ones are:
  • In effect, they are arguing that their putative creator god can only create life within very narrow parameters, and yet an omnipotent god who allegedly created the 'rules' should be capable of creating anything it wants to create in any given set of parameters. So, if the creator constrained by natural laws and incapable of working outside its limitations? If so, who or what set those limitations.

    The fine-tuned argument is actually an argument against the existence of an omnipotent creator god, not for one.
  • Discussions about the existence of intelligent life can only be conducted in a Universe in which intelligent life exists, therefore, the fact that the debate is taking place means the Universe must be capable of giving rise to intelligent life. Trying to work out the probability of something happening that has already happened is statistical nonsense. The probability is 1 (certainty).
  • Assuming those constants do have a range of possible values (and that's only an assumption with no evidence ever presented), in order to calculate the probability of it having the value it has in this Universe, we would need to examine a large sample of universes. That of course is impossible, so, for all we know, the probability of any constant having its current value may be certainty, i.e. it might not be capable of having any other value. That the probability of it having its current value being unlikely, is merely an assumption - a claim without evidence which can be dismissed without evidence.
  • In this universe, the vast majority of it is highly hostile to life as we know it. Even in this planetary system, life can only exist on one planet, and intelligent (human) life can only exist on a fraction of the surface of this planet and within a few thousand feet of its surface without special equipment. So, far from being fine-tuned for life, almost all of it is fine-tuned to make life impossible.
  • Earth is not particularly well designed for human life (which is what Creationists men by 'intelligent life'). It is tectonically active which means it is subject to frequent natural disasters such as earthquakes, volcanoes and tsunamis; humans cannot survive for long and without equipment in the oceans or at the top of mountains, in deserts of at the poles. Earth is subject to occasional cosmic disasters such as meteorite strikes and it orbits a sun which will one day destroy it by turning into a red giant, hence intelligent life such as that on Earth will only exist for a fraction of the time the Universe will exist.
  • There are very many more black holes in the Universe than there are humans, so it would be more logical to argue that the Universe is fine-tunes for making black holes.
In the following article, reprinted from The Conversation under a Creative Commons licence, Miriam Frankel, podcast host, interviews Professor Fred Adams, Professor of Physics, University of Michigan, USA and Professor Paul Davies, Professor of Physics, Arizona State University, USA on the subject of fine-tuning. The article includes a podcast of the interview. The article has been reformatted for stylistic consistence. The original may be read here.



Great Mysteries of Physics 2: is the universe fine‑tuned for life?

Our universe is just right for structure such as galaxies, planets and life to form.

Credit: NASA/James Webb Telescope

Miriam Frankel, The Conversation
Imagine a universe with extremely strong gravity. Stars would be able to form from very little material. They would be smaller than in our universe and live for a much shorter amount of time. But could life evolve there? It took human life billions of years to evolve on Earth under the pleasantly warm rays from the Sun after all.

Now imagine a universe with extremely weak gravity. Its matter would struggle to clump together to form stars, planets and – ultimately – living beings. It seems we are pretty lucky to have gravity that is just right for life in our universe.

This isn’t just the case for gravity. The values of many forces and particles in the universe, represented by some 30 so-called fundamental constants, all seem to line up perfectly to enable the evolution of intelligent life. But there’s no theory explaining what values the constants should have – we just have to measure them and plug their numbers into our equations to accurately describe the cosmos.

So why do the fundamental constants take the values they do? This is a question that physicists have been battling over for decades. It is also the topic of the second episode of our new podcast series, Great Mysteries of Physics – hosted by me, Miriam Frankel, science editor at The Conversation, and supported by FQxI, the Foundational Questions Institute.

“We don’t know whether some of those constants are linked deep down. If we had a deeper theory, we’d find that they’re not actually independent of each other,” explains Paul Davies, a theoretical physicists at Arizona State University. “But we don’t have that theory at the moment, we’ve just got all these numbers.”

Some physicists aren’t too bothered by the seemingly fine-tuned cosmos. Others have found comfort in the multiverse theory. If our universe is just one of many, some would, statistically speaking, end up looking just like ours. In such a universe, says Davies, “beings will pop up and marvel at the fact that they live in a universe that looks like it’s rigged in favour of their existence, but actually we’re just winners in a cosmic lottery.”

But many physicists, including Davies, are holding out for a more fundamental theory of nature which can explain exactly what values the constants should have in the first place. “I usually say two cheers for the multiverse, cause I think it’s better than just saying God did it,” he argues, adding that to get to three cheers you need a more complete theory.

That said, in the absence of a deeper theory, it is hard to estimate exactly how fine-tuned our universe is. Fred Adams, a physicist at the University of Michigan, has done a lot of research to try to find out, and he has discovered that the mass of a quark called the down quark (quarks are elementary particle which make up the atomic nucleus, for example) can only change by a factor of seven before rendering the universe, as we know it, lifeless.

But how fine tuned is that? “If you want to tune a radio, you have to know the frequency of the signal to 1% – and 1% is much more tuned than a factor of seven,” explains Adams. “So it’s much harder to tune a radio than to tune a universe”. Intriguingly, his work has also shown it is possible to get universes that are more life-friendly than ours. “You can make a more logical universe that produces more structure, potentially produces more habitable environments, and I guess by implication supports life better,” he explains.

There are experiments which could help settle the fine-tuning debate. For example, some projects are trying to find out whether the constants we see around us really are constant – perhaps they vary ever so slightly over time or space. And if that were the case, it would be a blow to those who believe the cosmos is fine-tuned.

You can also listen to Great Mysteries of Physics via any of the apps listed above, our RSS feed, or find out how else to listen here. You can also read a transcript of the episode here. The Conversation
Miriam Frankel, Podcast host, The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Published by The Conversation.
Open access. (CC BY 4.0)
Victor J. Stenger
To complete this blogpost, I'll quote from the physicist, Victor J Stenger, from his article in The New Atheism - Taking a Stand for Science and Reason. This summarises Stenger's much more detailed rebuttal of the fine-tuning fallacy in his book The Fallacy of Fine Tuning:
Scientific Arguments
A list of thirty-four parameters that seem to be fine-tuned has been assembled by Rich Deem on the God and Science Web site [complete with the 'donate' button which seems to be traditional on Creationist Web sites]. His main reference is physicist and Christian apologist Hugh Ross and his popular book The Creator and the Cosmos, first published in 1993. Ross is the founder of Reasons to Believe, which is a self-described "international and interdenominational science-faith think tank providing powerful new reasons from science to believe in Jesus Christ. A long list of claimed "design evidences" can be found on its Web site.

Several of Deem's and Ross's constants, such as the speed of light in a vacuum, c, Newton's constant of gravity, G, and Planck's constant, h, are just arbitrary numbers that are determined simply by the unit system you are using. They can be set equal to any number you want, except zero, with no impact on the physics. So no fine-tuning can possibly be involved, just as the number pi is not fine-tuned.

Deem does not actually mention h explicitly, but it comes in when he talks about the "magnitude of the Heisenberg uncertainty principle" being fine-tuned so that oxygen transport to the cells is just right. The magnitude of the uncertainty principle is simply Planck's constant, h (technically, h divided by 4π) We can safely assume that life evolved in such a way that the energy transport was just right, adjusting to the physical parameters such as they are.

I will focus on the five parameters that have the most significance because, if interpreted correctly, they pretty much rule out almost any conceivable kind of life without fine-tuning. Copying, with minor modifications, the table from Deem:

Table 4.1 Five parameters that seem to be the most highly fine-tuned for life to exist
ParameterMax. deviation
Ratio of electrons to protons1 part in 1017
Ratio of electromagnetic force to gravity1 part in 1040
Expansion rate of the universe1 part in 1055
Mass density of the Universe*1 part in 1059
Cosmological constant1 part in 10120
*Deem says mass here but, based on his further discussion, I infer he means density.


Deem does not give any references to scientific papers showing calculations for the "maximum deviations" listed in the table. However, I will admit that the features a universe would have for slightly different values of these parameters, all other parameters remaining the same, would render our form of life impossible. Indeed, this extends to any form of life even remotely like ours, that is, one that is based on a lengthy process, chemical or otherwise, by which complex matter evolved from simpler matter. This is the main reason these parameters are significant. As we will see below, in all the other examples people give, some form of life is still possible, just not our form specifically.

Let me discuss each in turn. Note that the arguments all apply to our universe where we assume that none of the laws of physics are different. All that is different is the value of some of the numbers we put into those laws, so we are not making any assumption about other universes.

Ratio of Electrons to Protons
The claim is that if the ratio of the numbers of electrons to protons were larger, electromagnetism would dominate over gravity and galaxies would not form. If smaller, gravity would dominate and chemical bonding would not occur. This assumes the ratio is some arbitrary constant. In fact, the number of electrons equals exactly the number of protons for a very simple reason: the universe is electrically neutral-so the two, having opposite charges, must balance.

Here is a clear but slightly technical explanation for how this all came about in the early universe from a book by astronomer Peter Schneider:
Before pair annihilation [the time when most electrons annihilated with antielectrons, or positrons, producing photons] there were about as many electrons and positrons as there were photons. After annihilation nearly all electrons were converted into photons-but not entirely because there was a very small excess of electrons over positrons to compensate for the positive electric charge density of the protons. Therefore the number density of electrons that survive the pair annihilation is exactly the same as the number density of protons, for the Universe to remain electrically neutral.

So, no fine-tuning happened here. The ratio is determined by conservation of charge, a fundamental law of physics.

Ratio of Electromagnetic Force to Gravity
The source of the huge difference in strengths between the electromagnetic and gravitational forces of a proton and an electron, N1 = 1039, has been a long-standing problem in physics, first being mentioned by Hermann Weyl in 1919. If they were anywhere near each other, stars would collapse long before they could provide either the materials needed for life on planets or the billions of years of stable energy needed for life to evolve. Why is N1 = 1039? You would expect a natural number to be on the order of magnitude of 1.

But note that N1 is not some universal measure of the relative strengths of the electric and gravitational forces. It's just the force ratio for a proton and an electron. The proton is not even an elementary particle. It is made of quarks. For two electrons the ratio is 1042. If we have two unit-charged particles, each with a mass of 1.85 x 109 kilograms, the two forces would be equal!

So the large value of N1 is simply an artifact of the use of small masses in making the comparison. The force ratio is hardly "fine-tuned to 39 orders of magnitude," as you will read in the theist literature. The reason N1 is so large is that elementary particle masses are so small. According to our current understanding of the nature of mass, elementary particles all have zero bare mass and pick up a small "effective mass" by interacting with the background Higgs field that pervades the universe. That is, their masses are "naturally" very small.

If particle masses were not so small, we would not have a long-lived, stable universe and wouldn't be here to talk about it. This is an expression of the anthropic principle. But note that I am not invoking it-I am explaining it.

Several physicists, including myself, have done computer simulations where we generate universes by varying all the relevant parameters. In my case I randomly varied the electromagnetic force strength, the mass of the proton, and the mass of the electron by ten orders of magnitude around their existing values in our universe.11 The gravitational force strength was fixed. That is, I allowed the ratio of forces to vary from 1034 to 1044! These are a long way from 1039. I found that over half of the universes generated had stars with lifetimes of at least ten billion years, long enough for life of some kind to evolve.

In a more recent study I have also varied the strength of the strong nuclear force and placed further limitations on the characteristics of the generated universes. For example, I ensure that atoms are much bigger than nuclei and have much lower binding energy. I have demanded that planetary days be at least ten hours long and stars be far more massive than planets. I find that 20 percent of the universes have these properties.

My study was rather simple. More-advanced studies, which reach the same basic conclusion, have been made by Anthony Aguire, Roni Harnik, Graham D. Kribs, Gilas Perez, and Fred C. Adams.

Expansion Rate of the Universe
The fine-tuning of the expansion rate of the universe is one of the most frequent examples given by theologians and philosophers. Deem says that if it were slightly larger, no galaxies would form; if it were smaller, the universe would collapse.

This has an easy answer. If the universe appeared from an earlier state of zero energy, then energy conservation would require the exact expansion rate that is observed. That is the rate determined precisely by the fact that the potential energy of gravity is exactly balanced by the kinetic energy of matter.

Let me try to explain this in detail so that, once again, it is clear that I am merely stating a simple fact of physics. Suppose we wish to send a rocket from Earth to far outside the solar system. If we fire the rocket at exactly 11.2 kilometers per second, what is called the escape velocity for Earth, its kinetic energy will exactly equal the negative of its gravitational potential energy, so the total energy will be zero. As the rocket moves away from Earth, the rocket gradually slows down. Its kinetic energy decreases, as does the magnitude of its potential energy, the total energy remaining constant at zero because of energy conservation. Eventually when the rocket is very far from Earth and the potential energy approaches zero, its speed relative to Earth also approaches zero.

If we fired the rocket at just under escape velocity, the rocket would slow to a stop sooner and eventually turn around to return to Earth. If we fired it at a slightly higher speed, the rocket would keep moving away and never stop.

In the case of the big bang, the bodies in the universe are all receding from one another at such a rate that they will eventually come to rest at a vast distance. That rate of expansion is very precisely set by the fact that the total energy of the system was zero at the very beginning, and energy is conserved.

So, instead of being an argument for God, the fact that the rate of expansion of the universe is exactly what we expect from an initial state of zero energy is a good argument against a creator. Once again, we have no fine-tuning because the parameter in question is determined by a conservation principle, in this case conservation of energy.

Mass Density of the Universe
If the mass density of the universe were slightly larger, then overabundance of the production of deuterium (heavy hydrogen) from the big bang would cause stars to burn too rapidly for life on planets to form. If smaller, insufficient helium from the big bang would result in a shortage of the heavy elements needed for life.

The answer is the same as the previous case. The mass density of the universe is precisely determined by the fact that the universe starts out with zero total energy.

Cosmological Constant
This is considered one of the major unanswered problems in physics. The cosmological constant is a term that arises in Einstein's general theory of relativity. It is basically equivalent to the energy density of empty space that results from any curvature of space. It can be positive or negative. If positive it produces a repulsive gravitational force that accelerates the expansion of the universe.

For most of the twentieth century it was assumed that the cosmological constant was identically zero, although no known law of physics specified this. At least no astronomical observations indicated otherwise. Then, in 1998, two independent groups studying supernovas in distant galaxies discovered, to their great surprise since they were looking for the opposite, that the expansion of the universe was accelerating. This result was soon confirmed by other observations, including those made with the Hubble Space Telescope.

The component of the universe responsible for the acceleration was dubbed dark energy. It constitutes 73 percent of the total mass of the universe. (Recall the equivalence of mass and energy given by E=mc2.) The natural assumption is to attribute the acceleration to the cosmological constant, and the data, so far, seem to support that interpretation.

Theorists had earlier attempted to calculate the cosmological constant from basic quantum physics. The result they obtained was 120 orders of magnitude larger than the maximum value obtained from astronomical observations.

Now this is indeed a problem. But it certainly does not imply that the cosmological constant has been fine-tuned by 120 orders of magnitude. What it implies is that physicists have made a stupid, dumb-ass, wrong calculation that has to be the worst calculation in physics history.

Clearly the cosmological constant is small, possibly even zero. This can happen in any number of ways. If the early universe possessed, as many propose, a property called supersymmetry, then the cosmological constant would have been exactly at zero at that time. It can be shown that if negative energy states, already present in the calculation for the cosmological constant, are not simply ignored but counted in the energy balance, then the cosmological constant will also be identically zero.

Other sources of cosmic acceleration have been proposed, such as a field of neutral material particles pervading the universe that has been dubbed quintessence. This field would have to have a negative pressure, but if it is sufficiently negative it will be gravitationally repulsive.

In short, the five greatest fine-tuning proposals show no fine-tuning at all. The five parameters considered by most theologians and scientists to provide the best evidence for design can all be plausibly explained. Three just follow from conservation principles, which argue against rather than for any miraculous creation of the universe. They can be turned around and made into arguments against rather than for God.

Answers can be given for all the other parameters on Deem's list. Not a single one rules out some kind of life when the analysis allows other parameters to vary.

The same cannot be said about other compilations of parameters that appear at first glance to be fine-tuned. Even some of the most respected scientists have made the mistake of declaring a parameter "fine-tuned" by only asking what happens when it is varied while all other parameters remain the same. A glaring example is provided by Sir Martin Rees, the Astronomer Royal of the United Kingdom. In his popular book Just Six Numbers, Rees claims that a quantity he calls "nuclear efficiency," ε, defined as the fraction of mass of helium that is less than two protons and two neutrons, is fine-tuned to ε = 0.07. If ε = 0.06, deuterium would be unstable because the nuclear force would not be strong enough to keep the electrical repulsion between protons from blowing it apart. If ε = 0.08, the nuclear force would be strong enough to bind two protons together directly and there would be no need for neutrons and to provide additional attraction. In that case there would be no deuterium or any other nuclei containing neutrons.

But this all assumes a fixed strength of the electromagnetic force, which is given by the dimensionless parameter α, which has a value of 1/137 in the existing universe. In the first case, for any value of a less than 1/160 the deuteron will be stable because the electrical repulsion will be too weak to split it apart. In the second case, for any value of α greater than 1/120 the electrical repulsion will be too great for protons to bind together without the help of neutrons, so deuterons and other neutron-rich nuclei will exist. So stable nuclei are possible for a wide range of the two parameters ε and α and neither are fine-tuned for life.

Let me mention one parameter where the answer to the claim of fine-tuning is ridiculously simple. The masses of neutrinos are supposedly fine-tuned since their gravitational effects would be too big or too small if they were different and this would adversely affect the formation of stars and galaxies. But that assumes that the number of neutrinos in the universe is fixed. It is not. It is determined by their masses. If heavier, there would be fewer. If lighter, there would be more. Whatever the masses, the gravitational effects of neutrinos would be the same.

In short, there is no scientific basis for the claim that the universe is fine-tuned for life. Indeed, the whole notion makes no sense. Why would an omnipotent god design a universe in which his most precious creation, humanity, lives on the knife-edge of extinction? This god made a vast universe that is mostly empty space and then confined humankind to a tiny speck of a planet, where it is destined for extinction long before the universe becomes inert. He could have made it possible for us to live anywhere. He also could have made it possible to live in any conceivable universe, with any values for its parameters. Instead of being an argument for the existence of god, the apparent fine-tuning of the constants of physics argues against any design in the cosmos.

Victor J. Stenger. The New Atheism: Taking a Stand for Science and Reason. (Kindle Location 985-1078). Kindle Edition.

But perhaps the best argument again the fine-tuned fallacy is that it is just another example of the god of the gaps fallacy and the argument from ignorant incredulity fallacy. Just because science hasn't explained something doesn't mean science can't explain it. That is the thinking of a toddler that characterises the thinking of Creationists.

Thank you for sharing!






submit to reddit

No comments :

Post a Comment

Obscene, threatening or obnoxious messages, preaching, abuse and spam will be removed, as will anything by known Internet trolls and stalkers, by known sock-puppet accounts and anything not connected with the post,

A claim made without evidence can be dismissed without evidence. Remember: your opinion is not an established fact unless corroborated.

Web Analytics