These two galaxies are named NGC 4490 and NGC 4485, and they’re located about 24 million light-years away in the constellation Canes Venatici (The Hunting Dogs). Aside from the Milky Way’s own dwarf companions (the Large and Small Magellanic Clouds), this is the closest known interacting dwarf-dwarf system where astronomers have directly observed both a gas bridge and resolved stellar populations. Together NGC 4490 and NGC 4485 form the system Arp 269, which is featured in the Atlas of Peculiar Galaxies. At such a close distance (and with Webb’s impressive ability to peer through dusty cosmic clouds) these galaxies allow astronomers to witness up close the kinds of galaxy interactions that were common billions of years ago.
Just a gentle reminder, if any were needed, that we can tell the Bible is wrong by comparing its descriptions with what we can observe. To take a silly-simple example that even a creationists should be able to understand: supposing I told you that the Bible had 7 chapters in three sections, the Old, Middle And New Testaments, and was just 50 pages long, you could simply look in the Bible and see that I was wrong. It would be no use me trying to claim that I was right really because my statement was an allegory or a metaphor, because you could see that it was neither; it was simply wrong, unequivocally and irrefutably so.
Well, it's the same with the description of the Universe in the Bible. We can look at the Universe now, using technology the Bronze Age authors of the Bible could never dream of, and see that it is nothing like the description in the Bible.
So, just as my whimsical description of the Bible was not even close, so we can see that the Bronze age authors of the Bible were not even close. The difference of course is that while my mistakes were deliberate, theirs were the result of ignorance.
So, let's see again how the Bible describes a small, flat universe consisting of a single planet with a sun and moon hanging over it, and the whole covered by a dome to keep the water out.
And God said, Let there be a firmament in the midst of the waters, and let it divide the waters from the waters. And God made the firmament, and divided the waters which were under the firmament from the waters which were above the firmament: and it was so. And God called the firmament Heaven. And the evening and the morning were the second day. And God said, Let the waters under the heaven be gathered together unto one place, and let the dry land appear: and it was so. And God called the dry land Earth; and the gathering together of the waters called he Seas: and God saw that it was good. (Genesis 1.6-10)
And God made two great lights; the greater light to rule the day, and the lesser light to rule the night: he made the stars also. And God set them in the firmament of the heaven to give light upon the earth, And to rule over the day and over the night, and to divide the light from the darkness: and God saw that it was good.(Genesis 1.16-18)
How the Bible's authors saw the Universe.
And now let's look at what some tiny fragments of the Universe are really like, as shown by the James Webb Space Telescope and published by the European Space Agency (ESA):
Palaeontologists at the Federal University of São Paulo, Brazil have analysed the DNA recovered from two ancient humans and discovered that they were both carriers of the Human Papillomavirus HPV16, a virus implicated in several cancers. They have presented their evidence, ahead of peer-reviewed publication in the pre-print server, bioRxiv.
The interesting thing from the point of view of virology is that this discovery shed considerable light on when HPV entered the human virome and commenced co-evolving with us, with one theory being that we acquired them from Neanderthals. From the point of view of creationists however, the news could scarcely be worse.
The first sample, obtained from the famous 'Ötzi the Iceman', the 5,300 year-old mummified body recovered from a glacier on the Italian-Austrian border, is probably not too much of a problem for creationists as it just about falls within the timeline of the Bible mythology, apart from the little problem of it being from before they believe the was a general reset of Earth's biosphere in a genocidal flood which would have destroyed the glacier and everything in it, so Ötzi should not have been there.
But, the second is a massive problem, since it was recovered from a leg of a man, Ust'-Ishim man, recovered from western Siberia and dated to 45,000 years BP - way before creationists believe Earth existed, and tens of thousands of years before the mythical 'Fall', when creationists believe viruses didn't exist. This specimen provided the oldest complete human genome so far recovered and the DNA contains the unmistakable genome of HPV16. Creationist mythology just keeps getting further and further from reality as exposed by science using real-world evidence.
Traditionally, creationists claim Earth is 6,000 - 10,000 years old and was created perfect in every way, with no deaths or diseases, so no viruses, parasites or pathogens, bodies that always functioned perfectly and genomes that never failed to replicate perfectly. Then, along came 'sin' which, by some mysterious process, was able to thwart the omnipotent creator god's perfect plan and create viruses and other pathogens and make perfect physiology begin to malfunction and genomes to fail to replicate perfectly, causing variations and genetic weaknesses, etc.
Why a reputedly omnipotent creator failed to anticipate the effects of 'sin' and make its creation robust enough to resist them is never explained, although, apparently, it provided immune systems in preparation for something that, although omniscient, and even claimed to have created 'evil' (Isiah 45:7), it then failed to anticipate. But, as though those myths aren't too ridiculous for any adult with even a basic education to believe, creationists have to continually think of ways to ignore the evidence and continue holding plainly absurd beliefs, under the child-like delusion that their ability to do so is a sign of strength.
The paper itself sets out to address a long-standing question in human virology: how long oncogenic human papillomaviruses have been associated with our species, and whether their origins lie in relatively recent cultural changes or deep evolutionary history.
It may come as a surprise to some that scientists at the Center of Applied Space Technology and Microgravity (ZARM) at the University of Bremen, Germany, together with colleagues from the Transylvanian University of Brașov, Romania, have proposed a new theory of gravity, which they recently published in the Journal of Cosmology and Astroparticle Physics.
Even flat-earthers do not attempt to deny gravity. No one who does not require medication doubts that gravity is the force that prevents us from floating off into space and causes objects to fall when dropped. No one seriously believes they can step off a tall building and come to no harm because gravity is “just a theory”.
And yet, despite its obvious and universal effects, gravity remains incompletely understood. Unlike evolution—which can be directly observed and whose underlying mechanisms have been well established for decades—we still lack a complete explanation of how gravity works. Newton described it mathematically as an attractive force between masses, proportional to those masses and obeying an inverse-square law, but he did not explain why masses attract one another. Einstein later recast gravity as the curvature of spacetime caused by mass and energy, with objects following the shortest paths through that curved geometry. However, gravity has never been successfully reconciled with quantum mechanics and appears to be a phenomenon that belongs to the macroscopic domain of relativity. Although most physicists assume that relativity and quantum mechanics must ultimately be unified, a quantum theory of gravity remains elusive.
In other words, as with evolution, we know that gravity is real, and we understand its effects extremely well. But unlike evolution—where we possess a comprehensive and coherent explanatory framework—we currently have only incomplete and sometimes conflicting theories for gravity’s underlying cause.
Despite this, creationists never dispute the theory of gravity on the grounds of these gaps. The reason is obvious: their sacred collection of Bronze Age myths makes no claims about gravity at all. Its authors took gravity for granted, seeing no need to explain it, and therefore left no theological foothold for modern denial. There are no angels holding planets in orbit or magical forces suspending objects in mid-air, and so gravity is quietly accepted.
Illustrations of Ordovician, jawless vertebrates. Left is a Promissum conodont, ranging from 5 to 50 cm in length and named after unusual, cone-like teeth fossils, which are hypothesized to be ancestors of modern lampreys and hagfishes. On the right is a pair of Sacabambaspis, around 35 cm in length, which had distinct, forward-facing eyes and an armored head. Very few conodont species survived the Late Ordovician Extinction Event, and no fossils of animals like Sacabambaspis from after the event have been discovered.
A recent paper published in Science Advances by Wahei Hagiwara and Professor Lauren Sallan of the Macroevolution Unit at the Okinawa Institute of Science and Technology, Japan, closes a long-standing gap in our understanding of the early radiation of vertebrates into jawed and jawless fishes following the Late Ordovician mass extinction, around ~445–443 million years ago. Their analysis shows that this radiation arose from a small number of fortunate survivors clinging on in ecological refugia. From those few lineages, of course, all modern marine and terrestrial vertebrates ultimately evolved.
This study neatly dismantles one of creationism’s favourite rhetorical fallbacks: the claim that Earth was deliberately “fine-tuned” to support complex life, and ultimately humans. The evolutionary pattern revealed here—near-annihilation followed by recovery from a few scattered refugia—is not the signature of foresight or optimisation, but of contingency and survival against the odds. Life does not flourish because conditions are perfectly arranged for it; rather, whatever happens to survive is forced to adapt to whatever conditions remain. The history of vertebrates, like that of life more generally, is therefore not one of careful planning, but of repeated catastrophe followed by opportunistic evolutionary radiation.
Creationists are notable for clinging to demonstrably false beliefs in the face of overwhelming evidence, childishly mistaking stubbornness for intellectual strength, rather like a spoilt toddler refusing to accept that they have just lost a game of Snap!. Alongside the patently absurd claim that Earth is only 6,000–10,000 years old sits the almost equally untenable belief that the planet was created exactly as it is, perfectly suited for human life. This notion is maintained despite abundant evidence for repeated mass extinctions driven by cosmic impacts, large-scale geological processes such as plate tectonics and associated seismic activity, major reorganisations of ocean circulation, and delicately balanced biogeochemical feedback systems involving oxygenation and carbon cycling that periodically spiral out of control, triggering catastrophic climate change.
What the evidence actually reveals is not a cosy, well-regulated world resembling some tranquil small town in Kansas, but a planet that is frequently so hostile to life that much of it is wiped out entirely. Most species go extinct, leaving only a handful of survivors to inherit the aftermath and radiate into new forms adapted to altered conditions—until they too are eliminated by some future catastrophe. The conclusion is unavoidable: Earth is not fine-tuned for human life, or for life in general. Instead, today’s species are the fortunate descendants of a few lucky survivors, shaped by natural selection to fit available ecological niches as neatly as a hand fits a glove.
An international research team led by the University of Vienna and the University of Tartu (Estonia) — in collaboration with the University of Cambridge and University College London — has shown that ancient genomes of human betaherpesvirus 6A and 6B (HHV-6A/B) entered the human genome and then co-evolved with humans over the last 2,000 years. Their study, published in Science Advances a few days ago, confirms that these viruses have been evolving with, and within, humans since at least the Iron Age.
A common creationist claim, rooted in ignorance of what genetic information actually is, is that new genetic information cannot be created. Claims that new genetic information cannot arise because it would violate the laws of thermodynamics rely on a fundamental category error. The second law applies to closed systems, whereas every biological system on Earth is emphatically open, continuously exchanging energy and matter with its environment. Local decreases in entropy are not only permitted but expected in open systems supplied with external energy — which, on Earth, is overwhelmingly provided by the Sun. Crystals grow, snowflakes form, embryos develop, and genomes increase in length and complexity without violating any physical law.
Moreover, information is not a conserved physical quantity like energy. Shannon information theory concerns the statistical properties of signals in communication channels; it says nothing about biological meaning, function, or heredity. Treating genetic information as though it were interchangeable with thermodynamic entropy is simply a misuse of terminology. When genomes gain duplicated genes, viral insertions, or transferred sequences, no atoms are created, no laws are broken, and no special pleading is required — just chemistry operating under well-understood physical principles.
The creationist claim is flatly contradicted by straightforward observations of several well-understood mechanisms by which new genetic information can enter a species’ genome. These include gene duplication or whole-genome duplication, horizontal gene transfer from one species to another (particularly common in parasite–host relationships), and — as demonstrated by the research discussed here — the insertion of viral DNA into the genome. This last process gives rise to endogenous viral elements that are ubiquitous in biology and which precisely match evolutionary trees established independently from multiple other lines of evidence.
Endogenous viral insertions are especially devastating for the creationist concept of immutable “created kinds”. Viral DNA does not insert itself independently at exactly the same genomic locations in unrelated lineages by chance. When identical viral sequences are found embedded at the same chromosomal positions in different populations — and when their accumulated mutations form nested hierarchical patterns — they provide a precise historical record of shared ancestry.
The HHV-6A/B insertions documented in this study behave exactly as evolutionary theory predicts: they enter the genome at a particular point in time, are inherited by descendants, accumulate mutations at measurable rates, and track human population history. There is no coherent creationist explanation for why a designer would place broken, mutating viral sequences into genomes in patterns that perfectly mirror evolutionary trees derived independently from anatomy, archaeology, and population genetics.
If humans were created as a distinct “kind”, there is no reason for their genomes to contain time-stamped viral relics tracing population divergence over millennia. But if humans evolved — and if viruses have co-evolved with us — this is precisely the pattern we expect to find. The data fit evolution effortlessly, while creationism is left inventing ad hoc excuses to deny what the genome itself records.
There is, of course, no let-up in the steady stream of bad news for creationists to ignore in 2026, and today is no exception. This time the problem comes from archaeology and concerns events taking place toward the end of the very long span of Earth’s history that preceded creationism’s so-called *Creation Week*. The news is that the diversification of domestic dogs, descended from domesticated wolves, had already begun at least 11,000 years ago — long before anything resembling the modern concept of dog “breeds”.
The evidence is presented in a paper published in Science by a team led by palaeontologists from the University of Exeter and France’s Centre National de la Recherche Scientifique (CNRS). The researchers analysed 643 modern and archaeological canid skulls—including recognised breeds, village dogs, and wolves—spanning the last 50,000 years. In both geographical scope and time depth, it is the largest and most comprehensive study of its kind to date.
Using a technique known as geometric morphometrics, the team demonstrated that by the Mesolithic and Neolithic periods dogs already displayed a striking range of shapes and sizes. This diversity almost certainly reflects their varied roles in early human societies, from hunting and herding to guarding and companionship, rather than anything resembling systematic modern breeding.
All of this directly contradicts the claim in Genesis that animals were created fully formed for mankind’s exclusive use by an omnipotent and omniscient creator. Had that been the case, dogs would not require modification to make them fit for different purposes, nor would the archaeological record preserve clear evidence of their gradual evolutionary divergence from an ancestral wolf population. Instead, the evidence shows — unambiguously — that modern dogs are the product of an evolutionary process in which human-mediated selection played a central role, carried out by people who themselves existed long before the biblical timeline allows.
A brief communication, published last November in the American Journal of Biological Anthropology may, if creationists never read past the title (as usual), have produced a frisson of excitement in those circles. It questioned the taxonomic status of one of the most complete fossil skeletons of an early ancestral hominin, Australopithecus prometheus, popularly known as “Little Foot”.
However, reading even a little further would have turned that excitement into disappointment — assuming, of course, that they understood what they were reading. The authors were not questioning whether the fossil was ancestral at all, but whether it had been assigned to the correct position in the hominin family tree, or whether it should instead be recognised as a distinct ancestral hominin species. In other words, this was a discussion about how many transitional species there are, not whether transitional species exist at all.
The only crumb of comfort available to creationists is the familiar claim that this demonstrates how science “keeps changing its mind”, something they take as evidence that science is fundamentally unreliable—presumably including even those parts they routinely misrepresent as supporting their beliefs.
For anyone who understands the scientific method, and the importance of treating all knowledge as provisional and contingent on the best available evidence, this paper represents the principle functioning exactly as it should. Far from being a weakness, this willingness to revise conclusions in the light of new information is what makes science self-correcting and progressively more accurate over time.
The authors of the paper — a team led by La Trobe University adjunct Dr Jesse Martin—carried out a new analysis of the “Little Foot” fossils and concluded that the specimen was probably placed in the wrong taxon when first described on the basis that it does not share the same “unique suite of primitive and derived features” as Australopithecus africanus. Since that initial assessment, additional fossils of A. prometheus have been discovered, and it has become clear that “Little Foot” also differs from those specimens. At the same time, it remains sufficiently distinct from A. africanus that reassignment to that species is not justified. In short, it possesses its own unique combination of primitive and derived traits and should therefore be recognised as a separate species.
Naturally, there is no real comfort here for creationists. The phrase “suite of primitive and derived features” is simply palaeontological shorthand for evidence of descent with modification—what Darwin referred to as transitional forms. It follows that the researchers involved have no doubt whatsoever that the species under discussion evolved from earlier ancestors, and there is no hint that they believe it was spontaneously created, without ancestry, by magic.
Fossilized elephant dentine (scale: 1.5 mm across), with rock seen in the lower right and dentine in the upper left. The white dentine is intact collagen.
The bad news for creationism continues unabated. Scientists led by Professor Timothy G. Bromage of the Department of Molecular Pathobiology at New York University College of Dentistry have developed a technique that opens an entirely new window onto the deep past. By analysing metabolites preserved in fossilised bones, the researchers are able to extract detailed biological and environmental information from animals that lived between 1.3 and 3 million years ago.
The team have published their findings in Nature, describing a method that pushes palaeobiology well beyond traditional morphology-based reconstruction.
The significance of this technique lies in its ability to reconstruct ancient environments with remarkable precision. From the chemical signatures locked within fossil bone, researchers can infer temperature, soil conditions, rainfall patterns, vegetation, and even the presence of parasites. The resulting picture is one of ecosystems changing over time, with animals adapting in step with shifting environments — exactly what evolutionary theory predicts, and wholly incompatible with the childish notion of magical creation a few thousand years ago or a recent biological reset caused by a genocidal flood.
This article struck a chord with me — not primarily because it refutes creationism, although it certainly does that by presenting evidence that simply should not exist if the biblical flood genocide story contained even a kernel of truth. Such evidence ought either to have been swept away entirely or buried beneath a thick layer of flood-deposited silt containing a chaotic jumble of animal and plant fossils from unrelated landmasses. It was neither.
What resonated more personally, however, is that I have just published a novel in which a clan of Neolithic hunter-gatherers forms a close association with wolves, with the animals playing a central role in both their hunting strategies and their folklore. In the novel, The Way of the Wolf: A Stone Age Epic — the second volume in the Ice Age Tales series — Almora is raised alongside a wolf cub that becomes her inseparable guide and protector. This relationship gives rise to several versions of a mythologised hunt in which the wolf, Sharma, saves the day and defends the hunters. Together with her Neanderthal partner, Tanu, Almora later leads a group of exiles who encounter a clan already familiar with these legends, and who have begun adopting abandoned wolf cubs and raising them as part of the community.
It is fiction, of course — but a deliberately realistic depiction of how wolves could have been domesticated through mutual benefit, cooperation, and prolonged social contact with humans.
The article itself concerns the discovery by researchers at the Francis Crick Institute, Stockholm University, the University of Aberdeen, and the University of East Anglia of wolf remains on a remote Baltic island that could only have been transported there by boat. Isotopic analysis shows that these wolves consumed the same food as the humans, and skeletal pathology in one individual indicates long-term care. The findings are reported in a research paper published in Proceedings of the National Academy of Sciences (PNAS).
Researchers from the University of New South Wales (UNSW), Sydney, Australia, have identified DNA switches that help control how astrocytes work. These are brain cells that support neurons and are known to play a role in Alzheimer’s disease. They have just published their findings in Nature Neuroscience.
Firstly, there is the embarrassment that the cause of Alzheimer’s is indistinguishable from Michael J. Behe’s favourite ‘proof’ of intelligent design — irreducible complexity — in that all the elements must be present for Alzheimer’s to occur.
Secondly, there is the discovery by the Australian team of which triggers ‘switch on’ which genes that affect the astrocytes implicated in Alzheimer’s. These switches are embedded in the 98% of the human genome that is non-coding, or so-called ‘junk’ DNA. Since they can be separated from the genes they regulate by thousands of base pairs, it has been notoriously difficult to identify which switches control which genes. Now, using CRISPR, the team have identified around 150 of these regulatory elements.
The existence of this non-coding DNA has long been an embarrassment for creationists, who have been unable to explain why an intelligent designer would produce so much DNA that does not contain the roughly 20,000 genes that actually code for proteins. Why such prolific waste, adding massively to the risk of errors that can result in cancer?
The creationist response has been to conflate the terms ‘non-coding’ and ‘non-functional’, and then proclaim this ‘functional DNA’ as intelligently designed — reducing, but by no means eliminating, the amount of ‘junk’ they still have to explain away. Of course, ‘non-coding’ does not mean ‘not transcribed’, only that the RNA does not code for a functional protein. However, this non-coding but functional DNA does play a role in gene expression, in that the resulting RNA can act as controls or ‘switches’ that turn genes on and off.
So, creationists — having triumphantly waved ‘functional, non-coding DNA’ as evidence for intelligent design after all — are now presented with the fact that it is part of the ‘irreducible’ cause of Alzheimer’s, and probably the cause of many other diseases with a genetic basis.
Two researchers at McGill University, Montréal, Québec, Canada, have uncovered evidence of an ecosystem teeming with giant marine predators some 130 million years ago. The largest of these predators could, quite literally, have eaten something the size of a modern orca as little more than a snack. This will make depressing reading for creationists, not only because it all happened deep in the long pre-“Creation Week” history of life on Earth, but because the evolutionary arms races that led to these giants are precisely what the theory of evolution by natural selection predicts.
It doesn’t get any easier for creationists. Just because it’s Christmas week doesn’t mean the awkward facts are going to go away, or that scientists are going to stop uncovering more of them. No matter what they post on social media; no matter how loudly they shout; or how fervently they gather on Sundays to collectively drown out their doubts, Santa is not going to deliver evidence that the Bronze Age creation myths in the Bible contain even a grain of historical truth. The problem is that truth remains true whether a creationist believes it or not, and regardless of whether their parents believed it. No amount of looking the other way or pretending the facts aren’t there will ever change that.
The palaeontologists reached their conclusions by reconstructing an ecosystem network for all known animal fossils from the Paja Formation in central Colombia. They used body sizes, feeding adaptations, and comparisons with modern animals, and then validated the results against one of the most detailed present-day marine ecosystem networks available: the living Caribbean ecosystem, which they used as a reference. The Paja ecosystem thrived with plesiosaurs, ichthyosaurs, and abundant invertebrates, giving rise to one of the most intricate marine food webs known. This complexity emerged as sea levels rose and Earth’s climate warmed during the Mesozoic era, including the Cretaceous, triggering an explosion of marine biodiversity.
Photo montage of five major elements of DAN5 fossil cranium
Credit: Dr. Michael Rogers
Map showing potential migration routes of the human ancestor, Homo erectus, in Africa, Europe and Asia during the early Pleistocene. Key fossils of Homo erectus and the earlier Homo habilis species are shown, including the new face reconstruction of the DAN5 fossil from Gona, Ethiopia dated to 1.5 million years ago.
Credit: Dr. Karen L. Baab. Scans provided by National Museum of Ethiopia, National Museums of Kenya and Georgian National Museum.
Palaeontologists at the College of Graduate Studies, Glendale Campus of Midwestern University in Arizona, have reconstructed the head and face of an early Homo erectus specimen, DAN5, from Gona in the Afar region of Ethiopia on the Horn of Africa. In doing so, they have uncovered several unexpected features that should trouble any creationist who understands their significance. The research has just been published open access in Nature Communications.
Creationism requires its adherents to imagine that there are no intermediate fossils showing a transition from the common Homo/Pan ancestor to modern Homo sapiens, whom they claim were created as a single couple just a few thousand years ago with a flawless genome designed by an omniscient, omnipotent creator. The descendants of such a couple would, of course, show no genetic variation, because both the perfect genome and its replication machinery would operate flawlessly. No gene variants could ever arise.
The reality, however, is very different. Not only are there vast numbers of fossils documenting a continuum from the common Homo/Pan ancestor of around six million years ago, but there is also so much variation among them that it has become increasingly difficult to force them into a simple, linear sequence. Instead, human evolution is beginning to resemble a tangled bush rather than a neat progression.
The newly reconstructed face of the Ethiopian Homo erectus is no exception. It displays a mosaic of more primitive facial traits alongside features characteristic of the H. erectus populations believed to have spread out of Africa in the first of several waves of hominin migration into Eurasia. The most plausible explanation is that the Ethiopian population descended from an earlier expansion within Africa, became isolated in the Afar region, and retained its primitive characteristics while other populations continued to evolve towards the more derived Eurasian form.
The broader picture that has emerged in recent years—particularly since it became clear that H. sapiens, Neanderthals, and Denisovans formed an interbreeding complex that contributed to modern non-African humans—is one of repeated expansion into new environments, evolution in isolation, and subsequent genetic remixing as populations came back into contact. DAN5 represents just one of these populations, which appears to have evolved in isolation for some 300,000 years.
Not only is this timescale utterly incompatible with the idea of the special creation of H. sapiens 6,000–10,000 years ago, but the sheer existence of this degree of variation is also irreconcilable with the notion of a flawless, designed human genome. Even allowing for old-earth creationist claims that a biblical “day” may represent an elastic number of millions of years, the problem remains: a highly variable genome must still be explained as the product of perfect design. A flawless genome created by an omniscient, omnipotent creator should, moreover, have been robust enough to withstand interference following “the Fall” — an event such a creator would necessarily have foreseen, particularly if it also created the conditions for that fall and the other creative agency involved (Isaiah 45:7).
As usual, creationists seem to prefer the conclusion that their supposed intelligent creator was incompetent—either unaware of the future, indifferent to it, or powerless to prevent it—rather than accept the far more parsimonious explanation: that modern Homo sapiens are the product of a long, complex evolutionary history from more primitive beginnings, in which no divine intervention is required.
Origins of Homo erectus
Homo erectus
Homo erectus appears in the fossil record around 1.9–2.0 million years ago, emerging from earlier African Homo populations, most likely derived from Homo habilis–like ancestors. Many researchers distinguish early African forms as Homo ergaster, reserving H. erectus sensu stricto for later Asian populations, although this is a taxonomic preference rather than a settled fact.
Key features of early H. erectus include:
A substantial increase in brain size (typically 600–900 cm³ initially, later exceeding 1,000 cm³)
A long, low cranial vault with pronounced brow ridges
A modern human–like body plan, with long legs and shorter arms
Clear association with Acheulean stone tools and likely habitual fire use (by ~1 million years ago)
Crucially, H. erectus was the first hominin to disperse widely beyond Africa, reaching:
The Caucasus (Dmanisi) by ~1.8 Ma
Southeast Asia (Java) by ~1.6 Ma
China (Zhoukoudian) by ~0.8–0.7 Ma
This makes H. erectus not a single, static species, but a long-lived, geographically structured lineage.
Homo erectus as a population complex
Rather than a uniform species, H. erectus is best understood as a metapopulation:
African populations
Western Eurasian populations
East and Southeast Asian populations
These groups experienced repeated range expansions, isolation, local adaptation, and partial gene flow, producing the mosaic anatomy seen in fossils such as DAN5.
This population structure is critical for understanding later human evolution.
Relationship to later Homo species
Neanderthal (H. neanderthalensis)
From H. erectus to H. heidelbergensis
By around 700–600 thousand years ago, some H. erectus-derived populations—probably in Africa—had evolved into forms often grouped as Homo heidelbergensis (or H. rhodesiensis for African material).
These hominins had:
Larger brains (1,100–1,300 cm³)
Reduced facial prognathism
Continued Acheulean and early Middle Stone Age technologies
They represent a transitional grade, not a sharp speciation event.
Divergence of Neanderthals, Denisovans, and modern humans
Genetic and fossil evidence indicates the following broad pattern:
~550–600 ka: A heidelbergensis-like population splits
African branch → modern Homo sapiens
Eurasian branch → Neanderthals and Denisovans
Neanderthals
Evolved primarily in western Eurasia
Adapted to cold climates
Distinctive cranial morphology
Contributed ~1–2% of DNA to all non-African modern humans
Denisovans
Known mostly from genetic data, with sparse fossils (Denisova Cave)
Closely related to Neanderthals but genetically distinct
Contributed genes to Melanesians, Aboriginal Australians, and parts of East and Southeast Asia, including variants affecting altitude adaptation (e.g. EPAS1)
Modern Homo sapiens
Emerged in Africa by ~300 ka
Retained genetic continuity with earlier African populations
Dispersed out of Africa multiple times, beginning ~70–60 ka
Interbred repeatedly with Neanderthals and Denisovans
The key point: no clean branching tree
Human evolution is reticulate, not linear:
Species boundaries were porous
Gene flow occurred repeatedly
Populations diverged, adapted, re-merged, and diverged again
Homo erectus is not a side branch that “went extinct”, but a foundational grade from which multiple later lineages emerged. DAN5 fits neatly into this framework: a locally isolated erectus population retaining ancestral traits while others continued evolving elsewhere.
Why this matters
This picture:
Explains mosaic anatomy in fossils
Accounts for genetic admixture in living humans
Makes sense of long timescales and geographic diversity
Is incompatible with any model of recent, perfect, single-pair creation
Instead, it shows that our species is the outcome of millions of years of population dynamics, not a single moment of design.
A new fossil face sheds light on early migrations of ancient human ancestorA New Fossil Face Sheds Light on Early Migrations of Ancient Human Ancestor
A 1.5-million-year-old fossil from Gona, Ethiopia reveals new details about the first hominin species to disperse from Africa.
Summary: Virtual reassembly of teeth and fossil bone fragments reveals a beautifully preserved face of a 1.5-million-year-old human ancestor—the first complete Early Pleistocene hominin cranium from the Horn of Africa. This fossil, from Gona, Ethiopia, hints at a surprisingly archaic face in the earliest human ancestors to migrate out of Africa.
A team of international scientists, led by Dr. Karen Baab, a paleoanthropologist at the College of Graduate Studies, Glendale Campus of Midwestern University in Arizona, produced a virtual reconstruction of the face of early Homo erectus. The 1.5 to 1.6 million-year-old fossil, called DAN5, was found at the site of Gona, in the Afar region of Ethiopia. This surprisingly archaic face yields new insights into the first species to spread across Africa and Eurasia. The team’s findings are being published in Nature Communications.
We already knew that the DAN5 fossil had a small brain, but this new reconstruction shows that the face is also more primitive than classic African Homo erectus of the same antiquity. One explanation is that the Gona population retained the anatomy of the population that originally migrated out of Africa approximately 300,000 years earlier.
Dr. Karen L. Baab, lead author
Department of Anatomy
Midwestern University
Glendale, AZ, USA.
Gona, Ethiopia
The Gona Paleoanthropological Research Project in the Afar of Ethiopia is co-directed by Dr. Sileshi Semaw (Centro Nacional de Investigación sobre la Evolución Humana, Spain) and Dr. Michael Rogers (Southern Connecticut State University). Gona has yielded hominin fossils that are older than 6.3 million years ago, and stone tools spanning the last 2.6 million years of human evolution. The newly presented hominin reconstruction includes a fossil brain case (previously described in 2020) and smaller fragments of the face belonging to a single individual called DAN5 dated to between 1.6 and 1.5 million years ago. The face fragments (and teeth) have now been reassembled using virtual techniques to generate the most complete skull of a fossil human from the Horn of Africa in this time period. The DAN5 fossil is assigned to Homo erectus, a long-lived species found throughout Africa, Asia, and Europe after approximately 1.8 million years ago.
How did the scientists reconstruct the DAN5 fossil?
The researchers used high-resolution micro-CT scans of the four major fragments of the face, which were recovered during the 2000 fieldwork at Gona. 3D models of the fragments were generated from the CT scans. The face fragments were then re-pieced together on a computer screen, and the teeth were fit into the upper jaw where possible. The final step was “attaching” the face to the braincase to produce a mostly complete cranium. This reconstruction took about a year and went through several iterations before arriving at the final version.
Dr. Baab, who was responsible for the reconstruction, described this as “a very complicated 3D puzzle, and one where you do not know the exact outcome in advance. Fortunately, we do know how faces fit together in general, so we were not starting from scratch.”
What did scientists conclude?
This new study shows that the Gona population 1.5 million years ago had a mix of typical Homo erectus characters concentrated in its braincase, but more ancestral features of the face and teeth normally only seen in earlier species. For example, the bridge of the nose is quite flat, and the molars are large. Scientists determined this by comparing the size and shape of the DAN5 face and teeth with other fossils of the same geological age, as well as older and younger ones. A similar combination of traits was documented previously in Eurasia, but this is the first fossil to show this combination of traits inside Africa, challenging the idea that Homo erectus evolved outside of the continent.
I'll never forget the shock I felt when Dr. Baab first showed me the reconstructed face and jaw. The oldest fossils belonging to Homo erectus are from Africa, and the new fossil reconstruction shows that transitional fossils also existed there, so it makes sense that this species emerged on the African continent,” says Dr. Baab. “But the DAN5 fossil postdates the initial exit from Africa, so other interpretations are possible.
Dr. Yousuke Kaifu, co-author
The University Museum
The University of Tokyo
Bunkyo-ku, Tokyo, Japa.
This newly reconstructed cranium further emphasizes the anatomical diversity seen in early members of our genus, which is only likely to increase with future discoveries.
Dr. Michael J. Rogers, co-author.
Department of Anthropology
Southern Connecticut State University
New Haven, CT, USA.
It is remarkable that the DAN5 Homo erectus was making both simple Oldowan stone tools and early Acheulian handaxes, among the earliest evidence for the two stone tool traditions to be found directly associated with a hominin fossil.
Dr. Sileshi Semaw, co-author
Centro Nacional de Investigación sobre la Evolución Humana (CENIEH)
Burgos, Spain.
Future Research
The researchers are hoping to compare this fossil to the earliest human fossils from Europe, including fossils assigned to Homo erectus but also a distinct species, Homo antecessor, both dated to approximately one million years ago.
Comparing DAN5 to these fossils will not only deepen our understanding of facial variability within Homo erectus but also shed light on how the species adapted and evolved.
Dr. Sarah E. Freidline, co-author
Department of Anthropology
University of Central Florida
Orlando, FL, USA.
There is also potential to test alternative evolutionary scenarios, such as genetic admixture between two species, as seen in later human evolution among Neanderthals, modern humans and “Denisovans.” For example, maybe DAN5 represents the result of admixture between classic African Homo erectus and the earlier Homo habilis species.
We’re going to need several more fossils dated between one to two million years ago to sort this out.
Abstract
The African Early Pleistocene is a time of evolutionary change and techno-behavioral innovation in human prehistory that sees the advent of our own genus, Homo, from earlier australopithecine ancestors by 2.8-2.3 million years ago. This was followed by the origin and dispersal of Homo erectus sensu lato across Africa and Eurasia between ~ 2.0 and 1.1 Ma and the emergence of both large-brained (e.g., Bodo, Kabwe) and small-brained (e.g., H. naledi) lineages in the Middle Pleistocene of Africa. Here we present a newly reconstructed face of the DAN5/P1 cranium from Gona, Ethiopia (1.6-1.5 Ma) that, in conjunction with the cranial vault, is a mostly complete Early Pleistocene Homo cranium from the Horn of Africa. Morphometric analyses demonstrate a combination of H. erectus-like cranial traits and basal Homo-like facial and dental features combined with a small brain size in DAN5/P1. The presence of such a morphological mosaic contemporaneous with or postdating the emergence of the indisputable H. erectus craniodental complex around 1.6 Ma implies an intricate evolutionary transition from early Homo to H. erectus. This finding also supports a long persistence of small-brained, plesiomorphic Homo group(s) alongside other Homo groups that experienced continued encephalization through the Early to Middle Pleistocene of Africa.
Introduction
The oldest fossils assigned to our genus are ~2.8 million years old (Myr) from Ethiopia and signal a long history of Homo evolution in the Rift Valley1,2,3. There is evidence of multiple Homo lineages in Africa by 2.0–1.9 million years ago (Ma) and an archaeological and paleontological record of expansion to more temperate habitats in the Caucasus and Asia between 2.0 and 1.8 Ma4 (Fig. 1). The last appearance datum for the more archaicHomo habilis species (or “1813 group”) is ~1.67 (OH 13) or ~1.44 Ma, if KNM-ER 42703 is correctly attributed to H. habilis5, which is uncertain6. The archetypal early African Homo erectus fossils from Kenya (i.e., KNM-ER 3733, 3883; and the adolescent KNM-WT 15000) already present a suite of traits that distinguish them from early Homo taxa by 1.6–1.5 Ma, including larger brains and bodies, smaller postcanine dentition, more pronounced cranial superstructures (e.g., projecting and tall brow ridges), a relatively wide midface and nasal aperture, deep palate, and projecting nasal bridge1,6,7,8,9,10,11. The only evidence for H. erectus sensu lato in Africa before 1.8 Ma are fragmentary or juvenile fossils12,13,14, while fossils expressing both ancestral H. habilis and more derived H. erectus s.l. morphological traits are only known from Dmanisi, Georgia at 1.77 Ma15,16. Thus, H. erectus emerged from basal Homo between 2.0 and 1.6 million years ago, but when, where (Africa or Eurasia), and how it occurred remain unclear. An expanded fossil record also documents significant variation in endocranial 17,18 and craniofacial6,8 and dentognathic morphology19,20 throughout the Early Pleistocene, which extends to the Middle Pleistocene with the addition of small-brained Homo lineages to the human tree.
Fig. 1: Early Homo and Homo erectus timeline between 2.0 and 1.0 Ma and map of key sites in Africa and southern Eurasia.
The solid bars of the timeline indicate well-established first and last appearance data; the horizontal stripes indicate possible extensions of the time range based on fragmentary or juvenile fossils. Diagonal lines signal earlier archaeological presence in those regions. The question mark indicates a possible date of <1.49 Ma for the Mojokerto, Indonesia site cf.22,23,24,25. The horizontal gray bar represents the time range associated with DAN5/P1. Colors on the map indicate presence of fossils matching taxa or geographic groups of H. erectus as indicated in the timeline. Surface renderings of the best-preserved regional representatives of archaic or small-brained Homo fossils (beginning at top and continuing clockwise): D2700, KNM-ER 1813, KNM-ER 1470, KNM-ER 3733, SK 847, OH 24, KNM-WT 15000, and DAN5/P1. All surface renderings visualized at FOV 0° (parallel). Map was generated in “rnaturalearth” package68 for R.
The initial announcement of DAN5/P1 assigned it to H. erectus on the basis of derived neurocranial traits21. Subsequent analyses of neurocranial shape and endocranial morphology confirmed affinity with H. erectus but also noted similarities to early (pre-erectus) Homo fossils such as KNM-ER 181317,18. Only limited information about the partial maxilla and dentition was presented in the original description21. Yet, facial and dental traits are increasingly important in early Homo systematics, given overlap in brain size among closely related hominins6,8,22. The DAN5/P1 fossil is a rare opportunity to evaluate neurocranial, facial, and dental anatomy in a single Early Pleistocene Homo fossil and thus has significant implications for this discussion.
Here we present a new cranial reconstruction of the 1.6–1.5 Myr DAN5/P1 fossil from Gona, Ethiopia. This study demonstrates that the small-brained adult DAN5/P1 fossil (598 cm321) presents a previously undocumented combination of early Homo and H. erectus features in an African fossil.
Taken together, the evidence leaves little room for the idea that Homo erectus was a dead-end curiosity, neatly replaced by something entirely new. Instead, it represents a long-lived, widely dispersed, and internally diverse population complex that provided the evolutionary substrate from which later human lineages emerged. Its descendants were not produced by sudden leaps or special creation events, but by the ordinary, observable processes of population divergence, isolation, and adaptation acting over deep time.
Modern Homo sapiens, Neanderthals, and Denisovans did not arise as separate “kinds”, nor did they follow clean, branching paths. They represent regional outcomes of this erectus-derived heritage, shaped by geography, climate, and repeated episodes of contact and interbreeding. The genetic legacy of those interactions is still present in living humans today, providing independent confirmation of what the fossil record has long been indicating.
What emerges is not a ladder of progress but a dynamic, reticulated history: populations spreading, fragmenting, evolving in isolation, and reconnecting again. Fossils such as DAN5 are not anomalies to be explained away; they are exactly what we should expect from evolution operating on structured populations across continents and hundreds of thousands of years.
For creationism, this is deeply inconvenient. For evolutionary biology, it is precisely the kind of rich, internally consistent picture that arises when multiple independent lines of evidence converge on the same conclusion: humanity is the product of a long, complex evolutionary history, not a recent act of design.
In another major embarrassment for those creationists who understand it, researchers at the Gladstone Institutes and Stanford University have developed a method for linking the genome of a cell to diseases caused by specific gene variants. They have recently published their findings, open access, in Nature.
Creationists insist that the human genome was intelligently designed, with every outcome the result of “complex specified information” which, according to Discovery Institute Fellow William A. Dembski, constitutes definitive evidence of intelligent design. If this were true, it would follow that genes which cause disease were intelligently designed to cause those diseases.
The difficulty deepens for creationists when one considers that many diseases involve multiple genes, sometimes hundreds or even thousands, all of which must possess the “correct” variants for the disease to emerge. In other words, some diseases not only depend on Dembski’s “complex specified genetic information”, but also conform to Michael J. Behe’s proposed hallmark of intelligent design: irreducible complexity.
Unless creationists invoke an additional creator—one over whom their reputedly omnipotent and omniscient god has no control—their supposedly intelligent designer must have deliberately created these gene variants to produce the suffering they cause.
By contrast, the evolutionary explanation requires no such mental gymnastics. The existence of genetic variants is exactly what evolutionary theory predicts, and provided such variants remain rare within a population, there is little selective pressure to remove them. A genome produced by an omniscient, perfect designer, however, would contain no such variants: the original design would be flawless, as would the mechanisms responsible for replicating it. The very existence of gene variants is therefore evidence against intelligent design.
The technique developed by the research team is sensitive enough to examine the entire genome and determine which genes influence which cell types. This makes it possible to identify which genes contribute to particular diseases. In cases where a single gene is involved, this can be relatively straightforward, but where many genes are implicated, it can be extremely difficult to disentangle their individual effects—precisely the problem this new technique helps to overcome.
Having recently watched a grey squirrel carefully plot a route through a line of trees, I was struck by the sophistication of its behaviour. It was not simply moving at random. It clearly knew where it wanted to go and was able to take into account such factors as how much slender branches would bend under its weight, how wide a gap it could safely jump, and—perhaps most importantly—exactly where it was within its own mental map of the environment. It is difficult to see how such behaviour could be possible in a creature that was not conscious and, to some degree, self-aware.
In animal psychology, there is now little doubt that many vertebrates possess some level of self-awareness and therefore consciousness. The remaining debate has centred not on whether consciousness exists in non-human animals, but on how it arose. The fact that consciousness is found across a wide range of vertebrates, and even in molluscs such as cephalopods, suggests either that it originated in a remote common ancestor or that it evolved independently multiple times through convergence. Either way, this strongly points to an evolutionary origin.
According to two papers published in a special edition of the journal Philosophical Transactions of the Royal Society B, by working groups led by Professors Albert Newen and Onur Güntürkün at Ruhr University Bochum in Germany, consciousness can indeed be explained as the outcome of an evolutionary process, with each step conferring a selective advantage. Moreover, consciousness only makes sense as an evolved biological function. The two open-access papers can be found here and here.
This work is bound to provoke another bout of denialism among creationists, for whom consciousness remains one of the standard “impossible to explain without supernatural intelligence” fallback arguments. As with abiogenesis and the Big Bang, the reasoning typically amounts to: “Science hasn’t explained it and I don’t understand how it could, therefore God did it.” This false dichotomy conveniently removes any obligation to provide evidence in support of the supernatural claim. Creationists also like to flatter themselves that consciousness is a uniquely human trait and thus evidence of special creation. In scientific terms, however, this does not even rise to the level of a hypothesis: it proposes no mechanism, makes no testable predictions, and is unfalsifiable by design. It is, in essence, wishful thinking rooted in the belief that the Universe is obliged to conform to personal expectations.
By contrast, the Ruhr University team have identified three distinct levels of consciousness and demonstrated the evolutionary advantage of each, drawing on detailed studies of birds that show parallel forms of consciousness to those seen in humans. These levels are:
Basic arousal — such as the perception of pain, which signals that harm is occurring and that corrective action is required.
General alertness — awareness of the broader environment, allowing threats and opportunities to be recognised and responded to appropriately.
Reflexive (self-)consciousness — the ability to place oneself within an environment, learn from past experience, anticipate future outcomes, and formulate an action plan; in other words, to construct a narrative with oneself as a participant.
If we take creationist claims about the human body at face value – that we are the special design of an omniscient, omnipotent creator god – we would have to conclude that this putative god equipped us for life in small, dispersed bands of hunter-gatherers, entirely free from the pressures of modern urban existence. That is the inescapable implication of new work by Daniel P. Longman of the School of Sport, Exercise and Health Sciences, Loughborough University, UK, and Colin N. Shaw of the Department of Evolutionary Anthropology, University of Zürich, Switzerland.
In their study, recently published in Biological Reviews, they argue that human evolutionary fitness has deteriorated markedly over the past 300 years, beginning with the Industrial Revolution. They attribute this to the escalating stresses of urban life, which are increasingly linked to counter-survival problems such as declining fertility rates and the rising prevalence of chronic inflammatory conditions, including autoimmune diseases. They also highlight impaired cognitive function in urban settings, with chronic stress playing a central role in many of these conditions.
As they note, our stress responses were shaped in environments where predators such as lions posed intermittent but existential threats. A sudden burst of adrenaline and cortisol – the classic fight-or-flight reaction – made the difference between survival and being eaten. Today, however, we summon exactly the same physiological response to traffic noise, difficult conversations with colleagues or family, and that irritatingly arrogant but ignorant creationist on the Internet. Where a lion encounter would once have been an occasional shock, we now experience the physiological equivalent of facing several lions a day.
For creationists, this poses an awkward problem. An omniscient designer should have foreseen humanity’s future circumstances and endowed us with a physiology robust enough to cope with them. Evolution, by contrast, cannot predict even the next generation, let alone the demands of life tens or hundreds of millennia later. It optimised our ancestors for survival on open African landscapes, not for navigating congested cities, chronic noise, 24-hour information streams, and the relentless stimuli of modern technology. This helps explain why our inherited design is increasingly mismatched to our environment, and why evolution cannot adjust us quickly enough to keep pace.
My own family history illustrates this accelerating mismatch. My grandparents grew up in rural Oxfordshire, before the arrival of the motor car, electricity, modern sanitation, or powered heating. Their lives were essentially unchanged from those of their parents and grandparents. My parents, by contrast, had electricity, piped water, proper sanitation, and radio; later a motor car, a television, and eventually a telephone. Now we have smartphones, laptops, air travel, satnavs, and city centres jammed with traffic. We spend hours each day staring at screens, communicating instantly across the world. My grandparents’ lives would have been recognisable to their great-grandparents, but mine would be unrecognisable to them – such has been the accelerating pace of technological change. No evolutionary process could possibly adapt a species to that speed of environmental transformation.
We are, in effect, experiencing stress levels akin to those of ancestors living among a pride of lions, not merely encountering one on rare occasions. And crucially, we have little or no time to recover before the next ‘lion’ appears.
Research led by the University of Bristol and published in the journal Nature a few days ago suggests that the transition from simple prokaryote cells to complex eukaryote cells began almost 2.9 billion years ago – nearly a billion years earlier than some previous estimates. Prokaryotes — bacteria and archaea — had been the dominant, indeed the only, life forms for the preceding 1.1 billion years, having arisen about 300 million years after Earth coalesced 4 billion years ago.
Creationists commonly forget that for the first billion or more years of life on Earth, it consisted solely of single-celled prokaryotes — bacteria and archaea. They routinely post nonsense on social media about the supposed impossibility of a complex cell spontaneously assembling from ‘non-living’ atoms — something no serious evolutionary biologist has ever proposed as an explanation for the origin of eukaryote cells.
There is now little doubt among biologists that complex eukaryote cells arose through endosymbiotic relationships between archaea and bacteria, which may have begun as parasitic or predator–prey interactions before evolving into symbioses as the endpoint of evolutionary arms races. The only questions concern when exactly eukaryote cells first began to emerge, and what triggered their evolution.
The team collected sequence data from hundreds of species and, combined with fossil evidence, reconstructed a time-resolved tree of life. They then used this framework to resolve the timing of historical events across hundreds of gene families, focusing on those that distinguish prokaryotes from eukaryotes.
One surprising finding was that mitochondria were late to the party, arising only as atmospheric oxygen levels increased for the first time — linking early evolutionary biology to Earth’s geochemical history.