An Introduction to Cognitive Archaeology

Cognitive archaeology studies human cognitive evolution by applying cognitive-science theories and concepts to archaeological remains of the prehistoric past. After reviewing the basic epistemological stance of cognitive archaeology, this article illustrates this interdisciplinary endeavor through an examination of two of the most important transitions in hominin cognitive evolution—the appearance of Homo erectus about 2 million years ago, and the recent enhancement of working-memory capacity within the past 200,000 years. Although intentionally created stone tools date to about 3.3 million years ago, Homo erectus produced a bifacial, symmetrical handaxe whose design then persisted for nearly the next 2 million years. An enhancement in working-memory capacity may have been responsible for the relative explosion of culture within the past 50,000 years, which included personal ornamentation, highly ritualized burials, bow-and-arrow technology, depictive cave art, and artistic figurines.

Cognitive archaeology (aka evolutionary cognitive archaeology) is most often defined as an approach to studying human cognitive evolution that applies theories and concepts developed in the cognitive sciences to archaeological remains of the prehistoric past. It is based on the premise that the material traces of past activities can be used as clues to the minds that organized those activities. It is an interdisciplinary endeavor, drawing on the data provided by Paleolithic archaeology (Paleolithic literally means "old stone age"), from the first stone tools about 3.3 million years ago to those about 12,000 years ago, and upon interpretive concepts provided by the cognitive sciences, psychology, and other disciplines.
But how can one construct valid arguments about cognition in the past? After all, the actors died long ago and cannot be participants in well-designed experiments. Instead, cognitive archaeology relies on forms of reasoning used in the historical sciences, such as geology and paleontology. It is primarily observational, reasoning about past processes from patterns visible in the present. Experimentation plays a role in testing hypotheses about past processes but is not the primary basis of an archaeological argument. An archaeological argument about cognition relies on a series of linked inferences (Botha, 2010;Wynn, 2009;see Fig. 1).
Archaeologists recover artifacts and patterns of artifacts (a) and from them reconstruct the activities responsible (b). The key piece of reasoning is the inferential link (b). How is this done? Archaeologists base this inference on their knowledge of technical systems in the present and the past. They often confirm inferences through experimentation, by duplicating the production and use of the artifact (e.g., Stout, Schick, & Toth, 2009;Wadley, 2010). This requires a second inference (d), again based primarily on knowledge of modern technical systems. In this step, ethnographic information about non-modern systems is also important and, for very early remains, knowledge of non-human primates. A final inference (f) that relies on concepts developed in the cognitive sciences is necessary to get from (e) to (g). It requires that features of the reconstructed knowledge system be linked explicitly to elements of an established cognitive model. Herein lies the challenge of cognitive archaeology, for only with carefully constructed arguments can the assessment achieve modest levels of reliability. But it is not a fool's errand; this form of reasoning is the only way that science can access the minds of prehistoric actors, and thus, despite the challenge, it is a necessary component of any attempt to document human cognitive evolution.
Given the interdisciplinary nature of cognitive archaeology, it is a challenge for scholars to follow all of the relevant scholarly developments. However, collaboration eases this load and also provides more nuanced understandings of the peculiarities of the disciplines involved. We began working together in 2000, Wynn the archaeologist and Coolidge the neuropsychologist, and the following summarizes our thoughts on two of the most salient transitions in hominin cognitive evolution (hominin refers to modern and extinct humans and their most known or suspected distant ancestors).

The First Major Leap in the Evolution of Human Cognition: Two Million Years Ago
In the second edition of our book, The Rise of Homo sapiens: The Evolution of Modern Thinking (Coolidge & Wynn, in press), we propose there were at least two major leaps in the evolution of cognition in hominins. The first major cognitive leap was in the evolution of Homo erectus out of an earlier, smaller-brained hominin about 2 million years ago. The earlier australopithecines (comprising many species of bipedal hominins at that time) had brain sizes in the range of modern chimpanzees (around 400 cc), but some made and used stone tools. Their smallish bodies compared to their longer limbs indicate that they traveled on the ground while foraging but probably slept in nests in trees, like modern chimps. They had varied diets that probably included some meat, but they seemed to be very apelike in terms of brain size, brain shape, and behaviors.
No significant increase in relative brain size occurred in any of the various hominin groups until about 2.5 million years ago, with the advent of hominins assigned to the genus Homo; one tool-making hominin, Homo habilis, had a brain size around 650 cc, which was about 50% greater than that of the australopithecines. However, the body proportions of the habilines remained similar to those of the australopithecines, suggesting they probably still slept in trees. The stone tools of the australopithecines and habilines were relatively simple: mostly sharp stone flakes struck from a pebble core (see Fig. 2). However, there is nothing known about this earliest technology that would have selected for the increase in brain size. These tools were used to butcher animal carcasses obtained through scavenging, and this access to higherquality nutrition arguably powered the increase (for greater detail on the relationship of meat to the evolution of cognition, see DeLouize, . The first apparent evolutionary development in cognition well beyond the ape range occurred with the advent of Homo erectus (upright man) about 2 million years ago along with a lithic (stone) technology known as Acheulean (see Fig. 3).
The hallmark of this technology is the handaxe, which Homo erectus made by trimming around the margins of a large flake to produce a sinuous cutting edge. In doing so, they also imposed a bilateral symmetry on the tool. Its manufacture clearly required spatial cognitive abilities (the active coordination of dorsal and ventral information from the primary visual cortex) and hierarchical organization of action that also relied on mechanisms of cognitive control whose use was not evident in the stone tools of earlier hominins (Hecht et al., 2014;Wynn, 2002). Homo erectus was not just a variant on the standard ape; it was something altogether different in its morphology, behavior, and cognition. A number of evolutionary firsts were associated with Homo erectus: dispersal out of the tropics and into cooler habitats; modern body size and locomotion, including running (Lieberman, Bramble, Rachlen, & Shea, 2009); an increase in female body size and reduced sexual dimorphism (i.e., difference in body size between males and females); and an increase in relative brain size well beyond the ape range, about 950 cc (modern brains = 1,350 cc). Antón, Potts, and Aiello (2014) have argued that the pivotal feature of Homo erectus's success was the ability to adjust to dynamically fluctuating environmental conditions. These authors did not specify their cognitive Artifacts (a) are linked to the technologies that produced them (c), those technologies to the concepts and knowledge they derived from (e), and that knowledge to its cognitive prerequisites (g) through a series of inferences (b, d, and f).
abilities, but they must have been significant. The most direct clues to these cognitive developments come not from anatomy but from the archaeological record. While many people are aware of the Out-of-Africa hypothesis, according to which all extant humans are genetically related to an effective (reproducing) population of about 2,000 humans who lived in Africa about 80,000 years ago, it is clear that Homo erectus came out of Africa many times over a million or more years. In our 2006 article, we addressed what we thought to be a neglected but critical component to the dramatic increase in cognitive abilities of Homo erectus, as evidenced by Homo erectus behavior and stone-tool technology: a full transition to terrestrial sleep (Coolidge & Fig. 2. The first known stone tools, a pebble core and sharp flakes, dating to about 3.3 million years ago. These were the first tools that were retained and reused, rather than being discarded after use; as such, they represent a distinct behavioral change with cognitive implications. Photo credit: Chip Clark; copyright 2010 by the Smithsonian Institution's Human Origins Program. Reprinted with permission. Fig. 3. Three Acheulean stone handaxes from Boxgrove, England, dating to about 500,000 years ago. These tools were shaped to have bilateral symmetry, indicating specific changes in cognitive abilities for visuospatial integration and more complex hierarchical procedures. Photo credit: Thomas Wynn. Wynn, 2006). Our hypothesis where we proposed that a single integral sleeping period on the ground (as opposed to in a nest in a tree) would allow for better sleep and a greater percentage of REM sleep, which in turn would allow for not only the consolidation of declarative and procedural memories but also their enhancement, is consistent with a recent study (Samson & Nunn, 2015). Further, we proposed that the longer REM periods would provide for the rehearsal and priming of likely encounters and threats during the day by dreaming about them at night. Additionally, we noted the surfeit of anecdotal evidence for inventions, ideas, music, and art that have been attributed by their creators to dreams, such as Elias Howe's sewing machine, Dmitri Mendeleev's periodic table, Srinivasa Ramanujan's mathematical theorems, Giuseppe Tartini's violin sonata in G minor, and so on. Thus, extended REM periods would also be potentially fruitful for their contents to a waking mind (see Coolidge & Wynn, 2006, for greater detail and empirical evidence on the relationship between sleep and cognition; see also Coolidge & Wynn, in press). Further, substantial research within the past decade has fully supported our 2006 contention that slow-wave sleep, REM sleep, and other sleep stages, as well as perhaps a single integrated sleep period, not only consolidate some types of declarative memory and procedural memories but may provide enhancement effects as well (e.g., Rasch & Born, 2015).
As noted previously (Antón et al., 2014), one key feature of Homo erectus's success may have been the ability to survive in disrupted and fluctuating environments. There is also anthropological evidence that the home territories of Homo erectus expanded tenfold or greater (up to 260 square kilometers, or about 100 square miles). Given current empirical research that supports the strong correlation between working-memory capacity and fluid intelligence, or the ability to solve novel problems (e.g., Shelton, Elliott, Hill, Calamia, & Gouvier, 2009), greater fluid intelligence might help explain Homo erectus's ability to expand its home territory and repeatedly leave Africa, as an enhanced ability to solve novel problems would have served Homo erectus well in the face of territorial expansion and extreme environmental vagaries.

The Second Major Leap in Human Cognition: Two Hundred Thousand Years Ago
In 2001, in our first collaboration, we addressed the evolution of modern executive reasoning (Coolidge & Wynn, 2001). We employed an established cognitive theory, that of executive functions, to understand changes in the archaeological record. We reasoned that as a result of the high polygenic heritability of executive functions (e.g., Engelhardt, Briley, Mann, Harden, and Tucker-Drob, 2015, found 100% heritability for a common executivefunctions factor across four individual heritable domains; Friedman et al., 2008, found 99% heritability for a common executive-functions factor across three individual heritable domains), a recent genetic or epigenetic event in the lineage of Homo sapiens might have enhanced executive functions (e.g., sequential reasoning, inhibition, organization, planning) beyond what was then the hominin standard. The production and use of hafted projectile points dating to about 100,000 years ago appeared to us as evidence for sequential reasoning and memoryin other words, a complex linkage of technological steps. The archaeological appearance of bow-and-arrow technology about 66,000 years ago, we viewed as evidence for complex sequential reasoning (Coolidge, Haidle, Lombard, & Wynn, 2016). The practice of agriculture, requiring planting, cultivation, culling, and storage (beginning about 12,000 years ago), appeared to us as an excellent representative of a task of inhibition, as it required delaying the immediate gratification of eating seeds in order to plant them and harvest them over varying lengths of time. There is also archaeological evidence for hunting reindeer by interception about 17,000 years ago, and desert traps that remotely captured game appeared about 10,000 to 7,000 years ago. Both of these hunting behaviors required sequential reasoning and the inhibition of prepotent impulses.
As evidence of successful organization and planning, we endorsed what had already been noted by others: the colonization of the Sahul (comprising the regions now known as New Guinea, Australia, and Tasmania) about 60,000 years ago. Klein (2000) and others have noted that such a journey, given that those lands could not be seen from where the travelers commenced, was a marker of modern behavior and probably modern language, although none interpreted the cognitive prerequisites in terms of executive functions. We noted that the watercraft themselves were evidence of multistep technology that required sequential memory and reasoning, and highly sophisticated organization and planning.

Another Cognitive Model for Archaeology: Working Memory
In 2001, we began publishing our hypothesis that developments in executive functioning occurred relatively late in human cognitive evolution (Coolidge & Wynn, 2001). In 2005, we recognized that Baddeley's multicomponent model of working memory incorporated many of the classical executive functions from the neuropsychological literature into the central-executive component of his model (Coolidge & Wynn, 2005), with two major subsystems: phonological storage and a visuospatial sketchpad. Baddeley's fourth component, the episodic buffer, was hypothesized to serve as temporary memory storage for the central executive and was thought to integrate information from the two subsystems (see Baddeley, 2001, 2007, or Baddeley, 2012, listed in the Recommended Reading selections, for much greater detail about his model). As there was a preponderance of evidence for the genetic heritability of working-memory capacity and its components (as cited previously), the working-memory model fit nicely with our suspicions that some genetic event occurred at some point, perhaps between 200,000 and 100,000 years ago, that enhanced working-memory abilities, giving Homo sapiens essentially modern thinking. We labeled the result of that event enhanced working memory (Coolidge & Wynn, 2005;Wynn & Coolidge, 2010). Although heretofore we have remained vague about the specific nature of the enhancement, we have previously speculated that one candidate might be an expansion of phonological storage, given Baddeley's work that viewed such storage as a potential bottleneck for language acquisition and comprehension (Baddeley, Gathercole, & Papagno, 1998). Further, if Chomsky and his colleagues (Fitch, Hauser, & Chomsky, 2005) are correct in their supposition that the hallmark of modern language is recursion, expanded phonological storage may have aided recursive thinking.
Regardless of the nature of the enhancement of working memory, the archaeological record suggests that an important development in human cognition emerged sometime after 100,000 years ago (although the genetic event that enhanced working memory might have occurred up to about 100,000 years earlier). Archaeologists have long recognized that appearance of personal ornamentation, depictive cave art, ritualized burials, bowand-arrow technology, and enigmatic figurines, like the Hohlenstein-Stadel Lion-man (see Fig. 4; Wynn, Coolidge, & Bright, 2009), began more recently than 100,000 or so years ago and might represent a dramatic cognitive change, whether culturally or biologically driven. Some scholars (e.g., Klein, 2000;Mithen, 1996) proposed that there was a cultural explosion around 50,000 years ago due to some genetic mutation.
In proposing our alternative explanation, based on recently enhanced working memory, we took a closer look at some of these cultural developments (e.g., change in material technologies over time) and argued that they required an increase in working-memory capacity over that required for earlier cultural developments. Compound adhesives are just one example of the kinds of evidence we used. Experimental archaeologist Lyn Wadley (2010) has used experimental archaeology to duplicate the manufacturing procedures necessary for the adhesives recovered on stone tools from the South African Sibudu Cave site (dating to about 70,000 years ago). The artisans at Sibudu produced a compound adhesive made of acacia gum, powdered ocher (a mineral pigment), and small quantities of beeswax. They used the adhesive to bind stone barbs to projectile shafts, so it had to be strong but not brittle: The functional parameters were quite narrow. The Sibudu chemists-and that is not a misuse of the termhad to monitor the heat of the adhesive mixture (using an open fire) and the quality of the compound through its viscosity and color changes, all the while keeping in mind the sizes and shapes of the available shaft and stone barbs. A rote recipe would not work because of the large number of variables injected into the procedure by inconsistencies in the natural ingredients; the artisans had to actively monitor the changes (e.g., Wadley, Hodgskiss, & Grant, 2009). The complexity of this bit of stone-age chemistry rivals anything in the kit of modern hunters and gatherers, and exceeded anything evident from earlier time periods. Moreover, it is clearly an executive-function task. Other scholars attributed the changes to a cultural ratcheting effect without reliance on biological changes. Regardless, we have found that the application of formal cognitive models provides a more grounded approach to documenting the evolution of hominin cognition-Baddeley's working-memory model in particular, as his model has substantially more empirical evidence than any cognitive rival in archaeology. However, other cognitive models do have potential. For example, Malafouris (2013) has presented a strong argument based in embodied cognition for the way in which material culture scaffolds minds to become more than just brains. Further, advances in paleoneurology also underpin and support cognitive archaeology. Paleoneurologist Emiliano Bruner and his colleagues (Bruner & Iriki, 2015;Bruner, Preuss, Chen, & Rilling, 2016) have shown that the brain of Homo sapiens expanded in the parietal lobes compared to more archaic hominins, especially the area of the precuneus, known for multiple higher-level cognitive functions-in particular, the ability to envision future scenarios (in conjunction with prefrontal cortices, cingulate cortices, and other regions of the brain; e.g., Addis, Wong, & Schacter, 2007;Cavanna & Trimble, 2006). Currently, contributions from the aforementioned disciplines such as paleoneurology, genetics, psychology, sleep science, and the cognitive sciences, as well as advanced methodological techniques such as fMRI and other neurophysiological measures, are making evolutionary cognitive archaeology a vibrant and provocative field, with every new archaeological and anthropological discovery potentially becoming important in the understanding of the evolution of modern thinking and modern symbolic culture. Press. An edited volume of various cognitive, psychological, and evolutionary models for the understanding of archaeological artifacts.