Fair game: exploring the dynamics, perception and environmental impact of ‘surplus’ wild foods in England 10kya–present

ABSTRACT This paper brings together zooarchaeological data from Neolithic to post-medieval sites in England to explore the plasticity of cultural attitudes to the consumption of wild animals. It shows how, through time, game has been considered variously as ‘tabooed’ and ‘edible’, each having implications for patterns of biodiversity and wildlife management. The essential points being made are that deeper-time studies can reveal how human perceptions of ‘surplus foods’ have the potential to both create and remedy problems of environmental sustainability and food security. Perhaps more significantly, this paper argues that understanding the bio-cultural past of edible wild animal species has the potential to transform human attitudes to game in the present. This is important at a time when food security and the production of surplus are pressing national and global concerns.


Introduction
Today, food security and sustainable over-production are prime concerns for governments worldwide as they struggle with the economics and environmental impact of providing sufficient sustenance to feed a growing human population, estimated to reach >9 billion by 2050 (Hoffman and Cawthorn 2013, 764). In order to meet global demands, attention has focused on increasing the productivity of a narrow range of domestic animals (e.g. cattle, sheep, pigs, chickens) and crops (e.g. barley, wheat, maize, rice) with species transported from one country to another, being farmed intensively in regions far beyond their native range. However, it is becoming apparent that a key to the provision of food surplus may be for humans to tap resources that are already plentiful but, due to cultural taboos, are considered 'unedible' (Cawthorn and Hoffman 2014;Farouk et al. 2015;Gruber 2016;Meyer-Rochow, Megu and Chakravorty 2015).
The same case for broadening horizons could equally be made for archaeological research concerning food surplus. In keeping with studies of modern production regimes, archaeological analyses have also centred on the traditional domesticates, in particular plant species, with emphasis placed on how their spread and intensive management enabled the emergence of social complexity and change (Bogaard et al. 2009;Chesson and Goodale 2014;Kuijt and Finlayson 2009;Smith 2006). Less attention has been paid to species that contribute little to human diet in the West today (e.g. Fritz 2007). For instance, the horse, so vital to subsistence strategies in many areas of the worldpresent and pastis seldom considered within archaeological discussions of food because, as the 2013 'horse-meat scandal' demonstrates, it is not considered 'edible' across much of Europe (Poole 2013). The same is true for many wild animal species: they are given short shrift in archaeological discussions of food even though in the past they made a significant contribution to human diet and were often translocated and managed alongside animals that became the true 'domesticates' (e.g. Vigne, Daujat and Monchot 2015;Carden et al. 2012;Carden 2012;Stanton, Mulville and Bruford 2016).
In some respects, wild resources are the ultimate surplus food. In societies that have plentiful access to agricultural produce, wild resources often represent luxury goods: it is for this reason that the management, procurement, distribution and consumption of wild animals is often correlated with social hierarchy and power-structure complexity, all themes that this volume seeks to address (Allen 2014;Barringer 2001;Cartmill 1993;Griffin 2007;Hamilakis 2003;Sykes 2014a). In other cultures where agricultural production is less reliable, wild plants and animals represent 'risk buffers', low-input resources used to increase human nutrition in times of need (Brashares et al. 2004;Grant 1981;O'Shea 1989). Given the potential foodshortage that is currently facing humanity, perhaps it is time to consider carefully whether, in some contexts, wild resources might represent important sources of food for the future. This is not to suggest that the human population should return wholesale to subsistence hunting as this has certainly had a detrimental impact on global wildlife populations; for instance, the bush-meat trade has been shown to have a negative effect on biodiversity (Brashares et al. 2004;Jenkins et al. 2011). Indeed, there is a growing literature showing how zooarchaeological data can highlight issues of over-exploitation and be used to inform on conservation policy (e.g. Dietl et al. 2015;Emery 2007;Wolverton and Lyman 2012;Valenzuela et al. 2016). However, zooarchaeological data can also provide information concerning the mechanisms of sustainable harvesting (e.g. Etnier 2007) and this is particularly applicable to modern situations where wild animals are abundant and even considered to be pests (Cawthorn and Hoffman 2014).
In England, for instance, there are large and rapidly increasing populations of wild animals (Postnote 325 2009). These have been highlighted as a threat to the production of agricultural surplus and to food security because they cause millions of pounds of crop damage annually (MacMillan and Phillip 2008). In this case, it might be argued that game and venison should contribute more to the human diet (Riminton 2013). This is especially so since some of the most problematic species are anthropogenic imports that, in the past, played a significant role in feeding human populations. Indeed, in the medieval period considerable efforts were made to establish and manage these non-native animals in order to generate a surfeit of game that was used as a social currency, being incorporated into gift exchange across the social spectrum (Sykes 2007a;Birrell 2006). Today, however, government documents highlight a 'prevailing negative attitude towards game meat amongst the general public' (Postnote 325, 2009). That such an opposing cultural stance can have developed within a few hundred years reveals the plasticity of attitudes to food in general and to wild resources in particular. This paper sets out to explore the bio-cultural dynamics responsible for these shifts. It does this first by reviewing broad-brush patterns of wild animal exploitation in England from the Neolithic to post-medieval period. Against this background, the evidence for medieval England is examined in more detail, with particular emphasis on the exploitation of the two native deer speciesthe red deer (Cervus elaphus) and the roe deer (Capreolus capreolus)alongside the evidence for three introduced species: the fallow deer (Dama dama), brown hare (Lepus europaeus) and, to a lesser extent, the rabbit (Oryctolagus cuniculus).
The intention of this paper is to demonstrate that deep-time investigations can reveal how human attitudes to 'surplus', and the strategies put in place to generate over-production, have the potential to remedy, but also to create, problems of environmental sustainability and food security. More significantly, I will argue that understanding patterns of wild resource management in the past can have implications for attitudes to 'edibility' and surplus production in the present day.

Methods
In order to examine broad shifts in patterns of wild animal exploitation, zooarchaeological data from 815 English animal bone assemblages dating from the Neolithic to post-medieval periods were synthesized. In addition to the species mentioned above, data pertaining to the aurochs (Bos primagenius) and wild boar (Sus scrofa) were included in the synthesis. These data derived from the work of Sykes (2007b), Hambleton (2008), Allen (2010), Poole (2010) and Serjeantson (2011), all of which used broadly comparable methods (e.g. all are based on number of identified specimens and include only assemblages with >100 identifiable specimens). The exception is that shed antlers, which do not reflect exploitation for food, were included in Serjeantson's (2011) and Hambleton's (2008) datasets but excluded from those of Sykes (2007b), Allen (2010), Poole (2010)thus wild animal exploitation is artificially inflated for the Neolithic to Iron Age periods compared with the Roman and medieval data. The data for the medieval period (219 assemblages) are shown in more detail in Table 1, which specifies the site type and number of assemblages used to calculate the changing representations of red deer, roe deer, fallow deer and hare between the ninth and eighteenth centuries AD (Figure 2(a-d)).

10,000 years of wild foods in England
The diachronic variation in the frequency of wild animals represented in English zooarchaeological assemblages is summarized in Figure 1. It can be seen that, by contrast to the Mesolithic period where assemblages are composed almost entirely of wild animals, those from Neolithic sites in England contain less than 5 per cent (Serjeantson 2011;Schulting 2013). Furthermore, where wild animal remains have been recovered on Neolithic sites, they consist primarily of shed antlers from red deer so cannot be seen evidence for venison consumption (Serjeantson 2011).
It is unlikely that Neolithic people were ambivalent about wild animals; on the contrary, close relationships must have existed to motivate the many translocations of wild ungulate species that occurred during the period. Recent zooarchaeological and genetic studies make clear that it was in the Neolithic that red deer and wild boar were first introduced to Ireland Carden 2012) and several Scottish Isles saw the importation of red deer (Mulville 2010;Stanton, Mulville and Bruford 2016). Some of these islands may have functioned as self-sustaining game reserves, the animals being left to roam and breed without fear of natural predation so that they could be hunted on occasions of human visitation. Certainly, some assemblages from the Scottish Isles show heavy exploitation of red deer (Mulville 2010). These Scottish data bring into relief the dearth of evidence for wild animal exploitation in England, strengthening suggestions that a cultural taboo over the consumption of all wild resourcesmammals, birds and fishmay have existed during the Neolithic (Richards and Schulting 2006;Pollard 2006;Serjeantson 2011;Sykes 2014b). The apparent lack of wild animal exploitation in England seemingly endured through the Bronze Age and Iron Age, for, although rates of representation are artificially inflated by the inclusion of shed antlers (Figure 1), they continue to fall steadily. As in the Neolithic period, however, wild animals were translocated, the brown hare seemingly being introduced to Britain at some point during the Bronze Age or Iron Age (Sykes 2014b, 88-90). The brown hare does not appear, initially, to have been considered 'edible' since Caesar's account of the Britons, stated that 'hare. . .they think it unlawful to eat' (V.12trans. Handford 1982, 111). In the absence of human exploitation, it must be assumed that populations of wild animals were kept in balance by predation from the wolves (Canis lupus), bears (Ursus arctos) and lynx (Lynx lynx) that still inhabited England at this time (Hammon 2010;Hetherington 2010;Pluskowski 2010). The situation began to change, however, with the rise and expansion of the Roman Empire, which brought with it new cultural attitudes to the natural world.
Across the Roman Empire, the exploitation of wild animals became part of elite culture (e.g. Anderson 1985;Dunbabin 2004). Although the zooarchaeological representation of wild animals does not appear to increase dramatically on Roman sites in England (Figure 1) this is partly an artifice created by the exclusion of shed antlers from the dataset for this period. Those who have studied the faunal evidence in detail have demonstrated a clear Iron Age to Roman increase in the utilization of wild resources, particularly in large villas and military centres (Allen 2014). Populations of brown hares were cultivated in enclosures known as leporaria, with newly imported fallow deer herds being maintained in larger parks called vivaria (Sykes 2010). The hunting culture transported by the Romans was not to last however: following withdrawal of the Roman Empire  from Britain, cultural attitudes to wild animals appear to have reverted to the pre-Roman situation, with the re-emergence of a taboo over their consumption (Sykes 2014b). Genetic data suggest that any established fallow deer populations became quickly extinct (Sykes et al. forthcoming), and brown hare populations presumably escaped their leporaria to become feral in the wider landscape.
After approximately four centuries of limited exploitation, the concept that wild animals might represent 'food' gradually returned in the medieval period, starting in about the ninth century AD and reaching an apogee between the twelfth and sixteenth century (Figure 1). Before examining this period in detail, it is interesting to reflect upon the changing attitudes to the consumption of wild animals. At the most basic level, trends in exploitation can be seen as charting attitudes to the natural world, which in turn are related to levels of trade as well as settlement and social complexity (Sykes 2014b). For instance, it is noticeable that the two periods in which wild animals were regularly consumed -Roman and medieval (ninth-fifteenth centuries)were also those characterized by internationalism, imperialism, urbanism and social hierarchy, combined with belief systems that put humans at the top of the 'chain of being', thus entitling them to dominate nature. These factors put in place the mechanisms for obtaining and distributing exotic species but also for supporting the non-productive populations who wished to express their elite identity through the production, distribution and consumption of game. In England, the links between wild animal exploitation and elite status are particularly evident in the context of feudal structure of the medieval period, to which we now turn.
The medieval legacy Figure 2 shows the changing representation of roe deer, red deer, fallow deer and hare through time and according to site type. Altogether it clearly demonstrates that the emergence of wild animal exploitation was driven by the social elite, with high-status sites consistently indicating a higher representation of hare and deer bones than is seen at other site types. Initially, the two native deer species were the focus of hunting: roe deer were the principal quarry between the ninth and eleventh century, with red deer becoming more heavily exploited species between the eleventh and twelfth century. It seems possible that over-hunting during this period was responsible for the population declines that are indicated not only by the zooarchaeological record (Figure 2(a, b)) but also, in the case of roe deer, by genetic studies (Baker 2011). Attempts to establish more accessible sources of venison may have been the motivation for the reintroduction of fallow deer, which reappear in English zooarchaeological assemblages during the eleventh century (Figure 2(c)). Genetic analysis of ancient and modern fallow deer indicates that, following their post-Roman extirpation, populations were imported to England from the eastern Mediterranean (Sykes et al. in prep.). The evidence points to contact with the Byzantine Empire, where the elite passion for hunting and the emparkment of wild ungulates persisted unbroken from the Roman period (Ševčenko 2002) and was most likely introduced to England around the time of the Norman Conquest (Sykes 2007b).
Certainly, the reintroduction of the fallow deer appears to have brought with it a renewed interest in the production, distribution and consumption of game. Zooarchaeological and historical analyses have shown the centrality of venison in the creation of community and maintenance of order, whereby carcasses or cuts of meat were gifted horizontally between elite households as social currency or distributed vertically through the hierarchy as symbols of largesse (Birrell 2006;Sykes 2007a). So important was this gift exchange to the social economy that large numbers of deer parks were created, at great expense, to enable the production of surplus venison: it is estimated that by 1300 AD more than 3,000 had been established, covering about 2 per cent of the total area of countryside (Rackham 1986, 123;Sykes et al. 2016). Within these parks, warrens were also set up for keeping hares and rabbits, the latter having been introduced to England in late twelfth century as a source of food (Sykes and Curl 2010). Efforts by the elite to preserve game were not only limited to the transformation of the English landscape and its biodiversity (with the introduction of new species) but also involved the eradication of top predators that competed for these sought-after resources: by the end of the medieval period England's bear, wolf and lynx populations had all been become locally extirpated (Hammon 2010;Hetherington 2010;Pluskowski 2010). Figure 2(c, d) provides an indication of how successful these measures were, with both fallow deer and hare frequencies reaching unprecedented levels in fourteenth-sixteenth-century assemblages.
Ironically, the efforts that the elite went to in order to maintain a ready supply of game can be cited as the very factors that led to the demise of game consumption within English culture. With what was essentially the farming of deer, hares and rabbits, game became easier to acquire as it percolated into the urban black market and even became a staple of peasant feasts (Birrell 1992(Birrell , 1996Sykes 2007a). Figure 2(c, d) shows evidence of increased representation of fallow deer and hare on lower-status rural and/or urban sites in the later medieval periods. As soon as game was no longer the preserve of the elite, it lost its status associations and the aristocracy began to scale down their parks and warrens, reducing their consumption of game, as can be seen particularly in Figure 2(c). Of course, once the elite lost interest in game, the lower social echelons soon did likewise and the levels of exploitation gradually declined across the board. This overarching cultural shift away from game is highlighted in Figure 1, which shows a substantial drop in the frequency of wild animal representation in assemblages dating between the sixteenth and nineteenth centuries, a downward trend in exploitation that has continued in England to the present day (Sykes and Putman 2014;Riminton 2013).

Surplus for the present?
The last 500 years have seen perhaps the greatest change in cultural attitudes to the natural world, arguably a more dramatic change than has been witnessed in the entirety of human history (Thomas 1983). Across Europe, the rapid growth of internationalization, urbanization and intensive farming has seen human populations expand and become increasingly divorced from traditional rural practices and concerns (Sykes and Putman 2014). Whereas the archaeological record would suggest that, as in the Roman and medieval periods, these conditions are conducive to fostering elite hunting cultures, the collapse of feudalism altered both perceptions and expressions of social identity. Rather than desiring to 'dominate' nature, modern English attitudes to the natural world show similarities to those of the Neolithic and Iron Age, with the general public seeing wild animals as sacred icons of the wilderness and, therefore, above human exploitation.
The paradox is that, today, in the absence of top predators (both human and non-human) England's populations of deer, hare and rabbit are expanding, burgeoning to a density that is believed to be higher than at any point in the past; they now represent environmental and agricultural problems in many areas (MacMillan and Phillip 2008). There is a growing need for people to manage these species and in much the same way that Meyer-Rochow, Megu and Chakravorty (2015) and Gruber (2016) have argued that ratsbeing both abundant and pests of agricultural surplusrepresent a logical source of protein, there is a strong case that game animals ought to contribute more to the human food chain in England (Riminton 2013). Similar calls are being made for other regions of the world, particularly where wild animals were translocated by past generations and now represent ecological problems (Nugent and Fraser 1993;Cawthorn and Hoffman 2014). In New Zealand, for instance, non-native deer that were released in the 1800s were traditionally considered a conservation problem until they were recently 'rebranded' as a food resource. New Zealand venison is now exported globally and is more likely to be present on English supermarket shelves than locally sourced English venison (Riminton 2013;Nugent and Fraser 1993).
It is bizarre that New Zealand venison is being imported to Britain, while locally obtained venison is, in turn, exported to mainland Europe (Riminton 2013). There would seem to be a need to recalibrate British perceptions of wild foods and one way this might be achieved is to present to the public the deep-time evidence and the biodiversity issues unwittingly created by the actions of the medieval elite and the landed estate deer collections of the nineteenth century (MacMillan and Phillip 2008;Riminton 2013;Sykes and Putman 2014). Another way forward may be to actively promote the democratization of game, as its enduring elite associations serve only to strengthen public opposition to wild animal consumption (Riminton 2013). In other European countries, such as Norway and Sweden, the exploitation and consumption of wild animals is socially widespread and undertaken on a more egalitarian basis (Apollonio, Andersen, and Putman 2010). To a large extent, this situation reflects the socio-political history of Scandinavia and it would be difficult to superimpose a similar approach in England, where current attitudes are ingrained and equally linked to the country's socio-political past. As such, the outlook for wild mammal exploitation and management in England is concerning. However, studies of the past provide some comfort in this regard since the zooarchaeological record demonstrates that attitudes to the management and consumption of wild animals can change, have changed repeatedly in the past, and will certainly continue to do so.

Conclusions
Cultural attitudes towards foodstuffs are dynamic, with perceptions of edibility fluctuating through time, as is the case for wild animals. Although the 'received wisdom' is that ancient civilizations gained large quantities of their subsistence through hunting, the zooarchaeological evidence suggests that this was not the case in England. In fact, based on the English evidence, humans appear to have spent the majority of the last 10,000 years deliberately avoiding the consumption of wild animals. There is still a cultural tendency towards avoidance; however, today, the parameters have changed: the humans of the Roman and medieval periods left a legacy of exotic animal establishments (hares, fallow deer and rabbits) and extinctions (bears, lynx, wolves) that now require intervention.
Currently, non-native animals (particularly the fallow deer and rabbit) are discussed within the environmental literature as economic problems and, as such, are labelled as 'invasive species' (e.g. MacMillan and Phillip 2008). Nevertheless, there is scope to rebrand these 'problems' as beneficial surplus foodstuffs and, as was achieved in New Zealand (Cawthorn and Hoffman 2014), see them incorporated back into the human food-chain (Riminton 2013). To achieve this, cultural attitudes need to be recalibrated. Zooarchaeology has the potential to help make the cultural case for the benefits of this 'fair game' by highlighting both the bio-cultural history of wild animals involved and the paradox of the current taboo concerning their consumption. At the same time, the zooarchaelogical record contains warnings about the human capacity to over-exploit 'surplus' resources. For instance, the fallow deer introduced during the Roman period were quickly brought to extinction, and there is growing evidence that over-hunting during the medieval period extirpated English red deer and roe deer populations (e.g. Baker 2011; Sykes and Putman 2014). These past examples need to be kept in mind as there is every possibility that the same could happen again if management strategies were unchecked.
The approach presented hereto interrogate zooarchaeological data and bring the results to bear on modern-day issuesrequires the evidence to be considered within its unique bio-cultural context. It would not be possible for the results of this case study to be applied, say, to East Africa where hunting for bush meat as food surplus is devastating biodiversity (Brashares et al. 2004;Jenkins et al. 2011). Nor are they applicable even to Scotland where the country's socio-cultural development has evolved a very different attitude to wild resources, with its own unique set of problems. Nevertheless, while the results of this paper are not transferable, the approach is. And it has the potential to enable zooarchaeologists not only to generate insights about past cultures and their attitudes to the natural world but also to effect change in the present.