The obliteration of life: depersonalization and disembodiment in the terabyte era

Post-genomics allegedly allow us to become the “managers” of our own health. And yet, human individuality seems to dissolve into massive data streams. What is the fate of the human subject in the terabyte age? The Human Genome Project already resulted in personalizing and depersonalizing trends, exemplified by two types of genomes: the anonymized Human Reference Genome versus the personal genomes of genomics celebrities. This ambiguity is radicalized by post-genomics. Life becomes “obliterated”: dissolved into letters and symbols (e.g. the nucleotide alphabet), but this is complemented by re-personalizing trends. As a case study, I will analyze the Snyderome, involving a prominent geneticist who closely monitored “everything” with the help of precision diagnostics, resulting in a comprehensive (“high coverage”) omics portrait, highly personal and highly impersonal at the same time, captured in massive data sets, setting the stage for a digital panopticon: a molecularized “conscience”, the superego of the terabyte age.


Introduction
The Human Genome Project (HGP, launched in 1990) resulted in an "initial" (IHGSC 2001) and a "finished" composite sequence (IHGSC 2004). The latter is known as the Human Reference Genome (HRG) and is periodically updated. But as genome sequencing became "normal science", the focus of attention shifted to various post-genomics fields, notably personalized genomics and epigenomics. On the one hand, post-genomics allegedly opens up new practices of the Self, enabling individuals to become health "entrepreneurs" (Harvey 2009). On the other hand, human individuality seems to dissolve into massive data streams. What is the fate of the human subject in the terabyte (or even petabyte) age? such as Fukuyama (2002), the idea that the HGP was about to reveal the "factor X" (i.e. the uniquely human genetic endowment that supposedly would explain our exceptionality, as not only highly intelligent and creative, but also chronically discontent and unhappy beings) became a matter of concern. How would bioengineers use this new sway over human existence? The HGP seemed to pave the way for the leap into post-humanity. Yet, the "factor X" (the unique human genetic "essence") was never found. At face value, there is nothing remarkable or special about the human genome (Zwart 2009). Rather, as a quasi-artistic "portrait", the typical output of DNA sequencing machines (displayed in the above figure) conveys a sense of hyper-modernistic anonymization. In the context of high throughput sequencing, our "essence" or core identity, instead of being secured at last, evaporates in massive data sets.
This sense of anonymization is conveyed not only by the outcomes, but also by the methodology of genomics research, which is conducted by large-scale, multinational consortia, involving thousands of researchers and relying on automation to increase precision, standardization and speed. Genomics publications may list hundreds of "authors" (in alphabetic order), while the bulk of the work is carried out by high-tech machinery. Thus, both at the "subject-pole" (the researchers) and at the "object-pole" (the sequenced individuals) of the knowledge-production process, human individuality seems to dissolve and disappear from view.
At the same time, paradoxically perhaps, the HGP (and related research endeavors) provided a stage for (was given a face by) a select number of highly visible genomics celebrities, such as James Watson, Francis Collins, Craig Venter, John Sulston and George Church. Besides their involvement (as science managers) in the production of highly technical and multi-authored genomics papers (in accordance with the standardized genre formats of scientific discourse), these celebrities also published highly personal autobiographical accounts in book form: imaginative personal assessments of the meaning of genomics for society and the future of human existence (Zwart 2008a), explaining how genomics and post-genomics will revolutionize the world in general and human healthcare in particular. Examples of such books (sometimes written with the help of professional science authors) are The Language of God (Collins 2006) and The Language of Life (2011) by Francis Regenesis (2012) by George Church;The Common Thread (2002) by John Sulston (with Georgina Ferry); A Passion for DNA (2000) by James Watson and A Life Decoded (2007) and Life at the Speed of Light (2013) by Craig Venter. Some of the subtitles, such as "DNA and the Revolution in Personalised Medicine" (Collins 2011) or "How Synthetic Biology will Reinvent Nature and Ourselves" (Church and Regis 2012), are quite telling. Furthermore, many of these celebrities have sequenced and published their own personal genomes (Church, Venter and Watson), while Sulston collaborated with artist Marc Quinn to produce a highly personal genomic bio-art portrait (in 2001).
In other words, the human genome endeavor has resulted in two types of genomes. On the one hand the anonymous HRG, a composite sequence (a Mischperson, in Freudian terms) without identity or "face", disconnected from any actual genome sequence belonging to a traceable, identifiable human being. By describing humans in terms of anonymizing strands of letters (the nucleotide alphabet composed of A, C, G and T), personal identity seems to be (literally) "obliterated". On the other hand, by way of "compensation" as it were, HGP celebrities such as Church, Venter and Watson published their individual genome. The most notable trendsetter is Craig Venter no doubt who, in his autobiography (2007), explicitly uses genes encountered on his personal sequence as explanatory factors to come to terms with biographical events. These genomes represent an emphatically personal, even egocentric dimension of the human genomics endeavor, for instance by drawing attention (wittingly or unwittingly) to some particular genes associated with specific health problems or personality traits, such as Alzheimer's in the case of Watson; stress-tolerance, thrill-seeking and colon cancer in the case of Venter; and narcolepsy in the case of Church. Thus, at the "subject-pole" of the knowledge-production process, genomics research is both anonymized (through large-scale research consortia and multiple-author publications) and personalized (by research managers publishing personal genomes and genomics memoirs).
Both trends (depersonalization counterpointed by re-personalization), although apparently moving in opposite directions, actually belong together as the front side and the reverse side of high-tech biomolecular innovation. They allow us to see the HGP and its aftermath as an updated enactment of the archetypal Icarus story. Initially, Icarus represents the position of the (overpromising, assumptive) genomics celebrity: eager to take risks, willing to use emerging technologies for unconventional purposes and aiming to reach unprecedented public visibility and "height". Eventually, however, the anonymous (invisible, depersonalized) subject of the genomics era is bereft of its substance, reduced to a mere sequence and drowned in data (in digital litter).
This same effect becomes noticeable when (personalized) genomics enters the everyday life-world. On the one hand, human individuals are encouraged to use high-throughput sequencing data to become the "managers" of their personal health condition (Harvey 2009), relying on high-resolution health transparency, provided by next-generation sequencing and epigenomics information. By combining "static" data (on individualized genome sequences) with "dynamic" data (on the impact of lifestyle, diet and other characteristics, framed as changeable by individuals), personalized medicine summons us to invest in self-optimization (Prainsack 2015). At the same time, in the era of high-pace sequencing technologies, human individuality and subjectivity seem to dissolve in depersonalizing data streams. In other words, genomics and post-genomics, notably the combination of personalized genomics and epigenomics (as complementary endeavors), seem to invoke both selfcenteredness and a drastic decentralization of the Self.
Literation as the final stage of symbolization The tendency toward depersonalization is not an exclusive feature of genomics, but constitutes a basic drive in modern science as such. Psychoanalytic epistemologist Bachelard (1938Bachelard ( /1947 already argued that in order for the scientific knowledge process to unfold, scientists must break free from the "immediate" (visible, tangible, fleshy, messy) objects encountered in the prescientific life-world of everyday experience. The basic objective of research technologies is to replace this familiar, mundane, phenomenological world with a new type of experiential space, an artificial environment: a laboratory setting, where new types of objects (laboratory artifacts, unable to exist in the outside world) can be modified and analyzed under controlled circumstances, undisturbed by intrusions, complications and noise. The lab topology incorporates mechanisms of defense, as only small samples of reality are allowed to enter. The typical laboratory object is a minimal, reduced, dismantled object, no longer viable on its own: a target of choice for the scientific cupido sciendi (the researcher's "will to know").
But also the scientific subject position is transformed and decentered ("emptied" as it were) by automation and laboratory equipment (Bachelard 1938(Bachelard /1947Serres 1972). At the subject-pole, the epistemological rupture notably entails a cleansing of the sociocultural heritage of expectations and associations concerning "nature", "humanity", "embodiment", etc., so that a purified, optimized, reliable, depersonalized and highly functional "subject" remains, dwelling in laboratories, smoothly interacting with (and increasingly replaceable by) machines. As Bachelard (1938Bachelard ( /1947 argues, this epistemological rupture separates scientific styles of knowledge production from "prescientific" (intuitive, embodied) modes of thinking and perceiving. Notably, science should cleanse itself from the archetypal images and ideas that remain influential in the realm of public imagination. As a self-proclaimed psychoanalyst of science, Bachelard devoted bulk of his fascinating oeuvre to tracing the seductiveness and tenacity of these archetypal "obstacles", this psychic "depth", retarding the exponential growth of scientific productivity.
This line of thinking was taken up by Jacques Lacan. Science, he argues, entails a radical modification, both of the "object-position" and of the "subject-position" of science (Lacan 1966). The subject of modern science originally emerged as the Cartesian cogito, with its emphatic rejection (expulsion) of traditional, prescientific knowledge forms as questionable: a subject freed from mundane and religious preconceptions, eventually resulting in a punctual, disembodied subject without substance or "depth" (728). A similar process unfolds at the object-pole, however. The history of the contemporary life sciences is a stepwise replacement of fleshy, messy, tangible entities (living being as a visible "Gestalt") by carefully documented and quantified objects known as model organisms (such as T1 bacteriophages, Caenorhabditis elegans, Drosophila melanogaster and the nude, immunodeficient mouse). But even these organisms are systematically "consumed" and "used up" in the course of a process bent on producing texts (Latour and Woolgar 1979). Moreover, even these laboratory creatures function as mere vehicles (still too messy, fleshy, etc., in the end) of more drastically disembodied bio-objects, such as genes. For Delbrück, Luria and other pioneers of bacteriophage research, bacterial viruses became research objects of choice because they basically counted as "naked genes" (Watson 1968(Watson /1996. 1 But now, this process has taken a further decisive step in the sense that the gene itself (as the ultimate object of molecular genetics) has begun to evaporate. The term gene (coined in 1909) is becoming a pure signifier, operating as a purely discursive phenomenon, covering up a disconcerting gap, an entity bereft of its substance, a set of symbols associated with a precarious concept that is quickly losing its ontological consistency. In the aftermath of the HGP, the vicissitudes of this core signifier reflect the emptying of the object-function: see, for instance, the current debate concerning the "demise" of the gene (Moss 2003;Barnes and Dupré 2008). Meanwhile, model organisms (as typical objects of life sciences research) are relentlessly "symbolized": they are dissolving into digitalized data sets, storable, retrievable and processable with the help of standardized collaborative cyber-infrastructures (Leonelli and Ankeny 2012).
From a Lacanian perspective, all this is highly typical, rather than exceptional, for contemporary science. The cupido sciendi (the "Will to Know") of the scientific endeavor amounts to a process of "symbolization": that is, the representation (and eventual obfuscation) of nature with the help of a limited set of algebraic and alphabetic symbols. The lived body becomes overridden by textual elements, and the living being as a Gestalt gives way to a system of signifiers, more easily manageable than real organisms, so that research can be conducted in silico (with the help of computers and databases) rather than in vivo (where life remains messy and recalcitrant). The representation of living entities with the help of alphabetic symbols (A, C, G and T, for instance) exemplifies a process of symbolization which, rather than being curiosity-driven, is driven by desire, by a "will to power". Eventually, it culminates in the "literation" or even "obliteration" of life, so as to drastically increase biotechnology's sway over nature, notably on the level of the "elementary particles of life" (genes, nucleotides, amino acids and the like). Life no longer seems impenetrable (Miller 2001) as the body as a coherent whole (Gestalt) gives way to the molecularized body, allowing bioengineers to operate upon the living with the help of the "algorithms of life". Thus, human life and health will become more manageable in the near future, preparing the way for "algorithmic" governance as the final stage of biopower (Rouvroy 2014).

Symbolization and the onset of biological "literacy"
The symbolization of the living, culminating in the HRG, is already quite noticeable in the work of Gregor Mendel (rediscovered in the spring of 1900), who speculated about "elements" or "factors" (either dominant or recessive) as ultimate determinants of phenotypical features with the help of alphabetic symbols (Aa, Bb, Cc and so on), where uppercase stands for "dominant" and lowercase for "recessive" (Mendel 1866(Mendel /1913. Mendel already enacted a symbolization or "literation" of the living, stressing the importance of biological "literacy". Everything pertaining to the garden pea as a living organism was filtered out. For Mendel, Pisum sativum was basically a research tool for studying what geneticists came to call "genes", conceived as autonomous units, passed on to future generations independently of biological categories such as species or sex. In his inaugural lecture, Foucault (1971) presented Mendel (who had published his results in 1866) as a "monster" (37): a voice that spoke too soon; but in the twentieth century his approach quickly gained momentum and genetics became the core discipline of the new biology.
A key aspect of Mendel's "untimely" epistemic "mutation" consisted in the use of letters (Aa, Bb, Cc, etc.) to refer to "elements" that could be either present or absent, and either dominant or recessive. In ancient Greek, the term stoix1ĩa ("elements") referred both to elementary building blocks (of reality or of knowledge) and to the letters of the alphabet, and Mendel used the term "elements" in a similar way. Indeed, he aimed to see through the living organism (the visible Gestalt) in order to read the symbols, the stoix1ĩa, the "letters" within: the genotype in the literal sense of "type".
Rather than being precocious, Mendel's literation effort was actually the resurgence of an idea of long standing: that lógo6 (i.e. words, language and letters) constitutes the intelligible principle pervading nature. Moreover, lógo6 can be encountered on both sides of the knowledge relationship, for a human being ( ) is a reading animal, gifted with language (lógo6) and therefore able to discern the lógo6 of nature: the letters, the stoix1ĩa of life: that which renders the apparently chaotic Real intelligible. Nature and life are incarnations of a primordial text: the language of the Other (in the beginning is the word, 'En as the Gospel phrases it), and the basic objective of research is to make this text, this language of the Other, readable. Thus, although it may be impossible to know what Mendel was consciously thinking (Hartl and Orel 1992), his efforts do reflect a recognizable epistemological gesture (Zwart 2008b).
In trying to decipher the letters, moreover, Mendel was not completely at odds with his Zeitgeist. His contemporary Francis Galton was thinking along similar lines (Müller-Wille and Rheinberger 2012, 133). For him, life likewise consisted of a collection of letters and to read them, biologists had to improve their literacy, so as to achieve a higher level of resolution. 2 Mendel and Galton shared the idea that the stoix1ĩa of life were representable by means of letters, so that the idea of life as a "text" or "code" was born, gaining momentum in the century to come, when symbolizing life with the help of signifiers such as GCA, GCC, GCG and GCT became the dominant trend. This also explains the abundant use of terms adopted from linguistics (such as annotation, duplication, translation, transcription, etc.). Speaking of Zeitgeist, it is certainly no coincidence that the histories of modern genetics and modern linguistics (as key research fields of the twentieth century) coincide in time (Zwart 2013).
And Mendel was not the only researcher who used letters for "elements". Also in the year 1900 (the year of his rediscovery), Karl Landsteiner discovered the blood types, determined by the (presence or absence) of basic constituents, namely antigens. This likewise resulted in the introduction of a small alphabet of symbols (A, B, AB and O). During the first decades of the twentieth century, similar developments occurred in other branches of research. Quantum physics produced its own alphabet of elementary particles (e 2 , P + , H + , H o , m, etc.) and similar "alphabets" emerged elsewhere, such as the alphabet of amino acids (Ala, Arg, Asn, Asp, etc.).
This literal understanding of life became a key motif in the new science of life. The American geneticist of Japanese descent Susumu Ohno (1928Ohno ( -2000 acquired everlasting fame by coining the (now discredited) term "junk DNA" to emphasize that our genome apparently contains large chunks of unreadable script. In other words, a significant portion of the "letters" on the human genome was actually "litter", that is, letter-like debris, the remainders of evolutionary dramas of the distant past. Moreover, the most basic principle of life, he argued (in nature as well as in laboratories), is plagiarism. According to Ohno (1987Ohno ( , 1988, evolution largely depends upon plagiarizing a relatively small set of innovations which emerged relatively early in the history of life. What molecular geneticists are doing in their laboratories basically comes down to plagiarizing molecular innovations developed by microorganisms long ago. The concept of plagiarism reinforces the "literal" understanding of life, while at the same time confirming the idea of researchers as subjects "without depth", processing, copying and annotating the stoix1ĩa of life in a systematic, automated fashion, driven by the desire to acquire and propagate a particular kind of literacy. This concurs, I would argue, with the basic message of the sizable mural (on wooden panels, almost hundred square meters in size) painted by Pablo Picasso in 1957 -1958 (during the heydays of molecular biology) for the UNESCO headquarters in Paris, which came to be known as The Fall of Icarus. 3 There is a stark contrast between the fleshy, bathing, living, healthy bystanders here portrayed and Icarus himself: a "minimal" human being, stripped down to his bare essentials, so that only the (genomic) skeleton remains, about to drown (in a sea of data). Indifferent (eye-less) bystanders silently witness how Icarus is about to disappear from view, while his body seems transformed into an X-ray picture.
Picasso himself persistently refused to comment on the "meaning" of his artwork, but others connected it with the nuclear bombs that put an end to World War II (as UNESCO was established in response to the prospect of global nuclear devastation). 4 Genome research began with the work of pioneers such as Hermann Joseph Muller (1890 -1967) who studied mutations caused by X-ray radiation in drosophila, and interest in human genomics was likewise fueled by the genetic damage inflicted by nuclear radiation, which explains why the US Department of Energy (DOE) played such a decisive role (as a funding agency) in the human genome endeavor (Kay 2000). From this perspective, the Fall of Icarus becomes the portrayal of a human being (a research subject) whose "flesh" is obliterated so that the (fragile) genotype is exposed. And it is no coincidence that Picasso painted his mural when, in the aftermath of the discovery of the structure of DNA in 1953, the human "essence" was being disclosed with the help of (fleshless, disembodied) alphabetic symbols. Picasso's X-ray figure exposes humanity's genetic "essence" (DNA). Nuclear bombs and the discovery of DNA, as landmark "achievements" of twentieth-century science, brought about by elementary particle physics and molecular genetics, respectively, converged in disclosing the letters (stoix1ĩa) of matter, energy and life, thereby obliterating the living, which explains why physicists (Debrück, Schrödinger, Wilkins, Crick, etc.) played such a decisive role in the postwar molecular biology revolution.
The process of symbolization or "literation" implies that living beings as visible, tangible entities are substituted by laboratory artifacts and subsequently stripped down to their bare essentials, captured by sequences of signifiers, and eventually by the barcodes of life, storable and modifiable by means of information and communications technology (Thacker 2005). This process of ob-literation has de-carnated life, making it computational and fleshless, building on a technological way of thinking (represented by Daedalus the t1xnxth6, the archetypal engineer). Yet, much like the fisher, the shepherd and the peasant depicted on Landscape with the Fall of Icarus by Brueghel in the 1560s, the bystanders (i.e. the speaking masses), immersed in the realities of their daily lives, so far remain indifferent to Icarus's fall. For the time being, the impact of the latter's "fall" (the reduction of human beings and living entities to sequences and barcodes) is hardly noticeable. Notwithstanding the spectacular rhetoric of (post-)genomics, individuals are not reinventing themselves en masse with the help of their personal sequence.
The imperative of data production For Lacan, symbolization is the basic impetus of modern science, as we have seen. In his Seminars, the first of which actually coincided with the discovery of the structure of DNA in 1953 (Lacan 1953(Lacan -1954(Lacan /1975, the symbolization of life by molecular genetics is discussed on various occasions. Life, Lacan argues, is described in terms of nucleotide and genes, that is, in terms of combinations of signifiers, referring to basic elements that can be either present or absent. Molecular biology uncovers life as a "typographical" realm (1957 -1958/1998, 147). And Lacan underscores how the notion of "information" has permeated contemporary scientific discourse "with the speed of lightning" (1972-1973/1975, 21/22). Life is conceived in terms of a series of lines and dots, of 0s and 1s, in which all sorts of typos may occur. This notably applies, he argues, to the molecular "information" of genes as strands of DNA, wrapped around each other, from where messages are recorded and distributed: a linguistic phenomenon, basically. Thus, the reproduction of life is ultimately determined by something which in itself is neither living nor non-living: a molecular program, known as the "codon", situated on the chromosomes (Lacan 1971(Lacan -1972(Lacan /2011 and characterized by repetitiveness: the tendency of life to continuously reproduce itself. A similar tendency can be discerned at the subject-pole of the knowledge relationship. Molecular researchers themselves are reduced to functional, replaceable subjects, driven by a basic imperative, voiced by the Other (i.e. funding agencies, university boards and other authoritative agencies): continue to produce more knowledge! (Lacan 1969(Lacan -1970(Lacan /1991. The researchers involved are not literally told to do so. Rather, this imperative functions as an unconscious injunction, fueling contemporary science as such (121). Now that we have unraveled the secrets of molecular structures, it seems impossible to put brakes on the torrent of signifiers and symbolic combinations produced by molecular research. Once this process has been unleashed, it no longer seems an option not to obey its basic Commandment: go on, produce more data (121), also because this process is expected to provide us with an "inconceivable" power over life.
Both the number of genomics papers and the number of genomes (of humans and other species) that are sequenced are growing exponentially, resulting in massive amounts of letters, the bulk of which must be regarded as symbolic "litter". And now that we have entered the tera-or even petabyte era of "precision medicine", 5 the amounts of data produced are reaching staggering levels. Genomics celebrity Craig Venter, for instance, recently launched a new initiative named Human Longevity Inc. (HLI) 6 committed to build the world's largest genotype/phenotype database by sequencing 40,000 genomes per year. The goal is to have 1,000,000 human genomes sequenced by 2020. His initiative rivals the official, publicly funded precision medicine program recently announced by Barack Obama, with Francis Collins (once again, but now as director of NIH) acting as the President's ghost writer. Precision medicine aims to integrate Big Data coming from various clinical, molecular, multi-omics, environmental and behavioral sources, allowing scientists to deepen their understanding of the biological basis of health and disease. The goal is to develop prevention and treatment strategies that take individual variability into account with the help of large-scale biological databases and computational tools for analyzing enormous data sets (Collins and Varmus 2015). In other words, we have entered the post-genomics terabyte world.
The terabyte era: from the HRG to the kenotic Self The newness of these new initiatives resides in a shift of focus from the (anonymous) HRG of "traditional" genomics toward the personal genomes of next-generation sequencing, unique for every human being: a multiplication of the standard (impersonal) sequence ("our" genome) into multiple (personalized) genomes ("my" genome; cf. Venter 2010). But as genome sequences are much too similar to explain diversity, personalized genomics is complemented by other post-genomics endeavors, notably epigenomics: the study of changes to gene expression caused by chemical modification of DNA (via processes such as DNA methylation and histone modification). Genomes are marked with chemical tags responsible for specialization, while organismal plasticity can be explained as molecular responsivity to the environment (the "environome"). Genes are turned on or off to affect interactions between DNA and the cellular protein-making machinery. These contextually induced alterations of gene expression are semi-stable and may be transmitted across generations. For this reason, epigenomics aspires a broadening of the epistemic horizon, bringing the social and material environment (notably, early-life adversities) into molecular research (Niewöhner 2011(Niewöhner , 2015. Through epigenetics, the genome acquires a "life span" (Lappé and Landecker 2015).
But again, this "broadening" of the horizon paradoxically coincides with a "narrowing" of that same horizon as well. Although the focus on genes seems to be opened up to take history and environment into account (so that epigenetic research practices purport to be more "mundane"), the social and natural environment is at the same time thoroughly molecularized. Only a thoroughly symbolized (molecularized) "world" can be taken into account. The environment is reduced to the "environome" or "exposome", the environmental equivalent of genome (Miller and Jones 2014): that part of external reality which can be sequenced by next-generation sequencing machines. In other words, the environment is made readable as a series of molecular messages. In a similar way, history (adverse life events and stress-evoking challenges) is reduced to traceable molecular evidence (to informational sediment) accumulating within cells.
Taken together, this combination of personalized genomics and epigenomics constitutes the core of life sciences research in the terabyte age, giving rise to an "explosion of new data" (Nature 2010). Big projects within the epigenomics "archipelago" (Meloni 2015a) such as ROADMAP, launched by IHEC (the International Human Epigenome Consortium), are presented as the life sciences equivalent of the Large Hadron Collider (LHC) of the European Organization for Nuclear Research, producing one gigabyte of data a second. But precisely for that reason, epigenomics provides "an excellent theoretical spyglass through which to see the changing thought-style (and possibly ethos) of the biosciences in this early twenty-first century" (Meloni 2015b, 125). Notably, epigenomics aims to move beyond the mechanisms of defense, the epistemic filters of mainstream genomics. More than a century after the inauguration of Mendelian genetics, epigenomics gives the floor to "heretic" and "disgraced" lines of thinking in biology, for which Meloni uses a "psychoanalytic metaphor . . . the return of the repressed" (2015a, 118), revolving around the idea that "historical and psychological traumas" do leave their marks, their litter as it were, on the genome (120). The genome is sensitive to environmental noise.
Epigenomics is expected not only to elucidate normal cell functioning, but also to highlight epigenetic changes that contribute to various recalcitrant diseases such as cancer. Often, such changes respond to environmental factors ("nurture") to which bodies are exposed, so that human health is captured by the equation: Health ¼ Genome + Exposome. The goal of large-scale epigenomics projects such as ROADMAP, ENCODE and BLUEPRINT is to produce hundreds of reference epigenomes, freely available in the public domain for researchers worldwide. Thus, in a recent publication (95 authors) by the Roadmap Epigenomics Consortium (2015), a leap is made from 1 HRG in 2004 to 111 human reference epigenomes now. Rapid release of raw sequence data is of key importance in this process.
BLUEPRINT is the European version of this trend, funded by the EU. The acronym seems symptomatic. Whereas the notion of the genome as a blueprint (abundantly used during the 1990s, the heydays of the HGP) had fallen into disrepute (as life proved much too complex to be explained on the basis of the genome alone), the objective of this new endeavor is to sequence this "other" (Stelmach and Nerlich 2015), higher resolution version of a blueprint, instructing molecules how to create specific cells. The new powerful combination of nextgeneration sequencing and epigenomics is expected to succeed where traditional genomics projects fell short (Waggoner and Uller 2015), providing a comprehensive, high-density portrayal of the molecular algorithms of life (Miller 2001).
These initiatives result in insatiable data hunger. In order to achieve epigenomics goals, horrendous amounts of data (procured from human individuals) are needed. And we are all expected to become data donors. The Human Genome Organisation (HUGO) regards "the willingness to share information" as a praiseworthy contribution to society (2007), representing a transformation in health perspective from Me to We (Dickenson 2013). According to the EUROBAROMETER, a significant majority of the European public sees the sharing and disclosure of personal information as a necessary part of modern life. Big data are not something to accept or reject: we are all in it together already. Wittingly or unwittingly, we are implicated in the current terabyte data deluge. Collectively, we create 2.5 quintillion bytes of data daily, so that 90% of the data in the world today have been created in the last two years or so. And it is against this backdrop of "Big Dataism" that personalized genomics and epigenomics evolve.
But once again, the paradox unfolds. On the one hand, epigenomics purports to bring the biomedical body closer to the "lived body", dwelling in a sociocultural life-world. The biomedical concept of the body is allegedly familiarized and personalized, as if molecular biology is finally able to acknowledge a basic phenomenological insight, namely that, rather than being locked up in a disembodied subject-position (as Cartesian ego's), we are open to the world, as mundane, intentional beings (as beings in the world), so that the separation of the human world into a "biological" and a "social" domain is finally undone (Meloni 2015b). At the same time, however, this "world" is drastically molecularized. As Niewöhner (2011) phrases it, epigenomics amounts to a "molecularization of biography and milieu". The sociocultural world is only taken into account to the extent that it can be subjected to a process of literation, allowing us to make its molecular lógo6 readable via automated reading machines.
Thus, on the one hand, a new concept of the body seems to emerge in biomolecular discourse, namely the embedded, biosocial body, open to our sociocultural surroundings, even on the molecular level (Moss 2002), a world-openness which represents a significant rupture with mainstream twentieth-century biology, especially genetics (Meloni 2015b, 139). Genomes are studied "in context", thereby allegedly bridging the nature -nurture gap (Lock 2013) and overcoming the "gene-centric" model of life (Barnes and Dupré 2008;Pickersgill et al. 2013). Both societal influences and bodily plasticity are taken into account, focusing on the molecular marks of sociocultural imprinting and exposure: the molecular language via which the environment "talks to us" (Carey 2012). And yet, biography and milieu are only acknowledged insofar as they can be made readable by the technologies of molecular literacy. Scientists frame their questions in molecular terms, and the world becomes noticeable only in this high-tech format of biomolecular stimuli and responses. In other words, the re-familiarized body dwells in a molecular ambiance, obfuscating the inexorable, unreadable, recalcitrant Real (only noticeable as intrusions or stains, as complications or derailments, in the folds and margins of the ongoing obliteralization process).
Thus, the re-familiarized body at the same time entails a dissolution of individuality in large-scale databases and electronic networks. In the post-genomics era, vast databases of biological information are developed, producing standards of normalcy as points of orientation for personalized Selves. These repositories of genomics, epigenomics and other -omics data (such as methylation data) are massively produced, exemplifying a tendency toward anonymization. Individuality becomes dissolved in large-scale data networks. Like Icarus, human individuality is drowned in a sea of data. We have become aggregates of data: data without bodies. Life is obliterated by "radical" technologies for producing and circulating vast amounts of information at an astonishing pace.
The term Big Data does not refer to a tangible entity or object, but rather to a "hyper-object" (Morton 2013), something which no longer seems in need of mediation or objectification, bypassing language even, generating seamless bits of code (Rouvroy 2014). In other words, human life is becoming kenotic. 7 We are emptying ourselves, dissolving into clouds of bits and bytes, and filled up again with data, as the new discourse of the Other, with its (rigidly anonymized) normativity, framed in terms of normalcy levels, that is, biomedical standard expectations, adapted to an individual's age, sex, ethnicity, etc. (Prainsack 2015). Previous instances of "personalizm" and "personization" (Chadwick 2011), previous ideas and concepts pertaining to personhood and identity, are obfuscated or even erased to make room for these new sets of data-based, quantified indicators and operationalizations of me-ness. The "Snyderome case" may serve as a case study, a perfect exemplification of this turn: a meticulously documented condensation, in which all the key trends discussed so far converge under a single heading.
iPOP: the Snyderome case In 2012, Michael Snyder and his team (the Department of Genetics at Stanford University) published the "iPOP" of a single individual, a 54-year-old male volunteer, whom they had closely monitored over the course of 14 months (Chen et al. 2012). 8 This longitudinal case study resulted in a comprehensive omics portrait ("extremely high coverage"), combining "deep sequencing" (of the genotype) with more than three billion measurements of molecules (i.e. the phenotype). Although the research subject was a "healthy individual", the project at the same time amounted to a case study in the sense of a Krankengeschichte as two minor viral infections, together with (unexpected) evidence of the subject's propensity for diabetes, constituted the clinical highlights of the story.
Soon, it turned out that the "male volunteer" of this N ¼ 1 experiment (surrounded by qualified personnel and costly equipment) was none other than Michael Snyder himself, the department chair now acting as his own research subject of choice, turning his body into an omics laboratory. The experiment resulted in what has been referred to as the Snyderome 9 or even the Narciss-ome (Dennis 2012). 10 Snyder himself made it known that he plans to remain a study subject for life, 11 adding new sources of information as the process unfolds, including data procured from body samples such as breath, urine, feces ("stool microbiome"), saliva, etc.; in other words, bodily materials released via bodily apertures known in psychoanalysis as "erogenous zones".
When Michael Snyder presented an update of his in-depth N ¼ 1 experiment during the annual HUGO meeting in Kuala Lumpur (15 March 2015), his lively report reminded me of the famous psychoanalytical rule of free association (the "X-ray tomography of the human mind", Grünbaum 2002), summoning patients to say anything that comes to mind, however trivial or unpleasant. As Freud himself phrased it: We instruct the patient to . . . report to us whatever internal observations he is able to make [taking care not to] exclude any of them, whether on the ground that it is too disagreeable or too indiscreet to say, or that it is too unimportant or irrelevant. (1917/1940, 297) In the era of personalized medicine, this seems once again the rule: record, report, accumulate and analyze anything: data of any kind must be included. Especially waste products (bodily "litter") may contain highly valuable information about what is going on under the surface. Ostensibly healthy individuals become (potential) patients. Thus, the Snyderome project exemplifies the principle of "literation": life in general, but notably life's refuse (life's "litter"), is transformed into letters (litera or littera in Latin). And as sequencing machinery produces streams of data, vast amounts of symbolic sediments are deposited along the way, junk data as it were: the "littoral" dimension of the literation process. 12 Freud himself employed a similar strategy in his Traumdeutung, one might argue, likewise the result of an N ¼ 1 self-monitoring project, as he basically reported and analyzed his own dream material (disregarded as a waste product of mental life by normal science at that time): a self-analysis turning "litter" into texts. An individual's omics data provide a window into the biological "unconscious" as it were (Zwart 2013): producing steady streams of signifiers, associated with ongoing metabolic processes, erupting every now and then as (physically noticeable) symptoms ("spike events"). Thus, personalized genomics has led to a rehabilitation of the individual case study, which also constituted the empirical basis of psychoanalysis.
The Cell article on the Snyderome cited above ends with the hope that more case studies will follow, so that large databases can be compiled with complete timedynamic profiles for growing numbers of individuals (Chen et al. 2012(Chen et al. , 1305. According to Snyder and his colleagues, the idea that a medical examination can be based on a mere handful of conventional measurements must be discarded as "primitive". Why measure five or ten items when you can measure 40,000? In other words, the Snyderome project exemplifies a culmination point, building on several previous paradigm shifts in medicine: from the intuitive "clinical gaze" of the general practitioner (Foucault 1963), via evidence-based medicine of recent decades, up to the era of personalized, high-resolution precision medicine known as "Me medicine" (Dickenson 2013). What used to be diffuse and opaque is now quickly becoming articulate and discrete. Thus, Snyder presents himself as a representative of an avant-garde, and his case study as a window into our personalized, kenotic future. All other "sources of the Self" (Taylor 1989), all the cultural depths and horizons of selfhood, are emptied out or obfuscated, and subsequently refilled, by these data-rich technologies, paving the way for a data-driven Self.
And indeed, the Snyderome project not only serves as a personalized medicine experiment, but also as a test-bed for the latest omics technologies. For besides being the chair of one of the world's most prominent genetics departments, Michael Snyder is also involved in high-tech omics companies, as founder and consultant for Personalis, member of the scientific advisory board of GenapSys and consultant for Illumina, for instance. 13 In other words, kenotic me-ness is not first and foremost a conceptual or rhetorical shift (a new vocabulary or ideology so to speak), but primarily something which pertains to the technological base of post-genomics knowledge production, so that the kenotic Self must be regarded as a power-effect of the new type of scientific information thus produced.
The Snyder case is not without precedents of course. As a more journalistic forerunner, the Experimental Man project should be mentioned, 14 a multiple omics endeavor initiated by science author Duncan (2009), in collaboration with the University of California, Berkeley, and various companies. Besides Duncan's personal genome sequence, its products include an algorithm for a personalized prediction of the risk of having a heart attack, based on multiple measurements (lipids, triglycerides, cholesterol, heart computerized tomography scan, ultrasound imaging of carotid arteries, etc.).
Genealogically speaking, the project builds on a longer tradition of self-quantification, as a subgenre of the practices of the Self, much older even than the Freudian couch. As a first historical precedent, the auto-experiment performed by Sanctorius in the seventeenth century can be mentioned (Zwart 2000;Smith 2007). Sanctorius, professor at Padua from 1611 to 1624, was obsessed with weighing personal body fluctuations. For 30 years, and with the help of a special, self-constructed chair on which he lived, Sanctorius not only meticulously weighed himself, but also everything he ate and drank, as well as his urine and feces, comparing the weight of food intake to that of waste products. By doing so, he systematically quantified himself. Snyder's experiment can be seen as a radicalized, terabyte version of Sanctorius' longitudinal N ¼ 1 self-monitoring endeavor, resulting in a terabyte "Me", a radically externalized Self, produced with the help of high-precision selfdiagnostics.
Snyder's idea is that, via high-resolution selfmonitoring, human individuals will become the proactive managers of their own health condition (rather than hypochondriacs). Longitudinal multi-omics analysis will allow "us" to take medicine into our own hands, with doctors acting as mere advisors (with whom we will communicate via websites and portals) rather than as "dictators". Individuals will not only monitor huge amounts of body molecules in a detailed manner, but will also heavily wire themselves, so as to register pulse, heartbeat, stress (transpiration) and numerous other indicators continuously. Thus, the focus of attention is displaced from the weight scale to a plethora of high-tech gadgets. Measurements of thousands of factors can be integrated through devices such as iPhones and compared with big data references, available 24/7 at open-source repositories (vast science clouds), so that self-diagnostics can be translated into everyday options (diet, exercise, etc.). It is expected that especially the etiology of mystery symptoms (such as unexplained fatigue) can thus be elucidated.
But once again, the paradox emerges. Rather than opening up practices of the Self, allowing individuals to refashion their own lives, the Big Data repositories which provide reference data (i.e. standards for normality) can easily become a ubiquitous electronic panopticon: a molecularized version of the super-ego, the "voice of conscience" of the terabyte age. On a daily basis, computer "monitors" will be telling individuals that they better change their lives (Sloterdijk 2009) in order to optimize somatic functioning, and to live up to health and normalcy standards, and/or to postpone the impacts of unhealthy lifestyles and ageing.
Concluding remarks: from Big Data to kenotic life Processes of biomedical symbolization and literation seem to be heading for an Omega point, as omics data claim to make human life fully transparent with the help of data-rich characterizations of individuals, at various stages of health and disease (Prainsack 2015), providing a comprehensive, high-density portrayal of a person's health status, combining static (e.g. gene sequence) with dynamic (lifestyle, responses to environmental challenges, etc.) data, resulting in the "end of medical history" as it were. In the near future, patients may even pay for their healthcare (freely offered to them) with their data: the new currency of the upcoming petabyte age.
This concurs with an analysis by Cetina (2005) who notices a shift of focus in contemporary culture toward "life as such". Enlightenment ideals are "emptied out" (76), she argues, dissolved into networks and automated electronic information structures (77), while individuals are called upon to engineer their own fate, so that the focus of attention now is on molecular and visceral dimensions, rather than on things such as Bildung or even IQ (78). Biology is not destiny, but changeable, open to optimization: a constructive project, fueled by technoscientific promises.
And yet, paradoxically, this cleansing of life produces its own messiness, its own sediment of pointless ("junk") data. New omics tools give rise to a data deluge of often meaningless and incidental bits of data, eavesdropping on the bagatelle of everyday existence, shed by humans on a daily basis and stored and analyzed in digital media. Thus, on the one hand, human life becomes kenotic. We are emptied, externalized, uploaded and then filled up again with data. What was once propagated by deconstructionist philosophy is now happening in real life: the human subject dissolves into the symbolic order of electronic networks. But the exponential growth curve of data production inevitably becomes a key symptom (in the Lacanian sense of "sinthome", Lacan 1975Lacan -1976Lacan /2005) of contemporary life sciences research itself. For indeed, we become drowned in data, and an obsessive jouissance seems involved in this erupting data spate, fueled by a frantic drive to capture the real through symbolization, amounting not only to an emptying or ob-litteration of life (dissolving individuality into pure data), but also to new forms of messiness, to massive amounts of digital litter, as the recurrence of the inexorable Real.