Before the Anthropocene: Part 2

In Part 2 of Before the Anthropocene, Scott W. Schwartz thinks about the social, political, and economic conditions that led to the desire to measure temperature. Through an interrogation of the archaeology of knowledge production, Schwartz argues that the technologies that allow us to quantify and detect the Anthropocene developed as part of a broader historical moment that led to the emergence of a "New Normal." 

Temperature and Control: Measuring the Anthropocene into Existence

Prior to the 17th century temperature didn’t exist. Only the privileged counted the number of years they’d endured or numerically valued their wealth. The quantification of reality has a political and economic history that is encoded in the technologies that observe and produce this reality. How did the notion of reality as quantifiable become naturalized? Why did metrical concepts like temperature emerge when they did? These questions can be interrogated through the archaeology of knowledge production. Archaeology here is not a stratigraphic metaphor implying layers of knowledge. Rather, it denotes an investigation of material (as opposed to discursive) culture. (Lucas 2004) Products of knowledge like temperature or gravity have material strata that become obscured through normalization. Excavating the buried materiality of concepts, specifically studying the material culture of measurement, challenges this practice of extracting meaning from matter. 

Different populations employ different epistemologies, varying in what information they deem worthy of observing, measuring, and codifying into knowledge. The focus here is on the socio-material negotiations that gave rise to today’s dominant epistemology of insidious growth, which emphasizes extraction and accumulation to such an extent that human geologic impact merits classification as its own epoch. A perhaps irreconcilable irony I wish to unveil is that the very advancements in quantitative measurement that allow us to detect and codify the Anthropocene are directly responsible for causing the planetary tumult that the epoch forebodes.

Thermostatic

Object-oriented ontology posits that every object has a bottomless reserve of withdrawn attributes. (Harman 2010) Of a tree’s endless properties (height, color, smell, velocity, angularity, etc.) how do certain attributes become privileged over others as ‘worth knowing’? Measurement is the socio-political act of valuing and eliciting attributes from an object through the design of an observational apparatus – a social, not scientific, process.

Fig. 1. Endless attributes. 

Fig. 1. Endless attributes. 

Karen Barad (2007) asserts that isolated phenomena cannot be measured without simultaneously being co-constituted by the measuring apparatus. Instead of simply the observation and recording of properties, Barad casts measurement as generative – observational devices actually create the properties they’re designed to detect. An object’s attributes are not inalienable, they emerge from the interaction of observer and observed.

Measurement isn’t about truth. Any apparatus can be devised that outputs true information. The system temperature says I’m 37°C, but the relevance of this truth is socially dependent. A device could observe that I speak 4,000 words a day. The truth of this observation doesn’t give it meaning. Motivations for deciding what to observe are of greater consequence in producing knowledge than results of measurements. Measurement doesn’t observe attributes that exist ontologically outside of the measuring apparatus, it generates attributes through the process of discernment.

Examining the material design, construction, and mechanics of an apparatus like the thermometer offers a useful vantage for exploring how, why, to what ends, and in whose interest quantified thermal knowledge is valuable. The mercury thermometer creates a space where four materials (glass, mercury, heat, and numbers) interact to manifest temperature. Fluctuations in the intensity of [1] heat induce the [2] mercury to expand or contract within a [3] glass tube inscribed with [4] numbers. From this material confluence the discursive concept temperature emerges. The shape of the glass, force of heat, pliability of mercury, and cardinality of numeracy combine to create something called temperature.

A key design imperative of temperature is the ability to quantify the observation. Early thermometry devices such as the thermoscope lacked numbers – its measurements were relative not quantitative. Temperature is not intensive heat, but rather a quantified representation of intensive heat. The thermometer actually generates an extensive proxy property – the length mercury expands is the observation. Thermometers don’t just measure temperature; they create it by translating a relative intensity into a quantifiable extension. 

Fig. 2. Thermoscope design. 

Fig. 2. Thermoscope design. 

Extensive properties (length, weight) are additive and extractable – within ten meters there are nine discrete meters. Intensive properties (heat, density) describe non-extractable internal compositions – there are not nine discrete units of Celsius within 10°C. A stone isn’t heated by adding five Celsius. Changing the log’s temperature demands changing its surrounding conditions (placing it near a fire). Heat doesn’t vacillate through extraction or addition of cardinal units.

The principles of thermometry (that some materials expand when heated) had been known since antiquity. It was social valuation that necessitated the design of an instrument to translate this phenomenon into an extensive quantity. Further investigation into the material culture of knowledge sheds light on this valuation process.

Precautionary Measures

In 1348 the Black Death completed its sweep of Europe, beginning a nearly 500-year phase of recurring plague. A concurrent period of ecological instability (often referred to as the Little Ice Age) brought cooler weather, increased storminess, increased glaciation, and more drastic annual variation in climate to the Northern Hemisphere. This destabilizing period incubated an epistemological shift emphasizing knowledge production over knowledge management. (Harrison 2014) The production of predictive knowledge promised greater control over populations and environments undergoing fluctuation than the management of fixed ecclesiastic truths. Prior epistemology was not focused on the directional growth of knowledge. This shift reached ascendency with the political revolutions of 1848, which entrenched commercial interests over ecclesiastic. (Hobsbawm 1962) In response to destabilized medical and ecological conditions pre-Anthropocene populations began favoring knowledge better equipped at tracing causal trajectories. Probabilistic futurity became more valued than eternal mechanics.

 

The legacy of this transformation is unprecedented exponential growth in the accumulation of people, wealth and just about everything else since 1848 (Steffen et al. 2007) – an utterly mutinous rate of growth with no precedent in 200,000 years. Why did this exponential curve erupt when it did? It could have happened earlier, or not at all. We were just as smart 2,000 or 30,000 years ago. This infocratic overthrust was not teleologically inevitable. Its material origin lies in the ability to increasingly quantify phenomena, which offered the illusion of control through predict-ability. Heat, health, and wealth, are not inherently quantifiable. Instruments are designed to transform them into quantities. 

Fig. 3. Quantitative Machines. 

Fig. 3. Quantitative Machines. 

The implementation of quarantine as a response to plague epitomizes this shift. Accounts of plague in Dubrovnik assert that, “the city’s doctors did not feel qualified to fight [plague], so they preferred to leave. In their own words, plague made it impossible to practice medicine.” (Blažina & Blažina 2015) Doctors considered plague a catastrophe like earthquakes, not a medical concern. The causal ruptures induced by plague reflect the germination of the new epistemology. Diffusion of cause and effect inspired responsibility to be programmed via quantitative observational instruments like quarantine. Quarantine is an algorithmic space designed to extract and isolate individuals for a programmable duration. It can be reprogrammed to operate on different objects (symptomatic individuals, family members, travelers), but whatever the parameters, upon running the program, the reality of health is deferred to the algorithmic responsibility of quarantine with its two outputs: dead or alive.

Reality was transformed from an experience into an output.

Interested Parties

The notion of reality as constituting information that serves as a probabilistically reliable basis for prediction (reality as output) is reflected in contemporaneous economics. The Medici banking empire was built upon financial innovations that employed the same algorithmic reality as quarantine.

When the Medici bank was incorporated in 1397 charging and accumulating interest was outlawed in Christendom. Collecting more than the principal on a loan was a legally codified punishable offense in Florence. Medici banks hacked this ban on usury through the innovative use of negotiable instruments such as bills of exchange. These instruments act as insurance against currency devaluations. Buying a bill of exchange before traveling abroad ensures you’ll be able to exchange dollars for pounds at today’s exchange rate. If the dollar has dipped in value upon arrival in London, you won’t lose out. However, if the dollar has strengthened against the pound you don’t get the extra, the bank profits off this fluctuation. Most Medici profits came from moving these instruments around their various branches in Europe. (Goldthwaite 1987)

This practice naturalized temporal flux as generative of value, popularizing the idea that wealth is kinetic – if wealth isn’t growing, it’s diminishing. The negotiable instrument observes wealth in units of its future value – the basis of modern capitalism. Hypothetical futures became highly lucrative, reifying predictive capacity as the primary attribute of knowledge. The character of interest was well known for millennia prior, and one suspects the reason for its ban was because its dangers were quite appreciated.

Populations certainly prepared and predicted prior to the 15th century. Divination and portents have long been valued, but destabilizing plagues and climate change rendered these efforts probabilistically inadequate when the future itself became a source of profit. Quantifying apparatuses like the thermometer reduce information to data from which patterns and trends can be more easily identified, and causal trajectories more accurately projected. Quantifiable reality was engineered because banking technology made growth the driving economic principle. Growth requires a future, so reality became the information that best programs that future.

Fig. 4. Non-directional growth. 

Fig. 4. Non-directional growth. 

The heat-mercury-glass apparatus that creates temperature is no better at observing relative warmth than a heat-skin-nervous system apparatus, but temperature generates numerically repeatable and probabilistically programmable information. Quantification makes vacillations in heat more predictable. Even if it’s wrong, algorithmic reality is by design predictable, and because it offers the illusion of control, predictability is the foremost concern of modern knowledge production.

Accelerating Reality 

Temperature didn’t exist a thousand years ago, but a device called the accelerator mass spectrometer has been designed in order to elicit quantitative thermal knowledge from 4,000-year-old oxygen atoms. This device discerns the relative frequency of O16 to O18 isotopes as an extensive proxy for archaic atmospheric heat. This retrograde observation offers climatologists a longer spectrum for identifying trends in thermal vacillation. These trends allow knowledge curators to cut out and classify geologic phases of the planet. The methods of observation used to parse Pleistocene from Holocene are the same that created the programmable reality of quarantine or temperature. This categorical knowledge is produced through the cultivation of an extensive reality that can be extracted and accumulated.

Growth is programmed into this extensive reality, or rather extensive and quantitative observations have been privileged as ‘real’ because they foster the deterministic delusion that wealth can grow perpetually. This insatiable growth is the engine driving the looming cataclysms of the Anthropocene. Predictive observational devices were designed to program growth into the future. Now that these same devices advise us to halt our growth, those most concerned with growing wealth have begun to doubt the reality produced by these instruments, affirming that quantifiable information was only categorized as reality because it was profitable. 

It is the incipient idea that reality is an output that enables the belief in perpetual exponential growth. This idea is encoded in the material design of observational devices such as the thermometer. It is dubious whether continuing to produce knowledge from this ontological vantage will be able to pacify uncontrollable alterations of planetary ecologies. Predicting our way out of disaster seems unlikely given that framing the world as determinate and predictable is what induced Anthropocenic conditions.

Fig. 5. Framing the future. 

Fig. 5. Framing the future. 

Enlightenment technovations were paid for – probabilistic reality was financed by an epistemology of extractive and accumulative growth. As Nick Land writes “capitalism is an artificially intelligent space from the future constructing itself from its enemy’s resources.” (2011) We are those enemies and the Anthropocene is that capitalist space.  

References

Barad, Karen. 2007. Meeting the universe halfway: Quantum physics and the entanglement of matter and meaning. Durham: Duke University Press. 

Blažina-Tomić, Zlata & Vesna Blažina. 2015. Expelling the plague: The health office and the implementation of quarantine in Dubrovnik, 1377-1533. Montreal: McGill-Queen's University Press.  

Goldthwaite, Richard A. 1987. The Medici bank and the world of Florentine capitalism. Past & Present 114: 3-31.

Harman, Graham. 2010. Towards speculative realism: Essays and lectures. Winchester, UK: Zero Books.

Harrison, Peter. 2015. The territories of science and religion. Chicago: University of Chicago Press.

Hobsbawm, Eric. 1962. The Age of Revolution 1789 – 1848. New York: Vintage Books

Land, Nick. 2011. Fanged Noumena. New York: Sequence Press.

Lucas, Gavin. 2004. Modern disturbances: On the ambiguities of archaeology. Modernism/Modernity. 11(1): 109-20.

Peirce, Charles S. 1998. The essential Peirce selected philosophical writings. Bloomington: Indiana University Press.

Steffen, Will, Paul J. Crutzen, and John R. McNeill. 2007. The anthropocene: Are humans now overwhelming the great forces of nature. AMBIO: A Journal of the Human Environment 36(8): 614-21.

---

Scott W. Schwartz is currently completing a Ph.D. in Archaeology at the CUNY Graduate Center. His research centers on the materiality of knowledge production. Scott has conducted fieldwork on Neolithic sites in the Orkney Islands, Scotland (2011, 2015) and Medieval Iceland (2012, 2013, 2014). Scott is a regular collaborator with artists in New York, with projects frequently appearing in exhibitions and galleries, some of which can be seen here: askschwartz.com