Post-truth and other truths

In the aftermath of the American elections, discussions about knowledge and truth have been framed by concepts of post-truth and alternative facts. Even among those of us who consider themselves experts in the processes of truth-making and knowledge creation, this framing has stirred things up, reactivating old fault lines in the STS community (see the Sismondo vs Collins exchanges in Social Studies of Science).

There are many other, very insightful, discussions. For me, Helen Verran framed it best of all, in her examination of post-truth governance, which she recently summarized in a presentation at the ETHOS Lab, IT University in Copenhagen.

Verran characterizes post-truth governance as one of three imaginaries of truth. One of these, post-truth governance, is based on an epistemology of market values, rather than on ontological truth. In the post-truth imaginary, what matters, the knowledge that is sought out and produced and evaluated, is knowledge that will inform on the opportunities for return on investment.

This framing helps to understand the place of instruments and techniques of calculation and their dominance in global institutions that direct capital flow–whether for the alleviation of poverty, the energy transition of the survival of refugees. Think of the World Bank’s decision-making processes, think of the UN environmental agency’s evaluations, think of the procedures of EU funding for research. What is calculable can be valued–in all senses of the world–and if it can be valued, it is the kind of knowledge that can be funded. In other words, it is the kind of truth worth investing in.

For Verran, it is opposition to this line of reasoning that characterises many of the populist politicians who denounce the logic of calculation. When calculation takes on overly abstract forms, when the measures become too far removed from lived experience, when metrics are pursued within hermetic systems, a sense of alienation follows. What counts seems irrelevant to what matters.

These are the circumstances in which people appear to be ‘tired of experts’, to pick up on a particular polemic that arose during Brexit campaigns. Think for example of rebellions against technocratic approaches to visas and the appeals for pardons for long-time residents of the Netherlands: their rich, meaningful lives lived here “don’t count” in the visa and asylum procedures; the relationships and identities shaped by the many years spent in a community don’t seem to matter.

imaginaries of truth

To return to Verran, and her useful framework: she puts forth that there are currently three co-existing imaginaries of truth, each with their own knowledge-making technologies, political embedding and institutional supports:

  • correspondence theory of trust (abstract objects are derived from physical objects) (this is where Collins et al are most comfortable)
  • coherence theory of trust (there is no indexing; the focus is on coherence within a system. The IT world is an application of this, insofar as coherence with a system forms the basis for evaluating programming rules. (Sismondo tends to this view)
  • grounded theory of truth: empirical facts reveal themselves; truth is an event.

She invites us to consider that we, as critical scholars, can translate between these imaginaries, but that ‘something happens’ when we do and that we should pay attention to this. What is lost-gained in translation?

This framework and its implications are well-articulated in Verran’s lecture and well worth a watch, especially given the focus on the prevalence of coherence theory of truth in the epistemology of IT–while giving a talk at  ITU. It is also worth noting that these distinctions are eminently relevant in the current context, but not so new:  Haraway set out grounded epistemology in in her manifesto years, some decades ago.

What triggers me most in this lecture is the suggestion that translation may be not only a necessary step, but also a productive one. Verran warns us of the need for skills to translate and invites us to pay attention to what might happen when we do this translation. If we are to understand such translations, and if we are to see truth as an event unfolding on the ground, then we must also have expertise to apprehend this. This expertise lies not mainly in techniques of calculation but in the arts of listening.

A video recording of Helen Verran’s talk can be found on the Ethos Lab website.



Big Data and Energy: crucial intersection

Data seems to be everywhere in contemporary society, including energy.  Precisely because of this ubiquity, it is important to consider how data is created and used, and how it circulates, so that we can understand the implications for the energy transition, whether in the business sector, private and public life. The intersection of energy and big data also needs to be taken into account to achieve a sustainable future.

The course I’ll be teaching during my upcoming visit at the Dept of Science and Technology Studies at the University of Vienna will equip students to reflect and act on issues around big data, and will draw on conceptual and empirical work in the area of energy and sustainability.  Big data is a phenomenon that affects all kinds of domains, from energy to banking to sports to neuroscience, and most of the theories and concepts discussed in the course will be of use in understanding Big Data (and other data-intensive innovations) in other domains as well. big-data-energie-toekomst

Big data has a history going back several decades and has been shaped by tools and institutions, with the result that ‘big data’ has its own structures, biases and tendencies. It is therefore crucial to  analyse how big data approaches are a specific way of creating knowledge about energy and how this knowledge is used. In particular, we will trace how new forms of measurement yield data that are then combined with particular kinds of statistics and database logics, and how an informational turn is affecting the technologies and infrastructures in the area of energy.

The topics to be addressed in the lectures are

  • Key concepts in transition theory and big data
  • Impasses in the Energy Transition
  • Data/Facta (Data is made, not found)
  • Hope and Hype around Smart Energy
  • Interface with Energy Data
  • Big Data, Energy, Households and the Self

The course will be held in March 2018 and I’ll be reporting on this blog about our learning.

When Big Data and Energy Meet

In 2018, one of the activities I’ll be pursuing as visiting professor at the Department of Science and Technology studies (blog) at the University of Vienna is giving a course on energy and big data. This will be the occasion to explore with students of the Masters programme in Science and Technology Studies how energy and big data intersect and how this intersection matters for the energy transition. The course will run in March 2018.

Besides the traditional lecture form8-c-ascr-walter-schaub-walzer-1500x844, we will have some practical sessions where we engage with the material culture of energy. A visit to the ASCR Demo Center at the Seestadt Technology Center is planned.


After Oil

One of the outcomes of the course will build on the work of the Petrocultures collective, published in After Oil.5133afqgqol-_sx310_bo1204203200_ This small book provides a toolbox to analyze narratives of transition. As such, it is a valuable introduction to discourse analysis. Furthermore, the political dimension of transition narratives is foregrounded in this approach. After Oil will provide the theoretical and methodological basis for group work and student presentations. Students will chose a particular narrative to explore, in the form of their choice.


Szeman, I., & Group, P. R. (2016). After Oil. Edmonton, Alberta: Petrocultures Research Group.

Eating Problem

I am experiencing an eating problem. Not an eating disorder, not a food issue, but a problem with eating. It’s been developing these past few months and is now getting to be troublesome. There seems to be a growing disconnection between eating habits, hunger, exercise and diet. This is a multi-dimensional problem, but if I had to state is as a tension or dilemma, then it would be something like

‘Should I eat according to what I expend in a given day


should I eat according to a baseline diet?’

How did it come to this? It’s a story of arts of listening and techniques of calculation, like others on this blog.


About three years ago, after noting that I was starting to gain weight just by looking at food (hello perimenopause!), I started logging my calorie intake using an app (Myfitnesspas). A few weeks of logging revealed (!) that 1200 calories a day was the baseline intake to maintain my weight. A run or swim meant I could throw in the occasional beer or chocolate chip cookie or indulge in that match made in heaven:  a piece of chocolate and a well-parided whisky. So far so good.

But since end of November 2016, when I joined the local triathlon club GVAV Triathlon, I’ve increased the number of training sessions in my week, and these sessions have a much more intensive character. For example: 1.5 hour training on the track in the Stadspark mean 535 calories expended, not counting the bike ride there and back. This represents a whopping 45%  of my regular calorie intake. So it’s not a question of enjoying an additional snack.

So how to deal with this? Should I compensate intake according to effort and have an additional 3 course meal on days with intensive training? This path involves putting quite a bit of thought and attention to eating, to keep the extra intake nutritious and well-timed. And frankly, I’m already spending a lot of time on this triathlon business. Or should I keep to a more stable, consistent baseline and simply increase my daily intake, so that it all comes up in the wash and intake and output balance out over the course of the week? This approach takes less effort and is easier to transform into a new eating pattern, with simply more calories in my diet. But this then carries the risks of piling on the weight if for some reason I’m training less that week and of having energy dips because of large efforts on some days and not others.

So what is the best approach? I’ll be looking for sources these coming weeks, and perhaps even asking around. Though feeling like the ubernewb at GVAV, I’m not too keen to exhibit my ignorance there.

Measuring People up and Database Logic

Average Joe and Plain Jane are two North American ways of expressing in everyday speech the typical, normal, just plain, average person. In Weten Vraagt Meer Dan Meten, the concluding chapter points to the tendency in professional and expert contexts to compare people via measurements, with a particular emphasis on averages.omslag20weten-meten

The various contributors to this book brilliantly signal  a key social and epistemic issue, as stated by the punchy title. The current situation could also be labelled the “metrics paradigm”, a way of knowing in which the average has indeed been a ubiquitous statistical description of population measurements. Within this paradigm, our knowledge of populations is taking on a new vernacular, that of the probabilistic.

To contrast these related but distinct statistical descriptions, the example of composite images of pathological features are especially useful. Specifically, in the article Voxels in the Brain, I compared the average photographs of Francis Galton with the probabilistic atlases of the ICBM/Human Brain Project.

Average Images

The photographic plates on which the images of different representatives of a group were imprinted were intended as a summation of instances, produced through mechanical objectivity. The individual contributions to the image were meant to enforce essence through repetition, while erasing out (or averaging out) the ad hoc variations that were accidental to the essential features. The controlled exposure of the plates (distance to the subject, time of exposure, presumably also lighting) ensured that each case was objectively and precisely added to the summary image. In this process, the images blended; individual instances were ‘lost’, distilled, merged in the service of producing the whole.

In the probabilistic atlases, there is also a calculative logic that partakes of what became the “metrics paradigm”, of which Galton was a key proponent. Collective measurements produced according to standardised practices are the way to gain insights about normality and abnormality. But the mediation of this knowledge and the apparatus used to distil data into meaningful outcomes differ in significant ways from Galton’s approach.

Understanding these differences is relevant for thinking through how we treat individuals and how we distinguish the collective (populations, pathological groups) and, increasingly importantly, how we position individuals with regards to groups.

Probabilistic Images

In the average photographs, the individual measurements are pictures, and these are physically aggregated into accumulation of collective photonic traces on the photographic plate. In the brain atlases, the individual brain scans are digital measurements, organised in a space of voxels. This space is standardised, to allow the collation of measurements from different individual brain scans. The values of individual voxels can be expressed as collections of measurements—an average brain, for example, as an end product, close to the average photographs of Galton.

Paul M. Thompson, Michael S. Mega, and Arthur W. Toga 2000

The mediation of digital scans is different from that of optical photography, however, and the suites of technologies associated with digital scans support a different approach to the manipulation of individual cases in relation to group outcomes.

These possibilities form a ‘database logic’ (De Rijcke and Beaulieu 2014). In digital atlases, there is a different relation to the various constitutive elements. Since the individual measurements are not aggregated physically, as in Galton’s photographs, each measurement remains available, retrievable. The individual case is not merged into the whole in producing an average (or other statistical description). This means that the individual can be ‘found back’ in the collective, and that this relationship to the collective can also be calculated and expressed. This is where variation from the norm becomes exquisitely manipulable in a digital context– in the sense that you can do all kinds of things with it. The relationships between the different measurements can be expressed probabilistically, making variation within a group apprehensible. An important difference is that variation can be explored in a plastic way—much more so than with the optical photographs that are produced as the outcome of a uni-directional flow and where the individual cases become fixed in the collective representation. The database enables a back-and-forth between the individual and the collective. Specific structures in the brain can be quantitatively shown to vary more or less, individual brains can be shown to deviate more or less from the norm. Probabilities, rather than averages have become the dominant normative descriptor.

Personalised Population?

While the probabilistic brain atlas is a rather specific case (discussed at more length in De Rijcke and Beaulieu 2014), I’ve been noting that this logic is active in many other spheres. It points to a particular kind of “personalisation of population data” that I think is shaping how we think about individuals and populations. For example, even comparisons to ‘averages’ are increasingly personalised. I can compare my running pace or the amount of sleep I get to a personalised population of women who are the same age, weight, and height as me. In these comparisons to the norm, the individual as starting point is emphasised, and the relation to a population is personalised.

Is this paradoxical– a personalised population ? This approach to data is becoming so mundane that the tension inherent in this new social object runs the risk of going unexamined.

The exquisite personalisation that digital data enables is increasingly present in all kinds of products and services, so that the knowledge of the individual in relation to a population is becoming extremely widespread. While this type of personalised data has exploded in the last 5-8 years, as we gather and share more and more data through wearables and the growing presence of sensors in our environment, both public and private, it seems to be to be deeply resonant with a modernist project. The personalisation is effected through the accumulation of factors—the more factors for comparison the better the profile… better in the sense of more personal.

The individual as the sum of its factors, that is the crux  in constituting the modern subject. Are we back to Galton’s dream?

Yes and no. One of the main differences is the context in which the subject is created and who is doing the measuring. The relevant factors are not decreed, standardised and implemented by the state, nor are they elaborated in relation to the ambitions of better government of society and to one’s activities, rights or ambitions as a citizen. The current factorisation of individuals, the necessary condition to this personalisation of population data, takes place in a corporate context where better marketing is the ultimate aim and in relation to one’s activities, rights and ambitions as a consumer. If this is to be the shape of our knowledge of populations, I see all kinds of ramifications regarding the accountability and legitimacy of this knowledge.

Unravelling the paradox of the personalised population constitutes a larger project than can be addressed here, but these lines of thought and labels can be a fruitful framing.

Have you heard of Energy Humanities?

My review of a number of recent scholarly contributions on energy and sustainability has just appeared in De Nederlandse Boekengids (nov 2017). While diverse and wide-ranging, the many insightful analyses contained in these publications demonstrate the power of imagination in overcoming the current energy impasse. The various authors also apply incisive critical thought to fossil culture, and to the very idea of the climate crisis and of the anthropocene.

The following works are reviewed:

  • Sheena Wilson, Adam Carlson & Imre Szeman (Eds) Petrocultures: Oil, Politics, Culture, McGill-Queen’s University Press 2017.
  • Imre Szeman & Dominic Boyer (Eds) Energy Humanities: An Anthology, Johns Hopkins University Press 2017.
  • Derk Loorbach, Vanesa Castán Broto, Lars Coenen & Niki Frantzeskaki (Eds), Urban Sustainability Transitions, Routledge 2017.
  • Petrocultures Research Group, After Oil, Petrocultures Research Group 2016.
  • Anna Lowenhaupt Tsing, The Mushroom at the End of the World: On the possibility of Life in Capitalist Ruins, Princeton University Press 2015; 2017.


Margaret Atwood and Energy

In a piece taken up in Energy Humanities: An Anthology, the grande dame of Canadian letters Margaret Atwood wonders about the power of literature to fight climate change:

Could cli-fi be a way of educating young people about the dangers that face them, and helping them to think through the problems and divine solutions? Or will it just become part of the ‘entertainment business’?

Atwood’s contribution was noted by  Jelmer Mommers in De Correspondent, when it first appeared. This new genre has also been noted in the Netherlands, the subject of an essay review in the Nederlandse Boekengids. As a researcher in the field of energy, I’m obviously primed, and when browsing in the bookshop on a day off, energy titles do jump out at me.


My top 5

1.De een na laatste dood van het meisje Capone, Isabel Hoving

2. Het tegenovergestelde van een mens, Lieke Marsman

3. The End We Start From, Megan Hunter


4. Gold Fame Citrus, Claire Vaye Watkins

5. The Carbon Diaries, Saci Lloyd





What will bring about the energy transition?

At our meet-up on Friday, we watched and debated VPRO’s documentary De Doorbraak van Duurzaam from the Backlight series. The focus of the documentary is the point we have reached with regards to the financial and technological status of renewable energy: we’ve hit the moment when producing energy from renewable sources is cheaper that producing it from fossil fuels. Therefore, the various interviewees argue, we’ve come to a tipping point, a breakthrough moment. While they stop short of crying out Hallelujah, the language used is jubilant: now that technological efficiency has hit the necessary level, the green breakthrough is inevitable and we are heading the way of renewables. The bottom line, if you’ll pardon the pun, is that the calculative logic of the market creates an irresistible force that will bring about the energy transition: it would be too financially too stupid to do otherwise!

Yet, next to this dominant narrative, there are whispers of other dynamics emerging through the cracks of this narrative. For example on El Hierro,  one of the Canary Islands, we hear of the importance of sustainability and of the creation of opportunities for the local community. And it’s precisely the relative importance and consequences of what might drive the energy transition that was at the core of the discussion, after the viewing, moderated by Jaap de Wilde (University of Groningen). To briefly summarize a large number of diverse and lively interactions, the energy transition can be the result of

“push” by market forces (it’s the logical thing to do, financially)

“pull” by political will (the Chinese, able to put forth long-term and top-down plans)

an “imperative” to avoid ecological catastrophe (we have no choice if we want to survive)

or “growth” of community (there is a range of benefits in creating a new local energy system)

Each of these potential motors of the energy transition results in different configurations of what an energy system based on renewables might look like and on who its beneficiaries could be–with very stark contrasts between the different scenarios. In the discussion, even the ecological advantage of using renewable energy was not seen as a given–there are unsustainable ways of deploying solar panels, batteries and smart grids. As such, the inevitability of the economically-driven transition was considered more than debatable.

To delve deeper into the contrasting drivers

For a useful handle on different scenarios driving the energy transition, I recommend looking at After Oil, especially the chapter ‘Energy Impass and Political Actors’.


Behind the Event

Each year, the summer school holds an event that is open to the public, with the aims of giving something back to the city that hosts the school, of having an opportunity to connect to our summer school alumni, and to create interaction between the specialists-in-training attending the summer school and members of the public. Such bridges between expert knowledge and collective concerns are a crucial weapon against fact-free politics and a useful way of making knowledge relevant.

With regards to the event itself, this evening was a successful collaboration between the Energy Academy, Tegenlicht and its dedicated representatives, and the Groningen Energy Summer School. We are grateful to the many people who contributed to the meet up, with particular kudos to Tris van der Wal for making this meet-up happen.


I won’t ‘like’ your post: let’s move beyond potshots in public discourse on science

A troublesome ‘list’ has been circulating on Facebook lately, variously taking on the shape of a meme, coffee mug or t-shirts. It’s meant as a defense of science in the face of post-truth Trumpianisms and of recent waves of media attention to anti-science activists and extreme deniers.

Here is the pic in question, in one of its many forms: psa-earth-is-not-flat-vaccines-work-weve-been-to-26973087

Posted by many esteemed friends and colleagues these past couple of weeks, it is an image that I struggle with each time I come across it. This repeated and deep discomfort lies in both inability to endorse it, (knowing why others are posting it and agreeing with them that there is a fight to be fought) and with the dynamics it creates (its frame pushes us into the wrong fight).

This is an attempt to explain why I can’t ‘like’ this post.

How we’re talking about science

In very broad brushstrokes, there is currently a growing tension between two poles. On one end, this is characterized by the dominance of technocratic knowledge, highly abstract knowledge based on calculations and a whole metrics instrumentarium, where data is closely entwined with modelling and simulation. This is the kind of knowledge that is embraced by major global institutions–from the World Bank to the WHO–and enabled by state-supported bureaucracies. It is how we know about world economies, the refugee crisis, global warming, epidemics and many other crucial issues. This kind of knowledge is held by the ‘elite experts’ we are supposedly tired of–to paraphrase some of the recent commentary to the Brexit and Trump’s election.  The other pole of this tension is the populist appeal to common sense and to the evidence of one’s eyes–imagine Trump speaking on an icy day and stating that we just have to look outside to see that global warming isn’t such as issue. This (and much worse) happens and gets broadly tweeted, reported and broadcast.

I wish this tension were a caricature, but it’s not, this is the repeated and dominant framing of discussions about science in the mainstream media.

(At this point, I should state that my own intellectual and professional investments as a science and technology studies scholar have been to explore the professional production of knowledge, to show the diversity of kinds of knowledge and the important variations in what experts are telling us, how they come to their conclusions,  what counts as evidence and how these claims are validated. So in no way am I dismissing the importance of expertise, on the contrary, a proper characterization of this kind of knowledge and of its metrics, is a crucial matter that can feed the necessary measures I describe below.)

So what’s wrong with this post?

There are basically two problems with the ‘list’:

First, it contributes to polarisation and simply isn’t going to help the place of science in the public discourse. If anything, it’s making it worse. This list is as much of a potshot as Trumps claims: look out the window. It is just an appeal to common sense, perhaps one that has a ‘rationalistic’ or ‘scientoid’ flavour to it. Furthermore, the ‘controversies’ the list refers to are highly diverse in their scope and nature, and in their social contexts. This list tars all kinds of objections by a diversity of groups with the same brush.

Second, I can’t help but yearn for sub-clauses in these statements. For example: Vaccines work, yes, most of the time for most people if they are properly manufactured, stored and administered, and only if we collectively embrace them, and while they do bring some risks in a very small number of cases, this risk should be weighed with the risk of not using vaccines.

The earth is not flat and we’ve known this in various parts of the world at different times, based on different kinds of research and evidence.

Chem trails have not been widely measured and neither have their purported effects at population level been documented.

And so on…

The way to #StandUpForScience: grounded knowledge

I want to bring in these sub-clauses because they take us out of the potshot dynamic. Most importantly, they are a first step in getting some motion going. It’s essential for science and for public discourse to have a dynamic between the abstract and the particular, between expert and collective knowledge. Rather than potshots, we need circulation. I don’t long for a perfect, modernist cycle of interaction, but for modest moves between positions. We need a framing that leans towards interaction, translation, and conversation as an essential first step to grounded knowledge, to a science that matters and to a better public discourse on knowledge.

My recent reflections on the place of knowledge in politics and public debate have been inspired by the writings and lectures of Helen Verran, Paul Edwards, Anna Tsing, Kalpana Shankar, the recent exchange between Sismondo and Collins et al, starting in February 2017 in SSS.

Learning to interface at the Groningen Energy Summer School 2017

In one week, 25 PhD students from all over the world wafbeelding_nieuwill gather at the University of Groningen for ten intensive days of learning on the topic of Global Energy Transition from Local Perspectives.

The programme for this summer school is diverse, from lectures to excursions. Central to this programme is an active involvement for participants and organisers: presentations, discussions, feedback are all part of the deal!

Speakers and participants to the summer school have already co-created a shared and annotated bibliography on ‘YIMBY’ (yes in my backyard) that can be consulted at the Zotero website.

Not only does this interactive approach create a deeper link between participants, but is also helps participants develop skills that are essential to the energy transition.

These skills will help them

  • work across disciplinary boundaries and communicate their own expertise effectively
  • understand how social, economic, technical and cultural aspects of energy are entwined
  • engage a variety of stakeholders in complex shifts towards new energy systems
  • achieve the assemblages that are needed to connect local solutions and global issues

More on this next week, for now, I’m enjoying delving into the materials participants IMG_4076have submitted: two dozen delicious dissertation chapters-in-progress or papers reporting on their ongoing PhD work.

From Peru to India, from electric vehicles to atomic energy, from participation to obstruction, from citizens to corporate incumbents… Material that will enrich my understanding of the energy transition and stretch the scope of my knowledge.