Haunted libraries, invisible labour, and the librarian as an instrument of surveillance

This post was inspired by the article Intersubjectivity and Ghostly Library Labor by Liz Settoducato which was published earlier this month on In the library with the lead pipe. The article, in brief:

Libraries are haunted houses. As our patrons move through scenes and illusions that took years of labor to build and maintain, we workers are hidden, erasing ourselves in the hopes of providing a seamless user experience, in the hopes that these patrons will help defend Libraries against claims of death or obsolescence. However, ‘death of libraries’ arguments that equate death with irrelevance are fundamentally mistaken. If we imagine that a collective fear has come true and libraries are dead, it stands to reason that library workers are ghosts. Ghosts have considerable power and ubiquity in the popular imagination, making death a site of creative possibility. Using the scholarly lens of haunting, I argue that we can experience time creatively, better positioning ourselves to resist the demands of neoliberalism by imagining and enacting positive futurities.

Intersubjectivity and Ghostly Library Labor by Liz Settoducato, In the library with the lead pipe

I also think libraries can be described as haunted but for other reasons than Settoducato suggests. That doesn’t mean I think Settoducato is wrong or the article is bad. On the contrary – I found the article delightful and I learned a lot from it. For example, having not read Foucault myself, this was new to me:

In such examples, books are a necessary component of the aesthetic of librarianship, juxtaposing the material (books and physical space) with the immaterial (ghosts). Juxtaposition is central to Michel Foucault’s concept of heterotopias, places he describes as “capable of juxtaposing in a single real place several spaces, several sites that are in themselves incompatible” (1984, 6). Foucault identifies cemeteries, libraries, and museums among his examples of heterotopias, as they are linked by unique relationships to time and memory. Cemeteries juxtapose life and death, loss (of life) and creation (of monuments), history and modernity as their grounds become increasingly populated. Similarly, libraries and museums embody “a sort of perpetual and indefinite accumulation of time in an immobile place,” organizing and enclosing representations of memory and knowledge (Foucault 1984, 7).

Intersubjectivity and Ghostly Library Labor by Liz Settoducato, In the library with the lead pipe

That passage felt true to me. As I once confessed to an avocado on the Internet…

There are other passages in Intersubjectivity that I think could be expanded upon. For example, while I completely agree with its expression that the labour of library staff is largely invisible, I believe that particular invisibility was prevalent long before neoliberalism. The librarian has been subservient to those who endow the books for hundreds of years.

Richard Bentley, for his part, continued to run into problems with libraries. Long after the quarrel of the ancients and moderns had fizzled, he installed a young cousin, Thomas Bentley, as keeper of the library of Cambridge’s Trinity College. At Richard’s urging, the young librarian followed the path of a professional, pursuing a doctoral degree and taking long trips to the Continent in search of new books for the library. The college officers, however, did not approve of his activities. The library had been endowed by Sir Edward Stanhope, whose own ideas about librarianship were decidedly more modest than those of the Bentleys. In 1728, a move was made to remove the younger Bentley, on the ground that his long absence, studying and acquiring books in Rome and elsewhere, among other things, disqualified him from the post. In his characteristically bullish fashion, Richard Bentley rode to his nephew’s defense. In a letter, he admits that “the keeper [has not] observed all the conditions expressed in Sir Edward Stanhope’s will,” which had imposed a strict definition of the role of librarian. Bentley enumerates Sir Edward’s stipulations, thereby illuminating the sorry state of librarianship in the eighteenth century. The librarian is not to teach or hold office in the college; he shall not be absent from his appointed place in the library more than forty days out of the year; he cannot hold a degree above that of master of arts; he is to watch each library reader, and never let one out of his sight

Library: An Unquiet History by Matthew Battles

He is to watch each library reader” is a key phrase here. From the beginning, librarians and library staff were installed as instruments of surveillance as a means to protect property.

Even to this day, I will hear of university departments who wish to make a collection of material available for the use of faculty and students and are so committed to this end that they will secure a room, which is no small feat on a campus nowadays. But then the faculty or students refuse to share their most precious works because they realize that their materials in an open and accessible room will be subject to theft or vandalism.

Same as it ever was.

” Social security cards unlock the library’s door. ” image from Amelia Acker.

Presently, a handful of municipal libraries in Denmark operate with open service models. These open libraries rely on the self-service of patrons and have no library staff present—loans, returns, admittance and departing the physical library space are regulated through automated access points. Many public library users are familiar with self-check out kiosks and access to the collections database through a personal computing station, but few patrons have ever been in a public library without librarians, staff workers or security personnel. Libraries that rely on self-service operation models represent a new kind of enclosed environment in societies of control. Such automated interior spaces correspond to a crisis in libraries and other institutions of memory like museums or archives. Under the guise of reform, longer service hours, and cost-saving measures, libraries with rationalized operating models conscript their users into a new kind of surveillance….

The open library disciplines and controls the user by eliminating the librarian, enrolling the user into a compulsory self-service to engage with the automated space. The power of this engagement is derived from a regime of panoptic access points that visualize, capture and document the user’s path and her ability to regulate herself during every movement and transition in the library—from entering, searching the catalog, browsing the web, borrowing information resources, to exiting the building.

Soft Discipline and Open Libraries in Denmark, Amelia Acker. Posted on Saturday, November 3, 2012, at 5:00 pm.

That was written in 2012.

The tools of monitoring and affecting space have widely proliferated in the ‘smart home’ category since then. We have services such as AirBnB that allows all manner of spaces to made available to others. We have technologies such as Nest that act as combination thermostats, smoke detectors, and security systems that are learning systems as they use AI to discover patterns of use not readily apparent to the human mind. And then we have the spooky and unpredictable spaces where these technologies interact with each other.

Because of these technologies, many, many spaces are going to feel haunted. Not just libraries:

The other day, after watching Crimson Peak for the first time, I woke up with a fully-fleshed idea for a Gothic horror story about experience design. And while the story would take place in the past, it would really be about the future. Why? Because the future itself is Gothic.

First, what is Gothic? Gothic (or “the Gothic” if you’re in academia) is a Romantic mode of literature and art. It’s a backlash against the Enlightenment obsession with order and taxonomy. It’s a radical imposition of mystery on an increasingly mundane landscape. It’s the anticipatory dread of irrational behaviour in a seemingly rational world. But it’s also a mode that places significant weight on secrets — which, in an era of diminished privacy and ubiquitous surveillance, resonates ever more strongly….

… Consider the disappearance of the interface. As our devices become smaller and more intuitive, our need to see how they work in order to work them goes away. Buttons have transformed into icons, and icons into gestures. Soon gestures will likely transform into thoughts, with brainwave-triggers and implants quietly automating certain functions in the background of our lives. Once upon a time, we valued big hulking chunks of technology: rockets, cars, huge brushed-steel hi-fis set in ornate wood cabinets, thrumming computers whose output could heat an office, even odd little single-purpose kitchen widgets. Now what we want is to be Beauty in the Beast’s castle: making our wishes known to the household gods, and watching as the “automagic” takes care of us. From Siri to Cortana to Alexa, we are allowing our lives and livelihoods to become haunted by ghosts without shells.

Our Gothic Future, Madeline Ashby, February 25, 2016.

How can we resist this future that is being made for us but not with us? One of my favourite passages of Intersubjectivty suggests a rich field of possibility that I can’t wait to explore further:

However, it does not have to be this way. David Mitchell and Sharon Snyder also take up the questions of embodiment and productivity, examining through a disability studies lens the ways in which disabled people have historically been positioned as outside the laboring masses due to their “non-productive bodies” (2010, 186). They posit that this distinction transforms as the landscape of labor shifts toward digital and immaterial outputs from work in virtual or remote contexts, establishing the disabled body as a site of radical possibility. Alison Kafer’s crip time is similarly engaged in radical re-imagining, challenging the ways in which “‘the future’ has been deployed in the service of compulsory able-bodiedness and able-mindedness” (2013, 26-27). That is, one’s ability to exist in the future, or live in a positive version of the future is informed by the precarity of their social position. The work of theorists like Mitchell, Snyder, and Kafer is significant because it insists on a future in which disabled people not only exist, but also thrive despite the pressures of capitalism.

Intersubjectivity and Ghostly Library Labor by Liz Settoducato, In the library with the lead pipe

[An aside: a research library filled with non-productive objects can also be seen to resist capitalism. ]

In conclusion, I would like to answer this dear student who asked this important question:

The answer is: yes.
The library staff are the ghosts in the machine.

Digitization is a multiplier and metadata is a fractal

In Harry Potter and the Deathly Hallows (sorry), Helga Hufflepuff’s goblet is stored in a vault at Gringotts that’s been cursed so that every time you touch one of the objects in it, dozens of copies are created. On the cover of the original U.K. edition of the book, Harry, Ron and Hermione are pictured scrambling atop a wave of facsimile’d treasure. I’ve started thinking about special collections digitization like this. Digitization doesn’t streamline or simplify library collections; rather, it multiplies them, as every interaction creates additional objects for curation and preservation

The above is from Harry Potter and the Responsible Version Control of Digital Surrogates and it is one of the few examples that I know of that uses the Harry Potter and the… trope appropriately. It is a post written by Emma Stanford, Digital Curator at the Bodleian Libraries from some months past but it came to my mind this week after reading this from Jep Thorpe‘s newsletter a couple of days ago:

The amount of data that can be conjured from any given thing is almost limitless. Pick up a plain grey rock from the side of the road, and in moments you can make a small dataset about it: size, weight, colour, texture, shape, material. If you take that rock to a laboratory these data can be made greatly more precise, and instrumentation beyond our own human sensorium can add to the list of records: temperature, chemical composition, carbon date. From here there is a kind of fractal unfolding of information that begins to occur, where each of these records in turn manifest their own data. The time at which the measurement was made, the instrument used to record it, the person who performed the task, the place where the analysis was performed. In turn, each of these new meta-data records can carry its own data: the age of the person who performed the task, the model of the instrument, the temperature of the room. Data begets data, which begets meta data, repeat, repeat, repeat. It’s data all the way down.

We use computers because they are supposed to make our lives more efficient but at every layer that they are applied they introduce complexity. This is one of the takeaways that I gained from reading the “Designing Freedom” Massey Lectures from cyberneticist Stafford Beer.

The book is very interesting but also a somewhat frustrating read and so if you are interested in learning more, I’d suggest this podcast episode dedicated to the book from the cybernetic marxists of General Intellect Unit.

Yes. There is now a podcast episode for everything.

If the map becomes the territory then we will be lost

That which computation sets out to map and model it eventually takes over. Google sets out to index all human knowledge and becomes the source and the arbiter of that knowledge: it became what people think. Facebook set out to map the connections between people – the social graph – and became the platform for those connections, irrevocably reshaping societal relationships. Like an air control system mistaking a flock of birds for a fleet of bombers, software is unable to distinguish between the model of the world and reality – and, once conditioned, neither are we.

James Bridle, New Dark Age, p.39.

I am here to bring your attention to two developments that are making me worried:

  1. The Social Graph of Scholarly Communications is becoming more tightly bound into institutional metrics that have an increasing influence on institutional funding
  2. The publishers of the Social Graph of Scholarship are beginning to enclose the Social Graph, excluding the infrastructure of libraries and other independent, non-profit organizations

Normally, I would try to separate these ideas into two dedicated posts but in this case, I want to bring them together in writing because if these two trends converge together, things will become very bad, very quickly.

Let me start with the first trend:

1. The social graph that binds

When I am asked to explain how to achieve a particular result within scholarly communication, more often than not, I find myself describing four potential options:

  1. a workflow of Elsevier products (BePress, SSRN, Scopus, SciVal, Pure)
  2. a workflow of Clarivate products (Web of Science, InCites, Endnote, Journal Citation Reports)
  3. a workflow of Springer-Nature products (Dimensions, Figshare, Altmetrics)
  4. a DIY workflow from a variety of independent sources (the library’s institutional repository, ORCiD, Open Science Framework)

Workflow is the new content.

That line – workflow is the new content – is from Lorcan Dempsey and it was brought to my attention by Roger Schonfeld. For Open Access week, I gave a presentation on this idea of being mindful of workflow and tool choices in a presentation entitled, A Field Guide to Scholarly Communications Ecosystems. The slides are below.

(My apologies for not sharing the text that goes with the slides. Since January of this year, I have been the Head of the Information Services Department at my place of work. In addition to this responsibility, much of my time this year has been spent covering the work of colleagues currently on leave. Finding time to write has been a challenge.)

In Ontario, each institution of higher education must submit a ‘Strategic Mandate Agreement‘ with its largest funding body, the provincial government. Universities are currently in the second iteration of these types of agreements and are preparing for the third round. These agreements are considered fraught by many, including Marc Spooner, a professor in the faculty of education at the University of Regina, who wrote the following in an opinion piece in University Affairs:

The agreement is designed to collect quantitative information grouped under the following broad themes: a) student experience; b) innovation in teaching and learning excellence; c) access and equity; d) research excellence and impact; and e) innovation, economic development and community engagement. The collection of system-wide data is not a bad idea on its own. For example, looking at metrics like student retention data between years one and two, proportion of expenditures on student services, graduation rates, data on the number and proportion of Indigenous students, first-generation students and students with disabilities, and graduate employment rates, all can be helpful.

Where the plan goes off-track is with the system-wide metrics used to assess research excellence and impact: 1) Tri-council funding (total and share by council); 2) number of papers (total and per full-time faculty); and 3) number of citations (total and per paper). A tabulation of our worth as scholars is simply not possible through narrowly conceived, quantified metrics that merely total up research grants, peer-reviewed publications and citations. Such an approach perversely de-incentivises time-consuming research, community-based research, Indigenous research, innovative lines of inquiry and alternative forms of scholarship. It effectively displaces research that “matters” with research that “counts” and puts a premium on doing simply what counts as fast as possible…

Even more alarming – and what is hardly being discussed – is how these damaging and limited terms of reference will be amplified when the agreement enters its third phase, SMA3, from 2020 to 2023. In this third phase, the actual funding allotments to universities will be tied to their performance on the agreement’s extremely deficient metrics.

Ontario university strategic mandate agreements: a train wreck waiting to happen”, Marc Spooner, University Affairs, Jan 23 2018

The measure by which citation counts for each institution are going to be assessed have already been decided. The Ontario government has already stated that it is going to use Elsevier’s Scopus (although I presume they really meant SciVal).

What could possibly go wrong? To answer that question, let’s look at the second trend: enclosure.

2. Enclosing the social graph

The law locks up the man or woman
Who steals the goose from off the common
But leaves the greater villain loose
Who steals the common from off the goose.

Anonymous, “The Goose and the Commons”

As someone who spends a great deal of time ensuring that the scholarship of the University of Windsor’s Institutional Repository meets the stringent restrictions set by publishers, it’s hard not to feel a slap in the face when reading Springer Nature Syndicates Content to ResearchGate.

ResearchGate has been accused of “massive infringement of peer-reviewed, published journal articles.”

They say that the networking site is illegally obtaining and distributing research papers protected by copyright law. They also suggest that the site is deliberately tricking researchers into uploading protected content.

Who is the they of the above quote? Why they is the publishers, the American Chemical Society and Elsevier.

It is not uncommon to find selective enforcement of copyright within the scholarly communication landscape. Publishers have cast a blind eye to the copyright infringement of ResearchGate and Academia.edu for years, while targeting course reserve systems set up by libraries.

Any commercial system that is part of the scholarly communication workflow can be acquired for strategic purposes.

As I noted in my contribution to Grinding the Gears: Academic Librarians and Civic Responsibility, sometimes companies purchase competing companies as a means to control their development and even to shut their products down.

One of the least understood and thus least appreciated functions of calibre is that it uses the Open Publication Distribution System (OPDS) standard (opds-spec.org) to allow one to easily share e-books (at least those without Digital Rights Management software installed) to e-readers on the same local network. For example, on my iPod Touch, I have the e-reader program Stanza (itunes.apple.com/us/app/stanza/id284956128) installed and from it, I can access the calibre library catalogue on my laptop from within my house, since both are on the same local WiFi network. And so can anyone else in my family from their own mobile device. It’s worth noting that Stanza was bought by Amazon in 2011 and according to those who follow the digital e-reader market, it appears that Amazon may have done so solely for the purpose of stunting its development and sunsetting the software (Hoffelder,2013)

Grinding the Gears: Academic Librarians and Civic Responsibility” Lisa Sloniowski, Mita Williams, Patti Ryan, Urban Library Journal. Vol. 19. No.1 (2013). Special Issue: Libraries, Information and the Right to the city: Proceedings of the 2013 LACUNY Institute.

And sometimes companies acquire products to provide a tightly integrated suite of services and seamless workflow.

If individual researchers determine that seamlessness is valuable to them, will they in turn license access to a complete end-to-end service for themselves or on behalf of their lab?

And, indeed, whatever model the university may select, if individual researchers determine that seamlessness is valuable to them, will they in turn license access to a complete end-to-end service for themselves or on behalf of their lab?  So, the university’s efforts to ensure a more competitive overall marketplace through componentization may ultimately serve only to marginalize it.

“Big Deal: Should Universities Outsource More Core Research Infrastructure?”, Roger C. Schonfeld, January 4, 2018

Elsevier bought BePress in August of 2017. In May of 2016, Elsevier acquired SSRN. Bepress and SSRN are currently exploring further “potential areas of integration, including developing a single upload experience, conducting expanded research into rankings and download integration, as well as sending content from Digital Commons to SSRN.

Now, let’s get to the recent development that has me nervous.

10.2 Requirements for Plan S compliant Open Access repositories

The repository must be registered in the Directory of Open Access Repositories (OpenDOAR) or in the process of being registered.

In addition, the following criteria for repositories are required:

  • Automated manuscript ingest facility
  • Full text stored in XML in JATS standard (or equivalent)
  • Quality assured metadata in standard interoperable format, including information on the DOI of the original publication, on the version deposited (AAM/VoR), on the open access status and the license of the deposited version. The metadata must fulfil the same quality criteria as Open Access journals and platforms (see above). In particular, metadata must include complete and reliable information on funding provided by cOAlition S funders. OpenAIRE compliance is strongly recommended.
  • Open API to allow others (including machines) to access the content
  • QA process to integrate full text with core abstract and indexing services (for example PubMed)
  • Continuous availability

Automated manuscript ingest facility probably gives me the most pause. Automated means a direct pipeline from publisher to institutional repository that could be based on a publishers’ interpretation of fair use/fair dealing and we don’t know what the ramifications of that decision making might be. I’m feeling trepidation because I believe we are already experiencing the effects of a tighter integration between manuscript services and the IR.

Many publishers – including Wiley, Taylor and Francis, IEEE, and IOP – already use a third party manuscript service called ScholarOne. ScholarOne integrates the iThenticate service which produces reports of what percentage of a manuscript has already been published. Journal editors have the option to set what extent a paper can make use of a researcher’s prior work, including their thesis. Manuscripts that exceed these set thresholds can be automatically rejected without human interjection from the editor. We are only just starting to understand how this workflow is going to impact the willingness of young scholars to make their theses and dissertations open access.

It is also worth noting that ScholarOne is owned by Clarivate Analytics, the parent company of Web of Science, Incites, Journal Citation Reports, and others. One on hand, having a non-publisher act as a third party to the publishing process is probably ideal since it reduces the chances of a conflict of interest. On the other hand, I’m very unhappy with Clarivate Analytics’s product called Kopernio which provides “fast, one-click access to millions of research papers” and “integrates with Web of Science, Google Scholar, PubMed” and 20,000 other sites” (including ResearchGate and Academia.edu natch). There are prominent links to Kopernio within Web of Science that essentially positions the product as a direct competitor to a university library’s link resolver service and in doing so, removes the library from the scholarly workflow – other than the fact that the library pays for the product’s placement.

The winner takes it all

The genius — sometimes deliberate, sometimes accidental — of the enterprises now on such a steep ascent is that they have found their way through the looking-glass and emerged as something else. Their models are no longer models. The search engine is no longer a model of human knowledge, it is human knowledge. What began as a mapping of human meaning now defines human meaning, and has begun to control, rather than simply catalog or index, human thought. No one is at the controls. If enough drivers subscribe to a real-time map, traffic is controlled, with no central model except the traffic itself. The successful social network is no longer a model of the social graph, it is the social graph. This is why it is a winner-take-all game.

Childhood’s End, Edge, George Dyson [1.1.19]