I am fed up with hiding myself — an actual human being — behind the conventional anonymity of a large language model

This morning, I picked out Barbara Fister’s information literacy, writing instruction, and the problem of stochastic parrots from my Unread list of bookmarks. I’m so glad I saved her talk all those weeks ago so I could properly sit with it. Here’s its abstract:

For more than three decades, my job was to help students learn how information works. Though information literacy, as we call it, matters to me because inquiry is ideally a form of education that Paolo Freire called “the practice of freedom,” the students I worked with were understandably focused on formulating questions and selecting the kinds of sources that would satisfy their teacher rather than engaging in genuine curiosity. Tellingly, a Project Information Literacy study of recent college graduates found less than a third felt that college prepared them to ask questions of their own. Librarians and writing instructors both face a fundamental tension between our higher goals and the reality of our service roles to other disciplines. Two concepts that seem important but are too often overlooked are first, understanding the underlying ethical moves and commitments that characterize good honest work, whether it’s science, journalism, or an informative TikTok, and second, understanding how information systems shape our experiences, especially now that we no longer simply seek information, it seeks us. Today we’ll explore ways these concepts could be addressed without losing sight of the practical needs of writing instructors and their students to satisfy disciplinary expectations.

information literacy, writing instruction, and the problem of stochastic parrots
Posted on March 1, 2023 by barbara fister

I have several sets of passages saved so I can re-visit Fister’s essential perspective on the genre of the research paper. I am particularly fond of this one:

This is a phrase coined by Timnit Gebru and her coauthors in their paper “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” This fairly technical paper, which actually got Gebru and another author fired from their positions at Google, argues that there are a number of problems with large natural language models like ChatGPT. A language model is stochastic, in that it’s based on probabilities but has a certain amount of unpredictability. And it’s like a parrot in that it isn’t concerned with the meaning of words, but only fluency based on statistical likelihood of words being next to other words. They say a language model like the one underlying ChatGPT is

“a system for haphazardly stitching together sequences of linguistic forms it has observed in its vast training data according to probabilistic information about how they combine, but without any reference to meaning: a stochastic parrot.”

Is learning to write in college like being trained to put words next to each other in a predictable way, more concerned with correctness and mimicry than with creativity or curiosity, or even factual truth?

While I am trying to convince you to read Fister’s talk, that’s not what just I’m trying to do in this blog post. I also want to bring attention to another passage from Barbara that I am going to add to a line-of-thinking that I started with in, “The alternative to AI is talking to other people“.

Although calling it a line-of-thinking is probably overstating things. Let’s call it a hunch. My hunch is this. The product of academic text displaces most of a person. LLMs will displace the value of academic text. As a response, academia will continue to pivot to providing a set of choices/processes that a student goes through to develop both a sense of where they stand and to take personal responsibility of their own authority.

Or as Barbara Fister much more eloquently puts it,

I love the idea that research should be part of our lives, not something done only for a peculiar kind of school-based writing that erases the self and derives authority from other people – or, more commonly, not people but things: sources.

That phrase ‘a peculiar kind of school-based writing that erases the self’ immediately brought to mind another passage of recent reading.

The Tragedy of Stafford Beer is an essay by Kevin Munger that was published on Crooked Timber on September 26, 2023. In the post, is a picture of the back flap of Stafford Beer’s book, “Platform for Change, “the culmination of Beer’s project, a manifesto over which he insisted on having total creative control.”

That back flap lists all of the Beer’s many, many academic works, accomplishments, and business responsibilities. And yet, it ends with these words.

I am fed up with hiding myself, an actual human being, behind the conventional anonymity of scholarly authorship

In writing this post, I’ve come to realize that the concerns here dovetail with a long-standing bugbear of mine: that libraries overemphasize authority from sources, and do not do enough to support bibliography, a format in which authority derives from people and their choices.

It also helps explain why I’ve come to understand why I have developed such a strong distrust of large language systems: I cannot abide that these systems are designed to completely dissolve the connection between a person and the words that they have written or said.

We cannot hide any longer.

4 Responses to “I am fed up with hiding myself — an actual human being — behind the conventional anonymity of a large language model”

Leave a Reply

Your email address will not be published. Required fields are marked *