The Provenance of Facts

Brian Feldman has a newsletter called BNet and on May 30th, he published an insightful and whimsical take on facts and Wikipedia called mysteries of the scatman.

The essay is an excellent reminder that if a fact without proper provenance makes it way into Wikipedia and is then published in a reputable source, it is nearly impossible to remove said fact from Wikipedia.

Both the Scatman John and “Maps” issues, however, point to a looming vulnerability in the system. What happens when facts added early on in Wikipedia’s life remain, and take on a life of their own? Neither of these supposed truths outlined above can be traced to any source outside of Wikipedia, and yet, because they initially appeared on Wikipedia and have been repeated elsewhere, they are now, for all intents and purposes, accepted as truth on Wikipedia. It’s twisty.

mysteries of the scatman

This is not a problem of only Wikipedia. Last year I addressed a similar issue in an Information Literacy class for 4th year Political Science students when I encouraged students to follow the citation pathways of the data that they plan to cite. I warned them not to fall for academic urban legends:

Spinach is not an exceptional nutritional source of iron. The leafy green has iron, yes, but not much more than you’d find in other green vegetables. And the plant contains oxalic acid, which inhibits iron absorption.

Why, then, do so many people believe spinach boasts such high iron levels? Scholars committed to unmasking spinach’s myths have long offered a story of academic sloppiness. German chemists in the 1930s misplaced a decimal point, the story goes. They thus overestimated the plant’s iron content tenfold.

But this story, it turns out, is apocryphal. It’s another myth, perpetuated by academic sloppiness of another kind. The German scientists never existed. Nor did the decimal point error occur. At least, we have no evidence of either. Because, you see, although academics often see themselves as debunkers, in skewering one myth they may fall victim to another.

In his article “Academic Urban Legends,” Ole Bjorn Rekdal, an associate professor of health and social sciences at Bergen University College in Norway, narrates the story of these twinned myths. His piece, published in the journal Social Studies of Science, argues that through chains of sloppy citations, “academic urban legends” are born. Following a line of lazily or fraudulently employed references, Rekdal shows how rumor can become acknowledged scientific truth, and how falsehood can become common knowledge.

Academic Urban Legends“, Charlie Tyson, Inside Higher Ed, August 6, 2014

I’m in the process of working on an H5P learning object dedicated to how to calculate one’s H-Index and yet, I’m conflicted about doing so. There are many reasons why using citations as a measure of an academic’s value is problematic for reasons far beyond the occasional academic urban legend:

To weed out academic urban legends, Rekdal says editors “should crack down violently on every kind of abuse of academic citations, such as ornamental but meaningless citations to the classics, or exchanges in citation clubs where the members pump up each other’s impact factors and h-indexes.”

Yet even Rekdal – who debunks the debunkers – says his citation record isn’t flawless.

“I have to admit that I published an article two decades ago where I included an academically completely meaningless reference (without page numbers of course) to a paper written by a woman I was extremely in love with,” he said. “I am still a little ashamed of what I did. But on the other hand, the author of that paper has now been my wife for more than 20 years.”

Academic Urban Legends“, Charlie Tyson, Inside Higher Ed, August 6, 2014

The future of Big Te(a)ch

Last week, my place of work announced that the university campus was going to be primarily online for the upcoming fall semester. From my understanding, the qualifier of primarily is being used because there are some professional programs that have compulsory in-person components such as in clinical nursing.

Replicating hands-on or lab components of classes are a particular challenge in the present moment. How do you replace what an anatomy class might mean to a medical student? When you are training students to do work in a chemistry lab, what do you do when you no longer have a lab to work in?

I have taken my fair share of lab courses and, to be honest, I recall many of them were stressful. I always felt the pressure of being on the clock and having to finish a series of steps towards an outcome that was unclear to me. To be honest, young me would have preferred the option of watching a lab instructor with a go-pro strapped to their forehead, going through the experiment on my behalf.

But watching another person complete a jigsaw puzzle is not the same as doing the jigsaw puzzle yourself.

How can we create rich, online or at-home experiences with choice and agency?

One answer is, The Future of Big Tech.

It’s not the future you think I mean. I’m referring to the 10 minute game The Future of Big Tech which available as pay-as-you-can from Coney’s Pop-Up Playhouse [from the menu, click on : 2+ Players > The Future of Big Tech]

Coney is a UK-based interactive theatre group whose work I’ve been casually following for some years now. I’ve only recently started exploring their online options. This past weekend, I played Big Tech Future with my kids and I really appreciated the opportunity to have a conversation of what the experience meant to them afterwards.

screenshot from The Future of Big Tech

I’m being vague here because I really don’t want to spoil the experience as it is one that you really should try. But if you are feeling apprehensive about putting on your headphones and diving in, I will tell you a bit of what you can expect.

Once you choose your character, you will hear a short description of who you are and how you live in a particular future. You will pick up a phone call and during the call, you will be given choices to make. There are no loud or sudden disturbing noises during the call and the game ends in under ten minutes.

The voice acting is very good. I’m adding it as evidence in my ‘augmented experiences are better than virtual ones‘ file.

I’m so impressed how much this game achieves in such a short time. I also appreciate that the designers recognized that by dividing the experience into two, the game creates an easy entry into conversation afterwards, as each participant will want to ask the other for their side of the story.

It truly belongs on a syllabi.

It’s time to cut the CRAAP

I do not have a good understanding of what academic librarians are currently teaching students in regards to evaluating information they find on the Internet. Rather than read the literature, I searched for the word CRAAP in my custom Google Search Engine for Ontario Academic Libraries. I found that many libraries – including my own place of work – advocate the use of the CRAAP checklist-approach to evaluating information found online.

I have never been particularly enthusiastic about the CRAAP checklist approach to evaluating information and I know that I’m not the only librarian who feels this way. But until recently, if you had asked me what I would suggest as an alternative, I would have struggled to articulate the structure of what to replace it.

As my last series of posts can attest, I have been recently creating creative-commons licensed learning objects with H5P through eCampus Ontario. I am doing so because in these unprecedented times much of the teaching on the university campus has transitioned to asynchronous online learning and as such, I believe that my teaching should transition as well.

This week, I made this short presentation introducing the reader to two methods that I think should replace the use of the CRAAP checklist.

This presentation introduces the reader to the COR (Civic Online Reasoning) Curriculum and the SIFT Method. Both are comprised of a short series of steps to help the reader separate fact from fiction on the Internet. Both methods are built from the strategies employed by professional fact-checkers.

Mike Caulfield, who created and advocates for the SIFT method, has explained why the CRAAP checklist is insufficient in these two interviews that are best read in full: “Getting Beyond the CRAAP Test: A Conversation with Mike Caulfield” and “Truth Is in the Network” from Project Information Literacy.

I also found his post, A Short History of CRAAP as particularly enlightening. My jaw dropped a bit at this particular connection:

So when the web came into being, library staff, tasked with teaching students web literacy, began to teach students how to use collection development criteria they had learned in library science programs. The first example of this I know of is Tate & Alexander’s 1996 paper which outlines a lesson plan using the “traditional evaluation criteria of accuracy, authority, objectivity, currency, and coverage.” ….

… So let’s keep that in mind as we consider what to do in the future: contrary to public belief we did teach students online information literacy. It’s just that we taught them methodologies that were developed to decide whether to purchase reference sets for libraries

A Short History of CRAAP

Perhaps this is the reason why librarians have such a hard time letting go of this particular approach.

The Librarian as DJ

On Saturday night I had a Zoom call with a friend of mine from high school. My friend prefaced our chat with a warning that she was going to keep the conversation short because video calls are exhausting. I heartily agreed. During this call, my daughter and her son would grace our screens and through them, excitedly shared what game-spaces in Roblox they go to play and hangout in, with their friends.

This difference between exhaustion and joy struck me. I didn’t think it was because of any particular characteristic of our respective generations, but I couldn’t entirely place why the reactions were so very different. But then on Sunday morning, during the time in which I dedicate to my longreads collected from a week’s worth of tweets and newsletters, I found an answer that made a lot sense to me.

That essay was Home Screens by Drew Austin from the web publication, Real Life. After I finished, I promptly took to twitter to share my recommendation for everyone to read it. Here’s a passage from it, dedicated to Zoom:

Pure economic exchanges can relocate to screen interactions with a minimal loss of fidelity, but encounters meant to be less instrumental are proving harder to sustain without the texture of physical space. Most of the apps we use for interaction simply unbundle an informational component from the scene of social contact. This was sufficient under ordinary circumstances, when messaging and video conferencing apps merely complemented in-person exchanges. But now those tools leave users wanting more, failing to substitute the richness and depth that interaction in physical space could otherwise provide.

Consider, for example, the video-conferencing platform Zoom. During the quarantine’s first few weeks, it emerged as a flexible (albeit insecure) tool for conducting interactions that could no longer happen face to face, rapidly expanding beyond its established domain of business meetings to accommodate gatherings ranging from happy hours to dinner parties to dates. But rather than providing support for adjacent activities, as an app like Slack does for office work, Zoom replaces those activities altogether. In other words, users experience Zoom more as a stultified form of virtual reality than an augmented one, because it feels as though there is very little off-screen reality available to augment right now.

Drew Austin, Home Screens, Real Life, April 27th, 2020

I’m writing about this essay on this blog rather than my more technology focused outlet, because I want to start exploring this understanding that there is something fundamentally different between ‘virtual libraries’ and ‘augmented libraries’.

In Home Screens, Austin draws on one of my favourite written works from last year:

In How to Do Nothing, Jenny Odell makes an eloquent case for the importance of place as a site of non-transactional human relations. As an example, she describes how, for many, public transportation is “the last non-transactional space in which we are regularly thrown together with a diverse set of strangers, all of whom have different destinations for different reasons.” She goes on to summarize Louis Althusser’s contention that true societies can emerge only within spatial constraints, where individuals live in bounded proximity without the ability to easily disperse. In such settings, individuals have no choice but to encounter one another repeatedly and establish durable connections based upon a firmer foundation than the exchange value those relationships promise. This represents a quite different logic than that of an app that enables hiring random (and often unseen) strangers to perform tasks for us at a social distance.

Another non-transactional space in which residents are regularly thrown together with a diverse set of strangers, all of whom have different ‘destinations’ for different reasons, is the library.

I don’t mean to spoil the concluding passages of Joanne McNeil’s Lurking: How a Person Became a User, but I can’t ignore the connection that I see here:

Librarians are what the internet is aching for — people on task to care about the past, with respect to the past and also to what it shall bequeath to the future. There needs to be rituals in place online to treat people — users — with dignity, both for the living and the dead. For to speak of the humanity of internet users is to recognize the impermanence, the mortality of that humanity.

Everyone is welcome in a library just for being. A person in a library is a person: homeless or not, hurting or not. My dream for the internet, as a final form, is a civic and independent body, where all people are welcomed and respected, guided by principles of justice, rights, and human dignity. For this, users would express care in return, with a sense of purpose and responsibility to the digital spaces organized with these values. With the internet routing through a planet that is the origin of more than a hundred billion lives, such a project means information in abundance. Segmenting and clustering users and history into communities, rather than mass-purpose platforms, would be an integral component to this ideal internet in its cycles of maintenance and renewal.

Joanne McNail, Lurking: How a Person Became a User, 2020.

I have added the above to my file of technology writers who suggest that the future of the internet needs librarians and the public library become an ISP.

I haven’t been a public librarian in over twenty years now, so I am going to limit the following thoughts on augmented vs virtual library space in an academic library context.

First, let us consider that more students come to the library to study rather than to actively engage with library-provided materials, print or otherwise. Does this suggest that the academic library has a responsibility to provide online study space for students?

I think the answer might be yes — but not yet. I don’t believe academic libraries can provide online spaces for students to share what they’re learning in the current state of teaching and assessment in the University. Let us remember that it wasn’t that long ago when Canadian Universities equated creating virtual study spaces on Facebook with cheating. Let us be aware that not enough has changed since 2009, and that some faculty are quite happy to hold students under constant surveillance rather than risk a drop in ‘academic vigor’.

The most notable experiment in creating the library as a virtual space were the Libraries of Second Life. In the current moment, the virtual spaces where people gather are within Minecraft, Fortnite, Roblox, and Animal Crossing.

But let’s not think of virtual spaces. Let’s not try to reimagine the library through an Oculus Rift. Let’s try to start with something a little manageable. Something a little more humane.

Let’s take a psychogeographic approach, as suggested by my colleague Devon Mordell:

With our ability to roam the physical environment necessarily compromised, our platforms – Netflix, Instagram, Twitter, Spotify, etc. etc. – have taken on an even greater significance as the sites of our work and leisure. But how do we inhabit them in psychogeographic terms, as virtual spaces that shape our behaviours and emotions? Is it possible to find alternative paths to the passive consumption modalities that a data-driven culture industry expects of us? Can we amble through our platforms in ways unforeseen by their designers? And understand their infrastructures better through our experiments and investigations?

Ergo, a psychogeographical approach to platform studies as a means to engage with these infrastructures in novel ways (please note: I am not a licensed psychogeographer).

Tactics for resisting platform passivity, Devon Mordell, 2020

It delights me to no end that Devon published the above as I was writing the draft of this post because I also want to speculate that perhaps we should investigate sound as a platform (please note: I am a licensed psychogeographer).

What if every library offered its own version of lofi hip hop radio – beats to relax/study to? [explainer]

What if your librarian was also your DJ? What could a campus-scaled call-in show sound-like? Would you feel less alone if the DJ gave you a shout-out?

Personality Testing using H5P

We don’t all play games the same way. One useful means by which we can categorize types of players by their style of play is through the use of Bartle Types, named after Richard Bartle who formed the characterizations from observing participants playing MUDs:

So, labelling the four player types abstracted, we get: achievers, explorers, socialisers and killers. An easy way to remember these is to consider suits in a conventional pack of cards: achievers are Diamonds (they’re always seeking treasure); explorers are Spades (they dig around for information); socialisers are Hearts (they empathise with other players); killers are Clubs (they hit people with them).

Hearts, clubs, diamonds, spades: players who suit MUDS“, Richard Bartles

I’m not aware of any system that categorizes university students by their behaviours and motivations and this, I think, is for the best. It is useful to remember that there will always be a percentage of students who are in the process of trying to create or discover their own motivations and personal identity, especially when we are working with young people.

I’ve been engaging in this line of thinking because I noticed that the H5P Framework offers a format called ‘Personality Quiz’ and I’ve been trying to imagine how it can be used in an educational context.

I don’t think I have properly articulated why I’ve become so intrigued by H5P. I only became aware of the H5P Framework once it was made available through eCampus Ontario for Ontario educators in the university and college some months ago. Once I learned that the HTML5 based framework allowed for both accessible and lightweight learning objects (slide presentations, quizzes, simple puzzles) that could be embedded in systems such as Blackboard, WordPress, and Drupal, I invested in the time to learn more. I’m particularly impressed that most of the items I’ve found in both Laurier’s and eCampus Ontario’s catalogues have been licenced openly to encourage re-use.

Here’s a presentation I created using the framework earlier this week:

For the last three weeks, I’ve been spending some time playing around with the formats of H5P, as previous posts in the ludo series can attest.

From the exploration I’ve done this week, I believe that by labeling the H5P format a ‘Personality Quiz’, we might be overlooking that we could use this mechanic to weigh imperfect answers and present the most compelling choice after a series of questions. This might prove to be a more efficient means of guiding a user to a particular answer rather than presenting a large number of binary choices in which every answer is a single end-node of a decision tree.

For example, in a library context we present to the user a list of different databases they might want to use when they perform research. In my place of work, each librarian chooses a subset of options and lists them on a single page, often separated into groups. Here’s the list of resources that I’ve put together as the liaison librarian for UWindsor’s School of Environmental Science:

I try not to overwhelm the reader with too many choices and have opted to group the options from most specific to more general and then added some options for tools from related disciplines at the bottom of the page. This layout requires the reader to review the entire page of options before making a choice.

But what if I also presented the choices via a quiz to make this decision-making more palatable?

Screen capture of behind the scenes…

This is clearly not going to be as fun as a Buzzfeed Quiz, but it may be a better means to convey to the reader why there are so many available options and that some options may be better for some specific purposes than others. If you have any advice or experiences with this quiz – good or bad – please leave a comment and let me know.

I’m still interested in exploring further and trying other tools of Interactive Fiction to help a reader or researcher navigate through their research journey. I am working on an idea that I hope to showcase in the near future. I’m not going to tell you exactly what I’m hoping to achieve but I will let you know that it is going to be themed around the idea of the manifesto.

Make good choices

I’ve decided to assign all the game-related posts here with the label of ludo. The word ludo is likely related to the words, luden and lusory that you can learn more about from my first-games related post in this series.

Ludo is the name the Danes call the game that Canadians would call Sorry… unless those Canadians were The Dream Warriors, and in that case they would call it Luuuuuudi… lulululu luuuuudi…

Speaking of previous posts, remember when I told you that The Game of the Goose was one of the world’s first board games? Evidently, a thousand years before its known existence, Indians were playing something very similar called Pachisi.

Now Ludo or The Game of the Goose are not very good examples of games that can be used for educational purposes. They are not even particularly interesting games. The only people who like to play these Candyland-like racing games are small children. My theory is that they like it only because they haven’t figured out that no amount of magical thinking is going to change the outcome of the game once it starts. The cards or the dice determines who wins and who loses and there is no way for a parent to (legally) intervene if it is apparent that said small child is not emotionally able to handle losing.

(This is why we buy Hoot Owl Hoot! for our friends’ small children instead of Candyland).

Sid Meier, the (Sarnia-born!) game designer of Sid Meier’s Pirates and Civilization, is thought to have said “a good game is a series of interesting decisions.” Now, I don’t believe Meier believes that interesting decisions is the only characteristic that defines a game, but I do agree with him — games with meaningful and interesting decisions, lead themselves to become be games.

“It’s easier to look at it as what is not an interesting decision,” says the legendary creator of Civilization. If a player always chooses the first from among a set of three choices, it’s probably not an interesting choice; nor is a random selection…

… Interesting decisions are persistent and affect the game for a certain amount of time, as long as the player has enough information to make the decision – when early choices can ruin the game experience down the road, developers need to present them in a fashion appropriate to that…

Systems that allow for branching can be quite powerful. They form the fundamental mechanic behind simple quizzes: if you choose the right answer, you are awarded a point and can move to the next question in the series; and if you choose incorrectly, you are given an opportunity to try again, or to proceed to the next scenario but just without a score.

Quizzes – built from branching scenarios – can be a useful and powerful means to help the reader recall with what have just been read as well as provide formative assessment on their current understanding.

Branching scenarios can also provide enough mechanics to power text dungeon crawlers. Here’s a simple prototype I made using eCampus Ontario’s H5P Toolkit:

Branching systems also make possible the genre of games/writing called IF or Interactive Fiction. I have played several IF games and I believe they exist as this wonderful space in and between game and story and they expand the way we think about and appreciate text in games.

I know that there have been a number of librarians who have created library-related educational experiences using Twine, one of the most accessible IF systems. I hope to join their ranks shortly. It’s time to level up.

I just need to choose to do so.

The Untitled Game of the Goose

No, not that Goose game

I am an aspiring game designer in the same way that so many people are aspiring writers. I have been meaning to design games for several years now but despite my good intentions, I haven’t found a way to sit down and do the work.

I have done some self-reflection on why I fail to start particular projects. My conclusion is that I am prone to suffer from analysis paralysis. For example, despite wanting to grow a vegetable garden for many years, it was only until I signed up with a local gardening subscription program that prescribes all the major decisions for me, that I was able to move from intention to action. 

To overcome this inertia around game-design, I am going to use what has almost always worked for me in the past: I am going to write a series of blog posts on games and try to develop simple prototype to showcase in these posts.

My intention is not to develop one, singular and epic, amazing game but to create many tiny games that will hopefully improve in their design over the course of their creation. Years ago, I remember listening to a game designer once advise, “Your first 99 games are going to be terrible so you are best to get them over with as soon as possible.” While I didn’t make use of this advice, I kept these words tucked away for future use. I love that this game designer’s simple directive is a distillation of the oft-told Ceramics Assignment story.

My goal is not to become a full-time game designer.  What I am hoping for is a better understanding of how I can create well-designed games for play and learning at home, in the classroom, and in the library. 

But where to start?

Well, let’s try the beginning.

What’s the first image that comes to mind from the words, board game?

My children’s mega-game: Candyland + Chutes & Ladders + Snakes & Ladders + Busytown

Chances are, you thought of something that looked like one of the boards above.

Candyland, Sorry, and Snakes and Ladders are all ‘racing games’ that are variations of ‘The Game of the Goose‘ – a game that has been around since the end of the 15th century.

Game of the Goose Board

A version of the game was given as a gift by Francesco I de’ Medici of Florence to King Philip II of Spain sometime between 1574 and 1587. In June 1597 John Wolfe enters the game in the Stationers’ Register, as “the newe and most pleasant game of the goose”.

Game of the Goose from Wikipedia

I have played many, many, many Game of the Goose Game variations with my children when they were young. Yes, these games were boring for me but my children loved playing them. As we played together, I experienced a not-insignificant-amount of pride and joy watching my children slowly mastering the adding and subtracting of numbers. While I will always recommend playing such games with small children over making them do worksheets (*shudder*) there is no getting around the fact that these games are boring to grown ups.

And you might be thinking, how could such a boring game been so well-played for so much of history?  

The answer to that question is, is the same answer to why adults (used to) gather together and play the children’s game of BINGO: gambling

I found that historical tidbit as well as other lovely historical insights into this game from this episode of the podcast, Ludology.

Traditionally, game of the goose games are played with two dice, although they can be designed to be played by flipping a coin, as this game board that I found in Villains Beastro illustrates:

Over the course of my life, I have made quite a few ‘Game of the Goose’ board games. When I was in my elementary school years, many of my independent study projects were a combination of a racing game board and Trivial Pursuit, which was all the rage at the time.

Years ago, I imagined the ‘presentation deck’ as a potential game board. I couldn’t figure out how to turn my own slides into a game, but I did create a game board using Prezi and movement cards from our Candyland set and I pitted half my talk’s audience against each other for prizes.

I will share with you the last Game of the Goose Game I made. I made it some weeks ago when I was suffering from a particularly bad bout of paralysis and thought I would try to solve it with a little game theory. I took a bunch of index cards and cut them up. On each strip, I wrote down something I had been meaning to do – both pleasant (play the video game version of Walden) and less pleasant (spend 25 minutes cleaning the garage) and arranged them on a board. When I found myself spending more time weighing what I should do next than actually doing something, I rolled a die and let fate dictate my next action.

Keeping in the spirit of my self-imposed challenge, I spent part of this morning trying to make a Game the Goose prototype using the H5P Toolkit.

It has proven to be a little bit of a challenge because the framework lacks the ability to easily import a dice mechanic into a presentation format.

While I’m happy that I was able to come up with a movement mechanic for this one-person game, I don’t believe the end-product is a particularly fun experience. Still, here it is for your enjoyment:

Like so many other aspects of our lives, working on the game has felt like three steps forward and two steps back.

But I am trying not to forget that this still means progress.

And most importantly, it means learning.

Choose your quarantine character

Choose your quarantine character:

I have decided that the character that I am going to embody for this quarantine is an advocate of games.

You may notice that I did not use the word gamer to describe this role. The reason why I do not call myself a gamer is because to do so would suggest that the world is made up of gamers and non-gamers. I don’t believe that such a division exists. I believe we are all Homo ludens.

Homo Ludens is a book originally published in Dutch in 1938 by Dutch historian and cultural theorist Johan Huizinga. It discusses the importance of the play element of culture and society. Huizinga suggests that play is primary to and a necessary (though not sufficient) condition of the generation of culture. The Latin word ludens is the present active participle of the verb ludere, which itself is cognate with the noun ludus. Ludus has no direct equivalent in English, as it simultaneously refers to sport, play, school, and practice.

Homo Ludens from Wikipedia, April 10, 2020

Let’s re-read that last line again: … Ludus has no direct equivalent in English, as it simultaneously refers to sport, play, and … school??

When I first read the above, the inclusion of school in that list threw me. But then it reminded me of something that I read some time ago from Marshall McLuhan’s work, The City As Classroom (1977):

The book begins with this:

I would like to bring your attention to this directive: “Look up the root meaning of the word ‘school’ (schola < Greek σχολή).

The root word of schola is… leisure.


I used to be adamant that one could not make a proper educational game to be played in a school or library setting because such games would be mandatory and I abided by the definition of games by Bernard Suits:

Playing a game is a voluntary attempt to overcome unnecessary obstacles

Lusory attitude from Wikipedia, April 10, 2020

As James P. Carse in Finite and Infinite Games puts it, “Whoever must play cannot play”.

I have done a lot of reading about games since I took this particular stance. And in that time, I have become much more open to the educational potential of not only games but toys, puzzles, and above all, play.

“We should keep in mind that the Greek word for education, “paideia” is rooted in the words for child and play: “pais” and “paidia“… Huh. How about that.

As I try to best embody the character of as advocate of games, I hope to share what I have learn as I work and play from home.

I think it would be most apt, to start by sharing the slides and notes of this work of Sebastian Deterding, called Paideia as Paidia.

Just a heads up – there is a lot of Greek philosophy in this presentation.

But there is also this:

The Library of the Living and the Library of the Dead

I am in the process of re-organizing my Google Drive and in doing so, I stumbled upon a bit of writing from 2013 that would have been a perfect addition to the post Haunted libraries, invisible labour, and the librarian as an instrument of surveillance which I wrote earlier this year:

When I was a child, the walls of books in the adult section of our modest public library always filed me with unease and even dread. So many books that I would never read. So many books I suspected – even then – that were never read. I was under the impression that all the books were so old that the authors must all be dead. Unlike my refuge – the children’s section of the library, partitioned by a glass door set in a glass wall – this section of the library was dark and largely silent. The books were ghosts.

I am imagining a library that is made up of two distinct sections. These sections may be on separate floors. They may be in separate buildings. But these sections must be separated and distinct.

One of these sections would be ‘The Library of the Living’. It would be comprised of works by authors who still walked on the earth, somewhere, among us. The other section would be ‘The Library of the Dead’.

When an authors passes from the earthly realm, a librarian take their work from the Library of the Living and bring it, silently, to the Library of the Dead.

And at the end of this text was this:

“We don’t have much time, you know. We need to find the others. We need to find mentors. We need to be mentors. We don’t have much time.”

Considering dark deposit

I have a slight feeling of dread.

In the inbox of the email address associated with MPOW’s institutional repository are more than a dozen notifications that a faculty member has deposited their research work for inclusion. I should be happy about this. I should be delighted that a liaison librarian spoke highly enough of the importance of the institutional repository at a faculty departmental meeting and inspired a researcher to fill in a multitude of forms so their work can be made freely available to readers.

But I don’t feel good about this because a cursory look of what journals this faculty member has published suggests that we can include none of the material in our IR due to restrictive publisher terms.

This is not a post about the larger challenges of Open Access in the current scholarly landscape. This post is a consideration of a change of practice regarding IR deposit, partly inspired by the article, Opening Up Open Access Institutional Repositories to Demonstrate Value: Two Universities’ Pilots on Including Metadata-Only Records.

Institutional repository managers are continuously looking for new ways to demonstrate the value of their repositories. One way to do this is to create a more inclusive repository that provides reliable information about the research output produced by faculty affiliated with the institution.

Bjork, K., Cummings-Sauls, R., & Otto, R. (2019). Opening Up Open Access Institutional Repositories to Demonstrate Value: Two Universities’ Pilots on Including Metadata-Only Records. Journal of Librarianship and Scholarly Communication, 7(1). DOI:

I read the Opening Up… article with interest because a couple of years ago, when I was the liaison librarian for biology, I ran an informal pilot in which I tried to capture the corpus of the biology department. During this time, for those articles from publishers who did not allow publisher PDF versions of deposit and authors who were not interested in depositing a manuscript version, I published the metadata of these works instead.

But part way through this pilot, I abandoned the practice. I did so for a number of reasons. One reason was that the addition of their work to the Institutional Repository did not seem to prompt faculty to start depositing their research on their volition. This was not surprising as BePress doesn’t allow for the integration of author profiles directly into it’s platform (one must purchase a separate product for author profiles and the ability to generate RSS feeds at the author level). So I was not particularly disappointed with this result. While administrators are increasingly interested in demonstrating research outputs at the department and institutional level, you can still generalize faculty as more invested in subject-based repositories.

But during this trial I uncovered a more troubling reason that suggested that uploading citations might be problematic. I came to understand that most document harvesting protocols – such as OAI-PMH and OpenAIRE – do not provide any means by which one can differentiate between metadata-only records and full text records. Our library system harvests our IR and it assumes that every item in IR has a full-text object associated with it. Other services that harvest our IR do the same. To visit the IR is to expect the full text of a text.

But the reason that made me stop the experiment pretty much immediately was reading this little bit of hearsay on Twitter:

Google and Google Scholar are responsible for the vast majority of our IR’s traffic and use. In many disciplines the percentage of Green OA articles as a percentage of total faculty output is easily less than 25%. To publish citations when the fulltext of a pre-print manuscript is not made available to the librarian, is ultimately going to test whether Google Scholar really does have an full-text threshold. And then what do we do when we find our work suddenly gone from search results?

Yet, the motivation to try to capture the whole of a faculty’s work still remains. An institutional repository should be a reflection of all the research and creative work of the institution that hosts it.

If an IR is not able to do this work, an institution is more likely to invest in a CRIS – a Current Research Information System – to represent the research outputs of the organization.

Remember when I wrote this in my post from March of this year?

When I am asked to explain how to achieve a particular result within scholarly communication, more often than not, I find myself describing four potential options:

– a workflow of Elsevier products (BePress, SSRN, Scopus, SciVal, Pure)

– a workflow of Clarivate products (Web of Science, InCites, Endnote, Journal Citation Reports)

– a workflow of Springer-Nature products (Dimensions, Figshare, Altmetrics)

– a DIY workflow from a variety of independent sources (the library’s institutional repository, ORCiD, Open Science Framework)

If the map becomes the territory than we will be lost

The marketplace for CRIS is no different:

But I think the investment in two separate products – a CRIS to capture the citations of a faculty’s research and creative output and an IR to capture the fulltext of the same, still seems a shame to pursue. Rather than invest a large sum of money for the quick win of a CRIS, we should invest those funds into an IR that can support data re-use, institutionally.

(What is the open version of the CRIS? To be honest, I don’t know this space very well. From what I know at the moment, I would suggest it might be the institutional repository + ORCiD and/or VIVO.)

I am imagining a scenario in which every article-level work that a faculty member of an institution has produced is captured in the institutional repository. Articles that are not allowed to be made open access are embargoed until they are in the public domain.

But to be honest, I’m a little spooked because I don’t see many other institutions engaging in this practice. Dark deposit does exist in the literature but it largely appears in the early years of the conversations around scholarly communications practice. The most widely cited article about the topic (from my reading not from a proper literature review), is this 2011 article called The importance of dark deposit from Stewart Sheiber. His blog is licensed as CC-BY, so I’m going to take advantage of this generosity and re-print the seven reasons why dark is better than missing:

  1. Posterity: Repositories have a role in providing access to scholarly articles of course. But an important part of the purpose of a repository is to collect the research output of the institution as broadly as possible. Consider the mission of a university archives, well described in this Harvard statement: “The Harvard University Archives (HUA) supports the University’s dual mission of education and research by striving to preserve and provide access to Harvard’s historical records; to gather an accurate, authentic, and complete record of the life of the University; and to promote the highest standards of management for Harvard’s current records.” Although the role of the university archives and the repository are different, that part about “gather[ing] an accurate, authentic, and complete record of the life of the University” reflects this role of the repository as well.Since at any given time some of the articles that make up that output will not be distributable, the broadest collection requires some portion of the collection to be dark.
  2. Change: The rights situation for any given article can change over time — especially over long time scales, librarian time scales — and having materials in the repository dark allows them to be distributed if and when the rights situation allows. An obvious case is articles under a publisher embargo. In that case, the date of the change is known, and repository software can typically handle the distributability change automatically. There are also changes that are more difficult to predict. For instance, if a publisher changes its distribution policies, or releases backfiles as part of a corporate change, this might allow distribution where not previously allowed. Having the materials dark means that the institution can take advantage of such changes in the rights situation without having to hunt down the articles at that (perhaps much) later date.
  3. Preservation: Dark materials can still be preserved. Preservation of digital objects is by and large an unknown prospect, but one thing we know is that the more venues and methods available for preservation, the more likely the materials will be preserved. Repositories provide yet another venue for preservation of their contents, including the dark part.
  4. Discoverability: Although the articles themselves can’t be distributed, their contents can be indexed to allow for the items in the repository to be more easily and accurately located. Articles deposited dark can be found based on searches that hit not only the title and abstract but the full text of the article. And it can be technologically possible to pass on this indexing power to other services indexing the repository, such as search engines.
  5. Messaging: When repositories allow both open and dark materials, the message to faculty and researchers can be made very simple: Always deposit. Everything can go in; the distribution decision can be made separately. If authors have to worry about rights when making the decision whether to deposit in the first place, the cognitive load may well lead them to just not deposit. Since the hardest part about running a successful repository is getting a hold of the articles themselves, anything that lowers that load is a good thing. This point has been made forcefully by Stevan Harnad. It is much easier to get faculty in the habit of depositing everything than in the habit of depositing articles subject to the exigencies of their rights situations.
  6. Availability: There are times when an author has distribution rights only to unavailable versions of an article. For instance, an author may have rights to distribute the author’s final manuscript, but not the publisher’s version. Or an art historian may not have cleared rights for online distribution of the figures in an article and may not be willing to distribute a redacted version of the article without the figures. The ability to deposit dark enables depositing in these cases too. The publisher’s version or unredacted version can be deposited dark.
  7. Education: Every time an author deposits an article dark is a learning moment reminding the author that distribution is important and distribution limitations are problematic.

There is an additional reason for pursuing a change of practice to dark deposit that I believe is very significant:

There are at least six types of university OA policy. Here we orga-nize them by their methods for avoiding copyright troubles…

3. The policy seeks no rights at all, but requires deposit in the repository. If the institution already has permission to make the work OA, then it makes it OA from the moment of deposit. Otherwise the deposit will be “dark” (non-OA) (See p. 24) until the institution can obtain permission to make it OA. During the period of dark deposit, at least the metadata will be OA.

Good Practices For University Open-Access Policies, Stuart Shieber and Peter Suber, 2013

At least the metadata will be OA is a very good reason to do dark deposit. It might be reason enough. I share many of Ryan Regier’s enthusiasm for Open Citations that he explains in his post, The longer Elsevier refuses to make their citations open, the clearer it becomes that their high profit model makes them anti-open

Having a more complete picture of how much an article has been cited by other articles is an immediate clear benefit of Open Citations. Right now you can get a piece of that via the above tools I’ve listed and, maybe, a piece is all you need. If you’ve got an article that’s been cited 100s of times, likely you aren’t going to look through each of those citing articles. However, if you’ve got an article or a work that only been cited a handful of times, likely you will be much more aware of what those citing articles are saying about your article and how they are using your information.

Ryan Regier,The longer Elsevier refuses to make their citations open, the clearer it becomes that their high profit model makes them anti-open

Regier takes Elsevier to task, because Elsevier is one of the few major publishers remaining that refuses to make their citations OA.

I4OC requests that all scholarly publishers make references openly available by providing access to the reference lists they submit to Crossref. At present, most of the large publishers—including the American Physical Society, Cambridge University Press, PLOS, SAGE, Springer Nature, and Wiley—have opened their reference lists. As a result, half of the references deposited in Crossref are now freely available. We urge all publishers who have not yet opened their reference lists to do so now. This includes the American Chemical Society, Elsevier, IEEE, and Wolters Kluwer Health. By far the largest number of closed references can be found in journals published by Elsevier: of the approximately half a billion closed references stored in Crossref, 65% are from Elsevier journals. Opening these references would place the proportion of open references at nearly 83%.

Open citations: A letter from the scientometric community to scholarly publishers

There would be so much value unleashed if we could release the citations to our faculty’s research as open access.

Open Citations could lead to new ways of exploring and understanding the scholarly ecosystem. Some of these potential tools were explored by Aaron Tay in his post, More about open citations — Citation Gecko, Citation extraction from PDF & LOC-DB.

Furthermore, releasing citations as OA would enable them to be added to platforms such as Wikidata and available for visualization using the Scholia tool, pictured above.

So that’s where I’m at.

I want to change the practice at MPOW to include all published faculty research, scholarship, and creative work in the Institutional Repository and if we are unable to publish these works as open access in our IR, we will include it as embargoed, dark deposit until it is confidently in the public domain. I want the Institutional Repository to live up to its name and have all the published work of the Institution.

Is this a good idea, or no? Are there pitfalls that I have not foreseen? Is my reasoning shaky? Please let me know.