Quarantine Day 1

2024.03.20

Notes from the Homeland: Quarantine Day 1

While we may never know when COVID-19 first appeared, we can definitely date the moment here in the homeland when people realized that maybe they should take it seriously. It was the day the state closed K-12 schools for the month. It was also the day that the local university decided to cancel classes for two days and then re-open as an online-only institution. That was the day the toilet paper really began to fly (off the shelves).

It was a day like any other day for me. I drove the girl to way-too-early–in-the-morning track practice, came home, had a cup of coffee, prepared for class, and went to campus. In class, we discussed our contingency plan, and even managed to squeeze in a bit of discussion about the assigned reading.

As class ended, one of my students who is an RA (a residential assistant in a university dorm) announced that he had just gotten word that the university was in fact going online. Okay, we decided, good thing we had a plan. Everyone filed out. I went upstairs and attended a webinar on alternative ways to approach grading papers. It was just me, a grad student, and the faculty member who organized it, and we had to huddle around a laptop — because the room’s equipment was, of course, not working — but we enjoyed ourselves and the physical intimacy made it feel less like a webinar and more like a conversation.

Afterwards I headed home, where I heard that the governor had announced that the state was closing all public schools until the middle of next month. Oh, I thought. Now things are going to get goofy.

I decided that the best thing I could do was grab our standing household grocery list, add a few items for a long-ish weekend, and head to the closest grocery store and get a shop in before all the parents picking up kids from school, and knowing they wouldn’t be going back for a month, decided they needed to stock up for the apocalypse.

Too late.

When I walked into the store, I didn’t really worry that the cart I grabbed was the last one: this particular store isn’t necessarily the most organized, and they are often running low on carts. And it wasn’t that crowded as I worked my way through the produce. But by the time I cleared through the meat section and was heading to the back corner of the story to pick up milk and eggs, it became clear something was weird: there was a line of carts.

As I crossed the middle aisle that runs the length of the store, I saw that the line of carts ran from the back to the front. As I continued on my way to the back corner of the store, I was following the line of carts. As I turned the corner to go forward again to the bread aisle, I was following the line of carts. The line of carts was wrapping itself around the store.

And the line wasn’t moving, only growing longer.

I looked at the handful of items in my cart, and I turned to the store employee who had his phone out to photograph the line. I apologized as I told him that I was abandoning my cart.

“No problem,” he said. “I’ll push it back into the cold walk-in.”

“Thank you.”

“You know we open at five in the morning?”

“I’ll see you then.”

And I left and came home and stayed home until the sun went down.

Of Portals and Platforms

2024.02.28

A few months ago I mused elsewhere online that civilization would end in portals. That observation came after a period of travel in which I had not only to wade through my own organization’s portal but another organization’s portal. And I had both applied for a few jobs as well as submitted recommendations for students and colleagues, and I was portaled out. Some organizations were renting their portal from the same vendor, and so it would both recognize you and not recognize you. Somewhere in Dante’s inferno, there are portals. I am sure of it.

It’s not clear to me how much value portals bring to any organization except the appearance for management of “having done something.” This kind of check-box-ism is, of course, the first and last resort for the kinds of managers who can slow a good organization, trip a decent organization, and appear to flock to bad organizations, who, being bad organizations, cannot discern between good and bad management.

So far as I can tell, when it comes to portals, the logic of such management appears to be “the more the better.” And hapless employees are then forced to sign onto and off a variety of portals just to get the basics done. One portal for travel management. Another for travel reimbursement. Another for health records. Another by the insurer. Another by the hospital or medical provider. Yet another portal for performance evaluation.

None of these portals talk to each other, and so the chief task of the employee or patient appears to be to enter the same data over and over and over again all while juggling multiple login identities and a variety of password parameters — this site requires symbols; this site rejects symbols — and captchas — because who hasn’t fulfilled their lifetime quota of clicking on tiles that contain fire hydrants?

And none of this applies to the portals as platforms to which we subscribe which create similar amounts of drudgery for us. Take, for example, a recent interaction from ResearchGate, which emailed me the following:

Screenshot-2024-03-06-01

And when I clicked on the link in the email, it took me to this page:

Screenshot-2024-03-06-02

There is no reason, absolutely no reason, that that information could not have been in the original email. And if it was, I would be appreciative of the lack of friction ResearchGate offers me. Instead, I clicked, and had to log in!, only to learn this rather small fact.

From some manager’s point of view, they have created engagement. From my point of view, they’ve taken something decent and good and portalized it.

Fall 2024 Courses

2024.02.28

Here are the courses I am teaching this fall. The 432 is a regular feature now in our folklore course offerings and I have taught it with a focus on legends and information cascades for a few years now. The 370 is a new course, and I have to say I am looking forward to teaching a course with a non-folklore focus. I don’t get to do this often, and I am excited to see what students bring to the table.

ENGL 370. Interactive Fiction & Narrative Games. Branching narratives, interactive fiction, text adventures, CYOA all describe a form of entertainment—be it literary, performed in a group, or in a video game—in which a reader is given choices and their choices determine the nature and outcome of the story. This course explores the history of narrative games, from collaborative storytelling in oral cultures to the robust open-world games to cinematic universes in which multiple storylines exist (and sometimes interact). Course inputs include reading, viewing, and playing. Course outputs include analytical explorations of forms and mechanisms and the development of fictions of your own.

ENGL 432. American Folklore. The subtitle of this course is “Legends, Conspiracy Theories, Cryptids, Oh My!” This course seeks to explore the world in which all of us are already immersed, an online sea of information and misinformation. What are the impulses behind these flows, and what are their diverse functions. From the moment that humans became capable of re-presenting reality, we were engaged in various forms of fiction. Some forms are obviously meant for entertainment, like tales and jokes, and other forms are meant to inform and guide us, like myths and histories. In-between are the stories we tell, the information we pass along, and the arguments we make in which we conjecture about the nature of reality. Individuals interested in this course should be aware that there is as much darkness as light in what we consider and should be prepared to handle topics objectively.

A Star to Steer By

2024.02.28

This semester I am teaching a class on Project Management in Humanities Scholarship. I have seen enough graduate students stumble when shifting from the managed research environment of course papers to the unmanaged research environment of the thesis or dissertation, that I thought it would be useful to try out some of the things we know about how best to manage projects in general as well as offer what I have learned along the way. The admixture of experts agree this works and this works for me I hope opens up a space in which participants can find themselves with a menu of options from which they feel free to choose and try. Keep doing what works. Stop doing what doesn’t.

We are a month into our journey together and almost everyone has finally acceded to the course’s manta of doing something is better than doing nothing (because the feeling of having gotten anything done can be harnessed to build momentum to get something more important done), but there are a few participants who are still frozen at the entry door to the workshop where each of us, artisan-like, is banging on something or other.

All of them have interesting ideas, but some are struggling with focus. I think this is where the social sciences enjoy an advantage. They have an entire discourse, which is thus woven into their courses and their everyday work lives, focused on having a research question. What that conventionally means is that you start with a theory (or model) of how something works; you develop a hypothesis about how that theory applies to your data (or some data you have yet to collect because science); and then you get your results in which your hypothesis was accurate to a greater or lesser degree.

Two things here: the sciences have the null hypothesis, which means they are (at least theoretically) open to failure.[1] The sciences also have degrees of accuracy. Wouldn’t it be nice if we could say things like “this largely explains that” or “this offers a limited explanation of that” in the humanities? Humanities scholars would feel less stuck because they would be less anxious about “getting it right.” We all deserve the right to be wrong, to fail, and we also deserve the right to be sorta right and/or mostly wrong. Science and scholarship are meant to be collaborative frameworks in which each of us nudges understanding just that wee bit further. (We’re all comfortable with the idea that human understanding of, well anything, will never be complete, right? The fun part is the not knowing part.)

The null hypothesis works very clearly when you are working within a deductive framework but it is less clear when you are working in an inductive fashion. Inductive research usually involves you starting with some data that you find interesting, perhaps in ways that you can’t articulate and your “research question” really amounts to “why do I find this interesting?” Which you then have to translate/transform into “why should someone else find this interesting?” Henry Glassie once explained this as the difference between having a theory and needing to data to prove it, refine it, extend it and having data and needing to explain it.

There is also a middle ground which might be called the iterative method, wherein you cycle between a theory or model, collecting data, and analyzing that data. Each moment in the cycle helps to refine the others: spending time with the data gives you insight into its patterns (behaviors, trends) which leads you to look into research that explores those patterns, trends, behaviors. Those theories or models then let you see new patterns in your texts that you had not seen before, or, perhaps, make you realize that, given your interest in this pattern, maybe you need different texts (data) to explore that idea.

I see a lot of scholars, junior and senior, stuck in the middle of this iterative method without realizing it and don’t know which moment to engage first. What should they read … first? (I have seen the panic in their faces.) What I tell participants in this workshop is that it doesn’t matter. They can start anywhere, but, and this is important, start*. No one cares whether you start reading a novel (and taking notes) or reading an essay in *PMLA (and taking notes). 99% of managing a project as an independent researcher is just doing something and not letting yourself feel like you don’t know where to start. Just start.

Will it be the out come be the project they initially imagined? Probably not. But let’s be honest, that perfect project they initially imagined lived entirely in their heads—as it does for all of us. It was untroubled by anything like work. (That’s what makes it ideal!) It was not complicated by having to determine where we might publish the outcome, who might be interested, to what domain you might contribute. It was also unavailable to anyone else, inaccessible to anyone else, and probably incomprehensible to anyone else. As messy and subpar as the things we do in the hours we have are, in comparison to that initial dream, they are at least accessible to others, who will probably find them interesting and/or useful.

To be clear, I usually press workshop participants and students to start with data collection / compilation (and not with a theory). Mostly that’s because I am a folklorist (and some time data scientist) and I feel at my most driven when a real work phenomena demands that I understand it. To a lesser extent, as comfortable as I am with my own theoretical background, I find the current explosion in all kinds of theories a bit overwhelming. I prefer to let the data tell me what data I need to go learn, else I might end up going down the rabbit hole of great explanations and never get anything done!


[1] The sciences are currently undergoing a pretty severe re-consideration of the “right to be wrong.” With the cuts in funding to so many universities — because, hey, the boomers got their almost free ride and shouldn’t have to pay for you — the American academy has shrunk, creating greater competition for the jobs that remain, which has meant that scientists often feel like they can’t fail. Failure must be an option when it comes to science, and scholarship. When it isn’t, we end up with data that has been, perhaps purposefully or perhaps unconsciously, miscontrued because the results need to be X.

Test File

2024.02.06

This is for the text analytics class: here is the file you are looking for.

Raising a World Builder

2024.02.04

A recent comment I made on the current state of education in the humanities on LinkedIn drew a fair amount of attention. I’m not linking to that comment here as it was of a moment, but there are some things I have observed based both on being a parent of a particular kind of thinker as well as documenting similar kinds of thinkers out in the world. I call them world builders here, but they might also be called immersive thinkers.

Origins

In the car one morning on the way to her school I commented to my daughter that the rain had made driving a bit more difficult than usual and that I would have to make sure to keep two hands on the wheel. It was, for me in that moment, simply a metonym for paying attention, and, I confess, a way of letting my daughter know that her dad may not be paying as close attention to our conversation as we both often enjoyed. Over the years of a morning commute that got her to school and me to work, we had enjoyed a wide variety of conversations, which sometimes ran sufficiently wild, especially at her end, that I had to remind her, as a way of reminding myself, that driving was the higher priority.

A little too often my reminders came out more as a chides, which I always regretted. As was often (thankfully) the case, my daughter performed some conversational judo on it by responding, “What if you had three hands?” Her first thought was that I could drive and wave to drivers nearby, but quickly she spun the idea out into a variety of possibilities before settling down into playing a variety of instruments with three hands: there was a three-handed piano piece, then a three-handed guitar melody, and then a three-handed trumpet call. The sounds grew wilder, weirder and her laughter built from giggles to squeals.

Her first move displayed the power of divergent thinking, something which has been explored quite a bit over the past few decades in creativity studies, but her next move was to dwell in a particular domain, to immerse herself in a world, and to play with the possibilities there. For the time being, I would like to call that immersive thinking. It is surely related to that kind of thinking that we sometimes call rich mode or right brain thinking in a way that I want to spend more time thinking about — and to which I am open to suggestions![*]

World-building was, and is, like a reflex action for my daughter. From the time she could speak, she spun out stories. She usually enacted the stories, dramatizing them with props and costuming if she was a character or animating a wide variety of objects, some of them more obviously meant for such use and others not. I can’t, for example, count the number of times objects at restaurant tables came to life and led complex social lives when adult conversation became uninteresting to her. My wife and I saw utensils be sisters, salt and pepper shakers be parents, and a tented napkin become a home.

It was, and is, an amazing thing to watch, but as many creative individuals know, such an ability does not come without its penalties. While her school labeled her a “deep creative,” it seemed largely a way of admitting they were unable to come up with a plan on how to make a space within which she could learn and grow to suit her own abilities and interests. Don’t get me wrong: she did well (enough) in school, but that’s largely because we worked hard at home for her to adapt to the regimen at school. And so she got high marks, but those marks were also regularly accompanied by comments from, well-meaning and really nice, teachers that she “did not pay attention” as well as she should, that she was “daydreamy” or that “sometimes she just phones it in.”

One could perhaps fault the teachers, but I rarely find individuals are the problem in these circumstances. More often a system is at work. In this case, I think it’s fair to blame a larger educational ideology that has come to rely upon standardized tests as one of its central metrics. In a moment that resembles the classical economics parables about unintended consequences, what we so many of us face, as parents in the paroxysms of our children or ourselves, is an entire educational system which many believe is headed precisely in the wrong direction for what looks like reasonable, well, reasons.

Indeed, an entire cluster of industries have arisen around the wobbling of the educational infrastructure in our country. The technorati favor two flavors that are not necessarily mutually exclusive. The first flavor is that articulated by Ken Robinson who argues that our schools are stuck in the industrial age, anxiously trying to turn out uniform widgets in a moment where standardization couldn’t be less useful – the assumption being that things are changing more quickly and more predictably than ever. I don’t subscribe fully to this latter notion, but it’s not hard to see that the current context for businesses favors only a few large incumbents with stability, but employment with those incumbents, as two decades of layoffs and jobs moving from one part of the world to another have provied, is not stable. In other words, institutions have stability, but only individuals at the top of those institutions get to enjoy the fruits of that stability.

Outside of those narrow mountaintop retreats, there’s a whole host of changes taking place as industries transform in the face of an amazing amount of computing power. My own industry, higher education, is facing such a transition, but think about even the way manufacturing is changing as building components becomes less about removing metal by mill and lathe work or stamping and cutting but more about “printing” them by building up a part molecule by molecule. Suddenly, economies of scale matter less and sheer imagination matters more. (Well, you’ll still need quite a bit of capital to have such a “printer” at your disposal, but that’s a return to a history we have seen already – i.e., the original printing press!)

What to do with our little geek, our world builder?

Here’s the short of it: our daughter was a geek. She had all the classic geek traits: she prefered to be fully immersed in a problem or project or world and she oscillated between wanting external affirmation for her accomplishments and not caring what others think. Most geeks I know are like this. Many of them truly believe they don’t need anyone’s approval, and for a few of them that may very well be true. I also know, speaking as a geek (I think) myself, that, yes, sometimes a nod from someone you respect is not only all you need, but it is something you really want.

A lot of curricula which have high geek probabilities have switched to more project-oriented pedagogies. We are seeing more of it engineering, and it has always been a prominent part of architecture. But what to do with our geeks, our world builders in other domains? How do we re-rig systems at least to allow them to think the way they think?

An example from her experience:

For a time, our daughter was in the school choir. Every year the choir put on a musical. One year it was Charlie and the Chocolate Factory; another it was The Wizard of Oz. Every year students auditioned for a role in the play. Now, how do you suppose those auditions took place? Did it come after a watching the film version or reading all or parts of the book? Did it come after listening to some of the story’s most famous passages and songs? That is, did it allow an immersive thinker an opportunity to do what they do best, get inside a world and look around, elaborate it, play with it? No, the auditions were songs from some place else, handed out the week or so before the auditions. Students were told to practice the songs, do their best, and decisions would get made.

Now, that approach works if a student is procedurally-driven and understands the necessity, or already desires, adult approval. It doesn’t work at all for the student that needs to live and breathe inside a thing, to get a sense of it, to find their excitement there.

Fundamentally, this comes down to the difference between teachers as the center of a curriculum and students at the center. As a teacher myself, I know I can’t be all things to all students, and in a post to follow, I want to think more about how education might be made better for more kinds of learners than it currently is. In fact, I worry about one recent trend in particular: the rise of the master teacher and what that means for learning differences — here, learning differences are meant much more broadly than they are in the education industry.

Telling Stories with Data

2024.01.07

In Fall 2023 I led a course on digital storytelling. In preparing for the course, I wanted to see what others were doing, and so I searched for course listings, tracked down syllabi, and compared assignments and foci. It was fascinating to see the range of things being done. One thing that I did not fully expect was how often a search for “digital storytelling” washed me up on data science beaches. The graphic below tells the story, but I also want to collect more links to see what I can learn. (See the list below.)

Storytelling among the Four Pillars of Data Science

Copyright and AI

2024.01.05

Some time last year, comments were requested on the matter of AI and copyright. I submitted the following.

I am writing for myself, but as a folklorist I am also writing with profound respect, and sadness, for our national tradition of enabling private profit at the cost of the public commonwealth. Like the pharmaceutical industry raiding traditions around the world in order to develop better, perhaps life-changing, medicines, we have allowed the large language models behind most of the more prominent AI platforms to harvest knowledge of a lot of individuals without the individuals themselves receiving any acknowledgment, compensation, or share in the profit. Whether we call it “folk” or “mass,” we dis-enfranchise those who actually produce the materials from which we derive products.

We cannot fall back on user agreements which, in order for the basics of the web to work, had individuals consent to broad grants of copyright. We must acknowledge that most users posted texts, images, and other media assets to various platforms and sites in the interest of creating and maintaining various communities. That they were willing to be sold to advertizers, because that is the basis for American media production, should not in any way affect our consideration that their materials, and thus the people themselves to some degree, can simply be given to AI platforms. At least the social media platforms gave them something of value in exchange. AI platforms are already monetized, seeking rent for creating an abstraction of a city built of neighborhoods built by others.

We cannot know what will be the eventual outcome of the development of these AI platforms, and I don’t think referencing the hype or the fear-mongering does any good here. What we can know is that a system’s integrity must be clear and checked throughout the process. Right now, we can say for certain that these systems were built without integrity when it comes to their data acquisition. If we do not figure this out, if we do not create useful guidelines for clarity and integrity, than we are somewhat dooming these systems to have further negative impacts.

Comment Tracking Number: lm9-e4zx-p2oy

A Statistical Ouroborus

2024.01.04

I’m preparing to teach text analytics, the first time such a course has been offered at my university. I came across this great moment in John Scalzi’s Redshirts where statistical analysis is mentioned, but I can’t find a way to include it in the syllabus:

“So what you’re saying is all this is impossible,” Dahl said.

Jenkins shook his head. “Nothing’s impossible,” he said. “But some things are pretty damned unlikely. This is one of them.”

“How unlikely?” Dahl asked.

“In all my research there’s only one spaceship I’ve found that has even remotely the same sort of statistical patterns for away missions,” Jenkins said. He rummaged through the graphic elements again, and then threw one onto the screen. They all stared at it.

Duvall frowned. “I don’t recognize this ship,” she said. “And I thought I knew every type of ship we had. Is this a Dub U ship?”

“Not exactly,” Jenkins said. “It’s from the United Federation of Planets.” Duvall blinked and focused her attention back at Jenkins. “Who are they?” she asked.

“They don’t exist,” Jenkins said, and pointed back at the ship. “And neither does this. This is the starship Enterprise. It’s fictional. It was on a science fictional drama series. And so are we.”

How to be bored

2023.12.31

In a response to a video by Parker Settecase on the utility of boredom and of capitalizing on it by using a notebook, theorangecatmom noted:

I’m not that smart, but my Dad taught me to use my brain to entertain myself when bored pre-cellphones. I still make myself practice it when I’m in a waiting room or sometimes on my breaks at work. As a kid, he taught me to count things, find patterns in the stuff around me, play mental math games, stuff like that. As an adult, when I’m surrounded by people, I sometimes just listen or watch what they’re doing and think about why they might be doing it. I call it practicing being bored and it blows people’s minds that I do it on purpose.

To see all posts, see the archive.

<!– or search for what you seek:

–>