On Deception

In my current work, I was recently asked to explain how deception works in the current technological / social media environment. Below is my thinking so far.

1

First, I would suggest that when it comes to information, it is better to imagine it as a constant flow that circulates in and through individuals and groups. The same bit of information may pass through one individual or group and find no reception and either die there or get passed along in a truncated or peremptory fashion, but even that repetition can be dangerous: we have seen literally thousands of instances of information passed along in one group as “silly” be taken seriously by another group. The prime example of this is “Birds Aren’t Real,” which began entirely as an off-hand joke, grew into a collaborative internet fiction, and then got taken up by what amount to “true believers.”

For the record, this dynamic has long been present in vernacular communities. Folklorists were examining this after the second world war, and some of the initial reports were published in the 70s: the first use of “fake news” to describe the current phenomenon was by two folklorists in 1975. They noted then that the same information could be passed through various groups with different valences and would either die or thrive depending upon the group’s receptivity.

As I discuss in the social information systems module, what various internet “platforms” have done, and this includes both social media and games as well as a variety of websites — remember, Facebook is nothing more than a giant website in many ways — is to harness extant human-information dynamics for the purposes of commoditizing the humans. Information becomes for them simply a matter of holding users. The tools at their disposal are of two kinds: First, there is over 50 years of research into human psychology both in pure research to understand human nature but also as applied research to understand how to use basic human programming, both in terms of essential cognitive functions as well as common cultural functions, to capture, hold, and harness human attention to drive sales of items. Think no further than the constant re-arranging of grocery stores to take advantage of both the research into human way-finding but also how people navigate stores in particular. A second example would be incredible refinement of casinos and gaming machines.

The second tool at their disposal is the large-scale experimental structure of their own platforms. With so many people present, the ability to run A/B testing at scale and automatically is simply un-imaginable to most of us. And yet it happens every time you are on almost any site that carries DoubleClick ads, let alone a site like Amazon. There is simply too much at stake for most corporations not to be constantly testing ways to capture, hold, and monetize your attention.

So far as I can tell, our adversaries are not yet at the same level of operations as the corporations involved, but that is only from what I can tell from OSINT/my own reading. I have not been briefed into anything more. I base my conclusions on the fact that at least the Russians still appear to be in “see what sticks” mode with humans being the primary creators of content. (Interestingly, the burnout for Russian trolls is about the same as that for Facebook moderators.)

If there is going to be automation of information operations, I would suggest it will be likely the Chinese that get there first. First, they have the infrastructure of both people and computing power (and the ability to create more computing power with their own system of fabs) and second they are already sitting on top of an unbelievable amount of data, since a number of sites/services chose to use cheaper services provided by Chinese providers than AWS, Azure, CloudFlare, etc. And some sites/services actually maintained their data “in the clear” allowing easy access to the data as it transited between users and a server located in Shanghai. (One social media site stored the data unencrypted on such servers.)

On the matter of deception itself, it would appear that there is sufficient anxiety amongst a number of groups that information that enacts or articulates the anxiety is sufficient cause to receive it and transmit it again. (You can think of information flows as being like leaves riding the top of a creek: in some cases leaves collect to the side because there happens to be a small eddy there.) Folklorists, among others, have long documented the ways that legends and rumors, as well as a host of other forms (e.g., memes), are forms of projection that individuals use within various groups both to create and maintain the groups themselves. (Remember, all human relationships are largely informational in nature.)

2

Deception as a term is problematic in the current moment. The construction of social reality is a participatory, dynamic process in which information flows through individuals and the groups which they populate and becomes not simply the foundation for their reality but is the reality itself. In other words, reality is information-based. (Some might argue, and quite accurately, the economies are the true foundation, but decades of economic research have revealed the central role of information in any economic function.)

Part of the way economics works, whether in a totalitarian or a free market environment, is that those with more resources (power and money) have greater access to information and also greater capabilities to transmit information. That is, they can cast wider and deeper nets to collect information and they can also broadcast more widely and have the resources to do so for longer. This was true before the internet, and it is just as true now.

What the internet revolutionaries imagined was that it would make it possible for those at the margins of power to exchange information for personal development and that this self-enrichment would slowly make its way up the networks of individuals into communities and then into larger and larger economies such that life for all would slowly become richer, more informed, and, the thinking went, more democratic as more and more individuals connected with each other.

In any given community, there are those with more access to resources and those without, those who are more central to a group and those who are not. How communities allow people to life happily at the margins is one test for a communities resilience. In many traditional communities, some of the central spaces are actually given over to those who might otherwise be at the margins: in some Native American communities those who were uncomfortable with themselves — often because they were homosexual or transgender — were considered to have two souls, and thus have greater access to the spiritual realm. In many traditional European communities, marginalized individuals often became a group’s healer or its historian or storyteller. This model was so resilient that as European societies became larger and larger, they maintained distinct places of honor for scholars and artists.

There are other kinds of misfits within any given community as well: those with misanthropic or violent tendencies. Almost every community has had to deal with such individuals and found a way to channel or at least blunt their impact. In one south Louisiana community with which I am familiar, every girl of a certain age knew to avoid a particular man during town communal events like festivals and parades, especially when he had a bit to drink. This lore passed among young women, and it appears to have made it possible for the community function without any other intervention. (Please note that I am not saying this is an ideal, or even decent, solution to a social problem; merely that it was the one this community pursued. I have brushed up against similar situations in other research I have done — how much this kind of information makes up women’s culture in some communities is something that has been examined elsewhere. For now, from at least one point of view, we have caused a great number of individuals within our communities to pre-occupy themselves with information that, should the offenders have been dealt with otherwise, would have free up that information space for other things.)

In short, many communities had ways to isolate troublesome individuals, and that meant that the information they sought to transmit had nowhere to go. Those familiar with life BI (before the internet) will remember family reunions or other kinds of gatherings where the belligerent uncle (or aunt, but usually an uncle) or town elder would try to gather an audience, but whose pronouncements often fell on deaf ears or no ears. Well, the belligerent uncles found the internet, and have over the past decade proceeded to network and build a collaborative, if also often highly divergent, account of how wrong the status quo is.

It is so easy to do that even ordinary people with ordinary ideas but arguments on very particular things have joined in doing so. Two things have amplified this dynamic: first, networks connect and so ordinary people with particular concerns find themselves connected with cranks with universal grievances. Second, bad actors who once had to content themselves with trying to infiltrate social groups can now simply post information into these networks without having to leave St. Petersburg or wherever else they might lurk. It’s that easy.

Deception suggests agency, usually an external agent seeking to divert or corrupt someone from the truth, or at least a commonly held belief that many regard as true. The same goes for disinformation: we want to separate misinformation from disinformation because … why? Because misinformation is is incorrect or untrue information passed along accidentally or without intent to do harm where disinformation is intentional.

I would like to suggest an alternative grammatical scheme. I challenge you to think of a fact or some other reportable bit of information, and then place it within the following syntax: “I would like to inform you that … ” Now take that same bit of information, negate it, and then place it within the following syntax: “I would like to misinform you … ” Also try “I would like to disinform you … ” Both are in fact true statements are they not? And yet the awkwardness of the latter two is obvious as the words tumble out of your mouth.

While there might be mathematical theories of information, information itself is not mathematical. The negating of a negative does not make it a positive. All information is a positive: individual use bits of information to construct their realities. Information we regard as untrue or incorrect is just as useful to some individuals in their construction of reality as true or correct information. What we now find troubling is that is appears that many individuals actually prefer untrue or incorrect information because it is often easier to digest or creates a situation in which they are the heroes, or victims, of the moment. Folklorists had some sense of this BI (Before Internet), but we are not struggling to articulate how the dynamic changed due to the scale of online networks as well as the algorithmic nature of those networks, most of which are actually commercial properties in which the attention of individuals is what is sold.

Defense Contexts

There are a lot more sources than these, but these are the ones I consistently consider in order to keep up with the (textual) context within which the Army operates.

Serious Security Publications

  • War on the Rocks bills itself as the “producer of essays and podcasts by experts and/or with deep experience in foreign policy and national security issues.”
  • Its sister publication, the Texas National Security Review, is more scholarly in nature.
  • The Strategy Bridge, like WotR, is both a journal, a podcast, as well as a series of programs.

Defense Sector News Aggregators

Duck, Duck, Goose

This past summer my daughter came with me on one of my stays on post. We did not get to spend as much time together as I might have liked, but we still managed to cook up a crazy ideas, one of which involved the geese on post not being all that they appeared to be. The result was the following:

Duck, Duck, Goose

by

Lily Wu-Laudun and John Laudun

2021

EXT. NIGHT. The screen is filled with white light that dims and focuses into a pair of headlights as a car makes its way past the camera. A hundred yards past the camera, its brake lights brighten as it stops. A figure only just silhouetted by the headlights pops out of the passenger side door, opens the read passenger door, and lifts something out.

CUT to a pair of hands gently loosing a Canadian goose into the water. The goose swims away. The camera follows and fades into black as the goose disappears into the distance. A car door is heard closing and the car itself is heard, but not seen, speeding off.

EXT. DAY (MORNING). A group of legs and shoes walk past sleeping geese who are only six to ten feet closer to the camera. Voices are heard discussing various matters. Several groups go by and the geese hardly flutter. Some of those groups of legs are camouflaged and are tucked into khaki boots. We are on a military installation.

The camera has been slowly zooming out during this time, taking in more and more of the scene in which groups of civilians and soldiers walk along a handful of sidewalks that wend their way along a grove of trees next to a lake. The geese are scattered in small clumps among the trees and as the shot widens, a few begin to move about.

Most of the geese make their way to the water, but one goose waddles its way toward one of the nearby buildings. People continue to flow by but it’s clear that the flow is slowing as people settle inside the buildings to work.
We see some go into a nearby building, again talking animatedly. We follow a small group of two men and a woman (two civilians and one soldier — mix doesn’t matter) through the open doorway and down a hall into an office. As they move into and through the building, we follow their conversation as well:

CAROL: It’s the damndest thing, Bob. We can’t figure out how they got that information.

BOB: And you’re sure you tracked all our possible emission points?

CAROL: Yes, Bob. We ran down all possible emitters. We even tracked ground lines. Everything. We got nothing.

BOB: Hmmm.

JERRY: I’m going to ask the dumb question: you’re sure it was hardware and not personnel?

CAROL: The fidelity of their capture was too high. It’s word for word. A person would have made a mistake or changed a word. It’s just how the brain works.

JERRY: Unless you had someone with an eidetic audio memory…

CAROL: Unless we had a number of someones with eidetic audio memories…

JERRY: It could happen.

CAROL: It could happen. (This second echo reveals she does not find this a useful conversational thread.)

BOB: So known hardware configurations. (It’s clear he’s trying to move the conversation back to something productive.)

The conversation fades somewhat as the camera cuts from the series of medium shots that have captured the speakers to a group shot with a window behind the three overlooking the trees and the lake. We see most of the geese in the distance, either still sitting in the grass but one goose is rather close and is preternaturally still.

CUT to a CLOSE-UP of the goose. Its head moves very slowly, and as it does so, we catch a glimpse of a laser coming out of one of its “eyes.” A shot from behind its turned head reveals it is pointed at the room, and as we cut to a tight shot of the laser point hitting the glass (and the glass vibrating) we hear the voices within as their conversation carries on.

VOICES outside and nearby snap us back to our goose, which shakes its head, to the sound of servos, and then proceeds to dip down and pull at the grass, as geese do, while another group of legs walks past.

As the camera fades to black, we hear the voices of that group talking and the rustling of the goose’s feet in the grass with the slightest hint of a servo working.

THE END