Just, yes. I like that it’s colored blue, because I’ve been shouting this until I’m blue in the face.
Open access, for anyone who still doesn’t understand why the academic publishing industry needs reform.October 26, 2012
I have no idea how many people read this blog who don’t also read Play the Past, but it probably makes sense to tease my Epic Life series a bit, since it’s directly on topic for what Anke and I are trying to accomplish here, as well.
I mean in Epic Life to take the formulations I made in that last post, about where a rules-of-the-text reading can get us, towards an understanding of how describing this great chain of practomime in relation to our lives in culture, as the job of the humanities, may allow us to measure at least with qualitative precision the kind of learning outcomes (call it παιδεῖα [paideia] if you want) humanistic study can effect: critical thinking, contextually-sensitive analysis of cultural heritage materials, historically-sensitive analysis of contemporary cultural performance. If my notion of text as ruleset has traction, not only performances in the Iliad and the Academy and in Skyrim, but also performances on Facebook and Twitter, are on the one hand legible both as performances of rulesets and as, themselves, iterated rulesets, and on the other hand susceptible of development in both capacities, according to analysis that is at the same time performance within the ruleset of a discipline, itself an iterated ruleset of the study of the humanities that goes back through Plato to the homeric bards themselves in such passages as the “Embassy to Achilles,” where the bards analyze the warrior-code, in the voice of Achilles, just as Plato will later analyze the work of the bards in the voice of Socrates.
Everyone knows these days it’s tough to keep up with all technical innovation – there’s always a new code, a new program, a new cloud. As someone attending her first THATcamp (yup, newbie), I am humbled and fascinated by all these possibilities (the-kid-in-a-candy-store simile comes to mind). I do try to keep up with digital arts, though (media aesthetics again, Roger), and the Playground Digital Arts festival always presents some intruiguing work. Do check out older works, too, such as probe, a piece that challenges your notion of cinematic experiences, as Boris Debackere explains:
“probe is a metaphor of cinema: cinema as a space shuttle, or a probe you enter and you are completely separated from the external world. Suddenly, the huge screen you have in front of you disappears and becomes a sort of window to travel in time and space. And you sit in this vehicle, but the whole trip takes place in your brain: you concentrate with your mind with the screen that becomes invisible and everything that is projected on the screen puts your mind and imagination into action. The setting of the film is in your brain, not on the screen: it is part of the same dynamics of cinema, which stimulate your brain, the mechanism of your perceptive abilities.”
Are we soon going to perform as our own projection screens? Or have we been there all along?
After several conversations this week on the human-computer interface that is gaining greater currency as Digital (fill-in-field-as-necessary), Wired Mag’s musings feel… slightly 90s. The end of theory. Really? It makes me recall that cartoon we so revered as grad students, with the mice and the cereal box, when postmodernism was all the rage.
Breakfast as text! Foucault flakes. But I am digressing. And maybe 2012 isn’t 2008, as declaring the end of theory, accompanied by the end of methodology, whether scientific or otherwise, represents rather circular reasoning. For now, I, at least, am still grappling with context and variation, not models or theory or methods. Big data to detect correlations and define patters, perhaps. But what kind of data? Let’s take this excerpt from the article:
“Google’s founding philosophy is that we don’t know why this page is better than that one: If the statistics of incoming links say it is, that’s good enough. No semantic or causal analysis is required. That’s why Google can translate languages without actually “knowing” them (given equal corpus data, Google can translate Klingon into Farsi as easily as it can translate French into German). And why it can match ads to content without any knowledge or assumptions about the ads or the content.”
One may feel flummoxed by the announcement that “no semantic or causal analysis is required.” I pursued the fun factor, however, and stuck some Nietzsche into Google translator. For some semantic and syntactic clarity (yes, it does exist) I chose his “Why I am so clever” from Ecce Homo (just a bit of postmodern irony), translated the German into English and then copy-and-pasted an English translation to be re-translated into German. The results were hilarious – truly, even Nietzsche would have been amused. Especially when “pangs of conscience” were re-translated as certain anatomical elements that point to the posterior.
Granted, this is a cheap shot – and it may not have much to do with the science mentioned in the article. Patterns and correlations do not fill in for context and variation, though. Dare I mention the “human element”? If “science can advance even without coherent models, unified theories, or really any mechanistic explanation at all,” as claimed in the article, why do it? Or am I over-analyzing here…
The case here is way overstated, but I think the fundamental point is sound, and breathtaking. (You’re going to give me significant push-back here, Anke, I’m sure, though!)
“All models are wrong, but some are useful.”
So proclaimed statistician George Box 30 years ago, and he was right. But what choice did we have? Only models, from cosmological equations to theories of human behavior, seemed to be able to consistently, if imperfectly, explain the world around us. Until now. Today companies like Google, which have grown up in an era of massively abundant data, don’t have to settle for wrong models. Indeed, they don’t have to settle for models at all.
One thing I already see changing is that there’s a much shorter distance even in the humanities between data and analysis, a distance that used to be covered by theory. Or maybe I’m out to lunch. In my classroom, though, my students and I spend much less time hypothesizing and much more time analyzing. “The End of Theory”? Certainly not. The transformation of the role of theory, perhaps.
The suggested correspondence between magicians and ideology is intriguing – and makes me cringe. Who doesn’t remember those moments between blissful childhood and cynical adolescence when all of a sudden you realized that the magical performance in front of you, complete with choices that come with the magician’s attention, had turned into a farce and that, all along, you had been led by the nose? And when the broad smiles (feeling special, good choices!) changed to a sneer? Maybe I’m all alone with this; but how can one be oblivious to the manipulations of the game, especially when the initial fun-factor has lost its charm? Channeling Borghes here, somewhat…
Maybe I just read Thomas Mann’s Mario and the Magician (1929) one time too many. Maybe I read the Hunger Games one time too many. Maybe I equate games with just too many rules one has to follow and thus wants to break. What would make games really magical is if they could undermine their own construction and gameness, as in, games get to be changed at the core. Haymitch Abernathy’s discovering the force field and using it as a bouncing device comes to mind. What makes a gamer not feel like a trapped mouse, the eager subject of the game makers’ disciplining layers of tests and competitions? When s/he can outwit the game maker – or is that just part of everyone’s ludic impulse?
Play is fun, games are fun, magic is fun. Especially if the game itself is part of the ludic enterprise – that’s magic. For those interested in ideological game making, how about Europa Universalis III…
I’m absorbed these days with questions of how ludic artifacts can use their manifest interactivity (the thing that in my view differentiates them from non-ludic artifacts like traditional film and novel) to do new things. The examples are starting to pile up: BioShock, “No Russian” in Call of Duty: Modern Warfare 2, and the controversial ending of Mass Effect are the examples that stand out for me among mainstream games thus far, but it definitely seems like games that actually use their interactivity in thematic ways are growing more frequent.
As a quirky, but I think very important side note, I found this piece by Alex Stone about the role of choice in the arsenal of a magician’s tricks very revelatory:
Another familiar force is known as Magician’s Choice, the equivoqué. The idea is to set up multiple paths to the same endpoint. In the simplest version, you deal two cards down on the table and ask the spectator to “remove” to one of them. If your volunteer removes to the card you want to force, you say “Ok, that’ll be yours.” If, however, the spectator points to the other card, you eliminate it, saying “Great, we’ll remove that one.” (Here you’re exploiting the ambiguity in the meaning of the word remove.) Either way the spectator winds up with the same card.
Two things spring to mind: “ludic” films like The Usual Suspects (along with ludic epics like, say, the Odyssey, and, more importantly to me at least, every narrative digital game ever that convinced you that you were making real choices. The great frontier of making this illusion of choice manifest to the player is the one I’m excited to think we’re facing, and I take hope that the makers of games will be able to do a very great range of thematic things with it from the incredible variety of forces (of this illusionist type) one finds in traditional works of art: from the different ways Shakespearean comedies end with a marriage to the construction of the forum of Augustus as a traditional Roman family-home.
I’m suggesting that ideology can be viewed as a magician’s force, and that the connection may mean that narrative games have access to the machinery of ideology in a way they’re only beginning to explore. Worth pondering, maybe.