Posts Tagged ‘ augmented reality ’

Materiality Matters: Confronting Digital Dualism with a Theory of Co-Affordances

500px-Blue-punch-card-front-horiz

When someone sends a text to my phone, are they any less responsible for their comments than if they had said the same thing face-to-face? If someone says a photo I post is “unflattering” or “unprofessional,” do I feel like this says something about me as a person? Why has it become so common to equate the unauthorized use of someone else’s Facebook account to the violation experienced during rape, that the term “fraping”* has come into popular usage? These are some of the questions I think that digital dualism prevents us from answering satisfactorily. In framing an alternative to dualist thinking, I argue that it is important to account for people’s changing sense of self (hinted at in the examples above). To do this, we must examine the material conditions of subjectivity, or, put simply, how what we are affects who we are.

Interestingly, my argument that we ought to take serious account of people’s changing sense of self closely aligns me with with Nick Carr’s recent counter to the digital dualism critique. In fact, in re-reading his post, I realize that he is arguing for almost exactly the kind of theoretical frame work that I am working to develop. Continue reading

The Myth of Cyberspace

“The Myth of Cyberspace” appears in The New Inquiry Magazine, No. 3: Arguing the Web.

In the early 1980s, when personal computing first became a reality, the faces of glowing terminals had an almost magical aura, transubstantiating arcane passages of 1s and 0s into sensory experience. In fact, the seemingly impenetrable complexity of what was unfolding behind the screen created a sense of mystery and wonderment. We were in awe of the hackers who could unlock the code and conjure various illusions from it; they were modern magicians who seemed to travel between two worlds: reality and cyberspace. One day, we imagined, these sages of cyberspace would leave their bodies behind and fully immerse themselves in the secret world behind the screen. Such images manifested themselves through the decades in films like Tron, Hackers, and The Matrix and in the fiction narratives of the cyberpunk genre. When the public internet first emerged, images of cyberspace were already deeply embedded in our collective imagination; these images have become the primary lens through which we view and evaluate our online activity. For this reason, tracing the genealogy of the cyberspace concept reveals much about present cultural assumptions regarding our relationship with information technology.

The term cyberspace was first coined by author William Gibson. In Neuromancer, he imagines it as

a graphic representation of data abstracted from the banks of every computer in the human system … Lines of light ranged in the nonspace of the mind, clusters and constellations of data.

A “nonspace,” meaning that cyberspace lacks the physicality that “space” conventionally implies. Gibson’s cyberspace is an imaginary setting where information takes on some of the properties of matter. Yet, cyberspace is transcendent; it requires leaving behind the body and the physical world that contains it. When hackers “jack in,” they are no longer conscious of the physical world. The hacker trades a physical body and environment for one constructed of digital information. It is important to note that, as the cyberpunk genre evolved, it increasingly wrestled with forms of consciousness that blended sensory inputs from physical and digital sources. Nevertheless, cyberspace, as an ideal type, involves total separation of physical and digital. Continue reading

There is No “Cyberspace”

This is repost from Cyborgology. Comment there or @pjrey.

The words and ideas we use to make sense of the Web owe as much to science fiction (particularly, the cyberpunk genre) as they do to the work of technicians or to rigorous scientific inquiry. This by no means a bad thing; the most powerful of such literary works call upon our collective imagination and use it to direct society to prepare for major transformations looming on the horizon. William Gibson’s (1984) Neuromancer was, no doubt, one such work. Neuromancer features the exploits of a “console cowboy” (i.e., a computer hacker) named Case, who travels across a dystopian world serving a mysterious employer. The work is notable for popularizing the term “cyberspace,” which Gibson coined a couple years earlier in a short story called “Burning Chrome.”

In Neuromancer, Gibson described cyberspace as a”consensual hallucination” and more specifically: “A graphic representation of data abstracted from the banks of every computer in the human system. […] Lines of light ranged in the nonspace of the mind, clusters and constellations of data.” Rather than just staring into a computer screen, hackers “jack in” directly interfacing with these visual representations of data in their minds. The images described here are reminiscent of those portrayed in movies such as Tron (1982), Hackers (1995), and, to a lesser extent, The Matrix (1999). Continue reading

Ambient Documentation: To Be is to See and To See is to Be

This post was co-authored with Nathan Jurgenson.

We begin with the assumption that social media expands the opportunity to capture/document/record ourselves and others and therefore has developed in us a sort-of “documentary vision” whereby we increasingly experience the world as a potential social media document. How might my current experience look as a photograph, tweet, or status update? Here, we would like to expand by thinking about what objective reality produces this type of subjective experience. Indeed, we are increasingly breathing an atmosphere of ambient documentation that is more and more likely to capture our thoughts and behaviors.

As this blog often points out, we are increasingly living our lives at the intersection of atoms and bits. Identities, friendships, conversations and a whole range of experience form an augmented reality where each is simultaneously shaped by physical presence and digital information. Information traveling on the backs of bits moves quickly and easily; anchor it to atoms and it is relatively slow and costly. In an augmented reality, information flows back and forth across physicality and digitality, deftly evading spatial and temporal obstacles that otherwise accompany physical presence.

When Egyptians dramatically occupied the physical space of Tahrir Square this past January Continue reading

Equipment: Why You Can’t Convince a Cyborg She’s a Cyborg

Everybody knows the story: Computers—which, a half century ago, were expensive, room-hogging behemoths—have developed into a broad range of portable devices that we now rely on constantly throughout the day. Futurist Ray Kurzweil famously observed:

progress in information technology is exponential, not linear. My cell phone is a billion times more powerful per dollar than the computer we all shared when I was an undergrad at MIT. And we will do it again in 25 years. What used to take up a building now fits in my pocket, and what now fits in my pocket will fit inside a blood cell in 25 years.

Beyond advances in miniaturization and processing, computers have become more versatile and, most importantly, more accessible. In the early days of computing, mainframes were owned and controlled by various public and private institutions (e.g., the US Census Bureau drove the development of punch card readers from the 1890s onward). When universities began to develop and house mainframes, users had to submit proposals to justify their access to the machine. They were given a short period in which to complete their task, then the machine was turned over to the next person. In short, computers were scarce, so access was limited. Continue reading

US Cyber Command & Augmented Warfare

Nope. It’s not a reference to some long-forgotten 80s movie. On June 23, 2009, former Secretary of Defense Robert Gates signed a memorandum creating US Cyber Command, a separate sub-command unit of U.S. Strategic Command (STRATCOM) headed by a four-star general, (currently, Gen. Keith B. Alexander). And, despite all its digital dualist rhetoric (exemplified by the rampant use of terms like “cyberspace” and “cyber-attackers”), Cyber Command should viewed as a major step toward the augmentation of warfare. With the launch of Cyber Command, the US has quietly moved toward developing new first strike capacities that may, ultimately, prove more stategically important than even the nation’s nuclear arsenal.

While most media coverage has tended to focus on Cyber Command’s defensive postures (e.g., protecting classified data, securing the power grid, etc.), Cyber Command is also developing offensive capabilities to target and cripple other nation’s communication, transportation, and utility grids. This demonstrates that, in the augmented warfare of the future, an effective assault on atoms will also require a simultaneous assault on bits.

Cyber Command’s capacities, however, are far from fully developed. A recent report by the Government Accountability Office concluded that the Cyber Command “has not fully defined long-term mission requirements and desired capabilities to guide the services’ efforts to recruit, train and provide forces with appropriate skill sets.”

 

Watch CBS sensationalize cyber-warfare and make the digital dualist fallacy of comparing cyberspace to land and sea.

The Cyborgology of “Blade Runner”

“We’re not computers, […] we’re physical,” explains the Blade Runner‘s chief antagonist, a replicant named Roy Batty. In this moment of dialogue, Blade Runner engages a frequent themes of the Cyborgology blog—the implosion of atoms and bits, which we term “augemented reality.” In this statement, Roy unpacks the assumption that digitality and physicality are mutually exclusive, while, simultaneously, transcending the boundary between the two. Put simply, Roy is contending that computers cease to be mere computers when they become embodied. In contrast to the familiar theme of cyborganic trans-humanism, Roy is articulating (and embodying) the obverse theory: trans-digitalism.

This Copernican turn—de-centering humans’ role in understanding of the universe—is, undoubtedly, one of the great contributions of the cyberpunk genre (and science fiction, more broadly). Quite provocatively, it points to the possibility of a sociology, or even anthropology, where humans are no longer the direct object of inquiry. The question, here, shifts, from how we are shaped by and interact with our tools, to how technology itself becomes an actors (or even agents!) in a particular social milieu. Continue reading

Follow

Get every new post delivered to your Inbox.