Home Chapter 1 Cyborg and extended self

Site Search

GTranslate

Chinese (Simplified) French German Italian Portuguese Russian Spanish
Cyborg and extended self

When we pick up our cell phones mindlessly and hit autodial to contact our friend across town or across the planet, we take for granted that our bodies and minds have internalized a technological interface that is now seamless with the desire to speak to this friend. This makes the cell phone an extension of self, almost like a hand or foot. You simply reach for your techno-augmentation, mindlessly hit autodial, and talk.

It is wonderfully miraculous that we can electronically extend our sense of hearing and voice around the planet at the speed of modulated-light-waves, and of course, this means we can have agency and create action at a distance through our telecommunications infrastructure. Think about how e-commerce has changed the brick and mortar store so we can now order products over the web or via the phone. The brick and mortar still exist, but now it is “virtually” next door (for anyone with web access).

We often take this technology for granted when we approach an automatic glass door and feel confident that the door will open instantly, but this computer and sensor-actuator feedback system is just one of a microcosm of microcontrollers sensing and regulating human needs and desires. While the automatic door-opening machines have mapped where we are in relation to the door, the ability for us to cognitively remap ourselves into our machines is also a remarkable feat of human-machine interaction, feedback, and symbiosis.

Those of us, who drive cars, ride bicycles and use keyboards know this feeling of oneness with machines. Imagine flying a 747 jet and cognitively mapping your body into the 910,000 pounds of machinery and circuitry, as if your neural circuitry flowed through the electromechanical and robotic elements of that jet. Imagine the wheels of the plane’s landing gear as if it were your toes touching down on the runway. These experiences serve to redefine the notion of corporeal in our new techno-augmented world.

The ability to move and control mega machines is one important development in interfacing computer control to robotic actuators, but another, perhaps more profound and long lasting development, will be our ability to connect the very small machines to these same computers and control systems. The science of MEMS, or Micro Electro Mechanical Systems, will be the next revolution, following the agricultural, industrial, and informational revolutions.

 

Dust mite on a nano etched machine image by Jay Jakubczak at the Sandia National Laboratory.

MEMS use micro fabrication technology to integrate sensors, actuators, mechanical elements, and electronics on a silicon substrate (the same technology used to make microchips).

The artist Laura Beloff, Eric Berger and Sieban Pichlmair, have used a micro machine called an accelerometer with a microcontroller to create a project Seven Mile Boots 2003-04, which can sense the acceleration of the wearer and translate this into audio signals from the web that are heard by the wearer. This allows humans walking in the red boots to access cyberspace. With Seven mile boots, the magical footwear known from folk tales enables its owner to travel seven miles with one step.

With little effort, the user can move a short distance in physical space but can cross great distances in virtual space. The work uses an accelerometer a (MEM) and a microcontroller networked to the Ipaq wireless computer and communicating in LINUX to a server, which then accesses the web.

 

Seven Mile Boots by Laura Beloff, Erich Berger, and Martin Pichlmair. 2003/2004.

 

The web chats are amplified and broadcast from the tips of the boots as spoken chats, so your walk is always accompanied. Wherever you walk with these boots your physical and virtual selves will emerge.

Our extensions of self into the virtual, the physical, the very large and the very small, mediated by computer processors, are generally connected to our bodies through the now ubiquitous mouse and keyboards, with the computer screen for visual feedback.

Some current generations of mouse and joystick have the added benefits of force feedback or haptics, which provide the user a sense of interacting with “the real”.

 

Haptics is the science of using computers coupled with sensors and motors to give humans the sense that they can touch and feel something.

 

Haptics and force feedback promise to change how we interact with computers. Robotic actuators and motors have added mechanical and robotic force and vibration to mice and joysticks to influence the users’ experience to seem more real. Haptics is quite common now in gaming, where a joystick or throttle can be made to seem like a real joystick in flight simulations, where the illusion of pulling back on the throttle creates a counter push that would happen with a real airplane throttle. Or, the common and mundane machine gun fire, which shakes the throttles of first-person shooter games.

While this creates the illusion of reality through a coupling of what is on screen and what is happening to the joystick, the communication is still limited and often without imagination. Will the next generation of mouse or joystick be able to sense heart rate, brainwaves, and body moisture for indications of human reaction, and might this impact and affect the direction that the computer software may decide to move? Could it affect the kind of themes computer software and game inventors would decide to pursue? There could be many rich possibilities here for multi-directional computer/machine interactions that move beyond the standard first-person shooter games that have become all too formulaic.