Woolford: Reckless Eyes
is supported by:
as I see, it is necessary that the vision
be doubled with
a complementary vision or with another vision: myself seen from
without, such as another would see me, installed in the midst of
Merleau-Ponty, "The Visible and the Invisible"
Reckless Eyes video
used a combination of mobile wireless networks, handheld computers,
video streaming, and 2.4Ghz video to allow people to see themselves
through other's eyes and the eyes of their environment. The project
focused on how people gaze and are gazed upon in an urban environment.
Unfortunately, the technologies used in the project proved very
unstable. Development took far longer than expected and ate a great
deal of money and energy. Several weeks of tests culminated in a
working experiment in early April, but during the final presentation
on April 7th, two cameras died and the entire network collapsed.
the system worked, it generated interesting and unexpected results.
The wireless cameras received visually interesting interference
from the environment, and the buffering from the streams created
lags allowing participants to be in the same location seeing each
other's gaze offset in time.
Eyeballing" is a concept familiar to most Americans. It is
most infamous for its use by the pre-civil rights courts to punish
and imprison black men for looking at white women. However, it is
still used for any form aggressive or inappropriate gaze, especially
when a person without power dares to look a person with power directly
in the eye. American prisoners are frequently thrown into solitary
confinement for staring their guards in the eye, and prostitutes
are in danger any time they look a pimp in the eye.
The concept of
gaze has taken a radical shift over the past 50 years. With the
advent of the CCD, surveillance cameras have spread numerous gazes
across our cityscapes. The gaze of technology is different than
the gaze of biology. The technical gaze can hold, record, and re-present
the images before it. Most importantly, the technical gaze can be
shared - either through re-presentation of the recorded images,
or through live transmission.
Steve Mann, the grandfather of wearable computers, wore a wireless
camera and receiver for almost every waking minute of his life (he
took them off only swim, shower, and sleep). Both the camera and
display were connected to the Internet so visitors to Steve's www
site could see what he was gazing upon, and if a visitor sent him
an email, it would pop up in the display before his eyes. Steve
tells jokes about viewers to his website telling him to say hello
to people the recognized or chastising him for "ogling cleavage".
of a shared technical gaze explored the concept of sharing a live
gaze with others - with allowing others to view your world through
your eyes. "Reckless Eyes" was the first step in an attempt
to show people themselves as viewed by others and as viewed by the
environment itself. It is highly influenced by Maurice Merleau-Ponty's
writings in which he frequently refers to the seeing and the seen.
Merleau-Ponty fights the Cartesian model of vision where individuals
see the world from an external vantage point, or "God's eye"
view and stresses that we as human beings are not dis-embodied eyes
looking down upon the world. We are embodied. Therefore, in order
to see, we must be in our bodies, in the world. If we are embodied
in a shared world, we can be also be seen by those we see. Merleau-Ponty
identifies the fundamental "reversibility" in vision:
the observer is both subject and object, the seeing and the seen.
views on, erm, viewing put Steve Mann's experiment in a different
light. Steve, attempts to give others a god's eye view of the world.
Albeit, he attempts to make himself into a god by showing other's
the world through his eyes. Steve is a benevolent god though. He
allows his followers to speak to his eyes. Other would-be gods do
not want to share their vision. In his writings on the Panopticon,
Jeremy Bentham attempts to break this reversible vision through
architectural strategies. He describes the first methods of mediated
sight where the body of an observer is obscured behind a technology
through which he can look out. However, when the observed look back,
they see only the mediating technology, not the observer. Bentham
tries to break the bi-directional gaze.
mediation of gaze, have made radical leaps since Bentham's time.
The advent of the CCD and surveillance cameras have spread numerous
gazes across our cityscapes. A CCD's gaze can place the observer
further from the observed than Bentham ever dreamed. However, the
technical mediator (or camera) must remain in the proximity of the
observed. Therefore, if the observer wishes to move away from their
gaze mediator, they need some form of communication. In other words,
a distant observer can view through a remote camera, but the signal
from the camera needs to be transmitted to the observer. Traditionally,
this is done with a direct wire between the camera and a monitor.
However in recent years, it has become easier and more cost effective
to connect the camera to the recorder via wireless radio transmitters.
open a new concept in reckless eyeballing. Anybody with a proper
receiver can pick up the transmission between the camera and the
recorder. However, according to current Dutch Law, this is illegal,
even if the camera is pointed into public space and the signal is
intercepted in the same space. In other words, it is illegal for
a person to look at how they are looked at. It is illegal for them
to gaze into another person's technically enhanced eyes.
allows people to view through each other's eyes. It specifically
seeks to show people themselves as viewed by others. It looks at
manners in which concepts of sight and gaze are altered when they
are technically enhanced, when sight can be passed beyond the individual.
Phase 1: Cross-eyed
Eyes" project is too ambitious to be built within the schedule
and budgetary limitations of the Playing Fields project. We decided
to use Playing Fields to focus on a single aspect of the overall
project. The first phase of the project was called "Cross-eyed"
and dealt with viewing oneself through another's eyes, or, more
specifically, viewing yourself through a camera strapped to another
Phase 1 was intended
to be a technical phase working out what we mistakenly thought would
be a simple process of transmitting images from both people's cameras
to a central location, digitizing the signals, and sending them
to a streaming server. In addition to the head-mounted camera, each
participant was given a handheld computer or PDA. The PDAs used
a wireless network (802.11b WiFi) to connect to the streaming server
and watch the video stream. This seemed like a simple process, but
it took nearly 4 months to get our first mobile stream working.
Some of the problems we had to address were:
- Camera transmissions
and ranges: We chose to work with the 2.4 Ghz network because it
is the most open network for video transmission and we knew of several
cameras transmitting at 2.4 Ghz in the area around the Waag. However
a licence is needed to use a 2.4 Ghz transmitter more powerful than
10 mW. A 10 mW transmitter has a maximum range of approx 100m. We
spoke with icenced radio operators who said we could obtain 1W transmitters
with a 1Km range, but they are powerful microwave transmitters and
are dangerous to place near the human body. They also require large
power sources and were outside our budget. We ended up using two
"CCD Finger Camera"s from Conrad Gmbh, and two 10mW "Airlink
Wireless" 2.4 Ghz transmitters from Maplin UK. The 2.4 Ghz
video receivers were TranW "GigaAir TTA-10R"s also from
- Choice of PDA
media player and streaming server: The choice of PDA, client, and
server are all tightly interconnected. It took several months of
research to decide which system to use.
o We wanted to
use open-source systems as much as possible so we originally explored
Apple's Darwin streaming server with Sharp's Linux based PDA called
the Zaurus SL5500. At the time, there were several MP4 players available
for the Zaurus, but none of them could connect to an RTSP stream.
After searching the net and testing several players, we decided
to write our own java-based player for the Zaurus. Unfortunately,
the Sharp Zaurus, and most other PDAs use an obscure version of
Java called "Personal Java". Sun has already phased out
this specification and made it an extension of their Java2 Micro
Edition (J2ME) standard. The Java Virtual Machine (the program which
runs Java on the PDA), is a version called Jeode. Interestingly,
Jeode is the same VM used for most of the PocketPC PDAs. If we wrote
a java-based player that ran on the Zaurus, it should run on the
PocketPC as well. We were able to write a java player to connect
to a stream using the "Java Media Framework", but the
player would not run on the Zaurus. We finally discovered some stripped-down
streaming video classes intended for PDAs and mobile phones called
"J2ME Mobile Media", but never tested them on the Zaurus.
o The most common
PDAs on the market run Microsoft's PocketPC2002 (formerly called
WindowsCE) or PalmOS. Of the PalmOS devices, Sony's $800 Cliè
was the only one for which we could find a WiFi adapter. HP, Toshiba,
and Siemens all released PDAs with built-in WiFi while we were working
on the project, but they all cost around €750.00 each - which
put them out of our budget. There were numerous PocketPC PDAs on
the market for around €250. Unfortunately, none of the inexpensive
PDAs from Compaq, HP, Toshiba, or Viewsonic had a compact flash
slot or WiFi network adapter. However, we learned that Compaq sells
'expansion sleeves' for it's iPaqs. These sleeves add extra batteries,
Bluetooth, CF, and/or PCMCIA card expansion.
o We planned to
purchase 2 Dell Axims, but managed to get the Brighton UK Based
games testing and localization company, "Babel Media"
to loan us an old Compaq iPaq 3760 and an iPaq 3850. We found two
CF expansion sleeves and CF WiFi cards for these in London and finally
had working mobile media players.
o The iPaqs run
Microsoft PocketPC 2000 so they have a customized version of Windows
Media Player bundled into the OS. Montevideo didn't have a Window's
2000 or XP Pro server and none of us wanted to work with Microsoft
software, so we decided to look for an alternative to Windows Media
changed the name of their streaming client from "Real One"
to "Helix". Under the Helix name, they made portions of
the software open source. There was no version of the new Helix
Player for PocketPC. But the "Mobile Real One" player
works with most PocketPC based PDAs. Mobile Real One Player played
files created with the new Helix Producer, but Helix Producer Basic
(the free version) could not create a stream which could be played
on the PDAs. We didn't want to buy an expensive commercial licence
for the full version of Helix Producer, so after a fair deal of
research and experimentation, we learned the Mobile Real One Player
only played Real Media 8 streams. We dug out a two-year-old copy
of Real Producer 8.5 Basic, and were finally able to stream to the
- WiFi network ranges: One of the central themes of the project
is bringing the gaze into an urban space. This meant we needed a
network which extended into an urban environment. The Society for
Old and New Media (Waag Society) is the Internet-2 backbone for
Amsterdam. Montevideo's streaming servers are housed at the Waag.
The Waag also runs one of the only open WiFi networks in Amsterdam.
It seemed a natural decision to use the Waag's network for the project.
However, the Waag is a less than ideal building for a hosting a
network. Because it is a historical landmark, the WiFi antennae
is not allowed to reach above the building's roof. The roof itself
is made of nickel causing the WiFi signal to bounce erratically.
The network coverage around the building has mysterious strong and
weak pockets because of the shape of the building and the roof.
In the end, we had to purchase our own wireless access point (D-Link
DWL900-AP+ WAP) and place it in one of the windows of the Waag.
We tried to use the D-LinkWAP as a repeater or bridge to extend
the range of the Waag's network. Eventually, we realized that the
D-Link WAP could not act as repeater for the Waag's network, so
we stopped trying to use the Waag network and used our own.
- CF WiFi Cards:
We originally used 2 Buffalo 128b CF cards with the iPaqs. These
cards worked, but had very limited range. After we purchased our
own WAP, we realized one of the Buffalo cards had been damaged during
our experiments. We needed to replace the Buffalo card quickly but
the only CF WiFi card we could find was a Netgear MA701. The Netgear
card turned out to be a much better card with at least 25% more
range and far more helpful drivers and software including ping,
- WiFi interference:
We used 2.4Ghz transmitters and receivers because we wanted stray
external video signals to interfere with our transmitters. Unfortunately,
WiFi networks also work on the 2.4 Ghz range and created a fair
deal of interference visible as white horizontal lines scrolling
through the video feeds. We placed the video receivers and the wireless
access point as far away from each other as possible to minimize
the interference, but we could not eliminate it altogether.
- Delicate Hardware:
From the very beginning of the project, we had problems with hardware
breakage. We lost one transmitter and one of the WiFi cards during
early development. On the day of the final presentation, one more
camera died. As the audience entered the presentation room, the
backup wireless camera and transmitter died as well. We assume the
first transmitter was damaged because it was not properly encased
and moved too violently when it was carried. We solved this by wrapping
the transmitters in layers of foam rubber. The CF WiFi card probably
broke through similar rough handling. We do not know why the two
cameras died, but it may have had to do with the fact that we did
not have any form of power filtering between the cameras and the
- Web browser
windows: Once we had two live streams running, we tried to watch
the two streams side-by-side. This is not a problem with Quicktime,
but we moticed the RealNetworks Helix Player would not let us display
2 windows simultaneously. We suspect we can find a way around this
limitation, but ran out of time looking for a solution.
While it can be
debated whether the project was successful or not, it did yield
a single fully-functional test. The aesthetic of this
stream was far different than I expected. In my original concept,
I forgot about the buffering that occurs in video streams. Once
we had live streams running we realized the streams were offset
in time. We also noticed the signal strength and CPU speed of the
PDA affected the amount of buffering of the stream and the amount
of time-lag between transmission and reception. We knew from the
beginning that the signal from the wireless cameras would fluctuate.
In fact, we used 2.4 Ghz specifically in the hopes of finding interesting
interferences. However, the interference in final streams was again
far different than expected. In addition to the image jumping and
rolling, the streams would freeze, suddenly give bursts of several
seconds of clear image with smooth, sharp images, then break up
again. Instead of the look and feel of a television broadcast, the
stream had a rough, broken, feel to it. I found this aesthetically
interesting, but the project ended just as it became interesting.
Unfortunately the project suffered immensely from lack of proper
funding. All of the wireless problems could have been solved by
purchasing a couple PDAs with built-in WiFi support and purchasing
2 D-Link WAPs. We lost a great deal of time testing different WiFi
cards and trying to find solutions which could work together. We
could have developed and tested the system with our own network,
then taken the network to the Waag. The issue of video transmission
is a bit more difficult. With more money, we could have purchased
more powerful transmitters and more rugged cameras. Ideally, we
would use 900 Mhz video transmission to avoid the interference from
the WiFi network. Another alternative would be to use portable computers
instead of PDAs and digitize the video without any wireless transmission.
However, both of these solutions would take away the possibility
of wireless cameras in the area encroaching into our signal. Cleaning
up the signal would sterilize the content.
Interview with Kirk
Kirk Woolford is an independent designer, photographer, and programmer.
He has worked both as technical and creative director for numerous
online entertainment and education systems. In his own work, Kirk
explores the use of technology to abstract the human body, its
movements, and its senses. He was a research fellow at the Academy of
Media Arts, Cologne from 1993-1995 and has won awards from ISEA, Ars
Electronica, Arts Council of England, Ministry of Science and Education
North Rhine Westphalia and others.