Text 7905, 469 rader
Skriven 2005-10-26 14:29:34 av Glenn Meadows (1:379/45)
Ärende: Lifehackers Article
===========================
From: "Glenn Meadows" <gmeadow@comcast.net>
This is a fascinating article that definitely hit home for me as I struggle to
prioritize and multitask in our interrupt driven lifestyles that are full of
non-stop distractions. Some of the more interesting parts of the article were
the findings that most people only spend 11 minutes on any given project before
being interrupted and after the interruption has ceased it takes 25 minutes to
return to that task. I also found the section on proficiency while using
larger or multiple computer monitors interesting too. I highly suggest taking
a few minutes to give this a read - it's worth it.
-------------------------
http://www.nytimes.com/2005/10/16/magazine/16guru.html
October 16, 2005
Meet the Life Hackers
By CLIVE THOMPSON
In 2000, Gloria Mark was hired as a professor at the University of California
at Irvine. Until then, she was working as a researcher, living a life of
comparative peace. She would spend her days in her lab, enjoying the sense of
serene focus that comes from immersing yourself for hours at a time in a single
project. But when her faculty job began, that all ended. Mark would arrive at
her desk in the morning, full of energy and ready to tackle her to-do list -
only to suffer an endless stream of interruptions. No sooner had she started
one task than a colleague would e-mail her with an urgent request; when she
went to work on that, the phone would ring. At the end of the day, she had been
so constantly distracted that she would have accomplished only a fraction of
what she set out to do. "Madness," she thought. "I'm trying to do 30 things at
once."
Lots of people complain that office multitasking drives them nuts. But Mark is
a scientist of "human-computer interactions" who studies how high-tech devices
affect our behavior, so she was able to do more than complain: she set out to
measure precisely how nuts we've all become. Beginning in 2004, she persuaded
two West Coast high-tech firms to let her study their cubicle dwellers as they
surfed the chaos of modern office life. One of her grad students, Victor
Gonzalez, sat looking over the shoulder of various employees all day long, for
a total of more than 1,000 hours. He noted how many times the employees were
interrupted and how long each employee was able to work on any individual task.
When Mark crunched the data, a picture of 21st-century office work emerged that
was, she says, "far worse than I could ever have imagined." Each employee spent
only 11 minutes on any given project before being interrupted and whisked off
to do something else. What's more, each 11-minute project was itself fragmented
into even shorter three-minute tasks, like answering e-mail messages, reading a
Web page or working on a spreadsheet. And each time a worker was distracted
from a task, it would take, on average, 25 minutes to return to that task. To
perform an office job today, it seems, your attention must skip like a stone
across water all day long, touching down only periodically.
Yet while interruptions are annoying, Mark's study also revealed their flip
side: they are often crucial to office work. Sure, the high-tech workers
grumbled and moaned about disruptions, and they all claimed that they preferred
to work in long, luxurious stretches. But they grudgingly admitted that many of
their daily distractions were essential to their jobs. When someone forwards
you an urgent e-mail message, it's often something you really do need to see;
if a cellphone call breaks through while you're desperately trying to solve a
problem, it might be the call that saves your hide. In the language of computer
sociology, our jobs today are "interrupt driven." Distractions are not just a
plague on our work - sometimes they are our work. To be cut off from other
workers is to be cut off from everything.
For a small cadre of computer engineers and academics, this realization has
begun to raise an enticing possibility: perhaps we can find an ideal middle
ground. If high-tech work distractions are inevitable, then maybe we can
re-engineer them so we receive all of their benefits but few of their
downsides. Is there such a thing as a perfect interruption?
Mary Czerwinski first confronted this question while working, oddly enough, in
outer space. She is one of the world's leading experts in interruption science,
and she was hired in 1989 by Lockheed to help NASA design the information
systems for the International Space Station. NASA had a problem: how do you
deliver an interruption to a busy astronaut? On the space station, astronauts
must attend to dozens of experiments while also monitoring the station's
warning systems for potentially fatal mechanical errors. NASA wanted to ensure
that its warnings were perfectly tuned to the human attention span: if a
warning was too distracting, it could throw off the astronauts and cause them
to mess up million-dollar experiments. But if the warnings were too subtle and
unobtrusive, they might go unnoticed, which would be even worse. The NASA
engineers needed something that would split the difference.
Czerwinski noticed that all the information the astronauts received came to
them as plain text and numbers. She began experimenting with different types of
interruptions and found that it was the style of delivery that was crucial. Hit
an astronaut with a textual interruption, and he was likely to ignore it,
because it would simply fade into the text-filled screens he was already
staring at. Blast a horn and he would definitely notice it - but at the cost of
jangling his nerves. Czerwinski proposed a third way: a visual graphic, like a
pentagram whose sides changed color based on the type of problem at hand, a
solution different enough from the screens of text to break through the
clutter.
The science of interruptions began more than 100 years ago, with the emergence
of telegraph operators - the first high-stress, time-sensitive
information-technology jobs. Psychologists discovered that if someone spoke to
a telegraph operator while he was keying a message, the operator was more
likely to make errors; his cognition was scrambled by mentally "switching
channels." Later, psychologists determined that whenever workers needed to
focus on a job that required the monitoring of data, presentation was
all-important. Using this knowledge, cockpits for fighter pilots were
meticulously planned so that each dial and meter could be read at a glance.
Still, such issues seemed remote from the lives of everyday workers - even
information workers - simply because everyday work did not require parsing
screenfuls of information. In the 90's, this began to change, and change
quickly. As they became ubiquitous in the workplace, computers, which had until
then been little more than glorified word-processors and calculators, began to
experience a rapid increase in speed and power. "Multitasking" was born;
instead of simply working on one program for hours at a time, a computer user
could work on several different ones simultaneously. Corporations seized on
this as a way to squeeze more productivity out of each worker, and technology
companies like Microsoft obliged them by transforming the computer into a hub
for every conceivable office task, and laying on the available information with
a trowel. The Internet accelerated this trend even further, since it turned the
computer from a sealed box into our primary tool for communication. As a
result, office denizens now stare at computer screens of mind-boggling
complexity, as they juggle messages, text documents, PowerPoint presentations,
spreadsheets and Web browsers all at once. In the modern office we are all
fighter pilots.
Information is no longer a scarce resource - attention is. David Rose, a
Cambridge, Mass.-based expert on computer interfaces, likes to point out that
20 years ago, an office worker had only two types of communication technology:
a phone, which required an instant answer, and postal mail, which took days.
"Now we have dozens of possibilities between those poles," Rose says. How fast
are you supposed to reply to an e-mail message? Or an instant message?
Computer-based interruptions fall into a sort of Heisenbergian uncertainty
trap: it is difficult to know whether an e-mail message is worth interrupting
your work for unless you open and read it - at which point you have, of course,
interrupted yourself. Our software tools were essentially designed to compete
with one another for our attention, like needy toddlers.
The upshot is something that Linda Stone, a software executive who has worked
for both Apple and Microsoft, calls "continuous partial attention": we are so
busy keeping tabs on everything that we never focus on anything. This can
actually be a positive feeling, inasmuch as the constant pinging makes us feel
needed and desired. The reason many interruptions seem impossible to ignore is
that they are about relationships - someone, or something, is calling out to
us. It is why we have such complex emotions about the chaos of the modern
office, feeling alternately drained by its demands and exhilarated when we
successfully surf the flood.
"It makes us feel alive," Stone says. "It's what makes us feel important. We
just want to connect, connect, connect. But what happens when you take that to
the extreme? You get overconnected." Sanity lies on the path down the center -
if only there was some way to find it.
It is this middle path that Czerwinski and her generation of computer
scientists are now trying to divine. When I first met her in the corridors of
Microsoft, she struck me as a strange person to be studying the art of
focusing, because she seemed almost attention-deficit disordered herself: a
44-year-old with a pageboy haircut and the electric body language of a
teenager. "I'm such a spaz," she said, as we went bounding down the hallways to
the cafeteria for a "bio-break." When she ushered me into her office, it was a
perfect Exhibit A of the go-go computer-driven life: she had not one but three
enormous computer screens, festooned with perhaps 30 open windows - a bunch of
e-mail messages, several instant messages and dozens of Web pages. Czerwinski
says she regards 20 solid minutes of uninterrupted work as a major triumph;
often she'll stay in her office for hours after work, crunching data, since
that's the only time her outside distractions wane.
In 1997, Microsoft recruited Czerwinski to join Microsoft Research Labs, a
special division of the firm where she and other eggheads would be allowed to
conduct basic research into how computers affect human behavior. Czerwinski
discovered that the computer industry was still strangely ignorant of how
people really used their computers. Microsoft had sold tens of millions of
copies of its software but had never closely studied its users' rhythms of work
and interruption. How long did they linger on a single document? What
interrupted them while they were working, and why?
To figure this out, she took a handful of volunteers and installed software on
their computers that would virtually shadow them all day long, recording every
mouse click. She discovered that computer users were as restless as
hummingbirds. On average, they juggled eight different windows at the same time
- a few e-mail messages, maybe a Web page or two and a PowerPoint document.
More astonishing, they would spend barely 20 seconds looking at one window
before flipping to another.
Why the constant shifting? In part it was because of the basic way that today's
computers are laid out. A computer screen offers very little visual real
estate. It is like working at a desk so small that you can look at only a
single sheet of paper at a time. A Microsoft Word document can cover almost an
entire screen. Once you begin multitasking, a computer desktop very quickly
becomes buried in detritus.
This is part of the reason that, when someone is interrupted, it takes 25
minutes to cycle back to the original task. Once their work becomes buried
beneath a screenful of interruptions, office workers appear to literally forget
what task they were originally pursuing. We do not like to think we are this
flighty: we might expect that if we are, say, busily filling out some forms and
are suddenly distracted by a phone call, we would quickly return to finish the
job. But we don't. Researchers find that 40 percent of the time, workers wander
off in a new direction when an interruption ends, distracted by the
technological equivalent of shiny objects. The central danger of interruptions,
Czerwinski realized, is not really the interruption at all. It is the havoc
they wreak with our short-term memory: What the heck was I just doing?
When Gloria Mark and Mary Czerwinski, working separately, looked at the desks
of the people they were studying, they each noticed the same thing: Post-it
notes. Workers would scrawl hieroglyphic reminders of the tasks they were
supposed to be working on ("Test PB patch DAN's PC - Waiting for AL," was one
that Mark found). Then they would place them directly in their fields of
vision, often in a halo around the edge of their computer screens. The Post-it
notes were, in essence, a jury-rigged memory device, intended to rescue users
from those moments of mental wandering.
For Mark and Czerwinski, these piecemeal efforts at coping pointed to ways that
our high-tech tools could be engineered to be less distracting. When Czerwinski
walked around the Microsoft campus, she noticed that many people had attached
two or three monitors to their computers. They placed their applications on
different screens - the e-mail far off on the right side, a Web browser on the
left and their main work project right in the middle - so that each application
was "glanceable." When the ding on their e-mail program went off, they could
quickly peek over at their in-boxes to see what had arrived.
The workers swore that this arrangement made them feel calmer. But did more
screen area actually help with cognition? To find out, Czerwinski's team
conducted another experiment. The researchers took 15 volunteers, sat each one
in front of a regular-size 15-inch monitor and had them complete a variety of
tasks designed to challenge their powers of concentration - like a Web search,
some cutting and pasting and memorizing a seven-digit phone number. Then the
volunteers repeated these same tasks, this time using a computer with a massive
42-inch screen, as big as a plasma TV.
The results? On the bigger screen, people completed the tasks at least 10
percent more quickly - and some as much as 44 percent more quickly. They were
also more likely to remember the seven-digit number, which showed that the
multitasking was clearly less taxing on their brains. Some of the volunteers
were so enthralled with the huge screen that they begged to take it home. In
two decades of research, Czerwinski had never seen a single tweak to a computer
system so significantly improve a user's productivity. The clearer your screen,
she found, the calmer your mind. So her group began devising tools that
maximized screen space by grouping documents and programs together - making it
possible to easily spy them out of the corner of your eye, ensuring that you
would never forget them in the fog of your interruptions. Another experiment
created a tiny round window that floats on one side of the screen; moving dots
represent information you need to monitor, like the size of your in-box or an
approaching meeting. It looks precisely like the radar screen in a military
cockpit.
In late 2003, the technology writer Danny O'Brien decided he was fed up with
not getting enough done at work. So he sat down and made a list of 70 of the
most "sickeningly overprolific" people he knew, most of whom were software
engineers of one kind or another. O'Brien wrote a questionnaire asking them to
explain how, precisely, they managed such awesome output. Over the next few
weeks they e-mailed their replies, and one night O'Brien sat down at his
dining-room table to look for clues. He was hoping that the self-described
geeks all shared some common tricks.
He was correct. But their suggestions were surprisingly low-tech. None of them
used complex technology to manage their to-do lists: no Palm Pilots, no
day-planner software. Instead, they all preferred to find one extremely simple
application and shove their entire lives into it. Some of O'Brien's
correspondents said they opened up a single document in a word-processing
program and used it as an extra brain, dumping in everything they needed to
remember - addresses, to-do lists, birthdays - and then just searched through
that file when they needed a piece of information. Others used e-mail - mailing
themselves a reminder of every task, reasoning that their in-boxes were the one
thing they were certain to look at all day long.
In essence, the geeks were approaching their frazzled high-tech lives as
engineering problems - and they were not waiting for solutions to emerge from
on high, from Microsoft or computer firms. Instead they ginned up a multitude
of small-bore fixes to reduce the complexities of life, one at a time, in a
rather Martha Stewart-esque fashion.
Many of O'Brien's correspondents, it turned out, were also devotees of "Getting
Things Done," a system developed by David Allen, a personal-productivity guru
who consults with Fortune 500 corporations and whose seminars fill Silicon
Valley auditoriums with anxious worker bees. At the core of Allen's system is
the very concept of memory that Mark and Czerwinski hit upon: unless the task
you're doing is visible right in front of you, you will half-forget about it
when you get distracted, and it will nag at you from your subconscious. Thus,
as soon as you are interrupted, Allen says, you need either to quickly deal
with the interruption or - if it's going to take longer than two minutes - to
faithfully add the new task to your constantly updated to-do list. Once the
interruption is over, you immediately check your to-do list and go back to
whatever is at the top.
"David Allen essentially offers a program that you can run like software in
your head and follow automatically," O'Brien explains. "If this happens, then
do this. You behave like a robot, which of course really appeals to geeks."
O'Brien summed up his research in a speech called "Life Hacks," which he
delivered in February 2004 at the O'Reilly Emerging Technology Conference. Five
hundred conference-goers tried to cram into his session, desperate for tips on
managing info chaos. When O'Brien repeated the talk the next year, it was
mobbed again. By the summer of 2005, the "life hacks" meme had turned into a
full-fledged grass-roots movement. Dozens of "life hacking" Web sites now
exist, where followers of the movement trade suggestions on how to reduce
chaos. The ideas are often quite clever: O'Brien wrote for himself a program
that, whenever he's surfing the Web, pops up a message every 10 minutes
demanding to know whether he's procrastinating. It turns out that a certain
amount of life-hacking is simply cultivating a monklike ability to say no.
"In fairness, I think we bring some of this on ourselves," says Merlin Mann,
the founder of the popular life-hacking site 43folders.com. "We'd rather die
than be bored for a few minutes, so we just surround ourselves with
distractions. We've got 20,000 digital photos instead of 10 we treasure. We
have more TV Tivo'd than we'll ever see." In the last year, Mann has embarked
on a 12-step-like triage: he canceled his Netflix account, trimmed his
instant-messaging "buddy list" so only close friends can contact him and set
his e-mail program to bother him only once an hour. ("Unless you're working in
a Korean missile silo, you don't need to check e-mail every two minutes," he
argues.)
Mann's most famous hack emerged when he decided to ditch his Palm Pilot and
embrace a much simpler organizing style. He bought a deck of 3-by-5-inch index
cards, clipped them together with a binder clip and dubbed it "The Hipster
P.D.A." - an ultra-low-fi organizer, running on the oldest memory technology
around: paper.
In the 1920's, the Russian scientist Bluma Zeigarnik performed an experiment
that illustrated an intriguing aspect of interruptions. She had several test
subjects work on jigsaw puzzles, then interrupted them at various points. She
found that the ones least likely to complete the task were those who had been
disrupted at the beginning. Because they hadn't had time to become mentally
invested in the task, they had trouble recovering from the distraction. In
contrast, those who were interrupted toward the end of the task were more
likely to stay on track.
Gloria Mark compares this to the way that people work when they are
"co-located" - sitting next to each other in cubicles - versus how they work
when they are "distributed," each working from different locations and
interacting online. She discovered that people in open-cubicle offices suffer
more interruptions than those who work remotely. But they have better
interruptions, because their co-workers have a social sense of what they are
doing. When you work next to other people, they can sense whether you're deeply
immersed, panicking or relatively free and ready to talk - and they interrupt
you accordingly.
So why don't computers work this way? Instead of pinging us with e-mail and
instant messages the second they arrive, our machines could store them up - to
be delivered only at an optimum moment, when our brains are mostly relaxed.
One afternoon I drove across the Microsoft campus to visit a man who is trying
to achieve precisely that: a computer that can read your mind. His name is Eric
Horvitz, and he is one of Czerwinski's closest colleagues in the lab. For the
last eight years, he has been building networks equipped with artificial
intelligence (A.I.) that carefully observes a computer user's behavior and then
tries to predict that sweet spot - the moment when the user will be mentally
free and ready to be interrupted.
Horvitz booted the system up to show me how it works. He pointed to a series of
bubbles on his screen, each representing one way the machine observes Horvitz's
behavior. For example, it measures how long he's been typing or reading e-mail
messages; it notices how long he spends in one program before shifting to
another. Even more creepily, Horvitz told me, the A.I. program will - a little
like HAL from "2001: A Space Odyssey" - eavesdrop on him with a microphone and
spy on him using a Webcam, to try and determine how busy he is, and whether he
has company in his office. Sure enough, at one point I peeked into the corner
of Horvitz's computer screen and there was a little red indicator glowing.
"It's listening to us," Horvitz said with a grin. "The microphone's on."
It is no simple matter for a computer to recognize a user's "busy state," as it
turns out, because everyone is busy in his own way. One programmer who works
for Horvitz is busiest when he's silent and typing for extended periods, since
that means he's furiously coding. But for a manager or executive, sitting
quietly might actually be an indication of time being wasted; managers are more
likely to be busy when they are talking or if PowerPoint is running.
In the early days of training Horvitz's A.I., you must clarify when you're most
and least interruptible, so the machine can begin to pick up your personal
patterns. But after a few days, the fun begins - because the machine takes over
and, using what you've taught it, tries to predict your future behavior.
Horvitz clicked an onscreen icon for "Paul," an employee working on a laptop in
a meeting room down the hall. A little chart popped up. Paul, the A.I. program
reported, was currently in between tasks - but it predicted that he would begin
checking his e-mail within five minutes. Thus, Horvitz explained, right now
would be a great time to e-mail him; you'd be likely to get a quick reply. If
you wanted to pay him a visit, the program also predicted that - based on his
previous patterns - Paul would be back in his office in 30 minutes.
With these sorts of artificial smarts, computer designers could re-engineer our
e-mail programs, our messaging and even our phones so that each tool would work
like a personal butler - tiptoeing around us when things are hectic and barging
in only when our crises have passed. Horvitz's early prototypes offer an
impressive glimpse of what's possible. An e-mail program he produced seven
years ago, code-named Priorities, analyzes the content of your incoming e-mail
messages and ranks them based on the urgency of the message and your
relationship with the sender, then weighs that against how busy you are.
Superurgent mail is delivered right away; everything else waits in a queue
until you're no longer busy. When Czerwinski first tried the program, it gave
her as much as three hours of solid work time before nagging her with a
message. The software also determined, to the surprise of at least one
Microsoft employee, that e-mail missives from Bill Gates were not necessarily
urgent, since Gates tends to write long, discursive notes for employees to
meditate on.
This raises a possibility both amusing and disturbing: perhaps if we gave
artificial brains more control over our schedules, interruptions would actually
decline - because A.I. doesn't panic. We humans are Pavlovian; even though we
know we're just pumping ourselves full of stress, we can't help frantically
checking our e-mail the instant the bell goes ding. But a machine can resist
that temptation, because it thinks in statistics. It knows that only an
extremely rare message is so important that we must read it right now.
So will Microsoft bring these calming technologies to our real-world computers?
"Could Microsoft do it?" asks David Gelernter, a Yale professor and longtime
critic of today's computers. "Yeah. But I don't know if they're motivated by
the lust for simplicity that you'd need. They're more interested in piling more
and more toys on you."
The near-term answer to the question will come when Vista, Microsoft's new
operating system, is released in the fall of 2006. Though Czerwinski and
Horvitz are reluctant to speculate on which of their innovations will be
included in the new system, Horvitz said that the system will "likely"
incorporate some way of detecting how busy you are. But he admitted that "a
bunch of features may not be shipping with Vista." He says he believes that
Microsoft will eventually tame the interruption-driven workplace, even if it
takes a while. "I have viewed the task as a 'moon mission' that I believe that
Microsoft can pull off," he says.
By a sizable margin, life hackers are devotees not of Microsoft but of Apple,
the company's only real rival in the creation of operating systems - and a
company that has often seemed to intuit the need for software that reduces the
complexity of the desktop. When Apple launched its latest operating system,
Tiger, earlier this year, it introduced a feature called Dashboard - a
collection of glanceable programs, each of which performs one simple function,
like displaying the weather. Tiger also includes a single-key tool that zooms
all open windows into a bingo-card-like grid, uncovering any "lost" ones. A
superpowered search application speeds up the laborious task of hunting down a
missing file. Microsoft is now playing catch-up; Vista promises many of the
same tweaks, although it will most likely add a few new ones as well,
including, possibly, a 3-D mode for seeing all the windows you have open.
Apple's computers have long been designed specifically to soothe the confusions
of the technologically ignorant. For years, that meant producing computer
systems that seemed simpler than the ones Microsoft produced, but were less
powerful. When computers moved relatively slowly and the Internet was little
used, raw productivity - shoving the most data at the user - mattered most, and
Microsoft triumphed in the marketplace. But for many users, simplicity now
trumps power. Linda Stone, the software executive who has worked alongside the
C.E.O.'s of both Microsoft and Apple, argues that we have shifted eras in
computing. Now that multitasking is driving us crazy, we treasure technologies
that protect us. We love Google not because it brings us the entire Web but
because it filters it out, bringing us the one page we really need. In our new
age of overload, the winner is the technology that can hold the world at bay.
Yet the truth is that even Apple might not be up to the task of building the
ultimately serene computer. After all, even the geekiest life hackers find they
need to trick out their Apples with duct-tape-like solutions; and even that
sometimes isn't enough. Some experts argue that the basic design of the
computer needs to change: so long as computers deliver information primarily
through a monitor, they have an inherent bottleneck - forcing us to squeeze the
ocean of our lives through a thin straw. David Rose, the Cambridge designer,
suspects that computers need to break away from the screen, delivering
information through glanceable sources in the world around us, the way wall
clocks tell us the time in an instant. For computers to become truly less
interruptive, they might have to cease looking like computers. Until then,
those Post-it notes on our monitors are probably here to stay.
Clive Thompson is a contributing writer for the magazine.
--
Glenn M.
--- BBBS/NT v4.01 Flag-5
* Origin: Barktopia BBS Site http://HarborWebs.com:8081 (1:379/45)
|