Friday, March 22, 2013

Samsung Galaxy S4: can eye- tracking work with games?

New reports on Thursday's Galaxy
S4 announcement have been
dominated by a key feature: eye-
tracking. Is this the beginning of a
new era in game control?
Samsung Galaxy S4: is eye-tracking
more than a marketing gimmick?
Keith Stuart
A fascinating concept to some an
unworkable gimmick to others, the
eye-tracking capabilities of the new
Samsung Galaxy S4 have certainly
garnered a lot of news coverage.
The system, which uses the built-in
camera to view and interpret eye
movements, has been
demonstrated doing two things:
pausing a movie when the user
looks away (Smart Pause), and
scrolling the screen content when
the user tilts and, erm, looks at it
(Smart Scroll). But do these simple
features suggest that we're about to
enter an era of eye-controlled
games?
Well, it depends. I've not yet been
able to find out from Samsung
whether the technology will be
made available to developers –
that's obviously going to be
important. The other question is
how sensitive it will be. The current
S4 implementations are rather
binary: the user is either looking at
the screen or they're not. This
might offer some functionality to
game designers: it could augment a
standard control system (perhaps
as a hands-free pause system) or it
may be fine for very simple one-
input titles, like endless runners.
But for anything more complicated,
the system will need to be able to
accurately and speedily track eye
movement across the screen.
Nevertheless, game developers are
keen to know more about
Samsung's tech. "What would be
interesting is if Samsung puts out
APIs for an engine like Unity or
Cocos2D so that we could retrofit
this into games," says Ben
Trewhella of Opposable Games .
"Much like motion and gesture
tracking, these new features bring
in a lot of innovation but it can be
hard for a developer if they're not
made seamless to integrate. You
can spend 20-30 percent of your
development budget on trying to
implant a new interface feature,
when it needs to be two or three
percent of your time."
Opposable is a small studio that
specialises in multiplayer 'second
screen' gaming – i.e. interactive
experiences that use both a shared
TV display and a tablet device for
each participant. They're currently
working on trading card and
Advance Wars-style tactical titles,
but also see the possibilities for
impulsive multiplayer experiences.
"Games that allow someone to just
jump in and join a game are great -
with eye tracking there's the
potential to do a lot of very
inclusive games where you may just
walk past a screen, it recognises
you're there, and suddenly you're
in the game - it would mean that
you could get four to eight player
games very quickly on one screen -
that's very interesting."
Harvey Elliot was a producer at
Electronic Arts but is now MD of
cross-platform game technology
company Marmalade. He too sees
the potential of eye-tracking – and
the requirements for a sensitive
system. "The possibilities will
depend on how precise it can be in
tracking position across the whole
screen," he says. "There are clearly
opportunities for games to evolve
using this technology, perhaps
delegating certain functions like
reloading a gun in an FPS, steering
with a flick of the eyes in a racing
game or camera control in a 3D
adventure.
"For younger players simple games
like 'peekaboo' with characters, or
reading eye movement to create
feedback could be really rewarding.
Perhaps more valuable is outside of
gameplay - by tracking user line of
sight in real–time we can make
games more reactive to what the
user is focusing on – and by
relaying that information back to
the development studios it would
help inform future design
decisions."
Vitally too, the arrival of
affordable, pervasive eye-tracking
solutions could be great news to
gamers with disabilities. "I've been
predicting that gaze aware systems
would go mainstream for ten years,
and I'm glad it's started to
happen," says Dr Mick Donegan,
CEO of charity SpecialEffect which
modifies gaming peripherals for
disabled players. SpecialEffect has
developed its own PC app, Alt
Controller, which maps keyboard
controls to different areas of the
screen so that they can be read by
eye-tracking systems. In this way,
it's possible to play titles like
racing sim Dirt 3 with eye
movement alone. The cameras
supported, though, are specialist
products that cost upwards of
£3000.
"From what I've seen of the
Samsung system, I'm not sure how
accurate it will be," he says. "The
features I've seen rely on fairly
large movement of the eyes,
whereas to play a game,
particularly on a device as small as
a Samsung, it will need quite a high
level of accuracy. But it's a very
encouraging direction for things to
move in, whether that's just to
enhance enjoyment of games or to
allow people with severe disabilities
to use the technology."
And obviously, Samsung isn't alone
in exploring the consumer
possibilities of eye tracking.
Specialist technology company
Tobii , which usually supplies its
gaze interaction and eye control
products in research and medical
fields is working with Fujitsu on an
eye-tracking tablet. It is also
preparing to launch its first
consumer product, the Tobii Rex,
which adheres to your monitor and
allows you to use eye-tracking to
control any compatible application
– developers just need to integrate
some dedicated code into their
software. At last week's ceBit
exhibition, the company worked
with Intel to specifically showcase
the gaming applications.
Elsewhere, Donegan points to Eye
Tribe a company set up by a group
of PhD student from the University
of Copenhagen. After securing a
million euros in crowd funding, the
group is now working on a low-
cost eye-tracking controller for
mobile devices, using just your
inbuilt camera and no additional
technology.
Whatever Samsung intends for its
own use of eye tracking in the
Galaxy S4, this is another step
toward mainstream physical
controllers. From Kinect to Google
Glass, the concepts of intuitive,
highly accessible input are
evolving. Gamers of course, always
talk about how they'll always want
joypads because of the precision of
control offered: the huge decline in
Kinect support has shown that
neither developers or gamers were
really impressed with the accuracy
of the system.
But as Elliot says, if we view this as
augmentation rather than
replacement of existing interfaces
it gets interesting. Donegan talks
about 'gaze awareness' - i.e. titles
that simply know where you're
looking – rather than 'gaze control',
where your eyes become a
controller. For example, in a first-
person shooter, an enemy hiding
behind an object may look out at
the player, but then quickly get
back in to cover when you glance
at them. This has all sorts of
creepy implications for survival
horror titles.
Today, it might just be about
looking away from the screen to
pause Temple Run, tomorrow it
could be an augmented reality
ghost hunting game, using eye-
tracking to place spooks just at
your peripheral vision. You have
been warned.

No comments:

Post a Comment