Gabe Newell on Brain Computer Interfaces - briefly
(hx) 11:09 PM CET - Jan,25 2021
- Post a comment
Gabe Newell talks to 1 News, telling the New Zealand outlet about Valve's
interest in how brain computer interfaces (BCIs) could revolutionize the future
of video games.
Newell admits some of the ideas may seem incredible, and said some of
the discussions he's having around BCIs are "indistinguishable from
science fiction" - but according to him, game developers would be
making a mistake by not investigating BCIs within the short-term future.
To help them to do that, Newell said Valve is currently working on an
open-source BCI software project, allowing developers to begin to
interpret the signals being read from people's brains using hardware
like modified VR (virtual reality) helmets.
"We're working on an open source project so that everybody can have
high-resolution [brain signal] read technologies built into headsets,
in a bunch of different modalities," Newell said.
Valve has been working with OpenBCI headsets.
OpenBCI unveiled a headset design in November called Galea, designed to
work alongside VR headsets like Valve's Index.
"If you're a software developer in 2022 who doesn't have one of these
in your test lab, you're making a silly mistake," Newell said.
"Software developers for interactive experience[s] - you'll be
absolutely using one of these modified VR head straps to be doing that
routinely - simply because there's too much useful data."
That data will generally consist of readings from the player's body and
brain, which can be used to tell if the player is excited, surprised,
sad, bored, amused and afraid, among other emotions.
The readings can be used by developers to improve immersion and
personalise what happens during games - like turning up the difficulty
a bit if the system realises the player is getting bored.
Aside from just reading people's brain signals, Newell also discussed
the near-future reality of being able to write signals to people's
minds - to change how they're feeling or deliver better-than-real
visuals in games.
He said BCIs will lead to gaming experiences far better than a player
could get through their "meat peripherals" - as in, their eyes and ears.
"You're used to experiencing the world through eyes," Newell said, "but
eyes were created by this low-cost bidder that didn't care about
failure rates and RMAs, and if it got broken there was no way to repair
anything effectively, which totally makes sense from an evolutionary
perspective, but is not at all reflective of consumer preferences.
"So the visual experience, the visual fidelity we'll be able to create
- the real world will stop being the metric that we apply to the best
possible visual fidelity.
"The real world will seem flat, colourless, blurry compared to the
experiences you'll be able to create in people's brains.
"Where it gets weird is when who you are becomes editable through a
BCI," Newell said.
At the moment, people accept their feelings are just how they feel -
but Newell says BCIs will soon allow the editing of these feelings
digitally, which could be as easy as using an app.
"One of the early applications I expect we'll see is improved sleep -
sleep will become an app that you run where you say, 'Oh, I need this
much sleep, I need this much REM,'" he said.
Another benefit could be the reduction or total removal of unwanted
feelings or conditions from the brain, for therapeutic reasons.
|