I don't know about you, but every time I see something like a "non-invasive brain computer interface," my cringe factor goes up a bit.
But that, apparently, is where we're headed, as companies look toward the next phase of digital engagement, with Snapchat today announcing that it's acquired Paris-based neurotech company NextMind to help fuel its long-term augmented reality research efforts.
As explained by Snap:
Before snapping over to the company, NextMind developed non-invasive brain-computer interface tech, which allows users to much more easily have hands-free control over electronic devices--computers, AR/VR wearables and headsets, for example--by monitoring neural activity to infer your intent when using the interface. Think of it as a virtual button you can activate by focusing on it. "
[Instinctively rubs at side of my head]
Snap does note that NextMind's tech 'does not "read" thoughts or send any signals toward the brain'. So there is that - but still, the notion that your brain will eventually, subconsciously control your digital experience, and that big tech companies will have literal direct line into your head does seem somewhat concerning.
Meta's also building this, and has worked on brain-to-screen interaction since at least 2017.
While Meta did abandon the full digital mind-reading project last year, focusing on wrist-based devices "powered by electromyography" instead.
That could be the secret to penetrating more complex AR and VR spaces, more naturally and intuitively, too, in that regard: Meta also bought CTRL-Labs back in 2019 to further develop this aspect.
As Meta's head of VR Andrew Bosworth explained at the time:
The concept of this project would be a wristband that allows humans to control devices as if a body language movement. This is exactly how it's going to work. You have neurons in your spinal cord that send electrical signals to your hand muscles telling them to move in specific ways such as to click a mouse or press a button. The wristband will break down those signals into a digital signal your device can understand, giving you control over your digital life."
NextMind is pretty much doing the same thing but tracks those same types of signals directly from your visual cortex via head electrode sensors.
Yeah, I don't know, I'm just not 100 percent ready to accept that just yet.
But it could be the future, and with no keyboards in the virtual world, and tools now capable of detecting your brain impulses to facilitate such interaction, well, it makes a good sense.
Just a little creepy. A body has a natural instinct to protect a brain at all costs, so the tech platforms will need some hard sell to convince.
And then, of course, there's the question of what happens over time: does the fact that we no longer stand up and move around in order to have things happen make us even more like the people sitting in their autopilot chairs on the spaceship in Wall-E?
Of course, these are considerations to be factored in over time, but we need to start with core functionality first, before moving on to implications.
I think-is that how it's supposed to work?
In any event, it now increasingly feels like you are going to let technology companies into your brain, just a little bit, as we move toward the next stage of digital controls and interactive processes.
Welcome to the future.