With the release of the Oculus Rift in March 2016, the age of virtual reality (VR) truly began. VR tech had been generating buzz since the 1990s, but the Rift was the first high-end VR system to reach the consumer market, and early reviews confirmed that it delivered the kind of experience users had been hoping for.

Virtual reality was finally real.

Research into VR exploded in this new era, and experts soon started to find innovative ways to make virtual experiences more immersive…more real. To date, VR technologies have moved beyond just sight and sound. We’ve developed technologies that let users touch virtual objects, feel changes in wind and temperature, and even taste food in VR.

However, despite all this progress, no one would mistake a virtual environment for the real world. The technology simply isn’t advanced enough, and as long as we rely solely on traditional headsets and other wearables, it never will be.

Before we can create a world that is truly indistinguishable from the real one, we will need to leave the age of virtual reality behind and enter a new era — the era of neuroreality.

Neuroreality refers to a reality that is driven by technologies that interface directly with the human brain. While traditional VR depends on a user physically reacting to external stimuli (for example, swinging a controller to wield a virtual sword on a screen) a neuroreality system interfaces directly with the user’s biology through a brain-computer interface (BCI).

Notably, this technology isn’t some far-flung sci-fi vision. It’s very real.

To read more, click here.