Project Glass, the latest sci-fi concept to come out of Google's X Lab, has gotten a lot of attention online in the past 24 hours thanks to a clever demo video that shows a user donning a pair of augmented-reality eyeglasses which project a heads-up display of video chats, location check-ins, and appointment reminders.

Reactions to the product design have ranged skeptical to enthusiastic, but I was curious about the psychological and visual-cognitive aspects of the user experience. What would these "digital overlays" actually look and feel like? Would they really be as sharp and legible as the ones shown in the video? (I don't know about you, but I can't focus sharply on anything less than an inch away from my eyeball, which is where the eyeglasses' tiny screen would be dangling.) Would they obstruct my vision and make me motion-sick? How would my brain make perceptual and physical sense of the graphics: where would I "look," exactly, in order to "watch" the tiny picture-in-picture video chat shown at the conclusion of the clip?

I asked Mark Changizi, an evolutionary neurobiologist and author of The Vision Revolution, to answer some of these questions in an audio commentary track on the video, which you can watch above.

To read more, click here.