Introduction to React VR, Part 1
Shay Keinan
Last updated on Nov 9, 2017

Virtual reality is being used in many industries. Besides games, virtual reality is being used in many fields like medicine, education, and movies. Because of its ability to completely immerse you in a scene the possibilities are endless.

Before diving into code I want to talk to you about virtual reality concepts, things that we, as developers, must know if we want to build a virtual reality application.

So what is virtual reality in a nutshell? Visual virtual reality is made up of 2 things: stereoscopic imaging and movement tracking.

Lets look at these 2 images. Are they identical?

They seem identical but if you look closely you can see a difference.

Stereoscopic images are based on how the human brain works: it takes two images that show the same content but from a slightly different point of view. The offset of these images corresponds to the distance between our eyes, this distance is called inter-pupillary distance, IPD for short. In this way we simulate the way we see the world naturally, and it gives us the perception of 3D depth.

Headset lenses are an integral part of the virtual reality experience. Why do we use headset lenses Because they position the images on the screen at the exact distance that they need to be to get the desired effect. VR lenses are thick so they cause a distortion. The square that you see on the left looks caved in through the lenses. The outcome looks something like what we see on the right.

To compensate for this, we give images that are rounded out. On the left you can see a compensated image before the lenses, and on the right is the final desired image.

To sum up stereoscopic imaging: by showing two slightly different images to each eye, using special lenses, we get the effect of depth.

Besides stereoscopic imaging the second thing we need to complete the illusion of virtual space is to track the movement of our body. All VR devices track head movement so we can look around. Some devices, the more expensive ones, track body movements, so we can move around. Of course the more tracking sensors that you have, the better the illusion of reality.

On April this year, Facebook announced the launch of React VR, a new JavaScript framework, based on Three.js and React Native.

React VR allows developers to build virtual reality experiences with the help of JavaScript. As the name implies, React VR uses the same concepts as Facebook’s existing React framework. Just like with React for standard web apps, VR developers can now use the same declarative model to write apps for their 360-degree experiences.

Just like in animation, VR apps need to be rendered at 60 frames per second, React Native has solved many of the issues that usually make this hard to do with a JavaScript application. It’s important to know that React VR is based on React Native and Three.js. Most of the components that we use are React Native components. The 3D rendering engine is the Three.js rendering engine.

Three.js is a cross-browser JavaScript library used to create and display animated 3D graphics.

In the next part, we are going to cover React’s VR basic components and start to write an application from scratch. If you can’t wait and want to move on the entire tutorial is available here:

Back to all articles

© 500Tech. Building high-quality software since 2012.