Virtual Reality (VR)
Google VR
Google provides developers with two virtual reality platforms:
- Google Cardboard (Mobile VR platform) and
- Works with almost any smartphone on Android or iOS
- Google Daydream (Low latency, immersive and interactive mobile VR)
- Comes with hardware and software built for VR
These two virtual reality platforms can be achieved using Google VR SDK using three development platforms: Android, iOS and Unity.
- Android Studio 1.0 or higher lets you include the Google VR SDK and build apps that display VR, compatible with both Cardboard & Daydream
- Xcode 7.1 or higher & CocoaPods 1.0.0. Let’s you include Google VR SDK and build the apps that display VR, compatible only with Cardboard.
- Unity 5.2.1 or later lets you include the Google VR SDK and build apps that display VR which run on both android 4.4 or higher & iOS 7 or higher, compatible with both Cardboard & Daydream
Google VR SDK
- Includes Libraries, API Documentation, Developer Samples and design guidelines
- The Google VR NDK (Natural Development Kit) for android.
- Provides a C/C++ API for developers writing native code.
- Developers familiar with openGL can quickly start creating VR applications using the Google VR SDK.
- Some of the VR development tasks are
- Lens Distortion Correction
- Distortion is a deviation from rectilinear projections, a projection in which the straight lines in a scene or an image remain straight in an image. Mostly called as optical aberration.
- Although these distortions can be irregular or follow many patterns but the most commonly encountered distortions are radially symmetric, and these mostly arise from the symmetry of Photographic Lens. These radial distortions are usually classified as barrel distortions, pincushion distortions and mustache distortions and typically Barrel distortions.
- Concluding above statements, these images generated are actually distorted by both lens effects and spherical perspective effects and theses distortion can be mostly corrected by applying suitable algorithmic transformations to the digital photographs.
- Spatial Audio
- It’s a creation of 3D Audio experience using the headphones. Also called as 3D stereo sound or simply 3D Audio. The applications are augmented and virtual reality.
- HRTF – Head related transfer Functions are the directive patterns of the human ears, which means a pair of filters describing how the sound, arriving from given direction reaches the left and the right ear. The HRTF’s are functions of the direction, elevation, distance and the frequency of the sound
- Head Tracking
- Slaving the imagery. Head-mounted displays may also be used with tracking sensors that allow changes of angle and orientation to be recorded. When such data is available in the system computer, it can be used to generate the appropriate computer-generated imagery (CGI) for the angle-of-look at the particular time. This allows the user to “look around” a VR environment simply by moving the head without the need for a separate controller to change the angle of the imagery. In radio-based systems (compared to wires), the wearer may move about within the tracking limits of the system.
- 3D Calibration
- The human eye can’t see depth of field! it’s the placement of both the left and right eyes (roughly two and a half inches apart) seeing two separate images that is processed by our brain and perceived as depth of field. A 3-D display emulates this same concept by showing each eye the same image but from two separate perspectives. This, in essence, tricks our brain into thinking it’s seeing a real image.
- Side-by-Side Rendering
- Also called as Binocular Rendering and Stereoscopy is a technique for creating or enhancing the illusion of depth in an image by means of stereopsis for binocular vision.
- Traditional stereoscopic photography consists of creating a 3D illusion starting from a pair of 2D images, a stereogram. The easiest way to enhance depth perception in the brain is to provide the eyes of the viewer with two different images, representing two perspectives of the same object, with a minor deviation equal or nearly equal to the perspectives that both eyes naturally receive in binocular vision.
- To avoid eyestrain and distortion, each of the two 2D images should be presented to the viewer so that any object at infinite distance is perceived by the eye as being straight ahead, the viewer’s eyes being neither crossed nor diverging. When the picture contains no object at infinite distance, such as a horizon or a cloud, the pictures should be spaced correspondingly closer together.
- Stereo Geometry Configuration
- Ability to infer information on the 3D structure and distance of a scene from two or more images taken from different viewpoints. The most concerned problems are Correspondence, arises when two images (Left and Right) have similarities and reconstruction, arises when constructing the observed objects based on the spatial information and the 3D locations.
- Some of the related terminologies are Epipolar Geometry, Stereo Vision, Canonical Stereo Configurations, etcetera…
- User Input Event Handling
- The user can interact with the VR world by pressing a button
- Lens Distortion Correction
Getting Comfortable By Running Some simple Sample Apps on Android Studio
Prerequisites
- Android Studio 1.0 or higher
- Android SDK Version 23 or higher
- Android NDK Version r12b
- Gradle 23.0.1 or higher
- A physical android device (s) running android 4.4(KitKat)/ (Material Design) or higher
The following feature the sample google VR application constitutes
- Binocular Rendering (A split-screen view for each eye in VR)
- Spatial Audio (Sound seems to come from specific areas of the VR world)
- Head Movement tracking (The VR world view updates as the user moves their head)
- Trigger Input (The user can interact with the VR world by pressing a button)
Instructions – Sample APP – Treasure Hunt
- Sample Code link: Check this Link
- Clone or download the project and import to the Android studio as an existing project
- Configure Android studio with SDK aswellas the NDK
- In-depth explanation – Check this link
- The output of the applications looks like this (see below)
Instructions – Sample APP – Control Paint
- Apart from the above requirements this sample app falls into a typical Daydream Project, where it requires an additional controller which is another Android device and these two devices should be properly paired with the Bluetooth.
- Firstly, the setup for the Daydream development kit needs to be done
- Secondly, the Pairing between phones is mandatory and the last is the headphones, user needs to wear his headphones to experience the spatial audio.
- Reference: Check this link
- The final outcome looks like this (See below)
Conclusion
This blog looks like a technical description but no,
- The whole idea of this blog is to explain every particular of the hardware and software that is involved in Virtual Reality.
- The whole idea of this blog is to give the idea of what is involved inside VR and what does they mean exactly.
- The whole of this blog is to explain how to get into Virtual reality with google applications, you choose in a cost efficient way or in a costly way, doesn’t matter.
- The whole idea of the blog is to say it’s not easy but it not tough either to get into virtual reality.
Leave a Reply