During our most recent Friday hack day, Stuart Bowman and Justin Couchot, two of our amazing mobile interns, dove into virtual reality with Google Cardboard. It was awesome to see how much they were able to accomplish in 8-hours with no prior experience building Cardboard apps! Enjoy!
Last Friday Justin Couchot and I had the opportunity to work with the Google Cardboard SDK. Here’s what we picked up in our 8-hour hack day.
Google provides two SDK options to get you up and running with Cardboard: a Java project and library for use in Android Studio and a Unity framework package for use with a new or game project. We opted for the Unity framework for its ease of setup and installation with the engine tools we planned on using. The Cardboard framework is added to the project as a custom Unity package (similar to any other asset store package) and includes the stereoscopic camera prefab with gyroscope/accelerometer movement script as well as an example scene to get you up and running. Additionally, projects built in Unity using Google’s Cardboard framework can also be cross-compiled for use on iOS with minimal to no changes.
Having developed in Unity in the past, I can tell you that working with the camera is not usually an area of enjoyment, but in the case of Cardboard, camera placement is quite literally everything. Unlike conventional PC or console games where the player can easily move a 1st or 3rd person camera, Cardboard has very few input methods aside from tilting and turning your head. Because of this, we carefully considered the position and context of the camera with respect to the rest of the scene. That said, Google’s provided stereoscopic camera prefab worked flawlessly and handled pitch-roll-yaw head tracking very smoothly.
Our 8-hour coding bash was a ton of fun. We worked together to build a 3D audio visualization tool, similar to the iTunes audio visualizer, but placed in a VR environment. Using Unity’s built-in audio spectrum analysis, our project rotates a series of propellers relative to the amplitude of the frequency each propeller represents. With the Unity engine providing all necessary audio processing at runtime, the entire project was written with only 45 lines of C# in the main script, not including the Cardboard SDK scripts.
Additionally, Cardboard allows for full pitch-roll-yaw camera rotation right out of the box, another feature not found in a conventional first person shooter. As a developer or game artist, you must be more conscientious of the environment around the player who will likely spend plenty of time exploring the world behind, above and below them. It seems as though the Cardboard platform encourages this with applications that drive users to naturally explore as much of the world as possible.
The trickiest part of developing a VR application is to create a compelling and creative environment for your audience to explore. Rather than simulating a keyboard and mouse 1st person camera, you must treat VR projects as a 3D stage with much richer experiences. Modifying our Unity project to support the Cardboard SDK was trivial, and testing and deploying to a device was a snap. Overall, my first experience working with Cardboard was quite pleasant. The process allowed us to quickly and easily build a VR environment where we learned the building blocks for future personal and professional VR projects.
http://docs.unity3d.com/ScriptReference/AudioSource.GetSpectrumData.html http://answers.unity3d.com/questions/157940/getoutputdata-and-getspectrumdata-they-represent-t.html http://answers.unity3d.com/questions/382595/getspectrumdata-differance-between-pc-and-android.html
Stay up to date with the latest mobile app dev news, technologies and events with Atomic Robot.