[SOUND] Almost all Android devices have built-in inertia sensors, which can sense the orientation and rotation of the device. We can use these sensors to alter the view of the 3D virtual objects and create virtual reality effects. In this lesson, I'm going to show you how to use a sensor inputs to change the view of 3D objects. Let's get started by looking at what hardware sensors are available and how they work. Almost all Android devices are embedded with accelerometers and magnetometers. And most recent models have gyroscopes as well. Apart from these motion and position sensors, some Android devices have proximity, ambient light, humidity, pressure, and temperature sensors. The latest phones are even embedded with heart rate sensors. Can you think of apps that you use which rely on these sensors? Almost all Android devices are embedded with an accelerometer and geomagnetic sensor or magnetometer, and more recently even a gyroscope. Accelerometers measure the acceleration in three directions. Magnetometers or geomagnetic sensors measure the Earth's magnetic field. Gyroscopes measure the rotational velocity. Based on the magnetometer, accelerometer and gyroscopes, different sensing measurements can be derived such as gravity, step count, linear acceleration, rotation, orientation, etc. More details on the sensor types can be found on the Android developer's website. For building a virtual reality application, we mainly use the rotation vector. The Android emulator is designed with virtual sensors allowing you to simulate different rotation, motion, or sensor readings. To change of sensor readings of a proximity sensors for instance click the onto the advanced sensor tab and adjust the readings accordingly. When manipulating the device orientations, you can see the corresponding a accelerometer, gyroscope, magnetometer readings. You can also switch the orientation of this device by clicking on the corresponding device rotation button. For developing a virtual reality application, we need to measure the azimuth or yaw angle, which is the rotation around the z-axis. Roll angle which is the rotational angle around the y-axis. And the pitch angle, which is the rotational angle around the x-axis. As you know, Android apps can be displayed in a portrait or landscape mode. The relative rotational axis will be different in landscape and portrait modes. As such, we need to align the axis according to the orientation of the device. I'll go through this in more details later. Let's modify our program to find out what are the available sensors in our device. First, we modify the MainActivity class to implement the SensorEventListener for handling sensor events. Then we define a SensorManager to manage the sensor interface, mRotationSensor to sense the device rotation, and the WindowManager to read the device orientation. In the create function, I then set the WindowManager and the SensorManager by calling the functions getWindowManager and getSystemSurface respectively. I then call the function getSensorList where parameters sensor TYPE_ALL and create a for loop to list all the available sensors. I've also created the mRotationSensor calling the functions getDefaultSensor with parameter sensor.TYPE_ROTATION_VECTOR. We use the rotation vector sensor to capture the rotational angles of our device. In addition, the sensor event listener requires our class to handle accuracy changed, resume, pause, and sensor change events. Hence we need to define these callback functions. The onAccuracyChanged function will be called when the accuracy setting of the sensor is changed. The onResume function is called when the sensor is activated. I call the registerlistener function of the mSensorManager to handle the events for mRotationSensor. The onPause function is called, where the sensors is stopped. So I need to call the mSensorManager and registerlistener function, to stop receiving updates of the sensor readings. The onSensorChanged function is called whenever a new readings is received. >> So let's see how it works. So in the MainActivity class I added the implements sensor event listener, and then added the sensor manager and mRotationSensor, and windowsManager as a variable. And then in the onCreate functions, I add the function to create the mSensorManager and WindowsManager, and mRotationSensor. And then I added the variable deviceSensor as a list of sensors, and then create a for loop to list out all the sensors. I also need to create the functions onAccuracyChanged, onResume, and onPause and also onSensorChanged. When I run the program and choose the emulator as my device and in the logcat I search for the sensors you see the list of sensors available from the emulator. I run the program again, but this time I choose to run on my Samsung device, and then set the logcat to show the messages from my Samsung device. You'll see that my Samsung device has a different set of sensors compared to the emulator. It has accelerometer, gyroscope, magnetometer, and also other new sensors such as the heart rate sensor. To get the device rotation for a virtual reality app, I need to modify the onSensorChanged function to handle the events sent by the mRotationSensor. And whenever a new reading is received call the updateOrientation function. Then I define the updateOrientation function. First we need to get the rotational matrix from the rotational vector received from the sensor. We can do that by calling the function getRotationMatrixFromVector. We then need to map the axis of the devices orientation, in order to find the actual rotational angles. To do that, I check the orientation of the device by calling the mWindowsManager's getRotation function, and then set the device relative x and y to the actual axis. For instance, when a device is rotated 90 degrees, the z-axis should become the x-axis and x-axis should become the new y axis. Then I remap the rotational matrix to the new coordinate system using the remapCoordinateSystem function and store that in the adjustedRotationMatrix. I call the getOrientation function to get the orientation from the adjustedRotationMatrix. The orientation array will then get the yaw, pitch, and roll angles in radians. To rotate a 3D virtual object, I have to convert all the radiant angles into degrees. The updated yaw, pitch and roll angles are then sent to the MyView object to rotate our virtual object. If you open the MyView object, you can see that the sensorRotate function basically calls the renderer to rotate the object around the x, y and z-axis, just like touch control example I've showed you in the previous module. Let's look at coding this in the example program. Why don't you try and code along with me? >> First, I need to add the registerlistener in the onResume function and then modify the onSensorChanged functions to handle the mRotationSensor events and call the function updateOrientation whenever a new sensor reading is received. And I define the object orientation and functions, first I need to get a rotation matrix from the vector. And then we mapped the rotation matrix with the orientation of the device, and then get the orientation from the adjusted rotation matrix. Then I'll get the yaw, pitch and roll angles, and then I can run the program. When I run the program, you will see the image of the earth, and then I will need to use the virtual sensors in order to rotate the device. As you can see, when I rotate the virtual phone on the virtual sensors on the emulator you can see the 3D earth, the 3D virtual object we created will rotate accordingly. [MUSIC]