Lesson 137. Sensors. Acceleration, orientation.

Lesson 137. Sensors. Acceleration, orientation.


In this lesson:

– read data from sensors

In this lesson, I will try to outline what I have learned from everything I have read from sensors. Let’s look at how you can get data from sensors and how you can use that data.

Android supports several types of sensors. Currently, help reports that these types of 13. In this lesson, we will consider the sensors of light, acceleration, gravity and magnetic field.

It is easy to get sensor data. To do this, we will need to invoke the object-sensor system and hang its listener on it. An array of data will come into the listener method.

List of sensors. Light sensor.

The first application will show us a list of available sensors and data from the light sensor.

Let’s create a project:

Project name: P1371_Sensors
Build Target: Android 2.3.3
Application name: Sensors
Package name: ru.startandroid.develop.p1371sensors
Create Activity: MainActivity

IN strings.xml add rows:

List
Light

screen main.xml:



    
    
    
        
        
    

Buttons to get a list of sensors and data around the world, and TextView to display data.

MainActivity.java:

package ru.startandroid.develop.p1371sensors;

import java.util.List;

import android.app.Activity;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.os.Bundle;
import android.view.View;
import android.widget.TextView;

public class MainActivity extends Activity {

  TextView tvText;
  SensorManager sensorManager;
  List sensors;
  Sensor sensorLight;
  
  @Override
  protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.main);
    tvText = (TextView) findViewById(R.id.tvText);
    sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
    sensors = sensorManager.getSensorList(Sensor.TYPE_ALL);
    sensorLight = sensorManager.getDefaultSensor(Sensor.TYPE_LIGHT);
  }

  public void onClickSensList(View v) {
    
    sensorManager.unregisterListener(listenerLight, sensorLight);
    StringBuilder sb = new StringBuilder();
    
    for (Sensor sensor : sensors) {
      sb.append("name = ").append(sensor.getName())
      .append(", type = ").append(sensor.getType())
      .append("nvendor = ").append(sensor.getVendor())
      .append(" ,version = ").append(sensor.getVersion())
      .append("nmax = ").append(sensor.getMaximumRange())
      .append(", resolution = ").append(sensor.getResolution())
      .append("n--------------------------------------n");
    }
    tvText.setText(sb);
  }
  
  public void onClickSensLight(View v) {
    sensorManager.registerListener(listenerLight, sensorLight, 
        SensorManager.SENSOR_DELAY_NORMAL);
  }
  
  @Override
  protected void onPause() {
    super.onPause();
    sensorManager.unregisterListener(listenerLight, sensorLight);
  }
  
  SensorEventListener listenerLight = new SensorEventListener() {

    @Override
    public void onAccuracyChanged(Sensor sensor, int accuracy) {
    }

    @Override
    public void onSensorChanged(SensorEvent event) {
      tvText.setText(String.valueOf(event.values[0]));
    }
  };
  
}

IN onCreate we get SensorManager. We request a complete list of sensors using the getSensorList method and passing the sensor type TYPE_ALL there.

To get a specific sensor (Sensor), call the getDefaultSensor method. We pass the type TYPE_LIGHT and receive the light sensor. It is more accurate here, because if there is no such sensor in the device – then the method will return null.

IN onClickSensList we unsubscribed from the sensor. A little later.

Next, we take a list of sensors and display with it the information on the screen:
getName – name
getType – type
getVendor – creator
getVersion – version
getMaximumRange – the maximum value that the sensor can return
getResolution – as far as I understand, this is the minimum step that the value can change

In the method onClickSensLight we use the registerListener method to hang a listenerLight listener on a previously received sensorLight sensor. The third parameter of the method is the speed of obtaining new data. That is how often you need to receive data from the sensor. There are 4 speeds in descending order: SENSOR_DELAY_NORMAL, SENSOR_DELAY_UI, SENSOR_DELAY_GAME, SENSOR_DELAY_FASTEST.

True help writes that the system can ignore this value and give the data as it is convenient. And starting with the Level 9 API, you can pass your value in microseconds instead of the rate constant. Don’t mess with milliseconds.

IN onPause we unsubscribed from the light sensor. Here, as always, it is recommended to unsubscribe as soon as you do not need the data so as not to waste the battery.

listenerLight – Listener, implements the SensorEventListener interface. He has two methods:

onAccuracyChanged – Called when the accuracy of the sensor data changes and at the beginning of the data acquisition. Gives us the object sensor and the level of accuracy:

SENSOR_STATUS_ACCURACY_HIGH – The highest possible accuracy
SENSOR_STATUS_ACCURACY_MEDIUM – Medium accuracy, calibration could improve result
SENSOR_STATUS_ACCURACY_LOW – Low accuracy, calibration required
SENSOR_STATUS_UNRELIABLE – Sensor data is absolutely nothing. Either calibration is required or data cannot be read.

onSensorChanged – this is where we get the sensor data from the SensorEvent object.

We all save and launch the application. press List and we get a list.

It looks like this to me:

The screen shows that the device has several sensors of the same type. If instead of TYPE_ALL we pass some specific type of sensor to getSensorList method, then we get a list of sensors of this type only.

now click Light. The app will show the current illumination value. Try changing the brightness next to the device, the value should change.

In my dark room it shows 0. If you take a flashlight and from afar start to bring it to the light sensor will show in sequence: 10, 100, 1000, 10000 and at the end of 30000. Thus, on the screen with the list of sensors it is visible that the maximum value = 3000, and step (if I understood the resolution parameter correctly) = 1. I do not know why such information mismatch with reality.

acceleration

Next, consider the motion sensors. To do this, we will need to understand that there are three axes in our device space. Google has such a picture in help.

That is, if you hold the device in front of you, then the X axis goes from left to right, the Y axis goes from bottom to top, the Z axis passes through the device in its direction. The acceleration sensor will return to us an array of three values, each corresponding to a specific axis.

Let’s create a project:

Project name: P1372_Acceleration
Build Target: Android 2.3.3
Application name: Acceleration
Package name: ru.startandroid.develop.p1372acceleration
Create Activity: MainActivity

screen main.xml:



    
    

MainActivity.java:

package ru.startandroid.develop.p1372acceleration;

import java.util.Timer;
import java.util.TimerTask;

import android.app.Activity;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.os.Bundle;
import android.widget.TextView;

public class MainActivity extends Activity {

  TextView tvText;
  SensorManager sensorManager;
  Sensor sensorAccel;
  Sensor sensorLinAccel;
  Sensor sensorGravity;

  StringBuilder sb = new StringBuilder();

  Timer timer;

  @Override
  protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.main);
    tvText = (TextView) findViewById(R.id.tvText);
    sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
    sensorAccel = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
    sensorLinAccel = sensorManager
        .getDefaultSensor(Sensor.TYPE_LINEAR_ACCELERATION);
    sensorGravity = sensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);

  }

  @Override
  protected void onResume() {
    super.onResume();
    sensorManager.registerListener(listener, sensorAccel,
        SensorManager.SENSOR_DELAY_NORMAL);
    sensorManager.registerListener(listener, sensorLinAccel,
        SensorManager.SENSOR_DELAY_NORMAL);
    sensorManager.registerListener(listener, sensorGravity,
        SensorManager.SENSOR_DELAY_NORMAL);

    timer = new Timer();
    TimerTask task = new TimerTask() {
      @Override
      public void run() {
        runOnUiThread(new Runnable() {
          @Override
          public void run() {
            showInfo();
          }
        });
      }
    };
    timer.schedule(task, 0, 400);
  }

  @Override
  protected void onPause() {
    super.onPause();
    sensorManager.unregisterListener(listener);
    timer.cancel();
  }

  String format(float values[]) {
    return String.format("%1$.1ftt%2$.1ftt%3$.1f", values[0], values[1],
        values[2]);
  }

  void showInfo() {
    sb.setLength(0);
    sb.append("Accelerometer: " + format(valuesAccel))
        .append("nnAccel motion: " + format(valuesAccelMotion))
        .append("nAccel gravity : " + format(valuesAccelGravity))
        .append("nnLin accel : " + format(valuesLinAccel))
        .append("nGravity : " + format(valuesGravity));
    tvText.setText(sb);
  }

  float[] valuesAccel = new float[3];
  float[] valuesAccelMotion = new float[3];
  float[] valuesAccelGravity = new float[3];
  float[] valuesLinAccel = new float[3];
  float[] valuesGravity = new float[3];

  SensorEventListener listener = new SensorEventListener() {

    @Override
    public void onAccuracyChanged(Sensor sensor, int accuracy) {
    }

    @Override
    public void onSensorChanged(SensorEvent event) {
      switch (event.sensor.getType()) {
      case Sensor.TYPE_ACCELEROMETER:
        for (int i = 0; i < 3; i++) {
          valuesAccel[i] = event.values[i];
          valuesAccelGravity[i] = (float) (0.1 * event.values[i] + 0.9 * valuesAccelGravity[i]);
          valuesAccelMotion[i] = event.values[i]
              - valuesAccelGravity[i];
        }
        break;
      case Sensor.TYPE_LINEAR_ACCELERATION:
        for (int i = 0; i < 3; i++) {
          valuesLinAccel[i] = event.values[i];
        }
        break;
      case Sensor.TYPE_GRAVITY:
        for (int i = 0; i < 3; i++) {
          valuesGravity[i] = event.values[i];
        }
        break;
      }

    }

  };

}

IN onCreate we get three sensors:

TYPE_ACCELEROMETER - acceleration, including gravity (the same as 9.8 in physics)

TYPE_LINEAR_ACCELERATION - acceleration (pure, no gravity)

TYPE_GRAVITY is gravity

IN onResume register one listener listener on all three sensors. And we run a timer that will display data in TextView every 400 ms.

IN onPause unsubscribed from all sensors by calling the unregisterListener method without specifying a specific sensor. And we turn off the timer.

method format simply formats the float value to one decimal place.

showInfo output textView data. We will have data in five arrays.

In the listener listener in the onSensorChanged method, we determine the sensor type and write the data to the corresponding arrays:

valuesAccel - acceleration sensor data (including gravity)

valuesAccelMotion and valuesAccelGravity - data from valuesAccel, separated by a computational filter for pure acceleration (without gravity) and gravity.

valuesLinAccel - data from the acceleration sensor without gravity

valuesGravity - gravity sensor data

That is, we get acceleration data (valuesAccel) from the sensor TYPE_ACCELEROMETER and then computing filter ourselves break down into pure acceleration and gravity. But it is possible not to bother, but to use the TYPE_LINEAR_ACCELERATION and TYPE_GRAVITY sensors, which should give us about the same result.

By the way, note how I read data into my array. I read the values ​​and write them to myself. Why not just type assignment: valuesAccel = event.values? Doing so may occasionally slip through the curved data if you are reading multiple sensors. There seems to be a pool of objects to keep Georgia's garbage bin from such a wild count of new units per unit of time. Accordingly, if you take an object from a link, then until it arrives at you for processing, the system can again take it into circulation and write to it already new values, and even from another sensor. So it's better to read the value rather than the link to take.

Let's start the program. And by the way the device on the table screen up.

Let's discuss again what was displayed on the screen

Accelerometer: Acceleration data + gravity. We see that the third axis (Z), which in the supine position extends vertically upwards, shows an acceleration approximately equal to gravity. That is, even at rest the sensor shows not pure acceleration but also gravity, which is not always required.

We used a filter to separate the acceleration from gravity.

Accel motion: The net acceleration calculated from the acceleration with gravity. Here all the zeros, because the device lies and does not move.

Accel gravity: Gravity calculated from gravity acceleration. Here the first two axes = zero, because they run parallel to the earth and there is no gravity along these axes. But it is on the third axis, which runs vertically. Simply put, the planet does not pull us left, right, back and forth, it pulls us down. Therefore, the acceleration in 9.8 will be that of the axis that is vertical to the ground.

Lin accel: Data from pure acceleration sensor (without gravity). Everything is zero here because the device is at rest. These values ​​should roughly coincide with what we have calculated in Accel motion.

Gravity: Gravity sensor data. The third axis shows that it is vertically, because gravity is near its maximum. These values ​​should be the same as what we counted in Accel gravity.

You can move the device in this position with acceleration in different directions and watch as the axes change. However, not very clear. If you build a graph on these values, it will certainly be better to see their changes over time.

Now I will take it in my hands and lift it in front of me so that it is at eye level and the screen looks at me. That is, just as above in the text, in the image with the axes.

Let's see how the data has changed.

Accelerometer: It is clear that the second axis (Y) is now almost vertical, with gravity 9.8. And the X and Z axes are close to zero. They are not exactly zero because I hold the device not perfectly smooth and slight misalignments give gravity to these axes as well.

That is, the conclusion can be drawn as follows. The closer the axis value is to 9.8, the more vertical its position is in space. And the closer to zero, the more horizontal.

Accel motion and Lin accel show us pure acceleration. It's close to zero because I try not to pull the device.

Accel gravity and Gravity show that along the second axis we have almost complete gravity, and therefore the axis is vertically arranged.

Try to bend the device in different directions and watch as gravity changes. When the axis is moved from the horizontal position to the vertical value of the sensor along this axis will change from 0 to 9.8.

orientation

Now let's try to use the acceleration sensor data and add the magnetic field sensor data to it. These two datasets, when manipulated, will give us the angles of inclination of the device. The angle will be three, one for each axis.

Let's create a project:

Project name: P1373_Orientation
Build Target: Android 2.3.3
Application name: Orientation
Package name: en.startandroid.develop.p1373orientation
Create Activity: MainActivity

screen main.xml:



    
    

MainActivity.java:

package ru.startandroid.develop.p1373orientation;

import java.util.Timer;
import java.util.TimerTask;

import android.app.Activity;
import android.content.Context;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.os.Bundle;
import android.view.Display;
import android.view.Surface;
import android.view.WindowManager;
import android.widget.TextView;

public class MainActivity extends Activity {

  TextView tvText;
  SensorManager sensorManager;
  Sensor sensorAccel;
  Sensor sensorMagnet;
  
  StringBuilder sb = new StringBuilder();
  
  Timer timer;
  
  int rotation;
  
  @Override
  protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.main);
    tvText = (TextView) findViewById(R.id.tvText);
    sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
    sensorAccel = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
    sensorMagnet = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
  }
  
  @Override
  protected void onResume() {
    super.onResume();
    sensorManager.registerListener(listener, sensorAccel, SensorManager.SENSOR_DELAY_NORMAL);
    sensorManager.registerListener(listener, sensorMagnet, SensorManager.SENSOR_DELAY_NORMAL);
    
    timer = new Timer();
    TimerTask task = new TimerTask() {
      @Override
      public void run() {
        runOnUiThread(new Runnable() {
          @Override
          public void run() {
            getDeviceOrientation();
            getActualDeviceOrientation();
            showInfo();
          }
        });
      }
    };
    timer.schedule(task, 0, 400);
    
    WindowManager windowManager = ((WindowManager) getSystemService(Context.WINDOW_SERVICE));
    Display display = windowManager.getDefaultDisplay();
    rotation = display.getRotation();

  }

  @Override
  protected void onPause() {
    super.onPause();
    sensorManager.unregisterListener(listener);
    timer.cancel();
  }
  
  String format(float values[]) {
    return String.format("%1$.1ftt%2$.1ftt%3$.1f", values[0], values[1], values[2]);
  }
  
  void showInfo() {
    sb.setLength(0);
    sb.append("Orientation : " + format(valuesResult))
    .append("nOrientation 2: " + format(valuesResult2))
    ;
    tvText.setText(sb);
  }
  
  float[] r = new float[9];
  
  void getDeviceOrientation() {
    SensorManager.getRotationMatrix(r, null, valuesAccel, valuesMagnet);
    SensorManager.getOrientation(r, valuesResult);

    valuesResult[0] = (float) Math.toDegrees(valuesResult[0]); 
     valuesResult[1] = (float) Math.toDegrees(valuesResult[1]);
     valuesResult[2] = (float) Math.toDegrees(valuesResult[2]);
    return;
  }
  
  float[] inR = new float[9];
  float[] outR = new float[9];
  
  void getActualDeviceOrientation() {
    SensorManager.getRotationMatrix(inR, null, valuesAccel, valuesMagnet);
    int x_axis = SensorManager.AXIS_X;
    int y_axis = SensorManager.AXIS_Y;
    switch (rotation) {
    case (Surface.ROTATION_0): break;
    case (Surface.ROTATION_90):
    x_axis = SensorManager.AXIS_Y;
    y_axis = SensorManager.AXIS_MINUS_X;
    break;
    case (Surface.ROTATION_180):
    y_axis = SensorManager.AXIS_MINUS_Y;
    break;
    case (Surface.ROTATION_270):
    x_axis = SensorManager.AXIS_MINUS_Y;
    y_axis = SensorManager.AXIS_X;
    break;
    default: break;
    }
    SensorManager.remapCoordinateSystem(inR, x_axis, y_axis, outR);
    SensorManager.getOrientation(outR, valuesResult2);
    valuesResult2[0] = (float) Math.toDegrees(valuesResult2[0]); 
    valuesResult2[1] = (float) Math.toDegrees(valuesResult2[1]); 
    valuesResult2[2] = (float) Math.toDegrees(valuesResult2[2]); 
    return;
  }  
  
  float[] valuesAccel = new float[3];
  float[] valuesMagnet = new float[3];
  float[] valuesResult = new float[3];
  float[] valuesResult2 = new float[3];
  
  
  SensorEventListener listener = new SensorEventListener() {

    @Override
    public void onAccuracyChanged(Sensor sensor, int accuracy) {
    }

    @Override
    public void onSensorChanged(SensorEvent event) {
      switch (event.sensor.getType()) {
      case Sensor.TYPE_ACCELEROMETER:
        for (int i=0; i < 3; i++){
          valuesAccel[i] = event.values[i];
        }        
        break;
      case Sensor.TYPE_MAGNETIC_FIELD:
        for (int i=0; i < 3; i++){
          valuesMagnet[i] = event.values[i];
        }  
        break;
      }
    }
  };
 
}

IN onCreate we get acceleration sensors (TYPE_ACCELEROMETER) and a magnetic field (TYPE_MAGNETIC_FIELD).

IN onResume we hang the listener and start the timer, which every 400 msec will determine the orientation of the device in space and display this infa on the screen. In the rotation variable we get the value of the current screen orientation. It is necessary for us to correctly determine the orientation of the device.

IN onPause turn off the listener and the timer.

method format simply formats the float value to one decimal place.

showInfo will display the array data in TextView. But first these data must be calculated. This will address the following two methods.

method getDeviceOrientation determines the current orientation of the device in space without taking into account the screen rotation. To do this, we first called the getRotationMatrix method, which takes the acceleration and magnetic field data and forms a data matrix of them into a variable r. Further, the getOrientation method from this matrix allows to obtain an array of values ​​(in radians) of rotation of three axes. It remains to translate radians into degrees by the toDegrees method, and we have a ready array with device angles.

method getActualDeviceOrientation similar to the getDeviceOrientation method, but it does allow for screen orientation. To do this, we additionally call the remapCoordinateSystem method, which lists the matrix for us. Using the x_axis and y_axis variables, we pass in this method data about how the axes changed places when rotating the screen.

listener listener receives acceleration and magnetic field data and writes them to arrays valuesAccel and valuesMagnet.

Run the program. Put the unit on a level surface.

Orientation: Space orientation data excluding device screen orientation.
Orientation 2: Space orientation data based on the screen orientation of the device. They are Orientation data if the screen of the device is in the normal orientation.

Here, unlike acceleration, the axes are slightly different in order. The first digit is the angle along the Z axis. In the horizontal position of the device, this number indicates the degree of deviation from the north. That is a compass. Turn the device horizontally, keeping the first digit close to zero. Now your device should look strictly north.

The second digit is the angle along the X-axis. That is, if the device is left (right!) Punched with a needle from left to right, and then try to rotate it, this, the second, digit will change. We will not pierce anything. Just grab the top of the device (away from you) and lift it up as if you wanted to see something on the screen. The underside lies on the table. You can see how the second digit changes. When the device is vertically on the underside, this value should be -90. That is a right angle. Try also lifting the underside while leaving the top on the table. The angle will go up to 90.

The third digit is the angle on the Y axis. This is analogous to the X axis. If you put your device on the table and start lifting it to the right, leaving the left one on the table (as if turning the page), the third digit will change. It will show the angle of the y-axis. Try also to raise the left one while leaving the right table.

Thus, we got a complete picture of the position of the device in space.

Try changing the screen orientation and locking it in the settings. Again, test the slopes. You will see that Orientation data is displayed on the standard screen orientation and Orientation2 is displayed on the current screen orientation.

Orientation data can also be obtained without any manipulation using the sensor TYPE_ORIENTATION. But it was declared obsolete with the Level 8 API.

the rest

Some more information about sensors.

Sensors are real (hardware) and virtual (virtual). The real ones are the sensors that give us value. And the virtual ones use real values ​​and calculate their values. In the examples we have used, we have used a real acceleration sensor (TYPE_ACCELEROMETER) and have calculated pure acceleration and gravity from it. The pure acceleration sensors (TYPE_LINEAR_ACCELERATION) and gravity (TYPE_GRAVITY) we use here are virtual, just as we calculate our result. Only, I think the mechanism of calculation there is different from ours and allows to get more real results.

The gyroscope sensor (TYPE_GYROSCOPE) shows the rotational speed in axes in radians / sec.

The sensor also has the characteristic of power - power consumption (mA). The lower it is, of course, the better for the battery.

If your application necessarily uses a sensor and will not work without it, you can specify it in the manifest using the uses-feature tag. In this case, Google Play should not allow the app to be installed on a device that does not have a specified sensor.

An example for an acceleration sensor.

The values ​​of other sensors can be found in the help in the Sensors section.

In the next lesson:

- we receive location data




Discuss in the forum [14 replies]

Leave a Comment