datasheets.com EBN.com EDN.com EETimes.com Embedded.com PlanetAnalog.com TechOnline.com   UBM Tech
UBM Tech

What do smartphone sensors sense all day?

Ian Chen and Jim Steele, Sensor Platforms Inc. -December 18, 2012

As noted in our October article in EDN, each of the leading mobile operating systems offers its own approach to how apps work with sensors when the device is in standby mode. Android allows any developer to install a background process that monitors sensor data. Windows 8 requires all sensors to be powered down. And iOS takes the middle road, allowing communications, streaming media, navigation, and any app developed by Apple to run during standby.
 
A fair question to ask then is what capabilities might Apple be preserving for its own designers, and what might Windows 8 users miss.

Data Collection
To answer that question, we decided to collect some real world examples and find out what sensors would see if they were active when the phone is in standby. Sensor Platforms created an Android app that continuously logs inertial motion sensor samples. Along with each sample, we also collected all the information we could find on a user’s smartphone that might give us additional data on that user, and his situation or environment. The additional data includes location and other sensor and system information (see Table 1 below).
Table 1. Smartphone logging scheme


Our first implementation sampled GPS every second. But we quickly realized that taking GPS fixes that frequently runs the battery down in only three hours. So we had to exclude active GPS from this data collection.

We then distributed our app to colleagues, family members and friends who graciously allowed the app to keep running in the background while they went about their normal activities. The app then collected their data and uploaded it to our server periodically over the Internet.

We soon found that we had to stop taking gyroscope data for some subjects, because their phones could not last the whole day on a single charge with the gyroscope always active.

Then we analyzed the collected data and reviewed the findings.

Identifying Periods of Activity
The motivation behind our experiment is to find how, using information available on a smartphone, we can figure out what the user is doing. The first thing we need to know is when the user is doing something “noteworthy.” When a user is sitting down, it is noteworthy when he gets up and walks away. When a user is walking, it is noteworthy that he stops and sits down or gets into a car. Of course, it is also noteworthy when he is using his phone. Recording, tracking and interpreting such events is necessary if the user is to get the full, cumulative benefit of all the sensors in his mobile device. That is, allowing the device to serve the user without the user having to spend time programming the device.

The sensors and system information we recorded give us some of the answers. For example, a change in location reported by the phone informs us that the user has moved, and the mobile operating system tells us when the user is interacting with his phone. However, this picture is incomplete. Frequent location updates drain the phone battery and long gaps between updates can miss out on many activities. Operating systems may know what apps the user is employing, but they cannot tell if the user is reading email at his desk or while driving his car. Other sensors can provide additional clues. For example, a change in ambient light could mean that the user moved from indoors to outdoors. But, it also could mean the user took the phone out of his pocket.

Inertial motion sensors can provide a lot of information. After all, most activity that is noteworthy involves motion. And if the user has his smartphone with him, the sensors will record that motion. However, sensors capture more than just the user’s motion. They also capture, among other things, gravity, environmental vibrations, magnetic interferences, and movement of the vehicle in which the user is riding. The challenge is to figure out what reflects activities and what is just noise.

To better understand what inertial motion sensors are telling us, we first look at the power density spectrums (PSDs) for accelerometer, magnetometer, and gyroscope taken over a waking day of a subject. PSDs report the energy content the sensors recorded at various frequencies. When a user is sitting still and the phone is in his pocket, the frequency of the sensors measured would be nearly zero, but when a user is moving or traveling in a vehicle, sensor data would contain higher frequency content. Therefore, comparing these PSDs against the PSD when the device is completely stationary can help us understand the characteristics of a user’s movement throughout his day.

Figure 1(a) is a typical PSD for the accelerometer and is representative of the PSDs we have collected from all users. It can be viewed as three regimes: below 5 Hz; 5 – 10 Hz; and above 10 Hz. At below 5 Hz, the energy is the highest and there are multiple frequency peaks. We are seeing this because voluntary human hand motions rarely exceed 5Hz. Indeed, try moving your hand back and forth five repetitions per second and you will tire quickly. The frequency peaks occur because much of a user’s motion is repetitive. For example, when a person is walking, he is repeatedly taking steps and his arms would be swinging at his sides.


Figure 1a. Accelerometer Power Spectral Density

Between 5 and 10 Hz, we see another band of energy. We can attribute this to involuntary muscle resonance and other background vibrations, like those from the motor of a vehicle.

Whether it is being held in hand, situated in a pocket, or resting on the dashboard, the phone is always cushioned when it is in motion. That explains why the power spectrum above 10 Hz shows the characteristic of being low pass filtered. It is worth noting that the energy above 20 Hz drops to the noise floor, which validates that sampling at 50 Hz has captured all meaningful acceleration data.

Figure 1(b) is the PSD from the magnetometer of the same subject. The PSD is misleading when it shows spectral content increasing above 17 Hz. We chose to sample sensors at 50 Hz so as not to overload the application processor. However, a Nyquist rate of 25 Hz is clearly insufficient to capture magnetic interferences like those coming from AC-powered appliances, car engines, and alternators. Consequently, our data suffers from aliasing.

Figure 1b. Magnetometer Power Spectral Density

Looking at PSD of the magnetometer could also be misleading. Imagine a user standing idly on the subway platform. The magnetometer on his smartphone senses magnetic field changes from the trains going by, from the magnetic field generated by the third rail, and even from moving elevators and working vending machines around him. But none of these magnetic measurements could confirm that the user is merely standing in place.

For this article, we shall focus on changes in magnetic field characteristics that could be used to infer changes in the user’s surrounding. So magnetic field measurements that are significantly different from the Earth’s field may contain noteworthy information.


Correlating Activities with Sensor and System Information
To illustrate how we analyzed the problems the title of this article poses, this article focuses on three indicators, each of which suggest that the user may be in the middle of a noteworthy activity:
  1. Energy contents of the accelerometer signal;
  2. Anomaly magnetic field, that are significant different from normal Earth’s field; and
  3. When the screen of the smartphone is turned on.
We found that the accelerometer reports worthwhile activities for about 17 percent of a waking day. The user is in some magnetic anomaly 56 percent of the time. But the screen of the phone is turned on only 6 percent of the time. When the screen is turned off, the phone is in standby mode. So we can safely conclude that if sensors are turned off when the phone is in standby, we can only fulfill 6 percent of the potential utilities of the sensors; and we will be missing most of the opportunities to better understand the user by monitoring his activities.

Conclusion and Further Actions
The Venn diagrams (Figure 2) show that sensors, location services, and the operating system on a smartphone can work together to capture and understand user activities, which is a key step in creating smarter devices that are context aware. For example, when a context aware smartphone determines its user has left his car and is walking away, it can remember the location automatically without being prompted by the user. If the user later forgets where he has left his car, the context aware phone can remind him.


Figure 2. Venn diagram showing coverage of user activities over a waking day (not to scale).

Computing PSDs in real time in mobile devices would severely tax battery life. So the data collected from these experiments allowed Sensor Platforms to create more efficient algorithms that can perform these analyses without as much computational complexity.

Using the low-power sensors, accelerometers, and magnetometers to identify when the user is doing something noteworthy can also allow us to judiciously turn on more power hungry sensors to get more information. For example, the smartphone camera could provide valuable information on the user’s surrounding.

We discovered that when the accelerometer and magnetometer data suggest the user is active, about 13 percent of the time, at least one of the cameras is within 30 degrees of the horizon and the ambient light is sufficient to capture an image.

This and other similar findings suggest that it may be able to leverage all 18 sensors in a smartphone in a hierarchy such that only the lowest powered sensors are continuously active in context aware devices. When these sensors identify noteworthy activities, higher-power consuming sensors and systems are then brought in to improve the accuracy of context detection.

Lastly, we must recognize that people do not always have their smartphones with them. Context aware smartphones will have to collaborate with other body-worn sensors, and with “soft sensors” that process several measurements together to infer user activities from their electronic data, for devices to provide a more comprehensive profile of their users, for the users’ benefit.

About the authors
Ian Chen, Executive Vice President at Sensor Platforms, has 25 years in high technology, with a track record of successfully identifying emerging market inflections early, and addressing them with leading edge products in such varied fields as timing, networking, microprocessors, and cellular handset SoCs.

Jim Steele, Ph.D., VP of Engineering, has a track record of leading successful development of complex algorithms, software, and system engineering for cellular systems, RF communications and memory products. He also co-authored The Android Developer’s Cookbook: Building Applications with the Android SDK, and has given many invited lectures on using sensors with Android. Prior to joining Sensor Platforms, he has held senior management positions at Spansion, Polaris Wireless and ArrayComm as well as research positions in theoretical and particle physics at Massachusetts Institute of Technology and Ohio State University. Jim holds a Ph. D. in theoretical physics from State University of New York at Stony Brook.

If you liked this and would like to see a weekly collection of related products and features delivered directly to your inbox, click here to sign up for the EDN on Systems Design newsletter.

Loading comments...

Write a Comment

To comment please Log In