Transcript
Page 1: Using gravity to estimate accelerometer orientation

Using Gravity to Estimate Accelerometer OrientationDavid Mizell

Cray, Inc.Work performed while the author was at Intel Research Seattle

Abstract

Several wearable computing or ubiquitouscomputing research projects have detected anddistinguished user motion activities by attachingaccelerometers in known positions andorientations on the user’s body. This paperobserves that the orientation constraint canprobably be relaxed. An estimate of the constantgravity vector can be obtained by averagingaccelerometer samples. This gravity vectorestimate in turn enables estimation of thevertical component and the magnitude of thehorizontal component of the user’s motion,independently of how the three-axisaccelerometer system is oriented.

1. Introduction

Most previous work on detecting or measuringuser motion activities using body-wornaccelerometers [1 – 7] attached theaccelerometers to the body in a known positionand orientation relative to the body. Theresearch issue addressed in this work is thefollowing: in the absence of information abouthow a device containing a three-axisaccelerometer is being carried by the user, canwe still make reliable inferences about the user’sactivities or actions?

2. Orientation-independent accelerationinformation

Our hardware configuration employed twoAnalog Devices, Inc. ADXL202 evaluationboards hot-glued together at a 90-degree angle toprovide three orthogonal acceleration axes. Weneeded all three because of our assumption thatthe orientation of the device is not known.Acceleration measurements on each of the threeaxes were sampled at 100Hz and collected on aniPAQ, which experimental subjects could carrywith them.

Our approach for obtaining orientation-independent acceleration information makes useof the fact that MEMS accelerometers measuregravitational (“static”) acceleration as well as

(“dynamic”) acclerations caused by the wearer’smotion. The pull of gravity downward alongsome accelerometer axis manifests itself in theaccelerometer output as an acceleration in theopposite direction along that same axis.

Fig. 1. Relevant coordinate systems

There are two relevant coordinate systems, asshown in Figure 1. The three-axis accelerometerconfiguration is in some arbitrary orientation onthe wearer’s body. The three accelerometer axesare denoted in the figure as x, y, and z. Ideally,we would like to know acceleration informationin terms of a coordinate system oriented to theuser and his forward motion. In the figure, theseaxes are denoted v (for vertical), f (for thedirection of horizontal forward motion), and s isa (usually of less interest) horizontal axisorthogonal to the direction of motion.

The algorithm works as follows: for a chosensampling interval, typically a few seconds,obtain an estimate of the gravity component oneach axis by averaging all the readings in theinterval on that axis. That is, we are estimatingthe vertical acceleration vector v correspondingto gravity as v = ( vx, vy, vz ), where vx, vy and vz

are averages of all the measurements on thoserespective axes for the sampling interval.

Let a = (ax, ay, az) be the vector made up of thethree acceleration measurements taken at a givenpoint in the sampling interval. We assume forthe sake of simplicity that the three

Proceedings of the Seventh IEEE International Symposium on Wearable Computers (ISWC’03) 1530-0811/03 $ 17.00 © 2003 IEEE

Page 2: Using gravity to estimate accelerometer orientation

measurements are taken simultaneously. We setd = ( ax – vx, ay – vy, az – vz ) to represent thedynamic component of a , that caused by theuser’s motion rather than gravity. Then, usingvector dot products, we can compute theprojection p of d upon the vertical axis v as

vvvvd

p

•= . In other words, p is the vertical

component of the dynamic acceleration vector d.Next, since a 3D vector is the sum of its verticaland horizontal components, we can compute thehorizontal component of the dynamicacceleration by vector subtraction, as h = d – p.

However, as opposed to the vertical case, wedon’t know the orientation of h relative to f, thehorizontal axis we’d like to have it projectedupon. Furthermore, it appears impossible todetect. There is no dominating staticacceleration as there is in the vertical case.Accordingly, we simply compute the magnitudeof the horizontal component of the dynamicaccelerations, concluding that that is the best wecan expect to do.

The result of the algorithm performed across asampling interval is a pair of waveforms,estimates of the vertical components and themagnitude of the horizontal components of thedynamic accelerations, each of which isindependent of the orientation of the mobiledevice containing the accelerometers.

3. Conclusions

Our results indicate that a three-axis MEMSaccelerometer system might be useful indetecting and distinguishing several user motionactivities, such as walking, running, climbing ordescending stairs, or riding in a vehicle – in spiteof the fact that the position and orientation of thedevice are not known. We conjecture that thevertical acceleration component is sufficientinformation for most such activity detection.

Acknowledgements

The author wishes to thank Jonathan Lester ofthe University of Washington for the hardwareimplementations for this work, and MikePerkowitz of Intel Research Seattle for hisimplementation of the iPAQ data capturesoftware, and both of them for their help with theexperiments. Saurav Chatterjee of the Universityof Washington was instrumental in

implementing early prototype hardware. KurtPartridge and Waylon Brunette of UW providedvaluable debugging help. A special thanks goesto Prof. W. Dan Curtis of Central WashingtonUniversity for vastly clarifying the author’smathematics.

References

[1] Aminian, K, et al, “Motion Analysis inClinical Practice Using AmbulatoryAcclerometry,” in Lecture Notes inArtificial Intelligence 1537, Modeling andMotion Capture Techniques for VirtualEnvironments, 1998.

[2] Farringdon, J., et al, “Wearable SensorBadge & Sensor Jacket for ContextAwareness,” in Proceedings, ThirdInternational Symposium on WearableComputers, San Francisco, CA, IEEEComputer Society, 1999, ISBM 0-7695-0428-0, pp.107-113.

[3] Lee, C-Y. and Lee, J-J., “Estimation ofWalking Behavior Using Accelerometersin Gait Rehabilitation,” InternationalJournal of Human-Friendly WelfareRobotic Systems, June 2002, Vol. 3, No. 2,ISSN 1598-3250, pp. 32-35.

[4] Lee, S. and Mase, K., “Activity andLocation Recognition Using WearableSensors,” IEEE Pervasive Computing,July-September 2002, Vol. 1 No. 3, pp.24-32.

[5] Morris, J., “Acclerometry – A Techniquefor the Measurement of Human BodyMovements,” in Journal of Biomechanics,Vol. 7, 1974, pp. 157-159.

[6] Randell, C., and Muller, H., “ContextAwareness by Analysing AccelerometerData,” in P r o c e e d i n g s , F o u r t hInternational Symposium on WearableComputers, Atlanta, GA, IEEE ComputerSociety, 2000, ISBN 0-7695-0795-6, pp.175-176.

[7] Van Laerhoven, K., and Cakmakci, O.,“What Shall We Teach Our Pants?”, inProceedings, Fourth InternationalSymposium on Wearable Computers,Atlanta, GA, IEEE Computer Society,2000, ISBN 0-7695-0795-6, pp. 77-83.

Proceedings of the Seventh IEEE International Symposium on Wearable Computers (ISWC’03) 1530-0811/03 $ 17.00 © 2003 IEEE


Top Related