TY - GEN
T1 - Satellites in our pockets
T2 - 10th International Conference on Mobile Systems, Applications, and Services, MobiSys'12
AU - Manweiler, Justin Gregory
AU - Jain, Puneet
AU - Roy Choudhury, Romit
PY - 2012
Y1 - 2012
N2 - This paper attempts to solve the following problem: can a distant object be localized by looking at it through a smartphone. As an example use-case, while driving on a highway entering New York, we want to look at one of the skyscrapers through the smartphone camera, and compute its GPS location. While the problem would have been far more difficult five years back, the growing number of sensors on smartphones, combined with advances in computer vision, have opened up important opportunities. We harness these opportunities through a system called Object Positioning System (OPS) that achieves reasonable localization accuracy. Our core technique uses computer vision to create an approximate 3D structure of the object and camera, and applies mobile phone sensors to scale and rotate the structure to its absolute configuration. Then, by solving (nonlinear) optimizations on the residual (scaling and rotation) error, we ultimately estimate the object's GPS position. We have developed OPS on Android NexusS phones and experimented with localizing 50 objects in the Duke University campus. We believe that OPS shows promising results, enabling a variety of applications. Our ongoing work is focused on coping with large GPS errors, which proves to be the prime limitation of the current prototype.
AB - This paper attempts to solve the following problem: can a distant object be localized by looking at it through a smartphone. As an example use-case, while driving on a highway entering New York, we want to look at one of the skyscrapers through the smartphone camera, and compute its GPS location. While the problem would have been far more difficult five years back, the growing number of sensors on smartphones, combined with advances in computer vision, have opened up important opportunities. We harness these opportunities through a system called Object Positioning System (OPS) that achieves reasonable localization accuracy. Our core technique uses computer vision to create an approximate 3D structure of the object and camera, and applies mobile phone sensors to scale and rotate the structure to its absolute configuration. Then, by solving (nonlinear) optimizations on the residual (scaling and rotation) error, we ultimately estimate the object's GPS position. We have developed OPS on Android NexusS phones and experimented with localizing 50 objects in the Duke University campus. We believe that OPS shows promising results, enabling a variety of applications. Our ongoing work is focused on coping with large GPS errors, which proves to be the prime limitation of the current prototype.
KW - augmented reality
KW - localization
KW - structure from motion
UR - http://www.scopus.com/inward/record.url?scp=84864373093&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84864373093&partnerID=8YFLogxK
U2 - 10.1145/2307636.2307656
DO - 10.1145/2307636.2307656
M3 - Conference contribution
AN - SCOPUS:84864373093
SN - 9781450313018
T3 - MobiSys'12 - Proceedings of the 10th International Conference on Mobile Systems, Applications, and Services
SP - 211
EP - 224
BT - MobiSys'12 - Proceedings of the 10th International Conference on Mobile Systems, Applications, and Services
Y2 - 25 June 2012 through 29 June 2012
ER -