ME Seminar Series: Computational Vision and Control

Apr 18

Wednesday, April 18, 2012

11:30 am - 1:00 pm
Teer 115

In this presentation, research is presented that addresses novel methods for real-time processing of 3 D sensing systems measurements { (X,Y,Z, R, G, B) point clouds capturing the geometry and texture} of an unknown scene. These sensor systems and embedded computational implementations can be borne aboard a UAV, robot or spacecraft and overlapping imagery redundantly measures the scene at video rates.  We discuss basic algorithms along with experimental results wherein simultaneous location and mapping technologies are demonstrated for space proximity operations and other applications.  A fundamental new result is summarized that allows near-real-time fusion of overlapping point clouds.  This result allows a heretofore nonlinear algorithm for point cloud correspondence and data fusion to be rigorously achieved without the necessity of linearization via a judicious coordinate choice.  Also, a statistical method is discussed to characterize local accuracy and reject scene-dependent spurious features.  Videos of laboratory experiments demonstrating end-to-end closure will be presented.

Contact

Thompson, Michele
660-5321
mthomp@duke.edu