Algorithm Development

24pxspace

Sensor Fusion

SDI has completed an SBIR Phase II program to quantify the performance benefits of fusing multi-spectral image data provided by three nearly co-aligned sensors (ladar, IR, and color camera) for Automatic Target Recognition (ATR) applications. This effort was sponsored by the U.S. Army Aviation and Missile Research Development Engineering Center (AMRDEC) of the Army Aviation and Missile Command.  The program addressed target segmentation and target Classifier and IDentifier (CID) stages of the ATR algorithm. A feature-based evolutionary algorithm (EA) approach was employed to train the CID stages. The EA approach represented an innovative structure for improving ATR performance and streamlining computations associated with more computationally-intensive model-based techniques. Data collected from the THOR sensor suite during a series of captive flight tests at Redstone Arsenal, AL in calendar years 2005 and 2008 defined the database for the Phase II program. The THOR sensor suite includes: (1) a Lockheed NetFires ladar seeker, (2) a BAE LTC550 IR camera, and (3) an Elmo MN43H CCD color camera. The sensor suite was integrated with the Stabilized Airborne Electro-optical Instrumentation Platform (SEAIP) that was then mounted to a UH-1 helicopter. The three sensors were oriented on the SEAIP to provide roughly the same lines-of-sight to the imaged scene. During the captive flight tests, data was collected for diverse target arrays. Overall, data from over 60 tests were used to catalogue multi-spectral imagery for the sensor fusion effort. The majority of these tests included multiple targets, thereby providing a target-rich database of ladar, IR, and color imagery. Ground-truth files were generated for each image in the database. The ground-truth data includes the target class, target type, row/column target location within the image, estimated target pose angle, and ancillary qualitative information. The database was separated into an equal number of training and test files to train and test the target CIDs. Fusion of the multi-spectral data was enacted at the feature level. Feature fusion uses independent feature vectors from the three sensors prior to making an identification decision. The feature-fusion approach involves concatenating the feature vectors extracted from the three images (one vector per sensor) into a single feature vector. Twenty-seven features were extracted from segmented ladar target chips, and a set of 20 identical features were extracted from segmented IR and color target chips. Feature vectors extracted from the training set images were written to feature training files that were processed by SDI’s evolutionary algorithm program, e, to train the target CIDs. To test the CIDs ATR performance on the test files, the trained CIDs were incorporated into the MultiSensor Evolutionary ATR (MSEATR, pronounced “em seater”) program. Developed by SDI, MSEATR is an interactive tool that integrates all the stages for a complete ATR system including image preprocessing, target segmentation, target pose estimation, feature extraction, and target identification. (see MSEATR screen capture below)

sensor_fusion

Select the right for you text message tracker will not be a problem for a away back time and there were methods to find your particular model is not paying too much or anything track text messages were invented at the same time and everyone liked.