This work employs optical spatial filtering in transmission to increase the contrast of objects imaged with the Event-Based Sensor (EBS), improving object detectability. EBS are asynchronous imaging sensors with integrated change detection capabilities, able to detect spatio-temporal changes in scene brightness as they occur. This change detection capability is implemented in electronics, which while providing advantages such as low power and low latency, limits its ability to detect low-contrast objects. To address this shortcoming, the EBS may be augmented with improvements to its low-contrast sensitivity, using optical, coherent high-spatial frequency filtering (HPF). In this work, we present optical HPF to improve detection performance in EBS imaging systems. Experimental measurements demonstrate that objects containing features with contrasts as low as 3.53% are discernable, which enables object detection with triple the sensitivity of the standalone EBS.
In this work, we describe a test setup and experiments for evaluating event-based-sensor (EBS) based imaging systems in detecting small, moving objects against complex backgrounds. First, design decisions and the components of the setup are described, followed by how the setup is useful for conducting statistical performance analysis. The system description is followed up by testing different system configurations, investigating several algorithms including motion estimation, and noise filtering, compared to a reference system. We measure their effect on performance through Receiver Operating Characteristic (ROC) analysis, testing across many scene variations. We provide the results with a set of plots coupled with discussion and analysis.
Event-Based Sensors (EBSs) are passive electro-optical (EO) imaging sensors which have read-out hardware that only outputs when and where temporal changes in scene brightness are detected. In the case of a static background and platform, this hardware ideally implements background clutter cancellation, leaving only moving object data to be read out. This data reduction leads to a bandwidth reduction, which is equivalent to increasing spatio-temporal resolution. This advantage can be exploited in multiple ways, using trade-offs between spatial and temporal resolution, and between spatial resolution and field-of-view. In this paper, we introduce the EBS concept and our previous experiments and analysis. We discuss important EBS properties, followed by discussion of applications where the EBS could provide significant benefit over conventional frame-based EO sensors. Finally, we present a method for analyzing EBS technology for specific applications (i.e. determine performance compared to conventional technology). This approach involves abstraction of EBS and conventional imaging technology and provides a way to determine the value of EBSs over conventional imaging technology for facilitating future EBS application development.
This work focuses on a methodology to improve spatio-temporal performance in optical imaging systems. We investigate the potential of the optical event-based sensor (EBS) technology to reach spatio-temporal performance limits beyond that of imaging systems employing conventional focal-plane arrays. Specifically, we investigate EBS performance under object/scene/platform motion, where its spatio-temporal performance degrades. We propose a hardware motion compensation sub-system and experimentally demonstrate that the performance of a moving EBS can be recovered through effective reduction of platform motion. This demonstration confirms that the EBS can deliver significantly improved spatio-temporal performance, while under motion.
The Dynamic Vision Sensor (DVS) is an imaging sensor that processes the incident irradiance image and outputs temporal log irradiance changes in the image, such as those generated by moving target(s) and/or the moving sensor platform. From a static platform, this enables the DVS to cancel out background clutter and greatly decrease the sensor bandwidth required to track temporal changes in a scene. However, the sensor bandwidth advantage is lost when imaging a scene from a moving platform due to platform motion causing optical flow in the background. Imaging from a moving platform has been utilized in many recently reported applications of this sensor. However, this approach inherently outputs background clutter generated from optical flow, and as such this approach has limited spatio-temporal resolution and is of limited utility for target tracking applications. In this work we present a new approach to moving target tracking applications with the DVS. Essentially, we propose modifying the incident image to cancel out optical flow due to platform motion, thereby removing background clutter and recovering the bandwidth performance advantage of the DVS. We propose that such improved performance can be accomplished by integrating a hardware tracking and stabilization subsystem with the DVS. Representative simulation scenarios are used to quantify the performance of the proposed approach to clutter cancellation and improved sensor bandwidth.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.