Falcon ODIN is a technology demonstration science payload being designed for delivery to the International Space Station (ISS) in late 2024. Falcon ODIN contains two event based cameras (EBC) and two traditional framing cameras along with mirrors mounted on azimuth elevation rotation stages which allow the field of regard of the EBCs to move. We discuss the mission design and objectives for Falcon ODIN along with ground-based testing of all four cameras.
Event-based vision sensor (EVS) technology has expanded the CMOS image sensor design space of low-SWaP sensors with high-dynamic range operation and ability, under certain conditions, to efficiently capture scene information at a temporal resolution beyond that achievable by a typical sensor operating near a 1 kHz frame rate. Fundamental differences between EVS and framing sensors necessitate development of new characterization techniques and sensor models to evaluate hardware performance and the camera architecture trade-space. Laboratory characterization techniques reported previously include noise level as a function of static scene light level (background activity), contrast responses referred to as S-curves, refractory period characterization using the mean minimum interspike interval, and a novel approach to pixel bandwidth measurement using a static scene. Here we present pre-launch characterization results for the two Falcon ODIN (Optical Defense and Intelligence through Neuromorphics) event-based cameras (EBCs) scheduled for launch to the International Space Station (ISS). Falcon ODIN is a follow-on experiment to Falcon Neuro previously installed and operated onboard the ISS. Our characterization of the two ODIN EBCs includes high-dynamic range background activity, contrast response S-curves, and low-light cutoff measurements. Separately, we report evaluation of the IMX636 sensor functionality get_illumination which gives an auxiliary measurement of on-chip illuminance (irradiance) and can provide high dynamic range sensing of sky brightness (background light level).
KEYWORDS: Cameras, Sensors, Optical engineering, Field programmable gate arrays, Data acquisition, Space operations, Linear filtering, Imaging systems, Physics, Staring arrays
We report on the Falcon neuro event-based sensor (EBS) instrument that is designed to acquire data from lightning and sprite phenomena and is currently operating on the International Space Station. The instrument consists of two independent, identical EBS cameras pointing in two fixed directions, toward the nominal forward direction of flight and toward the nominal Nadir direction. The payload employs stock DAVIS 240C focal plane arrays along with custom-built control and readout electronics to remotely interface with the cameras. To predict the sensor’s ability to effectively record sprites and lightning, we explore temporal response characteristics of the DAVIS 240C and use lab measurements along with reported limitations to model the expected response to a characteristic sprite illumination time-series. These simulations indicate that with appropriate camera settings the instrument will be capable of capturing these transient luminous events when they occur. Finally, we include initial results from the instrument, representing the first reported EBS recordings successfully collected aboard a space-based platform and demonstrating proof of concept that a neuromorphic camera is capable of operating in the space environment.
We introduce the use of an event based image sensor as a versatile device for characterisation of turbulence and for wavefront Sensing at unusually high rates. This type of sensor presents the changes in the intensity field with time that integrate to a threshold level before output, rather than the traditional integrate for a specified time period. As a result a time sequence of activity tagged to spatial location and time is provided. This information can be signal processed in novel ways to ascertain high speed imagery, or more importantly directly provide fundamental descriptors of the turbulence field related to phase structure function, and possibly determining the power spectrum parameters of Cn2 , Fried’s coherence length r0 , anisoplanatic angle and the changes in these over ensemble average timescales.
The likes of the lenslet array used in Shack-Hartmann sensors can present the optical field to the sensor upon which it may provide location of the centroid by occurrence of light accumulating to a threshold, or highest temporal rate of occurrences in the region of interest behind the lenslet. In this way the Shack-Hartmann sensors it is not limited to one star per lenslet, and can respond to stationarity changes, and can facilitate investigations of chaos in the timeseries of aberrations.
We explore these thoughts upon data collected on-sky from both streaming Shack-Hartmann sensor and the direct light field incident onto the sensor, and the sensor behind a lenslet array. In fact we present on-sky data from two event based sensors (ATIS and Davis 240c), alongside the traditional integration video camera. Data is acquired both with and without an image intensifier.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.