In this invited paper, a simple and efficient matrix formalism is presented for computing aberrations in plane-parallel freeform mirror systems. The approach is flexible and can be easily generalized to arbitrary aberration orders and/or to systems with different symmetries. As an illustration, we derive analytical expressions for all 2nd and 3rd order image and pupil aberrations in plane-parallel confocal N-mirror systems. Some design examples are also presented and discussed.
Off-axis systems with freeform mirrors is a design approach of increasing importance for space instruments. While the off-axis reflective approach allows simple, versatile and obstruction free designs, the use of freeform surfaces allows to achieve better optical performance and/or compactness. In recent years, many such instruments have been designed and manufactured at TNO. Some important examples are the Tropomi (S5/precursor) and TSBOA (Sentinel-5) telescopes and several pushbroom spectrometers of the Spectrolite family. Despite the recognized potential of these systems, there is to our knowledge no available theory that allows to describe and predict the aberrations of plane symmetrical (off-axis) systems with freeform mirrors. In this context, an effort has been started 4 years ago at TNO to develop an approach that can describe and explore systematically off-axis freeform mirror systems with an arbitrary geometry. Since then the developed theory has been proven to be very useful in scanning the solution space for better starting designs, folding geometries and correction of lower aberrations in an early design phase. To circumvent difficulties linked to a wavefront formalism, generalized ray-tracing equations were derived, that include aberration terms up to 3rd order in X/Y object and pupil coordinates. These equations were recently published in two papers for the case of pure mirror systems [1,2]. The theory was also expanded to describe flat reflective gratings, opening a way to a complete description of reflective freeform spectrometers. In the present paper, after providing a high-level description and introduction to the aberration theory for freeform mirror systems, we will report some of its outcomes that have some practical relevance for space instruments. In particular: - two-mirror telescope designs for slit spectrometers (thus having a 1D field along the slit) that are inherently corrected for spatial smile will be presented. For these designs the slit projection in object space is exactly straight. - a new family of mirror spectrometers will be introduced that uses flat gratings and no collimator. In a collimator-less spectrometer, the aberrations induced by the grating under diverging light are corrected with freeform mirrors. The presented designs are entirely calculated from theory, with only optimization of the higher order freeform terms. A simpler architecture than traditional designs is obtained, with less optical surfaces. - finally we present a systematic classification of distortions in imaging slit spectrometers. The difference between distortions originating in the collimator and at the grating or in the imager is clarified and described mathematically. We discuss aberration-induced non-linear dispersion, as well as distortions from the keystone and smile families. The proposed classification also applies to catadioptric and refractive systems, the only requirement being to have a plane of symmetry.
In this paper, we discuss the setup of a confocal nanoscope in reflection using a super-oscillatory lens (SOL)which offers a sub-diffracted focal spot with an ultra-short depth of focus (≈100 nm). A tolerance of 20 nm defocussing in a 10 μm travel range is thus necessary, translating to an alignment requirement of the stage, with respect to the optical axis, below 2 mrad. We discuss an iterative procedure to fine tune the stage movement to achieve this requirement. We also demonstrate the necessity of the alignment by imaging a 5μm long 1D array of rectangular shaped Au nanostructures with a periodicity of 500 nm
Digital imaging has been steadily improving over the past decades and we are moving towards a wide use of multi- and hyperspectral cameras. A key component of such imaging systems are color filter arrays, which define the spectrum of light detected by each camera pixel. Hence, it is essential to develop a variable, robust and scalable way for controlling the transmission of light. Nanostructured surfaces, also known as metasurfaces, offer a promising solution as their transmission spectra can be controlled by shaping the wavelength-dependent scattering properties of their constituting elements. Here we present, metasurfaces based on silicon nanodisks, which provide filter functions with amplitudes reaching 70-90% of transmission, and well suitable for RGB and CMY color filter arrays, the initial stage towards the further development of hyperspectral filters. We suggest and discuss possible ways to expand the color gamut and improve the color values of such optical filters.
With the introduction of the NXE:3400B scanner, ASML has brought EUV to High-Volume Manufacturing for sub10nm node lithography. And work has already been started on a successor high-NA system with NA=0.55. For both these systems, node resolution will go down faster than NA increases, resulting in decreasing k1-factors and tightening of aberration requirements. A crucial component for measuring and controlling aberrations in-situ is a diffuser to fill the full pupil of the projection optics appropriately.
This paper presents several new diffuser concepts, both reflective as well as transmissive, with their respective key performance metrics for both NA=0.33 and NA=0.55 EUV projection optics. These concepts can be used for measuring wavefront quality from dedicated fiducial plates, or for measuring directly from the imaging reticle. The latter would enable a combination of reticle alignment with lens aberration control without throughput penalty.
It will be shown that with these diffuser concepts, we have a solution for in-situ aberration control for 5nm nodes and below.
Due to its potential for high resolution and three-dimensional imaging, soft x-ray ptychography has received interest for nanometrology applications. We have analyzed the measurement time per unit area when using soft x-ray ptychography for various nanometrology applications including mask inspection and wafer inspection, and are thus able to predict (order of magnitude) throughput figures. Here we show that for a typical measurement system, using a typical sampling strategy, and when aiming for 10-15 nm resolution, it is expected that a wafer-based topology (2.5D) measurement takes approximately 4 minutes per μm2 , and a full three-dimensional measurement takes roughly 6 hours per μm2 . Due to their much higher reflectivity EUV masks can be measured considerably faster; a measurement speed of 0.1 seconds per μm2 is expected. However, such speeds do not allow for full wafer or mask inspection at industrially relevant throughput.
In the last years, much interest has grown around the concept of optical surfaces employing high contrast dielectric resonators. However, a systematic approach for the design of this optical surfaces under particular requirements has never been proposed. In this contribution, we describe this approach applied to the robust design of an array of microlenses characterized by a numerical aperture of NA=0.19 with a field of view of FOV = ±60 mrad in a bandwidth of 20 nm. Typically, dielectric resonators are engineered in such a way to have almost full transmissive surfaces with locally tunable phase. However, considering the multiple wavelengths and angles under which the lenses may work, it is difficult to get uniform transmission characteristics for all the dielectric resonators employed. The design strategy, here proposed, uses a particle swarm optimization routine to find the best resonator distribution able to meet the requirements considering the amplitude and phase dispersive characteristics of the resonators surfaces. In the optimization process, also the effects of possible manufacturing inaccuracies, such as variations of resonators radii, are taken into account, allowing a robust design of the structure, within the given manufacturing tolerances. Different designs, operating at 405 nm and 635 nm, are presented and their performances are discussed.
is a well-known detection method which is applied in many different scientific and technology domains including atmospheric physics, environmental control, and biology. It allows contactless and remote detection of sub-micron size particles. However, methods for detecting a single fast moving particle smaller than 100 nm are lacking.
In the present work we report a preliminary design study of an inline large area detector for nanoparticles larger than 50 nm which move with velocities up to 100 m/s. The detector design is based on light scattering using commercially available components.
The presented design takes into account all challenges connected to the inline implementation of the scattering technique in the system: the need for the detector to have a large field of view to cover a volume with a footprint commensurate to an area of 100mm x 100mm, the necessity to sense nanoparticles transported at high velocity, and the requirement of large capture rate with a false detection as low as one false positive per week. The impact of all these stringent requirements on the expected sensitivity and performances of the device is analyzed by mean of a dedicated performance model.
Designing a novel optical system is a nested iterative process. The optimization loop, from a starting point to final system is already mostly automated. However this loop is part of a wider loop which is not. This wider loop starts with an optical specification and ends with a manufacturability assessment. When designing a new spectrometer with emphasis on weight and cost, numerous iterations between the optical- and mechanical designer are inevitable. The optical designer must then be able to reliably produce optical designs based on new input gained from multidisciplinary studies. This paper presents a procedure that can automatically generate new starting points based on any kind of input or new constraint that might arise. These starting points can then be handed over to a generic optimization routine to make the design tasks extremely efficient. The optical designer job is then not to design optical systems, but to meta-design a procedure that produces optical systems paving the way for system level optimization. We present here this procedure and its application to the design of TROPOLITE a lightweight push broom imaging spectrometer.
One of the big advantages of polymer optics is the possibility of integrating several functions into one component. These
functions can range from mechanical reference datums all the way to micro channels in "lab-on-chip" type of
applications. In this paper an overview on several design- and manufacturing principles of such integrated components
will be given. Furthermore the next steps in the
Design - Build - Test cycle will be discussed as well: mold
manufacturing, molding and finally metrology.
During the last years compact CMOS imaging cameras have grown into high volume applications such as mobile phones, PDAs, etc. In order to insure a constant quality of the lenses of the cameras, MTF is used as a figure of merit. MTF is a polychromatic, objective test for imaging lens quality including diffraction effects, system aberrations and surface defects as well. The draw back of MTF testing is that the proper measurement of the lens MTF is quite cumbersome and time consuming. In the current investigation we designed, produced and tested a new semi-automated MTF set up that is able to measure the polychromatic lens system MTF at 6 or more field points at best focus in less than 6 seconds. The computed MTF is a real diffraction MTF derived from a line spread function (not merely a contrast measurement). This enables lens manufacturers to perform 100% MTF testing even in high volume applications. Using statistic tools to analyze the data also gives possibility to find even small systematic errors in the production like shift or tilt of lenses and lens elements. Using this as feedback the quality of the product can be increased. The system is very compact and can be put easily in an assembly line. Besides design and test of the MTF set up correlation experiments between several testers have been carried out. A correlation of better than 6% points for all tested systems at all fields has been achieved.
Injection molded optics are frequently applied in many high volume applications. Bar code scanners, CD / DVD systems, CMOS cameras are a few examples. In all of these applications cost effective and fast design cycles are essential. At Philips High Tech Plastics we developed a design system that touches on all different aspects of the system design. Starting with traditional lens design (sequential ray tracing) and tolernacing we transport the initial design into mechanical solid modeling. During mechanical modeling, tolerances, injection molding design rules and integration of mechanical features, reference marks, etc. are incorporated as well. Here the full advantage of injection molding can be utilized. After the opto - mechanical modeling the system is ported back to non - sequential ray tracing for ghost - and stray light analysis. Finally extended tolerancing is performed in order to come to a robust high volume product. If necessary all or several steps in this design process are repeated in order to arrive at the final design. As an additional requirement the metrology possibilities for the design are checked in at an early stage. This integral system approach to optical design, including optical modeling (sequential and non-sequential) combined with mechanical solid modeling is presented using some recent examples.
Characterization of optical appearance by measurement of the hemispherical scattering distribution using a concave projection screen and a camera is investigated. Secondary intensities by repeated internal screen reflections can be measured separately and compensated for. The concept is coupled to functional properties of product surfaces and we use it in an industrial environment. Only little less accurate than a photogoniometer, the hardware is much cheaper, contains no moving parts and is up to 1000 times faster.
Scanning Deflectometry is a powerful method to measure optical figure quality of various optical components and systems in a simple way. This principle uses detection of slope deviations rather than optical path length variations. As an example, the design of a basic deflectometer for testing flat mirrors is presented.
A well-known advantage of injection molded plastic optical components is the possibility of integration of an optical function and a mechanical mount. The optical part can be positioned accurately with respect to well-defined references of the mechanical mount. The optical function does not have to be restricted to one optical surface. In principle any combination of lenses, mirrors and/or beam splitters is possible. The metrology of these combined optical functions is often not trivial. Commercial available measuring equipment in general has difficulties when the different optical functions are tightly toleranced with respect to each other and when less common types of optical surfaces are involved. In this paper three examples of multi function optical components are presented. One of these examples, a double mirror, is elaborated in detail in terms of metrology. The orientation of both mirrors with respect to the mechanical references is tightly toleranced. The same holds for the orientation of the mirrors with respect to each other. The shape of one of the mirrors is so accurate that the reflected wave front is diffraction limited. The other mirror is an off axis paraboloid. The specially developed measurement tool, based on the autocollimator principle, the obtainable measurement accuracies and the calibration procedure will be described. Also the product accuracies realized with injection molding of this component in mass production will be presented.
Scanning Deflectometry is a powerful method to measure optical figure quality of various optical components and system in a simple way. This principle uses detection of slope deviations rather than optical path length variations. As an example, the design of a basic deflectometer for testing flat mirrors is presented.
Light scattering measurements are important tools for characterizing optical surfaces can basically be divided into two main groups: total scatter measurements (TS) and Angle Resolved Scattering (ARS). Since TS measurements are fairly straight forward and widely used, international standardization has formulated an international draft standard on it, ISO/DIS 13696. ARS is a more complex method and not as common as TS measurements. However ARS data in form of the Bi-directional Reflectance Distribution FUnction (BRDF) can be used to predict stray light in Laser and Industrial Optical Systems. Because of increasing importance of this topic the EC is sponsoring a project regarding 'Standard procedures for stray light specification, measurement and testing - SLIOS'. During the project two of the activities are: performing a round robin experiment of measuring BRDF data at 5 different sites including some complementary techniques; compiling of an open access data base of measured BRDF data, measured according to procedures agreed upon between the SLIOS partners and proposed for 'Standard Procedures'. Results of these two activities will be presented.
In many consumer and professional plastic lenses have potential applications because of cost, weight and aspherical shape. However they suffer from a big disadvantage: large amount of focal shift as function of temperature. In particular for bar code scanners, focal shifts due to temperature changes have huge impacts on the function of the scanner. One possibility to improve the temperature behavior of such lenses is to turn them into so- called hybrids: a combination of a refractive and a diffractive surface. This way a temperature compensation can be achieved that reaches the level of glass lenses. In this paper design and manufacturing considerations for such a lens will be given. This includes proper material choice and mechanical design. The lens is temperature compensated over a range from -230 degrees-+60 degrees C. Operating at 650 nm and having a focal length of 4 mm, makes it extra difficult to produce such a lens with sufficient image quality and diffraction efficiency. Results from the design will be compared with measured values from an injection molded sample of the designed lens. Quality parameter such as wavefront quality, focal shift with temperature and diffraction efficiency will be given.
A design study for a compact 3D scanner, called Coplan, is presented. The Coplan is intended to be used for high speed, in-line coplanarity and shape measurement of electronic components, like Ball Grid Arrays and Surface Mount Devices. The scanner should have a scan length of at least 2 inches and a resolution of 5 micrometers in all 3 dimensions. First an analysis of two different scan schemes is made: a so-called pre-objective scheme using an F-(theta) scan lens and a post- objective scheme using a so-called banana field flattener, consisting of a convex, cylindrical hyperbolic mirror and a concave, cylindrical parabolic mirror. Secondly, an analysis of height resolution requirements for triangulation and confocal depth sensing has been made. It is concluded that for both methods of depth sensing a synchronous scheme with a 50-60 degrees detection angle in cross scan direction is required. It is shown that a post-objective scheme consisting of a banana mirror system combined with triangulation height detection offers the best solution for the optical requirements.
Common interferogram analysis techniques as phase-stepping or fringe analysis yield several drawbacks in regard of the measurement procedure (three or more interferograms, phase unwrapping etc.). We developed a new analysis algorithm which reduces these limitations to the minimum number. The new analysis algorithm can be applied for different types of interferograms such as Fizeau, Michelson and shearing. The wave-aberration is evaluated from a single interferogram -apart of the sign- by applying optimisation methods originally developed for lens design purposes. Describing the wave-aberration by a polynomial expansion, the coefficients of this expansion are used as variables which are adjusted by iterative variations so that the corresponding computed interferogram is approximating the real interferogram. The actual procedure will be demonstrated for several examples of practical importance. Thereby the successful realization will be demonstrated.
Interferometric tests are widely used to test high precision optical systems. This kind of testing is about
the only way to assure the desired accuracy and performance of those systems. In this paper a new
lateral shearing interferometer (LSI) will be presented originally designed to test infinity corrected microscope
objectives.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.