The semiconductor industry has been largely using the mean+3sigma overlay dispositioning metric for over 20 years now. As technology shrink progresses, this metric does not represent the most accurate overlay condition on the wafer. The continuous usage of the traditional metric leads to non-optimal rework decisions and potential yield loss in high-volume manufacturing (HVM). We propose an alternative overlay dispositioning metric that reduces rework without compromising accuracy or sampling. The proposed metric is called ‘number-of-dies-within-spec’. It is obtained by first evaluating the overlay model on a dense grid, followed by comparing the grid values against the spec limits. Based on that, each die can be evaluated, and dies below the spec are counted to obtain the wafer key performance indicator (KPI) “number-of-dies-within-spec”. This paper shows the rework gain for two layers when using our proposed metric against the traditional mean+3sigma dispositioning.
Improving on product overlay is one of the key challenges when shrinking technology nodes in semiconductor manufacturing. Using information from non-lithography process steps can unleash overlay improvement potential.1 The challenge is to find intra-wafer signatures by measuring planar distortion. Several previous applications showed that using exposure tool wafer alignment data can improve overlay performance.2 With smart placement of alignment mark pairs in the X and Y direction, it is possible to determine intra-wafer distortion wafer-by-wafer. Both the measurement and modeled results are applied directly as a feed-forward correction to enable wafer level control. In this paper, the capability to do this is evaluated in a feasibility study.
In leading-edge lithography, field-by-field corrections, also known as corrections per exposure, are well established. Many manufacturers use a combination of the traditional higher order wafer and intrafield polynomial corrections, combined with linear field-by-field corrections. However, non-linear wafer deformations are usually strongest at the wafer edge. Therefore, specific high order field-by-field corrections are the ultimate correction method to mitigate the effects of these non-linear wafer deformations. However, determining the appropriate amount of high order field-to-field corrections is not trivial. At the wafer edge, exposure fields are often incomplete, so the fields contain less overlay marks and have a less regular distribution than fields that are completely on the wafer. Therefore, even on dense measurements, it is challenging to model these fields with a high order model without applying overcorrection. On the other hand, the metrology capacity for dense measurements is high, so these can typically only be performed with low frequency. Alternatively, smart field-by-field modeling algorithms are available to compute higher order effects based on reduced sampling plans. In this paper, we study different algorithmic approaches to optimize modeling algorithms for both dense and sparse (reduced) sampling plans. We compare the impact of varying the frequency of dense sampling to the performance of different modeling algorithms on sparse sampling.
A standalone alignment technology was developed as a fundamental solution to improve on-product overlay (OPO). It enables high performance alignment measurements, and delivers state-of-the-art feed forward corrections to exposure scanner. Dense alignment sampling and high-order field distortion correction is effective for scanner fingerprint matching and for heat related field distortions. A modeling and sampling optimization software is a powerful tool for dense sampling and high-order overlay correction with minimal throughput loss. We performed an overlay experiment using the standalone alignment technology coupled with a modeling and sampling optimization software, which demonstrates on-product overlay improvement potential for next generation manufacturing accuracy and productivity challenges.
In leading edge lithography, overlay is usually controlled by feedback based on measurements on overlay targets, which are located between the dies. These measurements are done directly after developing the wafer. However, it is well-known that the measurement on the overlay marks does not always represent the actual device overlay correctly. This can be due to different factors, including mask writing errors, target-to-device differences and non-litho processing effects, for instance by the etch process.1
In order to verify these differences, overlay measurements are regularly done after the final etch process. These post-etch overlay measurements can be performed by using the same overlay targets used in post-litho overlay measurement or other targets. Alternatively, they can be in-device measurements using electron beam measurement tools (for instance CD-SEM). The difference is calculated between the standard post-litho measurement and the post-etch measurement. The calculation result is known as litho-etch overlay bias.
This study focuses on the feasibility of post-etch overlay measurement run-to-run (R2R) feedback instead of post-lithography R2R feedback correction. It is known that the post-litho processes have strong non-linear influences on the in-device overlay signature and, hence, on the final overlay budget. A post-etch based R2R correction is able to mitigate such influences.2
This paper addresses several questions and challenges related to post-etch overlay measurement with respect to R2R feedback control. The behavior of the overlay targets in the scribe-line is compared to the overlay behavior of device structures. The influence of different measurement methodologies (optical image-based overlay vs. electron microscope overlay measurement) was evaluated. Scribe-line standard overlay targets will be measured with electron microscope measurement. In addition, the influence of the intra-field location of the targets on device-to-target shifts was evaluated.
There are different approaches for alignment sampling optimization. In order to determine, which approach is optimal, OPAL run-to-run simulations1 must be executed using the result of the different sampling optimization. This means that there is a two-step approach: first, an iterative sampling optimization algorithm that results in optimal overlay modeling. Then, a run-to-run simulation is done to verify the impact on the overlay performance.
In this study, we investigate on the behavior of four different approaches to alignment sampling optimization on four different layers and analyze which approach is most suitable for which layer.
It was proven that higher order intra-field alignment data modeling and correction has the potential to improve overlay performance by correcting reticle heating and lens heating effects intra-wafer and wafer- to-wafer.1 But there were also challenges shown that needed further investigation. As the alignment measurement is done on a coordinate system with absolute positions, the modeled iHOPC values might be high. A suitable method needs to be developed to distinguish between tool-to-tool offsets, process influence and layer-to-layer tool stack effect. In this paper we will take the next step and evaluate the overlay improvement potential by using intra-field alignment data in an overlay feed-forward simulation. An overlay run-to-run simulation is afterwards performed to estimate the optimization potential. To simulate higher order intra-field overlay, dense alignment data is needed. Facing the challenge of optimizing the number of measured marks but not losing relevant information, an intra-field alignment mark sampling optimization is done to find the best compromise between throughput and overlay accuracy.
Before each wafer exposure, the photo lithography scanner’s alignment system measures alignment marks to correct for placement errors and wafer deformation. To minimize throughput impact, the number of alignment measurements is limited. Usually, the wafer alignment does not correct for intrafield effects. However, after calibration of lens and reticle heating, residual heating effects remain. A set of wafers is exposed with special reticles containing many alignment marks, enabling intra-field alignment. Reticles with a dense alignment layout have been used, with different defined intra-field bias. In addition, overlay simulations are performed with dedicated higher order intra-field overlay models to compensate for wafer-to-wafer and across-wafer heating.
Advanced processing methods like multiple patterning necessitate improved intra-layer uniformity and balancing monitoring for overlay and CD. To achieve those requirements without major throughout impact, a new advanced mark for measurement is introduced. Based on an optical measurement, this mark delivers CD and overlay results for a specified layer at once. During the conducted experiments at front-end-of-line (FEOL) process area, a mark selection is done and the measurement capability of this mark design is verified. Gathered results are used to determine lithography to etch biases and intra-wafer signatures for CD and overlay. Furthermore, possible use cases like dose correction recipe creation and process signature monitoring were discussed.
Monitoring long-term performance of projection optics in lithographic exposure systems will become more and more important, especially for 193nm wavelength. Various effects influence the quality and long-term stability of a lens projection system. Using the well known and established blazed phasegrating method, it is possible to identify lens degradation before it becomes a significant detractor in a manufacturing process. A two beam interferometer formed by a blazed grating reticle is used to measure the aberration values. This works for all DUV tools, and therefore it allows a comparison of tools from different suppliers. The test can be run after regular preventive maintenance or as daily monitor checks, in order to evaluate lens aberration over time. By storing the results, it is easy to generate a tool individual database. With this paper, we will show aberration data over time and the possibility to increase tool performance and stability.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.