In this paper, we expand the eyebox size of lens-less holographic near-eye-display (NED) using passive eyebox replication technique that incorporates the spatial light modulator (SLM) and a holographic optical element (HOE). In holographic NEDs, the space-bandwidth product (SBP) of the SLM determines the exit pupil dimensions and corresponding eyebox size. The base eyebox is replicated in horizontal direction by using the horizontal high-order diffractions of the SLM under spherical wave illumination and multiplexed HOE combiner. The HOE combiner is used as a see-through reflective screen for the projected holographic virtual image, and it is fabricated based on two spherical divergent waves recording condition. When a digital blazed grating and a digital lens phase are added to the computed phase hologram sent to the SLM, two spatially separated, horizontal high-order diffraction terms with identical intensity and information can be used for eyebox expansion. When the eyebox size is expanded, the field-of-view (FOV) is not sacrificed; spherical divergence wave illumination alleviates the need for a tradeoff between FOV and eyebox size. Astigmatism distortion introduced during the HOE fabrication was counterbalanced by pre-correcting the target image using a computer-generated, holographic computation algorithm. The experimental results prove that the proposed prototype system is simple and effective to achieve distortion-free reconstruction of 3D virtual image and eyebox extension of lens less holographic NED.
KEYWORDS: Printing, Holography, RGB color model, Digital holography, 3D printing, 3D modeling, Holographic materials, 3D image processing, Computer generated holography, Holograms
In this paper, simplified digital content generation using single-shot depth estimation for full-color holographic printing system is proposed. Firstly, digital content generation is analyzed completely before the hardware system of holographic printing is run to provide a high-quality three-dimensional (3D) scene without degrading information of the original 3D object. Here, the single-shot depth estimation method is applied, and 3D information is acquired from the estimated highquality depth data and a given single 2D image. Then the array of sub-holograms (hogels) is generated directly by implementing fully analyzed computation considering chromatic aberration for full-color printing. Finally, the generated hogels are recorded into holographic material sequentially via effectual time-controlled exposure under synchronized control with three electrical shutters for RGB laser beam illuminations to obtain full-color 3D reconstruction. Numerical simulation and optical reconstructions are implemented successfully.
We propose an advanced holographic see-through display system with 3D/2D switchable modes based on a liquid-crystalline lens array and a one-shot learning model. The liquid-crystalline lens array switches its role that act like a lens array or glass, according to the state of the electrical polarizer. When the switch of an electrical polarizer is on-state, the camera captures the image of a real-world object, and a one-shot learning model estimates the depth data from the captured image. Then, the 3D model is regenerated based on both color and depth images; and the elemental image array is generated and displayed on the microdisplay while the liquid-crystalline lens array reconstructs as a 3D image. On the other hand, when the switch of the electrical polarizer is off-state, the camera captures the image of a real-world object and is directly displayed by the microdisplay, while the liquid-crystalline lens array simply transmits it to the holographic combiner. The experimental results confirmed that the proposed system can be an advantageous way to implement the 3D/2D switchable holographic see-through system.
In this paper, color optimization of a full-color holographic stereogram printing system using a single SLM based on iterative exposure is proposed. First, an array of sub-holograms (hogels) is generated effectively within fast computergenerated integral imaging, and fully analyzed phase-modulation for red, green, and blue (RGB) channels of hogel. Then, the generated hogels are recorded into holographic material sequentially where SLM displays the R, G, and B channels of a single hogel via effectual exposure under synchronized control with three electrical shutters for RGB laser illumination to obtain verified color optimization. Numerical simulation and optical reconstructions are implemented.
his report proposes a three-dimensional/two-dimensional switchable augmented-reality display system using a liquid crystalline lens array and an electrical polarizer. A depth camera that is connected to the proposed augmented-reality display system acquires the three-dimensional or two-dimensional information of the real objects. Here, the dual function liquid-crystalline lens array is switched its function according to the polarizing directions of an electrical polarizer. The proposed system's overall procedure is as follows: the depth camera captures the depth/color, or only color image according to the switcher of a polarizer, and the three-dimensional or two-dimensional images are displayed separately on the augmented-reality display system. It gives an opportunity that three-dimensional and two-dimensional modes can be switched automatically. In the two-dimensional mode, the captured color image of a real object is displayed directly. In the three-dimensional mode, the elemental image array is generated from the depth and color images and reconstructed as a three-dimensional image by the liquid-crystalline microlens array of a proposed augmented-reality display system. Even the proposed system cannot be implemented the real-time display in the three dimensional mode, the direction-inversed computation method generates the elemental image arrays of the real object within a possible short time.
In this paper, a full-color holographic stereogram (HS) printing system based on effective digital content generation using the inverse-directed propagation (IDP) algorithm is proposed. The digital content is generated effectively within the fast computation based on the IDP algorithm, and an optimized phase-modulation of hogel for red, green, and blue (RGB) channels of computer-generated hologram (CGH). Parallel computing is applied to provide high-resolution hologram data based on the independent hogel property. Finally, the generated hogels are recorded into holographic material sequentially as a volume hologram via fully-automated hogel printing setup using a single spatial-light modulator (SLM) to obtain a full-color HS. Numerical simulation and optical reconstructions demonstrate the simple and effective computation operated in content generation using the proposed IDP-based full-color HS printing system without degrading the image quality of the holograms.
The improvement of holographic waveguide-type two-dimensional/three-dimensional (2D/3D) convertible augmentedreality (AR) display system using the liquid-crystalline polymer microlens array (LCP-MA) with electro-switching polarizer is proposed. The LCP-MA has the properties such as a small focal ratio, high fill factor, low driving voltage, and fast switching speed, which utilizes a well-aligned reactive mesogen on the imprinted reverse shape of the lens and a polarization switching layer. In the case of the holographic waveguide, two holographic optical elements (HOE) films are located at the input and output parts of the waveguide. These two HOEs have functions like mirror and magnifiers. Therefore, it reflects the transmitted light beams through the waveguide to the observer's eye as the reconstructed images. The proposed system has some common features like holographic AR display’s lightweight, thin size, and the observer can see the 2D/3D convertible images according to the direction of the electro-switching polarizer, with the real-world scenes at the same time. In the experiment, the AR system has been successfully verified that the real-world scene and reconstructed 2D/3D images were observed simultaneously.
A holographic stereogram printing system is a valuable method to output the natural-view holographic three-dimensional images. Here, the 3D information of the object such as parallax and depth information, are encoded into the elemental holograms, i.e. hogels, and recorded onto the holographic material via the laser illumination of the holographic printing process. However, according to the low resolution of the hogels, the quality of the printed image is reduced. Therefore, in this paper, we propose the real object-based fully automatic high-resolution light field image acquisition system using the one-directional moving camera array and smart motor-driven stage. The proposed high-resolution light field image acquisition system includes interconnected multiple cameras with one-dimensional configuration, the multi-functional smart motor and controller, and the computer-based integration between the cameras and smart motor. After the user inputs the main parameters such as the number of perspectives and distance/rotation between each neighboring perspectives, the multiple cameras capture the high-resolution perspectives of the real object automatically, by shifting and rotating on the smart motor-driven stage, and the captured images are utilized for the hogel generation of the holographic stereogram printing system. Finally, the natural-view holographic three-dimensional visualization of the real-object is outputted on the holographic material through the holographic stereogram printing system. The proposed method verified through the optical experiment, and the experimental results confirmed that the proposed onedimensional moving camera array-based light field image system can be an effective way to acquire the light field images for holographic stereogram printing.
Multiview display is a popular method to deliver three-dimensional (3D) images by generating a perspective directional view. However, there are some limitations such as a low resolution, lack of motion parallax, and a narrow viewing angle. In this paper, we propose a method to implement a multi-view display system that provides a 3D image in high resolution. The original setup is composed of a stereoscopic 3D display panel and a head tracking camera. The directional view image of a 3D object is captured by a camera array and shown on a stereoscopic 3D display. A user interface is designed to control the hardware. An Intel RealSense sr300 camera is used to track the observer's viewing angle. The images are captured rotationally through a movable camera array in a 30-degree span. There are 71 and 3 views in the horizontal and vertical direction, respectively. The directional view information is displayed according to the observer's viewing direction as well as the head position. The observer can realize a high-resolution 3D image with smooth motion parallax. Most importantly, the proposed system interactively displays the exact view direction according to the user’s viewing angle which feels more natural to the observer.
We proposed a full-color three-dimensional holographic waveguide-type augmented-reality display system based on integral imaging using the holographic optical element-mirror array. As same as the conventional holographic waveguide, two holographic optical elements are utilized as in- and out-couplers that are located at the input and output parts of the waveguide. The main roles of these films are that reflecting the light beams come from the microdisplay into the waveguide, transmitting the reconstructed by the HOE-MA, three-dimensional image while a reflecting to the observer’s eye. In the experiment, the augmented-reality feature has been successfully verified that the real-world scene and reconstructed virtual three-dimensional image were observed simultaneously.
We proposed an effective method of digital content generation for the holographic printer using the integral imaging technique. In order to print the three-dimensional (3D) holographic visualizations of the given object, a printed hologram consists of an array of sub-hologram (hogels) should be generated, before the hardware system of the holographic printer is run. There are mainly three parts related to the digital content generation. In the first part, the acquisition of the 3D point cloud object is applied and the second part provided an encoding of directional information extracted from the 3D object. The array of hogel is generated by implementing direction inversed computer-generated integral imaging plus phasemodulation for improvement of the content generation, and displayed on the reflective phase-only spatial light modulator (SLM) then recorded onto holographic material one-by-one in sequence, while motorized X-Y translation stage shifts the holographic material; so, the full-parallax holographic stereogram (HS) is printed on the holographic material and 3D visualization of the object is successfully observed. Numerical simulation and optical reconstructions are verified effective computation and image quality respectively.
In this paper, we propose the well-enhancing method for the resolution of the reconstructed image of the mobile threedimensional integral imaging display system. A mobile 3D integral imaging display system is a valuable way to acquire the 3D information of real objects and display the realistic 3D visualizations of them on the mobile display. Here, the 3D color and depth information are acquired by the 3D scanner, and the elemental image array (EIA) is generated from the acquired 3D information virtually. However, the resolution of the EIA is quite low due to the low-resolution of the acquired depth information, and it affects the final reconstructed image resolution. In order to enhance the resolution of reconstructed images, the EIA resolution should be improved by increasing the number of elemental images, because the resolution of the reconstructed image depends on the number of elemental images. For the comfortable observation, the interpolation process should be iterated at least twice or three times. However, if the interpolation process is iterated more than twice, the reconstructed image is damaged, and the quality is degraded considerably. In order to improve the resolution of reconstructed images well, while maintaining the image quality, we applied the additional convolutional super-resolution algorithm instead of the interpolation process. Finally, the 3D visualizations with a higher resolution and fine-quality are displayed on the mobile display.
KEYWORDS: 3D image reconstruction, 3D image processing, Integral imaging, 3D displays, 3D acquisition, 3D modeling, 3D scanning, Cameras, Image quality, Mobile devices
In this paper, we focused on the improvement of reconstructed image quality of the mobile three-dimensional display using the computer-generated integral imaging. The three-dimensional scanning method is applied instead of capturing the depth image in the acquisition step, and much more accurate three-dimensional view information (parallax and depth) can be acquired compared with the previous mobile three-dimensional integral imaging display, and the proposed system can reconstruct clearer three-dimensional visualizations of real-world objects. Here, the three-dimensional scanner acquires the three-dimensional parallax and depth information of the real-world object by the user. Then, the entire acquired data is organized and the three-dimensional the virtual model is generated based on the acquired data, and the EIA is generated from the virtual three-dimensional model. Additionally, in order to enhance the resolution of the elemental image array, an intermediate-view elemental image generation method is applied. Here, five intermediateview elemental images are generated between each four-original neighboring elemental image according to the pixel information, at least, the resolution of the generated elemental image array is enhanced almost four times than original. When the three-dimensional visualizations of real objects are reconstructed from the elemental image array with enhanced resolution, the quality can be improved quite comparing with the previous mobile three-dimensional imaging system. The proposed method is verified by the real experiment.
KEYWORDS: 3D modeling, 3D image reconstruction, Cameras, 3D image processing, Integral imaging, Data modeling, Clouds, Imaging systems, Image quality, 3D displays
An integral imaging system using a polygon model for a real object is proposed. After depth and color data of the real object are acquired by a depth camera, the grid of the polygon model is converted from the initially reconstructed point cloud model. The elemental image array is generated from the polygon model and directly reconstructed. The polygon model eliminates the failed picking area between the points of a point cloud model, so at least the quality of the reconstructed 3-D image is significantly improved. The theory is verified experimentally, and higher-quality images are obtained.
KEYWORDS: Clouds, 3D image processing, Cameras, 3D displays, Adaptive optics, 3D modeling, Image quality, Digital micromirror devices, Image resolution, Mirrors
A novel 360-degree integral-floating display based on the real object is proposed. The general procedure of the display system is similar with conventional 360-degree integral-floating displays. Unlike previously presented 360-degree displays, the proposed system displays the 3D image generated from the real object in 360-degree viewing zone. In order to display real object in 360-degree viewing zone, multiple depth camera have been utilized to acquire the depth information around the object. Then, the 3D point cloud representations of the real object are reconstructed according to the acquired depth information. By using a special point cloud registration method, the multiple virtual 3D point cloud representations captured by each depth camera are combined as single synthetic 3D point cloud model, and the elemental image arrays are generated for the newly synthesized 3D point cloud model from the given anamorphic optic system’s angular step. The theory has been verified experimentally, and it shows that the proposed 360-degree integral-floating display can be an excellent way to display real object in the 360-degree viewing zone.
Viewing angle enhanced integral imaging (II) system using multi-directional projections and elemental image (EI) resizing method is proposed. In this method, each elemental lens of micro lens array collects multi-directional illuminations of multiple EI sets and produces multiple point light sources (PLSs) at the different positions in the focal plane; and the positions of the PLSs can be controlled by the projection angles. The viewing zone is made consisting of multiple diverging ray bundles, wider than the conventional method, due to multi-directional projections of multiple EI sets; whereas a conventional system produces a viewing zone using only a single set of EI projection. Hence the viewing angle of the reconstructed image is enhanced.
KEYWORDS: Integral imaging, Displays, LCDs, Cameras, Parallel processing, Parallel computing, Image processing, 3D image processing, 3D image reconstruction, 3D displays
A depth camera has been used to capture the depth data and color data for real-world objects. As an integral imaging display system is broadly used, the elemental image array for the captured data needs to be generated and displayed on liquid crystal display. We proposed a real-time integral imaging display system using image processing to simplify the optical arrangement and graphics processing unit parallel processing to reduce the time for computation. The proposed system provides elemental images generated at a rate of more than 30 fps with a resolution of 1204×1204 pixels , where the size of each display panel pixel was 0.1245 mm, and an array of 30×30 lenses , where each lens was 5×5 mm .
KEYWORDS: 3D image processing, 3D displays, Mirrors, Integral imaging, Projection systems, Fresnel lenses, Digital micromirror devices, 3D vision, Diffusers, Image resolution
We propose full-parallax integral imaging display with 360 degree horizontal viewing angle. Two-dimensional (2D)
elemental images are projected by a high-speed DMD projector and integrated into three-dimensional (3D) image by a
lens array. The anamorphic optic system tailors the horizontal and vertical viewing angles of the integrated 3D images in
order to obtain high angular ray density in horizontal direction and large viewing angle in vertical direction. Finally, the
mirror screen that rotates in synchronization with the DMD projector presents the integrated 3D images to desired
direction accordingly. Full-parallax and 360 degree horizontal viewing angle 3D images with both of monocular and
binocular depth cues can be achieved by the proposed method.
We implemented the dense light field microscopy using the infinity corrected optical system. In the infinity corrected
optical system, the three-dimensional specimen located around the focal plane of the objective is imaged at the
intermediate plane by the combination of the objective and the tube lens. This intermediate image is again imaged by the
micro lens array and captured by the CCD, providing the light field information. We analyzed geometrical structure of
the dense light field microscope for infinity corrected optical system. From the analyzed results, we defined the
characteristic and relationship of each component. Based on this result, we reconstructed various orthographic view
images of the specimen from the captured light field, and also generated the depth slice images using the computational
integral imaging reconstruction principle.
We propose the fast accessible data storage system using the double side hologram reconstruction scheme which can simultaneously read out holograms using both the forward and phase conjugate reference beams. In this system, digital pages are recorded in the usual manner but are reconstructed in two CCD cameras by the double side hologram reconstruction. As a result, we achieved the speed of two times faster than the conventional readout and the estimated raw BER were 3.09×10-18 (left image) and 4.76×10-16 (right image), respectively.
KEYWORDS: 3D image reconstruction, Digital holography, Holograms, Holography, Phase conjugation, Data storage, Digital imaging, Stereoscopic cameras, Image storage, Digital recording
We propose the double-side dual hologram reconstruction scheme which can simultaneously read out holograms using both the forward and phase conjugate reference beams, and demonstrate stereo image recording and playback by the holographic memory system. This system is composed of a stereoscopic camera obtained stereo image pairs, holographic data storage where stereo images are recorded in the usual manner but read out by double-side dual reconstruction, and a stereo monitor that use polarized light techniques. As a result, stereo digital pages, which are reconstructed by the proposed double-side dual reconstruction method, can be obtained with very low cross talk noise, and the estimated raw BER of retrieved holograms were approximately 3.59×10-4 (left image) and 5.0×10-4 (right image), respectively.
The parallel stereoscopic camera has a linear relationship between vergence and focus control. We introduced the automatic control method for a stereoscopic camera system that uses the relationship between vergence and focus of a parallel stereoscopic camera. The automatic control method uses disparity compensation of the acquired image pair from the stereoscopic camera. For faster extraction of disparity information, the proposed binocular disparity estimation method by the one-dimensional cepstral filter algorithm would be investigated. The suggested system in this study greatly reduces the extraction time requirement and error so as to offer spontaneous control and greater real-time realism to acquire high quality stereoscopic images.
A new beam steering scheme using computer-generated holograms(CGHs) is proposed. The steering devices in order to control the reference and object wave are necessary in various holographic multiplexing methods. The beam steering device using CGHs can be simultaneously processed the coarse address function controlling the beam up or down so as to select slice and the fine address function adjusting to the particular holographic page within the chosen layer. From the experimental results, we show that the beam steering can be easily implemented and so powerful to generate the electrically addressed reference wave in digital holographic memory system.
Parallel stereoscopic camera has a linear relationship between vergence and focus control. We introduced the automatic control method for a stereoscopic camera system that uses the relationship between vergence and focus of a parallel stereoscopic camera. The automatic control method uses disparity compensation of the acquired image pair from the stereoscopic camera. For faster extraction of disparity information, the proposed binocular disparity estimation method by the one-dimensional cepstral filter algorithm would be investigated. The suggested system in this study substantially reduced the controlling time and error-ratio so as to make it possible to achieve natural and clear images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.