Posted in: information

Progress in calibration and 3D measurement of imaging system based on ray model

1、 Background:

Machine vision can be called the”eye” of artificial intelligence, and the calibration of imaging system is one of the important links of machine vision processing. Its calibration accuracy and stability directly affect the efficiency of the system. In the field of traditional machine vision and camera measurement and calibration, the small hole perspective model still has the problems that the high-order lens distortion cannot be fully characterized and many kinds of complex special imaging systems are not applicable.

The ray based model assumes that each pixel point corresponds to a virtual main ray in the space under the focusing state of the imaging system. Calibration and imaging characterization can be achieved by determining the parameters of the ray equation corresponding to all pixels, which can avoid the structural analysis and modeling of complex imaging systems. Based on this light model, the relevant research group of the research institute has developed various special stripe structured light three-dimensional measurement methods and systems. Experiments show that the light model can be used for high-precision measurement of many kinds of complex imaging systems. It is an effective model for calibrating non pinhole perspective imaging systems and can be used as a supplement to the perspective model.

2、 Ray model

Baker et al. First proposed a light model that can characterize any imaging system[1], the image is considered as a discrete set of pixels, and a group of virtual photosensitive elements”light elements” represent the complete geometric characteristics, radiation characteristics and optical characteristics between each pixel and the spatial virtual light associated with a pixel, as shown in Figure 1.

Therefore, the calibration of the light model is to determine the light equation corresponding to all pixels, without strict analysis and construction of the complex optical imaging model of the imaging system. It has certain portability and versatility, and can also avoid the measurement error introduced by the polynomial approximate characterization of lens distortion to a certain extent, which provides a new idea for the characterization of the imaging system of non aperture perspective projection model.

Figure 1 Schematic diagram of light model of imaging system

3、 Three dimensional measurement of stripe structured light based on ray model

In the field of fringe structured light projection three-dimensional measurement, on the one hand, the light model can be used as a light scheme for three-dimensional reconstruction, which can be used to characterize the imaging and projection devices with large distortion lenses, light field cameras, DMD projectors, MEMS projectors and other special structures, and new fringe structured light three-dimensional measurement methods and systems based on light model can be developed; On the other hand, explore the advantages of light model in structural light measurement. Light model has excellent effects on overcoming the nonlinear response of projection and camera, and improving the accuracy of 3D reconstruction under large distortion lens imaging.

3.1 light model and three-dimensional measurement of Scheimpflug small field telecentric structured light measurement system

The research group has developed a small field of view telecentric structured light measurement system, which adopts the Scheimpflug structure design to ensure the public depth of field coverage, as shown in Figure 2. Considering that the telecentric lens belongs to the parallel orthogonal projection and the Scheimpflug inclined structure, resulting in the non centrosymmetry of the distortion model, a non parametric generalized calibration method based on the ray model is proposed[2]。 In the system, the imaging process of camera and projector is represented by light model, the corresponding relationship between its pixels and space light is calibrated, and the coordinates of light intersection are calculated to realize three-dimensional reconstruction. Figure 3 shows the physical diagram of the system and the three-dimensional measurement results of a small local area of a nickel coin, with a measurement accuracy of 2 μ m。

Fig. 2 Scheimpflug small field telecentric structured light measurement system

Figure 3 physical diagram of measurement system and three-dimensional measurement results of part of a nickel coin

3.2 light model calibration and active light field three-dimensional measurement of light field camera

The research group has developed a three-dimensional measurement method and system of light field based on active fringe structured light illumination. By placing a microlens array in front of the sensing plane, the light field camera can record the light intensity and direction at the same time. Due to the complex factors such as microlens processing error, distortion aberration, assembly error and so on, the complete characterization and precise calibration of the light field camera is a difficult problem.

The research group proposed a light model to characterize the light field imaging process[3]That is, the interior of the light field camera is regarded as a black box, and the parameters of the pixel m and the corresponding object space light equation L are directly established, as shown in Figure 4. By calibrating the mapping relationship between all rays of the light field and the phase of the projection fringe, the high-precision three-dimensional measurement of the measured object is realized. Considering the multi angle recording characteristics of the light field, a data screening mechanism based on the fringe modulation is constructed to realize the high dynamic three-dimensional measurement of the scene. As shown in Figure 5, the black panel and reflective metal can be reconstructed at the same time.

Figure 4 light field imaging model

Figure 5 high dynamic three-dimensional measurement of active light field

3.3 light model calibration and three-dimensional measurement of DMD projector and biaxial MEMS laser scanning projector

The projector based on micro electro mechanical system (MEMS) laser scanning is applied to the fringe projection measurement system with the advantages of miniaturization and large depth of field, as shown in Figure 6 (a). However, because it relies on the biaxial MEMS scanning projection pattern of laser points and does not rely on lens imaging, there will be some errors in the characterization of perspective projection model. In addition, DMD and other projectors that rely on lens imaging, large aperture design will also affect the characterization accuracy of small hole perspective projection model.

In this regard, the research group used the light model to characterize the projector[4]A calibration method of fringe projection 3D measurement system based on projector ray model is proposed. This method recognizes and tracks the ray according to the orthogonal phase of biaxial MEMS projection, and realizes 3D reconstruction by using projection ray and triangulation constructed by camera. It is further found that due to the phase consistency of projection light, the light model can significantly suppress the measurement error caused by the nonlinear response of the system. Figure 6 (b) shows the three-dimensional reconstruction results of gypsum sculptures using perspective projection model and light model respectively under the condition of three-step phase shift (without additional correction of nonlinear response). It can be seen that the light model is immune to the influence of nonlinear response.

Fig. 6 principle of double axis MEMS laser scanning projection and 3D reconstruction results of plaster sculpture (3-step phase shift, perspective projection model on the left and light model on the right)

3.4 ray model calibration and three-dimensional measurement of uniaxial MEMS laser scanning projector

The single axis MEMS projector extends laser point scanning to area scanning, which greatly improves the projection rate and can be applied to dynamic measurement. Aiming at the problems that the pinhole model is not applicable and the unidirectional projection cannot provide orthogonal phase feature points due to the lensless structure of the uniaxial MEMS projector, the research group proposes a system calibration method based on the isophase plane model[5]A new mapping function between the three-dimensional coordinate value and the phase value at the intersection of the camera back projection ray and these phase planes is derived, and a fast three-dimensional reconstruction is realized.

Figure 7 shows the monocular measurement system and reconstruction scene built with a high-speed camera. The projection acquisition rate is 1000 frame/s, using 4-step phase shift and phase unwrapping of the thunder code diagram, and the three-dimensional reconstruction rate is 90 frame/s. Later, in order to adapt to higher rate measurement applications, monocular can be extended to binocular or multi camera systems, and methods such as single frame demodulation phase and multipole constrained phase unwrapping can be used to reduce the number of projected images and improve the three-dimensional measurement rate.

Figure 7 3D measurement system and dynamic reconstruction scene

3.5 ray model calibration and three-dimensional measurement of large distortion lens imaging

Aiming at the problem that the traditional low order polynomial can not completely represent the large distortion lens, the research group adopts the light model to represent the imaging of the large distortion lens camera, and proposes a three-dimensional reconstruction method of light and fringe phase mapping that is completely independent of the camera and projector internal parameters (the perspective model depends on the camera and projector internal parameters). By directly calibrating the reciprocal polynomial mapping coefficients of the camera light and fringe phase, the tedious and time-consuming corresponding point search and light interpolation operations are avoided.

Figure 8 shows the light calibration results and the three-dimensional measurement results of the standard sphere equipped with a 4 mm wide-angle lens. It can be seen that due to the large distortion of the wide-angle lens, the reconstruction quality of the light model is improved compared with that of the perspective model.

Fig. 8 fitting error distribution of wide-angle lens ray calibration and standard sphere three-dimensional measurement data (a) perspective projection model, (b) ray mapping model

4、 Summary

The ray model realizes calibration and imaging characterization by determining the parameters of the ray equation corresponding to all pixels, thus avoiding the structural analysis and modeling of complex imaging (projection) systems, solving the calibration and reconstruction problems of special fringe projection 3D measurement systems, and showing excellent performance in the suppression of nonlinear phase error and accuracy improvement of fringe projection 3D measurement systems. In the future development of structured light 3D measurement, we can further expand the method and application of light model 3D measurement, improve the measurement accuracy, efficiency and versatility, and solve the application measurement problems in various special and complex scenes.

reference

[1] Baker S, Nayar S K. A theory of catadioptric image formation[C]//Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271), January 7, 1998, Bombay, India. New York:IEEE Press, 1998:35-42.

[2] Yin Y K, Wang M, Gao B Z, et al. Fringe projection 3D microscopy with the general imaging model[J]. Optics Express, 2015, 23(5):6846-6857.

[3] Cai Z W, Liu X L, Peng X, et al. Ray calibration and phase mapping for structured-light-field 3D reconstruction[J]. Optics Express, 2018, 26(6):7598-7613.

[4] Yang Y, Miao Y P, Cai Z W, et al. A novel projector ray-model for 3D measurement in fringe projection profilometry[J]. Optics and Lasers in Engineering, 2022, 149:106818.

[5] Miao Y P, Yang Y, Hou Q Y, et al. High-efficiency 3D reconstruction with a uniaxial MEMS-based fringe projection profilometry[J]. Optics Express, 2021, 29(21):34243-34257.

Introduction to the research group:

Authors:Liu Xiaoli, Yang Yang, Yu Jing, Miao Yupei, Zhang Xiaojie, Peng Xiang, Yu Qifeng;Shenzhen Key Laboratory of intelligent optical measurement and perception, School of physics and Optoelectronic Engineering, Shenzhen University.

Led by academician Yu Qifeng, the Institute of intelligent optical measurement and image research of Shenzhen University mainly focuses on large-scale structural deformation and large-scale motion measurement, extraordinary optical measurement and intelligent image analysis, computational imaging and three-dimensional measurement, multi-sensor fusion perception and control, etc.