首页    期刊浏览 2024年12月04日 星期三
登录注册

文章基本信息

  • 标题:Development of a software based calibration system for automobile assembly system oriented AR.
  • 作者:Park, H.S. ; Park, J.W.
  • 期刊名称:DAAAM International Scientific Book
  • 印刷版ISSN:1726-9687
  • 出版年度:2012
  • 期号:January
  • 语种:English
  • 出版社:DAAAM International Vienna
  • 摘要:With the development of virtual reality technology, many computer science-related businesses and manufacturers are researching augmented reality as for new human-machine interface. However, recent augmented reality researches are being biased into the accuracy of image matching, object tracking techniques, and they are limited into the mobile applications for the utilization fields (Bimber et al., 2008; Daniel et al., 2007).
  • 关键词:Internet software;Robots

Development of a software based calibration system for automobile assembly system oriented AR.


Park, H.S. ; Park, J.W.


1. Introduction

With the development of virtual reality technology, many computer science-related businesses and manufacturers are researching augmented reality as for new human-machine interface. However, recent augmented reality researches are being biased into the accuracy of image matching, object tracking techniques, and they are limited into the mobile applications for the utilization fields (Bimber et al., 2008; Daniel et al., 2007).

Researches on the augmented reality application in the automotive assembly system are infrequently being executed by few automotive manufacturing companies (Ong et al., 2004). However, according to current practical utilization, system layout using virtual reality technology and process simulation constitute rather than augmented reality (Gunter et al, 2005; Kim et al, 2000). Conventional 3D virtual simulators cannot animate including real visualization such as dynamic bending displacement of virtual robot arms in manufacturing system (Fig. 1). Because of this reason, many researchers are studying about calibration technology of virtual object in 3D simulation system (Kimn et al., 2010; Wolfgang, 2006).

[FIGURE 1 OMITTED]

For the automotive assembly system implementation using existing augmented reality technology, it is required to have new and precise calibration after the setting for not being reflected such as virtual robot arm's deflection and deflection for the tool itself due to heavy load of robot tools installed into the robot. Since robot is most frequently used equipment at the stage of automotive assembly operation, this problem is a big obstacle when applying augmented reality technology technique into the automobile assembly system. Moreover, In AR technology, the coordinate accuracy of the marker depends on the camera lens distortion and the lens calibration is performed for compensation of distortion (Gruen et al, 2001). Robot teaching operators use the Zoom In / Zoom Out function of the high-definition cameras and change the location of the cameras to obtain the images for superimposing of the scenes while they program the robot position. For these reasons, the early calibrated parameters cannot be used at the changed environments such as changed location of camera and changed zooming status. In order to solve these problems, this research introduces a method of the software based calibration to apply the augmented reality effectively to the automobile assembly system for the deflection of the tool itself due to heavy load and for the camera lens calibration. on the other hand, the camera lens calibration module and the direct compensation module of the virtual object displacement for the augmented reality were designed and implemented. Furthermore, the developed automobile assembly system oriented AR-system was verified by the practical test.

2. Existing Augmented Reality System

2.1 Analysis of Existing Augmented Reality System

Existing augmented reality system is consisted of 4 core modules such as video interface for the treatment of video and virtual objects, tracking, rendering, and measurement module as shown in Fig. 2.

[FIGURE 2 OMITTED]

Marker information which is continuously being tracked by the camera undergoes through the image process steps, and through this the augmented reality system can aware the location of the marker (Gerald et al., 2006; Jun et al., 2000). In addition, through the matching between stored information in the marker database such as the size and pattern of the marker, it creates a coordinate system in order to locate virtual objects. Virtual object creation and removal is executed through the rendering module based on the generated coordinate system. Also, through the UI, it defines relationship between the marker. Furthermore, coordinate transformation is carried out for the selection of virtual object position based on 3 translations, 3 rotations, and 3 scaling. In the interference measurement module, it performs interference between 3D objects or the measurement of the distance where it aids users to effectively obtain the view.

The clipping planes can be generated to check the collision between the virtual objects and to ensure the inside area of the virtual objects. The generated clipping plane does not have visualization feature. Because of this problem, there is the difficulty to recognize the location of the clipping plane in the virtual space. To solve this problem, the virtual object of the grid plane is created basically and it is matched with the clipping plane at the same time as shown in Fig. 3. The manufacturing field has poor light condition. For this reason, the optical marker cannot be recognized easily. To supplement this handicap, threshold value is controlled through the panel. The function of controlling threshold value returns selected pixel value of the binary image from 0 to 255 (Fig. 4).

[FIGURE 3 OMITTED]

[FIGURE 4 OMITTED]

2.2 Deriving the Existing System Problems

Existing augmented reality system reduced modeling time by matching virtual object into the real surrounding environment and carries an advantage to reduce costs, since it does not create tense work for the preparation. However, despite these advantages, current augmented reality system contains problem where it newly requires having precise revision works after setting. In other words, in existing augmented reality system, the utilization rate falls for the fine works due to the deflection of heavy tools installed on the arm of the robot, deflection due to the weight of the robot arm itself.

Fig. 5 shows the existing cockpit module assembly system for using current manufacturing oriented AR system. In Fig. 5, [??] and [??] are the assembly points between cockpit module and automobile body. The existing robot teaching point for assembling cockpit module of real assembly system without AR technology and the robot teaching points by using AR are listed in Tab. 1. Tab. 1 also contains the error of coordinate between the real system without the AR and the AR based assembly system. The test bed was constructed in the twelve times reduced model of the conventional station for assembling a cockpit module.

[FIGURE 5 OMITTED]

[FIGURE 6 OMITTED]

The ambitious goal of AR is to create the sensation that virtual objects are present in the real world. To achieve the effect, software combines VR elements with the real world. obviously, AR is most effective when virtual elements are added in real time. Because of this, AR commonly involves augmenting 2D or 3D objects to a real-time digital video image. AR technology can remarkably reduce the modeling work, because it uses the real environment to design and to plan manufacturing systems. Nevertheless, (1) the existing AR system requires additional work to compensate for the image distortion. In additional, re-calibration work is required due to Zoom In and Zoom Out. The existing AR system (2) does not consider the deflection of heavy tools installed on the arm of the robot, (3) does not consider the deflection of robot joints caused by heavy tools, (4) does not consider the dynamic deflection due to the movement of the robot and stress distribution. Because of these reasons, the utilization of the existing AR system is getting low (Fig. 6). In this paper, we focus on problem (1) and (2) to verify the introduced method and strategy as an early stage of research.

3. Lens and Deflection Calibration for the Manufacturing oriented AR System

3.1 Camera Lens Calibration

The distorted image from the camera is transmitted to the AR system due to the curved shape of the lens without calibration. This distorted image to display virtual objects without accuracy can be used to track the coordination of the markers at the non-precision work such as educational materials or entertainment materials (Lee et al., 2008; Rhee et al., 2007). Because the automobile assembly system needs high accuracy and precision, the distorted image must be calibrated. Especially, the accuracy of object tracking is dependent on the camera zoom function. It means that if the user changes the state of zoom of camera, the calibration work has to be re-performed in order to ensure the accuracy and precision of the tracking at AR system.

In this research, the developed system sends the message to the user if the user changes the zoom state of the camera. And then the user sets again the chessboard that was used for calibration at first time. After setting the location of chessboard, the calibration will be re-performed. Re-performing calibration work captures 20 frames per 1 second and the calculated parameters will be sent to the existing calibration parameter data file and to the tracking module. Fig. 7 shows the work procedure of lens calibration module.

[FIGURE 7 OMITTED]

3.2 Direct Calibration based on Virtual Object

After finishing the calibration work for the distorted image, the virtual objects can be placed at the exact position on the marker coordination. Then the deflection module calculates the displacement of the heavy tools installed on the arm of the robot. Whole work procedure including deflection of robot joints caused by heavy tools is shown in Fig. 8. This paper is focused on reducing the visual errors caused by deflection of the virtual object on the display to check the efficient of the direct calibration method.

[FIGURE 8 OMITTED]

[FIGURE 9 OMITTED]

Fig. 9 is the internal structure of the deflection calibration module for the system to combine with augmented reality. The deflection calibration module analyzes the information of the virtual objects such as marker number at the robot arm, the coordinate system and etc., after obtaining the position data of the robot.

Just like the measurement module, it distinguishes coordinate system that the system recognizes, and selects analytical model based on the user's selection. After selecting finite element analysis method used for the explanation, k matrix is utilized to derive result of the linear systems of equations. Derived results are transmitted to rendering module and this information is displayed on the screen with the completed condition with the analysis application such as bending. The euler beam element analysis is used in the deflection calibration module (eq. 1).

{F} = EI/[L.sup.3] [K] {d} (1)

F : Forces Matrix, E : Modulus of Elasticity, I : Second moment of Area L : Length of Element, K : Stiffness Matrix, d : Deflection Matrix

From the above matrices, the deflection calibration module calculates the displacement at each node. Then each of the calculated displacement and information are transferred to the rendering module in order to have virtual rendering object for later using. The user can perform the robot teaching work through the displayed superimposition images. In this paper, the cockpit module and the gripper of the robot are set as a beam and the result of the displacement are reflected to compensate the position of assembly point. Fig. 10 and Eq. 2 show each displacement and each deflection angle of several nodes. Each angle is used to compensate the assembly points of cockpit module. In Fig. 10, if the assembly point is located in element 2 area, [[theta].sub.2] can be used as the angle for compensation.

[FIGURE 10 OMITTED]

[[theta].sub.n] = ([[delta].sub.n+1] - [[delta].sub.n]/1), n = 1,2,3, ..., [[delta].sub.1] = 0 (2)

In Fig. 10, if the coordinate of assembly point 1 is ([z.sub.1], [h.sub.1]), the compensated coordinate of assembly point [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] caused by deflection angle changing can be replaced to ([z.sub.1] - [[DELTA]z.sub.1], [y.sub.1] - [[DELTA]y.sub.1]). Because [z.sub.1] is the distance from origin to the z axis and it is same as radius r, [[DELTA]z.sub.1] can be replaced to ([z.sub.1] - [z.sub.1] cos [[theta].sub.2]) and [[DELTA]y.sub.1] can be replaced to ([z.sub.1] sin [[theta].sub.2]). Therefore, the compensated coordinate of assembly point 1 can be shown as Eq. 3 finally, and the other assembly point can be compensated similarly.

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (3)

4. Implementation of Calibration Modules and Integration

4.1 Integrated Architecture of AR System

[FIGURE 11 OMITTED]

Fig. 11 shows the integrated architecture between the existing Ar system and the lens calibration module and deflection calibration module.

Precision and accuracy of target tracking depends on the image resolution of the camera device. USB Web Camera which has low resolution is used generally for experiment at many laboratories. However, ordinary manufacturing system requires the tasks with high precision and accuracy. Because of this reason, the video interface module was complemented to convert DV image to BGR24 image. The video interface module checks on connected video device list and enables to obtain real-time video from the selected video device. After that, when camera imaging devices outside of the computer are connected, it compares numbers provided from the list and selects user preferred image device after giving numbers in the connected order by enumerating connected devices. It receives images of the connected device when the number of listed device and selected list numbers are equal. The experiments were carried out more than twenty times because they sensibly responded to the changes of environmental conditions for each camera. With the average of the experimented values except extraordinary things, the calibration of camera was done. The digital video camcorder that is connected to computer through the IEEE 1394 interface was used. However openCV does not support digital video image type. To solve this problem, high resolution buffer is obtained by using Microsoft DirectShow technology in this research.

non-calibrated images transmitted from the video interface module are used at the time of tracking location of the marker within the tracking module. Tracking module detects marker within the binary image information and transmits rendering module by generating 3D coordinates. Further, rendering module performs matching of real-time images and 3D virtual objects. After 3D virtual environment is selected, it synchronizes marker location information obtained from the tracking modules and 3D virtual environment coordinate information. After that, it adjusts location information of 3D virtual object, and then it performs matching of real-time images and 3D virtual objects. In rendering module, the superimposition of scenes is performed by using the obtaining image data, the tracking positions and the virtual objects. The system requirement of the hardware is prime concern to draw the scenes into viewing rectangle of the user interface. Therefore, the overall specification of the hardware as a graphic device, CPU and memory devices have to be high. The high specification of the hardware prevents jitter and lag of the superimposition scenes. Also the VRML (Virtual Reality Modeling Language) object files decrease the software overhead and improve the problems of jitter and lag status. All things considered, the object control panel can load text files which include the coordinate information of the virtual object based on rearrangement coordinate matrices. And the data structure including the two-dimensional array was designed

for each object rendering on the marker. The data structure has the information of each marker number, visible/invisible, object number and translation-rotation-scale of each object.

Measurement module is the module to check the interference between 3D virtual objects and to measure distances. For the interference verification, it distinguishes coordinate system recognized within the system and enables to measure relative distances between points of objects. Measured distances between objects are delivered to the user through the message transmission, and also send messages to the user when the interference occurs.

The lens calibration module compensates the distorted images and the calibrated images are transferred to the tracking module. The deflection calibration module calculates the deflection of virtual object by using information of virtual object and the calibrated deflection data is also transferred to the tracking module. The work procedure of these two modules is described in section 3.1 and 3.2.

4.2 Implementation of the Integrated AR System

4.2.1 User Interface of Developed AR System

Fig. 12 is the UI of the augmented reality system where calibration modules are integrated and implemented. They are divided into seven parts such as mainframe, matching image screen, camera control panel, marker setting panel, 3D virtual object control panel, tracking start/stop, and program status message indication panel. In addition, it includes additional dialog box such as selection of camera device, camera device properties, 3D virtual object load, calibration information load, and element property input.

[FIGURE 12 OMITTED]

1) Mainframe is the basic framework of the program where its first generating window size is 1024x675, and can be configured into 1600x1200, 1024x768, 800x600, 640x480, and full screen.

2) Matching image screen draws from the selected camera to the acquired image usage area, and it is the area that draws loaded 3D virtual object and 2D image matching image.

3) camera control panel is composed of camera select and camera select/stop, activation and deactivation of the background screen, attribute information of the camera uses, camera calibration data file's loading. When camera button event occurs, camera select dialog pops up, and when camera calibration button event occurs, calibration file loadable dialog pops up.

4) Marker setting panel is composed of use of individual marker's total number input, use of maker's individual number input, marker size (unit: mm) input, level of tracking adjustment through the adjustment of binary image threshold value.

5) 3D virtual object control panel is implemented to produce clipping plane and change the location for 3D object loading use and indirect measure. When object load button event occurs, object load dialog for individual marker ups, and when board marker usage object load button event occurs, object load dialog for board marker pops up.

6) Start/stop tracking is composed of button that can perform and stop location tracking depending on the user's decision.

7) Program status message mark panel records all history that the user operated from the program, and the user can verify status of current program through message mark panel.

8) The user can input the material properties and its shape in order to generate mesh of object.

9) Lens calibration dialog has the function of pattern selection for lens calibration. And it has the function of calibration performing and result saving.

[FIGURE 13 OMITTED]

4.2.2 Camera Lens Calibration Module

Fig. 13 shows the implemented camera lens calibration module. The user connects the camera to the computer then selects the camera to use ([??]) and sets the squares information such as count of squares, size of each square, the pattern size of the chessboard and number of measurement ([??]). In this paper, the size of each square is set 1cm x 1cm, the squares along width is set 18, the squares along height is set 12 and the number of measurement is set 20. After inputting the required information in order to calibrate, the calibration work is started ([??]) and the calibrated results are save as the file of .xml or .dat ([??]).

4.2.3 Deflection Calibration Module

Fig. 14 shows the implemented deflection calibration module. The material properties input part ([??]), the cross section selection part of virtual object ([??]), the dimension input part ([??]) and the force input part ([??]) are composed in the deflection calibration dialog box. The deflection results are displayed on the display panel for the render scene ([??]) .

[FIGURE 14 OMITTED]

5. Verification of Developed System

Augmented reality was added to verify deflection in the real environment by the user by comparing results of the commercial FEM tool (Abaqus). Properties are E=2GPa, V=0.3 and it was selected based on the square Lxbxh = 600x50x50 mm. In addition, cantilever is fixed to the left, and entered concentration to the downward 100N~1000N to the end platform (Fig. 15). In order to verify accuracy of the developed system, it used data from the same condition and performed beam deflection analysis in the Abaqus which is the commercial FEM tool. Comparison results are shown in Tab. 2. The error between the result of developed system and the result of commercial tool is about 1mm.

[FIGURE 15 OMITTED]

[FIGURE 16 MITTED]

The operator generates the robot operation program with two cameras. one camera is set at side of the test bed and the other is set at behind of the test bed. The auxiliary tools for collision-free between the peripheral unit and for accurate assembly work were modelled. The centre datum line was generated individually to match between the assembly part of the automobile body and the cockpit module. And the approach path was set and was modelled to avoid collision between the robot and the automobile body. The operator is able to perform generating of the robot operation program precisely by using these auxiliary tools (Fig. 16).

The result of developed system is listed in Tab. 3, and the tolerance of the robot program was improved from 4 to 1 mm.

6. Conclusion

This paper introduces a method of the software based calibration to apply the augmented reality effectively to the automobile assembly system. The camera lens calibration module and the direct compensation module of the virtual object displacement for the augmented reality were designed and implemented. furthermore, the developed automobile assembly system oriented AR-system was verified by the practical test. As the results, an operation program of an assembly system was generated by using the developed AR system. And deflection of heavy tools installed on the arm of the robot can be shown. Tolerance of the robot program was improved from 4 to 1 mm as an early stage of research.

Based on this research, we plan to solve other direct calibration methods such as the dynamic deflection due to the robot's movement and deflection of robot joints. If the deflection of robot joints caused by heavy tools and dynamic deflection due to the movement of the robot and stress distribution are included, the accuracy of the developed AR system for automobile assembly system can be improved. Moreover, if the modules for dynamic deflection and for deflection of robot joints are implemented, the measurement systems for off line robot teaching area might be removed.

DOI: 10.2507/daaam.scibook.2012.44

7. Acknowledgements

This research was supported by Mke (Ministry of knowledge economy), Korea, under the Industrial Source Technology Development Programs supervised by the KEIT (Korea Evaluation Institute of Industrial Technology.

8. References

Bimber, O. & Raskar, R. (2008). Spatial Augmented Reality--Merging Real and Virtual Worlds, A K Peters, pp.1-12

Daniel, W. & Dieter, S. (2007). ARToolKitPlus for Pose Tracking on Mobile Devices, Computer Vision Winter Workshop, St. Lambrecht, Austria, pp. 1-8

Gerald, S. & Axel, P. (2006). Robust Pose Estimation from a Planar Target, IEEE Transactions on pattern analysis and machine intelligence, Vol.28, No. 12, pp. 2024-2030

Gruen, A. & Huang, T. S. (2001) Calibration and Orientation of Cameras in Computer Vision, Springer-Verlag Berlin Heidlberg

Gunter, W. & Emmerich, S. (2005). Digital Planning Validation in Automotive Industry, Computers in Industry, Vol. 56, pp. 393-405

Jun, R. & Yuji, A. (2000). CyberCode: Designing Augmented Reality Environments with Visual Tags, Proceedings of Designing Augmented Reality Environments, pp.1-10

Kim, S. C. & Choi, K. H. (2000). Development of Flexible Manufacturing System Using Virtual Manufacturing Paradig, International Journal of Precision Engineering and Manufacturing, Vol. 1, No. 1, pp. 84-90

Kimn, S. J. & Dey, A. K. (2010). AR interfacing with prototype 3D applications based on user-centered interactivity, Computer-Aided Design, Vol.42, No.5, pp. 373-386

Lee, K. H.; Lee, J. M.; Kim, D. G; Han, Y. S. & Lee, J. J. (2008). Development Technology of Vision Based Augmented Reality for the Maintenance of Products, Transactions of the Society of CAD/CAM Engineers in Korea, Vol.13, No.4, pp. 265-272

Ong, S. K. & Nee, A. Y. C. (2004) Virtual and Augmented Reality Applications in Manufacturing, Springer-Verlag London Limited

Rhee, G. W.; Seo, D. W. & Lee, J. Y. (2007). Ubiquitous Car Maintenance Services Using Augmented Reality and Context Awareness, Transactions of the Society of CAD/CAM Engineers in Korea, Vol.12, No.3, pp. 171-181

Wolfgang, K. (2006) Digital Factory--Integration of Simulation Enhance the Product and Production Process toward Operative Control and Optimization, International Journal of Simulation, Vol. 7, No. 7, pp. 27-39

Authors' data: Prof. Dr.-Ing. Park, H[ong] S[eok]; M.S. Park, J[in] W[oo], * University of Ulsan, Daehak-ro 93, Nam-gu, Ulsan, South Korea, [email protected], [email protected]

This Publication has to be referred as: Park, H[ong] S[eok] & Park, J[in] W[oo] (2012). Development of a SoftWare Based Calibration System for Automobile Assembly System Oriented AR, Chapter 44 in DAAAM International Scientific Book 2012, pp. 527-544, B. Katalinic (Ed.), Published by DAAAM International, ISBN 978-3-901509-86-5, ISSN 1726-9687, Vienna, Austria
Tab. 1. The teaching point errors of existing AR system

Assembly Existing robot Robot teaching point by Error
Point teaching point using AR (mm) (mm)
 (mm)

 X +433.0 +436.0 -3
[??] Y -424.0 -421.0 -3
 Z -76.3 -64.3 -12
 X +433.0 +436.0 -3
[??] Y +424.0 +427.0 -3
 Z -76.3 -64.3 -12

Tab. 2. Deflection comparison between integrated AR system and
commercial FEM tool

 Direct Compensation Commercial FEM (Abaqus)
 applied AR system

Force Node [[delta].sub.y] (mm) Node [[delta].sub.y] (mm)
(N)

 1 0 1 -100.E-3
-100 2 -0.2764 2 -0.1705
 3 -0.9676 3 -0.8753
 4 -1.8662 4 -1.7954
 1 0 1 -200.E-3
-200 2 -0.5529 2 -0.4439
 3 -1.9353 3 -1.7348
 4 -3.7324 4 -3.1224
 1 0 1 -300.E-3
-300 2 -0.8294 2 -0.7146
 3 -2.9030 3 -2.8013
 4 -5.5987 4 -5.0334
 1 0 1 -400.E-3
-400 2 -1.1059 2 -1.0031
 3 -3.8707 3 -3.6454
 4 -7.4649 4 -6.9867
 1 0 1 -500.E-3
-500 2 -1.3824 2 -1.2913
 3 -4.8384 3 -4.5564
 4 -9.3312 4 -8.8441
 1 0 1 -1.E-3
-1000 2 -2.7648 2 -2.6554
 3 -9.6768 3 -9.4785
 4 -18.662 4 -17.127

Tab. 3. Improvement of the assembly point by using integrated AR
system

Assembl existing robot robot teaching point by Improve
y point teaching point using AR including d error
 (mm) calibration modules (mm) (mm)

 X +433.0 +435.0 -2
[??] Y -424.0 -422.0 -2
 Z -76.3 -67.3 -9
 X +433.0 +435.0 -2
[??] Y +424.0 +425.0 -1
 Z -76.3 -68.3 -8
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有