Virtual and augmented reality used to simulate the mechanical device.
Baritz, Mihaela ; Cotoros, Diana ; Moraru, Ovidiu 等
Abstract: This paper presents the activities of our team to study
and to establish a configuration for technical applications of a
mechanism motion modeling using virtual and augmented reality. It is
also presented a constraint-based methodology for reconstructing the 3D
motion given image observations. For that we used the augmented reality
to modeling and to analyze the moving of a mechanism in education and
technical applications.
Key words: virtual and augmented reality, mechanism, simulation
1. INTRODUCTION
Virtual Reality (VR) refers to a technology that is capable of
shifting a subject into a different environment without physically
moving him/her.
To this end the inputs into the subject's sensory organs are
manipulated in such a way, that the perceived environment is associated
with the desired Virtual Environment (VE) and not with the physical one.
A computer model that is based on the physical description of the VE
controls the manipulation process. Consequently, the technology is able
to create almost arbitrarily perceived environments. (Kaufmann H. 2000)
Practical applications of virtual reality are normally in the
simulation of engineering process, product design, and skill rehearsal.
Users enter into a virtual environment in order to learn something new
about the real situation, to which the simulation corresponds, or to
improve or learn a skill set.
There are four technologies and they are crucial for Virtual
Reality (VR)
* visual (aural and haptic) displays that immerse the user in the
virtual world and that block out contradictory sensory impressions from
the real world;
* graphics rendering system that generates, at 20 to 30 frames per
second, the ever-changing images;
* tracking system that continually reports the position and
orientation of the user's head and limbs;
* the database construction and maintenance system for building and
maintaining detailed and realistic models of the virtual world.
Also other four auxiliary technologies are important, but not
nearly so crucial:
* synthesized sound, displayed to the ears, including directional
sound and simulated sound fields;
* display of synthesized forces and other haptic sensations to the
kinesthetic senses;
* devices, such as tracked gloves with pushbuttons, by which the
user specifies interactions with virtual objects;
* interaction techniques that substitute for the real interactions
possible with the physical world.
Motion synthesis of mechanisms relies on the designer's
ability to specify desired locations of an object and visualizes
relative motion of the resultant mechanism.
Traditionally, mechanism design has concentrated on synthesis of
planar motion mechanisms.
Planar mechanism synthesis involves two-dimensional ~2D!
Display and interaction and this are well suited to the traditional
human-computer interface (HCI) of a computer monitor, keyboard and
mouse. However, designing spatial mechanisms requires the designer to
visualize and interact with the mechanism in three dimensions, which is
difficult using the traditional HCI. (Sander van Dijk, 2004)
VR technology provides a three-dimensional environment in which to
interact with digital models. Thus, this research focuses on the use of
VR for the design of spatial mechanisms. Models using a traditional HCI
are not drawn in real size and cannot be manipulated in a natural way.
VR allows the user to view the real size models and interact with the
models with a position sensor to track head motion and a wand or
instrumented glove, which would also be equipped with a position sensor.
The head position and orientation are used to compute the viewing
perspective for the computer display. This is in contrast to the
traditional HCI where the user manipulates a desktop mouse and types on
a keyboard to interact with digital models.
2. EXPERIMENTAL ASPECTS
Motion modeling is an attractive method for creating the real
systems to establish the position of the objects or of virtual spaces
used to study and to simulate different technical or medical
applications.
In this application we used an optical set of sensors, a video-cam
and a marker. To establish a practical configuration in modeling action
we use also the system TriVisio with stereoscopic feature for Augmented
Reality (www.TriVisio.com). With this system and using an acquisition
board (Pinnacle) for stereoscopic images we develop a technical
application for simulate a moving of articulate bars into a mechanism.
To use this device for AR function, the signal from videocam must
send to a computer, using the USB cable 2.0. The video signal suppress
stereoscopically received from a computer can be redirect in the mean
unit using two cables VGA. The resolution of computer display must be
800x600 at 60 Hz and to connect these two cables VGA we can use a few
possibilities: the most practical solution it is that the computer has a
video board with two exits VGA and the second it is the solution that
the computer use two video board in SLI and CrossFire.
[FIGURE 1 OMITTED]
[FIGURE 2 OMITTED]
[FIGURE 3 OMITTED]
To adjust the convergence of the small video-cam of the head
mounted device (HMD) it is necessary to rotate correctly around itself
axes or to rotate a little wheel on the front of the device.
The optimal position for using the two displays can be obtains by
the movement of the optical systems of the videocam into right or left
direction.
In operation for modeling or simulation of mechanical motion
we're using the programming language, software ARToolKit, which it
is software that allow researchers to develop easier the applications in
Augmented Reality.
The most important and difficult part of the developing the
applications in Augmented Reality it is the very precise calculus of the
visualization point in such way that the visual images could be in line
with the real world and the object from environment.
ARToolKit uses the visual technique through computer interface to
compute the visualization point of the human visual system, in real
time, to put in the same line, the virtual objects with the real
objects; uses techniques to compute the right position of the video-cam
and the orientation to the marker, allowing to the user to suppress the
virtual object on these markers; must offer a rapid developing of an
application in AR or to improve at different sort of platform like
SGIIRIX, PC Linux and PC Windows.
To obtain a good stereoscopic vision it is necessary to configure
the optical system "see-through", then it is necessary to
measure the right distance between the users eyes and the distance to
mini TFTs. All these are made by an autocalibration method, in which the
user takes the marker in his hand, looks with each eye and use an image
of a virtual target presented on the display of device "see-
through". To maintain the stereoscopic capabilities must provide a
new re-calibration for each user that will use the Augmented Reality
applications. For a good calibration operation, the marker must be put
it at 80 cm distance from user eyes and after that it must held very
stabile and in vertical position. In standard method using ARToolKit we
find necessary to virtual calibrate the video-cam with a real
target-pattern with 6x4 points having 40 mm distance between them. After
a 5-10 images taking the system must stop the calibration and compute
the value of video-cam optical distortions.
3. AR APPLICATION INTO A TESTING STAND
The application is to suppress a virtual mechanism onto a stream
video recording with system TriVisio. This virtual mechanism is an
articulated bars system and its command it will be made by UDP port and,
also, to obtain a continuum and variable signal we use the command line
from LabView software.
This application uses a mechanical stand with two motors.
The signal is send to the stand, then to the ARToolKit and the
virtual elements receiving the same signal will have the same movement
with the mobile elements from the stand. If the calibration and
stereoscopic parameters are not established correctly, in his virtual
movement, the articulated bars system will have the initial point in
another position (not in the center of the marker) and it will not work
properly (will not follow the movement of the user head) Fig.4.a)
To implement the HMD it was necessary to implement the virtual
elements into the system and to oblige to stay in a fix position in the
same time with the moving of the video-cam. For that was implementing
also a video-stereo signal to the TriVisio system. This signal was
develop with the help of C++ and with the drivers NVIDIA. The virtual
element must be sending it to each displays of the system with a
different visual angle. But there are other problems with the light used
in the experiment time, because the calibration of ARToolKit was made
with a certain illumination and if this will change, the testing must be
restart once again with the calibration operation and with a constant
illumination.
The result of this experiment is presented in fig.4.b) And we can
observe that, after the implementation, the virtual element (in red
color) remain in the center of the marker and the marker remain in the
center of the video-cam and the marker is recognized. The rendering
operation was made with the help of a capturing video-board.
[FIGURE 4 OMITTED]
4. CONCLUSIONS
By this system we develop a structure to study and to simulate the
moving of the articulated bars in different conditions (speed,
positions, time period). For that was necessary to implement a new AR
system with TriVisio component and special compute line in ARToolKit
software. This system will be useful in understanding and modifying
existing or design future mechanical structures.
5. REFERENCES:
Sander van Dijk, Movement Classification, Artificial Intelligence,
25 october 2004;
Kaufmann, H., Schmalstieg, D., and Wagner, M. Construct3D: A
Virtual Reality Application for Mathematics and Geometry Education.
Education and Information Technologies 5:4 (December 2000), pp. 263-276;
J.S.Monzani, An architecture for behavioral animation of virtual
humans, Ecole Politehnique Federale de Laussanne, Suisse, 2002;
www.TriVisio.com, Accessed: 2007-07-09