[ieee 16th international conference on artificial reality and telexistence--workshops (icat'06)...

4
SIVP: A Toolkit for Integrating Visualization into Virtual Environment Xiaohui LIANG, Yinghui CHE, Xiaoxiao WU Key Laboratory of Virtual Reality Technology of Ministry of Education, School of Computer Sciences, Beihang University, Beijing, P.R.China {lxh,cyh,wuxx}@vrlab.buaa.edu.cn Abstract None-visible data modeling and representation is one of the key topics in virtual reality, since it greatly affects the sense of the user. In this paper we present how to visualize none-visible data in the virtual environment and a toolkit called Synthetic Information Visualization Platform (SIVP) is given. This toolkit includes two processes and the main components of SIVP are describes. Finally, we demonstrate some works base on SIVP. 1. Introduction Virtual reality is a new type of human-computer interface. Typically, constructing a virtual reality system includes two main processes: the modeling process and the rendering process. The modeling process is in charge of describing three kinds of important properties of an object, the geometry properties, the physics properties and behavior properties. The rendering process then makes use of these modeling information to present the virtual environment. In order to satisfy the 3I properties, visual, haptic and aural way can be adopted. Users can interact with the virtual environment and experience a feeling of realistic. To make the virtual environment more realistic and interactive, many research works have been done [1]. [2]. Visualization is a method of computing. It transforms the symbolic into the geometric, enabling researchers to observe their simulations and computations. Visualization offers a method for seeing the unseen. It enriches the process of scientific discovery and fosters profound and unexpected insights. In many fields it is already revolutionizing the way scientists do science [3]. Based on the difference of abstract data, it can be divided into three categories: visualization in scientific computing (ViSC), information visualization and knowledge visualization [4]. In general, ViSC visualizes the data which has space relationship. Information visualization visualizes the data which has no space relationship. Knowledge visualization is a new type of visualization. It implies all the methods that visualize the process and relationship of complex knowledge. Great introductions to visualization can be found in [5-8]. In the practical application, the users want the virtual environment not only provides a way to interact with the visible data, but also gives a method to visualize the non-visible data. By this way, the users can observe the data and its effect intuitively. This improves the imagination property of the VR system greatly. For example, if the measured atmosphere data can be visualized, we can observe the temperature in different height and the wind field in the virtual environment. Furthermore we can also see how the atmosphere affects the object in the virtual environment. Although there are many toolkits for modeling and rendering in the VR area and many specific tools in visualization already, few tools have the ability to visualize the non-visible data and rendering the virtual environment. We focus on how to integrate non-visible data into the virtual environment and present a toolkit called Synthetic Information Visualization Platform (SIVP). The objective of SIVP is to provide a way for the user to observe the non-visible data besides the geometrical data intuitively. The SIVP allows the user to input traditional visible data such as terrain data, geometry model of the object, texture data, and etc. At the same time, non-visible data can also be represented, such as atmosphere data, temperature, pressure, wind field. Users can also interact with the virtual environment freely. The rest of this paper is organized as follows. In section 2, we present the workflow and main components of SIVP. Section 3 gives some results which are based on SIVP and at last we conclude this paper in the last section. Proceedings of the 16th International Conference on Artificial Reality and Telexistence--Workshops (ICAT'06) 0-7695-2754-X/06 $20.00 © 2006

Upload: xiaoxiao

Post on 24-Mar-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: [IEEE 16th International Conference on Artificial Reality and Telexistence--Workshops (ICAT'06) - Hangzhou, Zhejiang, China (2006.11.29-2006.12.1)] 16th International Conference on

SIVP: A Toolkit for Integrating Visualization into Virtual Environment

Xiaohui LIANG, Yinghui CHE, Xiaoxiao WU Key Laboratory of Virtual Reality Technology of Ministry of Education, School of Computer

Sciences, Beihang University, Beijing, P.R.China {lxh,cyh,wuxx}@vrlab.buaa.edu.cn

Abstract

None-visible data modeling and representation is one of the key topics in virtual reality, since it greatly affects the sense of the user. In this paper we present how to visualize none-visible data in the virtual environment and a toolkit called Synthetic Information Visualization Platform (SIVP) is given. This toolkit includes two processes and the main components of SIVP are describes. Finally, we demonstrate some works base on SIVP. 1. Introduction

Virtual reality is a new type of human-computer interface. Typically, constructing a virtual reality system includes two main processes: the modeling process and the rendering process. The modeling process is in charge of describing three kinds of important properties of an object, the geometry properties, the physics properties and behavior properties. The rendering process then makes use of these modeling information to present the virtual environment. In order to satisfy the 3I properties, visual, haptic and aural way can be adopted. Users can interact with the virtual environment and experience a feeling of realistic. To make the virtual environment more realistic and interactive, many research works have been done [1]. [2].

Visualization is a method of computing. It transforms the symbolic into the geometric, enabling researchers to observe their simulations and computations. Visualization offers a method for seeing the unseen. It enriches the process of scientific discovery and fosters profound and unexpected insights. In many fields it is already revolutionizing the way scientists do science [3]. Based on the difference of abstract data, it can be divided into three categories: visualization in scientific computing (ViSC), information visualization and knowledge visualization [4]. In general, ViSC visualizes the data which has

space relationship. Information visualization visualizes the data which has no space relationship. Knowledge visualization is a new type of visualization. It implies all the methods that visualize the process and relationship of complex knowledge. Great introductions to visualization can be found in [5-8].

In the practical application, the users want the virtual environment not only provides a way to interact with the visible data, but also gives a method to visualize the non-visible data. By this way, the users can observe the data and its effect intuitively. This improves the imagination property of the VR system greatly. For example, if the measured atmosphere data can be visualized, we can observe the temperature in different height and the wind field in the virtual environment. Furthermore we can also see how the atmosphere affects the object in the virtual environment.

Although there are many toolkits for modeling and rendering in the VR area and many specific tools in visualization already, few tools have the ability to visualize the non-visible data and rendering the virtual environment. We focus on how to integrate non-visible data into the virtual environment and present a toolkit called Synthetic Information Visualization Platform (SIVP). The objective of SIVP is to provide a way for the user to observe the non-visible data besides the geometrical data intuitively. The SIVP allows the user to input traditional visible data such as terrain data, geometry model of the object, texture data, and etc. At the same time, non-visible data can also be represented, such as atmosphere data, temperature, pressure, wind field. Users can also interact with the virtual environment freely.

The rest of this paper is organized as follows. In section 2, we present the workflow and main components of SIVP. Section 3 gives some results which are based on SIVP and at last we conclude this paper in the last section.

Proceedings of the 16th International Conference onArtificial Reality and Telexistence--Workshops (ICAT'06)0-7695-2754-X/06 $20.00 © 2006

Page 2: [IEEE 16th International Conference on Artificial Reality and Telexistence--Workshops (ICAT'06) - Hangzhou, Zhejiang, China (2006.11.29-2006.12.1)] 16th International Conference on

2. SIVP and its main components

Synthetic Information Visualization Platform (SIVP) is developed to integrate the data visualization into a synthetic nature environment. Through SIVP, the users can observe their interested non-visible data in the virtual environment and the imagination property of a VR system can get improved.

The SIVP adopts object-oriented designing and programming method to support different data type. To achieve high performance, we provide some predefined primitives in SIVP. 2.1. Basic definitions

Before we give a detailed description of SIVP, some definitions used in this paper are introduced first.

Definition 1: Object. The object means the things in the real world to be modeled into the virtual environment.

Definition 2: Visual primitive. Visual primitive is the basic element that the rendering process handled. It includes the properties of shape, color, texture, transparency etc. In our toolkit, the shape of visual primitive can be triangle mesh, image, regular volume data and particle system.

Definition 3: Data type. Data type is a serial number which stands for the type of a visual primitive. In our implementation, we define four types as POLYGON, IMAGE, VOLUME and PARTICLE.

Definition 4: Influence type. Influence type is a serial number which describes the symbol to reflect the influence of an object. In our method, we define four types as BALL, CUBE, CYLINDER and CONE.

Definition 5: Behavior data. Behavior data is the user assigned behavior of an object. Although the objective of our work is building a virtual environment that user can interact with, the user still want to define the behavior data such as route to let some objects have automatic behavior in some cases.

Definition 6: Node. Node is the basic element to be rendered. It is represented as a seven element group.

As defined, the node is the element to be rendered. Because different data type has different rendering method, so the node must provide the proper rendering method to render the node.

Definition 7: Scene. Scene is a graph of nodes. It includes both the nodes and the relationship between two nodes.

2.2. Workflow of SIVP

Figure 1 demonstrates the workflow of SIVP.

Terrain Data

Actor Data

User Data

Behavior Data

Modeling Environment

Visualization Mapping

Triangle Mesh

Image

Symbol

Animation Control

UserInput

Node

Transfer Method Definition

Rendering Environment

Scene Organization and Management

Interaction Management

Rendering Process

User

Rendering Method Library

Figure 1. Workflow of constructing a VR system.

The SIVP includes two main environments. One is the modeling environment which transfers the object to visual primitive. The other one is the rendering environment to organize and manage the scene and render different kinds of nodes. In these two environments, the modeling environment is more important because it implements the transfer function.

By now, the input data include four types: 1) Terrain data. We model the terrain using

measured terrain data in the SIVP. The terrain data includes the DEM (Digital Elevation Model) about the height of sampled point and DLG (Digital Line Graph) about the features on the terrain. The terrain data can be transferred to triangle meshes and textures to model the surface of terrain.

2) Actor data. The actor means the object controlled by the user in the virtual environment. This kind of data can be transferred to triangle mesh using commercial software such as Multigen Creator or 3DS Max. To make the object to be controlled easily, we define a multi-articulation skeleton. The user can control different parts of the object in the run-time.

3) User data. The user data refers to the data that the user interested. The data may include scalar field, vector field and even some imagination in the user’s mind. For example, the user may be interested with the atmosphere data such as cloud, air pressure, air temperature and velocity of wind. These data can easier be described as regular scalar or vector field and be transferred to visual primitives. Furthermore, the user may also add the influence of the object in the user data,

4) Behavior data. The user can define some behavior data of the object such as route.

Node=<data type, visual primitive, position, orientation,influence type, behavior data, rendering method>.

Proceedings of the 16th International Conference onArtificial Reality and Telexistence--Workshops (ICAT'06)0-7695-2754-X/06 $20.00 © 2006

Page 3: [IEEE 16th International Conference on Artificial Reality and Telexistence--Workshops (ICAT'06) - Hangzhou, Zhejiang, China (2006.11.29-2006.12.1)] 16th International Conference on

2.3. The modeling environment

The task of the modeling environment is to transfer the input data to visual primitives. The user can interact with the transfer process to define the user interested data.

In the modeling environment, the difficulty lies in that how to define the transfer function. To solve this problem, we provide some embed transfer methods so that the user can implement the transfer process easily. By now, our transfer functions include four types:

1) Normal Map for the terrain. Terrain is an important part of a virtual environment.

The DEM、DLG and airscape etc are the general data format. Usually, DEM data can be transferred to triangle meshes in the virtual environment. However, here we use the normal mapping technique in order to support large scale terrain data. Normal map function has been provided. This function generates the normal map from DEM data and presents 3D-like rendering result. The normal at every point is generated using equation (1)

2 2

( , ,1),

( ) ( ) 1g a g r

g a r a

H H H Hnormal

H H H H

− −=

− + − + (1)

where gH is the height on the point ( , )x y , aH is the height on the point ( , 1)x y + and rH is the height on the point ( 1, )x y+ .

The normal map generated through equation (1) and the rendering result show in figure 2.

Figure 2. Normal map (left) and rendering result (right).

2) Symbols for the influence of an object. In some cases, the users want to know not only the

state of the actor, such as position and orientation, but also the influence of the actor and the interaction between actors. In SIVP, we use symbols to stand for the influence of the actor. These symbols can be added to the node by the user, and changed their shape in the run-time

Figure 3 illustrates the symbols in SIVP.

Figure 3. Symbols for representing the influence of an object.

3) Vector arrow and line for the Vector field. We provide two ways to transfer the vector field

data to the visual primitives. One way is using the arrow to describe the direction and magnitude of a vector. The direction of arrow implies the direction of the vector, the length of arrow implies the magnitude

of the vector. Let ( , )V x y= , then 2 2( )length x y= + . The other way is using the line to describe a vector.

Here we choose an improved DDA method to generate the vector line.

4) Surface extracting from the scalar field. We use the classical marching cube method to

transfer the scalar data field to surfaces. In order to generate proper isosurface, the user should assign the color、the transparency and the value of the isosurface. Storing the isosurface consumes a lot of memory space, so we use a layer vertex pool technology to reduce the memory requirement. 2.4. The rendering environment

The task of the rendering environment is organizing and managing the nodes of the scene, then rendering the scene correctly. Because the modeling environment has transferred the object to visual primitives already, such as polygons and images, the rendering process is relatively simple. Furthermore, we use a linear table method to organize the nodes in the virtual environment, the organization and management task is easy too.

To render some special effect in the virtual environment, we add a special effect library in the rendering environment. By now, the library supports realistic cloud rendering based on the particle system. Figure 4 shows the clouds generated by using perlin noise.

Figure 4. Different types of clouds generated by our method.

Proceedings of the 16th International Conference onArtificial Reality and Telexistence--Workshops (ICAT'06)0-7695-2754-X/06 $20.00 © 2006

Page 4: [IEEE 16th International Conference on Artificial Reality and Telexistence--Workshops (ICAT'06) - Hangzhou, Zhejiang, China (2006.11.29-2006.12.1)] 16th International Conference on

��3. Some works based on SIVP

In this section, we give two works based on SIVP. The first work is the strategy situation visualization

system. The requirement of this system is visualizing the influence of strategy object in a virtual environment. The environment includes measured terrain data, base object and its influence, equipment object and its influence and route. In this work, we also use the billboard method to describe some complex equipment objects.

Figure 5. A strategy situation visualization system.

The second work is an atmosphere visualization

system. The requirement of this system is visualizing the effect of atmosphere in a virtual environment, including temperature, pressure and wind direction etc. The input data include measured terrain data, atmosphere, equipment object and its influence and route. In this work, we also use the particle system to illustrate the realistic cloud effect. The system run-time results are showed in Figure 6.

Figure 6. An atmosphere visualization system. 4. Conclusion and future works

In this paper, we present a toolkit called synthetic information visualization platform (SIVP). The objective of SIVP is to integrate data visualization and virtual environment, visualize non-visible data into a virtual environment to improve the imagination property of the VR system. This paper introduces the workflow of SIVP and the main components: the

modeling environment and the rendering environment. The paper also gives some works based on SIVP.

There are still a lot of works to be done to make the virtual environment more realistic. For example, to render large-scale terrain data, the out-of-core method should be added in the environment. Some hardware-accelerated volume rendering method should also be researched and added in the special effect library to support direct volume rendering. References [1] MÄoller, T. and Haines, E. Real-time rendering. A.K. Peters, Ltd, 2002. [2] Dong Shihai. Progress and challenge of human-computer interaction. Jounal of Computer-aided Design & Computer Graphics, 16, 1, 1-13, 2004. [3] McCormick, B.H., DeFanti, T.A. and Brown, M.D. Visualization in scientific computing. Computer Graphics, 21, 6, 1987. [4] Shedroff, N. Information interaction design: a unified theory of design. In R Ja-cobson (ed.), Information Design, MIT Press, Massachusetts, 1999. [5] Elvins, T. T. A survey of algorithms for volume visualization. ACM Computer Graphics, August 1992. [6] Yagel, R. Volume viewing algorithms: survey. International Spring School on Visualization, 2000. [7] Meissner, M., Huang, J., Bartz, D., Mueller, K. and Crawfis, R. A practical evaluation of popular volume rendering algorithms. ACM Symposium on Volume Visualization 2000, Salt Lake City, New York:ACM Press, 81-90, 2000. [8] Klaus Engel and Thomas Ertl. Interactive high-quality volume rendering with Flexible consumer graphics hardware. EG2002 State of the Art Report, 2002.

Proceedings of the 16th International Conference onArtificial Reality and Telexistence--Workshops (ICAT'06)0-7695-2754-X/06 $20.00 © 2006