aixCAVE - Virtual Production Intelligence goes Virtual Reality

  Cave Cluster of Excellence Integrative Production Technology for High-Wage Countries

Based on the methods of information integration and visualization explored within the first funding period of the research area “The Ontology and Design Methodology of Virtual Production Systems”, the research area “Virtual Production Intelligence” consistently continues the research on virtual production. During the second funding period of the Cluster of Excellence, the focus is on the systematic analysis of information from the production environment. This includes the identification, integration, extraction, analysis and visualization of production information. The concept of Virtual Production Intelligence (VPI) is based on successful, already well-established concepts of information integration and aggregation as well as their evaluation and interpretation by experts in the relevant application domains. The vision of this project is to reduce the plan-value dilemma by facilitating the aggregation and propagation of heterogeneous information generated within processes from real and virtual production environments.

 

Practical Issues

Fabric Cluster of Excellence Integrative Production Technology for High-Wage Countries

In established factory planning processes, an inspection of the factory is not possible until its completion. Due to this limitation natural effects such as the visibility between workstations cannot be considered during the planning
phase. In many projects, such issues have negative effects on the planning result. Another important aspect is the ideal configuration of machines for the planning of a factory. The variety of adjustment possibilities that have impact on the production result leads to a large parameter space which can no longer be intuitively grasped. As a consequence, ideal configurations for individual processes often cannot be found and are therefore substituted by approximations based on experience or technology tables.

 

Approach

Cave Cluster of Excellence Integrative Production Technology for High-Wage Countries

The first scenario examines the inspection on the factory floor in a fully immersive environment. Firstly, a factory model that has been developed by means of the factory planning tool visTABLE is loaded into the aixCAVE and made available for exploration. The user can observe his immediate surrounding from different perspectives by
physically walking through the aixCAVE, while remote areas can be reached by intuitive navigation techniques. Furthermore, the model can be analyzed for compliance with relevant criteria: the stereoscopic representation and
natural mobility, for example, enable the visual accessibility of workplaces or the estimation of their layout. These possibilities of immersive visualization represent a significant enhancement compared to previous planning tools. The second scenario is an approach to visualizing so called meta-models for the examination of production processes. These meta-models map multiple machining parameters to certain criteria, for example, the impact of gas pressure or cutting speed on the roughness of the cut in laser cutting. To explore the resulting multi-dimensional parameter space, a visualization application based on HyperSlice has been developed. The technique represents the parameter space as a matrix of several two-dimensional slices, which illustrate every possible pairwise combination of the axes of the parameter space. Additionally, a three-dimensional volume visualization containing a three-dimensional slice through the parameter space that is individually chosen by the user is shown. The immersive presentation of the application allows the exploration of the parameter space for identifying ideal configuration parameters for the desired process result. Moreover, this presentation provides the possibility to identify interdependencies between output variables and individual input parameters. Such analyses can be used in research as well as in education.

 

Technical Challenges

Hyper-Slice Cluster of Excellence Integrative Production Technology for High-Wage Countries

The first step of information integration in both scenarios consists of information modelling. Subsequently, information is integrated into the Virtual Production Intelligence platform. This platform provides, among others, the integration and provision of information by using a service oriented architecture enabling, e.g. the connection of
various methods of analysis and appropriate visualization. Simultaneously to the creation of the factory model through visTABLE touch, the virtual access to the model shall be possible. It is therefore a requirement that the factory model is not just loaded into the immersive environment but will also be continuously synchronized with modifications made in visTABLE touch. This requires a communication channel between the immersive environment and visTABLE touch that transmits layout changes such as creating, deleting or moving of machine models. Therefore, a transmission with only short delay needs to be realized allowing for a workflow without unintended interruptions.
Another challenge is the navigation through the factory model. While it is possible to overcome short distances by real walking within the aixCAVE of RWTH Aachen University, greater distances have to be covered by using appropriate navigation techniques due to limited physical extents of the immersive system. Their implementation calls for special input devices allowing for an easy and quick operation. The biggest technical challenge for the HyperSlice-based visualization of multi-dimensional metamodels is to guarantee a sufficient interactivity of the application. The creation of slices through the parameter space is extremely CPU-intensive and can take several minutes for high resolutions. To address this issue, methods like progressive rendering and task-based parallelization with user-centered scheduling are used. This allows an immediate display of visualizations in low-resolution while the simultaneous interaction with the application remains possible. This allows the user to get a first impression of the current situation while navigating through the parameter space. As soon as the user stops navigating, the resolution of the visualization is continually increased.