Synthetic Environments and Reality Simulations

Reality simulations have existed for a long time. It firstly gained practical credibility in the military training and aerospace, later also gaining credibility in gaming domains and massive multiplayer online platforms. Simulated realities unfold within an environment of synthetic origin and they are created artificially – we call them synthetic environments (SE). They become more and more relevant, not just due to the rise of virtual platforms, but also due to progress in outer space exploration.

SE has been defined as multi-layered space run by persistent infrastructure, based on data sets and simulated storylines that render a certain experience on demand. They are not a result of natural occurrence. Presented with frameworks, limitations and opportunities, they are alternative realities that fabricate environments for the purpose of simulating experiences – physically and digitally.

Physical synthetic environment. Physical SE accommodate various types of conceptual models that are developed to guide through problem-solving mechanisms. They simulate an ongoing experience of specific assemblage of physical conditions to train within pre-programmed and pre-designed circumstances such as climate, weather, specific levels of pressure, temperature, winds, bathymetry, visibility etc. Based on these, teams can navigate and first-handedly experience the effects of such conditions for training purposes, or create simulated livable environments within unlivable environments for survival purposes, such as extreme climates or extraterrestrial terrains.

Digital synthetic environment. The most prevalent use of digital synthetic environment is currently in the commercial domain as virtual immersive worlds. A virtual world is a synchronous, persistent network of interconnected entities (people represented by avatars) and computers (or other connected devices) occupying a layer of reality that is digitally created. It is a real-time rendered 3D non-physical environment that operates synchronously with the physical world.

Synthetic environments, both physical and digital, share following characteristics:

  1. Spatial design and 3D coordinates – same concepts apply as to space with three-dimensional coordinates, i.e. concepts of distance, simulated or physical objects placed within the space, articulated terrain

  2. Time constant – may be real-time or may be based on an agreed time rate that will be constant for all users

  3. Entry and exit – users or participants can enter and exit the simulation

  4. On demand – SE is available anytime it is desired with counted pre-entry effort of entry

  5. Usage asynchronicity – does not require presence of all users to be persistent

  6. Continuity and sustained effects – the SE experience is continuous and sustained from session to session; actions have effects and consequences

  7. Multi-user and multi-player mode – the SE experience is not circled around one user, but is created as a group dynamics between actors; actions of users have effects and consequences

  8. Interactiveness – there is a fluent communication among users, simulated or non-simulated, for coordination, informational or entertainment purposes

Physical Reality Simulations

The development of physical synthetic environments is an incredibly responsible work. It requires a precise understanding of interplay of state-of-the-art information, interactive and visualization technologies, world-class technical research and knowledge from both research and the commercial field. The most prominent use of such environments are therefore in highly advanced projects in the aerospace sector, environmental engineering and military environment, often tied to transnational projects on a global scale and representing planetary interests.

The primary concern is simulation-based assessment of product concepts in manufacture, design, deployment and operability. These airborne platforms may combine air, land, sea or space environments that need to be optimized and managed in real time. Simulating physical environmental conditions is done via software and hardware resources on the basis of permitting interconnection of simulation models and physically involves human participants (engineers, designers, operators etc.).

Synthetic environment is intended to assemble information about a specific reality and then assemble that reality in order to make it suitable for a particular activity.

This information is quite distinct and is based on thorough analysis to adequately manufacture a failure-resistant simulation and ensure persistent reliable conditions. Considering the diversity of interests when it comes to development of new systems, there is always a dedicated multidisciplinary research at play. This brings together a broad range of technologies of embedded systems and their operational environment.

Transition to Immersiveness

Simulation-based training requires highly realistic and immersive synthetic environments. Even though physical simulations contain a physical element, they are connected to a virtual model of synthetic reality that acts as an operational modifier. The digital networks supplement higher level operations based on informational input, which is condensed on a physical level. So physical and non-physical reality simulations are not divorced from one another, but while physical simulations have a non-physical layer, non-physical simulations do not always have a physical counterpart – a twin.

Digital immersion technologies provide various degrees of immersion via:

  • Augmented Reality (AR) – physical environment with an informational overlay of virtual elements (augmentation)

  • Mixed Reality (MR) – environment is mostly virtual with some elements transferred from physical environment

  • Virtual Reality (VR) – environment is completely virtual

With use of these, simulations create Computer-based Training (CBT) environments. It may include VR and AR in order to create Advanced Embedded Training (AET), developing AR-AET or VR-CBT concepts. These environments, while technologically challenged, also involved research concerning human factors design. Implementation techniques to meet the standard for properly including human factors in synthetic realities is led by international standards (ISO 13407) about Human-Centered Design Processes for Interactive Systems. The scale varies from individual standalone encapsulated simulated spaces to integrated Live-Virtual-Constructive (iLVC) environments.

Human Factors – Perceptual and Cognitive

Experimental evidence suggests crucial factors in human stimuli for a gainful experience (and persistence) in SE:

  • Tracking of moving objects

  • Tracking of multiple objects simultaneously

  • Filter out irrelevant visual input

  • Identify targets within a chaotic cluster

  • Switching and multitasking

  • Reacting to briefly presented stimuli

  • Mentally rotating 3D objects

To design a simulation, the process must meet the criteria that apply to physical and psychological fidelities that people can relate to and which they can cognize. Analysis of human factors then contain elements of relevance that belong to the 3D design and are related to fundamental perceptual and cognitive skills.

Perceptual – if the simulation is predominantly focused on perceptual skills (motor-based, manual handling, hand-eye coordination and dextrous activity, it is designed to primarily foster a sequential procedure (usually based on limited cognitive effort). The visual experience is rather secondary as long as the skill transfer is not compromised. Such simulation will be focused on high-fidelity physical experiences.

Cognitive – if the target is primarily cognitive (based on decision making, spatial navigation and spatial awareness etc.), the simulation will predominantly focus on decision-triggered animation sequences to engineer high fidelity psychological experiences.

To increase believability and accuracy, reality simulations benefit from “exploitation” of physical environments to properly design simulated reality and their interfaces. Designing behavioral details that complement and not interfere with performed tasks is difficult – technologically, psychologically and otherwise. With the current competitive market of virtual experiences, virtual spaces often leave the users “blinkered”, overwhelmed or sometimes underwhelmed. Designing scenarios that are contextually accurate, but don’t contain distractive elements is increasingly becoming more common. With more simulated contexts, more evolution occurs. Simulated contexts that heavily feature user navigation pay close attention to believable contexts, fidelity or consistency of the content, and are as such more likely to succeed in addressing the core of the subject for future developments.