Projects | VR and AR Lab | NC State ISE


Scale Cognition through Advanced Learning Environments in Virtual Reality (SCALE-VR)

Scale Worlds allow users to “shrink and grow”. Watch a short video on Scale Worlds below.

Six proposed scientific entities in Scale Worlds.
A user viewing entities in the CAVE.

The Next Generation Science Standards propose “scale, proportion, and quantity” as a crosscutting concept that pervades science and can aid students in making connections across topics, disciplines, and grades in order to construct a more robust understanding of science. While cutting edge STEM research, such as nanotechnology and exoplanet exploration, is being conducted at extremes of scale, research shows that learners of all ages hold inaccurate ideas about the size of scientifically relevant entities.

Funded by the National Science Foundation, the overarching goal of SCALE-VR is to examine scale learning experiences and outcomes in immersive virtual environments, and investigate and characterize students’ conceptions of scale. SCALE-VR entails the design, development, and evaluation of a virtual learning environment called Scale Worlds, which allows students to experience first-hand, and in an embodied manner, realistic size comparisons that cannot be replicated in everyday experience. The project team has already developed a functioning prototype of Scale Worlds in a Cave Automatic Virtual Environment (CAVE). Scale Worlds will advance understanding of the crosscutting concept of scale, proportion, and quantity, and how students learn about powers of ten, scientific and standard notation, and measurement. Broadly speaking, SCALE-VR will yield theoretical advances, techniques, and principles widely applicable to STEM subjects that share a “common core” of mathematics.

Virtual Instructor Application using Augmented Reality for Worker Posture Training

A screenshot of a user viewing through an augmented reality display to see a virtual instructor (VIA).
A screenshot of a user viewing through an augmented reality display to see a virtual instructor (VIA).

Musculoskeletal disorders are the largest category of workplace injuries in today’s workforce. We have attempted to provide this novel training using cutting-edge augmented reality (AR) technology. AR enables the users to see real-world objects and virtual objects generated by the computer in the same context. This project is driven by the 10 Big Ideas of the National Science Foundation, we created Virtual Instructor Application, or VIA. We started with the first generation of VIA, or VIA-1. The overarching objective of VIA-1 was to establish an interactive and immersive training platform to facilitate MMH workers learning safer work postures. The identified MMH postures included squat lifting, stoop lifting, and overhead reaching. Users were able to superimpose their own bodies to match the virtual instructor’s posture in AR. Users could also walk around the point-cloud-generated virtual instructor in the lab and observe lifting postures from different angles.

Comparison of Psychological Responses to Proximate Robot Motion Between the Physical World and an Immersive Digital Twin

Three experimental conditions: physical environment, LiDAR scanned virtual environment, and CAD virtual environment.

One of the principal uses of virtual reality (VR) is to simulate hazardous environments for training and psychological assessment. Digital twins, for example, are virtual copies of manufacturing environments that can be used to safely explore human-robot interactions (HRI) without the risk of injury. But do virtual environments such as these accurately reproduce the real-world effects of moving robots on human performance, workload, and stress? To measure the effects of robot motion, a mock work-cell was created in which people performed a visual search task while a collaborative robot moved its arm as in a pick-and-place operation. Two digital twins were created in VR to compare real-world effects to those in high- and low-fidelity virtual environments, with and without sound. Quantitative measures of task performance, workload, stress, and presence were recorded. Semi-structured interviews were conducted and a thematic analysis was performed to explain any significant differences between real and virtual environments. Results showed that subjective measures of presence were significantly different among the three environments. The effect of robot motion on task performance was similar in all three environments, although people performed better in VR in part because of differences in the perception of robotic sound. However, the real-world effect of robot motion on subjective workload and stress was significantly attenuated in both virtual environments. This study shows that the results of human-factors evaluations in immersive digital twins should not necessarily be taken at face value.

Robot-Related Injuries in the Workplace

Anatomical distribution of the four injury types that accounted for 80%
of reported injuries.

Industrial robots are becoming more widespread in the United States and across the world, yet the nature and extent of robot-related occupational injuries is not fully understood. This is a hindrance to the development of evidence-based safety and risk mitigation strategies. To address this knowledge gap, Severe Injury Reports (SIRs) associated with robot-related injuries from the U.S. Occupational Safety and Health Administration (OSHA) were retrieved and analyzed. Specifically, 61 relevant reports were identified and a taxonomic analysis was conducted to categorize the hazard scenarios which preceded robot-related mishaps as well as the injuries which resulted. The two most frequently occurring scenarios were being pinned, pinched, or struck by unexpected robot actuation after failing to de-energize machinery during maintenance activities; and being pinned or struck by normal robot actuation after unknowingly entering the work envelope of an adjacent robot during maintenance activities. Most injuries were caused by constrained impacts. Amputations of the fingers were the most common injury, followed by fractures to the extremities. The results suggest strategies for risk mitigation, in terms of both reducing the probability of robot-human impacts and reducing the severity of injury should impact events occur.

Emergency Medicine Patient Lift Training Simulation in Virtual Reality

A participant performing an emergency medicine patient transfer task.

Musculoskeletal disorders are the largest category of workplace injuries in today’s workforce. Patient handling is a common cause of workplace musculoskeletal disorders among healthcare workers worldwide. Nine participants were first asked to complete the same three-part physical lift from the pilot study in order to calibrate the virtual exertions thresholds for three different weights; 100, 150, and 200 pounds. Participants then completed virtual lifts within the cave automated virtual environment.

Investigation of Virtual Reality Guided Upper Limb Exercises

Person performing a shoulder extension task in virtual reality by following a virtual instructor.

Shoulder pain is common in individuals with work-related musculoskeletal disorders, which can be disabling and hinders people from performing activities of daily living. While individuals seek conventional shoulder rehabilitation program for recovery guidance and support, new technology like virtual reality (VR) has gained increasing attention in physical rehabilitation.