The proposed augmented reality training system emulates a hazard sensor, miniature low-energy range-finding Bluetooth beacons (one per hazard), and a web application that calculates exposure levels and through which the instructor configures the exercise.
It represents a step forward in the quality and relevance of hazardous operations (HAZOPS) training. eLearning has improved training over traditional classroom teaching, and many believe that the next step in eLearning is immersion, i.e., field training that engages mind and body in simulating the experience of being autonomous in (simulated) dangerous situations. Immersive experiential/physical learning has been widely adopted; for example, OPHP runs at least five exercises every year in which a simulated hazardous waste site is prepared for students as part of their hands-on training in the 40-hour Hazardous Waste Training course.
The Rutgers School of Public Health (RSPH) Office of Public Health Practice (OPHP) conducts hands-on exercises that are a vital part of its HAZMAT training programs. A key objective of these exercises is to provide learners with realistic experiences involving (1) hazards, (2) personal protective equipment (PPE), and (3) air monitoring instruments, to assess learner performance in the context of these real experiences.
OPHP learners work with actual PPE and very realistic looking (and behaving) simulated hazards. Missing from the experience, however, are hazard sensors, including chemical and radiological, that in real time provide the learners with realistic readings representative of the current condition of the exercise, such as hazard types, hazard and learner positions, and wind speed and direction. Instead, instructors periodically shout exposure levels to learners during the exercise from the sidelines, undoing much of the effort to make the overall exercise realistic to the learner and increasing burden to the instructor, impeding multi-learner exercises.