View the PDF here

Fire Smell in Virtual Reality - Why Include It?

By: Humayun Khan, Department of Civil and Natural Resources Engineering, University of Canterbury, New Zealand (


Loss of life and destruction from fires remain a regular occurrence in today’s world. The survival of people involved in the fire partly depends on individual and collective behaviour. Research on human behaviour in fire (HBiF) has indicated that people who are unprepared and lack clear evacuation guidelines often misinterpret and take a long time to respond [1, 2] and take inappropriate actions which can lead to serious injuries, and in the worst cases, fatalities. To investigate the behaviour of people during fire and evacuation, multiple approaches (observational and simulated) have been tried in the past, including hypothetical experiments, case studies, field studies, drills (announced and unannounced), laboratory experiments, and virtual reality (VR) experiments [3, 4, 5].  Among these methods, VR allows immersing participants in scenarios and environments that would be hazardous and impossible to replicate in the real-world and allows the study of human behaviour to derive effective guidelines. However, VR also has its limitations which are discussed in the later part of the article.

VR is a computer-simulated 3D environment that creates the feeling of being physically there in the place by mainly engaging the visual and auditory senses. Generally, VR is understood to be experienced in a head-mounted display (HMD) that has two small screens rendering images for each eye. The brain combines the two images in the visual cortex to register a 3D shape or geometry. This is how human perceives depth and is called stereoscopic vision. The stereoscopic images displayed on HMD are generated by creating a pixel offset between the images. The HMD’s position and orientation are constantly tracked at a high frequency using either inside-out or outside-in tracking, and the stereoscopic images are updated with the motion of the HMD in physical space. Apart from HMD, VR can also be experienced using multiple projected screens (CAVE system), 3D glasses, and even a standard LCD screen, however, each technology has a different level of immersion. For our research, we are using HTC Vive Pro Eye HMD to create VR experiences.

Multimodal Sensory Input for Virtual Reality

Most of the above-mentioned VR technologies are based on audio-visual stimuli and do not engage the olfactory (smell), haptic (touch), gustatory (taste), and proprioceptive (kinaesthetic) senses. Experiments using traditional audio-visual VR systems have been shown to be effective for training and wayfinding drills in a digital twin of the real-world [6]. However, researchers also found inconsistent, non-valid behaviour exhibited by the users in these virtual environments (VE). Non-valid behaviour is the set of actions which do not match the behaviour observed in the same real-life situation. Researchers have postulated that the missing sensory inputs (smell, touch, kinaesthetic) might be the reason for this discrepancy, and the addition of these inputs could increase the number of valid actions performed by the user [7].

Passive Fire Smell in Virtual Reality

Smell is potentially an important sensory input experienced by people during a building fire. It can also be a cue that triggers evacuation behaviour. Introducing smell in VR as per the events happening in virtual reality could increase the immersiveness of the experience. However, olfactory displays in VR are still an open research problem, and there has not been any established method to deliver smell. In the VR Evacuation Lab at the University of Canterbury, we are developing a passive olfactory display, Smell-O-Vator, (shown in Figure 1) that stimulates the olfactory senses without actively emitting smell molecules in the air. Our system consists of a smell chamber (containing a cotton wick with a fire smell) covered by a sliding lid that is attached at the end of an arm. The sliding lid is controlled by a linear actuator and the arm is manipulated with a rotary servo motor. Both motors are controlled with PWM signals emitted from a microcontroller that is wirelessly connected to the server PC running the VR simulation. During the VR experience, when the smoke starts in the environment, the arm moves down as shown in Figure 1, and the slider lid opens to release the fire smell.  After the smoke is suppressed through user behaviour and is completely stopped, the arm moves up and the slider lid closes to remove the smell stimuli. With this mechanism, it is easier to introduce and take away the smell in VR, and we can prevent the smell from lingering in the environment.

We will be using our passive olfactory display in a recreated MGM Grand Hotel fire scenario in VR, as it is a well-documented incident with validated actions identified by the survivors. The MGM Grand Hotel fire occurred in 1980 and 85 people died in the incident. Most of the deaths were due to smoke inhalation and were located in the hotel tower. Arias et al. experimented with the MGM hotel fire scenario at Lund University without the smell stimuli [3]. Figure 2 shows the MGM hotel room in the simulation, and Figure 3 has the smoke coming through the vent. The participants will experience the fire smell when the smoke starts coming out of the vent. We will be investigating the effect of smell stimuli on user’s behaviour during the fire. We anticipate that the addition of smell will increase the number of valid actions performed by the users in the simulated MGM Grand hotel fire scenario.  


 Figure 1: HTC Vive HMD with the Passive Olfactory Display, Smell-O-Vator


Figure 2: MGM hotel room VR environment                  Figure 3: MGM hotel room with smoke

Limitations and Potential Opportunities

Even though VR is a promising tool to create high-fidelity fire evacuation scenarios, it is limited from a risk-perception standpoint. As they are no consequences associated with performing non-valid behaviour, there are unrealistic actions taken by the users which are not possible in real life. The virtual world created in VR has high visual fidelity, but participants know that they are in an artificial simulated world. This belief leads to actions taken by the users that do not reflect the behaviour exhibited in the real fire scenario where the evacuees feel immediate danger. However, adding multisensory stimuli such as smell, touch, and heat along with the visual would arguably make the VR experience more realistic and could increase the fidelity of response shown in real-life scenarios.

What’s Next?

VR offers a medium to simulate the real world with high visual and audio fidelity. There remains a lack of research on the effect of multimodal stimuli on evacuation and safety-related VR scenarios. Particularly, research has not assessed the advantages and disadvantages of having additional modalities in the environment. In our next planned user study, we are looking at studying the inclusion of smell on the fidelity of behaviour exhibited by the users. The study will be conducted by using our newly designed passive smell delivery system. Exploring this avenue could open new ways of creating more effective evacuation plans and guidelines that might save lives in future incidents.


[1] Wood PG. A survey of behaviour in fires. John Wiley & Sons, Ltd, UK; 1980.

[2] Chittaro L, Ranon R. Serious games for training occupants of a building in personal fire safety skills. In2009 Conference in Games and Virtual Worlds for Serious Applications 2009 Mar 23 (pp. 76-83). IEEE.

[3] Arias, Silvia, Rita Fahy, Enrico Ronchi, Daniel Nilsson, Håkan Frantzich, and Jonathan Wahlqvist. "Forensic virtual reality: investigating individual behavior in the MGM grand fire." Fire Safety Journal 109 (2019): 102861.

[4] Lovreglio R, Kinateder M. Augmented reality for pedestrian evacuation research: promises and limitations. Safety science. 2020 Aug 1;128:104750.

[5] Kinateder M, Ronchi E, Nilsson D, Kobes M, Müller M, Pauli P, Mühlberger A. Virtual reality for fire evacuation research. In2014 federated conference on computer science and information systems 2014 Sep 7 (pp. 313-321). IEEE.

[6] Shaw, Emily, Tessa Roper, Tommy Nilsson, Glyn Lawson, Sue VG Cobb, and Daniel Miller. "The heat is on: Exploring user behaviour in a multisensory virtual environment for fire evacuation." In Proceedings of the CHI Conference on Human Factors in Computing Systems, pp. 1-13. 2019.

[7] Dong, Yuanfa, Mark Webb, Carlo Harvey, Kurt Debattista, and Alan Chalmers. "Multisensory Virtual Experience of Tanning in Medieval Coventry." In GCH, pp. 93-97. 2017.