On-road Virtual Reality Driving Simulatoressay.utwente.nl/74239/1/Goedicke_MA_HMI.pdf · 2017. 12. 11. · On-road Virtual Reality Driving Simulator Human Media Interaction - Master - [PDF Document] (2024)

  • On-road Virtual Reality DrivingSimulator

    Human Media Interaction - Master thesis

    AuthorDavid Goedicke,BScd.goedicke[at]student.utwente.nls1235079

    Stanford University supervisorDr. Wendy Ju

    University of Twente supervisorDr. Jamy Li

    Prof. Dr. Vanessa EversDr. Angelika Mader

    December 11, 2017

  • Contents

    1 Introduction 1

    2 Related Work 32.1 Driving Simulators . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . 32.2 On-road drivingexperiments . . . . . . . . . . . . . . . . . . . . . . . . . 4

    3 VR-in-Car as a research platform 53.1 Advantages of VR-In-Car. . . . . . . . . . . . . . . . . . . . . . . . . . . 53.2Development Challenges . . . . . . . . . . . . . . . . . . . . . .. . . . . . 7

    4 VR-in-Car System Design 84.1 Inputs . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . 8

    4.1.1 Car orientation tracking . . . . . . . . . . . . . . . . .. . . . . . . 84.1.2 OBD II and CAN-Bus . . . . . . . . . . . . . .. . . . . . . . . . . 104.1.3 User input . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . 10

    4.2 Virtual Environment Core . . . . . . . . . . . . . . . . . .. . . . . . . . . 114.3 Outputs . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . 124.4 Data Collection . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134.5Testing set-up and support hardware . . . . . . . . . . . . . . . .. . . . . 14

    5 Setup and Protocol Considerations 165.1 Vehicles, Wizards andParticipants . . . . . . . . . . . . . . . . . . . . . . 165.2Driver Instructions/Training . . . . . . . . . . . . . . . . . . .. . . . . . . 175.3 Motion Sickness . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . 17

    6 Validation Study 186.1 Participants . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . 186.2 Virtual CourseScenario . . . . . . . . . . . . . . . . . . . . . . . . . . . .186.3 Physical Course . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . 206.4 Experiment Procedure . . . . . . . . . .. . . . . . . . . . . . . . . . . . . 216.5 Analysis . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

    7 Results 23

    1

  • 7.1 Motion Mirroring and Orientation . . . . . . . . . . . . . .. . . . . . . . 237.2 User Experience . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . 257.3 Sound . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . 257.4Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . 25

    8 Discussion 278.1 Participant Experience . . . . . . . . . . .. . . . . . . . . . . . . . . . . . 27

    8.1.1 Graphical Fidelity . . . . . . . . . . . . . . . . . . . .. . . . . . . 278.1.2 Motion Sickness . . . . . . . . . . . . . . .. . . . . . . . . . . . . 28

    8.2 Technical Improvements . . . . . . . . . . . . . . . . . . .. . . . . . . . . 28

    9 Future work 29

    10 Prospective Applications 3010.1 Autonomous Vehicle Design . .. . . . . . . . . . . . . . . . . . . . . . . . 3010.2 HumanBehavior in Autonomous Vehicles . . . . . . . . . . . . . . . . . .3010.3 Inertial Motion as signaling for AVs . . . . . . . . . . . .. . . . . . . . . . 3110.4 Handover scenarios . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3110.5 Design orentertainment and secondary activities . . . . . . . . . . . . . .31

    11 Conclusion 32

    A Data Collection Snippet 36

    B Experiment Documents 38B.1 Procedure . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . 38B.2 Participantrecruiting text . . . . . . . . . . . . . . . . . . . . . . . . . .. 39

    C Motion cuing 40C.1 IMU evaluation and testing . . . . . . . .. . . . . . . . . . . . . . . . . . 40

    D CAN-Bus implementation 42

    E Course Selection 43

    2

  • Abstract

    For autonomous vehicles to become market ready it is importantto better understandhow drivers and passengers will interact withthis new technology. This includes theinterface design and how thevehicle’s motion is perceived by the passengers. We proposea newmethod to evaluate this interaction, which combines the use ofsimulators andon-road methods.

    Through virtual reality, participants can experience what it islike to drive in an au-tonomous vehicle. They are seated on thepassenger of a regular vehicle, which is oper-ated by theexperimenter. Throughout the experiment, participants perceive asimulatedtraffic scenario through a virtual reality headset.Additionally, they can indicate steeringmotions through a steeringwheel in front of them.

    This new method allows for a rapid prototyping approach inautonomous vehicle designin a highly immersive virtual environment.The integration of virtual reality in a realcar allows for arealistic and immersive experience while being cost-effective.

    A preliminary validation study yielded promising results andshowed that the proof ofconcept implementation of the system canimmerse the participants in a virtual drivingscenario. The realismof the system was convincing enough to evoke reflexive behaviorsinsome test participants in response to the environment. Thus, theresults from thisstudy were promising however, further researchneeds to be done to fully validate thismethod.

  • Chapter 1

    Introduction

    The emerging technology of Autonomous Vehicles (AVs) pose newquestions on howpeople interact with AVs and react to theirinterfaces, behaviors and motion. Existingmethods to address theseissues use simulators and on-road methods. However, simula-tors areoften expensive (above 50K$ [10]) and on-road studies are verylimited in thetype of traffic scenarios that can beencountered.

    Therefore, in this document we present VR-in-Car a simulationand on-road hybridmethod that addresses the aforementioned issues.It integrates off-the-shelf Virtual Re-ality (VR) technology into amoving vehicle. VR-in-Car allows participant to experiencewhat itis like to drive in an AV. This method is comparatively low-cost,offers iner-tial vestibular cues and dynamic traffic environments.The main contributions of thiswork are the development, aproof-of-concept implementation and evaluation of thisnewmethod.

    The next section describes the different areas of related workon vehicle and simulatorresearch. In section 3, this new method isdiscussed and compared to the related work.Section 4 describes thetechnical implementation details. Section 5 through 6 discusstheapplication of this method in a study. The results are describedand discussed in section7 and 8 respectively. Sections 9 and 10layout future work and prospective applications.

    1

  • Figure 1.1: VR-in-Car allows participants to experience thephysical sensations of real-world motion while being visually in avirtual environment.

    2

  • Chapter 2

    Related Work

    VR-in-Car is a research tool that is particularly cost-effectivein creating immersive trafficexperiences from the drivers’ andpassengers’ point of view. In the following section wewill look atexisting work in the field of drivers-AV interaction research toevaluate thisnew method against prior work.

    2.1 Driving Simulators

    Driving simulators are a commonly used method to evaluateinterface design and humanbehavior in simulated trafficsituations.

    In advanced driving simulators, a participants enters a mobilecar chassis which containsscreens or projectors instead of theactual windshields to create a visual virtual envi-ronment. Manytypes of driving simulators exist. They can primarily bedifferentiatedby the level of immersion they can create. Generally,it follows that the more sensoryinputs a simulator can synthesizefor, the higher the immersive potential of asimulator.Immersiveness is important to provoke genuine responsesin the participants[15].

    Simulator research always has to negotiate between cost and therequired immersion forany given research question. For instance ifthe focus is on reflexive behavior, a highfidelity simulator isrequired to provoke the required response [9, 23].

    Simulators range in complexity from basic on screen simulatorslike City Car Driving orOpenIV1 to very advanced facilities thatalso include the capabilities to create a felt senseof inertialmotion. Examples of these high-end simulators are the NADSsimulator atthe University of Iowa [12] and Fords advanced VIRTTEXsimulator[2]. However, theseexpensive and elaborate set-ups, arestill only able to produce a fraction of the actual

    1http://citycardriving.com, http://openiv.com/

    3

  • motion one feels driving in the real world[10, 8]. Moreinformation on the importanceof motion cuing can be found inAppendix: C.

    VR-in-Car is set-up and run in a real car therefore, the motionfelt by the participantis very realistic. At the same time, thismethod is much cheaper to set-up. The costto equip a regular carwith VR-in-Car is, at the time of writing about 5000 to 7000euros.These costs may become even less when the used equipment will bemore readilyavailable. This is significantly less than even thesimplest motion base simulator thatstart at about 50ke. Is theacquisition or renting of a research vehicle required, thecosts areabout the same as a simple motion base simulator[9].

    2.2 On-road driving experiments

    In on-road driving experiments, participants enter a mobilevehicle, that is often aninstrumented regular car, which is driventhrough the the real world. This differs fromon-road observationalstudies such as [11, 17] which observe without interacting withthedrivers themselves. Past theoretical work that has analyzed thevalue of conductingresearch with users in naturalisticenvironments, have claimed that the benefits includethat peoplehave awareness of the physics, their own bodies, the environment,and theirinteraction[14].

    Traditionally, on-road driving experiments or on-road tests haveparticipants drive in anormal vehicle rather then an autonomousvehicle[18]. Given that researchers do notoften have access toautonomous vehicles, this work uses a Wizard of Oz approachtocreate the illusion that the normal vehicle used in theexperiment is autonomous. Priorwork that has used this technique tocreate the illusion in two of three ways by hiding thedriver [4,24] or by controlling the vehicle by a hidden wire system [22].Similarly, otherstudies have used a similar approach to investigatethe interaction between pedestriansand AVS [21].

    For general interaction and behavior related questions, on-roadand simulator methodsare a common choice. Each method also extendsinto different specialized areas to answerspecific researchquestions. These specific versions are not covered here as they donotdiffer significantly from the already introduced methods. Thenew research platformcombines some of the advantages of theseexisting methods in an affordable way, enablingmore research onpassenger-AV interaction and on questions that up until now havebeendifficult to answer.

    4

  • Chapter 3

    VR-in-Car as a research platform

    VR-in-Car aims to combine the benefits of driving simulators andon-road experimentsdescribed above. In this hybrid approach,participants wear a head mounted virtualreality headset as thesimulator component, while they are riding a regular vehicle thatissimulated to be autonomous as the on-road component.

    The following components are in line with driving simulatormethodology - the envi-ronment that is perceived by theparticipants is simulated. Fake control surfaces andhand trackingcomplete the illusion of sitting in an autonomous vehicle. Thefollowingcomponents are in line with the on-road experiments - themotion of the real vehicle isused to create the felt motion for theparticipant and an easy and cost-effective way toset-up thescenario for the tests.

    The idea of this approach is based on work done by NASA on pilottraining. Bachelderused similar fused reality to train helicopteraircrews [3] and work at the NASA Arm-strong Flight ResearchCenter1 used fused reality to train pilots air refueling andlandingsequences at a safe height.

    Using virtual reality in the car is also in the automotive spacenot an entirely new idea.Hock et al. and McGill et al. demonstratedthe use of VR in the car for entertainmentpurposes and to alleviatemotion sickness [13, 19].

    3.1 Advantages of VR-In-Car

    Simulator research is based on the primes that participants needto suspend their disbeliefwhile being in the virtual environment.Similar to watching a movie, the participant isexpected to acceptthe virtual environment as their current new reality. It is highlylikelythat the more believable the illusion is, the more naturalthe participants’ responses to

    1https://www.nasa.gov/centers/armstrong/features/fused_reality.html

    5

  • Figure 3.1: Methodological advantages of using VR-in-Car as aresearch tool kit.

    the actions in the virtual environment will be. VR-in-Car meetsthe requirements forthis type of simulator research on driving andinterface design by creating an immersiveexperience for theparticipants by synthesizing or augmenting their sensoryperception.The participants’ experience can be broken down into thedifferent sensory inputs thistechnology produces or uses:

    Visual - Synthesized The view of the participant is completelycomputer generated.

    Acoustic - naturalistic In the proof of concept implementationthe acoustic experi-ence was not augmented. So that only naturalaudio sources were present, e.g.engine - and road noise. Laterversions of VR-in-Car could play additional soundsthat are emittedfrom virtual objects through the car speakers or open-headphonestoaugment the already existing sounds.

    Tactile - augmented The participants’ hands are tracked and arevisible in the VRenvironment. The interior of the virtual vehicleis closely modeled after the physicalcar. That also means that thesteering wheel shape and position are similar. Acalibrationprocedure ensures that the physical objects line up with thevirtualobjects.

    Motion - naturalistic The physical car motion maps directly tovirtual vehicle theparticipant sits in and sees in VR. The motionis hence limited to the drivingabilities of the wizard.

    The complete experience can be seen in Figure: 4.2 andFigure:4.5

    VR-in-Car has three main advantages as compared to existingmethodologies. First, incomparison to driving simulators it is alow-cost method to acquire and operate. Sec-ond, in comparison withon-road tests without a VR headset this method can be usedtoevaluate almost all traffic scenarios from normal city driving tocritical highway situ-ations. Third, the motion felt and the visualenvironment displayed are very immersiveand close to advanceddriving simulators or on-road studies. VR-in-Car aims to allowforan optimal trade-off between cost, realism and testable scenariostypes as shown inFigure: 3.1.

    6

  • 3.2 Development Challenges

    The main challenge in developing VR-in-Car was to useoff-the-shelf VR display andsensor technology to create animplementation that can reliably be used to test humanparticipants.This underlying concept made repeatability and ease-of-use a keyfactorin developing the technical components for this method.

    The core functionality of the system is tracking the participantand the experimentcar accurately and in a synchronized way toconsistently mirror their behaviors in thevirtual environment. Howthese design and implementation objectives were achievedisdescribed in the following implementation sections

    7

  • Chapter 4

    VR-in-Car System Design

    The VR system is made up of several different components thatwork together to createan immersive experience for the participant.The explanation is split into three parts:input, core and output.The input consists of mostly sensor technology that is usedtomeasure actions in the real-world. The core keeps track of thesechanges in thevirtual environment to then render out appropriatechanges through the outputs likee.g. screens or Head MountedDisplay (HMD) to the participants. The schematic layoutof therelated hardware components are depicted in Figure: 4.1.

    4.1 Inputs

    There are a few different input streams into the core that allcontribute different pieces ofinformation that are used to mirrorthe car’s and the participant’s behavior in the virtualenvironment.The car is tracked using internal and external sensors. Theparticipantshead motion and hands are tracked with off-the-shelf VRtechnology.

    4.1.1 Car orientation tracking

    An Inertial Measurement Unit (IMU) was used to track theorientation of the vehicle.Many different IMU types are availablewith different cost-benefit trade-offs. Pickingthe right sensor iscrucial in making the system work. In this section we describetheintegration and choice of IMUs for this implementation.

    An IMU is commonly a combination of different Micro ElectroMechanical Systems sen-sors that are used to determine orientationof the device in three-dimensional (3D) space.Most IMUs that areavailable on the market use a type of sensor fusion like aKalmanfilter to merge Accelerometer, Gyroscope and Magnetometerdata into a quaternionorientation vector.

    8

  • Figure 4.1: Sensor and Hardware diagram showing the placementand organization ofthe different hardware components. Credit: Dr.Wendy Ju

    A variety of different IMUs exist all with different sensors andfusion algorithms. Forthe proof of concept implementation we endedup using an Mti-1 from Xsens. Theevaluation process to pick thisparticular IMU is laid out in Appendix: C.1.

    The IMU is mounted in the center console between the driver andthe participant asclose as possible to the center of rotation ofthe car but also as far away as possible fromany metal. Thequaternion vector send by the IMU can directly be used in the coretolink the physical car’s orientation to the orientation of thevirtual car.Marker Tracking and Drift CorrectionAn IMU is also usedto track the orientation of the head of the participant.However,since IMUs are imperfect sensors, the orientation vectorsof the head and the car willslowly drift apart. In other words,after some time the rendered view of the participant’sHMD mightlook out of the virtual car window even though the participant issittingstraight (i.e. looking forward) in the car. This mismatchbetween the virtual representa-tion and the real situation caninduce severe visual-vestibular conflicts that can lead tosimulatorsickness, since the motion of the actual car no longer aligns withthe motion ofthe virtual car. Several different approaches wereimplemented and evaluated to detectand correct for this relativedrift of the sensor systems.

    The first approach used ARToolkit’s 1 marker tracking to trackthe head rotation relativeto the car’s orientation. The trackedrotation would then be used to compare and correctagainst thedrifting IMU rotation. However, the light sensitivity of the cameraand anunstable tracking algorithm caused this approach to notreliably correct for the drift.

    1https://artoolkit.org/

    9

  • Other algorithms and approaches, such as comparing theacceleration vectors betweenthe car and the headset or LED trackingwere explored. None, however, could deliver thenecessary frame rateand accuracy to reliably correct for the IMU drift. For the proofofconcept implementation this problem was circumvented byrecalibrating the orientationvectors between the differentconditions of one experiment.

    4.1.2 OBD II and CAN-Bus

    Another piece of important information to mirror is the car’smotion is its speed. Thisinformation stream needs to be frequent,reliable and accurate. In general, serial busesin the car offerthis type of information. The On Board Diagnostics Port II (OBDII)andCAN-Bus offer this information. In the proof of conceptimplementation the OBDD IIport was used. Details on the CAN-Busimplementation can be found in Appendix: D.

    The OBDII or On-board diagnostics port is a polled serialinterface that is typically usedfor car diagnostics and emissionstesting. The vehicle speed is one of the parameters thatcan berequested through this port. Its refresh rate and accuracy is loweras comparedto the CAN-Bus. OBDII has an average data through-put ofabout 14.4KB/s and thespeed value uses integers with KM/haccuracy.

    4.1.3 User input

    To further enhance the virtual experience, the user interfaceinteraction consists of twoparts.The first one being the LEAPMotion, which tracks the hands of the participant.This enables thesystem to render the hands of the participants in real-time inthevirtual environment, and hence, it significantly supportscreating the illusion of thevirtual environment.

    The second interactive component is a gaming steering wheel(Logitech G920). It isconnected to the main computer. Its rotationis logged over time and is directly linkedto the visualrepresentation of the virtual steering wheel. The virtual steeringwheel isdepicted in Figure: 4.2 while its physical counterpart canbe seen on the bottom rightcorner of Figure:1.1.

    It is important to note is that in this proof of conceptimplementation, the participantwas sitting physically on thepassenger side while virtually sitting on the driver side.

    Both input methods are also used to calibrate the view positionof the participant in thevirtual environment as the real worldpositions relative to the car are known.

    10

  • Figure 4.2: Visual appearance of the Virtual Environment.

    4.2 Virtual Environment Core

    The virtual environment is design and organized in Unity3d, afreely accessible gameengine2. It combines the various data streamsand mirrors the vehicle and participantin the virtual environment.Dedicated virtual reality add-ons are used to renderthisenvironment back to the participant. The different gameelements are scripted in C#.The most important elements to enablethe core functionality are:

    UDP Bus manages two to three different threads. These threadsreceive receives thesensor information like speed and orientationover different UDP ports and makesthem available as globalvariables to other game objects.

    Car uses the orientation and speed information available fromthe UDP Bus object, tomirror the real cars motion and position.Additionally, it has a secondary camera,which renders a forwardfacing view to a texture that is compressed and sends outover UDPto the tablet application. This is used as the view for the wizarddriver.

    LMHeadMountedRig manages the main position of the VR-camera andis the ref-erence point for the LEAP Motion tracked hands. Thisobject keeps a relativeposition to the Car object and can freelyrotate based on the HMDs IMU. Thisrelative position is determinedduring the calibration process as described above.

    GameManager keeps track of which environments and experimentconditions are loadedin. Furthermore, it manages a data logger(running in a separate thread) whichsamples in-game variables, suchas the car’s speed and orientation, and sends themout over a UDPport to an external logging program.

    The performance of the complete system was not extensivelyprofiled. The only profilingmeasure was the resulting frame-ratewith the most performance hungry task being thewizard view camerastreaming. Overall, around 60 frames per second was achieved,whichis enough to create an immersive VR experience.

    2https://unity3d.com

    11

  • Figure 4.3: Software diagram showing what kind of informationflows where.

    4.3 Outputs

    The main output of the system contains is the rendered virtualenvironment that ispresented to the participant through the HMD.Two different implementations whererealized. In the firstimplementation we used a combination of SteamVR and theOSVRheadset. The second implementation used the OculusHome Unityplug-in and the OculusDK2 as the VR headset.3 An essentiallimitation in the implementation was the lack ofany absolutetracking or on-the-fly recalibration. This issue is discussed inmore detailin the future work section.

    The other visual stream that was produced and streamed out ofvirtual environmentwas a forward facing camera. The program usedJPEG compression to reduce the sizeand sends them over a UDP-port.This image stream was received by a tablet4 thatdisplayed theimages. This tablet was mounted in sight of the experimenterdriving thecar. Due to performance limitations this system ran at alower frame rate.

    3These implementation details are likely not going to help a lotin a replication study as this technol-ogy, the hardware andassociated SDKs are quickly changing and being updated frequently.A discussionabout this aspect can be found in the conclusion.

    4Galaxy Tab A

    12

  • Figure 4.4: On the left side is the screen showing theperspective of the participant. onthe right side a birds eye viewon the scene. This is the experiment Wizard.

    4.4 Data Collection

    The data collection for this project involved several differentparts. The values from thesensors as well as other information fromthe virtual environment were logged out of thegame engine. Besidesthat we recorded 360◦ video and audio of the participants andtheexperimenter’s laptop screen with in-game audio and the laptopsmicrophone. Therecorded laptop screen showed the view of theparticipant as well as a wizard interfacefor controlling the sceneand the conditions.

    The internal unity data was streamed over UDP to a Pythonprogram that combineddifferent messages into one timestamped datalog. Variables were logged with the speedof the frame rate and allshare a common time code. Additionally, specific events wereloggedindependent of the frame rate e.g. on button presses or certaininteraction mo-ments.

    For the pilot experiments, we selected 25 different values to belogged. Most of thesewere position and rotation information fromthe different actors in the scene. Addition-ally, interactioninformation from the wizard and the gaming steering wheel fromtheparticipant were collected. An excerpt from the data recordingcan be found in Table A.1in the Appendix A.

    Each of these variables was logged at frame rate level between14 to 60 times per second.The different data streams of video anddata recordings included the start time anddate in their filenamesthrough which they can be associated with thecorrespondingparticipant data.

    13

  • 4.5 Testing set-up and support hardware

    The rendering laptop, VR headset, sensors and interactiontechnologies were supportedby a few infrastructure components thatenable the system to run. The fact that thecomplete system neededto operate in a car complicated some of these issues5.

    Power was supplied to the gaming steering wheel through a DC/ACconverter that wasfed by the car’s engine with 12V.Re-implementations of this study should includea 12V battery and astrong enough power converter so that the laptop can also bechargedduring an experiment.

    Networking The different components needed to communicatethrough a fast and re-liable network connection. A basic Wi-Firouter that was directly powered out ofthe cars power grid operateda local Wi-Fi. In this implementation, the laptopand the wizard’sdrivers tablet used this connection to stream the wizard’sdriverview. The hosted network was a standard 2.4GHz-802.11n Wi-Finetwork.

    Environment The area for the experiment needed to be carefullypicked. A flat openspace without obstacles is preferable (for moreopen research scenarios). For in-stance a black lake, airportrunway or a flat paved parking lot. The proof ofconcept of thesystem was tested on an empty parking lot. A different option istoremodel an existing scenery and use the VR-in-Car strictly inthat environment.This could be a suburban environment or a part ofa highway. The parking lotused in the proof of conceptimplementation was not an ideal location as humps inthe space meantthat the surface was not perfectly flat. This surface featurewasnot matched in the virtual environment.

    In general, the physical environment should be matched to thevirtual environmentand the requirements of the experiment. For moreinformation on course selectionplease see Appendix E.

    Car The vehicle itself did not need to fulfill any specifictechnical requirements. Allnormal production cars now-days have anOBD2 available that is used to send thespeed information from thecar. The other basic requirement was the availabilityof a 12Vconnector to supply the power system. The actual energyconsumptionshould be calculated on beforehand to make sure not tostretch the car’s energyprovisioning would be stretched toomuch.

    The system was deployed on three different cars: a Prius V, aright hand drive Jeepand an Audi A6. The implementation on thePrius worked particularly well due to theCAN-Bus access.

    5Video explaining how the setup worked at the University ofTwente.https://youtu.be/ZlXmFxDz17A

    14

  • Figure 4.5: The top half of the picture shows the driver on theright, participant on theleft, and experimenter in the back of thecar. On the bottom left one can see the imagethe participant sees,on the bottom right a birds eye view as seen from theexperimenter.

    15

  • Chapter 5

    Setup and ProtocolConsiderations

    The operational context of this method is similar to runningon-road experiments. Threeaspects need to be prepared. First,physical course needs to be selected, second thevehicle withVR-in-Car needs to be prepared and Third experimenter driver needsto betrained.

    One additional important requirement is the approval of theinternal review board orethics board as this research does humansubject testing. The protocol for the proof ofconcept study wasapproved by the ethics board at the University of Twente.

    5.1 Vehicles, Wizards and Participants

    The setup of the experiment in the car was straight-forward.Conceivably this setup canbe run in any sized car that can holdfive people. However, it is strongly advisable touse a four-doorcar to assure easy access for experimenter.

    During the experiment three people(+ equipment like cameras,laptops and tablets)were in the car. First, the experiment wizardsat in the back with the laptop runningthe VR-in-Car system. Theexperiment wizard needed to also monitor the experienceandwell-being of the participant. Second, the experiment driver whosat in the car’sdriver seat with a tablet showing the virtualenvironment. This screen was placed wherenormally a navigationscreen would be placed. The last person is the participant whoissat in the passenger front seat wearing an HMD and with a gamingsteel mounted infront of them.

    16

  • 5.2 Driver Instructions/Training

    The wizard driver needs to be trained for reasons of safety andto create a convincing,immersive and consistent driving experiencefor the participants.

    Safety is the primary concern when running these experiments. Tothat end, the driverwas trained to be able to momentarily evaluatetheir actions in VR before focusing backon the road. Additionally,t training was also important so that the wizard driver couldactconsistently as the autonomous system of the vehicle. This alsoincluded consistentacceleration and steering behavior. The tasksfor the experiment driver can be brokendown into the followingthree prioritized steps:

    1. Check the path of the vehicle in the real world.

    2. If the participant grabs and turns the steering - copy thatmotion.

    3. Follow the virtual environment. This can be to follow astreet, a car in front or tomimic a specific driving style.

    The path of the vehicle in the virtual environment was preparedas part of the experimentprocedure and discussed with the driverbefore practicing that path several trial runs.After the training,the driver was familiar with the path, conditions andoccurrencesthat could happen in the virtual environment, so that heor she could focus on the roadduring the experiment.

    5.3 Motion Sickness

    As discussed in the related work section, motion sickness is aserious issue for simulatorstudies. There are a few things that canand need to be done to ensure a successfulstudy execution andparticipant well-being.

    Participants should be screened and excluded from the studyshould they have epilepsyand/or are prone to simulator sickness inVR or simulators. Additionally, the experimentprotocols shouldinclude clear instructions on what to do in case a participant getsmotionsick or starts to feel generally unwell.

    During the experiment, the experimenter needed to assure thatthe frame rate of the VRenvironment was high enough and that thecalibration between participant and car wasstill good.Recalibration can help combat this issue. In the validation study,the par-ticipants orientation and position was recalibrated inbetween the different experimentconditions.

    Dealing with motion sickness, driver training and testpreparation are key practicalaspects to successfully use VR-in-Caras a research method. In the following section, wewill examen howthese procedures and the technical implementation were applied inaproof of concept validation study.

    17

  • Chapter 6

    Validation Study

    The novelty of the VR-in-Car approach requires that it getsvalidated to assure that itcan be used to produce meaningful dataand results.

    The full evaluation and validation of a new research methodrequires several differentstudies that evaluate the differentimportant aspects of this new method. In a first step,thevalidation study evaluated whether or not the desired experience ofdriving in anautonomous vehicle could be elicited with thisapproach. The results are based on thequalitative experience ofparticipants that used VR-in-Car. The

    This study is only a first initial step towards validating thisnew method. Future evalu-ation may include pilot experiments orcomparative studies where the same situation istested with thismethod and other related methods e.g. a static drivingsimulator.

    6.1 Participants

    Six students and staff(4 male, 2 female), selected throughconvince sampling. (Age rangefrom 20s to 40s) from the Universityof Twente participated in the pilot evaluation study.

    All participants were screened to assure that they were notprone to epileptic seizure andindicated to not have had previousmotion sickness experiences in virtual environments.In general,participants did not have much experience with VR. All participantwheretested individually.

    6.2 Virtual Course Scenario

    The virtual environment consisted of a simple suburbanenvironment, with houses, treesand a car parked on the side. Abird’s eye view of the complete scene can be seenFigure 6.1.

    18

  • Figure 6.1: Image showing a bird’s eye view of the completevirtual environment. Theparticipant’s car starts on the bottom leftside arm and drives around the center buildingthree times.

    In all experiments, the vehicle started the bottom right road ofthe virtual environment,as indicated in black in Figure 6.2. Allexperiments also had the car follow the samepath, circling aroundthe house in the center (see Figure 6.1) as indicated in greeninFigure 6.2. Each run consisted of three drives. After one circlei.e. one condition thecar would back up into the starting positionfor recalibration.

    Normal condition The car drove around the block with no otherentity in the virtualenvironment interacting with the participantor car.

    Open Door condition The car drove around the block andencountered a second carthat was parked on the side of the road.The driver side door of the parked caropened in the way of theparticipant’s autonomous vehicle right before it wouldpass by.

    Cutoff condition During the drive around the block theparticipants car would be cutoff by a second car that quickly drovepast. Sometimes, if no action was taken bythe participant it wouldhit the participant’s car. This encounter happened at thesecondturn the participant’s car would take.

    The different conditions are marked in Figure 6.2 green for thenormal condition andyellow for the remaining two conditions. Theparticipant were told the focus of thestudy was on testingautonomous car behaviors to create a convincing story fortheirparticipation and for motivating the different conditions. Theparticipants’ reactions andbehaviors towards the autonomous vehicleare, in this study, not further investigated.However, the designand structure of this study could be used as a starting pointwhenusing VR-in-Car to address these aspects.

    19

  • Figure 6.2: The virtual course overlaid on satellite mapobtained from Google Maps. Theblack box is the starting location,conditions one and two are marked by the yellow carsand the whiteoutline is the shape of the road. The gray boxes indicated theapproximatelocations of buildings in the virtual scene.

    6.3 Physical Course

    The virtual course spanned a space of 45 by 26 meters, thephysical space that was usedwas 74 by 41 meters large. The paddingbetween the virtual and physical space wasnecessary as a safetymargin and to leave enough room to avoid obstacles. The parkinglotwas selected after surveying for different open spaces on Googlemaps in and aroundthe University campus. In the end, the choicefell on the parking lot in front of the localice rink1.

    All experiments where run in the month of August in which theice rink was still closed.The only other traffic that appearedsometimes during the test were driving schools thatpracticedriving. They did not, however, disrupt the experiments.

    1 WGS84 location: 52◦14’25"N 6◦49’56"E

    20

  • 6.4 Experiment Procedure

    The experiment started with the experimenter introducing theexperiment task witha written text (see Appendix B.2) to thepotential participant. If the participant un-derstood theinformation and fall within the screening parameters she or hewould beinstructed to read and mark the consent form.

    After the participant read the information sheet, and gaveconsent to partake in theexperiment, he or she would be introducedto the experiment driver. Then participant,wizard and experimentdriver entered the vehicle and drove to the starting point ofthecourse.

    At the starting point, the gaming steering wheel was installedin front of the participant.After that, the HMD is handed to theparticipant, who was instructed to strap it to theirhead. At thispoint, the experimenter started following the protocol to set upand runthe experiment as laid out in Appendix B.1.

    Prior to the experiment, a calibration phase was run, whichrequired the participant tolook forward and to put their hands onthe steering wheel. This calibration took 10seconds during whichthe participant should move as little as possible.

    The participant was given some time to get acclimated to theirnew visual environment,which is a standard procedure for this typeof head mounted VR research. This step wasalso necessary to let theparticipant feel comfortable. Typically, a car is anenvironmentwhich requires attention and oversight. In thisexperiment, it was important to makesure that the participant feltcomfortable enough to fully focus on the virtualdrivingexperience.

    After the acclimatisation phase, the task instructions were readout-loud to the par-ticipant. Then the actual experiment starts.The experiment driver drove around thetrained course across theparking lot. This course lead the virtual car around thecentervirtual house three times, each condition once as describedin Section 6.2.

    Once all conditions were completed the participant was asked toremove the HMD fol-lowed by a short pause to let them adjust to thereal environment.

    The experiment concluded with a semi-structure interview (steps14-22 in Appendix B.1),which was audio and video recorded. On thedrive back to the pick-up location theparticipant had the chance toask and discuss any questions they might have abouttheexperiment.

    This experiment was approved by the Ethics board from EWI on the24th of August2017.

    The total duration of the experiment was approximately 30minutes for each participant.

    21

  • 6.5 Analysis

    The analysis of this study was based on the responses to thesemi-structured interviewat the end of the experiment. Thetranscribed responses were compared to find com-mon themes, strongand weak points, as well as comments regarding the quality oftheexperience.

    The analysis focused especially on the motion experience andpotential motion sickness.Of particular interest was also if theparticipants were able to suspend their disbeliefand trust, and ifthey acted like they were in an autonomous vehicle.

    This open type of semi-structure interview was important inorder to identify potentiallyhidden issues or inconsistencies thatcould threaten the integrity of the experience ormethodology.

    Two out of the six responses were lost due to power failure inthe recording equipment.These responses could not be included.

    22

  • Chapter 7

    Results

    The results of the validation study are based on the post testinterviews with the partic-ipants. While not generalizable, theycan still give an indication on how well the systemperformed increating the illusion of driving in an autonomous vehicle.

    The results showed that most (5 out of 6) participants had aneasy time suspendingtheir disbelief, to accept the VR environmentas their new surrounding and that none ofthem felt sick or unwellduring the experience. Furthermore, the successful completionof thevalidation study indicates that this method can be safe andusable.

    7.1 Motion Mirroring and Orientation

    The motion matching or mirroring is the essential part of thisnew methodology toenable the simulated but immersive experience forthe participant. The mirrored motionneeds to function well enoughto add to the immersive experience to reduce the induced(simulator)motion sickness.

    The motion perception of the participant was therefore directlyaddressed in the post-test interview. The results showed that noneof the participants felt motion sick duringor right after theexperiment. One participant mentioned that

    "I am not dizzy or nauseated but maybe if I do it for a longtime. I had it thefirst time when I was popping in and out of thecar, that was not pleasant."Mid 20s, Master student, male

    This popping in and out of the car the participant mentionedhappened during thecalibration phase were the view point of theparticipant was moving relative to its hands.This typically shouldnot happen and will not happen if the procedure iscorrectlyfollowed.

    23

  • Generally, the matched motion between the physical and virtualcar was perceived wellby the participants and functionedsufficiently well to not induce motion sickness.

    That said, most participants (5 out of 6) reported feelingunwell about one hour afterthe experiment. Why that happened is notparticularly clear and will be addressed inthe discussionsection.

    For most participants, the motion mirroring was initially astrange or new experiencewhen the experiment driver starteddriving. However, this new experience was acceptedrelativelyquickly to a point where only one participant mentioned the motionof thevehicle during the experiment.

    During the interview participants were specifically asked howthey perceived the motionof the vehicle. The answers to thisquestion were very different between participants.Some commenteddirectly on the motion of the vehicle in virtual reality:

    "It felt like the car really turned, it really turns like inreal life and virtualreality." Mid 20s, Bachelor student, male

    Some other responses addressed the driving behavior of thevirtual autonomous vehicle.

    "Smooth a bit reckless at points. Yea but other then that, itwas ok." Early20s, Bachelor student, female

    And while the motion of the virtual vehicle matched the motionof the physical car thesurface shape of the physical and virtualworld were not exactly the same. Since theparking spots that areslightly elevated on this testing ground and this was nottranslatedto the virtual surface. This inconsistency was noticed bysome participants.

    "I liked the physical motion. It wasn’t as in sync as it shouldbe I guess. Iwas quite impressed that the bumping of the car wasalso translated to mysimulation." Mid 20s, Master student, male

    While this participant negatively noticed this disconnectbetween physical and virtualworld some other did not seem tonotice.

    "It was really interesting to feel the real road, which feelsreal in the VRworld where everything is fake." Mid 20s, Bachelorstudent, male

    Another important aspect is disorientation that one participantfelt during the exper-iment. the participant felt disoriented aftereach condition to the extent that he hadto shortly remove the VRheadset. He explained that lifting the headset was necessaryfor himto regain a sense of orientation in the physical world. Thisindicates that theparticipant was not fully immersed in theenvironment.

    24

  • 7.2 User Experience

    After strapping on the HMD, participants had a few moments toget used to the vir-tual environment. Most participants weresurprised by their tracked hands, which wasreflected in extensiveobservation and turning of their hands. For many, the nextlogicalstep was to grab the virtual steering wheel that was rightin front of them. This wasa useful procedure on two levels. First,the hands needed to be on the steering wheelfor the calibration andsecondly, because it gave the participant a direct way tointeractwith the virtual world, helping them to accept this newenvironment.

    When the car started driving most participants were initiallysurprised about the motionof the virtual car, but all participantsquickly accepted this matched motion. Directlyafter the initialsurprise, they started to focus on events and objects that werepresentin the virtual environment.

    Even though all participants had been informed that the wizarddriver would be drivingthe car still most of them (five out of six)grabbed the steering wheel to intervene atsome point during theexperiment. Three of them steered only at specific sections whenitwas necessary whereas the other two participants steered almostconstantly.

    In some cases, the design condition indeed elicited thereflexive reactions as intendedby the condition from theparticipant. In reaction to the second condition aparticipantreported that:

    "The second car really that just came from the left by surprise.That wasreally a reflex to grab the steering wheel." Mid 20s,Master student, male

    Two out of the six participants reacted to either of thescenarios by steering away or bymentioning it to the experimenter.However, this behavior was not recalled during anyof the otherinterviews.

    7.3 Sound

    Sound was in this implementation a naturalistic modality. Themain sound came fromthe engine and car itself. While as mentionedabove other cars, especially driving schools,were present, they didnot come close enough or were loud enough to be registeredinsidethe car. Participants did not explicitly comment on thesound.

    7.4 Feedback

    Participants were encouraged during the post-test interview toelaborate on aspectsof the experience that they did not like orappreciate. All participants had different

    25

  • comments about what they did not like. Besides theaforementioned disorientation andVR headset quality two criticalcomments focused on aspects of the VR-in-Car system.

    One participant mentioned that:

    "I am missing the pedals. After the last round [the experimentdriver] wasparking very close to the other car and I noticed I waspressing my foot down.Just to break. Reflex of pressing my footdown." Mid 20s, Master student,male

    The interactions that the participants referenced happened allin the virtual environment.This shows a level of immersion but alsoat the same time demonstrating a limit of thecurrent proof ofconcept implementation.

    Another participant mention the environment and specifically theweather in the virtualenvironment.

    "The environment was not necessarily pleasant to go through,in." Early 20s,Bachelor student, female

    She repeated this point at the last open question of theinterview. She advised to:

    "Put up sunny weather next time its really dark and it changesthe mood."

    Both comments pointed towards the rendering, visual quality andspecifically lightingconditions of the virtual environment and howshe did not feel particularly comfortablebeing in thatenvironment.

    However, in general, the participants reported an engagedexperience. Especially thekey factor of motion mirroring waspositively noted and clearly helped some participantto suspenddisbelief and act and respond as if they were in an autonomousvehicle.

    26

  • Chapter 8

    Discussion

    The results as they have been described above are based on thesubjective impressionsfrom the participants reported during thepost-test interviews. In the following sectionswe will discussthese findings and put them into context of this method and futurework.

    8.1 Participant Experience

    From the interviews several limitation, or points of improvementof the system becameapparent.

    8.1.1 Graphical Fidelity

    The most common topic that came up was the graphical fidelity ofthe simulation. Threeout of six participants mentioned this as animportant point. The fidelity of the graphicswas indeed in thiscontext somewhat limited as can be seen in Figure 6.1. Most ofthehouses are more or less flat shaded boxes with not a lot ofdetail.

    This is an issue that can easily be improved through differentstrategies. One approachcould be to improve the overall lighting ofthe scene. Recently added features to Unity arebaked lighting,ambient occlusion and dynamic lights. These techniques can add alot ofvisual realism to a scene by making objects blend togetherinto one visual representation.

    A different approach could be to remodel the houses and roads inone consistent style.In the proof of concept implementation themodels are based on a variety of differentstyles from realisticarchitectural to cartoon-like houses. A consistent modelcatalogwould help to create a believable environment, even if it isnot photorealistic.

    27

  • 8.1.2 Motion Sickness

    An important issue that arose after the experiment was thedelayed onset of motionsickens for five out of six participants.The participants were asked about their well-being during theinterview but no participant reported feeling unwell or motion sickatthat time. However, most participants reported a slight sense ofmotion sickness aboutone hour after the experiment was over.

    This delayed onset of the motion sickness is a phenomenon thatcannot yet be explainedbased on the data that was gathered duringthis test. This point needs to be furtherinvestigated before thismethod can be used for longer form studies. A first simple waytoinvestigate this issue would be by consistently monitoring theparticipant after anexperiment with VR-in-Car is complete.

    8.2 Technical Improvements

    A simple improvement would be the addition of control paddles(accelerator, clutch andbreak). In the version that was testedduring the user test only the steering wheel wasavailable as acontrol surface. However, at least one participant noted themissing paddleas he tried to stop the car at one point during theexperiment.

    The IMU drift issue as discussed in section 4.1.1 needs to beaddressed with a morerobust solution. Different technologies formarker tracking and other approaches needto be evaluated to allowfor longer form studies that don’t allow for frequentrecalibration.An ideal solution would be to fuse the differentinertial measurements in the trackingalgorithm for the headset. TheOpenSourceVR (OSVR) platform would be suitable forsuch amodification as the code is available.

    In general, performance improvements would be desirable tocreate a more fluid envi-ronment and to rule out lower frame ratesas a cause for simulator sickness. in thecurrent implementation themost time consuming task in each frame is the compressingandsending the separate camera view for the experiment driver. Thisprocess can beoptimized by synchronizing these tasks with the restthat ism happening on the GPU.This has been demonstrated how suchan optimization can work in Unity1. In generalhowever, a higherperformance laptop and separate logging computer would alsoimprovethe performance.

    1https://medium.com/google-developers/real-time-image-capture-in-unity-458de1364a4c

    28

  • Chapter 9

    Future work

    In addition to the improvements derived from the participantsresponses, there are a fewtechnical and methodological developmentsthat will be the focus of future work on thismethod. While theaforementioned technical improvements focus more on extendingtheparticipants experience the main focus of future work should lieon validating this newmethod.

    Further validation is an essential next step to establishVR-in-Car as a research method.Especially, the comparison toexisting simulators and simulator research is an importantstep inestablishing VR-in-Car as a research method. Further developmentshould alsolook at the data logging stream, integration of higherquality sensors, and a streamlineddata evaluation. Additionally, itwould be worthwhile to develop a quantitative measuresforsimulation quality, e.g. through measuring the orientation drift ofthe car over time.

    29

  • Chapter 10

    Prospective Applications

    VR-in-Car presents a new method to investigate human responsesto AV related behav-iors and designs in a realistic and immersiveway. The combination of immersion andlow-cost implementationenables this AV of research in a more diverse setting thanprevi-ously possible. In the following section, the different newapplication areas are exploredthat can be addressed with this newevaluation methodology.

    10.1 Autonomous Vehicle Design

    In traditional car design, the car interfaces for e.g. media andenvironmental-controlshave been as simplistic and clear. Buttonswith clear functions are optimized for costreduction and clarity inthe interface. AVs bring a new dimension to the designcon-sideration as these interfaces suddenly also become animportant contact point for thecommunication between the autonomoustechnology and the passenger. This meansthat the evaluation ofthese interfaces becomes more context dependent and shiftsfromclarity and safety to entertainment and trust building.

    The VR-in-Car method allows for rapid prototyping and evaluationof different interfacedesigns in safe as well as critical trafficenvironments. These new types of interfaces needto be usable tounderstandable in any traffic sitation.

    10.2 Human Behavior in Autonomous Vehicles

    Once drivers are asked to relinquish control to the autonomoustechnology these needto have a minimum level of trust in the systemto drive the car safely through thetraffic environment.VR-in-Carcan be used to develop trust building tasks and behavioralpatternsfor the AVs. One simple but important example behavior questionwould be:

    30

  • How soon before a turn should the AV switch-on theindicators?

    These interactions between the passenger, the AV, and othertraffic participants caneasily be evaluated in this new proposedmethod.

    10.3 Inertial Motion as signaling for AVs

    The felt inertial motion of a vehicle is an important sensorycue that can have greatinfluence on how safe a passenger feels.Additionally, the inertial motion of an AV couldbe used to e.g.signal its intent. Accurate and flexible motion cuing is anessential partof VR-in-Car. Trained wizard drivers can usedifferent driving styles to induce the feltsense of differentcontrol parameters in the AV.

    10.4 Handover scenarios

    Partially autonomous vehicles are likely going to be introducedbefore fully autonomousvehicles become available. These cars arecapable to operate autonomously in only acertain type of trafficenvironments e.g. highways. The use of these vehicles impliestheoccurrence of handover scenarios. In these scenarios, driversneed to relinquish and afterthe end of the autonomous period regaincontrol of the vehicle. These types scenariosare often critical,especially if they happen unplanned. This new method can be usedtoevaluate how drivers respond in a handover scenario.

    10.5 Design or entertainment and secondary activities

    Secondary tasks like entertainment or work are expected tohappen more frequently andbecome the main occupation during theride in an autonomous vehicle. The interfacedesign on how thesetasks could be incorporated is another design aspect for whichVR-in-Car can be used as a prototyping and evaluation method.Especially, video gamesand video screen can easily be implementedin the virtual car environment and testedwith users directly.

    31

  • Chapter 11

    Conclusion

    New requirements to rapidly develop and evaluate AVs interfacedesigns, and advancesin VR have lead to the proposition,implementation and validation of a new methodfor in car research.VR-in-Car utilizes modern VR toolkits to create animmersiveenvironment in which behaviors of AVs in different trafficscenarios can be tested withhuman participants.

    It combines established on-road and simulator methodologies tocreate, for the partici-pant the illusion of driving in an AV whilein realty the car is driving by an experimenter.The multi-modalexperience includes among others visual, acoustic, tactile andvestibu-lar motion cues.

    The validation study of the proof of concept implementationproduced promising re-sults and points towards aspects that need tobe improved to create a fully functioningand reliable evaluationtool. Future work should address the graphical fidelity andthesensors’ drift.

    32

  • References

    [1] Hironori Akiduki, Suetaka Nishiike, Hiroshi Watanabe,Katsunori Matsuoka,Takeshi Kubo, and Noriaki Takeda. 2003.Visual-vestibular conflict induced byvirtual reality in humans.Neuroscience letters 340, 3 (2003), 197–200.

    [2] Bruce Artz, Larry Cathey, Peter Grant, Dan Houston, JeffGreenberg, and MaxMariani. 2001. The design and construction of thevisual subsystem for VIRT-TEX, the driving simulator at the Fordresearch laboratories. In Driving simulationconference.255–262.

    [3] Ed Bachelder. 2006. Helicopter aircrew training using fusedreality. Technical Re-port. SYSTEMS TECHNOLOGY INC HAWTHORNECA.

    [4] Sonia Baltodano, Srinath Sibi, Nikolas Martelaro, NikhilGowda, and Wendy Ju.2015. The RRADS platform: a real roadautonomous driving simulator. ACM, NewYork, New York, USA.

    [5] Martin Brünger-Koch. 2005. Motion parameter tuning andevaluation for the DLRautomotive simulator. 2005 DSC NA CD-ROMProceedings (2005).

    [6] M Bruschetta, F Maran, and A Beghi. 2017. A fastimplementation of MPC-based motion cueing algorithms for mid-sizeroad vehicle motion simulators. VehicleSystem Dynamics 55, 6(2017), 802–826.

    [7] Mattia Bruschetta, Fabio Maran, and Alessandro Beghi. 2017.A Nonlinear, MPC-Based Motion Cueing Algorithm for aHigh-Performance, Nine-DOF Dynamic Sim-ulator Platform. IEEETransactions on Control Systems Technology 25, 2(2017),686–694.

    [8] Sydne J Carlson-Newberry, Rebecca B Costello, et al. 1997.The Iowa DrivingSimulator: Using Simulation for Human PerformanceMeasurement. (1997).

    [9] Sergio Casas, Marcos Fernández, and José V Riera. 2017. FourDifferent Multi-modal Setups for Non-Aerial Vehicle Simulations—ACase Study with a SpeedboatSimulator. Multimodal Technologies andInteraction 1, 2 (2017), 10.

    33

  • [10] Mehmet Dagdelen, Gilles Reymond, Andras Kemeny, MarcBordier, and NadiaMaïzi. 2009. Model-based predictive motion cueingstrategy for vehicle drivingsimulators. Control EngineeringPractice 17, 9 (2009), 995–1003.

    [11] Thomas A Dingus, Sheila G Klauer, Vicki L Neale, APetersen, Suzanne E Lee, JDSudweeks, MA Perez, J Hankey, DJ Ramsey,S Gupta, et al. 2006. The 100-carnaturalistic driving study, PhaseII-results of the 100-car field experiment. TechnicalReport.

    [12] JS Freeman, G Watson, YE Papelis, TC Lin, A Tayyab, RARomano, and JG Kuhl.1995. The Iowa driving simulator: Animplementation and application overview.Technical Report. SAETechnical Paper.

    [13] Philipp Hock, Sebastian Benedikter, Jan Gugenheimer, andEnrico Rukzio. 2017.CarVR: Enabling In-Car Virtual RealityEntertainment. In Proceedings of the 2017CHI Conference on HumanFactors in Computing Systems. ACM, 4034–4044.

    [14] Robert JK Jacob, Audrey Girouard, Leanne M Hirshfield,Michael S Horn, OritShaer, Erin Treacy Solovey, and JamieZigelbaum. 2008. Reality-based interaction:a framework forpost-WIMP interfaces. In Proceedings of the SIGCHI conferenceonHuman factors in computing systems. ACM, 201–210.

    [15] Nico Kaptein, Jan Theeuwes, and Richard Van Der Horst.1996. Driving simulatorvalidity: Some considerations.Transportation Research Record: Journal of theTransportationResearch Board 1550 (1996), 30–36.

    [16] Andras Kemeny and Francesco Panerai. 2003. Evaluatingperception in drivingsimulation experiments. Trends in cognitivesciences 7, 1 (2003), 31–37.

    [17] Markus Kuderer, Shilpa Gulati, andWolfram Burgard. 2015.Learning driving stylesfor autonomous vehicles from demonstration.2015 IEEE International Conferenceon Robotics and Automation (ICRA)(2015), 2641–2646.

    [18] Hoe C Lee et al. 2003. The validity of driving simulator tomeasure on-road drivingperformance of older drivers. Transportengineering in Australia 8, 2 (2003), 89.

    [19] Mark McGill, Alexander Ng, and Stephen Brewster. 2017. I AmThe Passenger:How Visual Motion Cues Can Influence Sickness ForIn-Car VR. In Proceedings ofthe 2017 CHI Conference on HumanFactors in Computing Systems. ACM, 5655–5668.

    [20] Gilles Reymond, Andras Kemeny, Jacques Droulez, and AlainBerthoz. 2001. Roleof lateral acceleration in curve driving: Drivermodel and experiments on a realvehicle and a driving simulator.Human factors 43, 3 (2001), 483–495.

    [21] Dirk Rothenbücher, Jamy Li, David Sirkin, Brian Mok, andWendy Ju. 2015. Ghostdriver: A platform for investigatinginteractions between pedestrians and driverless

    34

  • vehicles. In Adjunct Proceedings of the 7th InternationalConference on AutomotiveUser Interfaces and Interactive VehicularApplications, AutomotiveUI 2015. Stan-ford University, Palo Alto,United States, ACM Press, New York, New York, USA,44–49.

    [22] G Schmidt, M Kiss, E Babbel, and A Galla. 2008. The Wizardon Wheels: RapidPrototyping and User Testing of Future DriverAssistance Using Wizard of Oz Tech-nique in a Vehicle. InProceedings of the FISITA 2008 World AutomotiveCongress,Munich.

    [23] Jonathan Stevens, Peter Kincaid, and Robert Sottilare.2015. Visual modalityresearch in virtual and mixed realitysimulation. The Journal of Defense Modelingand Simulation 12, 4(2015), 519–537.

    [24] Peter Wang, Srinath Sibi, Brian Mok, and Wendy Ju. 2017.Marionette: EnablingOn-Road Wizard-of-Oz Autonomous DrivingStudies. In Proceedings of the 2017ACM/IEEE InternationalConference on Human-Robot Interaction. ACM, 234–243.

    35

  • Appendix A

    Data Collection Snippet

    36

  • TableA.1:Ex

    ampleda

    taexcerptfrom

    oneof

    theuser

    valid

    ationstud

    ies.

    Show

    ingthetim

    estam

    pan

    drotatio

    nan

    dpo

    sition

    ofthepa

    rticipan

    tVR

    camera.

    time

    CenterE

    yeAncho

    r-xP

    osCenterE

    yeAncho

    r-yP

    osCenterE

    yeAncho

    r-zP

    osCenterE

    yeAncho

    r-Qx

    CenterE

    yeAncho

    r-Qy

    CenterE

    yeAncho

    r-Qz

    CenterE

    yeAncho

    r-Qw

    UDPB

    us-C

    ANSp

    eed

    33.106

    0-0.080

    9508

    4-0.020

    5598

    -0.029

    7389

    5-0.116

    9599

    0.50

    0749

    7-0.027

    6999

    5-0.857

    2064

    0.69

    0560

    033

    .235

    4-0.080

    5559

    2-0.020

    3149

    2-0.029

    3684

    0-0.115

    5024

    0.49

    8609

    2-0.026

    9516

    2-0.858

    6743

    0.69

    0560

    033

    .375

    12-0.080

    2708

    7-0.020

    0938

    2-0.029

    2153

    2-0.115

    0438

    0.49

    6772

    1-0.026

    2792

    6-0.859

    8207

    0.69

    0560

    033

    .490

    89-0.079

    0472

    4-0.020

    3717

    2-0.028

    4630

    3-0.116

    8476

    0.48

    4720

    8-0.023

    6221

    0-0.866

    506

    0.69

    0560

    33.618

    24-0.077

    0169

    0-0.020

    0301

    2-0.026

    1808

    8-0.115

    8988

    0.46

    7244

    9-0.022

    1823

    9-0.876

    2177

    0.86

    3200

    033

    .733

    57-0.074

    4999

    4-0.020

    0832

    2-0.025

    4696

    7-0.121

    3208

    0.43

    2206

    4-0.022

    9945

    8-0.893

    2805

    0.86

    3233

    .877

    88-0.066

    8743

    1-0.020

    3029

    3-0.020

    2193

    0-0.122

    4712

    0.32

    7733

    4-0.034

    2998

    -0.936

    1704

    0.86

    3234

    .010

    57-0.042

    5495

    -0.022

    4777

    4-0.005

    8294

    8-0.127

    713

    0.08

    0626

    97-0.015

    3590

    3-0.988

    4091

    0.86

    3200

    034

    .116

    27-0.018

    4104

    6-0.022

    4399

    70.00

    3451

    466

    -0.119

    9094

    -0.071

    2852

    1-0.010

    0245

    0-0.990

    1715

    0.86

    3200

    034

    .249

    100.01

    1688

    45-0.021

    9290

    30.01

    1928

    51-0.113

    892

    -0.097

    5554

    80.00

    4145

    067

    -0.988

    683

    0.86

    3234

    .372

    610.01

    5816

    46-0.020

    4647

    0.01

    3301

    19-0.093

    7999

    5-0.068

    8159

    830.00

    0719

    485

    -0.993

    2096

    0.86

    32

    37

  • Appendix B

    Experiment Documents

    B.1 Procedure

    The introduction:

    1. They put on the VR-Headset

    2. Let them see their hands and look around.

    3. "Ok, Please look straight forward now and put your hands onthe steering wheel.We need about 10 seconds to calibrate yourposition so please hold still!"

    4. "Ok good!, Now can you shorty describe what you are seeing,where you are?"

    5. ...the participants describe their experience...

    6. "The autonomous car is now going to drive you around theblock, just so you knowit might fail in some cases so you maybehave to correct the cars path."

    7. Drive the first round (Normal)

    8. When stopping. Good Now we need to quickly re calibrate andreset so Just closeyour eyes for a moment.

    9. Drive the second round (Condition1)

    10. When stopping. Good Now we need to quickly re calibrate andreset so Just closeyour eyes for a moment.

    11. Drive the third round (Condition2)

    12. "Now we are done! so you can take the VR-Headset of."

    13. –short break– to let the participant arrive back inReality

    38

  • 14. Can you explain in your own words (shortly) what youexperienced.

    15. Did you notice a car rushing by or crashing into you?

    16. Did you see a car door opening from a parked car?

    17. How did you experience the motion of the car?

    18. What did you like about the experience?

    19. what did you not like about the experience?

    20. How nauseated do you feel right now?

    21. Do you feel unwell in any other way (related to theexperience?)

    22. Is there any thing you’d like to add, let us or mention?

    23. ––––End of experiment—-

    24. Thank your time and while we are driving back is theresomething you would liketo know from us?

    B.2 Participant recruiting text

    Hello, I am looking for participants for my research project. Itfocuses on AutonomousVehicles and how passengers and AutonomousVehicles will interact in traffic in possiblefuture scenarios. Theexperiment will take between 30 to 45 minutes.

    If Yes then:

    There are three requirements:

    1. Do you have a drivers license?

    2. Have you been in a traffic incident that still affects yourcapabilities to operate avehicle normally in a normal trafficenvironment?

    3. Have you used/been in Virtual Reality before and if so didyou ever feel unwell usingit, specifically nausea or epilepsy?

    3a. If this does not apply did you show any of these symptomswhen playing a videogame?

    39

  • Appendix C

    Motion cuing

    Motion cuing tries to address the visual-vestibular conflictsthat can occur if the visualsi.e. what the person is seeing doesnot align with the motion information receivedby their vestibularsystem. For head mounted Virtual Reality application the headmotionhas to be well measured and applied to the rendering so that itdoes not causethis conflict[1].In complex driving simulators thisis done through the motion base, thatmoves the participant andpossible the car in which their are situated. Tuning the controlofa simulators motion platforms has been the focus of much prior work[16, 10, 5, 7, 6].And has also been identified as crucial to invokemore genuine driver responses[20].

    C.1 IMU evaluation and testing

    For this project several different types were used to improvethe performance for thesystem. There are various performancemeasure among which IMUs can be evaluatedthere are primarily twothat are important for this project.

    First, the refresh rate. Typically one would want to update theorientation of a trackedobject at about 1kHz refresh rate. However,in this example 100Hz appeared to be ahigh enough refresh rate foran accurate tracking of the vehicle. This is likely due tothe factthat the vehicle does not typically make fast orientation changesthat would gounnoticed in a 100Hz sampling rate.

    Second, the yaw drift. Yaw is the rotational axis that is normalrelative to the groundplane and with that a reversed gravityvector. The fact that it coincides with the gravityvectors makes itparticularly difficult to track as there is no good reference forit. Rolland pitch can both be referenced against the gravityvector. This means that the yawrotational axis is purely integratedbased on relative measurements and not referencedover time in mostlower end IMUs. Some IMUs attempt to reference yaw based onearthmagnetic field, however these sensors are often not accurateenough to do that reliably

    40

  • especially when deployed inside of a vehicle made out ofmetal.

    Five different IMUs where experimented with two of them stoodout and performed wellenough to track the cars orientationreliably. These IMUs were the BNO055 from Boshand the MTi-1 fromXSens. For future projects and implementation we wouldstronglyadvice to get better quality orientation tracking systemslike the MTi-300 form XSens.They feater active yaw stabilization tofurther reduce or eliminate the yaw drift. TheIMUs that did notperform well enough to map the motion where the LSM9DS11, theRazorIMU M02 and the UM7-LT3.

    1https://www.sparkfun.com/products/132842https://www.sparkfun.com/products/140013https://www.pololu.com/product/2763

    41

  • Appendix D

    CAN-Bus implementation

    The CAN bus, which is only accessible for some cars, is a carinternal data networkwhere important operation information isshared between different components of thecar. E.g. the Anti-LockBreaking System(ABS) often use the individual wheel speedand otherinformation through the CAN-bus to determine if ABS needs to beengaged.CAN bus is a low-level protocol its data rate can be ashigh as 1Mbit/s. The accuracydepends on the car manufacturer in theexample implementation with a PriusV thespeed was reported in 10m/hsteps. Therefore, the CAN-bus should always be used if itisaccessible. This implementation was used in the Prius V setup atStanford University.

    42

  • Appendix E

    Course Selection

    The underlying goal of the system is to enable the participantto feel the real motionduring a drive with this VR in car system.This, however, implies that there is a physicalspace necessaryemulate these motions.

    Depended on the scenario that is to be tested two options areavailable. The firstoption is to re-create a the physical testenvironment like e.g. a highway entrance orsuburban environment inthe virtual environment. The behavior of the virtual car canthenfollow the physical car. This approach works best for primarilysecondary or tertiarytasks evaluation of the driver/passenger,meaning for non-driving related interface andbehaviorquestions.

    The second option is to execute the study on an open, empty andpaved area. Thesecan be found in big parking lots close tostadiums, concert halls or other event locations.These parkingspaces are often deserted during the day. Other options would beadesignated research area like General Motors black lake facilityor a disused runway.This option gives the most freedom regardingthe scenario and the repeatability sincethese space are typicallyempty hence the wizard driver is free to steer the vehiclemorefreely without having to worry about other drivers orobstacles. The main limitationfor running experiments in e.g. aparking-lot is the physical size and obstacles like lightpoles. Thevirtual world and the path vehicle need to be adapted to thislimitation. Inmost cases it is best to measure out the boundaryconditions of the parking lot, thento design the experiment with inthat information and to finally test and readjust thedesign tomatch to any limitations.

    43

    IntroductionRelated WorkDriving SimulatorsOn-road drivingexperiments

    VR-in-Car as a research platformAdvantages ofVR-In-CarDevelopment Challenges

    VR-in-Car System DesignInputsCar orientation trackingOBD II andCAN-BusUser input

    Virtual Environment CoreOutputsData CollectionTesting set-up andsupport hardware

    Setup and Protocol ConsiderationsVehicles, Wizards andParticipantsDriver Instructions/TrainingMotion Sickness

    Validation StudyParticipantsVirtual Course ScenarioPhysicalCourseExperiment ProcedureAnalysis

    ResultsMotion Mirroring and OrientationUserExperienceSoundFeedback

    DiscussionParticipant ExperienceGraphical FidelityMotionSickness

    Technical Improvements

    Future workProspective ApplicationsAutonomous VehicleDesignHuman Behavior in Autonomous VehiclesInertial Motion assignaling for AVsHandover scenariosDesign or entertainment andsecondary activities

    ConclusionData Collection SnippetExperimentDocumentsProcedureParticipant recruiting text

    Motion cuingIMU evaluation and testing

    CAN-Bus implementationCourse Selection

On-road Virtual Reality Driving Simulatoressay.utwente.nl/74239/1/Goedicke_MA_HMI.pdf · 2017. 12. 11. · On-road Virtual Reality Driving Simulator Human Media Interaction - Master - [PDF Document] (2024)
Top Articles
Latest Posts
Article information

Author: Melvina Ondricka

Last Updated:

Views: 6015

Rating: 4.8 / 5 (68 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Melvina Ondricka

Birthday: 2000-12-23

Address: Suite 382 139 Shaniqua Locks, Paulaborough, UT 90498

Phone: +636383657021

Job: Dynamic Government Specialist

Hobby: Kite flying, Watching movies, Knitting, Model building, Reading, Wood carving, Paintball

Introduction: My name is Melvina Ondricka, I am a helpful, fancy, friendly, innocent, outstanding, courageous, thoughtful person who loves writing and wants to share my knowledge and understanding with you.