Autónomo

A Virtual Reality Simulation
of a Self-Driving Car

NTT Data

Scroll Down

Abstract

Simulating the experience of riding in a self-driving car by using Virtual Reality and Hand Tracking

As self-driving cars (or autonomous vehicles) are reaching a point where they’re as good, or better than human drivers, it is important that people understand how autonomous cars work and how it feels like to ride in one. Autónomo does this by simulating the experience of riding in a self-driving car by using Virtual Reality (VR) and Hand Tracking. The use of VR and Hand Tracking creates a very real-world experience of riding in a self-driving car and has lower costs compared to test-rides. The application is also created to appeal to a variety of training and learning styles and eliminates any risk and safety concerns. Potential adopters of this technology can include car dealerships that can use this application as a substitute for test-rides, which are more costly and time-consuming and are hence, much less scalable. Similarly, potential buyers of self-driving cars can use Autónomo to experience riding in a self-driving car before buying one.

Our solution uses an Oculus Quest headset to simulate a feature-complete experience of a Level 5 autonomous car and our application is written using Unity and C#.

Virtual Reality

Get the complete self-driving experience - move around and look around just like in a real-life setting, interact with elements of a fully autonomous (Level-5) car, customise these elements to your preference, understand how the car will always choose the most optimal route, and much more - all in a Virtual Reality (VR) environment.

The Hand Tracking Experience

Interact with the simulation just like you would interact with a real object - choose your destination with a single touch on a map, use Gestures to enter/exit the simulation, to control the entertainment system (system has a music library and a simple game), and to choose and customise your car.

A Feature-Complete Simulation

Choose a car you like, customise its exteriors and interiors, choose a destination, look around a complete city environment (with roads, bridges, buildings, traffic lights, and more), play music, games, and more on the entertainment system, and most importantly, sit back and relax - the car drives on its own! Autónomo emulates and simulates a feature-complete self-driving experience.

A New Car Buying Experience

Autónomo is built to simulate a new car buying experience. Gone are the days when you had to visit multiple dealerships, kick tyres, and drive around the block to buy a car. With Virtual Reality (VR) and Hand Tracking, Autónomo allows users to view information (description, specifications, and performance figures) about the car while also simulating the rider experience all in the comfort of your own space!

Requirements

For gathering and refining requirements of the project, we liased with our client, NTT Data, until common ground was found and realistic goals were set for the project. We conducted one in-person meeting and all other meetings were done virtually over Skype.

MoSCoW List of features

Must Have

  •   A user-friendly interface to select a car
    • User-friendly elements include buttons for choosing a color for the car and arrows for navigating car options
    • Users must also be able to view information about a car - it's specifications, description, and performance figures
  •   A simulation of interacting with elements of the car
    • Track user's hands and give visual feedback (if applicable) when interacting with elements of the car (like the steering wheel and other parts of the car that are visible to the user)
  •   A feature-complete simulation of riding in a self-driving car
    • Users must be able to see the car moving from a start point to a destination along a specific track
    • The car must have basic elements of a car’s interiors; must have a driver's seat, a front-passenger seat, and a Dashboard/instrument panel with at least a speedometer

Should Have

  •   A user-friendly interface to select a car
    • Should allow users to select a car using Hand Tracking features like Gestures
    • Users should be able to use gestures to switch between cars, view information about the cars, and also select a car
  •   An interface for customising the car
    • A menu that lists out all available cars with additional customising options for the car's exteriors (like the colour of the car and additional artistic elements like stickers)
  •   A feature-complete simulation of riding in a self-driving car
    • Users should be able to interact with the Dashboard/instrument panel (using Hand Tracking) and receive visual feedback
    • Users should be able to look around (must be free to rotate and change the view) just like looking around in a real-life setting
    • The applications should have a feature-complete environment for the simulation similar to that of a city - with buildings, roads, highways, roundabouts, bridges etc.

Could Have

  •   An interface for customising the car
    • Could have an option to customise the interiors of the car like customising the colour of the seats, the information displayed on the Dashboard/Instrument panel (could add fuel level and oil pressure along with a broad array of gauges and controls), along with options for climate control (like regulating the temperature of the car and the fan speed of the air conditioner)
  •   Options for customising the route that the car will take
    • Could have a pathfinding algorithm that navigates the car from a starting point to a destination
    • Could have options that allow the user to choose a starting point, a destination, along with options for alternate routes and the estimated time each route will take
  •   A feature-complete simulation of riding in a self-driving car
    • A navigation system for the car - a virtual map with real-time updates of the car’s position
    • The car could also have additional interior elements that may be found in fully autonomous (Level-5) cars in the future
  •   An entertainment system for the car
    • Could have a simple game and a music system
    • The car could also have additional interior elements that may be found in fully autonomous (Level-5) cars in the future

Won't Have

  •   A Deep Learning algorithm that trains cars to navigate road courses

Our Users

Author image

I have always been fascinated by idea of self-driving cars and want to experience the feeling of riding in one before buying one for myself.



John Doe Senior Asset Manager
Download Persona
Author image

I have been thinking about using Virtual Reality as a substitute for test-rides for a while now and would like to see how I could use a VR simulation of a self-driving car as a selling point for potential buyers.

Luka Rakitic Owner of a Luxury Car Dealership
Download Persona

Use-Case Diagram

Research

These are some of the projects Autónomo is inspired by.

In IEEE format

  Related Projects

  • City Car Driving[1] - Car driving simulator game on Steam
  • CARLA[2] - autonomous driving research
  Comparing the two

City Car Driving CARLA
Device Works on both VR headset and PC [3] Compatible only with PC
Programming Language Unity and C# C++ and Python[4]
Game Engine Nebula Unreal 4

  Final decision

We selected City Car Driving to be the final solution since it met almost all of our project requirements; even though City Car Driving is not exactly a self-driving car simulator, it still helped us create our UI design, implement our car design, and choose other Unity assets that are a part of our simulator. For our final deliverable, we decided to use an Oculus Quest. The reason why we choose Oculus for development is that Oculus supports both Unity and Unreal. Similarly, Oculus Quest supports precise tracking, excellent hand controls and room-scale play that lets users actually move in a virtual world, and without the restrictive cables tethering that are associated with a PC[5].


These are some of the frameworks Autónomo is built on top of.

  Assets

  Why we chose these Assets

Asset Rationale for use
#067 Sportscar
  • Lightweight and easy-to-use with very detailed documentation
  • Customisable interiors - fulfills one of the features of our project
  • Made for simulators and racing games
Realistic Mobile Car #06
  • Very detailed interiors
  • Comes with a customisable Dashboard ready for animation - fulfills one of the features of our project
  • Emissive texture for lights included
  • Easy-to-understand documentation
Urban Construction Pack
  • Really good looking prefabs and objects
  • Made for creating optimized city environments for VR projects - exactly what our project needed
  • Very modular with resuable components
  • A scripted traffic light system, walls, sidewalks, poles, signs, and stairways - fulfilled most of our requirements for a city environment
Ultimate Sci-Fi UI Bundle
  • Easily editable components with adjustable layers
  • Design goes well with the overall persona of Autónomo

  References

Algorithms

At the heart of Autónomo lies an efficient pathfinding algorithm



Autónomo uses a variation of the A* Search algorithm to find the shortest path between two points. A fairly flexible and efficient graph traversal algorithm, A* search is the most is the most popular choice for pathfinding in most applications. Its superiority over other popular graph traversal algorithms, like Depth-first search (DFS) and Breadth-first search (BFS), is attributed to a couple of reasons. First, the A* Search algorithm, unlike Depth-First Search, is guaranteed to find the shortest path (if the length of the edges don't vary too much). Secondly, with the help of the heuristic function, A* Search is likely to find the destination faster than Breadth-First Search, which can be very time consuming (O(|E|) = O(b^d)) in its worst case.

The efficiency of choosing an optimal pathfinding algorithm is very important to Autónomo's functionality as it calculates the shortest path in a single frame, which means an inefficient pathfinding algorithm may cause the simulation to lag and, in the worst case, crash. Currently, A* Search works perfectly; with the current size of the city environment, we have not encountered any Low FPS or other lag problems during testing.

Human Computer Interaction (HCI)

Autónomo's design is based on the principles of visibility, simplicity, and consistency

Car Interior Design

Car Selection Page Design

Car Selection Prototype

Car Interiors (Prototype)

Navigation Panel Sketch

Climate Control System Sketch

Map (Final Release)

City Map Prototype

Road Design (Final release)

Car Interiors (Final Release)

Buildings Design (Final release)

System Architecture

Testing

Our testing environment(s)

There are two different testing environments for our software - the Unity Editor and the Oculus Quest headset. Most of our unit and integration tests, however, were carried out on Unity Editor. There are a couple reasons for this decision: first, the process of loading the game onto the Quest headset is very time consuming. Secondly, the headset does not have an in-built Unity debugger, unlike the editor, so when an error occurs, one can not see the error message or any other details about the error, unless we display the message on a game object (like a Canvas).

However, there were some tests that could only be carried out on the Quest headset. For example, to test whether the Hand Tracking feature or the Gesture Detection system was working correctly.

Unit tests

We used the in-built Unity test runner for our Unit tests or simply ran the game and checked for error messages. Unit tests, in the context of our project, refer to the testing of individual Unity prefabs (the Navigation Page prefab), scripts (Navigator script, Traffic Light Control scripts), and methods (Navigator.FindShortestRoute() method - our pathfinding algorithm, for example)

Integration tests

Integration tests refer to the testing and interaction/connections between game prefabs and some border functionality that requires the scripts to work together. We carried out integration tests by playing the game on Unity Editor as well as on the Oculus Quest headset to check if the system met our requirements. For example, when playing around the car interior UI, we found that when we jumped from the music canvas to the game canvas, the music stops, which was a bug in functionality.

System tests

System tests took place at the last stage of the development after our game passed all the unit and integration tests. We performed system tests on Oculus Quest only to make sure the overall user experience is delightful.

Performance tests

We performed performance testing on some of our features to check if there were any performance issues that could affect user experience - most of them were on the level of integration tests.

Evaluation

Achievments

ID Requirements Priority State Contributors
1 A User-friendly interface that allows the user to choose a car Must Yifan
2 A custom, indoor-like VR environment to view amd select the car Must Yifan
3 User must be able to view information about a car - it’s specifications, description, and performance figures Must Yifan
4 A feature-complete simulation of riding in a self-driving car - users must be able to at least see the car moving autonomously from a start point to a destination along a specific track Must Xuyou
5 The car must have basic elements of a car’s interiors; must have a driver's seat, a front-passenger seat, and a Dashboard/instrument panel with at least a speedometer Must Xuyou
6 Users should be able to interact with the Dashboard/instrument panel (using Hand Tracking) and receive visual feedback Should Xuyou
7 Users should be able to look around (must be free to rotate and change the view) just like looking around in a real-life setting Should Xuyou
8 The applications should have a feature-complete environment for the simulation similar to that of a city - with buildings, roads, highways, roundabouts, bridges etc. Should Abir
9 An interface for customising the car's interiors Could
10 A simple user-interface for customising the car's exteriors with customising options like choosing the colour of the car and adding stickers/decals Could Yifan
11 A navigation system for the car - a virtual map with real-time updates of the car’s position Could Xuyou
12 Options for customising the route that the car will take - could have a pathfinding algorithm that navigates the car from a starting point to a destination Could Xuyou
13 An entertainment system for the car - with a simple game, a music system, and climate control options - that is controlled using touch and gestures Could Xuyou
Key functionality (Must have and Should have) 100% completed
Optional functionality (Could have) 80% completed

Individual Contribution

Work packages Xuyou Abir Yifan
Project Partners Liaison 33% 33% 33%
Requirement Analysis 33% 33% 33%
Research and Experiments 33% 33% 33%
UI Design 50% 0% 50%
Programming 60% 10% 30%
Testing 100% 0% 0%
Bi-weekly Reports 33% 33% 33%
Report Website 0% 80% 20%
Poster Design 0% 20% 80%
Video Editing 50% 25% 25%
Overall Contribution 33% 33% 33%
Main Roles Front-end and Back-end Developer, UI Designer Front-End Developer, Documentation Writer, Researcher, Elevator Pitch UI Designer, Front-end and Back-end Developer, Researcher

Critical Evaluation

User interface and user experience

UI elements in our VR environment include a hologram for the user to choose a car, buttons to customise the car, as well as gesture detection for users to control the entertainment system, the navigation system and the climate inside the car. These UIs are generally understandable to users, as most of them are designed to be the same as those phone apps used by people regularly (music player, maps, etc.). Moreover, users are allowed to interact with all the UIs in the VR scene by touch and gestures instead of controllers, which will is easier for those who have never used VR devices before. However, there are also some drawbacks to using Hand tracking. Feedback from tester has showed that it will be hard to justify the actual distance between users and the UI elements, while using hand tracking functionality, also, they reported that the position of hands in the VR scene does not fully coincident with their actual hands position in the reality, this results in some buttons in the scene not actually being pressed when the user has actually pressed it. However, Oculus Hand Tracking is still a beta feature and this aspect of the project will improve with time.

Functionality

100% of the key functionality stated in our MoSCoW list is completed and fully functioning. The two main functionalities that are functioning, but unstable, are Gesture detection and the navigation system. Regarding the gesture detection, the problems are the same as mentioned above. The navigation system works fine in most of the cases, however it can be buggy sometime. Bugs are mostly related to the route displayed on the mini map on the navigation panel - sometimes it does not display correctly and in the worst case, the user may arrive at the wrong destination. However, we have never come across this scenario in testing or production.

Stability

The VR environment is generally stable due to the fact that it is a standalone game and the actions the user can perform are quite restricted. However, the navigation system is a bit unstable - the pathfinding algorithm may cause the game to be stuck for a few frames. Again, we have never come across this scenario in testing or production.

Efficiency

The VR environment is very efficient because it is constructed using the sample framework provided by Oculus, which works very well on the Quest headset. The elements in the project that affect efficiency the most would be the city environment, which needs to be rendered by Unity engine frame-by-frame and greatly affects the game FPS. Notes on the efficiency of the pathfinding algorithm can be found here.

Compatibility

The VR environment is completely compatible with our target device - the Oculus Quest headset.

Maintainability

The product is relatively easy to maintain as it does not use any additional cloud services or resources that need to be constantly managed. The only part of the project that might need to be regularly updated would be scripts that use Oculus APU as it is recommended to update them when a new stable version of Oculus API comes out.

8192
and counting ...

Miles Simulated

Where To Find Us

Computer Science Department
University College London
Gower St
Bloomsbury
London WC1E 6BT