A Virtual Reality Simulation
of a Self-Driving Car
As self-driving cars (or autonomous vehicles) are reaching a point where they’re as good, or better than human drivers, it is important that people understand how autonomous cars work and how it feels like to ride in one. Autónomo does this by simulating the experience of riding in a self-driving car by using Virtual Reality (VR) and Hand Tracking. The use of VR and Hand Tracking creates a very real-world experience of riding in a self-driving car and has lower costs compared to test-rides. The application is also created to appeal to a variety of training and learning styles and eliminates any risk and safety concerns. Potential adopters of this technology can include car dealerships that can use this application as a substitute for test-rides, which are more costly and time-consuming and are hence, much less scalable. Similarly, potential buyers of self-driving cars can use Autónomo to experience riding in a self-driving car before buying one.
Our solution uses an Oculus Quest headset to simulate a feature-complete experience of a Level 5 autonomous car and our application is written using Unity and C#.
Get the complete self-driving experience - move around and look around just like in a real-life setting, interact with elements of a fully autonomous (Level-5) car, customise these elements to your preference, understand how the car will always choose the most optimal route, and much more - all in a Virtual Reality (VR) environment.
Interact with the simulation just like you would interact with a real object - choose your destination with a single touch on a map, use Gestures to enter/exit the simulation, to control the entertainment system (system has a music library and a simple game), and to choose and customise your car.
Choose a car you like, customise its exteriors and interiors, choose a destination, look around a complete city environment (with roads, bridges, buildings, traffic lights, and more), play music, games, and more on the entertainment system, and most importantly, sit back and relax - the car drives on its own! Autónomo emulates and simulates a feature-complete self-driving experience.
Autónomo is built to simulate a new car buying experience. Gone are the days when you had to visit multiple dealerships, kick tyres, and drive around the block to buy a car. With Virtual Reality (VR) and Hand Tracking, Autónomo allows users to view information (description, specifications, and performance figures) about the car while also simulating the rider experience all in the comfort of your own space!
For gathering and refining requirements of the project, we liased with our client, NTT Data, until common ground was found and realistic goals were set for the project. We conducted one in-person meeting and all other meetings were done virtually over Skype.
City Car Driving | CARLA | |
---|---|---|
Device | Works on both VR headset and PC [3] | Compatible only with PC |
Programming Language | Unity and C# | C++ and Python[4] |
Game Engine | Nebula | Unreal 4 |
We selected City Car Driving to be the final solution since it met almost all of our project requirements; even though City Car Driving is not exactly a self-driving car simulator, it still helped us create our UI design, implement our car design, and choose other Unity assets that are a part of our simulator. For our final deliverable, we decided to use an Oculus Quest. The reason why we choose Oculus for development is that Oculus supports both Unity and Unreal. Similarly, Oculus Quest supports precise tracking, excellent hand controls and room-scale play that lets users actually move in a virtual world, and without the restrictive cables tethering that are associated with a PC[5].
Asset | Rationale for use |
---|---|
#067 Sportscar |
|
Realistic Mobile Car #06 |
|
Urban Construction Pack |
|
Ultimate Sci-Fi UI Bundle |
|
Autónomo uses a variation of the A* Search algorithm to find the shortest path between two points. A fairly flexible and efficient graph traversal algorithm, A* search is the most is the most popular choice for pathfinding in most applications. Its superiority over other popular graph traversal algorithms, like Depth-first search (DFS) and Breadth-first search (BFS), is attributed to a couple of reasons. First, the A* Search algorithm, unlike Depth-First Search, is guaranteed to find the shortest path (if the length of the edges don't vary too much). Secondly, with the help of the heuristic function, A* Search is likely to find the destination faster than Breadth-First Search, which can be very time consuming (O(|E|) = O(b^d)) in its worst case.
The efficiency of choosing an optimal pathfinding algorithm is very important to Autónomo's functionality as it calculates the shortest path in a single frame, which means an inefficient pathfinding algorithm may cause the simulation to lag and, in the worst case, crash. Currently, A* Search works perfectly; with the current size of the city environment, we have not encountered any Low FPS or other lag problems during testing.
There are two different testing environments for our software - the Unity Editor and the Oculus Quest headset. Most of our unit and integration tests, however, were carried out on Unity Editor. There are a couple reasons for this decision: first, the process of loading the game onto the Quest headset is very time consuming. Secondly, the headset does not have an in-built Unity debugger, unlike the editor, so when an error occurs, one can not see the error message or any other details about the error, unless we display the message on a game object (like a Canvas).
However, there were some tests that could only be carried out on the Quest headset. For example, to test whether the Hand Tracking feature or the Gesture Detection system was working correctly.
We used the in-built Unity test runner for our Unit tests or simply ran the game and checked for error messages. Unit tests, in the context of our project, refer to the testing of individual Unity prefabs (the Navigation Page prefab), scripts (Navigator script, Traffic Light Control scripts), and methods (Navigator.FindShortestRoute() method - our pathfinding algorithm, for example)
Integration tests refer to the testing and interaction/connections between game prefabs and some border functionality that requires the scripts to work together. We carried out integration tests by playing the game on Unity Editor as well as on the Oculus Quest headset to check if the system met our requirements. For example, when playing around the car interior UI, we found that when we jumped from the music canvas to the game canvas, the music stops, which was a bug in functionality.
System tests took place at the last stage of the development after our game passed all the unit and integration tests. We performed system tests on Oculus Quest only to make sure the overall user experience is delightful.
We performed performance testing on some of our features to check if there were any performance issues that could affect user experience - most of them were on the level of integration tests.
ID | Requirements | Priority | State | Contributors | |
---|---|---|---|---|---|
1 | A User-friendly interface that allows the user to choose a car | Must | ✔ | Yifan | |
2 | A custom, indoor-like VR environment to view amd select the car | Must | ✔ | Yifan | |
3 | User must be able to view information about a car - it’s specifications, description, and performance figures | Must | ✔ | Yifan | |
4 | A feature-complete simulation of riding in a self-driving car - users must be able to at least see the car moving autonomously from a start point to a destination along a specific track | Must | ✔ | Xuyou | |
5 | The car must have basic elements of a car’s interiors; must have a driver's seat, a front-passenger seat, and a Dashboard/instrument panel with at least a speedometer | Must | ✔ | Xuyou | |
6 | Users should be able to interact with the Dashboard/instrument panel (using Hand Tracking) and receive visual feedback | Should | ✔ | Xuyou | |
7 | Users should be able to look around (must be free to rotate and change the view) just like looking around in a real-life setting | Should | ✔ | Xuyou | |
8 | The applications should have a feature-complete environment for the simulation similar to that of a city - with buildings, roads, highways, roundabouts, bridges etc. | Should | ✔ | Abir | |
9 | An interface for customising the car's interiors | Could | ✖ | ||
10 | A simple user-interface for customising the car's exteriors with customising options like choosing the colour of the car and adding stickers/decals | Could | ✔ | Yifan | |
11 | A navigation system for the car - a virtual map with real-time updates of the car’s position | Could | ✔ | Xuyou | |
12 | Options for customising the route that the car will take - could have a pathfinding algorithm that navigates the car from a starting point to a destination | Could | ✔ | Xuyou | |
13 | An entertainment system for the car - with a simple game, a music system, and climate control options - that is controlled using touch and gestures | Could | ✔ | Xuyou | |
Key functionality (Must have and Should have) | 100% completed | ||||
Optional functionality (Could have) | 80% completed |
Work packages | Xuyou | Abir | Yifan |
---|---|---|---|
Project Partners Liaison | 33% | 33% | 33% |
Requirement Analysis | 33% | 33% | 33% |
Research and Experiments | 33% | 33% | 33% |
UI Design | 50% | 0% | 50% |
Programming | 60% | 10% | 30% |
Testing | 100% | 0% | 0% |
Bi-weekly Reports | 33% | 33% | 33% |
Report Website | 0% | 80% | 20% |
Poster Design | 0% | 20% | 80% |
Video Editing | 50% | 25% | 25% |
Overall Contribution | 33% | 33% | 33% |
Main Roles | Front-end and Back-end Developer, UI Designer | Front-End Developer, Documentation Writer, Researcher, Elevator Pitch | UI Designer, Front-end and Back-end Developer, Researcher |
UI elements in our VR environment include a hologram for the user to choose a car, buttons to customise the car, as well as gesture detection for users to control the entertainment system, the navigation system and the climate inside the car. These UIs are generally understandable to users, as most of them are designed to be the same as those phone apps used by people regularly (music player, maps, etc.). Moreover, users are allowed to interact with all the UIs in the VR scene by touch and gestures instead of controllers, which will is easier for those who have never used VR devices before. However, there are also some drawbacks to using Hand tracking. Feedback from tester has showed that it will be hard to justify the actual distance between users and the UI elements, while using hand tracking functionality, also, they reported that the position of hands in the VR scene does not fully coincident with their actual hands position in the reality, this results in some buttons in the scene not actually being pressed when the user has actually pressed it. However, Oculus Hand Tracking is still a beta feature and this aspect of the project will improve with time.
100% of the key functionality stated in our MoSCoW list is completed and fully functioning. The two main functionalities that are functioning, but unstable, are Gesture detection and the navigation system. Regarding the gesture detection, the problems are the same as mentioned above. The navigation system works fine in most of the cases, however it can be buggy sometime. Bugs are mostly related to the route displayed on the mini map on the navigation panel - sometimes it does not display correctly and in the worst case, the user may arrive at the wrong destination. However, we have never come across this scenario in testing or production.
The VR environment is generally stable due to the fact that it is a standalone game and the actions the user can perform are quite restricted. However, the navigation system is a bit unstable - the pathfinding algorithm may cause the game to be stuck for a few frames. Again, we have never come across this scenario in testing or production.
The VR environment is very efficient because it is constructed using the sample framework provided by Oculus, which works very well on the Quest headset. The elements in the project that affect efficiency the most would be the city environment, which needs to be rendered by Unity engine frame-by-frame and greatly affects the game FPS. Notes on the efficiency of the pathfinding algorithm can be found here.
The VR environment is completely compatible with our target device - the Oculus Quest headset.
The product is relatively easy to maintain as it does not use any additional cloud services or resources that need to be constantly managed. The only part of the project that might need to be regularly updated would be scripts that use Oculus APU as it is recommended to update them when a new stable version of Oculus API comes out.
Computer Science Department
University College London
Gower St
Bloomsbury
London WC1E 6BT