Self-Driving Vehicles (SDVs) are considered to be safety-critical system. They may jeopardize the lives of passengers in the vehicle and people in the street, or damaging public property such as the transportation infrastructure. According to the National Transportation Safety Board report [1] of an Uber self-driving crash, the accident was caused by the internal components of SDVs when the AI module failed to detect a victim. The autonomous system was implemented to give a human driver control of a vehicle on the unmanaged areas; however, the driver was distracted and did not react within the appropriate time.
If you cannot see the document below, the PDF document is most likely not freely accessible. In this case, please try to access the document via this link.
% BibTex
@inproceedings{Alotaibi20,
author = {Fahad Alotaibi},
editor = {Alexander Raschke and
Dominique M{\'{e}}ry and
Frank Houdek},
title = {Improving Trustworthiness of Self-driving Systems},
booktitle = {Rigorous State-Based Methods - 7th International Conference, {ABZ}
2020, Ulm, Germany, May 27-29, 2020, Proceedings},
series = {Lecture Notes in Computer Science},
volume = {12071},
pages = {405--408},
publisher = {Springer},
year = {2020},
url = {https://doi.org/10.1007/978-3-030-48077-6\_32},
doi = {10.1007/978-3-030-48077-6\_32},
timestamp = {Tue, 16 Jun 2020 17:18:07 +0200},
biburl = {https://dblp.org/rec/conf/asm/Alotaibi20.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}