Concept Enforcement and Modularization for the ISO 26262 Safety Case of Neural Networks
Faculty/Professorship: | Cognitive Systems |
Author(s): | Schwalbe, Gesina ![]() ![]() |
Publisher Information: | Bamberg : Otto-Friedrich-Universität |
Year of publication: | 2020 |
Pages: | 6 |
Source/Other editions: | PhD Forum at the European Conference on Machine Learning and Principles of Knowledge Discovery ECML (2019), 6 S. |
Year of first publication: | 2019 |
Language(s): | English |
DOI: | 10.20378/irb-47277 |
Licence: | Creative Commons - CC BY-SA - Attribution - ShareAlike 4.0 International |
URL: | https://ecmlpkdd2019.org/submissions/phdforum/ |
URN: | urn:nbn:de:bvb:473-irb-472771 |
Abstract: | The ability to formulate formally verifiable requirements is crucial for the safety verification of software units in the automotive industries. However, it is very restricted for complex perception tasks involving deep neural networks (DNNs) due to their black-box character. For a solution we propose to identify or enforce human interpretable concepts as intermediate output of the DNN. Two effects are expected: Requirements can be formulated using these concepts. And the DNN is modularized, thus reduces complexity and therefore easing a safety case. A research project proposal for a PhD thesis is sketched in the following. |
GND Keywords: | ISO/DIS 26262; Netzwerk; Verifikation |
Keywords: | ISO 26262, neural networks, formal verification, concept enforcement |
DDC Classification: | 004 Computer science |
RVK Classification: | ST 300 |
Type: | Workingpaper |
URI: | https://fis.uni-bamberg.de/handle/uniba/47277 |
Release Date: | 3. July 2020 |
File | Description | Size | Format | |
---|---|---|---|---|
fisba47277.pdf | 889.15 kB | View/Open |

originated at the
University of Bamberg
University of Bamberg