Uncertainty, modeling and safety assurance: towards a unified framework

Abstract

Uncertainty occurs naturally in software systems, including those that are model-based. When such systems are safety-critical, they need to be assured, e.g., by arguing that the system satisfies its safety goals. But how can we rigorously reason about assurance in the presence of uncertainty? In this paper, we propose a vision for a framework for managing uncertainty in assurance cases for software systems, and in particular, for model-based software systems, by systematically identifying, assessing and addressing it. We also discuss a set of challenges that need to be addressed to realize this framework.

Publication
In 2020 IEEE Seventh International Workshop on Artificial Intelligence for Requirements Engineering (AIRE)
Mona Rahimi
Mona Rahimi
Assistant Professor of Software Engineering

AI Engineering, AI-based Software Engineering, Software Evolution, Requirements Engineering, Safety Assurance