Dynamic Assurance Cases for Autonomous Adaptive Systems

  • Cyber security : hardware and sofware,
  • phD
  • Paris – Saclay
  • Level 7
  • 2023-10-01
  • MRAIDHA Chokri (DRT/DILS//LSEA)
Apply

Providing assurances that autonomous systems will operate in a safe and secure manner is a prerequisite for their deployment in mission-critical and safety-critical application domains. Typically, assurances are provided in the form of assurance cases, which are auditable and reasoned arguments that a high-level claim (usually concerning safety or other critical properties) is satisfied given a set of evidence concerning the context, design, and implementation of a system. Assurance case development is traditionally an analytic activity, which is carried out off-line prior to system deployment and its validity relies on assumptions/predictions about system behavior (including its interactions with its environment). However, it has been argued that this is not a viable approach for autonomous systems that learn and adapt in operation. The proposed PhD will address the limitations of existing assurance approaches by proposing a new class of security-informed safety assurance techniques that are continually assessing and evolving the safety reasoning, concurrently with the system, to provide through-life safety assurance. That is, safety assurance will be provided not only during initial development and deployment, but also at runtime based on operational data.

Ingénieur ou Master 2 en informatique

en_USEN

Contact us

We will reply as soon as possible...