Trusted Digital Systems

We develop cryptographic methods, algorithms, tools, policies and practices to safeguard privacy and security in key computation and communication processes. We also devise methods and tools to assess, using techniques from cognitive psychology, an individual's level of trust in digital systems as well as providing evidence, through new verification techniques, of the trusted nature of digital systems.

Key processes

Safeguarding key computation and communication processes, including:

  • human-machine symbiosis;
  • autonomous systems;
  • AI trustworthiness.

Using automated testing, verification, abstract interpretation techniques to verify the correctness of embedded and cyber-physical systems.

This cluster also addresses autonomous decision-making, developing the design, analysis, and monitoring of trustworthiness.

The group also develops and maintains the MAMBO open-source tool for the analysis of Arm binary programs.

We have also made significant contributions to software verification and automated reasoning: theory and implementation techniques to write trustworthy software systems.

Find out more about our partnership with the Arm Research and Development team. 

Priority areas of research interest

Systems of software security

  • cryptography;
  • data and identity privacy;
  • networked and distributed system security;
  • software security;
  • systemic view of governance in cyber resilience.

Automated reasoning

  • computational logic and verification;
  • runtime verification;
  • program synthesis;
  • software testing.

Autonomous systems

  • explainable AI;
  • machine ethics;
  • autonomous decision-making;
  • system trustworthiness.

We also develop award-winning theorem provers (Vampire and iProver) and software verification and testing tools (ESBMC and JBMC) to ensure safe and secure hardware and software systems. These tools have consistently won international competitions in theorem proving, software verification, and testing over the last ten years.

The group has made significant contributions to software verification and automated reasoning: theory and implementation techniques to write trustworthy software systems.

We have strong links to the industry, including collaborations with ARM, Intel, Microsoft, AWS, NASA, and THG.

Research grants

 Some of our research grants include:

Standards and regulation

Members of the cluster have a significant role in standards development across a range of relevant activities:

  • British Standards Institution - AMT/10 (ISO 299) committee on Robotics.
  • British Standards Institution BS 8611 standard on Robot Ethics.
  • IEEE P7001 standard on Transparency in Autonomous Systems.
  • IEEE P7009 standard on Failsafe Design of Autonomous Systems.
  • IEEE Technical Committee on the Verification of Autonomous Systems.
  • British Standards Institution - BS 10754 Information technology – Trustworthy Systems.
  • British Standards Institution Committee IST/15 Software Engineering.
  • The IASME Governance Standard for Information and Cyber Security.

People

Cluster Lead: Dr Mustafa Mustafa.

Members: Dr Richard Banach, Dr Louise Dennis, Prof Clare Dixon, Prof Daniel Dresner, Prof Michael Fisher, Dr Konstantin Korovin, Christos Kotselidis, Prof Mikel Luján, Dr Lucas Cordeiro, Dr Pierre Olivier, Dr Giles Reger, Dr Ning Zhang, Dr Youcheng Sun, Dr Bernardo Magri, Prof John Goodacre, Dr Renate Schmidt, Dr Ser-huang Poon, Dr Emily Collins, Dr Marie Farrell, Dr Edoardo Manino, Dr Haiyu Li, Dr Alex Creswell, Dr Gaven Smith, Dr Mahdieh Sadabadi