Trusted Digital Systems
We develop cryptographic methods, algorithms, tools, policies and practices to safeguard privacy and security in key computation and communication processes. We also devise methods and tools to assess, using techniques from cognitive psychology, an individual's level of trust in digital systems as well as providing evidence, through new verification techniques, of the trusted nature of digital systems.
Key processes
Safeguarding key computation and communication processes, including:
- human-machine symbiosis;
- autonomous systems;
- AI trustworthiness.
Using automated testing, verification, abstract interpretation techniques to verify the correctness of embedded and cyber-physical systems.
This cluster also addresses autonomous decision-making, developing the design, analysis, and monitoring of trustworthiness.
The group also develops and maintains the MAMBO open-source tool for the analysis of Arm binary programs.
We have also made significant contributions to software verification and automated reasoning: theory and implementation techniques to write trustworthy software systems.
Find out more about our partnership with the Arm Research and Development team.
Priority areas of research interest
Systems of software security
- cryptography;
- data and identity privacy;
- networked and distributed system security;
- software security;
- systemic view of governance in cyber resilience.
Automated reasoning
- computational logic and verification;
- runtime verification;
- program synthesis;
- software testing.
Autonomous systems
- explainable AI;
- machine ethics;
- autonomous decision-making;
- system trustworthiness.
We also develop award-winning theorem provers (Vampire and iProver) and software verification and testing tools (ESBMC and JBMC) to ensure safe and secure hardware and software systems. These tools have consistently won international competitions in theorem proving, software verification, and testing over the last ten years.
The group has made significant contributions to software verification and automated reasoning: theory and implementation techniques to write trustworthy software systems.
We have strong links to the industry, including collaborations with ARM, Intel, Microsoft, AWS, NASA, and THG.
Research grants
Some of our research grants include:
- UKRI project Soteria: demonstrating the security capabilities of the Morello System in the e-commerce vertical industrial segment - over the call for ISCF digital security by design: technology-enabled business-led demonstrator.
- UK RISE 4212204/RFA 15971 rFAS reconfigurable Accelerator Sandboxing, investigating security issues from side channel analysis to protecting FPGAs against malicious configurations.
- EU project ELEGANT: secure and seamless edge-to-cloud analytics - over the call for software technologies.
- EPSRC project SCorCH: secure code for capability hardware - over the call for ISCF digital security by Design Research Projects.
- EPSRC project EnnCore: end-to-end conceptual guarding of neural architectures - over the call for security for all in an AI-enabled society.
- EPSRC EP/R026092 Future AI and robotics hub for space - tackling verification of reliability and security for space robotics.
- EPSRC EP/R026084 Robotics and artificial intelligence for nuclear - tackling verification of resilience, safety and security for nuclear robotics.
- EPSRC EP/N007565 Science of Sensor Systems Software programme grant - tackling verification of sensor-rich systems.
- EPSRC EP/V026801 UKRI TAS Node in verifiability - tackling verifiability issues related to Trustworthy Autonomous Systems.
- EPSRC EP/V012134 UniFaaS: A Unikernel-based Serverless Operating System.
- ERDF 15R18P02426 Greater Manchester Cyber Foundry.
- Bounded Model Checking for Verifying and Testing Ethereum Consensus Specifications.
- Google ASPIRE Fund 2021 Program: Formal Verification Driven Fuzzing of Trusty OS.
- UKRI IAA 373 EPSRC with SES Secure Ltd: Using Artificial Intelligence/Machine Learning to assess source code in Escrow.
Standards and regulation
Members of the cluster have a significant role in standards development across a range of relevant activities:
- British Standards Institution - AMT/10 (ISO 299) committee on Robotics.
- British Standards Institution BS 8611 standard on Robot Ethics.
- IEEE P7001 standard on Transparency in Autonomous Systems.
- IEEE P7009 standard on Failsafe Design of Autonomous Systems.
- IEEE Technical Committee on the Verification of Autonomous Systems.
- British Standards Institution - BS 10754 Information technology – Trustworthy Systems.
- British Standards Institution Committee IST/15 Software Engineering.
- The IASME Governance Standard for Information and Cyber Security.
People
Cluster Lead: Dr Mustafa Mustafa.
Members: Dr Richard Banach, Dr Louise Dennis, Prof Clare Dixon, Prof Daniel Dresner, Prof Michael Fisher, Dr Konstantin Korovin, Christos Kotselidis, Prof Mikel Luján, Dr Lucas Cordeiro, Dr Pierre Olivier, Dr Giles Reger, Dr Ning Zhang, Dr Youcheng Sun, Dr Bernardo Magri, Prof John Goodacre, Dr Renate Schmidt, Dr Ser-huang Poon, Dr Emily Collins, Dr Marie Farrell, Dr Edoardo Manino, Dr Haiyu Li, Dr Alex Creswell, Dr Gaven Smith, Dr Mahdieh Sadabadi