Twins, pyramids and environments: unifying approaches to virtual testing

Main Article Content

Louise Wright
Kathryn Khatry
Maurizio Bevilacqua
Liam Wright
Michael Chrubasik
Joao Gregorio

Abstract

There is a long history of the use of engineering simulation for design. This virtual approach is typically followed by physical prototyping, testing and refinement to reach a final design, followed in many cases by a physical testing regime to meet regulatory requirements. For many companies, the use of simulation tools has reduced the time and cost associated with getting new products to market due to the ability to explore multiple designs, and has reduced resource usage and improved product quality by enabling exploration of aspects of manufacturability and long term in-use performance.


Companies are increasingly seeking to gain similar benefits beyond the design stage. Some companies are adopting a “smarter testing” approach that uses a combination of physical test and verified simulation to reduce testing and approval costs. Some manufacturers are seeking to  improve lifetime prediction and understanding of real-world performance through digital twins. Regulators are seeking to gain demonstrable confidence in the safety of products such as autonomous vehicles by defining virtual tests of the vehicle’s artificial intelligence system that simulate rare events and safety-critical situations.


This paper discusses the common features, and differences between, the fields of smarter testing, digital twins, and virtual test environments. Starting from a consideration of commonality we highlight areas where existing methods and expertise could be better exploited, and identify areas where further research and development of tools would accelerate successful application of trustworthy digital assurance approaches in industry.

Article Details

How to Cite
Twins, pyramids and environments: unifying approaches to virtual testing. (2025). Engineering Modelling, Analysis and Simulation, 3(1). https://doi.org/10.59972/qvjabgtz
Section
Articles

How to Cite

Twins, pyramids and environments: unifying approaches to virtual testing. (2025). Engineering Modelling, Analysis and Simulation, 3(1). https://doi.org/10.59972/qvjabgtz

References

“Connected and Automated Vehicles”, Vehicle Certification Agency. Accessed 30/1/25. Available https://www.vehicle-certification-agency.gov.uk/connected-and-automated-vehicles/

“Maritime Autonomous Surface Ships: Creating a framework for efficiency, safety and compliance”, Lloyd’s Register. Accessed 30/1/25. Available https://www.lr.org/en/knowledge/research-reports/2024/maritime-autonomous-surface-ships/

L. Wright and S. Davidson, “How to tell the difference between a model and a digital twin”, Adv. Model. and Simul. in Eng. Sci. 7, 13, 2020. https://doi.org/10.1186/s40323-020-00147-4

“Virtual testing of autonomous vehicles”, Claytex. Accessed 30/1/25. Available https://www.claytex.com/wp-content/uploads/2015/01/Virtual-Testing-of-Autonomous-Vehicles_DTC-20161.pdf

“Virtual test drive”, Hexagon. Accessed 30/1/25. Available https://hexagon.com/solutions/virtual-test-drive

“Open-source simulator for autonomous driving research”, Carla. Accessed 30/1/25. Available https://carla.org/

“Verification, Validation and Uncertainty Quantification (VVUQ)”, ASME. Accessed 30/1/25. Available https://www.asme.org/codes-standards/publications-information/verification-validation-uncertainty

“Data quality model”, ISO/IEC technical report 25012:2008, 2008.

R. Miller et al, “A Framework for Current and New Data Quality Dimensions: An Overview”, Data , 9(12), 151, 2024. https://doi.org/10.3390/data9120151

K. Lines et al, “A MATHMET Quality Management System for data, software, and guidelines”, Acta IMEKO, vol. 11, no. 4, article 8, 2022. https://doi.org/10.21014/actaimeko.v11i4.1348

“Quality Assurance Tools”, Mathmet. Accessed 30/1/25. Available https://www.euramet.org/european-metrology-networks/mathmet/activities/quality-assurance-tools

M. D. Wilkinson et al, "The FAIR Guiding Principles for scientific data management and stewardship", Scientific Data. 3 (1): 160018, 2016. https://doi.org/10.1038/SDATA.2016.18

“Managing Information with Building Information Modelling (BIM)”, ISO 19650-1:2-018, ISO Standard, 2019.

“Physical Reference Data”, NIST. Accessed 30/1/25. Available https://www.nist.gov/pml/productsservices/physical-reference-data

ASAM OpenODD. Accessed 30/1/25, Available https://www.asam.net/standards/detail/openodd/

ASAM OpenScenario. Accessed 30/1/25, Available https://www.asam.net/standards/detail/openscenario/

SAM OpenMaterial. Accessed 30/1/25, Available https://www.asam.net/project-detail/asam-openmaterial/

ASAM Test specification. Accessed 30/1/25, Available https://www.asam.net/standards/asam-test-specification/

BSI FLEX 1890 Vocabulary. Accessed 30/1/25, Available http://bsigroup.com/globalassets/localfiles/en-gb/cav/pass-and-flex-pdfs/bsi-flex-1890-v5.0-final-pdf_watermark.pdf

BSI Flex 1889 - Natural Language Description for Automated Driving Systems. Accessed 30/1/25, Available https://www.bsigroup.com/en-GB/insights-and-media/insights/brochures/bsi-flex-1889-natural-language-description-for-automated-driving-systems/

BSI PAS 1881 - Assuring the Operational Safety of Automated Vehicles. Accessed 30/1/25, Available https://www.bsigroup.com/en-GB/insights-and-media/insights/brochures/pas-1881-assuring-the-operational-safety-of-automated-vehicles/

“Road Vehicles — Test scenarios for automated driving systems — Specification for operational design domain”, ISO 34503:2023, ISO standard, 2023. Accessed 30/1/25. Available https://www.iso.org/standard/78952.html

M. Chrubasik, C.T.S. Lorch, and P. M. Duncan, “Ontology-based REST-APIs for measurement terminology: glossaries as a service”, presented at the First International IMEKO TC6 Conference on Metrology and Digital Transformation, Berlin, 2002, http://dx.doi.org/10.21014/tc6-2022.023

K. Inokuchi, J. Nakazato, M. Tsukada and H. Esaki, "Semantic digital twin for interoperability and Comprehensive Management of Data Assets," 2023 IEEE International Conference on Metaverse Computing, Networking and Applications (MetaCom), Kyoto, Japan, 2023,

C. Boje et al, “Towards a semantic Construction Digital Twin: Directions for future research”, Automation in Construction, 114, 2020. https://doi.org/10.1016/j.autcon.2020.103179

J. Gregorio et al, “A competency question driven approach to conceptual data model design for digital verification and validation”, in “Advanced Mathematical and Computational Tools in Metrology and Testing XIII Series on Advances in Mathematics for Applied Sciences, Vol. 94”, F Pavese et al (eds.), pp71-82, 2025. https://doi.org/10.1142/9789819800674_0006

“Smarter Testing”, UKRI. Accessed 30/1/25. Available https://gtr.ukri.org/projects?ref=52036

A. Saltelli, K. Chan, and E. M. Scott (eds), “Sensitivity Analysis”,. John Wiley & Sons, 2001. ISBN 0-471-99892-3.

M. D. Morris, “Factorial sampling plans for preliminary computational experiments”, Technometrics 33(2), 161–174, 1991.

R. Miller, J. Gregorio and P. Duncan, “A framework for streamlining digital supply chain integration through software interoperability analysis”, Measurement: Sensors, 101479, 2024. https://doi.org/10.1016/j.measen.2024.101479

T. Zengeya and J. Vincent Fonou-Dombeu, "A Review of State of the Art Deep Learning Models for Ontology Construction", in IEEE Access, vol. 12, pp. 82354-82383, 2024. https://doi.org/10.1109/ACCESS.2024.3406426

N. Kumar, M. Kumar & M. Singh, “Automated ontology generation from a plain text using statistical and NLP techniques”, Int J Syst Assur Eng Manag, 7 (Suppl 1), 282–293, 2016. https://doi.org/10.1007/s13198-015-0403-1.