IMPACT AUTUMN 2025

Page 1


MILITARY UTILITY OF EMERGING TECHNOLOGY

OR to support emerging technology deployment in the military sector

TECHNOLOGY READINESS LEVELS FOR OR How to better align operational research e orts with real-world needs

DIRTY DATA

On mitigating the carbon impact of literally every byte of data

VALUE FOR TAXPAYER’S MONEY IN PUBLIC INFRASTRUCTURE

Enhancing the precision and relevance of Markovian Real Options Analysis

EDITORIAL

is issue dedicates several pages to military applications of OR. Jay Williams’ article presents a mixed Soft/Hard OR method to support ready development and deployment of e cient novel technology to maintain military advantage. Richard Underwood’s piece discusses how ‘Wargaming’ supports the planning of military operations. Bazargani, Montoya and Spencer broaden the scope of the discussion on new technology deployment to include applications beyond the military.

As usual, we also feature contributions from impactful collaborations between universities and industry. For instance, May Hicks 2025 Winner Lei Zhou shares her experience of introducing OR tools to enhance production planning at an Italian frozen vegetable producer. We also publish the rst contribution by our newly established ‘OR in Practice’ Special Interest Group, which elaborates on the role and impact of OR in the (primarily UK) health sector. Geo Rosyton’s column is also back to gently remind us that our profession is just as much about creating enhanced futures as it is about extracting insights from analysis of data, past and present.

Much of what we publish in OR academic journals retains an aura of ‘obscurity’ that often prevents uptake from practitioners. In this issue we initiate a process of ‘translation for analysts and managers’: we pick an academic paper and unpack its models, algorithms and results, to (hopefully!) boost wider adoption and impact. In our inaugural article of this kind, Jacco ijssen focuses on valuation of major infrastructure projects.

e production of this issue wouldn’t have been possible without the precious help of Alan Robinson (Companion of OR 2024, former Chair of HORAF, familiar face to many of us), who joins us as Consulting Editor—Welcome Alan!

Maurizio Tomasella

OPERATIONAL RESEARCH AND DECISION ANALYTICS

The OR Society is the trading name of the Operational Research Society, which is a registered charity and a company limited by guarantee.

Seymour House, 12 Edward Street, Birmingham, B1 2RX, UK

Tel: + 44 (0)121 233 9300, Email: email@theorsociety.com

Executive Director: Colette Fletcher

President: Sanja Petrovic

Editor: Maurizio Tomasella

Consulting Editor: Alan Robinson

Senior Editorial Assistant: Sophie Rouse

Editorial Assistant: Chiara Carparelli

ImpactMagazine@theorsociety.com

Print ISSN: 2058-802X Online ISSN: 2058-8038 www.tandfonline.com/timp

Published by Taylor & Francis, an Informa business

All Taylor and Francis Group journals are printed on paper from renewable sources by accredited partners.

Operational Research (OR) is the discipline of applying appropriate analytical methods to help those who run organisations make better decisions. It’s a ‘real world’ discipline with a focus on improving the complex systems and processes that underpin everyone’s daily life – OR is an improvement science. For over 80 years, OR has focussed on supporting decision making in a wide range of organisations. It is a major contributor to the development of decision analytics, which has come to prominence because of the availability of big data. Work under the OR label continues, though some prefer names such as business analysis, decision analysis, analytics or management science. Whatever the name, OR analysts seek to work in partnership with managers and decision makers to achieve desirable outcomes that are informed and evidence-based. As the world has become more complex, problems tougher to solve using gut-feel alone, and computers become increasingly powerful, OR continues to develop new techniques to guide decision-making. e methods used are typically quantitative, tempered with problem structuring methods to resolve problems that have multiple stakeholders and con icting objectives. Impact aims to encourage further use of OR by demonstrating the value of these techniques in every kind of organisation –large and small, private and public, for-pro t and not-for-pro t. To nd out more about how decision analytics could help your organisation make more informed decisions see https://www.theorsociety.com/ORS/About-OR/OR-in-Business.aspx. e OR Society is the home to the science + art of problem solving.

CONTENTS

Brian Clegg introduces us to the work by Tom Jackson and Ian Hodgkinson of Loughborough Business School on ‘digital decarbonisation’, a way to mitigate the carbon impact of every byte of data that looks beyond the most obvious energy usage of data centres

MILITARY UTILITY OF EMERGING TECHNOLOGY

Jay Williams discusses in detail a method to conduct OR to support the development of emerging technologies in the military sector, from the demonstration that a technology concept provides military utility within the current force, to determining where it integrates best into the future force structure

Mosab Bazargani and Paul S. Spencer of Bangor University and Laurent Montoya of Airbus introduce us to the TRL framework, an established approach to new technology introduction that has been successfully adopted by Airbus in recent years, suggesting it would prove beneficial for OR professionals to further align their research with real-world needs 23

TRUST: HARNESSING OPERATIONAL RESEARCH TO FOSTER EQUALITY, DIVERSITY, AND INCLUSION

Isma Shafqat, The OR Society’s Pro Bono OR Manager, illustrates how OR made a meaningful difference at Brandon Trust, a charity focused on empowering people with learning disabilities and autism to live the lives they choose

26 HOW NOT TO THROW GOOD MONEY AFTER BAD IN AN UNCERTAIN WORLD

Jacco Thijssen of the University of York writes about estimating value-for-money for public infrastructure projects. He shows how a new ratio type of metric can be implemented, one that better captures the taxpayers’ interests when compared to the Benefit to Cost Ratio metric adopted at the UK Treasury

DISCLAIMER

5 Seen Elsewhere

Analytics making an impact

44 Universities making an impact

Brief report on a student project from the University of Southampton

46 Design Thinking Thoughts

Geoff Royston’s column is a natural follow-on from his last, this time discussing ‘design thinking’ and its relevance to the OR profession

The Operational Research Society and our publisher Informa UK Limited, trading as Taylor & Francis Group (“T&F”), make every effort to ensure the accuracy of all the information (the “Content”) contained in our publications. However, the Operational Research Society and our publisher T&F, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by the Operational Research Society or our publisher T&F. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information and any reliance on the Content is at your own risk. The Operational Research Society and our publisher T&F make no representations, warranties or guarantees, whether express or implied, that the Content is accurate, complete or up to date. The Operational Research Society and our publisher T&F, shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions.

Reusing Articles in this Magazine

All content is published under a Creative Commons Attribution-NonCommercial-NoDerivatives License which permits noncommercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way.

EMPOWERING INSIGHTS: AUTOMATED DATA INFRASTRUCTURE FOR BRAND PERFORMANCE

Aditya Pandey, Nitin Manhar Dhamelia and Bo-Yen Tsou report on some successful and recent joint work between Barilla Group and students in Business Analytics at the University of Bristol, looking at data-driven insights generation to support realworld brand health and market performance decision making

35 WARGAMING IN SUPPORT OF PLANNING MILITARY OPERATIONS

Richard Underwood of RED Scientific Ltd discusses some of the latest technology available for the British Army to plan its operations. He focuses on the Cirsium wargame, its innovative approach to time management, and support to actual military operational decision making

40 CAN OPERATIONAL RESEARCH RELIEVE OUR STRAINED HEALTHCARE SYSTEMS?

Xin Fei, Nalan Gülpinar and Christina Phillips of our newly established ‘OR in Practice’ Special Interest Group share with us the main findings from the SIG’s inaugural event that, on the 2nd of June, brought together a panel of expert clinicians, OR practitioners and academics with shared interests and relevant expertise in applying OR in healthcare

SEEN ELSEWHERE

DEFINITION OF EXPLANATION COMES FIRST!

The latest issue of Phalanx (https://bit. ly/Phalanx-Summer25), the ‘magazine of national security analysis’ published by the (US) Military Operations Research Society (MORS, http://www. mors.org/), includes a very interesting article by Dr Asim Roy, Arizona State University and Teuvonet Technologies entitled: ‘A New Era for Explainable AI Where Definition of Explanation Comes First Rather Than Model Creation’. This looks at artificial intelligence (AI) systems for computer vision and image recognition and considers how explainability can be built in from the outset, and therefore used to influence the AI implementation, as opposed to more traditional approaches that tend to build the AI model first before subsequently considering how best to explain them. The author likens this to an architect first determining a person’s or family’s needs before designing and building a house according to those needs.

The article explains how this reversing of the approach to AI adoption from the current standard of ‘model first, explanation later’ to its exact opposite (a change of heart the author credits to DARPA, the US Defence Advanced Research Projects Agency) can also address the problem of adversarial measures that challenge more traditional approaches. These measures often result in partial visibility of objects. So, with the risk of somewhat oversimplifying matters: from an image showing bits of the tail of an airplane, it can be inferred that there is indeed a plane in the hangar! This line of reasoning can also be

applied to, say, camouflage type measures (in which parts of an object are covered up to make it look different to an object detection system) and decoy type attacks (where similar objects may be painted on the ground to trick the same system). The author backs this up by adding examples of interesting ways in which Russia has tried to hide its military assets from satellites and unmanned aerial vehicles (UAVs) since the start of the Ukraine war. The article also explains how the new method would work, step by step. Overall, if this approach proves successful, applications in other domains of use other than defence may follow. Only time will tell.

For even more information on OR and Analytics in the military sector (from a chiefly US perspective), the current issue of Phalanx is openly available (Summer 2025, time of writing) – earlier versions are only available to MORS members. Other publications from MORS include the MOR Journal (https://www.mors.org/ Publications/MOR-Journal).

WARGAMING IN THE UK

Later in this issue, Richard Underwood’s article on wargaming may pique your interest in this topic area

and associated methodology. The approach described by Jay Williams in his article (also in this same issue) uses wargaming as one of its methods. Wargaming is a term mainly used in Defence and Security-related analysis –hence the ‘war’ part of wargaming –although the approach can be tailored for much wider use. Indeed, the key tenets of wargaming are that it is a process of adversarial challenge and creativity, delivered in a structured format and usually umpired or adjudicated. Wargames are dynamic events driven by player decisionmaking, with players typically taking on various roles within an immersive scenario. Its applicability well beyond its traditional bounds of Defence and Security can be readily seen, for example in the analysis of climate change (where the ‘adversary’ is the climate), in dealing with extreme weather events and their implications, in business market analysis, and many others.

For those interested to dive deeper into the topic, a Defence Wargaming Handbook (https://www.gov.uk/ government/publications/defencewargaming-handbook) was published in 2017 and is kept under review as the state-of-the-art and practice advance. It provides guidance on how wargaming can be used to explore issues at the national strategic, operational and tactical levels and across all domains and environments. The handbook discusses how wargaming can be applied to education and training, planning – where provision of a safe-to-fail learning environment is particularly important – as well as in support of executive decision-making. Although the handbook is not intended

© DC
Studio/Shutterstock

to be a prescriptive or detailed ‘how-to’ guide, it does provide key guidance around the effective design, delivery and assurance of wargames including some of the key methods and approaches typically adopted and the key roles involved.

Supporting guidance and advice for those commissioning, considering the commissioning of, or delivering wargames is also available via several active online communities. The most prominent of these is the Connections franchise. Connections UK (https:// professionalwargaming.co.uk/) is one of a suite of international franchises linked to an original US community with the aim ‘to advance and preserve the art, science and application of wargaming’. For example, the recent Connections UK conference (https:// professionalwargaming.co.uk/2025. html) was held in September 2025 with a combination of: introductory material for those newer to wargaming; deeper-dive topics such as uncertainty in wargaming; the actual and potential use of artificial intelligence in support of wargaming; the wargaming of resilience; and, a games fair to give attendees hands-on experience of a variety of different gaming approaches and methodologies.

IS YOUR ANALYSIS FIT FOR PURPOSE?

UK Government best practice guidance on the assurance of OR – and wider analytical work – has just been

refreshed and published in the form of the AQuA Book on Analytical Quality Assurance (https://www.gov.uk/ guidance/the-aqua-book). By way of background, the original AQuA book was produced in 2015 following a review of the quality assurance of analytical methods in Government (https://bit.ly/Macpherson-Report), which itself stemmed from shortcomings in the letting of the Government’s rail franchise for the West Coast Mainline around 2012. Conveniently, Aqua is also a colour and thus the guidance sits alongside and complements other ‘coloured’ books such as, amongst others, the Green book on investment appraisal (https:// bit.ly/GreenBook-HMTreasury), the Magenta book on evaluation (https:// bit.ly/MagentaBook-HMTreasury), and the Orange book on risk management (https://bit.ly/OrangeBookRiskManagement).

The refreshed AQuA book provides Government guidance about how to produce robust, fit-forpurpose analysis. It was produced specifically for analysis done by –and in support of – Government; however, it provides very useful guidance and best practice for all analysts, analytical managers and commissioners (those who commission analysis-related studies).

The original AQuA book concentrated on the introduction and standardisation of a common taxonomy and methodology for assurance based around the needs of analysts, assurers and commissioners. It also addressed the life-cycle of analysis, from initial commissioning, through study work, to the delivery of results and insight; and the various needs for assurance at each stage. And, it considered how to address proportionality in assurance, that is how to ensure that any assurance

processes are suitably independent and that they are tailored to reflect the importance of the decisions the analysis supports and the complexity, novelty and/or uncertainty of the methods (and data) employed.

The 2025 update reflects changes over the past decade alongside refinements gleaned from using the original guidance in anger. It doesn’t alter the fundamental tenets of the original work, but rather refines and updates them for current circumstances. For example, the original AQuA book captured best practice in the early/mid 2010s, at which point the main analytical tools in use within Government were proprietary simulation models and spreadsheets. Clearly, the usage of methods has moved on greatly in the intervening years, with the use of AI –both Machine Learning (ML) and Large Language Models (LLMs) – and data analytics both increasing significantly alongside the use of additional analysis models and methods including open-source software and ‘black-box’ approaches.

The guidance has also been updated in its content around proportionality in assurance, where a risk-based approach is now advocated to allow better tailoring of the assurance undertaken. It advocates for a more formal adoption of a continuous assurance review cycle rather than the emphasis of assurance at certain key points in the study lifecycle. The guide also emphasises the role of publishing and transparent treatment of models, data and assumptions, and assurancerelated evidence. Finally, the updated guidance now takes a more holistic view of the assurance of analysis in the round – whichever analytical disciplines (OR, statistics, economics, social research…) are involved

individually or in combination –rather than the more model-centric approach adopted in the original version.

All of this should resonate with many across our community who still believe in OR as a rigorous process to support decision-making about relevant problem situations instead of yet another old-fashioned area of human enquiry where AI/ML/LLMs/etc. may, albeit incidentally, apply (is there any area where they do NOT apply, these days?).

ANALYTICS HELPS TO PAVE THE WAY TO CYCLING GOLD

In the latest issue of INFORMS’ Analytics magazine (https://bit. ly/4ojTYJx), Kara Tucker covers the successful story of USA Cycling’s Women’s Team Pursuit squad and their data- and analytics-driven journey to Gold Medal at the Paris 2024 Olympic Games. For their work with the squad and support staff, the team behind ‘PROJECT 4:05: Optimizing USA Cycling’s Women’s Team Pursuit Gold’ was recently awarded the prestigious 2025 INFORMS Franz Edelman Prize.

The name of the project is easily explained. In Tucker’s own words, 4min 05sec was ‘a threshold identified through deep analysis of historical medal-winning performances’. Achieving it would require not only technical refinement, but strategic allocation of every available resource: ‘physical, financial and human’. In a sport event where elite improvements are of the order of tenths of a second, Team USA’s gap with respect to the leading countries (e.g., New Zealand and Great Britain) was about seven seconds (Team USA’s time at Glasgow’s World Championships held in 2023 had been 4:12.684, failing to qualify for the bronze medal final by just 0.159 seconds). In the Paris final, Team USA clocked a fantastic 4:04.306 – an American record!

The analytics engine of PROJECT 4:05 consisted of four components. The ‘Race Strategy Mixed-Integer Programming Model’ helped to optimise cyclist rotation timing, power output and energy expenditure. The ‘Athlete Metric Simulation’ component ran countless race scenarios to inform team composition and tactical race choices. The ‘Datadriven Athlete Selection’ supported USA Cycling to pick squad members based on quantifiable metrics such as ‘functional reserve capacity’, ‘critical power’ and ‘aerodynamic profiles’. Finally, the ‘Training Translation’ component helped to convert all relevant analytical insights into

customized training targets for individual cyclists, and race-day plans. Such a comprehensive analytics suite had to undergo rigorous validation processes prior to being employed by the cycling team itself (the research team claims 1% error in predictions against historical performances).

The eventual squad – Jennifer Valente, Chloé Dygert, Lily Williams and Kristen Faulkner – excelled both in strength deployed by each cyclist and near-perfect synergies. More importantly, the team approach to the Olympic event showed excellent resilience, as the (inevitable) disruptions met on the way to the Paris final didn’t eventually derail the set plan. For instance, a crash in a road race suffered by one of the team members just days prior to the final implied that additional simulations had to be run. These showed alternative paths to gold in terms of renewed race tactics, ‘only’ requiring, on the side of the squad, perfect execution of the new plan. But that is exactly what happened, back in Paris.

DIRTY DATA

In our drive to reduce human-initiated climate change it’s natural to think of carbon emissions in terms of the more obviously dirty processes, such as burning fossil fuels – yet our use (and hoarding) of data is having a rapidly increasing impact. The answer proposed by Tom Jackson and Ian Hodgkinson of Loughborough Business School (respectively left and right in the opening image) is digital decarbonisation. This involves looking far beyond the most obvious energy usage of data centres to trace and mitigate the carbon impact of every byte of data.

Jackson and Hodgkinson came to a realisation of the importance of the intersection between data and environmental concerns. ‘We’ve always been drawn to the intersection of information, systems, and impact. Our

journey began by exploring how information flows through organisations, but it quickly became clear that our growing dependency on digital technologies carries serious environmental consequences. When we saw the scale of ‘dark data’ – data collected and stored but never used –we realised this wasn’t just an IT problem, but a sustainability crisis in the making. That realisation sparked the digital decarbonisation movement (a phrase we coined that was recognised by the World Economic Forum) and it has shaped much of our research ever since.’

A GROWING PROBLEM

Data centres are estimated to consume between 2 and 3 per cent of the world’s electricity supply, in the region of 400

BRIAN CLEGG

Data centres are estimated to consume between 2 and 3 per cent of the world’s electricity supply

TWh – a little more than the total UK consumption. This is expected to more than double to around 945 TWh by 2030, perhaps reaching 1,200 TWh by 2035 [1]. Of itself this is significant, but Jackson and Hodgkinson point out that data centres themselves are but a part of the energy impact of data. They advocate taking a lifecycle approach, recognising that data’s production, manipulation, passage through networks and storage can all have energy implications.

Tom Jackson commented: ‘It’s very much thinking around the lifecycle of digital. So what is digital? It is really all about the data that you get from A to B to make a decision, and the journey it takes. How was that data created in the first place? When it’s been created, where does it go next? It might go to the network right across the world, because it could be an Instagram image that has gone viral. Or it could be a piece of data that’s been used to train ChatGPT, and now it’s put somewhere and forgotten about, becoming dark data. The life analysis of data looks at where it flows, where it ends up, whether it comes back around again. And unfortunately, in data terms, it normally just ends up in a pile

somewhere in a storage device, which is never accessed again. Large volumes of dark data exist within many storage setups.’

There are strong overlaps between this work and operational research around modelling, forecasting and systems optimisation. Tom Jackson noted ‘The common theme throughout is having that desire to discover new things, to question things… a desire to unpick things and see if you can optimise them, which is a great OR trait to have.’ One such approach, scenario analysis, moves the focus away from data centres to attempt to calculate an overall energy usage per gigabyte of data. This shows the potential extreme impact of

So what is digital? It is really all about the data that you get from A to B to make a decision, and the journey it takes.

uncontrolled data usage, especially with the growth of energy-intensive applications such as AI and bitcoin mining.

DOOMSDAY SCENARIOS

In their recent co-authored article [2], Jackson and Hodgkinson with colleagues pose the question: ‘Will we reach a point where digital data requires more energy than can feasibly be produced?’ They outline a series of scenarios when considering the energy intensity of data and its predicted growth over the coming years. Drawing on a range of different available estimates for kWh/GB (including 0.072 kWh/GB from The Shift Project [3], 0.060 kWh/GB by Aslan et al. [4], and their own calculations [2] with forecasts of renewable energy production globally, their ‘doomsday scenarios’ reveal that total renewable energy generation worldwide may fall short of digital data energy demand by 2030. While the reality could be far different, these

‘Will we reach a point where digital data requires more energy than can feasibly be produced?’

projections are a wake-up call to take digital decarbonisation and the carbon footprint of data more seriously.

Ian Hodgkinson emphasises the importance of knowledge management in getting to grips with data. ‘Ultimately, knowledge management supports how organisations learn. And within that, data plays a critical role. Increasingly, with technological advancements, we see that organisations are very good at acquiring an awful lot of data, but then often fail to go through the processes of effective assimilation and transformation of that new data. This then impacts the use of that data, whether it be to generate commercial value or to provide value to other parties in the third sector and public sector. Technological developments have enabled or increased the potential to enhance organisational learning through knowledge

management practices. But what we’re finding is that many organisations haven’t developed the capability at the same level as the ability to create and generate data.’

This need to consider effective data management applies both at the individual and corporate level. I recently upgraded my computer. To avoid spending too long transferring data from one machine to the other, I dumped unnecessary files, freeing up 300 GB of space. This data was not just stored on my computer, but backed up in the cloud. Jackson and Hodgkinson emphasise the value of taking a personal approach: ‘While it took you time, the environmental return is substantial when scaled across society. It’s not just about space; it’s about emissions. Think of it as a digital spring clean with planetary benefits. The more we encourage intentional data behaviour, deleting what’s unnecessary, archiving efficiently, minimising duplication, the better chance we have at avoiding the “data doomsday” scenario.’

The more we encourage intentional data behaviour, deleting what’s unnecessary, archiving efficiently, minimising duplication, the better chance we have at avoiding the “data doomsday” scenario.

MOVING TO CURATION

Organisations and individuals need to think about two classes of dubious data: dark data and ROT (redundant, obsolete and trivial): between them Jackson and Hodgkinson suggest these make up around 88 per cent of organisational storage. The idea, as Jackson and Hodgkinson put it, is to move from blind retention to intentional curation: balancing utility with cost which can be financial, environmental, and operational.

Ian Hodgkinson noted: ‘We take the broad perspective of Gartner’s definition [of dark data], which is effectively information assets that are used once and then never again, or maybe not even at all. ROT data obviously can become dark, but we find

it’s helpful to explain them separately, especially when we’re working with organisations. So, for instance, a council that we were in discussion with a few months ago, was talking about how they still have their Christmas lunch menus from 15 or 20 years ago stored, and that’s a great example of very trivial data that has no value. Whereas with dark data, there may still be value there, but it’s often forgotten about and not looked into.’

Tom Jackson expanded on this: ‘A good example of that has also been Internet of Things and sensors. You’ve got sensors on pretty much everything nowadays from washing machines to cars. Ninety per cent of the data generated from those sensors is never used. That goes straight into dark data. It’s not redundant, obsolete, or trivial, it’s just never used. It could still be of value, but people forget it’s actually there and don’t then consider the value of having that, because everyone’s thought ‘Let’s collect as much data as possible, it might come in useful in the future.’ That thinking’s a bit like a garden shed. We all start storing lots of stuff in our garden shed and we don’t clear it out, we just buy a bigger shed because we got so much stuff.’

Taming this data, Ian Hodgkinson suggests, is not just about storing less. ‘It’s the difference between hoarding and stewardship: converting data into information that can be meaningfully interpreted and applied, ultimately developing knowledge that supports decision-making and innovation. In a world of limited energy and environmental budgets, treating knowledge, not just data, as the end goal is key. Metadata, tagging, and access histories help distinguish valuable archives from digital clutter.

It’s similar to managing physical archives: not every piece of paper is worth saving forever.’

LOOKING TO THE FUTURE

For Jackson and Hodgkinson, there are three key areas for future research:

• Lifecycle accounting – developing robust, standardised methodologies for tracing the energy impact of data across network, compute, and storage with increased transparency across the digital ecosystem.

• Policy tooling – they have recently helped draft the OECD Recommendation on Digital Technologies and the Environment, which is in force across 38 member countries. Now they are working on implementation frameworks to guide governments and industry.

• Forecasting models – building scenario-based tools that can map the tipping points between data demand, energy supply, and sustainability thresholds. These are crucial for both resilience planning and policy regulation.

FOR FURTHER READING

Whether as individuals or organisations, the final message is that we need to change. Jackson and Hodgkinson say ‘Think of it as tracing and tackling the carbon impact of every byte, not just the building that houses it. A data-centric approach reframes our view, treating data as a resource with energy and environmental costs, much like we do with water or fuel.’ Their message that data is a wonderful resource, but not carbon neutral, is an important one to bear in mind in a world so dependent on information.

Brian Clegg is a science journalist and author and who runs the www.popular science.co.uk and his own www.brianclegg. net websites. After graduating with a Lancaster University MA in Operational Research in 1977, Brian joined the OR Department at British Airways. He left BA in 1994 to set up a creativity training business. He is now primarily a science writer: his latest title Brainjacking looks at the science of influence and manipulation.

[1] IEA (2025). Energy and AI, IEA, Paris iea.org/reports/energy-and-ai

[2] Castro, V., M. Georgiou, T. Jackson, I. Hodgkinson, L. Jackson, and S. Lockwood (2024). Digital data demand and renewable energy limits: Forecasting the impacts on global electricity supply and sustainability. Energy Policy 195:114404. DOI: 10.1016/j.enpol.2024.114404.

[3] The Shift Project (2019). Lean ICT: towards digital sobriety [Report of the Working Group Directed by Hugues Ferreboeuf for the Think Tank “The Shift Project”]. https://theshiftproject.org/en/article/lean-ict-our-newreport/

[4] Aslan, J., K. Mayers, J.G. Koomey and C. France (2018). Electricity intensity of internet data transmission: Untangling the estimates. Journal of Industrial Ecology 22:785–798.

ASSESSING MILITARY UTILITY OF EMERGING TECHNOLOGY

The information age is well upon us, boasting unprecedented levels of rapid technological advancements across various fields. With rising global political tensions and active wars in Europe, the Defence sector is by no means an exception. The UK government is tackling this uncertainty head-on, pledging a ‘commitment to spend 5% of GDP on national security by 2035’ [1]. In practice, the acceleration of weapons for the UK’s Ministry of Defence (MOD) is already taking place with examples such as DragonFire, a

UK-developed Laser Directed Energy Weapon which is being installed on Royal Navy warships for the first time in 2027 [2], 5 years sooner than previously projected.

The timing of this strategic change in Defence procurement is no coincidence. Events in Ukraine to date have shown a strong reliance on rapid capability developments and innovation to effectively counter adversary forces. A prime example is drone technology, with ‘approximately 100 different types of drones being used in theatre’ [3] and an ‘estimation that Ukraine will

Computer Generated Image (CGI) illustrating the use of potential future Urban technology concepts.
JAY WILLIAMS

produce more than 5 million drones in 2025’ [4].

Whilst increases to Defence spending will help, the budget will always be stretched, and getting the best value for money in terms of investment in science & technology remains a necessity.

OPERATIONAL RESEARCH AND MILITARY UTILITY

The rapidly evolving Defence landscape increases the scale and complexity of requirements for the MOD, but also decreases the timeline within which those requirements need to be fulfilled, creating a difficult challenge for Defence procurement. Whilst increases to Defence spending will help, the budget will always be stretched, and getting the best value for money in terms of investment in science & technology remains a necessity.

Operational Research (OR) for the assessment of military utility is critical for allocating these resources efficiently. Conducting OR to support the development of an emerging technology concept can maximise the effectiveness of resources by identifying development pathways with the highest potential military utility reducing the risk of often costly changes to a product once it has matured.

Once a technology concept has been demonstrated to provide military utility within the current force, decisions need to be made to determine where it integrates best into the future force structure. The OR conducted in the initial development phases exploring various hypotheses as to whether that concept can provide adequate military utility in a future force environment also has applicability in such later stages of decision-making. Examples of

exploitation of such analysis include determining the most effective roles of the new technology concept for the user, the best ways to use the technology in particular environments, and the procurement numbers required to meet the MOD’s needs.

An important factor when working with emerging technologies, heightened in the present day with faster development cycles, is stakeholder perception management.

DIFFICULTIES OF ANALYSING EMERGING TECHNOLOGY

Traditionally, OR supporting the end-to-end development of concepts has involved several discrete studies over a protracted period that has allowed for the generation of data sets and supporting information. However, the new strategic imperatives outlined previously means that the required pace of development of concepts within a rapidly evolving environment reduces the effectiveness for such approaches. Without an approach tailored to data generation in this new context, this lack of data presents layered implications for OR ranging from limited ability to conduct quantitative analysis to higher uncertainty and risk present in any results (Figure 1).

An important factor when working with emerging technologies, heightened in the present day with faster development cycles, is stakeholder perception management. Using the different military Front Line Commands (FLC) as an example, the Army will have a different perspective of how best an emerging technology could be developed and adapted for their operating environment to the Royal Navy, or the Royal Air Force

and so forth. Similarly, it’s not only the FLCs that will be stakeholders in emerging technology: budget holders and technical experts will require engagement, leading to complex and conflicting agendas that ideally need to be accounted for in any OR.

Despite these challenges, data generation remains key to underpin OR support to technology development. Data and information is required for further exploration of the emerging technology, to define the role and boundaries of the concept in the broader military force, and to estimate cost-effectiveness, limitations, and risks. Consideration of all these factors is an important step in evaluating its military utility and should be fundamental to the combined OR approach. Consequently, the development of OR approaches to meet this more rapid need requires novel methods to draw upon available data and information.

OR APPROACH

Considering the complexities of the topic and a push to develop deployment options quickly, a tailored approach was necessary. The approach used consisted of a multitude of different OR techniques. Softer OR methods, such as matrix gaming and Multi-Criteria Decision Analysis (MCDA), defined concept boundaries, transitioning into harder OR methods such as Discrete-Event Simulation (DES) to evaluate the technology in a simulated military mission setting (Figure 2).

This use of a coordinated set of OR approaches was underpinned by an iterative cycle of engagement between analysts and technology concept developers, focusing on a mutual expanding understanding of the utility, limitations, and remaining uncertainties relating to the concept, as

… [matrix gaming] was used to generate a shared understanding of the technology concept, and … hypothetical applications … in the absence of any notable supporting quantitative information

well as the collation of available data and information. This exchange happened in real-time, steering which aspects of the technology were the focus for further investment and development, and identifying and realising opportunities for generation of further data and information.

MATRIX GAMING

An example of a softer approach used in the initial formative stages of the process is the application of matrix gaming. ‘Matrix games’ demand that players provide several specific arguments for the success of a proposed action. These are limited only by player imagination and

FIGURE 1 CGI ILLUSTRATING THE USE OF POTENTIAL FUTURE LITTORAL TECHNOLOGY CONCEPTS
FIGURE 2 MULTI-STAGE OR APPROACH

feasibility. Other players can then make counterarguments. If opposed, a short discussion leads directly to an adjudication outcome. These characteristics stimulate free-thinking creativity and enable novel outcomes from the narrative generated in the game [5].

This approach was used to generate a shared understanding of the technology concept in the absence of any notable supporting qualitative information, and to start generating hypothetical applications and ways of using the technology in a military context. Without this bounding exercise, the art-of-the-possible relating to the concept remains ill-defined and subsequent focused analysis becomes difficult (Figure 3).

Transferring data dynamically between different components of the [OR] approach enabled both continuous improvements to an individual technique and multiple techniques to run in parallel without having to start from scratch with each change.

In this case, the lack of restrictions placed on the players created a healthy environment for discussion and led to comprehensive insights about:

• how the concepts could be used;

• within what environments the concepts might provide the most military utility;

• the limitations and risks of using the concepts.

Technology concept developers were present within these games alongside military personnel, not only providing technical input towards the discussion, but also taking on-board ideas presented within the games about development pathways for their concepts that suit current and future military requirements. These matrix games were conducted iteratively, becoming progressively more bounded through the utilisation of new-found information from previous versions, amongst other OR methods. Transferring data dynamically between different components of the approach enabled both continuous improvements to an individual technique and multiple techniques to run in parallel without having to start from scratch with each change.

MULTI-CRITERIA DECISION ANALYSIS

As the approach progressed, additional information obtained deepened the understanding of the concepts and informed subsequent steps. The iterative accumulation of data allowed for more targeted OR. An MCDA tool was adopted to utilise this additional data gained and develop understanding about expendable1 or indispensable2 characteristics of the concept. At this stage of the process, there was an emerging understanding around basic information such as approximate physical dimensions of the concepts, along with required military enablers and platforms. This emerging information, alongside judgements from technology concept developers and military personnel based on their current understanding, was sufficient to serve as inputs into the MCDA activity to help steer technology development decisions. Several performance metrics were used, ranging from qualitative estimates relating to logistics to ease of use for armed forces and estimated net impact on the military operation. The results of the MCDA study for each concept were presented as a combined score between 1 and 20, representing expendability and indispensability respectively, enabling stakeholders to effectively visualise technology concepts. Furthermore, technology concept developers gained an understanding of which metrics would require technical development efforts if one end of the scale was particularly desirable for an in-service product.

DISCRETE-EVENT SIMULATION

Discrete Event Simulation (DES) relies on complete data sets, however, results from previous OR techniques were sufficient for DES to yield valuable insights. This enables performance testing and trade-space analysis in the early design phase.

FIGURE 3 AN EXAMPLE OF A WARGAMING EXERCISE [6]

Traditionally, after drawing on existing models (if available), developing new ones would be the standard approach. As noted in the introduction however, the military domain of interest was a novel one (no models existed), and time did not allow for the design and development of a bespoke solution. Consequently, the use of Commercial-off-the-shelf (COTS) tools was considered, and a survey of potential options highlighted that an adapted version of the Battlespace Simulation Incorporated, Modern Air Combat Environment (MACE) model might offer insights. MACE is a ‘fullspectrum, Mission-level simulation software for building and executing scenarios across multiple domains’ (Figure 4) [7].

The model was adapted through the integration of custom-built plug-ins to

Simulating both emerging technologies and current capabilities within these bounded environments … [allowed to address] the question: What does the concept provide us above and beyond our current capabilities?

emulate emerging technologies that were not represented within the existing model framework. This approach enabled the provision of quantitative insights on technologies in the early phase of development.

Limiting the number of variations tested through DES was critical for preventing combinatorial explosion. A small number of hypothesised applications and environments that were gained from matrix wargaming were represented within the model. Simulating both emerging technologies and current capabilities within these bounded environments, produced insights about technology performance and allowed comparison against what the FLCs can do already, addressing the question: What does the concept provide us above and beyond our current capabilities?

A key benefit of DES in this case was the ability to vary technical parameters of emerging technology. Due to there being a high degree of uncertainty in the data used as inputs when emulating the technology, sensitivity testing was conducted to explore the technical trade space and illuminate the technical changes that generated significant variation in the simulation results in terms of military

utility. However, due to the number of combinations of elements within the simulation exponentially increasing as the number of those elements increased, restrictions were placed on which variables could be changed.

COLLATING RESULTS

Throughout the study, information was systematically transferred from one method to the next, with the outputs of preceding methods serving as inputs for those subsequent, progressively bounding the analysis scope.

Upon completion of all methodological phases, the accumulated information was synthesised and employed as inputs for a decision conference aimed at prioritising the technologies according to their assessed potential military utility, taking into account Rough Order of Magnitude (ROM) costs3 available at the time.

The relative weight assigned to each input during the decision conference varied based on the perceived value and priorities of the stakeholders involved, allowing the process to reflect differing strategic perspectives and operational needs.

FIGURE 4 MACE GRAPHICAL USER INTERFACE (RIGHT) AUGMENTED REALITY MISSION REHEARSAL AND OBSERVATION (ARMOR) (LEFT)

CONCLUSION

There is mounting pressure on the field of OR to deliver results with greater speed and efficiency, particularly within military contexts where rapid innovation is no longer optional but essential for maintaining military advantage. Although data availability has long been a challenge, it is becoming an increasingly pervasive issue as the pace of the innovation cycle increases. However, initial attempts at responding to this increase in pace have shown that continuous collaboration between analysts and technology developers can help tackle this issue. Utilising a combined approach of soft and hard OR techniques allows for a fast and effective methodology that enables the real-time refinements of emerging technology. This positively contributes towards shorter procurement cycles and ensures the MOD remains at the forefront of innovation.

In the words of Elizabeth Harrison, Scientific Advisor, Defence Science, and Technology (DST): ‘This work supported the critical “so what” of technology development for military application, steering customers towards concepts which would add the most utility within an operational context. The use of modelling software and in-person wargames provided a balance of scientific rigour with military expertise, optimising the assessments overall validity and allowing for OR to be conducted alongside the development of emerging technology. I hope this operational analysis is given a wide exposure for its method and results to UK MOD.’

NOTES

1. Defined within the study as being of relatively little significance to the scenario outcome and therefore able to be abandoned or destroyed.

2. Defined within the study as being necessary for the operation.

3. A cost estimate made when specific details are scarce.

Jay Williams is an Operational Analyst at Defence Science & Technology Laboratory (Dstl), an executive agency of the Ministry of Defence (MOD) that brings strategic advantage to UK defence and security through science and technology. Jay has

FOR FURTHER READING

analytical experience across multiple military domains, with a passion for emerging technology and innovation.

© Crown copyright (2025), Dstl. This information is licensed under the Open Government Licence v3.0. To view this licence, visit https://www. nationalarchives.gov.uk/doc/opengovernment-licence/. Where we have identified any third party copyright information you will need to obtain permission from the copyright holders concerned. Any enquiries regarding this publication should be sent to: centralenquiries@dstl.gov.uk.

[1] UK Government (2025) UK to deliver on 5% NATO pledge as Government drives greater security for working people. Available at: UK to deliver on 5% NATO pledge as Government drives greater security for working people - GOV.UK (accessed 17th July 2025).

[2] Ministry of Defence and DSTL (2025) New procurement rules help rapid fitting of military laser to Royal Navy ships. Available at: New procurement rules help rapid fitting of military laser to Royal Navy ships - GOV UK (accessed 17th July 2025).

[3] Franke, U. (2025) Drones in Ukraine: Four lessons for the West, European Council on Foreign Relations. Available at: Drones in Ukraine: Four lessons for the West | ECFR (accessed 17th July 2025).

[4] Jakes, L. (2025) As Drones Transform Warfare, NATO May Be Vulnerable, The New York Times, 4 June. Available at: Drone Attacks Are the New Front in War. Can NATO Keep Up? - The New York Times (accessed 17th July 2025).

[5] Ministry of Defence (2017) Wargaming Handbook. Available at: Wargaming Handbook - GOV UK (accessed 19th July 2025).

[6] Defence Imagery, Ministry of Defence (2021) Marine Air Ground Task Force Warfighting Exercise. Available at: https://www.defence imagery.mod.uk Filename: FLEET-20211031-BH0026-033.jpg (accessed 22nd July 2025) © Crown copyright (2025). This information is licensed under the Open Government Licence v3.0. To view this licence, visit https://www.nationalarchives.gov.uk/doc/open-governmentlicence/.

[7] Battlespace Simulations (2025) Modern Air Combat Environment (MACE), bssim. Available at: MACE | Battlespace Simulations (accessed 17th July 2025)

TECHNOLOGY READINESS

LEVELS: CHARTING INNOVATION FROM CONCEPT TO DEPLOYMENT

The global shift to advanced Artificial Intelligence (AI) and Operational Research (OR) systems requires more than theoretical breakthroughs. It calls for reliable and scalable solutions that can integrate seamlessly into complex and sometimes critical operational environments. Bridging the gap between conceptual innovation and measurable impact requires a disciplined, structured progression, from foundational modelling to real-world deployment.

Technology Readiness Levels (TRLs), originally developed by NASA and now widely endorsed by organisations such as the European Commission, UK Research and Innovation (UKRI), and major industrial players, offer a robust nine-stage framework to guide this evolution. In fields such as OR, where the ultimate goal is robust implementation in live systems, TRLs provide a framework for evaluating, costing, ensuring reliability and safety, and

communicating the progression of technological maturity.

ORIGINS AND FRAMEWORK

The origins of TRLs trace back to NASA in the 1970s, where they were conceived as a systematic approach to assess the maturity status of spacebound technologies. Formalised in 1989 and extended to a nine-level model by John Mankins in 1995, the TRL framework provided a consistent method to manage technical risk in high-stakes missions. Over time, its utility was recognised far beyond aerospace. Institutions such as the US Department of Defense, the European Space Agency, and ISO (under standard 16290:2013) adopted TRLs to guide procurement, investment, and innovation.

In AI and OR contexts, this [TRL] framework helps teams accurately position their innovations, whether at the level of theoretical modelling, validated prototypes, or deployable systems.

In 2014, the European Commission integrated TRLs into Horizon 2020, institutionalising them as a core component of research and innovation policy. Today, TRLs underpin project assessments across Horizon Europe (https://bit.ly/HorizonEurope_official), UKRI (ukri.org), and industrial technology development programmes spanning sectors from manufacturing and transport to healthcare and energy. The scale ranges from TRL 1, where basic scientific principles are observed, to TRL 9, which represents a technology proven through successful real-world operation. TRLs 1–3 capture fundamental research and early conceptual development. TRLs 4–6 represent progressive stages of prototype validation and demonstration, beginning in laboratory

settings and advancing into increasingly relevant operational environments. TRLs 7–8 involve field qualification, industrialisation, system integration, and regulatory alignment, while TRL 9 confirms full operational deployment. These stages provide a structured map for aligning technical progress with investment, planning, and risk management. In AI and OR contexts, this framework helps teams accurately position their innovations, whether at the level of theoretical modelling, validated prototypes, or deployable systems.

TRLS IN PRACTICE: THE CASE OF TRANSPORTATION SCHEDULING

Consider an OR team developing an optimised bus scheduling system for a city-wide transit network. At TRLs 1–2 (Figure 1), the emphasis is on concept formulation, mathematical modelling and the exploration of algorithmic solutions. Techniques such as mixedinteger linear programming and metaheuristics are considered and compared. At this stage, the emphasis is on theoretical soundness, performance

benchmarking, and computational relevance.

TRL3 is a key critical step where ‘feasibility’ is confirmed. At this stage, analytical studies or proof-of-concept tests using synthetic data within a simulation environment provide confidence that the proposed technology (in this case, an algorithm) can meet requirements in terms of performance, cost, integration, and related factors.

Advancing to TRLs 4–5, the technology begins to incorporate elements of the so-called ‘relevant environment’ where the technology will be deployed operationally later. For instance, it begins to interact with real-world data sources, such as anonymised GPS logs, historical ridership statistics, and timetable archives. In a laboratory setting, these inputs are used to stress-test the algorithms under realistic scenarios. Integration and performance challenges emerge, stakeholder requirements are introduced, and system behaviour is evaluated against defined success criteria. This phase bridges theoretical development with engineering viability.

FIGURE 1 TRL FRAMEWORK (RESEARCH, DEVELOPMENT AND DEPLOYMENT) AND ITS NINE STAGES

Crucially, each stage involves not only technical validation (in terms of computing performance, scalability, and result quality), but also consideration of the surrounding ecosystem, including user acceptance, regulatory compliance, cost modelling, and long-term maintainability.

At TRL 6, considered the second major milestone in the TRL scale, the prototype of the full technology (e.g., an algorithmic scheduling approach) is demonstrated in a constrained operational setting that is representative of the future deployment environment. For example, this could mean trialling the algorithm on historical datasets or in an offline format where it is not yet connected to live systems, with data uploaded manually. Real-world performance indicators (such as reductions in delays, improved trailer fill, or increased vehicle utilisation in transportation problem) are collected, providing evidence that the system can function outside controlled simulations. This marks a pivotal transition from internal confidence to external validation, offering the first concrete proof of robustness and operational potential.

TRL 7 involves expanding deployment from a representative setting to a broader operational environment. At this stage, the algorithmic approach is connected to upstream and downstream systems, receiving data via API feeds and generating outputs directly into operational workflows. Operational staff interacts with the system in real time, uncovering usability challenges and testing integration under authentic operating conditions.

TRL 8 includes full system qualification, often conducted through parallel runs with incumbent solutions to validate performance in real-world

scenarios. For example, the algorithm may be executed alongside the company’s existing scheduling software across the entire distribution network, with results compared directly under identical operating conditions. These qualification trials confirm not only technical performance but also compliance with safety, regulatory, and labour requirements, ensuring readiness for full adoption.

Finally, TRL 9 denotes system-wide deployment. At this level, the scheduler is embedded across the full transit infrastructure, operating continuously and reliably. It interfaces seamlessly with human resources, maintenance, and supply chain information systems, while benefiting from routine updates and dedicated support mechanisms. The system delivers measurable operational value, earns full trust from stakeholders, and aligns with long-term organisational objectives.

This progression highlights how TRLs structure the innovation and technology development pathway from initial concepts to full-scale deployment and operations. Crucially, each stage involves not only technical validation (in terms of computing performance, scalability, and result quality), but also consideration of the surrounding ecosystem, including user acceptance, regulatory compliance, cost modelling, and long-term maintainability.

STRATEGIC APPLICATION AND COMMON PITFALLS

Effectively leveraging TRLs requires more than technical progress, it demands disciplined transparency. One of the most frequent challenges is TRL inflation: prematurely declaring a technology field-ready when it has only been validated in controlled or simulated settings. This misrepresentation not only undermines credibility but also increases the risk of failure during funding reviews or early-stage deployment.

A balanced, well-evidenced TRL narrative enables early risk detection and strengthens both academic and commercial propositions, ensuring alignment between innovation ambitions and delivery capabilities.

Equally problematic is the neglect of non-technical dimensions such as regulatory approvals, stakeholder engagement, integration constraints and timelines, and logistical or supply chain dependencies. These factors form an essential part of TRL assessments, and if left unaddressed, can stall innovations that are otherwise technically sound.

To mitigate such risks, organisations like Airbus have instituted formal TRL review processes. Project teams must submit detailed documentation and address structured, standardised questions aligned with the specific TRL stage they are aiming to achieve. Panel reviews are conducted by independent groups comprising domain technical experts and business representatives, with progression contingent on satisfactorily addressing the actions and risks identified at earlier stages. These checkpoints serve not only as technical audits but also as mechanisms for defining the target schedule at the outset of technology development, and as gatekeepers during project execution to ensure funding continuity and effective resource allocation.

For researchers applying to programmes such as Horizon Europe, Innovate UK, or industry-sponsored initiatives, mapping work packages to specific TRL levels is now a common expectation. Accurate TRL alignment enhances proposal clarity, improves costing estimates, and supports recruitment planning. It also fosters credibility with funders, who increasingly prioritise not just scientific excellence but demonstrable pathways to impact.

Understating technology readiness can obscure progress and limit investment in operational integration, while overstating it risks rejection or costly mid-project course corrections. A balanced, well-evidenced TRL narrative enables early risk detection and strengthens both academic and commercial propositions, ensuring alignment between innovation ambitions and delivery capabilities.

CAREER DEVELOPMENT THROUGH TRLS

Beyond guiding innovation, the TRL framework also provides a strategic lens for career development (Figure 2). For students and researchers, understanding where their skills and interests align along the TRL spectrum can inform both immediate training goals and long-term career trajectories.

At TRLs 1–3, the focus is on foundational science and exploratory research, ideal for academic careers. Here, success is driven by strengths in hypothesis-driven inquiry, theoretical

modelling, algorithmic design, and publication in peer-reviewed journals. Researchers working at this stage often contribute to the conceptual and analytical foundations of future technologies.

By aligning one’s skill set with the demands of specific TRL stages, individuals can plan targeted development and build credibility across sectors.

Those operating within TRLs 4–6 occupy a space where theory meets application. This transitional zone requires interdisciplinary expertise, including coding prototypes, integrating real-world datasets, and managing validation workflows in controlled settings. Proficiency in tools like Python, Gurobi, Git, and containerisation frameworks is essential. These roles thrive in applied research centres, industrial R&D labs, and Centres for Doctoral Training (CDTs), where academic rigor supports practical innovation.

At TRLs 7–9, roles evolve toward technology industrialisation, focusing on deployment and operational integration. Professionals here need competencies in agile development, DevOps (https:// en.wikipedia.org/wiki/DevOps), regulatory navigation, certification procedures, and stakeholder communication. Experience in product delivery, systems engineering, and cross-functional project management becomes crucial. These positions are typically found in industry, start-ups, and large-scale collaborative ventures. By aligning one’s skill set with the demands of specific TRL stages, individuals can plan targeted development and build credibility across sectors. Whether aiming for academic distinction, applied innovation, or industry leadership, mastering the TRL framework equips researchers to navigate their careers with clarity and intent.

SUMMARY

TRLs provide a shared vocabulary for assessing innovation and technology maturity, an increasingly vital tool for assessing risks along the development, securing funding, fostering industrial collaboration, and driving impactful research. For professionals in AI and OR, TRLs serve not only as a framework for structuring project development but also as a guide for aligning research with real-world needs and constraints.

The impact of embedding TRLs can be seen in the U.S. Government Accountability Office (GAO) Assessments of NASA Major Projects in 2024 (https://www.gao.gov/products/ gao-24-106767): “Of the 11 projects that reported critical technologies in 2024, the projects assessed that nine matured their technologies to technology readiness level 6 by their preliminary design review. Achieving this level involves demonstrating a

FIGURE 2 MODELLING ONE’S CAREER AFTER THE TRLS FRAMEWORK

representative prototype of the technology in a relevant environment. GAO’s past work shows that maturing technologies prior to product development can help reduce technology-related cost increases and schedule delays”. In 2024, NASA’s portfolio cut cumulative cost overruns from $7.6B to $4.4B and schedule slippage from 20.9 to 14.5 years.

As the UK’s Research Excellence Framework (REF) places growing emphasis on demonstrable impact, particularly in terms of economic and societal value, researchers in AI and OR must be equipped to navigate the TRL scale with clarity and precision. This includes understanding how to plan for higher TRL achievement, evaluate readiness, and engage stakeholders across sectors.

By embedding technical excellence within pathways to deployment, TRLs help ensure that promising innovations move effectively from theory to

implementation. In doing so, they support the transformation of highquality science into usable, trusted, and scalable technologies that deliver meaningful contributions to the economy and society. Complementary concepts such as Business Readiness Levels and Customer Readiness Levels extend this logic to commercial dimensions, asking whether a viable business model exists and whether a customer base is ready to adopt the product or service. These perspectives, particularly valuable in start-up contexts, emphasise that real impact depends not only on technological maturity but also on market and business readiness.

Mosab Bazargani is a Lecturer in Data Science and AI at Bangor University. His research focuses on large-scale industrial optimisation, connecting theory with real-world applications. He has worked as an Operational Research Data Scientist at

Tesco and a Data Analytics Contractor at Marsh McLennan, and now collaborates with Airbus Endeavr, Air France–KLM, and Mace.

Laurent Montoya has over 30 years of experience in the aerospace industry. Within the Innovation & Technology Management organisation of Airbus Defence and Space, he is responsible for developing technology roadmaps, project setup, and maturity assessment, across a portfolio of projects related to Mission Optimisation and Computing & Data Processing technologies.

Professor Paul S. Spencer is Pro-ViceChancellor (Research) at Bangor University, responsible for research, innovation, and commercialisation across the institution. He is also a director of the University’s science park, M-SParc, and of a joint venture company that owns and operates the University’s research vessel Prince Madog.

BRANDON TRUST: HARNESSING OPERATIONAL RESEARCH TO FOSTER EQUALITY, DIVERSITY, AND INCLUSION

In the third sector, the need for evidencebased approaches to challenges like Equality, Diversity, and Inclusion (EDI) has never been greater. At Pro Bono OR, we aim to empower charitable organisations to make informed decisions through Operational Research (OR). By applying analytical methods, charities can better understand their impact, improve their practices, and enhance their services, all while creating more inclusive and

equitable environments for the people they serve.

For many charities, particularly those focused on vulnerable or underserved communities, ensuring equality and inclusivity in every aspect of their work is a priority. However, measuring EDI can be complex. Charities may struggle to capture the full scope of how inclusivity manifests within their organisation, or how staff and service users experience equality. Without

ISMA SHAFQAT

a solid foundation of data, it can be challenging to identify where improvements are needed or track progress over time. This is where operational research can make a significant difference.

Brandon Trust, a UK charity working with individuals with learning disabilities and autism, is a great example of how OR can support organisations in addressing their EDI challenges. Brandon Trust’s mission is to empower people with learning disabilities and autism to live the lives they choose. As part of their ongoing strategic planning, the charity identified the need to understand better how EDI was being experienced by their staff. They needed to establish a clear, evidence-based starting point to inform their future inclusion strategies, and that’s where we came in.

Our Pro Bono OR volunteers involved in this engagement piece began by conducting an in-depth discovery phase. The goal was to gain a thorough understanding of the organisation’s existing practices and the challenges they faced regarding EDI. Through a series of discussions and problem-structuring workshops, they

engaged with both management and staff to explore how EDI was perceived within the organisation.

A significant part of the analysis involved reviewing past EDI-related concerns and evaluating existing organisational data. This helped to identify gaps and areas for further exploration. With this in mind, a specialised tool was designed to assess the current EDI standing of the charity. This tool was intended to gather baseline data on how staff experienced EDI in terms of opportunity, fairness, and inclusion. The design process was collaborative, ensuring that the instrument met Brandon Trust’s specific requirements and was aligned with their strategic vision.

The volunteers, Prof. Vincent Charles and Dr Tatiana Gherman, who worked on the project shared:

“What could be more fulfilling than contributing our knowledge and expertise to a good cause? Giving back to society has always been a dream of ours, and The OR Society has made it a reality. In the process, we have met some amazing people who are truly passionate about their mission and the communities they serve! We thought, ‘They’re actually doing something really good for the community; perhaps it is now the right time to do something for them.’ Pro Bono work has the potential to

impact organisational direction and strategy in a very meaningful way, and we are honoured to be a part of that.”

The final instrument was then delivered to Brandon Trust, along with a detailed guide on how to implement it. The volunteers worked closely with the Director of People and Organisational Development to ensure that the tool was integrated into their ongoing monitoring efforts. By capturing staff feedback on EDI issues, the tool enables the charity to track progress, identify potential barriers, and uncover opportunities for further improvement. This data-driven approach is crucial in supporting Brandon Trust’s long-term goal of creating a more inclusive environment for both their staff and the people they support.

The success of this project extends beyond the immediate benefits of the tool itself. The collaboration also led to the proposal of an impact study to track EDI progress over time. This periodic assessment would allow Brandon Trust to identify emerging trends, refine their approach, and make evidence-backed decisions that continue to promote inclusion throughout the organisation.

Working with Brandon Trust has been an insightful experience, demonstrating how operational research can be used as a catalyst for driving inclusive change within the third sector. By using OR methodologies to measure and improve EDI, charities like Brandon Trust can ensure that they are truly serving their diverse communities in the best way possible. Through this partnership, we hope to encourage more organisations to utilise OR in their own journeys towards greater equality and inclusivity, enabling them to unlock new opportunities and make a lasting, positive impact on society.

Isma Shafqat, Pro Bono OR Manager at the OR Society, leads initiatives that apply operational research to support third-sector organisations, drawing on her STEM background and leadership in education and managing strategic corporate partnerships.

Contact email: ProBonoOR@theor society.com

HOW NOT TO THROW GOOD MONEY AFTER BAD IN AN UNCERTAIN WORLD

JACCO THIJSSEN

What constitutes value for money (VfM) for (public) infrastructure projects? This question actually contains two sub-questions. First, what constitutes value? Second, how much value needs to be created before a project is deemed fundable? In the UK, the Treasury uses the Benefit to Cost ratio (BCR) to measure value. That is, one estimates the present value of the expected (monetized) benefits of the infrastructure asset (possibly including wider economic benefits) and divides those by the present value of the expected costs to construct the asset.

The DfT uses a defined set of thresholds for the BCR [1] in classifying an asset’s VfM: from ‘Very High’ (BCR greater than 4), down to ‘High’ (between 2 and 4), ‘Medium’ (between 1.5 and 2), ‘Low’ (between 1 and 1.5), ‘Poor’ (between 0 and 1), and finally ‘Very poor’ (less than 0).

Even though BCR calculations feature heavily in approving infrastructure spending across the globe, it is well-documented that the majority of large-scale (infrastructure) construction projects are finished over budget and behind schedule, often with

large differences between ex ante and ex post project evaluations [2], especially when construction times are long and uncertain. Appraisal estimates tend to be too optimistic in the sense that the reported BCR is too high, with almost 9 out of 10 projects having higher costs than estimated and an average cost overrun of 28% [3] (for rail projects, this increases to 45%). Cost overruns are more prominent the longer the implementation phase of the project [4].

the majority of large-scale (infrastructure) construction projects are finished over budget and behind schedule

Even though these facts are wellknown, cost overruns and construction delays continue to be with us. Some high-profile recent examples are the Elbphilharmonie concert hall in Hamburg, Germany, the Olkiluoto 3 nuclear plant in Finland, railways running between the cities of Dali, Ruili and Baoshan in the Southwestern province of Yunnan, China, and the Crossrail project in London, UK. A particularly interesting example is Berlin Brandenburg Airport, which, after an initially planned opening date

of October 2011 finally opened for commercial traffic and in the middle of a pandemic on 31 October 2020. The airport is now estimated to have costed EUR 10.3 bn, on an initial budget of EUR 2.83 bn.

This presents a challenge for the OR community to develop methods that allow managers: (i) to better appreciate the interplay between the uncertain evolution of free cash flows and construction costs; (ii) to value the flexibility of the option to abandon a project before construction has finished; and (iii) to integrate this flexibility in the optimal timing of initial investment

using project-independent BCR thresholds is not helpful for the justification of a decision to start construction on a particular project.

In a recent paper [5], I focus on a particular issue, namely that actual revenues and costs are uncertain and are accrued at different points in time. After all, revenues cannot be earned before construction has finished, which, in many cases, is quite far in the future. There are numerous ways for

the revenues and construction costs to be higher or lower than expected. The BCR is not a good measure for the evaluation of projects in the presence of both revenue and cost uncertainty, because a construction time delay affects the BCR in two ways. First the construction costs rise because construction takes longer than expected. Secondly, because revenues will be generated at a later point in time, these revenues are worth less due to additional discounting. In addition, using project-independent BCR thresholds is not helpful for the justification of a decision to start construction on a particular project, because it ignores the fact that the optimal investment threshold for projects with different risk characteristics will be different. This is one of the basic insights from the real-options approach to capital budgeting [6]. However, if the BCR threshold is project-dependent, then it is more difficult to compare BCRs across projects. In fact, the notion of BCR itself is problematic, because to value the revenues, the time at which construction finishes has to be accounted for, which implies that benefits and costs cannot easily be separated. To put it succinctly, the clock of the numerator starts ticking when that of the denominator stops!

Another issue with the BCR is that it ignores the fact that, during the construction process, the decision could be taken that the project should be abandoned. Examples of large-scale projects that were abandoned are the World Islands in Dubai, the Wonderland theme park in Chenzhuang, China, Marble Hill nuclear power plant in the USA, and Ryugyong hotel in Pyongyang, North Korea. In [5] I embed, within a construction project’s initial valuation, the option to abandon construction. It is to be expected that the presence of such an option creates value, because it allows for a ceiling to the losses that

can be incurred due to construction delays and cost over-runs. The flip side is, however, that such abandonment options are costly. For example, The Washington Post reported, on 22 September 2020, that the companies building the Purple Line, a light-rail project in the US state of Maryland, had stopped construction amid disputes with the state about cost overruns. Packing up the project required the remaining workers to secure 16 miles of construction sites—partly built bridges, a tunnel and miles of ripped up roads—through several counties. At that time Maryland transit officials said they were still trying to reach a settlement with the project’s concessionaire over what is claimed are about USD 800 million in delay-related cost overruns.

A real option is like a financial call or put option, but its underlying asset is real rather than financial

In short, there is a need for the development of a VfM measure that: (i) is explicitly dynamic; (ii) does not rely on a separation of benefits and costs; (iii) incorporates the value of the

option to abandon the project; and (iv) has a project-independent VfM threshold. In the context of a realoptions model with both construction costs and revenues, I introduce the value ratio (VR) as an easy-to-interpret and straightforwardly implementable VfM measure that satisfies all four criteria.

A real option is like a financial call or put option, but its underlying asset is real rather than financial. In the case of infrastructure investment, one has the right, but not the obligation, to start construction at a time of one’s choosing. In addition, once construction has started one has the right, but not the obligation, to abandon construction any time before construction has finished. The mathematical tools that have been developed for valuing financial options, the most famous being the BlackScholes formula for European call options, can then be adapted to value such projects.

In terms of OR techniques used, the model is based on continuous-time Markovian stochastic processes, although this could easily be simplified to discrete-time Markov chains. The data inputs are quite similar to those

for typical BCR calculations, with the caveat that measures of uncertainty, in particular (rates of) standard deviations around revenue growth and construction progress are central to the valuation exercise. In standard BCRs these are often used to provide confidence intervals around base-case calculations. In a real options analysis, the confidence interval is an integral part of the valuation. The main trade-off that is emphasised in a real options analysis is between the value that is released by the asset upon investment (its net present value) and the value of waiting for more information. Typically, the value of waiting is increasing in the uncertainty surrounding the value of the underlying asset. Therefore, the value of the underlying asset must reach a higher ‘hurdle’ before investment is optimal, i.e. that its value exceeds the value of waiting, when there is more uncertainty.

In [5], I define the value ratio (VR) as the expected present value of starting construction today relative to the project’s value under the optimal investment timing decision. Under this measure, a project is value for money whenever the VR exceeds 1. So, while the VR threshold is projectindependent, both its numerator and denominator are not and are heavily dependent on the way one models the stochastic evolution of revenues and (construction) costs. In addition, computation of the VR does not rely on the ability to separate costs and benefits.

As an illustration, the model is applied to the High Speed 2 (HS2) project in the United Kingdom. This is a proposed high-speed railway that was planned to be built in the United Kingdom between London and Birmingham (first phase) and then further north to Manchester and Leeds (second phase). Over the years the project has suffered from cost overruns and political controversy. For example,

the second leg has been scrapped and even the first phase is now in doubt. I use, as much as possible, data provided in the government’s case for the project [7], but it must be stressed that a real options analysis was not part of the original strategic case and, thus, several parameter values needed for my analysis had to be ‘guesstimated’.

Nevertheless, my results are robust against parameter changes. The DfT reports estimated benefits and costs that imply a BCR of 1.7; thus, rendering the project medium VFM. By my calculations, the value of immediate (2013) investment is GBP 1.68 billion, whereas the value of investment at the optimal time is GBP 11.71 billion; that is, a VR of 0.14. Hence, investment is no way near optimal and the probability that investment will be optimal at any time in the next 10 years is only 4.3%.

In conclusion, a combined analysis of benefit and cost uncertainty will typically lead to a more cautious

estimate of VfM and, thus, make it less likely that a particular project reaches a pre-set VfM threshold. However, introducing additional optionality in

FOR FURTHER READING

the project, e.g., by staging the investment, will generally increase the VR. The main reason for this is that if you view a project as a sequence of smaller projects rather than as one big now-or-never project, you are capping the downside potential in case the future evolves less positively than expected at the time the initial investment decision was made. While abandoning a high-profile project midstream may be reputationally damaging, it surely beats throwing good money after bad.

Jacco Thijssen is Professor of Mathematical Finance at the University of York. His research interests include the theory of investment under uncertainty (“real options”) and its applications in OR. Jacco is a Board member of the Operational Research Society and Chair of its Education Committee.

[1] https://www.gov.uk/government/publications/percentage-of-dfts-appraised-project-spending-that-is-assessed-as-good-or-verygood-value-for-money/value-for-money-indicator-2019

[2] Pohl, G., and D. Mihaljek (1992), “Project evaluation and uncertainty in practice: A statistical analysis of rate-of-return divergences of 1,015 world bank projects.” World Bank Economic Review, 6, 255–277. https://doi.org/10.1093/wber/6.2.255

[3] Flyvbjerg, B., M. Skamris–Holm, and S. Buhl (2002), “Underestimating costs in public works projects: Error or lie?” Journal of the American Planning Association, 68, 279–295. https://doi. org/10.1080/01944360208976273

[4] Flyvbjerg, B., M. Skamris–Holm, and S. Buhl (2004), “What causes cost overrun in transport infrastructure projects?” Transport Reviews, 24, 3–18.

[5] Thijssen, J. (2022), “Optimal investment and abandonment decisions for projects with construction uncertainty.” European Journal of Operational Research, 298, 368–379. https://doi.org/10.1016/j. ejor.2021.07.003

[6] Dixit, A. and R. Pindyck (1994), Investment under Uncertainty Princeton University Press.

[7] Department for Transport (2013), The Strategic Case for HS2 Crown Copyright. Available at https://www.gov.uk/government/ publications/hs2-strategic-case [last accessed: 8/8/2025].

© Clare Louise Jackson/Shutterstock

EMPOWERING INSIGHTS: AUTOMATED DATA INFRASTRUCTURE FOR BRAND PERFORMANCE

In the Consumer-Packaged Goods (CPG) space, maintaining brand equity demands a comprehensive understanding of brand health and market performance across one’s portfolio. For Barilla, which operates in several food categories, much of the recent interest has been focused

on the US market, where consumer preferences are increasingly influenced by health-conscious trends. Market insights into the US pasta market (valued at $9.24 billion in 2024 [1]) identified a rapidly growing protein pasta products segment.

ADITYA PANDEY, NITIN MANHAR DHAMELIA AND BO-YEN TSOU

Barilla, a house of Italian food brands, has long leveraged traditional market performance indicators such as sales data and market share – similar in operation to other large enterprise businesses. However, in today’s multifaceted landscape, where consumer behaviour is increasingly shaped by multi-platform interactions, traditional metrics can potentially benefit from complementary digital datapoints, adding colour and depth to the state of the market at a given point in time.

Recognising this, Barilla began a journey of experimentation in activating workstreams designed to explore different ways of leveraging additional sources from the vast digital realm. As part of those workstreams and internal ESG (Environmental, Social and Governance) initiatives, Barilla’s London Hub collaborated with MSc Business Analytics students from the University of Bristol to develop an integrated data concept that synthesised multiple digital data points to provide a comprehensive view of the brand’s market performance. This innovative concept provided at-a-glance insights into online consumer behaviour –complementing current tracking methodologies to offer an enhanced picture of the existing market.

With an intuitive user interface, the tool could enable marketing teams to

track brand performance, identify emerging trends, and assess the effectiveness of marketing campaigns [2]. Ultimately, the solution aimed to empower data-driven decisions in an organically, yet rapidly evolving category.

… classical OR principles like data structuring, scenario testing, and optimisation can be embedded within modern data infrastructure to support brand decisionmaking in fast-moving markets.

SOLUTION DEVELOPMENT

The longstanding collaboration between Barilla Group and the University of Bristol demonstrates how classical OR principles like data structuring, scenario testing, and optimisation can be embedded within modern data infrastructure to support brand decision-making in fast-moving markets. The solution is built on an automated data infrastructure designed to capture critical market components, including brand positioning across various brand equity platforms, consumer search trends, brand perception data, and competitor mapping. Advanced time intelligence

and seasonality tracking further enhance the accuracy of the data, allowing for more precise analysis of market movements.

Annalisa Capobianco, HR Business Partner at Barilla Group, commented:

“The partnership with Bristol University demonstrates how ESG can be lived through action. By combining academic talent with real business challenges, we are cultivating the next generation of data scientists while advancing our commitment to sustainable growth. This collaboration is more than a project—it’s a model for how business and education can come together to create lasting impact for people, brands, and society.”

The project team (Figure 1) designed a robust data infrastructure that integrates seamlessly with Barilla’s existing technology stack, ensuring minimal disruption to their operational workflows.

Leveraging Python’s data cleaning capabilities, the team implemented a Star Schema for data modelling, creating an organized data warehouse that facilitated efficient analysis. The structured Star Schema approach is aligned with OR’s focus on creating simplified, decision-ready representations of complex systems.

FIGURE 1 BARILLA AND UNIVERSITY OF BRISTOL TEAM MEET-UP AT BARILLA ACCELERATION OFFICE, LONDON

Leveraging Python’s data cleaning capabilities, the team implemented a Star Schema for data modelling, creating an organized data warehouse that facilitated efficient analysis.

The proof of concept allowed for the integration of diverse digital data sets, accounting for factors like seasonality and product types, thus enabling Barilla to analyse trends not only in isolation but also within the broader context of market and product category influences. For example, by analysing monthly trends for specific pasta varieties alongside key data points, Barilla was able to gather intelligence on which products resonated most with consumers. Ultimately, this added a layer of brand insight which would play a role in supporting the adjustment of brand marketing strategies.

After comprehensive data testing and transformation, the team developed a user-intuitive concept dashboard using Microsoft Power BI, offering two key functionalities: the ‘proactive’ section, which allows users to select specific criteria for analysis (e.g., month, brand, product type, and product line), and the ‘reactive’ section, which highlights the value real-time insights would bring based on the selected criteria. This intuitive and dynamic dashboard can provide Brand Managers with a bird’s-eye

view of the market, enabling them to make informed, real-time decisions coupled with other traditional business data points. This highlights a critical use case for a Decision Support System (DSS) (Figure 2), where the dashboard acts as a DSS to enable managers to assess the assumptions and trade-offs of tabled options, interactively.

OPERATIONAL IMPACT

In the CPG market, the ability to extract insights from data that are actionable is highly sought after. Since the pandemic, data has increasingly become a strategic asset, and arguably, companies that leverage it are able to make better informed decisions. A 2023 Salesforce report [3] revealed that 92% of CPG market leaders rely heavily on data to drive decisions.

Through the easy-touse dashboard, Barilla’s marketing teams can monitor key metrics, track emerging trends, and identify consumer preferences, enabling them to adjust strategies in real-time.

Barilla’s commitment to data-driven decision-making has allowed the company to appreciate and embrace data as a critical lever for success in marketing insights. This integrated data solution concept could potentially

enhance Barilla’s agility in responding to market shifts, enabling the company to optimise its marketing strategies and product development efforts.

Through the easy-to-use dashboard, Barilla’s marketing teams can monitor key metrics, track emerging trends, and identify consumer preferences, enabling them to adjust strategies in real-time. By enabling deeper insights into consumer behaviour, the solution not only strengthens Barilla’s existing marketing efforts but also informs strategic decisions regarding product development. For instance, the dashboard has allowed Barilla to monitor the demand for protein pasta products and adapt its portfolio to meet consumer preferences more effectively. This capacity to make informed, data-driven decisions is especially valuable in the context of the rapidly moving food industry.

Moving on from here, the solution deployed at Barilla can be extended into prescriptive analytics, using linear programming or constraintbased optimisation to help allocate promotional budgets across regions or time periods. For example, optimisation models will support with recommendations of optimal media spend across campaigns, given constraints on cost, seasonality, and product-category targets, enabling Barilla to achieve higher marketing ROI.

The solution is built on three of the four fundamental pillars of marketing: Product, Promotion and Price (Figure 3). The ability to track these in equivalent metric form - alongside real-time data on consumer behaviour and market trends - could enable Barilla to use this collaboration to refine its strategies and maintain a strong brand presence in the market. Ultimately, this tool concept –which acts as a springboard in the workstream - is designed to empower Barilla’s marketing teams with the insights needed to drive growth and optimise brand decision making.

FIGURE 2 DECISION SUPPORT SYSTEM

CONCLUSION AND FUTURE DIRECTIONS

The continued integration of advanced data analytics into Barilla’s marketing strategy represents a significant step forward in continued understanding of the trends and dynamics of the US market. The exploration and development of this type of solution underscores the growing importance of operational research and data-driven decision-making in the CPG sector. As Barilla continues to refine its data infrastructure, the company will be able to respond more quickly to market changes, optimize its product offerings, and ensure that its marketing strategies are aligned with consumer demand.

Arsalan Baig Global DS & AI Sr Manager at Barilla Group, said:

“The Brand Tracking project was a great example of collaboration between Barilla and Bristol University. Working together, we were able to test a proof-of-concept that demonstrated how innovative analytical approaches can provide meaningful insights into brand performance.”

Future iterations of the solution may include the integration of simulation modelling or scenario planning - both classical operational research methods - to test marketing

strategies under various demand and budget conditions. This could allow marketing teams to theoretically assess the probabilistic outcomes of their campaigns before committing to full-scale execution.

Enrico Bazzani, AI & Data Engineering Sr Manager at Barilla Group, said of this collaboration:

“The Brand Tracking project was also a great showcase of the Python skills developed by Bristol University students. Through dedicated coaching, they were able to apply advanced coding techniques to real business challenges—something that initially seemed ambitious, but ultimately impressed us with the quality and impact of the results in the proof-of-concept.”

The solution delivered by this joint work is not only a powerful tool but is also one avenue that Barilla can leverage upon to further explore this space, as well as a testament to the increasing business value of operational research in today’s business world. As the field of operational research evolves, solutions like this demonstrate the potential for advanced data analytics to support and help drive meaningful business outcomes in ever-evolving markets.

The authors would like to thank all other team members on this joint work for all their invaluable contributions. These are

FOR FURTHER READING

Enrico Bazzani, Arsalan Baig and Annalisa Capobianco from Barilla, and Sanyukta Jain, Yi Wang, Jun He and Dr Marios Kremantzis from the University of Bristol.

Aditya Pandey is a Data and Analytics Specialist with an MSc in Business Analytics from the University of Bristol and industry experience in automating insights and building scalable Data & Business Intelligence solutions. He has collaborated on projects bridging advanced analytics and Operational Research, with a focus on empowering organisations to make data-driven decisions with strong strategic initiatives.

Nitin Manhar Dhamelia is a senior digital marketing & transformation leader at Barilla Group, driving digital-first and A.I. marketing change across the Fast-Moving Consumer Goods sector automotive, financial services, and regulated industries.

Bo-Yen Tsou is a data-driven business professional with an MSc in Business Analytics from the University of Bristol, and industry experience in digital marketing, sales, and process improvement, using tools like Data Intelligence to support customer engagement, and drive business growth across regional and international markets.

[1] Statista: Pasta, United States. https://www.statista.com/outlook/ cmo/food/bread-cereal-products/pasta/united-states%23revenue products/pasta/united-states#revenue

[2] Martins, N., S. Martins and D. Brandão (2022). Design principles in the development of dashboards for business management. In D. Raposo, J. Neves, and J. Silva (eds) Perspectives on Design II: Research, Education and Practice, 353–365. Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-79879-6_26

[3] Salesforce, 2023. https://www.salesforce.com/uk/news/stories/ consumer-goods-industry-research-2023/2023/

WARGAMING IN SUPPORT OF PLANNING MILITARY OPERATIONS

Planning Army operations is immensely challenging. The efforts of tens of thousands of soldiers and their weapons must be coordinated as part of a strategy to prevail over an enemy who is equally committed to avoiding defeat.

Wargaming has been part of British Army operational planning for over a hundred years [1], and continues to develop. The Army commissioned a new digital wargaming environment, CIRSIUM (Figure 1), to support operational planning at the British

FIGURE 1 THE CIRSIUM WARGAME DISPLAY DURING A REPLAY OF THE 2008 CONFLICT BETWEEN RUSSIA AND GEORGIA. THE DISPLAY USES STANDARD MILITARY ICONS TO SHOW THE NOMINAL POSITION OF EACH UNIT, WHILE IN PRACTICE EACH UNIT IS HIGHLY DISPERSED (REPRODUCED WITH KIND PERMISSION FROM RED SCIENTIFIC LTD © 2025).

framework headquarters of the NATO Allied Rapid Reaction Corps and the two British Army divisional headquarters. It has been developed by Hampshire-based consultancy RED Scientific Ltd.

Armed forces use wargaming for many different reasons: as part of training and exercising, to inform the development of doctrine and to support equipment procurement decisions [2]. Each purpose imposes different requirements and constraints on the games. Operational planning in a corps or divisional headquarters is fast paced and the time available for wargaming is limited. It must be possible to complete a game in a matter of hours with just a small team on each side. Despite this, the games must be able to represent all aspects of the modern battlefield and give reasonable

estimates of the time and resources required to manoeuvre and engage enemy forces, and the likely outcome of each engagement.

Armed forces use wargaming for many different reasons: as part of training and exercising, to inform the development of doctrine and to support equipment procurement decisions.

A key focus of the design of the wargame is intelligence and target acquisition. Over many decades the increasing lethality of weapons has forced armies to ever greater levels of dispersion. The modern battlefield is very empty – the first rule of battlefield survivability is ‘don’t be seen’, since if you haven’t been seen, you can’t be

attacked. While models of individual sensors are well established, understanding of how hundreds, or even thousands, of individual sightings of enemy forces build into an intelligence picture is lacking. The solution adopted was to model in detail, keeping track of every small group of soldiers and vehicles and whether and when they were last sensed, while aggregating this information when presenting it to players so as not to overload them. Information from target acquisition feeds the Intelligence Picture. The players or player teams only get a partial picture of what is happening. They know where their own forces are, but the information they have on the enemy is limited and often out of date (Figure 2). The intelligence picture is built by comparing what has been

rates when units do things. Fuel bowsers and trucks bring more stock from the rear area. If a unit’s stocks run low, they can’t do certain things anymore. The difficulty is modelling the management of the system without adding to the workload of the players. The resupply system is a complex problem for mathematical optimisation.

The game’s approach to time management is innovative, using timesteps, or bounds, of varying length. This allows the game play to focus in detail on critical periods in the battle, without wasting time stepping through less intense periods. It is managed using a Synchronisation Matrix (Figure 3), similar to a civilian Gantt chart, which shows how the activities are distributed in time, complementing the spatial distribution of the map display.

validation used data from past conflicts, both more recent, such are the RussiaGeorgia conflict in 2008 (Figure 4), and earlier conflicts back to the two World Wars. It has included detailed statistical analysis of particular facets of the conflicts, such as how long individual battles take and how quickly forces can advance. It has also involved modelling whole operations to test the completeness of the modelling, and to ensure the results are realistic. This does not mean each game must always play out in exactly the same way as the real conflict: there is far too much uncertainty in combat to expect that! But if a battle that took 5 days in real life took just a day in the wargame, or if total casualties were many times higher, this would be a significant cause for concern.

sensed with what is known about the strengths and organisation of the enemy in order to calculate the most likely distribution of their forces.

The modern battlefield is very empty …

Another key element of the wargame is the representation of the support functions, logistics and equipment support. Armies consume vast quantities of fuel and ammunition; heavily armoured vehicles have limited range before they fail and must be repaired. Therefore, any plan must be sustainable. By the same token, it is only by considering the enemy’s logistics that strategies based on attacking deep targets or manoeuvring to outflank can be represented. In principle, representing logistics is relatively straightforward. Fuel and ammunition are consumed at known

Balancing the fidelity of the model with its efficiency is critical. In the initial search for options, lots of ideas need to be rapidly evaluated and filtered. Those that survive are developed further before a final choice is made. Once the core strategy is set, it is further tested and refined. Analysts use CIRSIUM, and a range of other tools, to support each stage.

The game’s approach to time management is innovative, using timesteps, or bounds, of varying length.

When making these decisions, it is important to consider the range of alternative strategies that the enemy might adopt. Hence the focus on wargaming against an active adversary. While many analytical problems involve optimising across a range of possible scenarios, games, in their broadest sense, are uniquely challenging because the adversary is actively seeking out the weak points in your strategy.

With the stakes so high, it is essential that the modelling is thoroughly tested and reflects real world experience. This

Naturally, having developed the wargame environment for human vs human play, the team is looking at the possibility of automating the roles of one or both players using Artificial Intelligence (AI). The latter would have immense benefits. Two AI ‘players’ could play hundreds or thousands of games in the time it normally takes to play one. This would allow the distribution of outcomes due to chance events to be mapped, and it would allow pairwise combinations of strategy options to be investigated to

FIGURE 2 PLAYERS MUST WORK WITH LIMITED INFORMATION ABOUT THEIR ADVERSARY’S FORCES DERIVED FROM LAND-, AIR-, AND SPACE-BASED SENSORS (REPRODUCED WITH KIND PERMISSION FROM RED SCIENTIFIC LTD © 2025).

FIGURE 3 THE SEQUENCING OF ACTIVITIES IS MANAGED USING THE SYNCHRONISATION MATRIX, SIMILAR TO A GANTT CHART. MUCH OF THE DATA ON THE TIME TO UNDERTAKE ENGAGEMENTS COMES FROM MILITARY DOCTRINE PUBLICATIONS BUT THE DURATION OF CLOSE COMBAT ENGAGEMENTS IS BASED ON STATISTICAL ANALYSIS OF OVER 150 HISTORICAL BATTLES. (REPRODUCED WITH KIND PERMISSION FROM RED SCIENTIFIC LTD © 2025).

FIGURE 4 THE RUSSIA- GEORGIA CONFLICT IN 2008 HAS BEEN USED AS PART OF THE VALIDATION OF THE WARGAME. (CREDIT: YANA AMELINA, CREATIVE COMMONS ATTRIBUTION-SHARE ALIKE 3.0 UNPORTED).

understand how robust different options are to the adversary’s choice of strategy.

The problem can usefully be separated into two parts, tactical decision making, i.e. controlling individual units or groups of units over a short period of time, and strategic decision making, i.e. determining sequences of actions that should be taken in a particular situation in order to achieve specified goals [3].

The problem of tactical control appears the more tractable. It is possible to write algorithms to coordinate the movement of small groups of units in ways that appear credible to military staff. These will then be refined using AI to optimise the control parameters.

Two AI ‘players’ could play hundreds or thousands of games in the time it normally takes to play one.

Strategic decision making is the greater challenge. AI has had some obvious successes, defeating the top human players in traditional board games like Chess and Go, and computer strategy games like StarCraft. But in these games, however complex, the board, the pieces and the starting positions are fixed. Each operational

plan supported by CIRSIUM is unique. Each iteration of the plan can involve different forces or different starting positions. The challenge is therefore to develop intelligent agents capable of playing on any battlefield, using any set of forces, against any opponent. The solution is likely to involve the AI starting by practicing, playing the scenario thousands of times against its mirror image, in order to determine a unique set of control parameters that work for the particular circumstances of the game.

The CIRSIUM wargame is now in use in British Army headquarters and the response from the Operational Research Branch has been extremely positive: ‘The CIRSIUM wargame has provided a significant enhancement to the performance of the British Army’s Operational Research Branch (ORB). The ORB provides deployable Operational Analysts (OA) to all UK formation HQs from Brigades to HQ ARRC. Utilised by OAs, CIRSIUM fully

FOR FURTHER READING

encompasses the complexity of integrating all elements of Combined Arms Manoeuvre warfare into potential Courses of Action. Critically, it does so fast enough for its outputs to be relevant and valuable to Commanders in their operational planning. As such CIRSIUM is a step change in capability’.

Dr Richard Underwood (BSc and PhD in Physics, Newcastle) began his career as a military operational analyst in 1991, shortly after the Gulf War. In addition to wargaming and combat modelling, he is heavily involved with Historical Analysis techniques, using statistical analysis of data from and about past conflicts in OR studies and for model validation. While working in the Ministry of Defence he founded the Historical Analysis for Defence and Security Symposium. Now, working for the consultancy RED Scientific Ltd, he continues to be involved as the industry representative on the organising committee.

[1] Choy, C. (2013). British War-Gaming, 1870-1914. MA Dissertation. Department of War Studies, King's College London.

[2] Ministry of Defence, (2017). Wargaming Handbook

[3] Robertson, G. and Watson, I. (2014). A review of real-time strategy game AI. AI Magazine, 35(4): 75–104.

CAN OPERATIONAL RESEARCH RELIEVE OUR STRAINED HEALTHCARE SYSTEMS?

A SYSTEM UNDER PRESSURE

In the UK’s National Health Service (NHS), headlines of ambulances queued outside hospitals, growing treatment waiting lists, and widespread staff burnout have become fixtures of public discourse. These are not isolated incidents but symptoms of deep, complex systemic issues. The pressures of unpredictable demand, staff shortages, and financial constraints can create a vicious circle where

inefficiencies compound, quality of care is impacted, and retaining motivated staff becomes a significant challenge (Figure 1). This environment demands more than quick fixes; it requires a smarter way to manage complexity and make the most of what we have.

That’s where Operational Research (OR) comes in. Described by practitioners as ‘Management Science’, OR offers a powerful toolkit for solving messy, real-world problems with data and analytics. In healthcare, it can

mean predicting patient demand with greater accuracy, optimising complex staff schedules, or redesigning clinical pathways to cut waiting times. Yet, despite its promise, OR remains a largely untapped resource within the health service.

This article draws on insights from a panel of clinicians, practitioners, and academics who gathered on the 2nd of June for ‘Implementing OR for Better Healthcare Systems’, the inaugural event of the OR in Practice Special Interest Group (ORPSIG). As part of the Operational Research Society, the group’s network is designed to support OR practitioners and academics from both public and private institutions, fostering collaboration, knowledgesharing, and mutual learning. Drawing on the panel’s discussion, this article delves into the cultural, financial, and educational barriers holding OR back and maps out a practical vision for how it can become a vital part of building a more resilient and effective healthcare system for all.

THE DIAGNOSIS: A LACK OF AWARENESS

One big hurdle is awareness - or the lack of it. One long-serving practitioner on the panel captured the problem succinctly, stating that operationally, “the National Health Service (NHS) doesn’t know what it doesn’t know” about OR. Clinicians and managers, swamped with daily pressures, often don’t even know OR exists, let alone what it can do.

Analytics skills, instead of being front and centre, get stuck in back offices, churning out basic charts rather than tackling real operational challenges (Figure 2).

This lack of awareness fosters a reliance on overly simplistic methods and what one panellist termed the “flaw of averages”. In one striking example, a hospital was found to be using a simple four-point moving average to forecast its A&E attendance. An OR practitioner introduced a far more accurate machine learning model that accounted for holidays and other variables, but the management team would not engage with it, largely because it was outside their realm of understanding.

This extends to the highest levels of financial planning. One panellist described policymakers pushing hospitals to run at full capacity to save money, not grasping that this leaves no room for sudden patient spikes. It took reframing the issue – “How often should we turn patients away?” - to shift the conversation. The answer, “Never!”, sparked a real talk about planning for uncertainty, an OR strength that’s still overlooked.

“They were using a moving four-point average for their forecast for A&E attendance […] the model we developed was far more accurate than what they were using… They just would not engage with it.”

THE HUMAN FACTOR: PEOPLE, PRESSURES, AND PERVERSE INCENTIVES

Beyond the technical gap, the application of OR runs into the complex reality of human and organisational factors. It is crucial

FIGURE 2 INTEGRATING ADVANCED ANALYTICS AND ARTIFICIAL INTELLIGENCE INTO HEALTHCARE OPERATIONS CAN BE A CHALLENGE DUE TO A LACK OF AWARENESS AND UNDERSTANDING AMONG SOME CLINICIANS AND MANAGERS
FIGURE 1 HEALTHCARE STAFF BURNOUT
© Liusia

to recognise that healthcare managers work under intense and unrelenting pressure. With staggering amounts of correspondence crossing their desks daily, they are often forced to prioritise immediate financial and performance targets, leaving little room for deep engagement with unfamiliar methodologies.

This tension is starkly illustrated by a case where financial incentives directly conflicted with patientcentric care. A surgeon established a ‘hot clinic’ to provide rapid specialist assessment for emergency cases, aiming to avoid unnecessary hospital admissions. The trial was a clinical success, with most patients managed safely without needing an acute bed. However, the initiative was abruptly shut down. The reason was purely financial: the hospital could only charge £160 for an outpatient appointment, whereas a full admission would generate £700. This powerful example shows how the system itself can create perverse incentives that constrain innovation and lead to the ‘over-engineering’ of patient care.

Even when an initiative is a clear win for patients, internal resistance can be a significant barrier. In another example, a plastic surgeon designed a ‘one-stop shop’ where patients could have a lesion assessed and removed in a single visit. The trial was highly successful; the registrar surgeons loved the efficiency, and patients praised it as the “best care delivery” they had experienced. Yet, the administrative staff “absolutely hated it” and complained ceaselessly. This highlights a deeper challenge of role rigidity, where established processes and professional boundaries create ‘barriers to flexibility,’ and even the most logical improvements falter if they disrupt familiar workflows.

“The manager who was managing that service went absolutely berserk. Because rather than being able to accrue money… for an admission, which I think were about £700, it could only charge 160 quid for a new appointment. So because they were losing money on it, they’d rather over-engineer the patient and admit them.”

BRIDGING THE GAP: FROM A GOOD IDEA TO A SUSTAINABLE SOLUTION

Despite the challenges, there are clear models for how OR can be implemented successfully. One panellist pointed to the National Bowel Cancer Screening Programme as a gold standard. This initiative succeeded because it moved beyond just the medical evidence; there was political and financial alignment, buy-in from all key stakeholders, and sophisticated modelling to predict demand and plan resources. Crucially, it was not a one-off project; the programme was

continuously monitored to maintain standards, ensuring its long-term impact (Figure 3).

This highlights that successful implementation often depends on framing analytical work in a language that resonates with staff. The goal is not the mathematical model itself, but what it can achieve. If you can approach a nursing team and explain that OR techniques can reduce the likelihood of a chaotic, exhausting shift, you create powerful buy-in. The idea that OR can improve predictability and help retain motivated staff is a hugely attractive proposition for any healthcare trust. As one clinician noted “even the fact that you’re trying to do it is welcomed”.

However, even the most successful projects face the critical challenge of sustainability. One panellist shared a pointed anecdote about a comprehensive guidebook on reducing outpatient waiting times. Years after its launch, he visited a clinic and found it was chaotic, breaking the most basic rules outlined in the guide. The realisation was stark: since the guide was published, seven or eight generations of outpatient managers

FIGURE 3 SUCCESSFULLY
© patpitchaya/Shutterstock

had passed through. The knowledge was never embedded, and the chain of wisdom was broken. This demonstrates that for improvements to last, the skills and knowledge must become part of the organisation’s DNA.

“It suddenly struck me, well, of course they haven’t read the book… Since then, there have been seven or eight generations of outpatient managers. Well, what is the chance that at some point, even if the first manager looked at that guidance and followed it, that they pass it on to the second one? …Somewhere that chain has got to be broken.”

A NEW PRESCRIPTION FOR LASTING CHANGE

The path to a more resilient healthcare system lies not in simply procuring more tools, but in fundamentally changing the culture around decisionmaking. A forward-thinking vision emerged from the panel: that the credibility of any business case would be immensely enhanced if it demonstrated a basic understanding of core OR concepts. This is not a call for every manager to become a master modeller, but a proposal for embedding literacy in the principles that govern complex systems. A competency in understanding variability and uncertainty should be as fundamental as training in patient safety.

This vision requires embedding OR literacy at every level of the organisation. A ward sister should be able to explain the basics of patient flow; a trust board should be able to critically evaluate the assumptions behind a capacity plan. Emerging technologies may play a crucial role here. Large Language Models, for instance, have the potential to democratise advanced analytics by translating complex data and model outputs into clear, natural language that non-experts can understand and act upon.

Ultimately, the goal is to foster a culture of ‘Human-Centric Analytics’. This approach transforms OR from a niche, technical tool into a shared language used to improve the working lives of staff and the care experienced by patients. By combining robust analytical methods with an empathetic understanding of the system’s human factors, healthcare organisations can move beyond firefighting and begin to build a system that is not only more efficient and sustainable, but also more predictable and humane for everyone involved.

ACKNOWLEDGEMENTS

We extend our sincere thanks to the following panelists for their contributions to the discussion that shaped this article: Lucy Morgan , OR Practitioner at the Strategy Unit and Visiting Researcher at Lancaster University; Garry Fothergill , Semiretired OR Practitioner with over 20 years of experience across the NHS; Geoff Royston , Former Head of Strategic Analysis and Operational

Research at the Department of Health; Guillaume Lame , Associate Professor at Paris-Saclay University, specializing in health services organisation; and a senior Consultant Physician and Health Innovation Clinical Director who wishes to remain anonymous. Their diverse perspectives and deep expertise in healthcare and OR were instrumental in exploring the potential of OR to address the challenges facing healthcare systems.

Xin Fei is a Lecturer in Business Analytics at the University of Edinburgh Business School. His research focuses on simulation and stochastic optimisation, particularly their applications in transportation and supply chain management.

Nalan Gülpınar is a Professor of Operational Research and Business Analytics at Durham University Business School. Her research focuses on decision-making under uncertainty and risk management. Her work has practical applications in diverse areas, including supply chain networks, finance, revenue management, dynamic pricing, and healthcare systems.

Christina Phillips is a Senior Lecturer in Business Analytics at Liverpool Business School, Liverpool John Moores University. Specialising in mathematical modelling and Human-Centric Analytics, her industry-focused research explores methods to facilitate and maximise benefits from participative modelling. She works to bridge the gap between technical models and practical, user-centred design and implementation.

UNIVERSITIES MAKING AN IMPACT

OPTIMISING FROZEN VEGETABLE PRODUCTION AT ITTELLA ITALY SRL

Traditional manufacturing industries, despite their long histories and innovations in production technologies, machinery, and quality control practices, sometimes lag in modernizing their core decisionmaking processes. Operational Research offers a powerful pathway for these industries to evolve beyond reliance on historical practices.

In the food processing industry frozen food technology stands as a particularly significant albeit longestablished innovation. Dating back to the early 20th century, it revolutionized the storage and year-round availability of seasonal produce while preserving vital nutrients. Europe dominates the global frozen vegetable market, serving as both the largest consumer and the largest producer worldwide. Rising consumer demand and intensifying competition have increased the criticality of effective production planning within the frozen vegetable food processing industry.

Companies like Ittella Italy Srl, an Italian frozen vegetable producer, face special operational challenges, including the strict seasonality of their agricultural inputs and the extremely short shelf-lives of key raw materials. For example, onions require processing within a single day after harvest.

The vast literature on capacitated lot-sizing and production scheduling often overlooks product perishability in

a way that facilitates prompt implementation at companies like Ittella which, as a consequence, often end up relying on empirical, experience-based decision-making for production planning. This dissertation work was set up jointly to prevent Ittella from further missing out on the optimisation potential offered by OR methods.

Lei was awarded the OR Society’s May Hicks Award 2025 (Figure 1) for her work jointly supervised by Dr. Stefano Cipolla (University of Southampton) and Dr. Pericle Pirri (Ittella Srl).

To directly address Ittella’s operational needs, a novel MixedInteger Linear Programming model was developed and tested. The model adapts existing works on sequence-dependent setups for multi-product scheduling on a single production line, and on the modeling of raw material perishability and the linkage between production periods and the timing of raw material deliveries.

The model minimises total costs including setup costs (for production lines changeovers between products), workforce costs, and costs associated with discarding raw materials that spoil before processing.

Model constraints reflect the full complexity of Ittella’s shop floor: production lines can actively process one specific product at a time, undergo

time-consuming changeovers to prepare for a different product, and are stopped to remain idle for various kinds of preventive maintenance to be carried out at specified frequencies (i.e. after a pre-defined number of uninterrupted operational shifts). The model tracks when raw materials are delivered and strictly ensures they are only used for production within their defined shelf-life periods. Materials not processed within this timeframe are discarded, incurring a cost penalty. Total production must meet the overall demand for each finished product. Demand for Ittella’s primary operation, which processes all harvested vegetables, is determined by the harvest volume and the achievable transformation rate from raw vegetable to frozen product. Specific ad-hoc customer orders can be accommodated too. Deliveries of raw materials are constrained to periods when they are available (based on harvest schedules) and are decided by batch sizes and maximum delivery capacities.

Model inputs include: harvest schedules and material availability windows, raw material shelf lives, production line capacities for each product, setup costs for product changeovers, workforce costs per shift for each product, raw material discard costs, transformation rates (raw material units needed per unit of finished product), raw material delivery

batch sizes, limits on delivery quantities per period, maximum allowed continuous production periods before maintenance, total product demands, number of planning periods, number of products.

Within the above context, the model outputs a recommended optimal production schedule (what product ought to be processed in what period and in what quantity), a detailed raw material delivery plan (timing and quantity of each delivery), an optimal sequence of setup activities, expected quantities of raw materials to be discarded, and an optimal preventive maintenance plan (identification of idle periods), and the corresponding minimum total cost.

Model validation, implementation and testing involved extensive operational data sets from Ittella’s recent history, and a standard commercial solver. Experiments addressed Ittella’s most challenging three-month production cycle for the three most critical raw materials:

courgettes (24-day shelf life), onions (1-day shelf life), and corn (3-day shelf life). Results from all numerical experiments run for Ittella demonstrated several advantages over traditional planning methods in place at the company. The model generated optimal production schedules significantly faster than empirical approaches while revealing a potential 33% reduction in active processing time with total costs further minimised. Thanks to its level of detail, the model provided Ittella with granular operational blueprints which in turn enables precise execution at the operational level. Comparative analysis further highlighted the limitations of experience-based planning currently in place, which struggles to quantify hidden costs such as raw material waste.

Sensitivity analysis using the model also yielded strategically valuable insights. Smaller raw material batch sizes demonstrated substantially improved flexibility, significantly reducing waste and total costs, and highlighted various logistical tradeoffs – that are crucial to enable full-scale real-world implementation. For instance, simulated shelf life extensions (e.g., increasing onions’ viability from 1 to 3 days) showed tangible benefits through reduced delivery frequency and lower discard risk, suggesting value in preservation technologies. The model also enabled the quantification of the nonlinear relationships between input parameters (setup costs, labour rates, production capacities) and optimal total costs, enabling data-driven investment decisions at Ittella.

Traditional industries have irreplaceable value and deep-rooted expertise. Embedding OR tools, and an analytics-driven culture, to support critical decision-making processes such as production planning is not about discarding this experience, but about strategically enhancing it with quantitative precision and optimisation capabilities. This can ultimately yield substantial competitive advantages through superior cost control, minimised waste, and enhanced operational flexibility, paving the way for sustained modernization and operational resilience.

Asked to comment about Lei’s work at Ittella, Production Manager Pericle Pirri said “Lei Zhou’s project has demonstrated significant potential for transforming Ittella Italy Srl’s production planning processes. Ittella has recognized the effectiveness of Lei’s model and is planning to integrate it into its production operations in the upcoming season”. He then added “The model’s improved scheduling mechanism is expected to reduce overtime and streamline workforce allocation, leading to better productivity and cost savings” and concluded “The project served as a stepping stone for Ittella’s transition to data-driven decision-making, reinforcing the role of optimization models in the digital transformation of the frozen food industry. The methodology developed by Lei has the potential to be scaled across other frozen food processing operations, inspiring broader industry adoption of similar optimization techniques”.

FIGURE 1 LEI ZHOU PICTURED WITH HER AWARD CERTIFICATE

DESIGN THINKING THOUGHTS

If you are a manager, when are you most likely to seek input from analysts? Certainly, hopefully, when you have decisions to make that involve, say, option appraisal, resource allocation or risk assessment. But what if your task is to design a product, process or policy? Would bringing in an OR analyst spring equally to mind? This article suggests that it should, and why.

In a past article for Impact (Spring 2020) I noted that while design lies at the very heart of professions like engineering it may appear to be somewhat peripheral to OR and analytics, which typically are presented as being focused on decision making. True, up to a point, but this undersells OR, which, as well as informing decisions, often involves building better systems – undoubtedly a design task.

Indeed, right from its origins back in the days of WW2, the first OR teams worked on design problems. Their work on devising a system that integrated the early warning information from radar stations to guide fighter pilots to incoming waves of bomber attacks was reckoned to have doubled the operational efficacy of air defences.

The OR contribution to design tasks continues to the present day. We need to look no further than Impact magazine itself for evidence of this. Although only a few articles explicitly mention design in their title e.g. ‘Real world problems in designing supply chains’ (Spring 2023); ‘Optimization for survival – captive breeding design’ (Spring 2025), it is clear that many – arguably most – of the articles in Impact concern the design or redesign of systems or processes.

As another quick ‘dipstick’ test, I checked the abstracts for the presentations at the recent European OR conference EURO2025, held in Leeds (at which Christina Phillips and myself ran a ‘design thinking’ workshop). Unsurprisingly they contained many mentions of the word decision [1491] and analysis [633], but they also contained many of the word design [528].

So, for this article I want to say a bit more about ‘design thinking’, partly because it is a natural follow-on from my

last article, on creativity, but more importantly because managers have been increasingly realising the relevance of design principles to their businesses and may (or should!) be increasingly calling on OR and other analysts to assist in design-related tasks. As usual it has been stimulated and informed by various books on design, particularly: The Sciences of the Artificial [1] by the Nobel laureate Herbert Simon; Idealised Design [2] by the OR luminary Russ Ackoff; Design Thinking for Strategic Innovation [3] by the applied design thinker Idris Mootee; and The Design of Business [4] by the strategist Roger Martin.

DESIGN ISN’T……

Let’s start with a few things that design is not. Many people seem to think that design is mostly about the look of things, but as Steve Jobs said, ‘Design is not just what it looks like and feels like. Design is how it works.’. And design is not just about material objects; it extends to abstract artefacts such as software, systems, or strategies. Nor is designing confined to professions like architecture or engineering – as Herbert Simon said ‘Everyone designs who devises courses of action aimed at changing existing situations into preferred ones’. So policy makers, managers, and OR analysts (recall ‘the science of better’ tagline), are all involved in design.

DECISIONS ARE NOT THE ONLY FRUIT

Decision analysis, at its simplest, involves choosing between a number of pre-conceived options. Important for solving some business problems, but not the whole story – what about searching for new options, or fitting together

components of a solution? The latter are tasks, respectively, of creation and of synthesis – tasks of design.

(Note that traditional analysis involves deduction and induction but innovative design cannot be derived from logical analysis of inputs alone, it involves a third ‘duction’ – abduction – a creative jump or wondering ‘what could be?’)

ELEMENTS OF DESIGN THINKING

Debbie Millman (host of the podcast Design Matters) says ‘Design is one of the few disciplines that is a science as well as an art. Effective, meaningful design requires intellectual, rational rigor along with the ability to elicit emotions and beliefs.’. Or, succinctly, in the words of the design entrepreneur Robin Matthew: ‘Design is where science and art break even’.

Key elements of design thinking include:

• a deep observation of and empathy with users to discover their needs;

• a holistic approach, looking at problems and solutions from a ‘system’ perspective;

• creative expansion of the boundaries of problems and possible solutions; and

• a commitment to prototyping and real-world experiment.

These are challenging tasks because they involve seeing the world from other people’s perspectives, conceiving a large range of alternatives, considering the interplay of elements of often quite complex systems, and much iteration to find feasible and desirable solutions. Indeed, the process of designing often brings about changes in the very perception of what needs to be designed.

INTEGRATIVE THINKING AND ‘WHOLE BRAIN’

OR

If analysis typically features rigour, logic, modelling and quantification and design typically involves client empathy, intuition, creativity, and prototyping, then clearly solving problems of businesses and other enterprises will often require both – the integration of analysis and synthesis, involving convergent and divergent thinking (Fig 1).

Allowing considerable neurological licence about the roles of the left and right sides of the brain, what might be termed ‘whole brain OR’ (Fig 2).

THE TELEPHONE SYSTEM HAS JUST BEEN DESTROYED

Analysts who have paid attention to the emphasis that Simon and Ackoff put on design should be well-placed to practise ‘whole-brain OR’. And those who are experienced in systems

or scenario thinking will recognise the important role these can play in supporting design work. Idealised Design gives a great example, which I summarise below.

Ackoff describes how way back in 1951 he was invited by chance to an extraordinary meeting which the vice president of Bell Laboratories had called of all his senior staff. The VP strode in and announced gravely that ‘the telephone system of the United States was destroyed last night’. This was (obviously) a joke but one with a profoundly serious purpose. The VP had been thinking about the most important contributions Bell Laboratories had made to telecommunications – the top three being the phone dial, call multiplexing, and the coaxial cable. He pointed out that these innovations all took place before any of the research staff had been born! So his question was, ‘what have you all been doing’?

The laboratory had been working on improving parts of the system rather than focusing on the system as a whole. So

FIGURE 1 CONVERGENT AND DIVERGENT THINKING (credit: Tim Brown IDEO)
FIGURE 2 THINKING WITH BOTH SIDES OF THE BRAIN (Source: Wikipedia)

the VP told his staff to go away and produce a design idea for a completely new integrated phone system to replace the current one. They were free to design whatever they wanted subject only to it being technologically feasible and operationally viable in the current environment.

As a result of that the teams (which included Ackoff) went away and came up with ideas for innovations that would be present in a redesigned phone system that better met people’s needs – innovations such as push button telephones, call waiting, caller ID, speaker phones, conference calls, and even mobile phones – remember this was back in 1951!

Ackoff tells this story in person in a presentation that can be seen at https://www.youtube.com/watch?v=spm2HUxgI30 –it builds up slowly, but you will not regret sticking with it for the full 28 minutes!

(There are of course many effective design thinking approaches, but I particularly like this example because elements of this approach featured in the work I led on the design and implementation of the national helpline NHS Direct [now NHS 111]).

CONCLUSION

Hopefully, if you are a manager, when you are facing a design task, you will be considering bringing an analyst into the team – and right from the start (as Tom Peterson noted: ‘Design is a ‘day one’ issue’). Hopefully, if you are an analyst, you will be reflecting on how your skills in, say, problem structuring, systems modelling, or scenario thinking can assist with design tasks.

In either case (hopefully) you will want to learn more about the practical application of design thinking to

FOR FURTHER READING

management challenges (there is lots of guidance in books such as The Design Thinking Toolbox [5] by Michael Lewrick).

Best wishes for ventures in what design thinking author Idris Mootee describes as ‘the search for a magical balance between business and art, structure and chaos, intuition and logic, concept and execution, playfulness and formality and control and empowerment’.

Dr Geoff Royston is a former President of the OR Society and a former Chair of the UK Government Operational Research Service. He was Head of Strategic Analysis and Operational Research in the Department of Health for England, where for almost two decades he was the professional lead for a large group of health analysts.

[1] Simon, H. A. (2019). The Sciences of the Artificial, Reissue of the Third Edition with a New Introduction by John Laird. Cambridge: MIT Press.

[2] Ackoff, R. L., J. Magidson, and H. J. Addison (2006). Idealized Design: How to Dissolve Tomorrow’s Crisis… Today. Upper Saddle River: FT Press.

[3] Mootee, I. (2013). Design Thinking for Strategic Innovation: What They Can’t Teach You at Business or Design School. Hoboken: John Wiley & Sons.

[4] Martin, R. L. (2009). The Design of Business: Why Design Thinking Is the Next Competitive Advantage. Brighton: Harvard Business Press.

[5] Lewrick, M., P. Link, and L. Leifer (2020). The Design Thinking Toolbox: A Guide to Mastering the Most Popular and Valuable Innovation Methods. Hoboken: John Wiley & Sons.

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.
IMPACT AUTUMN 2025 by Impact Magazine from The Operational Research Society - Issuu