For most of my career in asset management I have watched organisations work incredibly hard to "get ready" for standards and for technology. Policies are drafted, frameworks adopted, APM platforms procured, sensors deployed, dashboards built. On paper and on screen, the asset management system looks exemplary. Yet in too many cases, the lived reality on the plant floor or in the network control room tells a different story: chronic firefighting, reactive heroics and outcomes that stubbornly refuse to match the promise of the artefacts.
That disconnect is what I call the performativity gap.
Asset management is something you do – not something you own
Standards like ISO 55001, PAS 55 and the GFMAM Asset Management Landscape have been invaluable. They have given us a shared language, a coherent clause structure and 39 subject areas that describe "what good looks like" across strategy, lifecycle delivery, information, people and risk. At the same time, Asset Performance Management (APM) has emerged as the technology banner under which many organisations now invest: condition monitoring, predictive analytics, digital twins and integrated dashboards.
In both cases, the dominant logic has been similar: if we can get the right artefacts in place – the right standard, the right model, the right platform, the right sensors – performance will follow.
The ethnographic reality is different. Real outcomes – reliability, cost, safety, environmental performance, resilience – are produced in the messy, adaptive, situational work of people making decisions under pressure. Technicians adjusting routines to handle equipment that does not match design assumptions. Planners juggling competing constraints. Engineers reconciling technical standards with economic realities. Asset management is not just a system to be audited or a stack to be deployed; it is a practice to be performed in sociotechnical settings.
A critique of APM as it is often practised
APM, as marketed and implemented today, is frequently artefact-centric rather than system-centric.
- It focuses on things: sensors, historians, cloud platforms, anomaly-detection algorithms, alerts and dashboards.
- It assumes that more data and smarter models will directly translate into better decisions and outcomes.
- It often treats people and organisations as passive endpoints – "users" of analytics – rather than as active shapers of what performance even means in context.
The result is that many APM programmes become sophisticated camera systems pointed at the assets but not at the organisation. They are very good at telling you what is happening (or might happen) to equipment; they are much less effective at helping you understand whether your asset management system and practices are performatively coherent with your stated strategy.
In other words, current APM practice tends to:
- Over-index on the technical artefacts of sensing and analysis
- Under-attend to the sociotechnical embeddedness of those artefacts – the roles, routines, power structures, incentives, cultures and informal workarounds they plug into
- Assume that better visibility is the same as better capability
It isn’t.
You can have world-class condition monitoring and still be unable to act on warnings because of fragmented decision rights, brittle planning processes or misaligned incentives. You can have exquisite digital twins and still revert to reactive heroics because overtime is easier to authorise than a shutdown. You can have streaming analytics and still ignore the results because the frontline does not trust the model or cannot reconcile it with lived experience.
APM, interpreted narrowly as "more data and smarter dashboards", risks becoming another framing exercise: a technologically enhanced view of the world that still stops short of engaging with the performative reality of how work is actually done.
The performativity gap: where standards, APM and reality part company
Drawing on Michel Callon's performativity theory and 27 months of ethnographic research in a major oil and gas company, I saw the same pattern repeat: on paper (and on screen), key routines were stable, well-documented and supported by increasing instrumentation. In performance, those same routines and tools were in constant motion. People adapted, improvised and negotiated – not because they were undisciplined, but because real assets, real markets and real organisations are never static.
The performativity gap is the structural distance between:
- How we frame asset management and asset performance through standards, policies, procedures, maturity levels, APM architectures and dashboards
- How asset management is actually performed through daily routines, operational decisions, workarounds and interactions in specific sociotechnical contexts
Crucially, this gap is not simply a sign that we are "not implementing APM properly". It is inherent to complex systems. Models and artefacts are abstract and general; real settings are concrete and specific. The gap is where organisations innovate, adapt and build resilience.
The real risk is not that the gap exists, but that our current instruments – including most APM deployments – cannot show it to us clearly.
Why point-in-time conformance and artefact-centric APM are no longer enough
Callon's distinction between models as cameras and models as engines is helpful here. Cameras describe what is there; engines help produce the reality they describe. Asset management standards and APM platforms are both engines: they are meant to shape organisational behaviour.
But engines only work under the right conditions. J.L. Austin called these felicity conditions – the circumstances that allow a performative act ("this system is now live", "this alert is serious", "this risk is acceptable") to actually change the world.
In organisational life, those conditions include:
- Clear authority and decision rights about when to intervene
- Competence and confidence in roles interpreting and acting on data
- A culture that tolerates surfacing ambiguity and re-examining assumptions
- Incentives and governance that reward proactive, not just reactive performance
You can have a beautifully architected APM stack and still fail performatively if those conditions are missing. Conversely, you can have modest tooling but strong sociotechnical conditions and outperform peers with much more technology.
Layer on top the problem of time. Asset-intensive environments are dynamic: equipment ages, markets shift, technologies evolve, people move roles, organisations restructure. Any assessment or dashboard viewed as a "state of the nation" at a single point in time is, at best, a high-quality photograph. It cannot show trajectory:
- Are APM insights actually changing routines and decisions sustainably?
- Are we holding performance together through disciplined capability or through unsustainable heroics?
- Are our investments in standards and APM compounding into coherent capability, or fragmenting into islands of excellence?
To answer those questions, we need to stop thinking about APM as a set of artefacts and start thinking about it as part of a sociotechnical performance system – and then measure that system longitudinally.
Introducing GARPI™: a performativity instrument – not another maturity model or dashboard
It was this realisation that led to the design of the Global Asset Reliability & Performance Index – GARPI™.
GARPI™ is built on a simple but demanding principle: if we want to manage the performativity gap, we must measure both sides of it – standards and APM artefacts on one side, lived performance on the other – together, over time.
That leads to four design commitments:
1. Measure framing and performance together
GARPI™ integrates management system "maturity" (including how you use technology and APM tooling) with hard operational outcomes across eight weighted dimensions. These dimensions are explicitly aligned with ISO 55001 clauses and the GFMAM's 39 subjects, so organisations can build on their existing governance and APM work. The key is that GARPI™ always asks a paired question:
- What does your system – standards, processes, tools, APM stack – say should happen?
- What are your people and assets actually achieving in practice?
2. Run longitudinally, on an annual cycle
GARPI™ is not a one-off certification nor a static dashboard. It runs annually. One measurement gives you a baseline. Three measurements reveal your trajectory. Five or more surface deeper patterns of framing, overflow and reframing as your strategies, systems and APM artefacts interact with changing realities.
3. Treat sociotechnical conditions as first-class data
GARPI™ explicitly examines the human and organisational conditions that allow both standards and APM to perform: workforce capability, organisational culture, clarity of decision rights, supply chain readiness and resilience. It treats sensors, models and platforms as embedded in sociotechnical systems, not as neutral add-ons.
4. Enable peer learning – not just scoring
Finally, GARPI™ is structured as a cohort benchmark. Participants can see not only their absolute scores but also how they sit relative to organisations with similar asset portfolios and regulatory contexts. The aim is to create a learning community grounded in rigorous, comparable data about what truly drives performance – beyond the artefacts.
From conformance and tools to performance and capability
Reframing asset management in performative and sociotechnical terms changes the questions we ask:
- Boards and executives move from "Are we compliant and well-equipped?" to "Are we performatively coherent – are our systems, tools, people and assets delivering the outcomes we say we value and is that coherence strengthening or weakening over time?"
- Regulators and policymakers gain a more nuanced understanding of systemic risk and resilience in critical infrastructure than binary yes/no conformance or technology adoption metrics can provide.
- Practitioners gain language and evidence for what many already feel: that the real work of asset management happens where standards, APM artefacts and lived practice meet – in the performativity space.
Asset management has come a long way by professionalising around standards, frameworks and more recently, powerful APM technologies. That journey was necessary. But if we want to unlock the next step-change in reliability, cost-effectiveness and resilience, we now need to complement conformance and tooling with a deeper, ongoing focus on sociotechnical performance.
That is the shift GARPI™ is designed to support.
If your organisation is interested in making its own performativity gap – including the gap between what your APM stack promises and what your sociotechnical system delivers – visible and actionable, explore GARPI™ and the underlying research at optimal.world/garpi.