“Most mining companies have legacy infrastructure in place that both poses high security risk as well as being an impediment to the adoption of new technologies.”
Over the past decade in which we have been surveying the global mining industry, we have witnessed a significant shift in this mindset, from a majority of self-described ‘fast followers’ to ‘industry leaders’.
The legacy remains, however. As is the case for many high-risk industries, a surprisingly large number of core operational technology systems are married to extremely out-dated applications. This is as true for the space and airline industries as it is for mining. Simply put, the risk of making changes (with all their poorly understood flow-on effects) can be deemed too great.
“Security is a terrible word; it makes you think there is safety in a big door. But in reality, there are a million little holes and you just need to find one and then you’re in.”
Complexity is becoming unmanageable as such legacy systems become intertwined with technologies due to the sheer rate of adoption, both planned and unplanned. Historically, software was purchased at very senior levels and licensed to all users in large commercial deals. Whilst this still happens with cloud platforms and ERP systems (to name just two), there is an increasing trend towards users having the discretion to subscribe to software themselves. It is this sales model that has catapulted start-ups such as Atlassian to market capitalisations in the tens of billions of dollars. It is also underpinned by organisational innovation and cultural initiatives.
“You have to remember you spent millions getting systems up for a reason, the major problem is that these are based on the assumption that everything will go as expected.”
Alongside new software is the exponential growth in connected devices on mining networks due to investments in automation and monitoring. Unlike much legacy technology on mine sites, these sensors and actuators have relatively short expected lives (both through technology obsolescence and reliability), creating ‘bow waves’ of sustaining capital expenses, depreciation and asset management complexity. With this increased automation and integration, ‘air gapping to contain and isolate cybersecurity attacks…is no longer an option and is ineffective.’
For cybersecurity teams, the complexity created through the melding of new technology with legacy systems is a double-edged sword. In one respect, the complexity has created a broad playing field for cyber adversaries to find and exploit vulnerabilities. In another respect however, it has also created a layered defensive arrangement that in some cases may complicate the speed and breadth of attacks. On balance, complexity with all of its technical debt is considered to be undesirable by cyber professionals in any industry – and things will get worse until newly adapted security processes and protocols are in place. As one interviewee put it, ‘a secure system is a simple, well-maintained system.’
“There must be transparency in systems put in place as well as a deep understanding of how their roles interact.”
Simplicity of systems and architecture design is going to go a long way towards solving many of these issues. In fact, building cyber resilience will itself give impetus to a shift in architecture that is simplified, standardised and more efficient. Lifting cyber as a priority in the design of these systems is highly aligned with the objectives of digital capability and systems performance in general – modular, upgradeable, testable and integrated. Quite the contrary from being the ‘handbrake’ to digital transformation that cybersecurity is commonly considered to be.
A major implication of these shifts is the move to cloud-based data and software systems. One interviewee with deep technical expertise explained that this will reduce the number of security layers ‘from nine to one’. Modern architectures also imply different possibilities for how cyber software is deployed – edge computing offers the opportunity to decentralise defence software and protect data and commands before they are transmitted. Another pathway unlocked by this shift is the capacity to utilise the computing power available on large cloud data servers to use simulation to foresee and act proactively to potential catastrophic risks. This offers the tantalising possibility of breaking the very real catastrophe to regulation cycle.
It also offers the opportunity to outsource large parts of the security task to global behemoths whose core value proposition relies on securing their client’s data – these businesses include Microsoft and Amazon, with their thousands of cybersecurity professionals. As a relatively small part of the revenue of these businesses, however, the imperative to address mining’s specialist security needs will always be marginal. In addition, like aircraft carriers, they are well protected for a reason; ‘did you know that cloud customers were hit with over six hundred million cyber-attacks last year?’ As several interviewees reiterated, every business ultimately owns its own risk – it cannot be outsourced.
What we are likely to see, because of the above requirements and advantages, is the growth of integrated services that excel in the design of integrated cyber resilient systems and software. In the companies that provide these services, skills in cyber, machine learning and user-centred design will come together. Businesses such as Singtel, Google and Palantir are at the vanguard of this movement. Prioritising users is critical for cyber, and indeed for success in digital transformation, enabling people to engage and do the right things intuitively. The best start-ups understand this viscerally.