Report reveals: A surprising number of organizations still tethered to outdated backup practices

You’re probably well aware that efficiency and optimisation are the name of the game for organisations in 2025.

But many businesses are unknowingly leaving immense value on the table, simply because they don’t maximise their backup strategies.

Added to this challenge is the reality that the winds of change aren’t letting up. With more and more of your teams depending on digital data, and—in tandem with the rising role of data security—it’s predicted we’ll all be witnessing the upward trajectory of backup and restore software from 2025 to 2032.

While backing up data may seem like a no-brainer, outdated practices like manual backups remarkably continue to plague nearly a third of organisations.

For those of us trapped in the now-archaic practice of tape storage, the time has come to jump into the 21st century and consider an updated approach to our data management and protection strategies.

Today’s state of backup disarray is more common than you’d think   

In addition to the alarming 29% of organisations still manually copying backup data, 5% of businesses leave their SaaS applications completely unprotected.

While these statistics may seem shocking, they’re a symptom of an outdated mindset that backups are merely ‘nice to have’.

To understand how far we’ve come, let’s take a step back and consider the legacy of tape backups.

Manual intervention used to be the norm. We didn’t have the option to question the laborious practice of swapping tapes out daily and transporting them offsite for security. As well as being taxing, getting to a point of recovery frequently chewed up months of valuable time and resources that nobody had to spare.

Annoyance with backup tapes was something that Colin, Biggers & Paisley, a Sydney-based law firm, could relate to. Their systems administration manager, Steven West, has looked after their backup and recovery process for over a decade, and remembers the frustration well.

“Using backup tapes was a massive commitment in terms of both time and effort. We ran old-school backup and sync products (like tape and disks) to cover on-premises and virtual machines throughout the week, and on weekends and public holidays.

“Unfortunately, it didn’t leave us a lot of room for system maintenance windows. By the time our backup technician finished, we’d have over a dozen tapes or disks to remove from the offices and store off-site in a bank vault. Then we’d start the whole backup process again.”

Since investing in a modern, fully-managed solution, the Colin, Biggers & Paisley team has come out the other side with bragging rights that feature business-enhancing advantages like streamlined processes, reduced downtime, and significantly minimised disaster recovery risks. What’s not to love?

Backing up responsibly is in your hands, so don’t sit on them

The age of cloud backups is saving the day, but to truly safeguard your organisation against the inevitability of a cyberattack, your backup strategy needs to be automated, tested, and proactive.

Rather than mopping up the mess after a disaster, the first prize is to get ahead of the problem.

Enter the proactive approach of data resilience. With this key piece of the disaster recovery puzzle in place, you can weather a cyberattack, retrieve critical data and applications during the attack, and get back to work in record time.

To shed light on how to implement robust cybersecurity plans that defend your business during that inevitable attack, these three standards can help you cut through the clutter:

  1. The Shared Responsibility

    The essence of this security and compliance framework may come as news to some: protecting your data is your responsibility.

    Most businesses in Australia use an average of three public cloud service providers, and assume that data protection is part of the deal. But while platforms like Microsoft 365, Salesforce, and Google Workspace manage infrastructure, many cloud applications only offer limited protection.

    Building cyber resilience requires action and should be top of your agenda. Make a start today by confirming that your SaaS solutions like Xero, Salesforce, or Intuit are truly backed up.

  2. Australian Signals Directorate (ASD) Essential Eight

    According to guidelines from the ASD, data protection plans should include:
    • Consistent and trustworthy backups that support the continuation of your business services
    • Disaster recovery activities that stress-test the effectiveness of your data restoration strategy
    • Guardrails to protect backups from being edited by the backup team during their retention period
    • Proactive retention procedures that guarantee secure compliance

  3. The National Institute of Standards and Technology (NIST) 3-2-1 Rule

    While navigating the expanse of security threats can feel complex and daunting, the NIST 3-2-1 Rule is simple, clear, and succinct:
    • Secure three copies of your data
    • Keep the copies in two different locations or media formats
    • Store one copy in the cloud or offsite

Subpar backup strategies can result in eye-watering costs

According to Forbes, the global average cost of a data breach amounted to a whopping $4.88 million in 2024. As we all become increasingly data-dependent in our workplaces, the expected consequences for those of us who don’t have plans to upgrade to automated, cloud-based solutions are clear.

While your teams may already be lagging behind in their backup strategy, all is not lost. The promise of efficiency and peace of mind are well within reach if you’re ready to fight—and win—the war on risk.

A heads up for 2025: 10 data management trends to keep your eye on

Welcome to 2025. So, what are the emerging data management trends that will frame up the future? Yours, and ours.

We’ve all become ever more reliant on data for decision-making, strategic planning and operational efficiency. These are some of the trends that will change and improve how we manage information in our businesses.

1. Data fabric architecture is (still) hot to trot

The concept of data fabric was first floated in the early 2010s (and Forrester made the term official in 2013), but it didn’t gain real traction until around 2018.

This year (and beyond), it’s expected that data fabric architectures will continue to rise in popularity. In its Top Trends in Data & Analytics (D&A) through 2030, Gartner predicts that by 2027, “30% of enterprises will use data ecosystems enhanced with elements of data fabric supporting composable application architecture to achieve a significant competitive advantage.”

Data fabric is a completely new approach to data management. It integrates disparate data sources across on-premises and cloud environments into a single cohesive framework.

What will adopting a data fabric architecture mean for you? With a unified view of your data, you can streamline how you access, manage, and analyse it. Both data governance and decision-making will be easier as your stakeholders can easily get their hands on real-time data. As you expand – and your data siloes do likewise – data fabric will make sure you can maintain transparency of all of your organisational data.

2. Cloud data management is here to stay?

First up, we need to say that, in all honesty, the shift towards any cloud-based solution can no longer be called a trend. It’s a fundamental, done-and-dusted change in how today’s businesses approach data storage and management.

We all know that businesses have moved away in droves from traditional on-premises solutions in favour of scalable, flexible cloud environments. (As of 2021, O’Reilly Media reported that cloud adoption by businesses, from SMEs to mega-enterprises, was already over 90%.)

So, when it comes to cloud data management solutions, what’s in it for you? Try the big three: accessibility, collaboration, and cost-efficiency. You’ll be able to leverage the best features of multiple cloud services, support data redundancy and disaster recovery, and also tick all the boxes for scalability, flexibility, and cost savings. Best of all, you can choose the best services for your needs – while avoiding vendor lock-in. Not to forget, you’ll be able to support dynamic workloads and drive innovation.

3. Data governance and compliance will just get tougher

As data privacy regulations tighten even further, effective data governance is flagged as a critical priority for organisations around the world.

To comply with regulations such as GDPR, CCPA, and HIPAA, the pressure will be on to implement robust governance frameworks. This includes defining clear policies for how you access, use, and protect your data while also maintaining transparency with your stakeholders regarding your data practices.

What does this mean for you? You’ll need to invest in tools and technologies that will automate your compliance reporting and track the lineage of your data. The role of Data Stewardship will become more prominent, so it’s likely you’ll need to designate responsibility to someone in the business for overseeing data quality and compliance.

It’s also worth noting the larger governance picture: Gartner says that “current data governance practices are often too rigid and insensitive to the business context. By 2027, for example, 60% of organizations will fail to realise the anticipated value of their AI use cases due to incohesive data governance frameworks.”.

4. Real-time data processing for real-time decisions

The rise in demand for real-time data processing is another trend to watch for as businesses increasingly seek to make on-the-spot decisions.  

In The Wall Street Journal’s article ‘In an On-Demand World, Real-Time Data Is ‘Becoming an Expectation’, AWS’s VP of Messaging and Streaming, says, “Anything outside of using real-time data can frustrate end consumers and feel unnatural. Having real-time data always available is becoming an expectation for customers. It’s the world we’re living in.”

As for your business? Be on the lookout for technologies such as stream processing and event-driven architecture. These will allow you to analyse data as it arrives so you can respond quickly to changing market conditions and consumer behaviour. Another bonus is that real-time analytics can improve the customer experience. You can delight them by acting and reacting more quickly and making personalised recommendations to ramp up both engagement and loyalty.

Are you in the finance, healthcare, or e-commerce industry? Then, you’ll find real-time analytics particularly beneficial for fraud detection and patient monitoring.

5. Decentralised data management makes its move

With the rise of blockchain technology, decentralised data management is gaining momentum. This approach enhances data security and integrity by allowing multiple parties in your organisation to access and verify data – without relying on a central authority.

So, how is this helpful? If you’re in an industry where consumers turn to you for trust and transparency (think finance, healthcare, and supply chain management), then decentralisation is particularly beneficial. It also helps you reduce the risk of data breaches and ensure that your private information remains…private.

6. Emerging data privacy and security enhancements

So, how is this helpful? If you’re in an industry where consumers turn to you for trust and transparency (think finance, healthcare, and supply chain management), then decentralisation is particularly beneficial. It also helps you reduce the risk of data breaches and ensure that your private information remains…private.

7. Data is no longer a byproduct but a standalone product

The view of data is changing from a byproduct of a business’s operations to a product in its own right – or, in techspeak, Data as a Product (DaaP).

How will DaaP impact you? The DaaP approach encourages you and your team to treat your data with the same care and strategy as your most valued (and valuable)  products and services. By focusing on the quality, usability, and customer value of your data, you can create new revenue streams and, again, enhance the customer experience. A DaaP approach also fosters accountability and ownership within your teams as the value of the business’s data is in their hands.    

8. Sustainability in data management will become a thing

As awareness of environmental issues rises, many organisations are being pushed to how sustainable their data management practices are. Considerations that come into play include optimising data storage to reduce energy consumption and prioritising eco-conscious operations by adopting cloud solutions.

How can you do your bit? Talk to your data centre about how they manage energy consumption, utilise renewable energy sources, and apply practices that reduce the carbon footprint of data storage and processing. (Any data centre worth its salt will have this information at their fingertips). Also, consider the lifecycle of your data – from creation to disposal. Does it align with your sustainability goals?

9. Data democratisation, self-service data platforms, and data literacy programmes

As we all increasingly focus on the trend of making data accessible to all employees (regardless of their technical expertise), there will also be more demand for user-friendly analytics platforms and self-service tools. Obviously, training programs will be essential to allow your employees to make confident data-driven decisions.

What will this mean for you? If you aim to foster self-sufficiency in a data-driven culture, then you need to plan to investigate self-service platforms that allow your non-technical users to access, analyse, and visualise data. All without turning to an already under-the-pump IT department for support. And back it up with data literacy programs so your employees can read, understand, and communicate data effectively.

10. AI and Machine Learning integration

No list of trends is complete without a very large nod to Artificial intelligence (AI) and machine learning (ML). And yes, it’s a pretty fair bet that every trends list for this year will include both – and with good reason.

In the context of what this will mean for you, AI and ML are transforming data management practices by eliminating time-consuming processes involved in data classification, cleaning, linking, and analysis and replacing them with automation. If your team never again have to analyse a large dataset, they’ll thank you for it.

You can also look forward to using AI to forecast market trends and customer behaviour with uncanny precision – so, in turn, your marketing team can use their talents to develop more targeted and effective marketing strategies.

And with AI, you can finally convert those large volumes of data that have been left to languish into actionable insights and drive predictive analytics. For example, AI algorithms can identify patterns in vast datasets, so you can anticipate trends and make those data-driven decisions pronto. AI can also personalise data experiences for your users so they can find what they need more easily – and reduce errors.

Overall? What does this all mean?

So, given these trends, what will 2025 look like? The data management landscape this year will be typified by a blend of technological advancements, regulatory compliance, and a strong emphasis on security and sustainability. In many ways – no big surprises.

Those organisations (and we sincerely trust you will be one of them) that proactively adapt to these trends will be in a much better position to turn their data into a strategic asset to fuel growth and innovation.

Why Data Governance Risk & Compliance is a risky business. And what you can do about it.

While Data GRCaaS (Data Governance Risk and Compliance as a Service) may sound like yet another incomprehensible IT acronym to many, it’s likely to greatly interest those in your business responsible for managing risk. They know exactly how important GRC is to the well-being of your business and its future.

Non-compliance with data protection regulations is both risky and expensive. It can cost dearly in terms of reputation and financial penalties that few can recover from. One such example is the Medibank data breach of 2022. The Office of the Australian Information Commissioner (OAIC) has started court proceedings against Medibank for failing to take reasonable steps to protect their personal information from misuse and unauthorised access or disclosure in breach of the Privacy Act 1988. If the prosecution is successful, the maximum civil penalty order theoretically available under the Privacy Act, in this case, is an unimaginable AU$ 21.5 trillion. It’s unlikely that this magnitude of fine will be awarded, but it signals how seriously OAIC takes this case – and the importance of GRC.  

But let’s back up a bit first and define Governance Risk and Compliance (GRC) and why it’s relevant to your data.

A (very quick) guide to GRC

Governance, Risk, and Compliance (GRC) is a structured approach used by organisations to align their IT and business goals while managing any risks. It helps to ensure compliance with regulations and maintain effective governance practices.

In plain language:

  • The Governance part refers to the framework of rules, processes, and practices your organisation follows. It encompasses establishing policies, taking accountability for meeting those policies, and overseeing your business performance.
  • The Risk aspect is all about the focus on identifying, accessing, and managing risks that could impact your ability to achieve your objectives. It includes risk management strategies and practices to mitigate potential threats (for example, cyber threats).  
  • And the Compliance part is the process of making sure you follow the letter of the law and adhere to both external regulations and your own internal policies. This includes monitoring and reporting on any compliance-related issues and ensuring your business meets its legal and ethical standards.

So, what’s Data GRC?

In adding Data to the GRC mix, the focus moves quite specifically to the areas of risk associated with data, like uncontrolled or illegal data access, exposure to data breaches, cyberattacks, and insider threats.

Safeguarding your data (and handling the ‘what comes next’) presents a unique set of challenges and adds still more layers of complexity to your GRC initiatives. Given the dynamic nature of cybercrime and the increasingly heavy fines for those who fail to protect their data, it’s a genuine worry for most businesses. And managing it internally, without expert and dedicated resources who have the time and knowledge to monitor, manage and protect your data 24/7, comes with its risks.

What are some of the everyday data risks we’re talking about?

  1. The effort of keeping up and responding to ever-evolving legal, industry, and internal requirements regarding how you protect your data, what you must do in case of a breach, and by when.
  2. Being blindsided by an incomplete view of your data.
  3. Slow response at times of need with manual remediation processes for mitigating risks.
  4. The struggle to implement and maintain a zero-trust security posture to help strengthen your security posture and compliance initiatives.
  5. Without an audit trail, you have no idea who has accessed, deleted, created, or moved your data.
  6. The inability to identify, prioritise, and address data security needs in real time (before it’s too late).

What is Data GRCaaS?

Data GRCaaS uses a service-based modular strategy designed to help you safeguard your data and ensure it is managed according to an agreed data compliance framework. And because the service is cloud-based and therefore scalable (and supported by industry-leading best practices and committed resources), it replaces the costly in-house infrastructure and experts you’d need to do the same job. It works across your entire environment – on-prem, cloud, or hybrid.

In real-world terms, what’s in it for you? How will it improve your GRC? Let’s take a look.

How does Data GRCaaS deliver on your compliance wish list?

Regulatory compliance is at the top of the GRC list. The good news is that if you need to comply with and report on data standards like SOCI, ACSC ISM, GDPR, PCI, HIPAA, HITECH, SEC, SOX, CJIS, CMMC, or PIPEDA in addition to your internal policies, you’re covered. With Data GRCaaS, you can’t slip up.

Data GRCaaS allows you to get to grips with your data. You’ll be able to discover, identify, classify, and label your sensitive data at scale in preparation for implementing DLP (data loss prevention). And this is a very good thing; DLP solutions help you protect your critical information, whether stored on endpoints, in the cloud, or in transit. Deep integration with Microsoft means it will also identify and categorise sensitive information in your emails, Teams, and SharePoint and pick up any unauthorised data exposure or behaviours. You’ll also save money with your newfound ability to identify stale data and decide if it can be archived or deleted – driving down your data storage costs.

You’ll also improve your security posture. Data GRCaaS will help you mitigate against the risk of data breaches, cyberattacks, and insider threats by adopting a Zero-Trust or least-privilege approach. Other compliance improvements include managing your permissions and understanding who is accessing, deleting, creating, and moving data – so you have control and visibility.

Peace of mind (and this can’t be overstated in terms of importance for those responsible for your GRC). A Data GRCaaS solution will mitigate your risk when it comes to data breaches, cyberattacks, and insider threats. It will also identify and action file-level security breaches as they happen. This includes insider threats, malware, and ransomware.

Lastly, your back is covered 24/7. Data GRCaaS is supported by real people who continuously oversee the management, reporting, and remediation of your data security, governance, and compliance risks – day in and day out.

What next?

With Data GRCaaS, you’ll be able to understand and remediate against industry-relevant data risks by type, sensitivity, regulation, risk, policy, and more. And we guarantee that’s going to make a lot of people happy and better able to sleep at night.

Beyond backup: The compelling case for data resilience

Thinking that simply backing up your data will save the day is a shortsighted strategy with little or no place in today’s world. Because when it happens – that inevitable cyberattack or natural disaster – you’ll find that just having a copy of your data is far from enough.

And if you have a hybrid cloud environment, with data sprawled across myriad locations and platforms, then you assuredly need more than just backups to save your bacon.

If you haven’t yet developed a data resilience strategy, there’s no time to waste. The latest Notifiable Data Breaches Report from the Office of the Australian Information Commissioner revealed a rapid rise nationwide in notifiable data breaches in the first six months of 2024.

At the risk of sounding like a broken record, we once again say: It’s no longer a matter of if (you’re attacked), but when.

Backup vs. data resiliency

Just so there’s no confusion:

Should you be creating backups? Obviously – that’s a yes. Backing up your data is essential for data recovery – but it’s a reactive approach, a pink band-aid applied after the accident in the hope that it will hasten recovery. Yes, backups restore your lost data. But they won’t prevent you from losing it in the first place, and the post-disaster backup process can lead to significant downtime, as your systems may need to be taken offline to restore data.

By comparison, data resilience is a proactive approach. It focuses on preventing data loss and ensuring continuous availability. So, when disaster strikes (as it will), your business can keep running, downtime is minimised, and data integrity is maintained.

In short, if you’re not thinking about data resilience, you’re not thinking far enough ahead.

What does disaster look like?

What happens to your business when you experience a natural disaster or cyber-attack? Why can this sort of event stop your people and operations in their tracks?

Here’s what can happen:

  1. Operational systems out of commission: Your core business applications and systems may become inaccessible, halting production, sales, or service delivery. Everything you rely on to run a business is in ‘off’ mode.  
  2. Employee productivity plummets: Your staff may be unable to perform their tasks effectively, leading to decreased productivity, frustration, fear, and low morale.
  3. No access to data: Being unable to access essential data, including customer information, financial records, and operational data, can severely impact your decision-making and operations.
  4. You can’t communicate: Your communication tools (think email, messaging platforms, etc.) can be compromised. Your team members can’t talk to each other, let alone to your customers and suppliers.
  5. Disrupted financial transactions: Your payment processing systems may be disrupted, preventing sales and impacting your cash flow.
  6. Zero customer service: If your customer support systems go down, it’s a red flag for your customer relationships. Few customers are impressed with delayed responses to their queries and requests for help and are fast to change loyalties.
  7. You can quickly get a bad rep. Trust can be rapidly eroded if customers learn of the breach, leading to potential loss of business and reputation damage.
  8. Failed regulatory compliance: Your compliance with data protection laws may be at risk, resulting in legal consequences and significant fines.
  9. Disrupted supply chains: If your suppliers or partners are affected, it may disrupt your supply chain, impacting inventory and delivery.
  10. The cost of recovery: Then, there’s the financial burden of remediation efforts, including IT forensics, system repairs, and potential legal fees. All of which can place a heavy strain on your people and your bank balance.

Given the potential impact on your business, relying on backups to dig you out of the deep hole of disaster is highly optimistic.

Data resilience – a holistic approach

Data resilience is about ensuring business continuity. It’s accepting that the impact of an attack can be wide and varied and that just restoring data via back-ups isn’t going to be enough to get you back in business.

Don’t get us wrong – backups are essential (and play an important role in a data resilience approach) – but they’re only part of the picture. Big-picture data resilience also encompasses recovery, redundancy, disaster recovery (DR) planning and cybersecurity. And it requires you to implement measures that ensure data availability, integrity, and security even in the face of unexpected events to minimise data loss and maintain business continuity.

Adopting a data resilience strategy can help your business pre-, during-, and post-incident in three ways.  

  1. It enables you to better withstand a cyber-attack.
  2. If you’re already impacted, it helps you to access your most important data and applications despite network disruptions or failures.
  3. It supports your rapid recovery and return to BAU.

How about data resiliency in a hybrid or multi-cloud environment?

Security and recovery are not assured simply because you’re in the cloud – whether public or private. And scarily, backup repositories are targeted in 96% of attacks, with bad actors ‘successfully’ affecting those repositories in 76% of cases.

If you count yourselves amongst the 89% of organisations with a multi-cloud strategy, you’re probably well aware of the challenges of backing up in the cloud. Legacy systems don’t deliver; relying on native backup tooling for each environment both fragments ease of management and crates inefficiencies and higher costs; and some first-party vendor solutions restrict flexibility and compromise performance, which drives up costs.

However, as said earlier, just investing in backup (no matter how good) on its own is a shortsighted strategy. Achieving data resilience requires your backup and cybersecurity teams to be aligned. To quote Veeam’s 2024 Ransomware Trends Report, “Recovery from a ransomware attack is a team sport.”

Yet most organisations struggle with this alignment, with 63% saying they need a complete overhaul or significant improvement to be fully aligned.

When asked why their teams weren’t better aligned, the most common answer (by respondents to Veeam’s report) was “a lack of integration between backup tools and cybersecurity tools.”

Summary

It’s been said that backup is easy, but recovery is hard – especially if you’re relying on your saved data to do more than it was ever intended. And with the rate at which we generate data and the increasing complexity of our technology environments, ‘hard’ isn’t a word that any of us want to hear.

A data resilience strategy that utilises integrated backup and cybersecurity tools is essential to survive D-day.

Whether it’s your first, tenth, or hundredth attack, you need to be able to face every event with the confidence that you will come out the other side with your data and business intact. Resilient to the end.

Is the Government being a tad overprotective of our critical infrastructure?

In our previous critical infrastructure blog, we discussed the Security Legislation Amendment (Critical Infrastructure Protection) Act 2022 – aka the SLACIP Act, whether it applies to you, and if yes, what you need to know.

But backing up a bit – why exactly did this act come about? What’s changed in the last few years, and has our Government overreacted?

Worrying trends

Let’s look at the ASD (Australian Signals Directorate) Cyber Threat Report 2022-2023 to get some local perspective.

In its report, ASD says upfront: “…Australian governments, critical infrastructure, businesses and households continue to be the target of malicious cyber actors…This threat extends beyond cyber espionage campaigns to disruptive activities against Australia’s essential services.”

Key trends identified by ASD in FY 2022-23 (as relating to critical infrastructure) include:

  1. State actors focused on critical infrastructure – data theft and business disruption. Here, ASD reports that, as part of their ongoing information-gathering campaigns or disruption activities, state cyber actors have targeted government and critical infrastructure networks globally. (A state actor is a private actor or entity who contracts to a state government.) Cyber operations, says ASD, are “increasingly the preferred vector for state actors to conduct espionage and foreign interference.” In recognition of this, ASD joined international partners in 2022-23 to call out Russia’s Federal Security Service’s use of ‘Snake’ malware for cyber espionage. It also highlighted the actions of a People’s Republic of China state-sponsored cyber actor that used ‘living-off-the-land’ (LOTL) techniques to compromise critical infrastructure organisations. A LOTL attack uses legitimate and trusted system tools to launch its cyberattacks and to evade detection. State actors often possess advanced capabilities and, due to the nature of their backers, have significant resources at their disposal.
  2. Australian critical infrastructure was targeted via increasingly interconnected systems. ASD reports that ‘operational technology connected to the internet and into corporate networks provided opportunities for malicious cyber actors to attack these systems.’

Stats and facts

Over the 2020–21 financial year, ACSC (the Australian Cyber Security Centre) received over 67,500 cybercrime reports. This was an increase of nearly 13% over the previous year. The self-reported losses totalled $33 billion. Of these reported incidents, ACSC estimated that approximately 25% were associated with Australia’s critical infrastructure or essential services.

During the 2022-23 period, ASD notified seven critical infrastructure entities of suspicious cyber activity (it was five the previous year).

Over that time, ASD responded to 143 incidents that were directly reported by entities that self-identified as critical infrastructure (the previous year saw 95 incidents reported). Luckily, nearly all these incidents were low-level malicious attacks or isolated compromises.

57% of the incidents affecting critical infrastructure involved compromised accounts, credentials, assets, networks of infrastructure, or DoS attacks. Other ‘popular’ attacks included data breaches and malware infection.

So, why do bad actors attack?

There’s no one reason for attacking critical infrastructure.

The sensitive information they hold, the high levels of connectivity with other organisations and critical infrastructure sectors, and the essential services they provide are alluring targets for those keen to disrupt life as usual, profit from insider knowledge, or wreck revenge for perceived political slights.

From hospitals losing access to client records, as happened in France in 2022, where their health system reportedly sustained a number of cyber incidents resulting in cancelled operations and shut down hospital systems, to the widespread fallout from a 2023 attack on Denmark’s energy infrastructure – the impacts are significant.

The reality is that it only takes one successful attack to cripple regions, economies, and communities – and it takes a huge amount of work (and can involve significant human distress) to recover the status quo.

Why is critical infrastructure such a good target?

Critical infrastructure networks are known for their interconnected nature. This, along with the third parties in their ICT supply chain, broadens the attack surface for many entities. Weak points include remote access and management solutions, which are becoming prevalent in critical infrastructure networks.

Operational technology (OT) and connected systems are also a dangling carrot for bad actors. They can target OT to access corporate networks – and the other way around. This allows them to move laterally through systems to reach their target destination. Even if an offensive isn’t directly on an OT, attacking via connected corporate networks can disrupt operations.

And, of course, any internet-facing system where the hardware or software isn’t updated with the latest security patches is vulnerable to exploitation, as are ICT supply chains and managed service providers.

Is the Government overreacting?

We’d say not.

In justifying the need for further reforms to more tightly regulate Australia’s critical infrastructure, the Government stated in 2022 that ‘Australia is facing increasing cybersecurity threats to essential services, businesses and all levels of government’.

At the time, the Prime Minister warned that cyberattacks were a ‘present threat’ and acknowledged they were a ‘likely response from Russia’ following the Government’s decision to impose sanctions in response to Russia’s recent aggression against Ukraine.

In its overview of the 2022 SLACIP bill, the Government also noted that the Parliamentary Joint Committee on Intelligence and Security (PJCIS) had ‘received compelling evidence that the pervasive threat of cyber-enabled attack and manipulation of critical infrastructure assets is serious, considerable in scope and impact, and increasing at an unprecedented rate’.

To be forewarned but not forearmed is a shortsighted strategy. We’re pleased to say that introducing SLACIP to protect our critical infrastructure shows that the Australian Government has paid close attention to ensuring we can protect what makes the world downunder go around.

AI: 101 (part 2) – Making a business case for AI

A quick recap: In our previous blog, we discussed the challenges and complexities behind the rush to embrace AI. We talked about what you needed to run AI (lots and lots of good data, and specialised hardware and software), the GenAI hype cycle, and the potential for use cases to simply bomb without delivering value.

Lastly, we finished with some somewhat worrying stats that questioned the readiness of Australian organisations to harness AI, let alone understand what they’re going to do with it.

So, now you’re up to date, let’s move on to use cases, those that are well-defined and understood, and why others are just pie in the sky.     

AI legal eagles

Earlier this year, Melbourne law firm Lander & Rogers set up their own AI Lab within the practice. They are currently working up “three or four” prototypes a day (mainly using Microsoft Copilot) to test how they can leverage AI to interact with various data types. Some of the most valuable use cases uncovered (from a flood of ideas generated internally) are those that save lawyers time in finding and surfacing the information they need.

Amongst Lander & Rogers’ winning use case is engaging AI to build a chronology of events in legal cases.

What’s not on their agenda, though, is using AI to rewrite a lawyer’s work. Their Head of AI Engineering, Jared Woodruff, says, “That’s not what AI is meant to do. The AI is there to give them [lawyers] all the information that they need to execute that decision and execute it with precision, saving them time.” 

Lander & Roger have taken a strategic approach to their areas of focus and have defined where AI can be used to deliver definable business value in the legal profession.

More great legal tech

Working with AI service provider Automatise, Ethan, an Australian-owned technology service provider, has invested heavily in building Cicero, a pioneering AI tool specifically designed for the Australian legal sector.

Ethan says that Cicero has already been adopted by several mid-tier and enterprise law firms in Australia and is transforming workflows and enhancing productivity. To quote: “As these firms integrate Cicero into their operations, they experience firsthand the benefits of high quality, coherent summaries and analyses of legal documents, a feat made possible by the fine tuning of LLMs for local use cases.”

Again, this is another great, well-thought-out use case that meets specific industry needs. If all goes to plan, it will transform the Australian legal industry and deliver an impressive ROI.

AI pie in the sky

There are numerous high-profile examples of poor AI use cases. Some are simply ill-conceived, ethically irresponsible, dangerous, or just plain thoughtless. Others have used insufficient or inadequate training data, which produced skewed and reprehensible outcomes.

This hasn’t daunted the would-be AI adopters, though.

McKinsey’s 2024 global survey on AI reports that 65% of respondents said they regularly use generative AI for at least one business function. However, only 10% of those organisations had implemented gen AI – at scale – for any use case.

Further to this, a senior partner at McKinsey, when speaking at the MIT Sloan CIO Symposium, said that while there are many organisational initiatives, “a lot of the efforts are scattershot and don’t contribute to the bottom line.” McKinsey’s survey confirms this, saying that only 15% of the responding companies realised an improvement in earnings for those AI initiatives.

AI isn’t cheap or easy

Why is the failure rate (or inability to generate an ROI or measurable business value) so high? This is where we dig out the old axiom: ‘fail to plan, plan to fail.’

Like any technology project, there needs to be rigour around the ‘what, why, and how’ of the business case. Major considerations include:

Worryingly, ADAPT’s CIO Edge Survey from February 2024 says that 66% of Australian CFOs say their organisations are unprepared to harness AI. 25% are non-committal, and only 9% say they’re AI-ready. AI-ready or not, 48% of the CIOs surveyed say they haven’t even defined any clear use cases for AI.

  • Setting out your commercial objectives – in other words, defining the problems you’d like to solve within your business, as well as the desired outcomes. Then, deciding if AI is, in fact, the right solution.
  • Ensuring your data is up to scratch – remembering that AI is data, your data needs to be up to date, accurate, relevant, ample – and used appropriately. All of this requires preparing and adhering to a sound data governance strategy.
  • Realistic expectations – yes, AI can be wonderful, but it’s not a magical cure-all. It’s critical not to overestimate the capabilities of AI, and it is essential to test and validate systems to ensure they meet the basic requirements of safety, compliance, accuracy, ethics, transparency, fairness, and security.
  • Making sure you’re resourced up – adopting AI comes at a cost. Just as you wouldn’t let a newly qualified driver lose in your brand-new Tesla, you wouldn’t (or shouldn’t) place your trust in anyone who doesn’t understand the legal,  ethical, and data considerations mentioned earlier. A successful AI project also requires an investment in technology, data and infrastructure. Poor infrastructure can result in performance issues and failure to support the implementation of advanced AI models, compromising both their efficiency and reliability.  
  • Scalability – it’s also critical to test AI projects at scale. What works perfectly as a test project may disappoint in terms of efficiency and reliability when rolled out to the entire organisation. 

Blue skies or uncertian horizons?

We know of many businesses that are keen to increase their compute power so they can train their own AI. And we’re supportive of that; we love to see organisations innovate.

But what concerns us is that few know what they actually want to train AI to do.

Without clarity of purpose, a strong business case, and a structured, disciplined approach, AI has the potential to become an expensive toy rather than a transformative technology that contributes to the bottom line.

AI: 101 (When, why, and what the hell?)

AI is going to change the world. It’s bigger than the internet. All of our jobs will disappear.

And so, the headlines continue. Everybody who’s anybody has made a meaningful quote about AI, and every technology business has jumped on the AI bandwagon with the same ready-or-not alacrity they embraced delivering cybersecurity services.

But you’ll have to excuse us if we’re going to take a bit more time to think about this. Because AI poses significant new challenges and complexities, we want to take the time to process the implications, not just pick it up and run with it.

If you’re feeling equally cautious about AI, you are not alone. In their (well-worth-a-read) feature article from April 2024, “Despite the Buzz, Executives Proceed Cautiously With AI,” Reworked raises the same concerns and cautions.   

So, backing up a bit, let’s take a 101 approach to AI and start at the beginning.

What does AI even mean?

We all know that AI stands for Artificial intelligence. It’s a term from 1955 coined in a proposal for a two-month, 10-man study of artificial intelligence. The ‘AI’ workshop took place a year later, in July and August 1956, which is generally considered its official birthdate.

Today, AI describes the simulation of human intelligence processes by machines (mainly computers), and can be seen in expert systems, natural language processing (NLP), speech recognition, and machine vision. To note: AI is frequently confused, by vendors and users alike, with machine learning (aka ML). But whereas AI mimics human intelligence, machine learning identifies patterns and then uses that information to teach a machine how to perform specific tasks and produce accurate results.

So, how does AI work? Basically, AI systems ingest large amounts of labelled training data. AI analyses the data, identifies relationships and patterns within it, and uses what it learns to make predictions about future states. Much as a human brain will access everything it knows at any given point and make a (hopefully) rational and informed decision about what happens next, so will AI. 

What do you need to run AI?

AI needs three things to work: 1. data – and lots of it; 2. specialised hardware; and 3. specialised software.

Let’s talk about the importance of hardware, though, as without access to this, you have nothing. What do you need to know? The process of using a trained AI model to make predictions and decisions based on new data is called AI inference. While you can, at a pinch, run AI inference tasks on a well-optimised CPU, you really need the parallel processing grunt power of a GPU (graphics processing unit) for the compute-intensive task of AI training.

GPUs play an important role in data centres as they deliver the performance needed to accelerate AI and ML tasks, facilitate video and graphics processing, and run scientific computing and simulation applications.

Given the importance of AI training, it’s probably no surprise that leading vendors have advised that the demand for high-end, AI-ready GPUs has exceeded supply. The wait time for the average buyer can exceed 260 working days – which is over a year.

(The silver lining is that you have more time to define your AI strategy rather than rushing in).

Where are we in the AI hype cycle?

Gartner uses ‘Hype Cycle’ to describe “innovations and techniques that offer significant and even transformational benefits while also addressing the limitations and risks of fallible systems.” So, what’s hot now, what’s coming, and when can we expect these innovations to become mainstream – or fail to follow through on their promise?

In looking at Gartner’s 2023 AI Hype Cycle for AI, they call out two kinds of GenAI innovations that dominate. Firstly, the innovations that will be fuelled by GenAI, and impact content discovery, creation, authenticity and regulations, as well as automate human work and the customer and employee experience.

The next is innovations that will fuel GenAI advancements. This includes using AI to build new GenAI tools and applications. In effect, it’s using innovation to create more innovation –  so it’s a popular use case for business startups.

The hype cycle illustrates user expectations of AI and how and where it will be used against where Gartner sees these innovations in 2-10 years (see graph). So, while the ‘innovation trigger’ and ‘peak of inflated expectations’ are crammed full of use cases and solutions, some will fall by the wayside while others will surface and go on to be productive.

The problem being?

While the list of potential use cases is exciting, Garter’s Hype Cycle does show that AI isn’t going to deliver business value (or even last the distance) in every instance.

Yes, you may get a head start on the competition by being an early adopter, but you may also become that cautionary tale shared in hushed tones in GenAI blogs and headlines of the future.

Forbes certainly reached that same conclusion in its article, ‘AI Reality Check: Why Data Is The Key To Breaking The Hype Cycle.’  Here, Forbes discusses whether GenAI reached its ‘peak of inflated expectations’ in August 2023, and many companies then came face-to-face with the reality of extracting genuine and meaningful value from AI.

Earlier, we listed access to data as a must-have for AI to work. And in its article, Forbes agrees, referring to its research, which firmly points the accusing finger of failure and disappointment at data silos. Nearly 75% of respondents who had implemented AI pilot projects in their organisations said that data silos were the primary barrier to enterprise-wide AI integration. “The number one thing keeping GenAI initiatives from reaching their fullest potential inside large corporations,” says Forbes, “is data.”

What the hell are we going to do with AI, anyway?

Worryingly, ADAPT’s CIO Edge Survey from February 2024 says that 66% of Australian CFOs say their organisations are unprepared to harness AI. 25% are non-committal, and only 9% say they’re AI-ready. AI-ready or not, 48% of the CIOs surveyed say they haven’t even defined any clear use cases for AI.

This leaves us asking, where to from here? How do you ensure that your investment in AI not only delivers business value through the availability of data, hardware, and software but that you are ready to use it and can justify the investment?

In part two of this topic, we discuss some real-world use cases in action and suggest some of the hard questions you should consider before committing to the shiny new thing that is AI.

Chicken or egg: Cyber resistance vs cyber resilience

In a digital world where data is the new ‘everything’, it’s unsurprising that it has become a prime target for criminals. Data is the modern-day equivalent of a stash of gold bullion – and it can be stolen, ransomed, and sold for profit with less effort and risk than a bank heist.

The unrelenting waves of global cyberattacks mean that the cost of business survival is escalating – with the cost of cyberattacks doubling between 2022 and 2023. To combat this, Infosecurity Magazine reports that 69% of IT leaders saw or expected cybersecurity budget increases of between 10 and 100% in 2024.

The cost of crime

At the pointy end of the problem, organisations face damaged or destroyed data, plundered bank accounts, financial fraud, lost productivity, purloined intellectual property, the theft of personal and financial data, and more.

The blunt end is no less damaging. There’s the cost of recovering data, rebuilding your reputation, and getting your business back to a state of BAU as soon as possible, as well as the hefty price tag that comes with forensic investigation, restoring and deleting hacked data and systems, and even prosecution

Generative AI to the cyber-rescue?

Many see the rise of generative AI and expansion into hybrid and multi-cloud environments as the means to alleviate the ongoing attacks. But, of course, the democratisation of generative AI (in other words, goodies and baddies have equal access to its powers) means that potential risks are also heightened.

Despite this, it’s hard to overcome the optimism that generative AI will be a cyber-saviour. According to Dell Technologies 2024 Global Data Protection Index (APJ Cyber Resiliency Multicloud Edition), 46% of responders believe that generative AI can initially provide an advantage to their cyber security posture, and 42% are investing accordingly.  

But here’s the rub: 85% agree that generative AI will create large volumes of new data that will need to be protected and secured. So generative AI will, by default, (A) potentially offer better protection and (B) increase the available attack space due to data sprawl and unstructured data.

Resistance vs resilience

Of the APJ organisations (excluding China) that Dell surveyed, 57% say they’ve experienced a cyberattack or cyber-related incident in the last 12 months.

And a good 76% have expressed concern that their current data protection measures are unable to cope with malware and ransomware threats. 66% say they’re not even confident that they can recover all their business-critical data in the event of a destructive cyber-attack.

So why, if 66% of organisations doubt their ability to recover their data, are 54% investing more in cyber prevention than recovery?

Can you separate the cyber chicken from the egg?

In a recent cybersecurity stats round-up, Forbes Advisor reported that in 2023, there were 2,365 cyberattacks impacting 343 million victims.

Given the inevitability of cyberattack, it’s critical that your methods of resistance are robust, and if disaster strikes, your ability to recover is infallible.

Look at it this way: While a cruise liner obviously must have radar to detect and try and avoid approaching icebergs, angry orcas, and other collision-prone objects, it’s just as important that they have lifeboats, lifeboat drills, lifejackets, and locator devices available to minimise loss of life and keep everyone afloat.  

In the words of Harvard Business Review: “Simply being security-conscious is no longer enough, nor is having a prevention-only strategy. Companies must become cyber-resilient—capable of surviving attacks, maintaining operations, and embracing new technologies in the face of evolving threats.”

So, how do you bolster your cyber resilience?

According to Dell, 50% of the organisations they surveyed have brought in outside support (including cyber recovery services) to enhance cyber resilience.

While AI will undoubtedly introduce some initial advantages, as suggested earlier, those could be quickly offset as cybercriminals leverage the very same tools. Not only are traditional system and software vulnerabilities under attack, but due to the sprawl of AI-generated data, there are more and newer opportunities.

So – can we rely on generative AI to save the day? Probably not – or not yet anyway. What about outside help? Yes, most definitely. However, cyber resilience begins at home, with a top-down strategy based on some inarguable facts:  

  1. Attacks are inevitable. Once you accept that this is the new reality of the digital age, the logical next step is to develop a clear, holistic strategy focusing on business continuity and crisis planning.
  2. People are the first and best line of defence. Ensure your entire organisation takes responsibility and is cyber-aware – to the extent that your procedures are included in your company policies and onboarding processes.  This should include delivering ongoing cyber awareness training and introducing regular drills.
  3. When disaster strikes, survival is in your hands. Establish clear cybersecurity governance that aligns with your business objectives. Everyone in the organisation should know what they need to do to protect the organisation, its data, and its clients and ensure continuity of operations.  
  4. No one is trustworthy. Assume everything around your network is a potential threat. Adopt a zero-trust mindset that requires continual verification and rigidly controls access based on preset policies.  
  5. What you don’t know can hurt you. The ability to detect and prevent threats is critical. Invest in Security as a Service to provide visibility into your data, regardless of where it’s located, so that you can see and address your weaknesses.
  6. Disaster will strike. We live in unexpected times, where cybercrime and unprecedented natural disasters conspire to stop us in our tracks. With cloud-basedDisaster Recovery as a Service, the risk of permanently losing data and disrupting business as usual is significantly reduced.

Get in touch for a Free, No‑Obligation Consultation

Arrange a chat with our experienced team to discuss your data protection, disaster recovery, cloud or security requirements.

  • Arrange an introductory chat about your requirements
  • Gain a proposal and quote for our services
  • View an interactive demo of our service features

Prefer to call now?
Sales and Support
1300 88 38 25

By filling out this form you are consenting to our team reaching out to you. You may unsubscribe at any time. Learn more by visiting our Privacy Policy

This field is hidden when viewing the form
This field is for validation purposes and should be left unchanged.

© 2021 Global Storage. All rights reserved. Privacy Policy Terms of Service

The Global Storage website is accessible.

Download
Best Practices For Backing Up Microsoft 365

By filling out this form you are consenting to our team reaching out to you. You may unsubscribe at any time. Learn more by visiting our Privacy Policy

This field is for validation purposes and should be left unchanged.

Download
5 Myths About Backing Up Microsoft 365 Debunked

By filling out this form you are consenting to our team reaching out to you. You may unsubscribe at any time. Learn more by visiting our Privacy Policy

This field is for validation purposes and should be left unchanged.