Why government isn’t using the cloud to drive measurable benefits
Governments are massive data-generating machines. Some larger agencies, such as Services Australia, collect and store an incredible amount of data. But many data owners have no idea what data they hold, which is a little scary when you think about it.
The biggest challenge for any public or private sector organisation is knowing their data. That many don’t is a significant reason government and industry leaders can’t use the latest technology – especially cloud – to make better decisions and drive measurable business results.
Knowing what data you hold – and who needs access to it – influences all your data decisions. Does the data need to be encrypted? How can we protect it effectively from cyber criminals? Do we need to hold this information indefinitely? If it’s currently sitting on tapes, what format should the data need to be in the future so it can be easily accessed by those who need it?
It does get even deeper than this. Even once you know your data, how do you make the most of it?
Think about healthcare. Imagine how health departments (federal and state) could run hospitals and other facilities more efficiently with their fingers on the most recent and accurate healthcare data. They could use many different datasets they don’t use today, perhaps from customers themselves.
This might not be the situation for every department, agency or organisation as there are often many complexities and moving parts. But collecting and analysing the correct data can make a real difference in people’s lives, allowing leaders to make more timely and proportionate decisions on budget and resource allocations.
There are some excellent tools that provide thorough data audits. Using artificial intelligence and machine learning, they trawl through file systems and databases and work out if data isn’t reliable or if there is any sensitive data that should not be held, such as credit card or password data.
Sometimes it might not be actual data that’s crucial to capture. It could be metadata. Often “data about the data” might be all organisations need to create an application or search – something above and beyond trending data – that provides valuable insights.
Regulatory issues
The primary data security driver in government, at least at the federal level, is the Information Security Manual (ISM). This is a cyber security framework produced by the Australian Cyber Security Centre that outlines how organisations protect their information and systems from cyber threats.
But the government has been slow to adopt when you look at the broader data security ecosystem and compare it with commercial users. Despite recent well-known data breaches, commercial operations have understood for a long time the need for quite sophisticated frameworks to keep customer or patient details safe – the Financial Industry Regulatory Authority (FINRA) and Health Insurance Portability and Accountability Act (HIPAA) are two examples.
The government is probably the last to go through this process through the adoption and maturing of the ISM. We’re still finding public-sector clients today that don’t look to turn disk encryption on, which you would think is essential and, most of the time, available at no cost.
There is no excuse for reasonable data security; generally, there has been a strong drive towards better data hygiene. These days, you need to make a conscious effort to turn security features off.
We need more governance tools that manage data across the hybrid cloud and these tools should let managers know where their data is and how it’s currently secured.
If a user or administrator turns off a security setting, it should be flagged. This wouldn’t just be when hardware or software is installed but across the lifetime of the product.
Impediments to success
When government looks to engage a cloud provider, managers need to establish if the storage arrangements suit their dataset and whether they meet an agency’s specific security requirements. In Australia, this means all sovereign data needs to be housed on-shore – a stipulation commercial operators don’t generally need to think twice about.
It isn’t surprising that the federal government is nervous about data safety. If data is managed and housed offshore, there are ways foreign nationals could potentially have access to it. The data may not be safe or securable in the face of a cyber attack a long way from home.
The perceived retrieval costs are more concerning for departments and agencies looking to move their data into the public cloud. Their issue is how technology has traditionally been spent, with perhaps millions of dollars set aside in a CAPEX budget to set up on-premises facilities and necessary upgrade costs. This arrangement provides managers with a sense of certainty and stability.
Shifting to the public cloud would be cheaper and offer greater flexibility and adaptability, but it would also necessitate a shift to OPEX budget planning to pay for a service provider rather than maintaining on-premises systems.
Some agencies aren’t flexible enough to make this budget change work effectively. If they are forced to deal with a catastrophic outage, they might have to restore data from the cloud. The retrieval process could cost them an extra $200,000 or $300,000 – unallocated funds that may need ministerial approval before being released, slowing down the process. Many managers prefer not to deal with this level of anxiety.
The right skills
Of course, many agencies choose to battle away with ageing facilities needing a refresh but like the idea of having on-premises storage. But who will maintain and optimise the old servers in their on-premises server room?
The biggest challenge for the government is attracting technology experts. When you talk to graduates, hardly any choose to work in traditional storage technologies and universities aren’t teaching them today because most clients require cloud storage-based solutions.
And even if you can find the right tech person during a global skills shortage, can the government agency afford to bring in a full-time employee to do the work? What happens three years after these platforms are redesigned or software upgrades are required? That’s why there has been a call for on-premises managed services because they will only be needed one day a week or a couple of hours a day.
In the end, government data machine managers need to get bold and choose an option that offers certainty, flexibility and long-term security. Because the future is cloud.
Matt Shelley is the chief technology officer at Cirrus Networks.
Is your data an asset or bait for cyber criminals?
- How the right data can help overcome policy challenges
- We’re still on our data journey but something’s brewing
- Why government isn’t using the cloud to drive measurable benefits
- Audits reveal weaknesses in government data security
- ‘Treat data like uranium’: Lessons from the ATO playbook
- How cloud helps government answer the hardest questions
- Chief data officers: Does your organisation need one?
- Want to use your data more effectively? Head for the cloud
- The critical steps when transforming siloed assets into actionable data in cloud
- Lies and statistics: Why Australia’s UN SDG measurement data is almost useless
- Data strategies emerging in digital governments of the future
- Data is an executive governance issue. Are your leaders on board?