Text size: A A A

The critical steps when transforming siloed assets into actionable data in cloud

The rate of change in the cloud and data space is phenomenal. On the back of the pandemic and constantly advancing technologies, expectations from managers and customers are forcing business units to adopt new tools and ways of working. Demand is hot right now.

I’m seeing a lot of organisations being driven into adopting cloud, whether through political pressure or an assumption of better capabilities. Some organisations initially go all in for cloud to bank cost savings and find they don’t have the necessary processes, tools and governance in place. Without proper readiness, the business case for cloud adoption rapidly diminishes.

Whatever the reason for change, organisations need to understand that they shouldn’t be pushed into making hasty decisions.

Understanding your digital portfolio is key to choosing the ‘right’ cloud platform to move to, supported by a robust decision framework. Adopting cloud without a transformation agenda will get you some quick wins, but ultimately a ‘lift and shift’ doesn’t address legacy architecture. Neither does it allow organisations to maximise the value of the features built into cloud platforms.

Interestingly enough, data is often overlooked when embarking on a new platform journey, but you must be clear about its anticipated lifecycle. During the planning for migration to the cloud, there’s a lot of focus on the workload (application or virtual machine), but outside of data sovereignty and data classification, I’m not seeing the priority on the data assets that they deserve.

The most important thing is adopting a data portfolio approach. It’s vital to understand your data assets – whether it’s an important spreadsheet, customer database or record system you need to maintain.

What’s really crucial is knowing where your data lives, what it can do and who has access to it. You also need to understand how other systems (perhaps secondary data systems) feed from or to your data. Then you need to apply the appropriate data protection and retention capabilities across any cloud-based system it lives with.

Let’s go back five years – or even just pre-pandemic – when many organisations started testing the water with cloud. Everything was once contained in one or two data centres – maybe a few buildings. Now, everything is being pushed out through different service providers across multiple data centres.

Furthermore, it isn’t just about understanding the value of your data in isolation but collectively as a department or organisation. When assessing how to transform siloed assets into actionable data in cloud, the main challenge is balancing immediacy with good strategy.

The first thing you need to do is establish if you want a departmental or business unit approach. This will have a significant impact on your overall data strategy – where is your data residing, and what kind of services can we provide based on the data we will have available? This is especially the case with some organisational services needing to prioritise cloud adoption and transformation, creating a multi-speed approach.

You also need to be aware of the capabilities or services those data assets contribute to and the lifecycle around these.

We see many people move to the cloud-based platform-as-a-service (PaaS) or software-as-a-service (SaaS) models and find their backup or archival mechanisms are no longer relevant. This is a major problem in government if you have a data retention policy that dictates records be kept for 50, 60 or perhaps 70 years. What would you do if the only way to access government info from 60 years ago was through a legacy system accessed via Exchange 2000?

In reality, many organisations need to understand better their data capabilities and how to leverage them fully. What they used to do in a traditional technology stack doesn’t necessarily apply in a new world. Adopting feature-rich API platforms allows more integration, orchestration and automation. This means you’ve got to start thinking about how you change your team’s way of working and what ‘good’ looks like. Importantly, you need to change how you measure success.

Finding the best mix of workforce skills is also essential. Moving from a traditional network data centre play to more of a scripting, ‘DevOps’ environment is no mean feat, especially during the current skills scarcity challenge. Finding enough people with the right mix of capabilities is difficult.

Last but not least, it’s all about adopting a security-by-design approach and understanding how you must protect your digital and data assets.

We’ve seen what happens when organisations are unable to protect their data. Primary and secondary data have recently been the focus of a lot of cyber crime activity, with personal data in particular coming under constant attack. Furthermore, many cyber criminals, through ransomware for instance, are targeting data backup repositories to minimise the victim’s ability to recover from an attack. Hence, backup strategies are now an essential line of defence in the cyber war.

Citizens will continue challenging organisations to prove their data is safe. They will want to know how their data is being used and how it’s being stored. Having an end-to-end focus on data security will be an ongoing challenge for public- and private-sector organisations.

Essential Eight, the model brought in by the federal government to combat cyber threats, has a great set of controls. Although it’s quite onerous to implement, it provides a good level of security. But we often see clients battle with the trade-off between better functionality and going through the hoops to ensure the data they hold is secure and trustworthy.

Still, many technology trends (and challenges) are being driven by consumer demand. Customers often set expectations of how they want to deal with organisations, especially government, which can create issues when bringing data together to deliver the services consumers want and how they want to consume it.

Government has no choice; it needs to embrace change. But leaders and procurement managers shouldn’t be too hasty and jump into the wrong or most expensive solution. A well-considered data strategy needs to come first.

Matthew Gooden is chief innovation and technology officer at Datacom.


 

Is your data an asset or bait for cyber criminals?

Data for policymakers
"Data can help you understand the environment in which policies and programs operate," says former ABS head David Kalisch.
data brewing
We need frameworks for appropriately and safely handling data throughout its lifecycle, says NSW chief data scientist Dr Ian Oppermann. 
Organisations need to know what data they hold to make better decisions and drive business results, says Cirrus Networks' Matt Shelley.
data security
Federal and state government auditors-general have uncovered serious data security problems, and people are often the weakest link.
ATO data uranium
Treating data like uranium means being aware, educated and prepared for the changing online world.
data solutions
Governments will get results when they match data governance and analytics with AI and machine learning, says HPE's Ridhav Mahajan.
chief data officers
Services Australia CDO Garrett McDonald says chief data officers ensure a human-centred lens is applied to policy and service design and service delivery.
Cloud services offer clients low-cost options for data storage and high-powered solutions for analysis.
Data strategy and cloud
Government needs to embrace change but a well-considered data strategy must come first, says Datacom's Matthew Gooden.
SDGs
Australia says it’s committed to pursuing the UN's Sustainable Development Goals but how well do we monitor progress?
How federal and state governments are evolving data strategies to support digital transformation and give taxpayers what they want.
data governance jonathan hatchuel commvault
Data protection and management are executive governance issues and require strong leadership, says Commvault's Jonathan Hatchuel.