As Sean Price discusses in his ‘2023 Public Sector Predictions’ blog, European government departments and agencies are under pressure to reduce costs, improve efficiency and provide a better citizen experience. Governments need to offer more services at higher quality at a time when it costs more to heat buildings and to employ people to run the services. As a result, many government departments have embarked on large programmatic digital transformation initiatives to deliver long-term cost savings and government-wide efficiency gains. Cloud often plays a key role, but more on that later.
Last week, I attended Cheltenham Festival - a famous horse racing event nestled away in the English countryside. Many of the races involved hurdles, each slightly different but nonetheless a challenge for the jockeys. Depending on your position and view (jockey, trainer, owner, developer, manager, leader, executive etc.), the hurdles all look different; some even have additional hidden obstacles that make them more challenging than first anticipated. The same can be said for any digital transformation. Leaders and jockeys alike plan and prepare their course, anticipating their route around likely significant hurdles. But during the mane* event problems evolve and new obstacles appear - a fallen horse, unexpected operational pressures, budget cuts - that cause us to reconsider and reroute.
You’d probably expect these hurdles to be the most common reason digital transformations fall short. Still, research shows that whilst these obstacles certainly make it more challenging, most of the time, the struggle is because of a lack of ongoing executive commitment and the scale of the all-encompassing change is not always fully recognised. Such a vast organisational-wide shift is undoubtedly challenging - change fatigue and lack of stamina are common. It’s analogous to a jockey running a typical 5-furlong race (5/8th of a mile) only to realise it’s a long-distance race; organisations that don’t prepare for the long-haul fail more often.
Over the past few years, we’ve seen a trend in the public and private sectors to move from significant programmatic digital transformation to several smaller, more achievable goals with accountable business owners, often governed by a more lightweight programme function. More often than not (70% according to Accenture), the central supporting pillar is to build the foundational technology. For many organisations it means implementing a cloud-first policy, creating a data strategy and performing analysis to support the migration of existing apps to the cloud. However, it turns out that even this much smaller ambition is challenging. And it appears to be more complex in the public sector. But why?
The migration of compute, storage and higher-level services to the public cloud has the potential to offer tremendous benefits to almost any organisation. It seems very compelling: The zero capital outlay, improved ability to innovate, near-unlimited scalability and the freeing up of skilled people to work on mission-critical services instead of generic infrastructure and platforms. 72% of IT decision-makers in the private sector cite efficiency as the main driver behind cloud adoption. The stories from successful private sector organisations add hunger to the already practically irresistible argument for cloud investment. It becomes apparent - to be a more agile and resilient government, capable of adapting to new requirements whilst meeting citizen expectations and managing rising costs, you need a flexible hosting platform. The public cloud promises just that, but many government departments fall at the first hurdle.
Research performed by several system integrators and consultancies used across European government departments indicate three main reasons why the public sector struggles more than private companies:
Let’s take a brief look at each.
Firstly, civil servants are driven by an obligation to spend public money wisely whilst delivering as much as possible; a need unparalleled in other sectors to deliver the best for the cost. Delivery of new or enhanced customer-facing services often takes priority over migrating existing apps or rearchitecting them to make the best use of cloud services only to be left with a service that looks, at least on the surface, the same as it does today. As a result, critical underpinning activities (such as migrating existing services to the cloud) usually don’t get championed and don’t get bored support and are not adequately resourced. Fundamentally, many departments default to thinking the problem is capacity rather than flow. So they put more and more horses on the course, thinking that one will get across the finish line faster; yet a more congested racetrack has the opposite effect; the competing priorities result in everything getting slower.
“Why are we going so slowly?”
“We just need a little more horsepower.”
Secondly, legislation and policy is changing across Europe. Governments hold sensitive information about their citizens, and moving this data to a US-based cloud provider is often a risk that’s hard to understand and a tough policy decision to make. In 2021, US companies stored and processed nearly 70% of European data. This problem is exacerbated as various EU member states have voiced their intentions for more robust digital sovereignty (the ability to act independently in the digital world) to keep data within national borders and stimulate home-grown technological economic growth, i.e. local cloud providers.
The EU has recently published its view on digital sovereignty, which has further driven the view that digital sovereignty following concerns around the threat of extraterritorial data access under the US Cloud Act. Other efforts such as the OECD’s ‘Declaration on Government Access to Personal Data Held by Private Sector Entities’, ongoing G7 efforts on ‘Data Free Flow with Trust’, the UK GDPR reform, and EU work towards Privacy Shield 2.0 (currently under review by the European Data Protection Board) all complicate the future legal landscape. Additionally, cloud service providers will shortly be considered ‘essential entities’ under the NIS2 Directive, with additional cybersecurity risk management measures enforced. It’s like running a race where the rules and constraints change while on the racetrack.
France have also introduced cloud certification scheme to better protect ‘sensitive data’ and drive data sovereignty. Their ‘Cloud in the Centre’ strategy (2021) introduces a ‘trusted cloud’ label, which requires a security classification by ANSSI (SecNumCloud). It mandates CSPs demonstrate immunity to third-country legislation and requirements on European capital ownership of the company. In response, US CSPs are collaborating with European partners, such as Bleu, to improve the chances of qualifying. The EU Cloud Security Certification Scheme, which is still in discussion at ENISA (EU cyber agency), aims to achieve similar outcomes across all EU members.
This is why many public sector departments are actively slowing down cloud migration efforts for sensitive services until the water clears. There’s no point in migrating twice - so hold your horses.
Many government departments have built large, complex technical systems to meet evolving requirements, interconnected both within the department and with systems managed by other government and private sector organisations. Enterprise architecture is challenging within a company that has a relatively small number of capabilities, but it’s a whole different beast when it requires various government departments under different leadership to work together. It’s why the UK launched the Central Digital and Data Office (CDDO) in 2021, to lead the Digital, Data and Technology (DDaT) function and put the conditions in place for digital transformation at scale. And even with dedicated effort and funding, they still struggle.
Simply lifting and shifting capabilities to the cloud can be expensive and not offer any of the anticipated value. Still, rearchitecting requires knowledge of the interconnected systems and a whole range of non-functional requirements from security rules, performance and reliability expectations, usability and accessibility requirements and data protection regulations. Moreover, the typical programmatic waterfall nature of many government programmes struggles to capture and action much of the understanding required in a desirable timeframe.
This is tough, so let’s take a step back for a moment. Complexity experts distinguish between complicated and complex situations (Cynefin framework). Putting a man on the moon was undoubtedly a complicated task but there was a clear objective and success was easily measured. The effect can be predicted from the cause. Making significant, long-term plans and being reasonably confident of achieving them is possible. We do well with physical-world tasks where analysis, extensive design, programmatic management and the “build” mindset prevail.
On the other hand, with complex situations, such as organisational culture and digital transformation, it’s hard to define a route to tangible and measurable outcomes. It’s difficult to measure improvement. The effect can be deduced from the cause, but only in retrospect. Progress is best made using emergent approaches – trying things, seeing what works and adapting accordingly. Responsiveness, collaboration and a ‘growth’ mindset are more important. The typical programmatic delivery approach simply doesn’t work; it requires a different approach. PA Consulting discusses what this means here.
Keep in mind the benefits you are chasing. As Sean points out in his blog, technology is a small but essential part of digital transformation. And the truth is that very few organisations get their migration to the cloud right the first time. It takes time and effort, but it’s worth getting these foundations right. So keep in mind the benefits that you’re chasing. Here are my top tips for successful cloud migration.
I hope this has been insightful, or at a minimum, you enjoyed the horse puns… Yay or neigh?
* This horse pun and all other puns are intended, sorry
The Splunk platform removes the barriers between data and action, empowering observability, IT and security teams to ensure their organizations are secure, resilient and innovative.
Founded in 2003, Splunk is a global company — with over 7,500 employees, Splunkers have received over 1,020 patents to date and availability in 21 regions around the world — and offers an open, extensible data platform that supports shared data across any environment so that all teams in an organization can get end-to-end visibility, with context, for every interaction and business process. Build a strong data foundation with Splunk.