Big but not scary: how to use big data to shape government policy and delivery

Following a session on big data at Public Service Data Live in London last year, one of the expert panellists – Deepak Shukla, head of data and analytics capabilities at Amazon Web Services (AWS) – provides a comprehensive summary of the key considerations for governments, from sharing and collaboration to generative AI
Government departments and agencies collect large amounts of structured and unstructured data from both anonymised and aggregated sources. Locked in that data are insights that can be used to inform the development of public policies and evaluate the delivery of services. But it can be difficult for government organisations to exploit the power of their data repositories.
Here, Deepak Shukla shares his insights on maximising value from data, touching on the importance of facilitating data sharing across public sector bodies, the merits of taking a holistic look at future data strategy, and why governments should prepare to take advantage of generative AI.
How are public sector organisations making the most of big datasets to help develop better policy and deliver better services?
Public sector organisations are investing extensively in cloud-based modern data platforms and are continuing to enhance their capabilities to make choices based on big data inputs.
There are numerous examples where government is using big data for policymaking.
The Scottish Government has been exploring use of geospatial data to inform delivery of strategic and national outcomes, from helping to develop National Planning Frameworks to supporting delivery of millions of pounds in community funding to support those affected by COVID-19.
The NHS in England is using big data to measure and report the ‘great strides’ the health service is making, by showing data on improvements in response times, and across long waits, urgent and emergency services, and cancer care. For example, the number of patients waiting more than 18 months for treatment was down by more than 90% in May 2023 compared to September 2021.
The UK government – which has set an ambitious target to reduce carbon emissions by 68% by 2030 compared to 1990 levels – is using extensive datasets to track and measure its progress towards its sustainability and net zero ambitions.
And government is also using data and analytics in its fight against public sector fraud. Data is helping it to shape strategies and investments to prioritise areas of maximum impact and it is using data to track and communicate the progress government agencies such as the Public Sector Fraud Authority (PSFA) is making in recovering taxpayer’s money.
In the areas of public safety, healthcare, local administration and every aspect of the public sector, there is a lot of effort being made to become more data driven. There is a huge opportunity to maximise the value from the large amount of data that public sector bodies hold to drive efficiencies in operating costs and reimagine citizen experiences through data-led innovations.
What is your advice to public service organisations looking to make the most of the data they collect for policymaking?
Data plays a key role in the whole process of policy formulation, implementation and monitoring. By using data and leveraging advancements in generative AI, organisations can develop policy options and strategies, and by applying advanced analytics techniques, they can identify patterns and correlations within the data and predict trends. These data insights help to highlight the potential impacts of different policy choices, identifying issues, and assessing the effectiveness of proposed policy interventions.

For me the most important thing is how we automate the end-to-end process and coordination within relevant public sector bodies. The current processes are lengthy, costly and labour intensive. Adoption of modern data platforms could help in this process but technology alone is not going to solve this challenge. This will need changes in data-sharing agreements, policies and processes. Also, a cross-government body with ownership for enabling data sharing and collaboration is required to ensure we’re not creating more data siloes for the future.
How do you see government and public sector data strategies evolving?
As government organisations continue on their data-driven transformation journey, in my engagements with data leaders I encourage them to look at their future data strategy holistically. By holistically, what I mean is they need to look at shaping their end-to-end data foundation from data collection to management and to delivering insights.

They also need to look at how advancements in AI and generative AI impacts the underlying data platform and whether the platform is fit for purpose for these new and emerging needs.
Generative AI innovation has challenged the cloud infrastructure foundation and we see our customers switching their generative AI workloads to advanced compute services to drive cost efficiencies and performance. Organisations need to look at the role their data is going to play in the future when the new business applications will use pre-trained Large Language Models (LLMs), internal proprietary data and third party data.
I see four main trends impacting the future data strategies of public sector organisations:
- Path to real-time data: organisations are looking for more opportunities or use cases for real-time insights. This will help accelerate value and also influence citizen experiences as they interact with a government agency in real-time.
- Building data assets as a product: given the focus on data collaboration and sharing, organisations are adopting practices to build future data pipelines, shaping and managing data as a product.
- Evolve the chief data officer’s (CDO) role: CDOs have started to own the AI agenda in public sector organisations and will take more accountability for enabling value and outcome delivery from the enterprise data.
- AI and data: what organisations could do with AI in future will depend on how mature their data management practices are. AI will drive a lot of decisions on platform choices, guardrails and approach to data storage and warehousing.
You were on the panel for an AWS session at the Public Service Data Live conference in London last year in which experts talked about how governments could use big data. What were your key takeaways from the discussion?
Reflecting on the discussion, my three key takeaways were:
- Making data accessible and reducing the pains in data sharing and collaboration remains a top priority for public sector bodies. While technology is there to automate data sharing, with capabilities like AWS Cleanrooms and Amazon Datazones, the UK government is missing the right framework and policies to enable value from data sharing efficiently.
- There is a need to educate citizens on how their data is being used by the government and how it helps the public sector deliver superior services to them. A lot needs doing to build citizen confidence and trust around their data privacy and security.
- Becoming data driven is more of a change programme than a data transformation programme. Having the right mechanism to bring people, process and cultural change is important to maximise the return on investment in big data and AI technologies.
– Deepak Shukla took part in the AWS session on big data at Public Service Data Live in London on 14 September 2023, along with fellow panellists Dr Ravinder Singh, modernising technology programme manager at the Central Digital and Data Office; Ed Towers, head of advanced analytics and data science at the Financial Conduct Authority; Aydin Sheibani, chief data officer at HM Revenue & Customs; and Shruti Kohli, head of data science (innovation), Innovation Lab, Department for Work and Pensions.
To learn how your organisation can become more data driven, register for AWS Public Sector Day 2024, taking place on 19 March in London. AWS’ flagship one-day conference for the UK public sector, the event features public sector and industry keynotes, hands-on workshops and 20+ breakout sessions.
About Deepak Shukla
Head, data and analytics capabilities, Amazon Web Services
Deepak heads data and analytics capabilities at Amazon Web Services (AWS) for public sector in UK and Ireland.
Deepak is a commercially focused senior leader with over 16 years of management consulting and advisory experience (to CIO, CTO, CDOs) within cloud and digital transformation enabled by data, analytics, AI and Generative AI. He is passionate about innovation and enabling large scale AI and data-led transformation programmes delivering multi-million dollars in benefits to his customers.
Before joining AWS, Deepak worked as senior manager at Accenture, shaping and delivering large scale data driven transformation programmes for customers across industries.