Accelerating data and analytics maturity in the US public sector

| Article

The McKinsey Global Institute estimates that data and analytics could create approximately $1.2 trillion a year in value across the public and social sectors.1Accelerating data and analytics transformations in the public sector,” McKinsey, March 18, 2021. The past two and half years have seen no shortage of new public-sector analytics use cases that bear out this potential value. Government entities have created real-time pandemic dashboards, conducted geospatial mapping for drawing new public transportation routes, and analyzed public sentiment to inform economic recovery investment.

While many of these examples were born out of necessity, public-sector agencies are now embracing the impact that data-driven decision making can have on residents, employees, and other agencies. Embedding data and analytics at the core of operations can help optimize government resources by targeting them more effectively and enable civil servants to focus their efforts on activities that deliver the greatest results.

In this article, we examine how state and local governments could benefit from data-driven decision making, and the common obstacles an informed data strategy could overcome. We also discuss six factors that have helped public-sector agencies accelerate their data and analytics maturity.

The need for data muscle in government

State and local governments could benefit from learning to flex data-driven decision-making muscle both internally (in and among agencies) and externally (with and for residents). Internally, it can give leaders near real-time visibility into priority measures of success to inform faster interventions when projects or priorities need attention from a given department or agency. State workers’ daily business resources could be enhanced to include targeted data and analytics tools that can improve efficiency and productivity in crucial areas such as outbreak response.

Externally, a robust government data platform could enable residents to access and control their own data (for example, in tax and state records) as well as communicate with government agencies through trusted and helpful shared tools. Local departments, community-based agencies, and faith-based agencies could also use the platform to liaise between government and residents and coordinate their outreach on initiatives that benefit the community. Feedback loops from resident-facing apps could further inform future activities taken by the state.

The muscle-building regimen for data-driven decision making often depends on how well data and analytics are currently used. A thorough understanding of an agency’s current maturity—and its place on the overall maturity spectrum (exhibit, and see sidebar “Four stages of the data and analytics maturity journey”)—can inform a consistent routine, and skilled expansion of data and analytics tools.

Public-sector agencies can start understanding their current state to begin advancing their data and analytics maturity.

But even an informed data strategy faces inevitable challenges, from choosing the right product to ensuring new tools can integrate with existing systems. Each stage could face hurdles such as risk-averse mindsets, entrenched behaviors, infrastructure gaps, and data use agreements that reinforce silos. Some state and local agencies have overcome these challenges by implementing change programs and making investments to embed data and analytics use at the core of how they work.

Moreover, it can be tempting—and dizzying—to consider the vast array of advanced technologies in the market in various phases of development. The global data and analytics space is evolving rapidly thanks to emerging technologies at the cutting edge of prescriptive analytics, as well as continuous innovation by industry leaders and start-ups. While some state and local governments are ready to adopt such advanced technologies, most public-sector agencies can still find opportunities to use simpler and highly accessible data tools that are more widely available.

Six factors for accelerating data and analytics maturity

We have identified six factors that could help accelerate data and analytics maturity, along with examples from public-sector agencies that have achieved success in this area.

Build priorities around the state or agency strategy

Every agency’s data and analytics approach ideally aligns with and cascades from its overall strategic goals and intended public-sector impact. With so many competing priorities across a state’s or agency’s agenda, it can be easy to lose sight of what is important and what is extraneous. A clearly articulated vision could ensure that time and effort are focused on the most important issues and in areas where analytics can meaningfully influence decisions and outcomes. A strong, clear vision could determine which specific analytics use cases could engender the greatest results and inform plans for the underlying data and technology needed to power those analytics.

One state, for example, ensures its data strategy focuses on specific data to solve agency and organizational challenges within identified policy areas and consistently aligns with the governor’s established priorities. This state also plans to invest analytics resources in a focused set of strategic priority areas such as public safety and criminal justice, housing and homelessness, workforce development, and COVID-19 recovery.

Demonstrate success early to generate momentum

A major data and analytics transformation can be daunting, especially if funding for the undertaking is limited. Rather than attempt to create a massive cross-agency, decade-long overhaul, leaders could start small and build momentum. Early wins could help establish proof points to celebrate and cultivate excitement across the agency. When effectively communicated across the agency, these early wins could also help catalyze more organic data and analytics transformations.

In one mid-Atlantic state, the Department of Health started small by focusing its data and analytics transformation efforts on the division of childhood lead exposure. The division was selected because it aligned with the department’s goals related to improving child health, had relatively few interdependencies with other divisions, and possessed a highly motivated data-driven leader. Over the course of several weeks, a cross-functional team of program leaders and analytics experts aligned on their targets and key measures to identify initiatives that could deliver the greatest outcome and then conceptualized the data and analytics needed to inform interventions.

Dashboard views for several levels of the government (for example, the Department of Health, local health departments, and municipal, county, and state offices) were then built to incorporate these data and analytics and shared across state and local levels. The solution and the speed of delivery generated interest from other divisions within the Department of Health to pursue the same approach and leverage centralized infrastructure such as a data lake.

Lead with a data and analytics center of excellence

Given the complex intersection of programmatic and functional expertise needed for a data and analytics transformation, state and local leaders could consider establishing a data and analytics center of excellence (CoE). A CoE generally consists of a specialized team that, especially in early stages of data and analytics maturity, could help accelerate results through coordination and know-how. For instance, the CoE could take responsibility for developing the initial strategic plan, incorporating linkages to the agency’s vision, and assembling and prioritizing a strategic road map for data and analytics. Over time, the CoE could serve a broader set of functions (see sidebar “Additional functions of a data and analytics center of excellence”).

Due to the potential breadth of its remit, a data and analytics CoE may need to assemble a diverse set of roles and capabilities, including a central analytics team leader, business analysts and translators, data scientists, data engineers, visualization experts, and back-end engineers. The exact composition may evolve over time as priorities shift, gaps in the rest of the agencies close, and new challenges emerge. There may also be additional operational design choices to consider related to roles and responsibilities, and how each agency’s CoE interfaces with similar teams in other parts of the state government.

Connecticut’s Data and Policy Analytics unit operates like an analytics CoE.2 It is a centralized group designed to support state agencies with their data and analytics needs, including open data (such as publishing and visualizations), data integration (for example, identifying data sources and data linkage or matching), analytics (such as strategy development or use cases road maps), mapping and geospatial data, best practices, and coordination and facilitation. This cross-functional team supports multiple agencies, affording a systemwide perspective on data and analytics.

Scale through agile operating principles

As a public-sector agency begins to accelerate across its data and analytics maturity journey, the number of use cases that must be developed, built, and scaled is likely to increase quickly. This is when the “how” often becomes just as important as the “what,” and the agency may need to consider adopting an agile operating model. This approach could enable teams to progress sustainably through a stream of use cases in time-boxed sprints.3The five trademarks of agile organizations, McKinsey, January 22, 2018. The model could also facilitate greater flexibility and responsiveness, attributes that are especially valuable in emergencies such as the COVID-19 pandemic.

In one case, the Department of Motor Vehicles (DMV) of a large state saw a sharp increase in fraudulent activity during the pandemic due to a host of factors, including higher rates for commercial driver services, higher prices of used vehicles, and new avenues for monetizing stolen IDs such as Pandemic Unemployment Assistance. In response, the DMV’s investigations division launched a major effort to tap into the state’s wealth of disparate data sets in hopes of identifying risks that fraud investigators may not have been detecting.

To enable this, they workshopped a new operating model that focused on agility. Teams conducted rapid data engineering sprints to connect a growing portfolio of internal and external data sources.

Through this new way of working, the DMV was able to continuously generate new insights and delegate response measures to the appropriate parties. This enhanced use of analytics based on agile operating principles led to a more than 55 percent increase in detection of verified fraudsters, a more than 50 percent reduction in the false-positive fraud detection rate, an improved employee experience with a bench of modern tools to fight fraud, and ultimately a streamlined resident experience.

Invest in a modern technology backbone

A public-sector agency’s ability to achieve its analytics ambitions in large part depends on adopting the appropriate technology to underpin it. However, public-sector leaders may be accustomed to infrastructure that is older, not integrated, and highly complex. For example, one state department was running approximately 125 enterprise applications across 15 divisions, the majority of which were on premises and had no method for sharing data or insights with one another. Constraints from categorical funding had compounded investment in siloed systems rather than shared technology infrastructure.

The state set a goal of integrating the data across departments and agencies to gain a 360-degree view of populations served by multiple public-health programs, such as immunizations and the women-infant-children (WIC) program. The state’s data and analytics upgrades reduced duplicate applications, which revealed opportunities to adopt enterprise-wide cloud solutions. Those solutions boosted productivity and generated cost savings. The department also used more commercial off-the-shelf and software as a service (SaaS) solutions instead of building bespoke ones.

Identify and navigate structural barriers systematically

Public-sector agencies routinely face structural barriers to advancing data and analytics maturity. One McKinsey Global Institute report found only 10 to 20 percent of potential value from data and analytics was captured in the public sector over a five-year period due to barriers stemming from siloed data within and across agencies as well as a lack of analytics talent.4The age of analytics: Competing in a data-driven world,” McKinsey Global Institute, December 7, 2016. The public sector has a relatively harder time attracting and retaining data and analytics talent,5Transforming the US government’s approach to hiring digital talent,” McKinsey, September 9, 2020. and contracts often require a multiyear commitment, preventing agile technology builds and the incorporation of best-of-breed technology and data services vendors. Moreover, funding to advance data and analytics may be earmarked for specific programmatic use cases instead of for building foundational infrastructure.

Acknowledging and navigating these institutional barriers systematically could prove pivotal, especially at early stages of the data and analytics maturity journey.6Accelerating data and analytics transformations in the public sector,” McKinsey, March 18, 2021. For example, one state set up novel partnerships with local academic institutions and established the role of a rotating data scientist training fellow to secure a pool of motivated, highly trained specialist talent. Another state’s analytics CoE established a learning program for analysts with a field-and-forum journey that supported analysts at varying capability levels to enhance their skills.7Lessons in leadership: Using data to drive public-health decisions,” McKinsey, August 16, 2022.

These six factors suggest an approach that agencies might take regardless of their current capabilities to ensure analytics can drive meaningful insights, decisions, and results.

In our experience, agencies that acknowledge, highlight, and systematically review barriers to data and analytics maturity are more likely to overcome them. Agencies may face different challenges, including access to talent, data governance, contracting restrictions, or organizational design. Once they clearly identify which obstacles are specific to their paths, they can develop plans to overcome them. This includes getting buy-in and sponsorship from senior leadership.


These six factors can help jump-start progress for public-sector agencies looking to accelerate their capabilities and advance in data and analytics maturity. These factors in no way diminish the complexities of specific domains and contexts that public-sector agencies may have to face; instead, they suggest an approach that agencies might take regardless of their current capabilities to ensure analytics can drive meaningful insights, decisions, and results.

Explore a career with us