When officials at the Center for Disease Control (CDC) began realizing that a new, potentially deadly virus was growing rapidly, they knew they had to get ahead of it as much as possible. Because the agency had worked successfully on many projects with Northrop Grumman for years, it turned to the systems integrator for help building a national infrastructure for all COVID line-level testing data as quickly as possible.
The goal was to create an authoritative source for comprehensive COVID testing data that would incorporate every test event happening anywhere in the United States or its territories. The system had to be able to deliver the data to downstream entities that were creating policy or engaging in rapid-response scenarios. They used an event-driven architecture because it would allow the system to evolve and continue delivering value even as situations changed.
The infrastructure, based on Confluent’s event-streaming platform, combines multiple streams of data and creates a record for each testing event by a citizen. As records travel through the pipeline, they are enriched with metadata.
Confluent’s enterprise event-streaming platform helps agencies unlock and repurpose their existing data with a universal data pipeline based on an elegant architecture that makes data an active asset available to any credentialed user.
This process allows for a consolidated view of all situational awareness across streams. It also enables data to be aggregated and streamed to any destination at any time. The Northrop Grumman team could then organize automated “data storefronts,” which help deliver data in real-time to the White House Task Force and the Health and Human Services Department. A self-service data storefront also allows jurisdictions to view data.
Report after report emphasizes the importance of data in agency decision-making, meeting mission goals and improving citizen services. Agencies must find a way to meet those goals while dealing with disparate stores of data, along with legacy systems and databases that are slow and expensive to access, and their existing data integration frameworks, which don’t communicate well with one another. Achieving all of these goals requires a new way of thinking and a new way of managing that data: event-driven streaming.
This article is an excerpt from GovLoop’s recent report, “Connecting the Dots: Getting Maximum Value From Data.” To learn how to use event streaming to process, store, analyze and act on both historical and real-time data in one place, download the full report here.
Leave a Reply
You must be logged in to post a comment.