There’s no denying that big data is transforming the way which government operates. There are so many different applications and needs for big data analysis. The challenge of learning to manage the volume, variety and velocity of data is not going away anytime soon. In fact, this challenge is getting harder for agencies to tackle, as we are creating more data than ever before.
The Cloudera Federal forum educated and trained government leaders on how to make smart big data investments and decisions. The event was moderated by Bob Gourley. Gourley is the publisher of CTOvision.com and DelphiBrief.com and the new analysis focused Analyst One. Bob’s background is as an all source intelligence analyst and an enterprise CTO. Find him on Twitter at @BobGourley, and be sure to follow his blog, he’s always got great content and stories he is sharing. Gourley led us through an exciting day focused on big data, some of the challenges highlighted during the sessions included:
- What kind of data ecosystem is needed to share data across the federal government?
- How do we secure privacy of data?
- What are the key issues for big data analysis in the intelligence community?
- What can we do to prevent access to information to reduce insider threats?
- What opportunities does big data present for the federal government?
- What are the challenges facing agencies to adopt big data solutions?
These are just some of the questions that panelist and audience members discussed during the Cloudera Federal Forum. The morning keynote David R. Shedd – Deputy Director, Defense Intelligence Agency, provided some insights to the challenges facing government while adopting big data tools, and technology solutions. Shedd provided an interesting observation, as he noted, “Bureaucracy will choose failure over change. The reason I say that is by and large, the bureaucratic construct where you work will look in the rear view mirror and say, we’re doing ok based on how we got here, rather than where we need to go.”
He continued to describe that what is critical is that agencies have champions and leadership driving agencies where they need to go. There were also several challenges he noted for adopting big data tools. I’ve highlighted six of them below.
1. Remove barriers to innovation: This means that organizations should continue to move towards supporting a culture of innovation, taking calculated risks and encourage staff to think out of the box and creatively.
2. The acquisition process is arcane: The acquisition process moves too slow and far to complex. It does not allow government to swiftly procure tools to meet mission need. If there was a more rapid acquisition process, agencies could be more agile and flexible to leverage big data.
3. Leaders understand the power of big data: More sharing and case studies are needed to help connect and teach leaders about the power of big data.
4. Dynamic new world of threats: Simply, the threat landscape for government is changing. There are more threats facing government than ever before, and these threats are risking the economic viability of our nation. With these threats occurring, it is essential that organizations understand how to securely protect data in information systems.
6. Finding the right data: creating the data ecosystem: There’s the saying, “You don’t know, what you don’t know,” and this couldn’t be more true than with big data analysis. The reality is that since so much data is being created, stored and managed, across multiple departments and agencies, there are millions of insights to be learned. For government, a challenging question is: how do we get people the right information, at the right time, to make the right decision? This means that data must be shared broadly across an agency, while still retaining any privacy and security concerns.
6 Lessons Learned for Big Data Adoption
Yet, the event certainly was not all about the challenges facing government to adopt big data. During the morning panel government leaders discussed the best practices and lessons learned from their agency. The panel included:
- Moderated by LTG (R) Richard Zahner
- John A. Marshall, Deputy Program Manager, NSG Program Management Office, IT Service Directorate, National Geospatial Intelligence Agency
- Skip McCormick, Systems Architect, Central Intelligence Agency
- Kevin Ford, Deputy Director, Technology Directorate, National Security Agency
- Todd G. Myers – Lead Global Compute Architect – National Geospatial-Intelligence Agency (NGA)
I looked back at my notes, and pulled out the six core best practices I heard from the panelist.
1. Get buy-in by showing value: One strategy to get buy-in is to show the value that big data analysis will bring to your agency. Show core metrics and a clear problem that you will be solving by leveraging data. Conduct estimates and make it real and relevant for your agency.
2. Governance is essential: Governance is essential to managing data. Your agency must make sure your policies are up-to-date. Also, be sure that you are enforcing your policy. You could have a great policy written, but without any kind of enforcement mechanism, it will fall flat.
3. Tight budgets lead to innovation: I heard this a lot during the panel. We’ve also seen this trend highlighted on GovLoop. With tight budgets, agency leaders must think even more critically how to be prudent users of limited funding.
4. Define personas to manage data: This was an interesting observation from the panel. This strategy can be used to combat insider threats. Every employee will need different kinds of accessibility, make sure that your data system maps to these needs and is not providing unauthorized access.
5. Start small, build framework and work out: When beginning any big data project, or any IT initiative, starting small and building out is a great way to test and iterate. You can spot roadblocks, understand challenges and prevent errors early. Ultimately, this leads to stronger products and services for stakeholders.
6. Create communities of practice: Big data requires the sharing of knowledge, best practices and case studies. If you can create a community of practice within your agency, you can help to build support across your department, encourage knowledge sharing and start to build a data culture.
The reality facing government is that big data is a trend that is moving fast, and many agencies are capitalizing on the opportunity that big data presents. With these new opportunities, it is essential that government keeps security paramount, and creates infrastructures that support agility, flexibility and security.
Want More GovLoop Content? Sign Up For Email Updates
Cloudera is revolutionizing enterprise data management with the first unified Platform for Big Data: The Enterprise Data Hub. Cloudera offers enterprises one place to store, process and analyze all their data, empowering them to extend the value of existing investments while enabling fundamental new ways to derive value from their data. Learn more here: |
Thanks Pat for a very insightful post. I thought you might find my recent report on Big Data of interest. The report was released by the IBM Center for the Business of Government (see here – http://kevindesouza.net/2014/02/big-data-ibm/). The report outlines what I learned from interviews with CIOs at all three levels of government (local, state, and federal). Would appreciate your thoughts on it.