Federal policy and the associated guidelines are extensive and constantly evolving. Whether a citizen or a federal employee, people spend countless hours navigating government databases or content management systems seeking answers to their questions. The answer is often found across numerous data sources, requiring people to piece together policy, guidance, and procedures to get to a complete answer…or in many cases, make decisions based on incomplete information. We’ve seen this result in misinformed or non-compliant action, frustration and employee disengagement.
Traditional approaches to solving this problem include:
- Streamlining information sharing through designated communications channels, normally equipped with keyword search
- Maintaining a knowledge base, limited by the need for a person to maintain, curate and regularly update the information and often constrained by disparate knowledge solutions across the organization
- Providing a service center of people who are educated on changing policies and guidelines to answer questions for you so that you don’t have to find the answer yourself
Still these resources cannot keep up with the pace of change and evolution of information. So how can AI help?
The Future of Knowledge Management
We envision a future where AI enables knowledge to come to life – leveraging existing information channels to provide succinct answers to conversational questions. With an estimated 80% of enterprise data being unstructured – emails, video and audio streams, policy documents, service tickets, standard operating procedures – AI technologies such as Machine Learning (ML) and Natural Language Processing (NLP) are required to extract knowledge from data and return more relevant and complete answers to questions. And with the evolution of AI, this possibility is becoming a reality. According to Gartner’s Magic Quadrant for Insights Engine, by 2022, information will proactively find more employees more often, thereby providing the insight needed to progress decisions and actions and reduce reactive searching by 20%.
The future state is a technology agnostic search engine to filter SharePoint, websites, policy documents, emails, and other information sources to provide the users a single source of truth without the cumbersome process of navigating countless interfaces. ML will continuously improve the usefulness of the content, similar to how Google continuously refines its search. And all of this can be done using natural language with access via multiple channels (chat, search, text, voice). Let’s walk through a few examples.
Today, data calls coming from OMB, the hill, the president, etc. are answered by the distribution of countless emails, often rerouted numerous times to get to a person with the answer, with answers then requiring manual consolidation. What if an individual could ask, maybe even via voice, ‘how many vehicles are under maintenance in the department’s fleet’, and receive an immediate answer?
Today, federal employees supporting the mission in the field search policy databases, official guidelines, and associated SOPs and in office documentation to determine the compliant process for accomplishing their goal. What if they input a single question and receive a human-like answer summarizing key information with reference to the most relevant results?
How do you get started?
While it may sound magical, bringing knowledge to life doesn’t come without work. And the work may not outweigh the value (at least not yet). To be successful, an organization must define user priorities, identify the relevant data sources, which may be both within and outside of the organization, establish governance around data access and data source management, tag the data, and train models based on your organization’s specific need.
We recommend that organizations identify a scalable use case that can be prototyped to evaluate the complexity and value of these technologies within your environment. Developing the business case to support an investment to scale is key to achieving success.
Co-authored by Christina Bone, Accenture Federal Services AI Value Architect
Dominic Delmolino is a GovLoop Featured Contributor. He is the Chief Technology Officer at Accenture Federal Services and leads the development of Accenture federal’s technology strategy. He has been instrumental in establishing Accenture’s federal activities in the open source space and has played a key role in the business by fostering and facilitating federal communities of practice for cloud, DevOps, artificial intelligence and blockchain. You can read his posts here.
Such an important topic and one we hear a lot of the GovLoop community struggling with. It can just take too long to find the data they actually need. Interested to see where AI takes this topic and how it will help employees.