AI in government is about people, not programming

The US government’s investment in AI has grown exponentially in the past few years, as evidenced by additional funding for AI research in President Biden’s fiscal budget for 2023.

With more than $2 billion allocated to the National Institute of Standards and Technology and the Department of Energy for AI research and development, it’s clear that there is a growing enthusiasm for the technology in government.

The application of AI is driven by the urgent need to address the fatigue of federal employees. recent study It found that nearly two-thirds of government employees suffer from burnout, a much higher rate than observed in the private sector. Moreover, nearly half of the respondents are considering leaving their government jobs within the next year due to increased burnout and stress.

One immediate solution to help with this potential crisis is responsible implementation of artificial intelligence. AI can mitigate the impact of burnout by removing repetitive and time-consuming tasks and streamlining processes, reducing the overall burden and repetitive nature of government work. However, effective AI investments require more than just finance and technology.

Agencies must balance efforts to scale up investment in AI with responding to the unique needs and challenges of the many diverse teams that make up the federal workforce.

Currently, there is a lack of coherent guidance driving government efforts around artificial intelligence. While organizations such as NIST Basic framework for artificial intelligence RMF, organizations that do not have any experience with AI may struggle to build the necessary foundation for developing agile and mature AI. To quickly lay the foundation for a long-term AI strategy with short-term gains to support the federal workforce, agencies should consider three guiding components of AI.

Ensure acceptable levels of data maturity

Funding for AI is only one component of an effective AI strategy. Before implementing new technologies, agencies must begin with their existing processes—starting with the data maturity level.

If an agency does not have enough historical data to analyze, or if the data they do have is unstructured, the application of artificial intelligence can create additional front-end work for federal workers who may find themselves sifting through inaccurate or incomplete data processed by intelligence. artificial. For example, in response to this challenge, the Ministry of Defense has stood up Head of the Office of Digital and Artificial Intelligence (CDAO) to lead the deployment of artificial intelligence across the Department of Defense, including the Department’s strategy and policy on data.

Once agencies realize a baseline of data maturity, they can pilot core AI applications such as automating core tasks, enabling agencies to collect high-quality data and provide analysis and insight into that data, and provide the information needed to create a scalable AI roadmap that can integrate with modernization technologies. other information technology.

But a roadmap alone is not enough to ensure that AI-driven technologies are beneficial to the federal workforce.

Understanding the needs of federal employees

To implement AI that truly supports federal workers, agencies need to understand key pain points and challenges for federal employees. For most private organizations looking to implement new technology, user experience surveys will be an essential part of the pilot program to ensure an analytics-driven understanding of the technology’s successes and vulnerabilities.

However, although employee input is an important part of the AI ​​planning process, government surveys are often expensive and can take months or years to be incorporated into actionable data.

One way to combat this difficulty is to use existing AI to enrich AI investments. For example, instead of sending out a survey where results could take months to receive, an AI dashboard might provide a real-time view of the overall work showing areas that need more support or automate a simple response survey where employees can provide input.

Using relatively basic AI to assess implementation allows agencies to gain insight into workforce needs more sustainably and effectively from surveys, showing IT leaders where to implement AI to achieve the most impact.

Using artificial intelligence to improve employee experience

Once agencies have an AI baseline and understand workers’ needs, the final step for implementing employee-centered AI is to create a robust employee experience program powered by AI.

There are many ways AI can help agencies manage experience, from automating schedules to simplifying business decisions. When AI is designed with these improvements in mind, the tangible benefits of AI support both broader organizational goals and the people working to achieve them.

Expanding AI beyond beta programs remains a challenge. One of the primary responsibilities of the Authority is to develop operations Develop AI-powered capabilities and bring them into the field on a large scale across the defense space.

CDAO addressed this issue by selectively expanding only proven AI solutions for enterprise and shared use cases. Prioritizing proven solutions ensures that the AI ​​being implemented runs smoothly, is easy to use, and most importantly, is familiar to the workforce. As AI solutions become more complex, agencies can continue to expand until they have a fully scalable AI network designed by and for humans.

Contrary to a lot of conversations about AI, people are the most important component of successful AI programs. For the implementation to be successful, agencies need a human-centered AI mindset. Following these three guidelines for creating human-centered artificial intelligence creates space for the federal workforce to be more creative, productive, and ultimately more effective in advancing agency missions, and equipping leaders to raise the full potential of their teams.

Dr. Allen Badow is Chief Technology Officer at Empower AI, as well as Director of the Center for Artificial Intelligence Empowerment for Rapid Sharing and Agile Technology Exchange (CREATE).

Do you have an opinion?

This article is an editorial and the opinions expressed are those of the author. If you would like to respond, or have your own editorial that you would like to submit, please Email, Federal Times Senior Editorial Director, Carrie O’Reilly.

Leave a Comment