On September 23, Principal Deputy Assistant Attorney General Nicole M. Argentieri announced that the U.S. Department of Justice (DOJ) updated its guidance on the Evaluation of Corporate Compliance Programs (ECCP). The DOJ’s ECCP serves as a roadmap for federal prosecutors to use when evaluating the effectiveness of corporate compliance programs. Therefore, companies should also pay close attention to this guidance when reviewing their compliance programs. Ultimately, a company’s efforts to design, regularly evaluate, and update its compliance program in line with this guidance could inform criminal investigations, charging decisions, and case resolutions.

Emerging Technologies and AI

Most notably, the DOJ’s recent update focuses on processes for identifying and managing internal and external risks associated with new technologies, including artificial intelligence (AI).

This focus on AI is neither surprising nor new. Deputy Attorney General Lisa O. Monaco stated during remarks in February 2024, and again in March 2024, that the DOJ is focused on AI. She announced the formation of Justice AI to help DOJ better understand and prepare for how AI will affect its mission and accelerate its potential for good, while also guarding against risks associated with AI. She foreshadowed that DOJ enforcement will be “robust” and that federal prosecutors will seek stiffer sentences for offenses made significantly more dangerous by the misuse of AI.

The updated ECCP focuses on risk assessment and management associated with new technologies, including AI, and contemplates their use in both a company’s business and its compliance program. Now more than ever, with respect to using AI for business and compliance purposes, companies must consider how they will:

  • Assess the risks associated with using AI and other new technologies;
  • Integrate those risks into broader enterprise risk management strategies and compliance programs;
  • Implement effective ways to curb potential negative or unintended consequences resulting from the use of new technologies such as AI;
  • Mitigate the potential for deliberate or reckless misuse of AI, including by company insiders;
  • Put controls in place to monitor and test to ensure AI is being used for its intended purposes, and is trustworthy, reliable, and complies with the code of conduct;
  • Maintain human oversight of AI and AI decision-making;
  • Detect and correct decisions made by AI that are inconsistent with the company’s values or applicable law; and
  • Train company employees on using emerging technologies, including AI.

The ECCP defines AI broadly, stating that no system should be considered too simple to qualify as a covered AI system due to lack of technical complexity, and includes systems that are fully autonomous, partially autonomous, and not autonomous, and that operate both with and without human oversight. AI is defined as:

  1. Any artificial system that performs tasks under varying and unpredictable circumstances without significant human oversight, or that can learn from experience and improve performance when exposed to data sets.
  2. An artificial system developed in computer software, physical hardware, or other context that solves tasks requiring human-like perception, cognition, planning, learning, communication, or physical action.
  3. An artificial system designed to think or act like a human, including cognitive architectures and neural networks.
  4. A set of techniques, including machine learning, that is designed to approximate a cognitive task.
  5. An artificial system designed to act rationally, including an intelligent software agent or embodied robot that achieves goals using perception, planning, reasoning, learning, communicating, decision-making, and acting.

The definition of AI encompasses, but is not limited to, the AI technical subfields of machine learning (including, but not limited to, deep learning as well as supervised, unsupervised, and semi-supervised approaches), reinforcement learning, transfer learning, and generative AI. It does not include robotic process automation or other systems whose behavior is defined only by human-defined rules or that learn solely by repeating an observed practice exactly as it was conducted.

Whistleblower Protections

In addition to the DOJ’s focus on emerging technologies, the updates highlight a commitment to anti-retaliation and whistleblower protections. Companies must evaluate whether they have effective mechanisms for reporting misconduct and whether their policies and/or practices incentivize or chill reporting. The DOJ will consider a company’s anti-retaliation policy and whistleblower protections in writing and in practice when evaluating its compliance program, including how employees who report misconduct internally are treated and the training employees receive about these protections. This ECCP update follows the DOJ’s recent introduction of its Corporate Whistleblower Awards Pilot Program on August 1. This pilot program provides financial incentives for whistleblowers to report certain categories of corporate misconduct to DOJ for potential investigation and charging.

Use of Data and Resources

The updated ECCP instructs prosecutors to consider whether a company’s compliance department has access to and is appropriately leveraging data to test the effectiveness of, and identify issues in, its compliance program and with third-party vendor relationships. Importantly, DOJ will consider the extent that a company’s assets, resources, and technology are available to compliance and risk management as compared to the resources the company puts toward its revenue-generating business.

Lessons Learned

Lastly, DOJ emphasizes the need for companies to have a process in place for updating policies, procedures, and training to reflect their own lessons learned or those lessons learned from other companies operating in the same industry or geographic region.

Conclusion

DOJ’s new ECCP guidance provides a helpful roadmap for corporate compliance leaders and their teams in assessing and updating their current compliance framework. With the increased incentives for whistleblowers to report corporate misconduct, it is critical to ensure your corporate compliance and risk framework is not only robust, but also evolving in line with these announced priorities.