The U.S. Department of Justice (DOJ) recently updated its Evaluation of Corporate Compliance Programs (ECCP), which prosecutors consider when investigating, charging, and negotiating plea or other agreements with corporations. The DOJ's recent update highlights the considerations related to a corporation’s use of artificial intelligence (AI) and the compliance program the corporation has implemented to identify and mitigate risks associated with AI. The update provides corporations with a list of AI-specific questions that help guide their development and ongoing assessment of compliance programs.
It is the latest example of signals that regulators will prioritize AI compliance reviews in 2025. As such, we expect 2025 to see a surge in federal and state regulators sending requests for information, conducting compliance sweeps, and bringing enforcement actions targeted at specific industries where the risks posed to consumers may be higher.
Corporations leveraging AI-driven systems and advanced data analytics, two data processing activities increasingly integral to modern operations, should closely review the ECCP’s considerations given the unique compliance risks posed by these processing activities. The DOJ places particular emphasis on a proactive approach, stating "prosecutors may credit the quality and effectiveness of a risk-based compliance program that devotes appropriate attention and resources to high-risk transactions, even if it fails to prevent an infraction."
Three areas of the ECCP received significant updates: evaluating risk management of emerging technologies, emphasizing the role of data analysis, and strengthening whistleblower protections.
AI and Emerging Technology Risk Management
The DOJ places an elevated level of scrutiny on corporations using AI and data-driven technologies, focusing on leveraging data to gain insights into the effectiveness of its compliance program that promotes a culture of compliance. Moreover, the ECCP instructs prosecutors to consider and assess whether corporate compliance programs thoroughly document the risks associated with AI across various domains, such as data confidentiality, cybersecurity, bias in automated decisions, and overall data reliability. However, compliance programs must also demonstrate how the corporation proactively identifies misconduct and/or issues with the compliance program. In other words, the DOJ clarifies that compliance is an ongoing function of a well-designed program and not a one-time "check box" exercise.
One key question the DOJ added in the update is "what baseline of human decision-making is used to assess AI?" Corporations need to establish this baseline and compare AI outputs to monitor and ensure they continue to meet ethical and regulatory expectations.
Best practices for compliance include conducting routine performance checks, auditing input data quality, and establishing technical safeguards to monitor, test, and evaluate the AI system's outputs. However, if best practices are insufficient to spur compliance, the DOJ instructs prosecutors to consider whether a company implemented robust compliance tools like monitoring systems, continuous testing, and data validation processes. Considering the DOJ’s emphasis on "proactive" risk management, corporations need to ensure that AI risks and misuses can be promptly identified and mitigated.
Data-Driven Compliance and Analytics
Expanding on previous guidance, the ECCP update underscores the importance of data analytics in a robust compliance program. The DOJ prompts prosecutors to consider whether a corporation’s compliance team has access to the data needed to evaluate risk holistically. Specifically, the DOJ focuses on assessing the assets, resources, and technology available to the compliance and risk management function compared to those available to other teams within the corporation. Corporations that under-invest in compliance or otherwise fail to provide the compliance function with a suite of tools that is at least as sophisticated as the business operation face significant legal risk if a prosecutor comes knocking.
Another focus area for the DOJ is third-party risk management. The department emphasizes the importance of data analytics in continuously monitoring vendor risk. Supply chain risk is among the top risks cited by information security officers, and corporations will be assessed on how vendor relationships are reviewed and tracked.
Strengthening Whistleblower Protections and Reporting Structures
The DOJ aligned the ECCP updates with its new Corporate Whistleblower Awards Pilot Program, introduced in August, which reinforces the department’s stance on whistleblower protections. The aligned guidance provides prosecutors with frameworks for evaluating a corporation’s commitment to protecting the whistleblower process, signaling DOJ support for employees reporting potential misconduct. Corporations will continue to be assessed based on their anti-retaliation policies and whether employees are actively trained on internal and external reporting mechanisms.
Prosecutors may place heightened emphasis on whether employees who report misconduct are disciplined fairly compared to others involved in the misconduct. For instance, the DOJ considers it essential that reporting misconduct should mitigate the company’s disciplinary response to a whistleblower’s involvement, promoting fairness, and encouraging future reporting. The external training aspect is relatively new and may pose practical challenges, as corporations will need to balance internal confidentiality and external whistleblower engagement.
Takeaway for Businesses on Corporate Compliance Programs
The DOJ’s updated guidance underscores the focus on well-designed, constantly monitored, and routinely improved compliance programs, with particular attention on AI governance, data transparency, and whistleblower protections. As the DOJ’s approach to compliance oversight grows, corporations that invest in nimble and adaptable compliance programs will be well-positioned to meet the DOJs expectations and limit regulatory scrutiny. For corporations navigating these changes, legal counsel can provide essential guidance on establishing risk management practices aligned with DOJ’s directives.
Here is a summary of steps corporations can take to align their compliance programs with DOJ’s expectations:
- Strengthen Risk Assessment and Documentation. Implement comprehensive, AI-specific risk assessments that document confidentiality, bias, and accuracy risks. Ensure these assessments are reviewed and updated regularly to reflect evolving regulatory standards.
- Establish Continuous Monitoring and Human Oversight for AI Systems. Corporations should deploy monitoring mechanisms to detect potential non-compliance or misuse of AI tools and integrate human oversight at critical decision points. The utilization of AI technologies does not excuse a failure to meet legal compliance requirements.
- Prioritize Data Access and Analytics for Compliance. Provide compliance personnel access to the same data tools as business units, facilitating a data-driven approach to risk management. Ensure these analytics are regularly audited and validated to comply with the DOJ’s evolving standards.
- Ensure Proportional Allocation of Resources. As companies invest in emerging technologies, a proportional amount of resources should be dedicated to establishing corresponding compliance programs.
- Enhance Whistleblower Training and Protection. Develop robust anti-retaliation policies that are clearly communicated across the organization. Consider incentivizing whistleblowing and ensure employees feel confident in internal and external reporting processes.
For more information, please contact us or your regular Parker Poe contact. You can subscribe to our latest alerts and insights here.