State attorneys general (AGs) continue to play a pivotal role as innovators, shaping the regulatory environment by leveraging their expertise and resources to influence policy and practice. The public-facing nature of AG offices across the U.S. compels them to respond to constituent concerns on abbreviated timetables. This political sensitivity, combined with the AGs’ authority to address both local and national issues, underscores their significant influence in the current regulatory environment.

In a recent interview, Karen White, the executive director of the Attorney General Alliance (AGA), discussed the organization’s impactful partnership with PBS, its involvement in the Bipartisan Leadership Project, and its proactive stance on artificial intelligence (AI). Originally a regional group, the AGA has grown into a significant force addressing complex issues through bipartisan collaboration and innovative partnerships.

Published in Law360 on January 22, 2025. © Copyright 2025, Portfolio Media, Inc., publisher of Law360. Reprinted here with permission.

In the first installment of this two-part article, state attorneys general across the U.S. took bold action in 2024 to address what they perceived as unlawful activities by corporations in several areas, including privacy and data security, financial transparency, children’s internet safety, and other overall consumer protection claims.

Missouri’s attorney general (AG) announced on X.com (formerly Twitter) that he is “issuing a rule requiring Big Tech to guarantee algorithmic choice for social media users.” [X.com post (January 17, 2025, roughly 3:35 p.m. EST)] He intends to use his authority “under consumer protection law,” known as the Missouri Merchandising Practices Act in that state, “to ensure Big Tech companies are transparent about the algorithms they use and offer consumers the option to select alternatives.” [x.com post] The Missouri AG touts this rule as the “first of its kind” in an “effort to protect free speech and safeguard consumers from censorship.” [Press release]

On January 9, New Jersey Attorney General (AG) Matthew J. Platkin and the Division on Civil Rights (DCR) launched a new Civil Rights and Technology Initiative aimed at addressing the potential for discrimination and bias associated with artificial intelligence (AI) and other decision-making technologies. The announcement is one of many recent examples of AG’s leading the development of AI regulation. The New Jersey initiative is informed by recommendations from Governor Phil Murphy’s Artificial Intelligence Task Force, which emphasized the need for public education on bias and discrimination related to AI deployment.

As one of her last acts in office, on December 24, 2024, Oregon Attorney General (AG) Ellen Rosenblum issued guidance for businesses deploying artificial intelligence (AI) technologies. The guidance highlights the risks associated with the commercial use of AI, and underscores that, despite the absence of a specific AI law in Oregon, a company’s use of AI must still comply with existing laws.

The Internet of Things (IoT) represents a transformative shift in how consumers interact with technology, integrating physical devices with sophisticated services to create interconnected ecosystems. As the adoption of IoT devices skyrockets, with projections estimating 75 billion connected devices by 2025, the legal landscape surrounding these hybrid transactions — comprising goods, software, and services — remains unsettled. Traditional legal frameworks, such as the Uniform Commercial Code (UCC), struggle to address the complexities of IoT transactions. Consumer advocacy groups are increasingly calling for regulatory intervention to protect consumers from emerging issues, considering a legislative landscape that is not keeping pace with rapidly evolving technology.

Introduction

The National Defense Authorization Act (NDAA) for 2025 includes a mandate that contractors furnish information and documentation to enable the military to modify and repair equipment and systems. Not surprisingly, industry is pushing back on that mandate. On September 25, Senator Elizabeth Warren (D-MA) sent a letter to various industry associations, questioning their motives to prevent a right-to-repair requirement that the Senate included in its proposed defense budget for fiscal year (FY) 2025. Warren also sent a separate letter to Secretary of Defense Lloyd Austin, expressing concern about contractual restrictions that void contractor warranties when third parties perform repairs and that prevent access to operations, maintenance, integration, and training data.