Reassessing AB 3211 with Agile Methodology
Understanding AI July 29, 2024
In a guest post by Dean Ball, an AI policy analyst at the Mercatus Center and author of the Substack newsletter Hyperdimensional, the focus is on California's AB 3211, a deepfake bill that has already passed one legislative house. Unlike many other state deepfake laws, California’s bill is notably ambitious, and Dean argues that it requires rethinking.
SB 1047, another AI bill from California, has received more attention due to its potential to disadvantage small tech companies. However, AB 3211, crafted by Assemblymember Buffy Wicks, could have an even more significant impact on the AI industry. The bill mandates watermarking for all AI-generated content, posing significant challenges for platforms like HuggingFace, which might have to remove many generative models.
The bill also requires AI systems to maintain a database of digital fingerprints for any potentially deceptive content, creating a substantial burden for AI developers. This provision seems nearly impossible for open-weight models to comply with, as developers have no control over user-generated outputs.
Moreover, AB 3211 stipulates that chatbots must disclose their AI nature at the beginning of each conversation, requiring user acknowledgment—akin to annoying cookie notifications on European websites.
Ball highlights that deceptive deepfakes are a genuine problem requiring legislative action, but AB 3211’s current form is poorly drafted and overly broad. It mandates “maximally indelible” watermarks for all generative AI content, which might be impossible to achieve given current technology limitations. The bill’s stringent requirements could lead to a false sense of security, as bad actors can still remove watermarks and disseminate deceptive media.
In its pursuit of protecting society from synthetic media, AB 3211 imposes severe constraints on the AI industry, affecting all generative AI systems, regardless of their size or purpose. This includes grad students working on small projects and large companies developing multimodal models.
Ball concludes that while the intent behind AB 3211 is valid, the bill’s execution is flawed. It would benefit from being shelved and revisited with a more nuanced approach. This post offers a critical perspective on the bill and urges readers to consider the complexities of AI policy.
To fully understand the implications of AB 3211 and Ball's arguments, readers are encouraged to read the entire post. His insights are crucial for anyone interested in the future of AI legislation and its impact on the industry.
Agile Methodology
Having experienced the benefits of Agile Methodology in software development, we wonder if it could be applied to legislation. Think big, move fast, start small. Wishful thinking?
Iterative Development: Start by developing a simpler, pilot version of the bill with fewer requirements. This initial version should focus on the most critical aspects, such as basic watermarking standards for AI-generated content, and gather feedback from stakeholders.
Sprint Planning: Divide the bill's provisions into manageable sprints. Each sprint could focus on a specific area, like watermarking, database requirements, or chatbot disclosures. Collaborate with AI developers and legal experts to refine each section during these sprints.
Feedback Loops: Establish regular intervals for gathering feedback from AI industry stakeholders, legislators, and the public. Use this feedback to make iterative improvements to the bill.
Cross-Functional Teams: Form cross-functional teams including policymakers, AI experts, legal advisors, and industry representatives to ensure diverse perspectives are considered during the bill's development and refinement process.
Prototyping and Testing: Create prototypes of compliance systems (e.g., watermarking methods) and test them in real-world scenarios to identify potential issues and refine the approaches before full-scale implementation.
Continuous Improvement: Adopt a continuous improvement mindset. Even after the bill is passed, remain open to amendments and updates based on technological advancements and feedback from the AI community.
This Agile approach ensures that AB 3211 evolves in a practical, inclusive, and effective manner, addressing both the needs of society and the capabilities of the AI industry. Of course, “agile” is not a word typically associated with government. Your feedback much appreciated.