April 20, 2026

Artificial Intelligence in the Art Market

Holland & Knight IP/Decode Blog
Neda M. Shaheen | Cindy A. Gierhart
IP/Decode Blog

A recent Artsy survey of more than 300 gallery professionals underscores a widening gap between operational adoption of artificial intelligence (AI) and cultural acceptance of AI as an artistic medium. Though galleries are integrating AI into back-office functions, artists, collectors and market professionals remain cautious of AI in the art industry.

Works that blend human and machine inputs continue to complicate authorship, ownership and valuation analyses. There is also concern that AI models may favor certain styles or artists, raising questions about market distortion, fairness and copyright infringement. Platforms with access to large datasets may gain disproportionate influence over market visibility. For stakeholders across the art ecosystem, these findings raise intellectual property (IP), privacy and governance considerations.

Key Findings

  1. No Definition of "AI Art" Creates Classification and Disclosure Risk. The survey reveals that galleries lack a uniform definition of "AI art," with approaches ranging from "fully prompt-based" works to broader "AI-assisted" frameworks. The lack of consensus means that debates around AI art are happening without a common vocabulary. Absent standardized definitions, galleries may consider potential risk for misrepresentation, particularly where AI is material to valuation, authenticity or authorship.
  2. Authorship and Training Data Raises IP and Privacy Concerns. A significant proportion of artists and galleries express ethical concerns regarding AI systems trained on existing artworks without consent. Training AI models on copyrighted works without authorization remains an active area of litigation. Additionally, the use of third-party AI tools may involve ingestion of sensitive or proprietary images, raising confidentiality and trade secret concerns.
  3. AI Adoption Is Focused on Operations and Governance. As explained above, the survey indicates that galleries using AI are primarily using it for back-office functions such as drafting communications, research and data management, operations and exhibition planning. Despite this, nearly one-third of galleries report not using AI in operations at all.
  4. AI Systems Raise Concerns on Competition and Transparency. Galleries are divided on AI as an artistic medium. Though few view AI art as a legitimate new medium, many see it as either complementary or potentially disruptive. At the same time, collector demand for AI art is low. According to the survey, only 15 percent of galleries reported getting questions about AI art from collectors, but even those asking questions often do not purchase AI art.
  5. AI as a Tool, Not a Market Category – For Now. Though most artists working with galleries surveyed do not use AI in their artistic production, it is a key tool in administrative and operational support. Some galleries predict that AI will become absorbed into the art ecosystem – such as photography – rather than upend it.

Considerations for Clients

In light of the survey findings outlined above, clients operating in or adjacent to the art market may consider the following steps to manage risk and position themselves for the evolving landscape.

  • Assess AI Use Across Your Organization. Conduct an internal review to identify where and how AI tools are being used – whether for administrative tasks, communications, research or creative processes. Understanding your current AI footprint is the first step toward managing associated risks.
  • Establish Clear AI Policies and Definitions. Develop internal guidelines that define what constitutes "AI art" or "AI-assisted work" for your organization. Clear definitions help avoid misrepresentation risks and ensure consistent communication with artists, collectors and partners.
  • Understand the Limitations of Ownership in AI Works. The U.S. Copyright Office and courts have consistently found that AI-generated works are not entitled to copyright registration, as they lack human involvement. See Thaler v. Perlmutter, 687 F. Supp. 3d 140, 150 (D.D.C. 2023), aff'd, 130 F.4th 1039 (D.C. Cir. 2025). A creator of work that contains human-generated input but also contains AI-generated content can register the work as a whole but must disclaim ownership of the portions that were AI generated. See, e.g., Zarya of the Dawn (letter from the U.S. Copyright Office indicating that a graphic novel's author owns the "text as well as the selection, coordination, and arrangement of the Work's written and visual elements" but not the AI images within the graphic novel). Furthermore, check with your insurance carrier to determine whether AI outputs are covered.
  • Review Contracts and Disclosures. Examine existing agreements with artists, collectors, consignors and service providers to ensure they adequately address AI-related issues, including authorship, IP rights and disclosure obligations. Update standard forms as needed. Be mindful of representations and warranties that you own the works provided under the agreement if the works provided are AI-generated, as you may not actually own them.
  • Evaluate Data Privacy and Confidentiality Practices. If your organization uses AI tools that process images, client information or proprietary data, assess whether appropriate safeguards are in place. Be mindful of confidentiality obligations when uploading sensitive materials to third-party AI platforms.
  • Do Not Attempt to Copy Known Works or Real People's Images Through AI. Prompting AI to create a replica of another's work and then displaying or distributing that replica (or a substantially similar work) may infringe on the original owner's copyright. Likewise, prompting AI to create a celebrity lookalike and then using the lookalike for a commercial purpose could violate the individual's right of publicity. Additionally, using an AI-generated image of a real person in a nonconsensual intimate depiction may also violate federal or state deepfake laws.
  • Monitor Regulatory and Legal Developments. The legal landscape around AI – particularly regarding copyright, training data and liability – continues to evolve. Stay informed about relevant litigation, regulatory guidance and industry standards that may affect your operations or collection.
  • Engage in Industry Dialogue. As the art market grapples with questions of AI authenticity, valuation and ethics, participating in industry conversations can help shape emerging norms and ensure your voice is represented in the development of best practices.

Conclusion

The 2026 survey confirms that AI is already embedded in the operational fabric of the art market, even as its legitimacy as an artistic medium remains contested. For legal and compliance teams, this divergence is critical: Risk is being created not at the point of artistic innovation, but in the everyday use of AI across business processes.

Organizations that proactively implement governance frameworks, clarify contractual positions and manage data risks will be best positioned to navigate the next phase of AI adoption in the art world.

Related Insights