from Human in the Loop

It started not with lawyers or legislators, but with a simple question: has my work been trained? In late 2022, when artists began discovering their distinctive styles could be replicated with a few text prompts, the realisation hit like a freight train. Years of painstaking craft, condensed into algorithmic shortcuts. Livelihoods threatened by systems trained on their own creative output, without permission, without compensation, without even a courtesy notification.

What followed wasn't resignation. It was mobilisation.

Today, visual artists are mounting one of the most significant challenges to the AI industry's data practices, deploying an arsenal of technical tools, legal strategies, and market mechanisms that are reshaping how we think about creative ownership in the age of generative models. From data poisoning techniques that corrupt training datasets to blockchain provenance registries that track artwork usage, from class-action lawsuits against billion-dollar AI companies to voluntary licensing marketplaces, the fight is being waged on multiple fronts simultaneously.

The stakes couldn't be higher. AI image generators trained on datasets containing billions of scraped images have fundamentally disrupted visual art markets. Systems like Stable Diffusion, Midjourney, and DALL-E can produce convincing artwork in seconds, often explicitly mimicking the styles of living artists. Christie's controversial “Augmented Intelligence” auction in February 2025, the first major AI art sale at a prestigious auction house, drew over 6,500 signatures on a petition demanding its cancellation. Meanwhile, more than 400 Hollywood insiders published an open letter pushing back against Google and OpenAI's recommendations for copyright exceptions that would facilitate AI training on creative works.

At the heart of the conflict lies a simple injustice: AI models are typically trained on vast datasets scraped from the internet, pulling in copyrighted material without the consent of original creators. The LAION-5B dataset, which contains 5.85 billion image-text pairs and served as the foundation for Stable Diffusion, became a flashpoint. Artists discovered their life's work embedded in these training sets, essentially teaching machines to replicate their distinctive styles and compete with them in the marketplace.

But unlike previous technological disruptions, this time artists aren't simply protesting. They're building defences.

The Technical Arsenal

When Ben Zhao, a professor of computer science at the University of Chicago, watched artists struggling against AI companies using their work without permission, he decided to fight fire with fire. His team's response was Glaze, a defensive tool that adds imperceptible perturbations to images, essentially cloaking them from AI training algorithms.

The concept is deceptively simple yet technically sophisticated. Glaze makes subtle pixel-level changes barely noticeable to human eyes but dramatically confuses machine learning models. Where a human viewer sees an artwork essentially unchanged, an AI model might perceive something entirely different. The example Zhao's team uses is striking: whilst human eyes see a shaded image of a cow in a green field largely unchanged, an AI model trained on that image might instead perceive a large leather purse lying in the grass.

Since launching in March 2023, Glaze has been downloaded more than 7.5 million times, according to 2025 reports. The tool earned recognition as a TIME Best Invention of 2023, won the Chicago Innovation Award, and received the 2023 USENIX Internet Defence Prize. For artists, it represented something rare in the AI age: agency.

But Zhao's team didn't stop at defence. They also built Nightshade, an offensive weapon in the data wars. Whilst Glaze protects individual artists from style mimicry, Nightshade allows artists to collectively disrupt models that scrape their work without consent. By adding specially crafted “poisoned” data to training sets, artists can corrupt AI models, causing them to produce incorrect or nonsensical outputs. Since its release, Nightshade has been downloaded more than 1.6 million times. Shawn Shan, a computer science PhD student who worked on both tools, was named MIT Technology Review Innovator of the Year for 2024.

Yet the arms race continues. By 2025, researchers from the University of Texas at San Antonio, University of Cambridge, and Technical University of Darmstadt had developed LightShed, a method capable of bypassing these protections. In experimental evaluations, LightShed detected Nightshade-protected images with 99.98 per cent accuracy and effectively removed the embedded protections.

The developers of Glaze and Nightshade acknowledged this reality from the beginning. As they stated, “it is always possible for techniques we use today to be overcome by a future algorithm, possibly rendering previously protected art vulnerable.” Like any security measure, these tools engage in an ongoing evolutionary battle rather than offering permanent solutions. Still, Glaze 2.1, released in 2025, includes bugfixes and changes to resist newer attacks.

The broader watermarking landscape has similarly exploded with activity. The first Watermarking Workshop at the International Conference on Learning Representations in 2025 received 61 submissions and 51 accepted papers, a dramatic increase from fewer than 10 watermarking papers submitted just two years earlier.

Major technology companies have also entered the fray. Google developed SynthID through DeepMind, embedding watermarks directly during image generation. OpenAI supports the Coalition for Content Provenance and Authenticity standard, better known as C2PA, which proposes adding encrypted metadata to generated images to enable interoperable provenance verification across platforms.

However, watermarking faces significant limitations. Competition results demonstrated that top teams could remove up to 96 per cent of watermarks, highlighting serious vulnerabilities. Moreover, as researchers noted, “watermarking could eventually be used by artists to opt out of having their work train AI models, but the technique is currently limited by the amount of data required to work properly. An individual artist's work generally lacks the necessary number of data points.”

The European Parliament's analysis concluded that “watermarking implemented in isolation will not be sufficient. It will have to be accompanied by other measures, such as mandatory processes of documentation and transparency for foundation models, pre-release testing, third-party auditing, and human rights impact assessments.”

Whilst technologists built digital defences, lawyers prepared for battle. On 12 January 2023, visual artists Sarah Andersen, Kelly McKernan, and Karla Ortiz filed a landmark class-action lawsuit against Stability AI, Midjourney, and DeviantArt in federal court. The plaintiffs alleged that these companies scraped billions of images from the internet, including their copyrighted works, to train AI platforms without permission or compensation.

Additional artists soon joined, including Hawke Southworth, Grzegorz Rutkowski, Gregory Manchess, Gerald Brom, Jingna Zhang, Julia Kaye, and Adam Ellis. The plaintiffs later amended their complaint to add Runway AI as a defendant.

Then came August 2024, and a watershed moment for artist rights.

US District Judge William Orrick of California ruled that the visual artists could pursue claims that the defendants' image generation systems infringed upon their copyrights. Crucially, Judge Orrick denied Stability AI and Midjourney's motions to dismiss, allowing the case to advance towards discovery, where the inner workings of these AI systems would face unprecedented scrutiny.

In his decision, Judge Orrick found both direct and induced copyright infringement claims plausible. The induced infringement claim against Stability AI proved particularly significant. The plaintiffs argued that by distributing their Stable Diffusion model to other AI providers, Stability AI facilitated the copying of copyrighted material. Judge Orrick noted a damning statement by Stability's CEO, who claimed the company had compressed 100,000 gigabytes of images into a two-gigabyte file that could “recreate” any of those images.

The court also allowed a Lanham Act claim for false endorsement against Midjourney to proceed. Plaintiffs alleged that Midjourney had published their names on a list of artists whose styles its AI product could reproduce and included user-created images incorporating plaintiffs' names on Midjourney's showcase site.

By 2024, the proliferation of generative AI models had spawned well over thirty copyright infringement lawsuits by copyright owners against AI developers. In June 2025, Disney and NBCUniversal escalated the legal warfare, filing a copyright infringement lawsuit against Midjourney, alleging the company used trademarked characters including Elsa, Minions, Darth Vader, and Homer Simpson to train its image model. The involvement of such powerful corporate plaintiffs signalled that artist concerns had gained heavyweight institutional allies.

The legal landscape extended beyond courtroom battles. The Generative AI Copyright Disclosure Act of 2024, introduced in the US Congress on 9 April 2024, proposed requiring companies developing generative AI models to disclose the datasets used to train their systems.

Across the Atlantic, the European Union took a different regulatory approach. The AI Act, which entered into force on 1 August 2024, included specific provisions addressing general purpose AI models. These mandated transparency obligations, particularly regarding technical documentation and content used for training, along with policies to respect EU copyright laws.

Under the AI Act, providers of AI models must comply with the European Union's Copyright Directive No. 790/2019. The Act requires AI service providers to publish summaries of material used for model training. Critically, the AI Act's obligation to respect EU copyright law extends to any operator introducing an AI system into the EU, regardless of which jurisdiction the system was trained in.

However, creative industry groups have expressed concerns that the AI Act doesn't go far enough. In August 2025, fifteen cultural organisations wrote to the European Commission stating: “We firmly believe that authors, performers, and creative workers must have the right to decide whether their works can be used by generative AI, and if they consent, they must be fairly remunerated.” European artists launched a campaign called “Stay True To The Act,” calling on the Commission to ensure AI companies are held accountable.

Market Mechanisms

Whilst lawsuits proceeded through courts and protective tools spread through artist communities, a third front opened: the marketplace itself. If AI companies insisted on training models with creative works, perhaps artists could at least be compensated.

The global dataset licensing for AI training market reached USD 2.1 billion in 2024, with a robust compound annual growth rate of 22.4 per cent projected through the forecast period. The AI datasets and licensing for academic research and publishing market specifically was estimated at USD 381.8 million in 2024, projected to reach USD 1.59 billion by 2030, growing at 26.8 per cent annually.

North America leads this market, accounting for approximately USD 900 million in 2024, driven by the region's concentration of leading technology companies. Europe represents the second-largest regional market at USD 650 million in 2024.

New platforms have risen to facilitate these transactions. Companies like Pip Labs and Vermillio founded AI content-licensing marketplaces that enable content creators to monetise their work via paid AI training access. Some major publishers have struck individual deals. HarperCollins forged an agreement with Microsoft to license non-fiction backlist titles for training AI models, offering authors USD 2,500 per book in exchange for a three-year licensing agreement, though many authors criticised the relatively modest compensation.

Perplexity AI's Publishing Programme, launched in July 2024, takes a different approach, offering revenue share based on the number of a publisher's web pages cited in AI-generated responses to user queries.

Yet fundamental questions persist about whether licensing actually serves artists' interests. The power imbalance between individual artists and trillion-dollar technology companies raises doubts about whether genuinely fair negotiations can occur in these marketplaces.

One organisation attempting to shift these dynamics is Fairly Trained, a non-profit that certifies generative AI companies for training data practices that respect creators' rights. Launched on 17 January 2024 by Ed Newton-Rex, a former vice president of audio at Stability AI who resigned over content scraping concerns, Fairly Trained awards its Licensed Model certification to AI operations that have secured licenses for third-party data used to train their models.

The certification is awarded to any generative AI model that doesn't use any copyrighted work without a license. Certification will not be awarded to models that rely on a “fair use” copyright exception, which indicates that rights-holders haven't given consent.

Fairly Trained launched with nine generative AI companies already certified: Beatoven.AI, Boomy, BRIA AI, Endel, LifeScore, Rightsify, Somms.ai, Soundful, and Tuney. By 2025, Fairly Trained had expanded its certification to include large language models and voice AI. Industry support came from the Association of American Publishers, Association of Independent Music Publishers, Concord, Pro Sound Effects, Universal Music Group, and the Authors Guild.

Newton-Rex explained the philosophy: “Fairly Trained AI certification is focused on consent from training data providers because we believe related improvements for rights-holders flow from consent: fair compensation, credit for inclusion in datasets, and more.”

The Artists Rights Society proposed a complementary approach: voluntary collective licensing wherein copyright owners affirmatively consent to the use of their copyrighted work. This model, similar to how performing rights organisations like ASCAP and BMI handle music licensing, could provide a streamlined mechanism for AI companies to obtain necessary permissions whilst ensuring artists receive compensation.

Provenance Registries and Blockchain

Beyond immediate protections and licensing, artists have embraced technologies that establish permanent, verifiable records of ownership and creation history. Blockchain-based provenance registries represent an attempt to create immutable documentation that survives across platforms.

Since the first NFT was minted in 2014, digital artists and collectors have praised blockchain technology for its usefulness in tracking provenance. The blockchain serves as an immutable digital ledger that records transactions without the aid of galleries or other centralised institutions.

“Minting” a piece of digital art on blockchain documents the date an artwork is made, stores on-chain metadata descriptions, and links to the crypto wallets of both artist and buyer, thus tracking sales history across future transactions. Christie's partnered with Artory, a blockchain-powered fine art registry, which managed registration processes for artworks. Platforms like The Fine Art Ledger use blockchain and NFTs to securely store ownership and authenticity records whilst producing digital certificates of authenticity.

For artists concerned about AI training, blockchain registries offer several advantages. First, they establish definitive proof of creation date and original authorship, critical evidence in potential copyright disputes. Second, they create verifiable records of usage permissions. Third, smart contracts can encode automatic royalty payments, ensuring artists receive compensation whenever their work changes hands or is licensed.

Artists can secure a resale right of 10 per cent that will be paid automatically every time the work changes hands, since this rule can be written into the code of the smart contract. This programmable aspect gives artists ongoing economic interests in their work's circulation, a dramatic shift from traditional art markets where artists typically profit only from initial sales.

However, blockchain provenance systems face significant challenges. The ownership of an NFT as defined by the blockchain has no inherent legal meaning and does not necessarily grant copyright, intellectual property rights, or other legal rights over its associated digital file.

Legal frameworks are slowly catching up. The March 2024 joint report by the US Copyright Office and Patent and Trademark Office on NFTs and intellectual property took a comprehensive look at how copyright, trademark, and patent laws intersect with NFTs. The report did not recommend new legislation, finding that existing IP law is generally capable of handling NFT disputes.

Illegal minting has become a major issue, with people tokenising works against their will. The piracy losses in the NFT industry amount to between USD 1 to 2 billion per year. As of 2025, no NFT-specific legislation exists federally in the US, though general laws can be invoked.

Beyond blockchain, more centralised provenance systems have emerged. Adobe's Content Credentials, based on the C2PA standard, provides cryptographically signed metadata that travels with images across platforms. The system allows creators to attach information about authorship, creation tools, editing history, and critically, their preferences regarding AI training.

Adobe Content Authenticity, released as a public beta in Q1 2025, enables creators to include generative AI training and usage preferences in their Content Credentials. This preference lets creators request that supporting generative AI models not train on or use their work. Content Credentials are available in Adobe Photoshop, Lightroom, Stock, and Premiere Pro.

The “Do Not Train” preference is currently supported by Adobe Firefly and Spawning, though whether other developers will respect these credentials remains uncertain. However, the preference setting makes it explicit that the creator did not want their work used to train AI models, information that could prove valuable in future lawsuits or regulatory enforcement actions.

What's Actually Working

With technical tools, legal strategies, licensing marketplaces, and provenance systems all in play, a critical question emerges: what's actually effective?

The answer is frustratingly complex. No single mechanism has proven sufficient, but combinations show promise, and the mere existence of multiple defensive options has shifted AI companies' behaviour.

On the technical front, Glaze and Nightshade have achieved the most widespread adoption among protection tools, with combined downloads exceeding nine million. Whilst researchers demonstrated vulnerabilities, the tools have forced AI companies to acknowledge artist concerns and, in some cases, adjust practices. The computational cost of bypassing these protections at scale creates friction that matters.

Watermarking faces steeper challenges. The ability of adversarial attacks to remove 96 per cent of watermarks in competition settings demonstrates fundamental weaknesses. Industry observers increasingly view watermarking as one component of multi-layered approaches rather than a standalone solution.

Legally, the August 2024 Andersen ruling represents the most significant victory to date. Allowing copyright infringement claims to proceed towards discovery forces AI companies to disclose training practices, creating transparency that didn't previously exist. The involvement of major corporate plaintiffs like Disney and NBCUniversal in subsequent cases amplifies pressure on AI companies.

Regulatory developments, particularly the EU AI Act, create baseline transparency requirements that didn't exist before. The obligation to disclose training data summaries and respect copyright reservations establishes minimum standards, though enforcement mechanisms remain to be tested.

Licensing marketplaces present mixed results. Established publishers have extracted meaningful payments from AI companies, but individual artists often receive modest compensation. The HarperCollins deal's USD 2,500-per-book payment exemplifies this imbalance.

Fairly Trained certification offers a market-based alternative that shows early promise. By creating reputational incentives for ethical data practices, the certification enables consumers and businesses to support AI systems that respect creator rights. The expanding roster of certified companies demonstrates market demand for ethically trained models.

Provenance systems like blockchain registries and Content Credentials establish valuable documentation but depend on voluntary respect by AI developers. Their greatest value may prove evidentiary, providing clear records of ownership and permissions that strengthen legal cases rather than preventing unauthorised use directly.

The most effective approach emerging from early battles combines multiple mechanisms simultaneously: technical protections like Glaze to raise the cost of unauthorised use, legal pressure through class actions to force transparency, market alternatives through licensing platforms to enable consent-based uses, and provenance systems to document ownership and preferences. This defence-in-depth strategy mirrors cybersecurity principles, where layered defences significantly raise attacker costs and reduce success rates.

Why Independent Artists Struggle to Adopt Protections

Despite the availability of protection mechanisms, independent artists face substantial barriers to adoption.

The most obvious barrier is cost. Whilst some tools like Glaze and Nightshade are free, they require significant computational resources to process images. Artists with large portfolios face substantial electricity costs and processing time. More sophisticated protection services, licensing platforms, and legal consultations carry fees that many independent artists cannot afford.

Technical complexity presents another hurdle. Tools like Glaze require some understanding of how machine learning works. Blockchain platforms demand familiarity with cryptocurrency wallets, gas fees, and smart contracts. Content Credentials require knowledge of metadata standards and platform support. Many artists simply want to create and share their work, not become technologists.

Time investment compounds these challenges. An artist with thousands of existing images across multiple platforms faces an overwhelming task to retroactively protect their catalogue. Processing times for tools like Glaze can be substantial, turning protection into a full-time job when applied to extensive portfolios.

Platform fragmentation creates additional friction. An artist might post work to Instagram, DeviantArt, ArtStation, personal websites, and client platforms. Each has different capabilities for preserving protective measures. Metadata might be stripped during upload. Blockchain certificates might not display properly. Technical protections might degrade through platform compression.

The effectiveness uncertainty further dampens adoption. Artists read about researchers bypassing Glaze, competitions removing watermarks, and AI companies scraping despite “Do Not Train” flags. When protections can be circumvented, the effort to apply them seems questionable.

Legal uncertainty compounds technical doubts. Even with protections applied, artists lack clarity about their legal rights. Will courts uphold copyright claims against AI training? Does fair use protect AI companies? These unanswered questions make it difficult to assess whether protective measures truly reduce risk.

The collective action problem presents perhaps the most fundamental barrier. Individual artists protecting their work provides minimal benefit if millions of other works remain available for scraping. Like herd immunity in epidemiology, effective resistance to unauthorised AI training requires widespread adoption. But individual artists lack incentives to be first movers, especially given the costs and uncertainties involved.

Social and economic precarity intensifies these challenges. Many visual artists work in financially unstable conditions, juggling multiple income streams whilst trying to maintain creative practices. Adding complex technological and legal tasks to already overwhelming workloads proves impractical for many. The artists most vulnerable to AI displacement often have the least capacity to deploy sophisticated protections.

Information asymmetry creates an additional obstacle. AI companies possess vast technical expertise, legal teams, and resources to navigate complex technological and regulatory landscapes. Individual artists typically lack this knowledge base, creating substantial disadvantages.

These barriers fundamentally determine which artists can effectively resist unauthorised AI training and which remain vulnerable. The protection mechanisms available today primarily serve artists with sufficient technical knowledge, financial resources, time availability, and social capital to navigate complex systems.

Incentivising Provenance-Aware Practices

If the barriers to adoption are substantial, how might platforms and collectors incentivise provenance-aware practices that benefit artists?

Platforms hold enormous power to shift norms and practices. They could implement default protections, applying tools like Glaze automatically to uploaded artwork unless artists opt out, inverting the current burden. They could preserve metadata and Content Credentials rather than stripping them during upload processing. They could create prominent badging systems that highlight provenance-verified works, giving them greater visibility in recommendation algorithms.

Economic incentives could flow through platform choices. Verified provenance could unlock premium features, higher placement in search results, or access to exclusive opportunities. Platforms could create marketplace advantages for artists who adopt protective measures, making verification economically rational.

Legal commitments by platforms would strengthen protections substantially. Platforms could contractually commit not to license user-uploaded content for AI training without explicit opt-in consent. They could implement robust takedown procedures for AI-generated works that infringe verified provenance records.

Technical infrastructure investments by platforms could dramatically reduce artist burdens. Computing costs for applying protections could be subsidised or absorbed entirely. Bulk processing tools could protect entire portfolios with single clicks. Cross-platform synchronisation could ensure protections apply consistently.

Educational initiatives could address knowledge gaps. Platforms could provide clear, accessible tutorials on using protective tools, understanding legal rights, and navigating licensing options.

Collectors and galleries likewise can incentivise provenance practices. Premium pricing for provenance-verified works signals market value for documented authenticity and ethical practices. Collectors building reputations around ethically sourced collections create demand-side pull for proper documentation. Galleries could require provenance verification as a condition of representation.

Resale royalty enforcement through smart contracts gives artists ongoing economic interests in their work's circulation. Collectors who voluntarily honour these arrangements, even when not legally required, demonstrate commitment to sustainable creative economies.

Provenance-focused exhibitions and collections create cultural cachet around verified works. When major museums and galleries highlight blockchain-verified provenance or Content Credentials in their materials, they signal that professional legitimacy increasingly requires robust documentation.

Philanthropic and institutional support could subsidise protection costs for artists who cannot afford them. Foundations could fund free access to premium protective services. Arts organisations could provide technical assistance. Grant programmes could explicitly reward provenance-aware practices.

Industry standards and collective action amplify individual efforts. Professional associations could establish best practices that members commit to upholding. Cross-platform alliances could create unified approaches to metadata preservation and “Do Not Train” flags, reducing fragmentation. Collective licensing organisations could streamline permissions whilst ensuring compensation.

Government regulation could mandate certain practices. Requirements that platforms preserve metadata and Content Credentials would eliminate current stripping practices. Opt-in requirements for AI training, as emerging in EU regulation, shift default assumptions about consent. Disclosure requirements for training datasets enable artists to discover unauthorised use.

The most promising approaches combine multiple incentive types simultaneously. A platform that implements default protections, preserves metadata, provides economic advantages for verified works, subsidises computational costs, offers accessible education, and commits contractually to respecting artist preferences creates a comprehensively supportive environment.

Similarly, an art market ecosystem where collectors pay premiums for verified provenance, galleries require documentation for representation, museums highlight ethical sourcing, foundations subsidise protection costs, professional associations establish standards, and regulations mandate baseline practices would make provenance-aware approaches the norm rather than the exception.

An Unsettled Future

The battle over AI training on visual art remains fundamentally unresolved. Legal cases continue through courts without final judgments. Technical tools evolve in ongoing arms races with circumvention methods. Regulatory frameworks take shape but face implementation challenges. Market mechanisms develop but struggle with power imbalances.

What has changed is the end of the initial free-for-all period when AI companies could scrape with impunity, face no organised resistance, and operate without transparency requirements. Artists mobilised, built tools, filed lawsuits, demanded regulations, and created alternative economic models. The costs of unauthorised use, both legal and reputational, increased substantially.

The effectiveness of current mechanisms remains limited when deployed individually, but combinations show promise. The mere existence of resistance shifted some AI company behaviour, with certain developers now seeking licenses, supporting provenance standards, or training only on permissioned datasets. Fairly Trained's growing roster demonstrates market demand for ethically sourced AI.

Yet fundamental challenges persist. Power asymmetries between artists and technology companies remain vast. Technical protections face circumvention. Legal frameworks develop slowly whilst technology advances rapidly. Economic models struggle to provide fair compensation at scale. Independent artists face barriers that exclude many from available protections.

The path forward likely involves continued evolution across all fronts. Technical tools will improve whilst facing new attacks. Legal precedents will gradually clarify applicable standards. Regulations will impose transparency and consent requirements. Markets will develop more sophisticated licensing and compensation mechanisms. Provenance systems will become more widely adopted as cultural norms shift.

But none of this is inevitable. It requires sustained pressure from artists, support from platforms and collectors, sympathetic legal interpretations, effective regulation, and continued technical innovation. The mobilisation that began in 2022 must persist and adapt.

What's certain is that visual artists are no longer passive victims of technological change. They're fighting back with ingenuity, determination, and an expanding toolkit. Whether that proves sufficient to protect creative livelihoods and ensure fair compensation remains to be seen. But the battle lines are drawn, the mechanisms are deployed, and the outcome will shape not just visual art, but how we conceive of creative ownership in the algorithmic age.

The question posed at the beginning was simple: has my work been trained? The response from artists is now equally clear: not without a fight.


References and Sources

Artists Rights Society. (2024-2025). AI Updates. https://arsny.com/ai-updates/

Artnet News. (2024). 4 Ways A.I. Impacted the Art Industry in 2024. https://news.artnet.com/art-world/a-i-art-industry-2024-2591678

Arts Law Centre of Australia. (2024). Glaze and Nightshade: How artists are taking arms against AI scraping. https://www.artslaw.com.au/glaze-and-nightshade-how-artists-are-taking-arms-against-ai-scraping/

Authors Guild. (2024). Authors Guild Supports New Fairly Trained Licensing Model to Ensure Consent in Generative AI Training. https://authorsguild.org/news/ag-supports-fairly-trained-ai-licensing-model/

Brookings Institution. (2024). AI and the visual arts: The case for copyright protection. https://www.brookings.edu/articles/ai-and-the-visual-arts-the-case-for-copyright-protection/

Bruegel. (2025). The European Union is still caught in an AI copyright bind. https://www.bruegel.org/analysis/european-union-still-caught-ai-copyright-bind

Center for Art Law. (2024). AI and Artists' IP: Exploring Copyright Infringement Allegations in Andersen v. Stability AI Ltd. https://itsartlaw.org/art-law/artificial-intelligence-and-artists-intellectual-property-unpacking-copyright-infringement-allegations-in-andersen-v-stability-ai-ltd/

Copyright Alliance. (2024). AI Lawsuit Developments in 2024: A Year in Review. https://copyrightalliance.org/ai-lawsuit-developments-2024-review/

Digital Content Next. (2025). AI content licensing lessons from Factiva and TIME. https://digitalcontentnext.org/blog/2025/03/06/ai-content-licensing-lessons-from-factiva-and-time/

Euronews. (2025). EU AI Act doesn't do enough to protect artists' copyright, groups say. https://www.euronews.com/next/2025/08/02/eus-ai-act-doesnt-do-enough-to-protect-artists-copyright-creative-groups-say

European Copyright Society. (2025). Copyright and Generative AI: Opinion of the European Copyright Society. https://europeancopyrightsociety.org/wp-content/uploads/2025/02/ecs_opinion_genai_january2025.pdf

European Commission. (2024). AI Act | Shaping Europe's digital future. https://digital-strategy.ec.europa.eu/en/policies/regulatory-framework-ai

Fairly Trained. (2024). Fairly Trained launches certification for generative AI models that respect creators' rights. https://www.fairlytrained.org/blog/fairly-trained-launches-certification-for-generative-ai-models-that-respect-creators-rights

Gemini. (2024). NFT Art on the Blockchain: Art Provenance. https://www.gemini.com/cryptopedia/fine-art-on-the-blockchain-nft-crypto

Glaze. (2023-2024). Glaze: Protecting Artists from Generative AI. https://glaze.cs.uchicago.edu/

Hollywood Reporter. (2024). AI Companies Take Hit as Judge Says Artists Have “Public Interest” In Pursuing Lawsuits. https://www.hollywoodreporter.com/business/business-news/artist-lawsuit-ai-midjourney-art-1235821096/

Hugging Face. (2025). Highlights from the First ICLR 2025 Watermarking Workshop. https://huggingface.co/blog/hadyelsahar/watermarking-iclr2025

IEEE Spectrum. (2024). With AI Watermarking, Creators Strike Back. https://spectrum.ieee.org/watermark-ai

IFPI. (2025). European artists unite in powerful campaign urging policymakers to 'Stay True To the [AI] Act'. https://www.ifpi.org/european-artists-unite-in-powerful-campaign-urging-policymakers-to-stay-true-to-the-ai-act/

JIPEL. (2024). Andersen v. Stability AI: The Landmark Case Unpacking the Copyright Risks of AI Image Generators. https://jipel.law.nyu.edu/andersen-v-stability-ai-the-landmark-case-unpacking-the-copyright-risks-of-ai-image-generators/

MIT Technology Review. (2023). This new data poisoning tool lets artists fight back against generative AI. https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/

MIT Technology Review. (2024). The AI lab waging a guerrilla war over exploitative AI. https://www.technologyreview.com/2024/11/13/1106837/ai-data-posioning-nightshade-glaze-art-university-of-chicago-exploitation/

Monda. (2024). Ultimate List of Data Licensing Deals for AI. https://www.monda.ai/blog/ultimate-list-of-data-licensing-deals-for-ai

Nightshade. (2023-2024). Nightshade: Protecting Copyright. https://nightshade.cs.uchicago.edu/whatis.html

Tech Policy Press. (2024). AI Training, the Licensing Mirage, and Effective Alternatives to Support Creative Workers. https://www.techpolicy.press/ai-training-the-licensing-mirage-and-effective-alternatives-to-support-creative-workers/

The Fine Art Ledger. (2024). Mastering Art Provenance: How Blockchain and Digital Registries Can Future-Proof Your Fine Art Collection. https://www.thefineartledger.com/post/mastering-art-provenance-how-blockchain-and-digital-registries

The Register. (2024). Non-profit certifies AI models that license scraped data. https://www.theregister.com/2024/01/19/fairly_trained_ai_certification_scheme/

University of Chicago Maroon. (2024). Guardians of Creativity: Glaze and Nightshade Forge New Frontiers in AI Defence for Artists. https://chicagomaroon.com/42054/news/guardians-of-creativity-glaze-and-nightshade-forge-new-frontiers-in-ai-defense-for-artists/

University of Southern California IP & Technology Law Society. (2025). AI, Copyright, and the Law: The Ongoing Battle Over Intellectual Property Rights. https://sites.usc.edu/iptls/2025/02/04/ai-copyright-and-the-law-the-ongoing-battle-over-intellectual-property-rights/

UTSA Today. (2025). Researchers show AI art protection tools still leave creators at risk. https://www.utsa.edu/today/2025/06/story/AI-art-protection-tools-still-leave-creators-at-risk.html

Adobe. (2024-2025). Learn about Content Credentials in Photoshop. https://helpx.adobe.com/photoshop/using/content-credentials.html

Adobe. (2024). Media Alert: Adobe Introduces Adobe Content Authenticity Web App to Champion Creator Protection and Attribution. https://news.adobe.com/news/2024/10/aca-announcement


Tim Green

Tim Green UK-based Systems Theorist & Independent Technology Writer

Tim explores the intersections of artificial intelligence, decentralised cognition, and posthuman ethics. His work, published at smarterarticles.co.uk, challenges dominant narratives of technological progress while proposing interdisciplinary frameworks for collective intelligence and digital stewardship.

His writing has been featured on Ground News and shared by independent researchers across both academic and technological communities.

ORCID: 0009-0002-0156-9795 Email: tim@smarterarticles.co.uk

 
Read more... Discuss...

from Larry's 100

Merry Christmas, Ted Cooper Hallmark, 2025 (4 out of 5 Hot Chocolates)

Read more #100HotChocolates reviews

The season’s first banger. Atypical male lead, fresh story beats, and riffing on zany Anchorman-style comedies. The romance is a Christmas Ale buzz. You root for leads Ted and Hope.

Supporting cast matters in holiday movies. From Ted’s high-strung sibling to the Sole Sisters, an a-hole newsman nemesis, and Hope’s straight-talkin' coworker, this movie has a fruitcake of an ensemble.

The plot had a third-act problem with an asinine Three’s Company miscommunication “conflict.” Hope is unnecessarily mean, as she delivers a brutal, undeserved Ted takedown. Justice for Ted! But the hot, handsy, elongated on-camera make-out scene saved Christmas. Watch it.

Ted

#movies #ChristmasMovies #HallmarkMovies #RomCom #HolidayMovies #100HotChocolates #ChristmasReview #100WordReview #Larrys100 #100DaysToOffload

 
Read more... Discuss...

from Shad0w's Echos

Rayeanna; Baddie, Caretaker, Sage

#nsfw #glass

Rayeanna is twenty-seven — young but already seasoned in the art of tending to people who can’t tend to themselves. She works long shifts as a night nurse — psych patients, elderly patients, the weeping, the furious. She’s seen breakdowns and bathroom floors. But never like this. Never a thin, pale white woman half-possessed by a porn loop and a secret worship that leaks out of her in unstoppable waves. Rayeanna’s first instinct is to be ready: she checks her battle purse — sanitizer wipes, a comb, travel tissues, a small pink canister of mace, a pink taser tucked behind her phone charger. If this Karen flips the switch from helpless to hateful, she’ll be ready. But right now, all she sees is a woman so cracked open by her own secrets that she can’t even stand up straight.

Rayeanna wipes Meredith’s cheeks first — gently, with a mother’s touch. Meredith hasn’t felt this kind of touch since her own mother died in a hospice bed fifteen years ago. She wipes away sweat from her upper lip, tears from her chin. She pulls a clean sanitary wipe from the pouch and presses it into Meredith’s clammy palm. “Here, baby. Clean your hands. Come on.” Her voice is low, soft — but there’s steel under it. She can command a grown man in a full psych break to sit the hell down. She can handle this.

Meredith stares at her, wide-eyed, trembling. Her mind tries to form the usual armor — the frost, the cutting line about privacy or decency. But it won’t come. She just stares at the soft slope of Rayeanna’s belly in the tight dusty pink one-piece. The faint sheen of gloss on her full lips. The scent — warm coconut and faint summer sweat — that makes Meredith’s thighs twitch even now.

Rayeanna pulls a small comb from her bag, smooths a few strands of Meredith’s hair away from her forehead. No one has done this for Meredith since she was a child. Not a hug — not an empty “there, there” — but real touch. Real, practical mercy. She wants to weep again.

They walk out of the restroom. Meredith full of shame steps back into the light. She’s a hot mess, but at least her arousal has subsided to a dull throb. She can focus. She can walk. Slowly.

When Meredith’s knees stop buckling, Rayeanna watches her carefully, taking silent stock of her slow breaths, her pink, raw eyes, her sudden stutters. Meredith wants to get away. Get to her car. Escape this embarrassment. Rayeanna knows this energy all too well once she sees where the woman is trying to go.

“You’re not driving,” Rayeanna says, flat and kind all at once. “No way. We’re gonna sit. Over there.” She nods toward a shaded gazebo — half-broken picnic table under peeling white paint. Safe enough. Open enough. She doesn’t trust Meredith not to snap, but the woman looks more like she’ll break herself than anyone else.

They walk together. Meredith clasps her bag to her chest like a child hugging a stuffed animal. Her phone — still open to black porn loops — buzzes in her pocket. She looks longingly at the screen one more time before she puts her phone on mute and locks the screen. Rayeanna sees this and takes note of it. Clearly this woman has some fort of porn problem and needs serious mental help.

They sit. Rayeanna crosses her legs, stays close but not too close — her free hand resting on her purse in case she needs the taser. Meredith opens her mouth. The first words tumble out clumsy, jumbled. But once they start, they don’t stop:

She talks about the hotel pay-per-view when she was barely twenty, drunk with her first husband’s snores in the next bed. How she watched a black woman ride and laugh and shine under cheap lamplight and how something inside her broke — or maybe opened.

She talks about her carefully pruned marriages — good on paper, dead in bed. How men called her frigid, cold, hollow. They were right, she says, they were all right — because she didn’t want them. She only wanted black porn.

She explains carefully how it’s not some blacked or BBC cuck fantasy — she hates that trash. She sees how cringe it felt her how forced it felt. It felt like a corporate machine of manipulation trying to maintain racism and oppressing black people.

Rayeanna blinked. How in the heck this HOA suburban queen who’s probably never talked to a black person get this right? She had to pay more attention now.

She wants the real bodies, the real softness, the real wildness she’ll never have for herself. “Not tools, not toys — goddesses,” She says the word like a prayer that can’t stop falling from her lips.

She admits how she ruined herself with a pagan ritual. Clearly she can’t trust herself in public anymore. That’s why she’s like this now. Openly admitting she has a dull aching sexual throb just by being in her presence–Rayeanna raises and eyebrow, her hand arming the taser in her purse silently.

Meredith continued her confession about how the pagan ritual was successful, but clearly too successful.. how it failed her and turned her into a degenerate pervert stalker. She was commanded by the spirits to go across town. They told her she needed to see black women in person for the first time. How she had to steal glances at black women with her eyes, with her ears, with her phone playing porn in her headphones as she went through the park.

She knows how sick and depraved it all sounds. She says she’s broken, fundamentally. She knows it. She half expects Rayeanna to spit in her face. Or hit her. Or scream.

She almost hopes for it. She deserves it. Meredith closed her eyes expecting violence because she enraged this stranger.

Rayeanna listens. She doesn’t interrupt, doesn’t flinch, doesn’t edge away. The whole time her dark eyes stay locked on Meredith’s flushed, mortified and defeated face.

When it’s over, Meredith’s voice cracks. She wipes her nose with the back of her hand. “I’m so sorry. I know what this looks like. I’m sick. I’d pay you — if you’d keep quiet. Or — or if you’d just… talk to me. Please.” She looks so small now, so paper-thin in her dirty prim blouse and pearls that don’t fit her sins.

Her thighs still ache. Her clit still pulsate with her heartbeat. She keeps telling her its normal and she wants this, but at the same time, she accepts she was not careful for what she wished for. For now, the porn demon is silent — waiting to see what this real goddess will do.

Rayeanna takes a slow breath. She feels pity. Confusion. A flicker of disgust she buries under layers of her nurse’s professional calm.

But also… something else. A dark, unwanted sympathy. She knows what it means to chase things you’ll never be — to crave softness, realness, life beyond what the world says you’re allowed to have. She’s been fetishized before — ugly, entitled hands, cheap dirty talk, black girl jokes in bed. But this? This isn’t that.

This is worse, in a way — sadder. This broken Karen didn’t want to use her. She wanted to be her. Or worship her. Or both.

Rayeanna sighs. Her eyes soften, even as her fingers tumble armed taser inside her purse. She leans in, just enough for Meredith to smell that warm coconut scent again.

“Okay,” she says, voice quiet but unflinching. “I’m not gonna beat you. I’m not gonna run. But you need help. Real help. And you need to promise me you’ll listen. Do you hear me?”

Meredith nods so fast it looks like her neck might snap. Her eyes glisten. She looks at Rayeanna the way a dying thing looks at a cup of water. “I’ll listen,” Meredith whispers. “I’ll listen to anything. I swear. Please help me if you can.”

And for the first time in Meredith’s life, the word please tastes clean on her tongue.

Rayeanna sits opposite Meredith on the cracked bench, legs crossed, purse tucked tight against her hip. She watches Meredith’s trembling hands, her bitten lip, her watery eyes. She sees the thing inside her — the itch that never sleeps, the shine behind her pupils. It makes her stomach knot. Rayeanna just needed a closer look to confirm. Meredith just thought it was a gesture of kindness.

Rayeanna can’t stop thinking about those words.. ‘pagan ritual’. It pulls at a dark corner of her mind — something old her grandmother whispered when she was little, stories she brushed off as old island fears: Spirits that listen if you talk too loud. Spirits that wake if you knock on the wrong door.

Rayeanna clears her throat. “I need you to tell me exactly what you did. Not the porn part. Not the, um… gooning. The ritual. Step by step. What did you say? What did you light? What did you promise?”

Meredith shifts under her prim skirt. She notices the warmth trickling down her inner thigh. A sign that her arousal is far from normal — However, she doesn’t notice the faint, sweet scent drifting up. Like sugar lilies, like jasmine after rain, the kind of scent that should come from a candle — not her own leaking cunt.

She doesn’t smell it. She’s probably too humiliated to care or notice. But Rayeanna does. She frowns, subtle, but doesn’t interrupt. She did see the puddle forming at Meredith’s feet but didn’t acknowledge it. Rayeanna has already seen the impossible. The pulsating pussy of something unworldly. Meredith’s pussy leaking something no grown woman should be capable of in deep arousal. She’s just never seen any kind of spiritual possession like this before, and she has seen a lot.

Meredith clutches her phone with both hands. Her voice trembles. “I… I found it on this board. An old thread — just old porn addicts, some of them said they’d tried it. It was just dumb words. Like an incantation. They called it a devotional goon binding. Said it makes the urge stronger but contained. Makes you pure for what you worship.”

She shudders at her own phrasing — but pushes through. “I bought candles — black ones, purple ones from a wiccan site. I drew this shape on my mirror like the guide said. It looked like an eye. I sat naked with my laptop in front of it — folders of all my favorite porn clips. I lit the candles and kept whispering: Make me pure for them. Make me need them more than air. Make me useless for anything else but this. Over and over. For hours. I masturbated the whole time. Edging. Begging. Wanting.”

Her voice drops, hoarse. “I came once. And then again. And then I woke up and I couldn’t stop. And now… now it’s like something’s watching. Or… feeding.” Meredith was too far gone to care at this point. Her confession had triggered her arousal to full tilt. The small bloom of wet pussy juice was darkening the bench under her skirt. But Rayeanna’s eyes flick down, just once. She catches the faint shimmer trailing down Meredith’s knee — a thin line that glistens in the hot daylight like dew. The scent is unmistakable: sweet. Too sweet for sweat. Too floral for sex alone. It was an omen. Death was coming to this woman. Rayeanna’s heart knocks against her ribs. She thinks of her grandmother’s cracked voice, the soft Haitian Creole prayers before bed: Some spirits stick to the desperate. Some pacts stick to the unclean. She glances at Meredith’s face — all sharp cheekbones and watery shame, but her body is ripe. She’s leaking like an overripe fruit under a polished shell. And the smell makes the little hairs rise on Rayeanna’s neck. “Did you close it?” Rayeanna asks, voice low. “Did you thank it? Did you break the circle?”

Meredith shakes her head, confused. “No… the post didn’t say anything about that. I just… passed out masturbating while watching porn in the circle.”

A drop of her sweetness lands on the wooden slat between her shoes. Rayeanna shifts her purse closer. One hand finds the cool edge of the comb, the other brushes the taser.

‘Old spirits love a fool,’ her grandmother used to say. ’But they’ll bleed you dry if you don’t pay them right.

Meredith was not paying in blood this time. She was paying with her sexual essence and whatever was left of her soul slowly melting out of her pussy. And she didn’t even know how much danger she truly was in.

Rayeanna leans in, close enough to smell the bloom of Meredith’s living perfume. “Honey… you didn’t bind anything. You fed it. You opened something. And now it’s feeding on you.”

Meredith’s eyes flicker — the words land, but her thighs twitch too. The idea that something that doesn’t belong is alive inside her, sucking at her bones, makes her wet again. A ghost moan rattles behind her tongue.

Neither of them fully notice it yet. But the ritual does. It weaves through Meredith’s bloodstream — an old, sticky echo of an idea she could never pronounce right. It likes being spoken of. It likes being named. Acknowledgement gives it power.

It kisses her pulse points with the smell of warm, blooming flowers — sweet, fertile, impossible.

And both women feel it at the edges of their skin: Something alive. Something awake. Something that wants to be fed.

Rayeanna has encountered dark sprits like this before. She had to save this poor naive woman.

 
Read more... Discuss...

from sun scriptorium

libra star, pierced the ages 'fore crystallise patterns into rings scales adjusting: what betrayals come to pass?

yet

rings will spin and splinter dark the spear haft cleanly through three times three times three more eons the elder you, a kind of current rivers do —

again.

panic i will not, then to see another wound and trust a misted trail rumbling against the orbit:

(again)

[#2025dec the 12th, #wander]

 
Read more... Discuss...

from Ladys Album Of The Week

Cover art: A partially cloudy sky, dynamic in appearance, sun obscured.

On Bandcamp. On MusicBrainz.

For the second of my “albums released in 2025 which you should really listen to before the year is out” recommendations, Yosi Horikawa¦s Impulse is a no‐brainer. I¦ve been following Horikawa¦s output ever since Wandering was released in 2012, and I¦m happy to say that this latest album holds up against the high bar set by its predecessors.

Each Yosi Horikawa album has its own feel; Vapor, in 2013, was accessible and familiar, driven by beats and sonic textures; Spaces, in 2019, was more experimental, leaning into Horikawa¦s use of field recordings and elevated a sense of place. Impulse lands somewhere between the two, but is more than just a compromise; rather, it feels like a refinement, a honing in on a feeling and a sound.

Favourite track: Some of my favourite Yosi Horikawa songs are the ones which make use of vocals, but I think “Snow Bird” deserves special mention for taking its vocals not from a human but from a bird. Of course, this is an appropriation of bird speech for human ears, but it serves as an important reminder that we are far from the progenitors of melody or sound. Computers and technology may give us the ability to arrange and understand these things in a different way, but this music is, in some way, just a new form of appreciation for what has always been there.

#AlbumOfTheWeek

 
Read more...

from The Understory

This year, I didn’t break any records in terms of total number of books read, but I did have fun and learn a few things. Here are some of my favorites reads from this past year.

Fiction

The Mistborn Saga by Brandon Sanderson

Finished:

This was my first Brandon Sanderson series, and I was floored by the depth of the world-building, the complexity of the magic system, and the richness of the characters. My partner and I started the first book together, but lost steam midway through due to the lengthy dialogue and slower pacing. I came back a few weeks later and realized what we were missing. I made it through the first two books this year, and I’m eagerly looking forward to finishing the third soon.

The Dungeon Crawler Carl Series by Matt Dinniman

Finished:

This was my favorite series of the year! Carl and his girlfriend’s cat, Princess Donut, fight to survive after an alien corporation restructures the Earth’s crust into a multi-level dungeon game show. The writing is creative, engaging, and *hilarious*—I couldn’t put the books down once I started. The next book is scheduled to release in June of 2026.

The Bobiverse Series by Dennis E. Taylor

Finished:

This was a fun series. The protagonist, Bob, is resurrected as an artificial intelligence installed in a Von Neumann probe after being cryogenically frozen. The series follows their attempts to save humanity from itself and an unforgiving universe.

James and the Giant Peach by Roald Dahl

This is one of our bedtime books with the kids.

Witch King by Martha Wells

I loved the Murderbot series, so when I saw this on the shelf at Powell’s I figured I would give it a shot. This novel was interesting. It had a similar structure to Murderbot, but went deeper on lore and the magic system. It was a really fun read, and the toggling between past and present throughout created an exciting pace.

Shroud by Adrian Tchiakowsky

Tchiakowsky is one of my favorite sci-fi authors. This standalone novel follows two humans as they struggle to survive on a pitch-black, inhospitable alien moon called Shroud.

Blindsight by Peter Watts

Another first-contact, hard sci-fi novel that had some really interesting concepts related to identity, consciousness, and neurology.

Project Hail Mary by Andy Weir

This one needs little introduction, as it’s on its way to becoming a feature film. I listened to it as an audiobook, which is the best way to experience the story, in my opinion. I love the attention to detail and theme of perseverance in Andy’s writing.

Diaspora by Greg Egan

Greg Egan writes some of the most cerebral hard sci-fi I’ve come across. I read Permutation City last year and felt like I needed more. Diaspora didn’t disappoint—it explores the nature of life and intelligence against a backdrop of compelling (fictional) theories in physics and mathematics. It’s hard to summarize the world and story within it. If you’re into post-humanism, consciousness, and exploration of theories of the universe, you’ve gotta check this one out.

Non-Fiction

Boom: Bubbles and the End of Stagnation by Byrne Hobart and Tobias Huber

I was a bit conflicted on this book. On the one hand, it does an excellent job of arguing the dangers of stagnation and the benefits of bubbles with examples and evidence from history. On the other hand, it felt a bit dogmatic and rigid at times. That, for me, is a good signal that this book is worth reading if you want to better understand the current trends in technology and AI in particular. I finished the book with a stronger desire for innovation and creativity than I had going into it.

Notes on Complexity by Neil Theise

An introduction to complexity theory, an examination of how interconnected systems behave and evolve. Neil provides an approachable, research-based perspective on consciousness and the interconnectedness of all things in a small package. This was one of the most intriguing reads of this year and I would recommend it to anyone who’s interested in a theory that joins philosophy, quantum mechanics, physics, and consciousness.

Essentialism by Greg McKeown

Thematically aligned with where I’m at right now, this book provides a framework for doing less while accomplishing more.

The Untethered Soul by Michael A. Singer

I listened to this as an audiobook and had a hard time with the narrator and the receptiveness of some of the content. I stuck with it though, and found a timeless message of acceptance and self-realization.

Meditations for Mortals by Oliver Burkeman

Similar to Essentialism, this one was a solid contribution toward embracing finitude and focusing on what’s important rather than trying to do it all.

 
Read more... Discuss...

Join the writers on Write.as.

Start writing or create a blog