By Odin Rasco
It takes only a look around to see AI technology is in the midst of a meteoric rise. The proliferation of it was most recently highlighted during the Super Bowl, with nearly a quarter of the gameโs ads (15 out of 66) involving AI in some way, according to a report by Adweek. Considering ChatGPT had more than 800 million users weekly in 2025 โ and that the platform is helping design everything from toothpaste to specialty Coca-Cola products โ Resourcera estimates that around 1 in 10 people alive today use AI.
Part of the appeal of the tech is its user-friendliness, allowing anyone to give an AI model instructions in plain language, no coding required. Because AIs like ChatGPT or Anthropicโs Claude are Large Language Models, trained on nearly immeasurable samples of the written word, many users turn to them to generate essays, cover letters, summaries and other kinds of writing.
But for individuals who engage with writing not as a task to get done but rather as a craft to hone โ journalists, authors, screenwriters and educators โ the rapid spread of AI has raised multiple concerns about how the new technology is impacting their fields.
โI feel that in a short period of time I’ve become very counter-cultural without meaning to, because I have a kind of like โkill it with fireโ attitude towards [AI],โ author and educator Lisa Locascio Nighthawk remarked. โI didn’t consent to this, you know? And I guess, you know, we don’t get to consent to the cultural changes that impact us; but I don’t appreciate how itโs all happened in what feels like about two years.โ
Tech-enabled thought theft?
Nighthawk is the author of New York Times Editorsโ Choice novel โOpen Me,โ as well as executive director of the Mendocino Coast Writersโ Conference and chair of the Antioch MFA in Creative Writing. Her own words, though unwillingly, may now be a small part of how one AI model works: โOpen Meโ was one of the more than seven million books Anthropic downloaded or digitized to support the training of its large language models.
Anthropic was taken to court for its unlicensed use of books in its LLM training in Bartz v. Anthropic, one of the first major lawsuits brought by authors against an AI company. The U.S. District Court for the Northern District of California issued a summary judgement on the case authored by Judge William Alsup on June 23, 2025. The courtโs judgment found Anthropicโs use of books, many of which were obtained on pirating websites, โinherently, irredeemably infringing.โ The court did maintain, however, that use of legally-obtained books in the training of an LLM constituted fair use.
The Authorโs Guild, the oldest professional organization in the United States for published writers, strongly disagreed with the fair use portion of the judgement, publishing a response stating โIt feels as though the court rushed to issue a decision without fully understanding the copyright law and legal issues or the potential harm.โ
The class action suit was later settled out of court, with Anthropic agreeing to pay an award of $1.5 billion to the rightsholders of 500,000 titles out of the 7 million copies of books it used to train its models, meaning each title is expected to net $3,000 (to be split between the author and publisher).
For Nighthawk, the potential payout from the settlement doesnโt address the feelings that come from knowing her work was used to train an AI without her permission.
โIt’s impossible to put a dollar figure on that,โ Nighthawk explained. โI mean, I worked on that novel for 7 years. It’s galling, especially given how hard it is to do anything as a writer. You just feel really disempowered, basically.โ
After urging from friends and colleagues to look into the Anthropic settlement, Sacramento-based author Naomi J. Williams discovered her 2015 novel โLandfallsโ was another book Anthropic may have used without permission to train its AI.ย
โWe are working on starvation wages, right?โ Williams noted. โWhen I found my book in that database, I said, โWoohoo, $3,000.โ I haven’t seen a dime of that settlement yet โฆ Certainly I’m not counting on it to pay bills or anything, but I mean, there’s something a little bit creepy about knowing that the work of your own mind has been used to help train a machine.โ
Williams, who has also published multiple short stories and essays, is co-director of CapLit, a reading series and literary organization founded in Sacramento in 2025. Although sheโs had experiences with AI that have raised some red flags, Williams doesnโt see the technology as an apocalyptic issue for authors so far.
โI’m 62, and so I very much came of age even before the internet, right?โ Williams mused. โSo, I’ve seen these big changes come and everybody gets up in arms of โthis is the death of literatureโ or whatever, and I feel like I’ve seen enough changes come and go, and the much talked of death of books and literature has yet to occur. So, although I’ve been alarmed by some of the AI related stuff I’ve seen, I’m not squarely in the doom and gloom, โthis is just horribleโ camp.โ
Hallucinogenic feedback loops for journalism?
When it comes to ethical concerns stemming from AI, writersโ concerns donโt stop at unauthorized use of their work to train the models. Journalist Krys Shahin, a Sac State alumni, recently took a deep dive into the potential ecological impacts from AI data centers including air pollution, water use and energy use.ย
Shahinโs reporting started with trying to find the source for the pervasive claim that an AI prompt uses the equivalent of a glass of water; though that estimate is disputed, the Environmental and Energy Study Institute found that large data centers can consume up to 5 million gallons of water a day.
โThat equals roughly the daily demand for a community of 50,000 people,โ Shahin wrote.
Although there are real ecological impacts to consider, AI is not without its positives, according to Shahin.
โI’m not anti-AI inherently, I think AI can be used for a lot of good things,โ Shahin acknowledged. โI think it’s great that Cal Fire uses AI to help detect fires in really remote areas. I think it’s good that local municipal companies and organizations use it for road maintenance or just generic city health. I think these are all great things, but once it breaches into the creative aspects of things, the things that require human connection, that require human touch, it gets a little bit more finicky.โ
One of the finicky cases, according to Shahin, is when journalism and AI overlap.
โMy main concern is the lack of regulation with AI, especially in such an ethical field like journalism,โ Shahin explained. โWe, as humans, have to make a lot of decisions based on our own past experiences; what we’ve learned going through journalism school or just doing journalism and talking to people. Those chatbots can’t make the same decisions that I can. They can’t make any decisions, actually. They just feed you back what you want it to.โ
โGarbage peopleโ with โscam bookโ objectives
Though the massive quantities of water needed to cool data centers are concern enough, for self-published authors like Sacramento-based Wayne Campbell, thereโs also the struggle to grab a buyerโs attention amidst a constant flood of AI-generated titles on ebook storefronts.ย ย
โAs somebody who’s in the self-published space, you see that all the time with the crowded field in Kindle Digital Publishing, where the market has been just completely flooded by garbage written by garbage people who are only seeking a profit,โ Campbell remarked. โThey’ll type in a prompt into an LLM and have a complete novel or novella, push it out with a garbage cover and a garbage blurb priced at $0.99 and they’ll do that at volume. KDP is just completely awash with that stuff even with the protections that they say that they have.โ
A 2024 NPR article backs up Campbellโs observations, with others in the self-published space finding the marketplace filled with AI-generated โscam books.โ
As time moves forward and companies and consumers continue to use AI, more concerns are likely to crop up. Some, like Nighthawk, believe AI may be a bubble fit to burst. Yet, even if that ends up being the case, the technology itself, and the questions that come along with it, wonโt be going anywhere.
โThe thing is, it’s here,โ Williams admitted. โWe can no longer live in a world where it does not exist.โ
This story is part of the Solving Sacramento journalism collaborative. This story was funded by the City of Sacramento’s Arts and Creative Economy Journalism Grant to Solving Sacramento. Following our journalism code of ethics, the city had no editorial influence over this story. Our partners include California Groundbreakers, Capital Public Radio, Hmong Daily News, Outword, Russian America Media, Sacramento Business Journal, Sacramento News & Review and Sacramento Observer. Sign up for our โSac Art Pulseโ newsletter here.
