In March, an aspiring author got a troubling message: All of her works in progress were no longer accessible. What happened next is every writer’s worst fear.
PHOTO-ILLUSTRATION: ANJALI NAIR; GETTY IMAGES
On the evening of March 24, 2024, writer K. Renee was doing what she often does: curling up on the couch and watching hockey with her husband. It was the Dallas Stars versus the Arizona Coyotes. Renee has followed the Stars her whole life. She was born the season they won the Stanley Cup. As she watched, she got a strange message: a friend texted to say the shared Google folders where Renee kept her works in progress were no longer accessible. Her friend had planned to read and make notes on one of Renee’s stories and was surprised to be locked out.
“You no longer have permission to view this document,” said the pop-up message. “If you believe this is an error, contact the document owner.”
This was how Renee experienced a moment that most of us have heart-pounding 3 am stress nightmares about. All 10 of her works in progress—some 222,000 words across multiple files and folders—were frozen. Not just frozen, but inaccessible on her phone and tablet. When her husband fetched her laptop, Renee logged into Docs and tried sharing the documents again. Then she received her own message from Google.
“Can’t share item,” was the header. “You cannot share this item because it has been flagged as inappropriate,” read the body text.
Renee writes hockey romance. People who get to see her drafts first, her community of alpha and beta readers, all have that in common. Renee describes her work as “open-door spice.” Aside from being an amazing name for an overpriced cocktail, the term serves as a descriptor for the level of explicitness in romance fiction. Simply put, “open-door” means more explicit; “closed-door” means less. Reading an open-door romance is like watching a John Wick movie. You see the knife go in. Closed-door romances are like watching a Marvel movie. You know something is happening to someone’s body, but you never really see it.
When she saw the word inappropriate in the notification, Renee worried her work had been dinged for its spice. “I thought I was the problem,” she says. “I thought I had somehow messed it up.”
But she hadn’t. At least, she hadn’t messed it up in any way she could hope to avoid in the future. Google never specified which of her 222,000 words was inappropriate. There were no highlighted sections, no indicators of what had rendered her documents unshareable. Had one of her readers flagged the content without discussing it with her first? Was it a malicious attack on the files? Had someone at Google decided her content was too spicy? Renee hadn’t turned on any of the AI functions in Google Workspace, so she doubted it could be chalked up to a bot banning her books. After all, a 2016 paper coauthored by Google researchers revealed that its recurrent neural network language models had been fed thousands of romances. If for some reason a bot was crawling her work, wouldn’t it recognize what it was looking at?
As a test, Renee tried to share files without any fictional content, but the access was still denied. It remained that way for two days. During those two days, her email slowed to a crawl—and she heard from others who had the same experience. “Truthfully, there’s more communities that have been affected by this; they’re just scared to say anything,” she says.
This may be true. Posting, under your own name, that your content has been found “inappropriate” means that search engines, Google among them, might associate your name and that word together forever. There is a genuine potential for social stigma when making a complaint of this nature in the public square—which is to say, online. And given the way that large language models dredge everything written online in an effort to sort which words belong together in context, the words most commonly associated with any individual name gain greater semantic weight and are more likely to be reproduced when users of these models “ask” the model for an “answer.”
Generally speaking, files containing violence, abuse, child sexual abuse material, and gore violate the terms of service for Google Drive and its associated products, like Docs and Sheets. Exceptions are made, though, for educational, artistic, and journalistic purposes. “There are clear policies for how our tools may be used, and we may review and take action on any content that violates our policies,” Google spokesperson Jenny Thomson said when reached for comment on this story. “If a user receives a notification that their file violates Google’s terms of service or program policies, and they believe it’s an error, they can request an appeal.”
The night Renee’s docs got frozen, the Dallas Stars won 4-2, continuing their winning streak. As ESPN uploaded video highlights of the game, Renee was submitting a report to Google. As of this writing, she has not heard back.
Renee likely isn’t alone. As she tried to figure out what happened, she shared the situation with a writers’ Slack channel she frequents. From there, it went to Instagram, where other writers advised their fellow scribes to back up their work. One even posted that a similar thing had happened to them: “I’ve been scouring the internet for answers because this just happened last night. I have 8 betas, and randomly last night it all shut down. 😭Same content, I’m an adult romance fantasy author. Any news?!”
That author later posted a video to Instagram explaining that it wasn’t the adult content in the files but rather “Google thought I was spamming people.” Apparently, sending the same doc to scores of people—for example, alpha and beta readers—can make it appear as though the doc was unsolicited.
Automated moderation is nothing new. In 2021, activists raising awareness of Missing and Murdered Indigenous Women and Girls found some posts mentioning MMIWG had disappeared from Instagram. In 2020, one Black Lives Matter activist discovered that Facebook had mistakenly flagged their account for IP violations, erasing documentary footage that included instances of alleged police brutality. (The activist’s profile was soon restored.) In 2018, Tumblr banned “adult content,” and queer communities warned that losing their spaces online would make things even more difficult for queer youth and others questioning or discovering their sexuality. (Tumblr modified this ban years later.) But what the sudden loss of an online community most resembles happened in 2007: Strikethrough.
On May 29, 2007, journals and communities began disappearing from LiveJournal. The missing journals and groups went unclickable, mute, struck through with a single-line font effect. To a banhammer, every query looks like a nail: depictions of rape disappeared, but so did posts by rape survivors. The same was true of incest, abuse, and violence. The ensuing exodus of users led to the founding of DreamWidth, Archive of Our Own, and the Organization for Transformative Works. Today, all three are still operational.
While it’s still unclear what exactly happened to Renee’s docs, or if it’s just a fluke, the effects of mishaps like this are complex. Even though it’s now commonplace, there can still be unease around letting major corporations store personal writing. For authors who write about sex, say, or queer people trying to find a voice, hearing that your content could be flagged as “inappropriate” can have a chilling effect. The problem, says bestselling pseudonymous author Chuck Tingle, is that companies like Google now function like utilities. “It’s the same as water and electric,” he says.
Tingle would know: His “Tinglers,” erotica pieces he releases as Kindle Singles, led to his contract at Macmillan for the queer horror novels Camp Damascus and Bury Your Gays. Those early singles were written without the benefit of editors, often within a matter of hours. They’re sloppy. “They’re punk rock,” he says, but they also helped him build a community around the “underdog genres” of erotica, horror, and comedy that his work falls into. If Amazon decided to stop selling his Tinglers, it would be a big blow, even though he now has a book deal.
Appropriate is a word with two usages and meanings in common parlance. The first is as an adjective, as in the message Google sent to Renee. It describes suitability in context, fitness to purpose. The second usage is as a verb, and it’s much closer to the original Latin appropriatus, which means “to make one’s own” or “to take possession of.”
Whether we’re discussing the “appropriation” of cultural slang or a piece of real estate, we mean a transfer of ownership. But both meanings of the word spring from that Latin origin and its antecedent, the word privus: the word that begat (among others) the words private, property, and proper. All of these words grew from the same source, and in one way or another they all describe qualities of belonging.
This is a story about belonging.
Accessibility, infrastructure, and organization are all important to Renee as a writer and as a person in daily life. She tracks more than just her word count: she tracks meals, moods, and medications. “We have to be organized,” she says.
By “we,” Renee means her fellow disabled people. The first time one of her patient portals experienced a privacy breach and sent her a letter about it, she was 16. By then, she’d had to give up hockey, moving from the ice to the bench to the couch. “I’m always in pain. That’s a part of my illnesses. That’s going to be my life. I’ve come to terms with that. I’ve accepted that.” She tracks her symptoms meticulously in part because the faster her appointments end, the sooner she can be back in bed.
“Listening to me now, you wouldn’t know that I’m chronically ill and disabled,” Renee says. “You can’t really see it either. My illnesses, my diagnoses, are invisible.” For this reason, Renee has experienced disbelief and gatekeeping when she uses a cane, wheelchair, or forearm crutches as a twentysomething. She has written similar moments into her fiction, like a scene wherein one character is second-guessed because she’s in a wheelchair one day and not using it the next.
Renee sees her work as opening conversations about disability and the perception of disability. Until Google Docs locked her out, she had the data to back up her hypothesis, in the form of long comment threads between reader and author. It remains the goal of her published work. “If even one person second-guesses” the way they think about disability, she says, “I feel my writing has done what it needs to do.”