A few months ago, a forty-five-year-old homemaker living in Georgia, whom I’ll call Robin, started playing around with an A.I. image generator. Growing up, Robin had loved reading; she dabbled in writing, too, but after her first child was born, the habit faded. A.I. offered something different—a kind of world-building that allowed her to project herself into places and situations she’d never inhabited. She made a photo of herself standing outside a café in Paris, and another of herself sitting on a private jet next to a Birkin bag. But she started to wonder, too, if she could make some extra cash by breaking into the influencer market. Soon, she had created Isabella, an A.I. avatar that looked nothing like her.
Isabella is thin and tall, and wears high heels and low-cut dresses. She appears to be in her twenties, and she alternates between flashing friendly smiles and smoldering, come-hither looks. In one photo that Robin uploaded to TikTok, Isabella poses in a sweatshirt and bike shorts from the yoga brand Alo, her chiselled abs exposed and gold bracelets dangling from her wrists. Robin is Black, but Isabella is white. It was a strategy, Robin explained, to widen her appeal to brands who might be interested in offering her partnership deals. “White women just have a broader audience,” she told me.
Across social media, an influx of A.I.-generated avatars is reshaping what it means to be an influencer. A Facebook group called Baddies in AI, geared toward women who are using A.I. to either augment their own social-media presence or create entirely new figures from scratch, has more than three hundred thousand members. In one post, a Black woman named Whitney shared A.I.-generated images of a white woman drinking iced coffee in a sunny apartment with blond-wood flooring. “Okay yall I’m going undercover,” she wrote. “May the odds be in my favor.” In the comments, one member jokingly called it “whitefishing.” Whitney mentioned that she’d already tried the approach on LinkedIn, uploading a white avatar but keeping all the other details—her name, experience, and posts—the same. “Recruiter outreach and post circulation jumped,” she wrote. “So for me it’s a data proven experiment, not self-hate.”
Ryan Milner, a professor of communication at the College of Charleston, told me that A.I. avatars seemed less like a rupture and more of a clarification. “One of the liberatory potentials of the internet was that you could divorce the mind from the body, and so the utopian read was that things would become more of a meritocracy where disability or race or other social hindrances wouldn’t get in the way. We would just be our intellects and be measured by that,” he said. But, with the advent of platforms such as Instagram and YouTube, the online self became highly saleable. “The internet has gone from a text-based medium to a visual one over the last two decades,” Milner said. “It’s not surprising that when people are playing with identity online in the age of A.I., that we’re still going to see the norms replicated. The tools aren’t changing that.”
The virtual influencer is not, strictly speaking, new. In 2016, a C.G.I. avatar named Lil Miquela appeared on Instagram, presenting as an aspiring musician from Southern California. Miquela, who was created by Trevor McFedries, has hazel eyes, olive skin, freckles, and a tasteful tooth gap; she often wears her brown hair in twin buns with pin-straight bangs. Her racial ambiguity was perfectly calibrated to an era in which brands were clamoring to amplify their social-media presence by appealing to as many audiences as possible. It was clear to anyone looking closely that she wasn’t real, but that was part of the appeal. Miquela partnered with Prada, made out with Bella Hadid for a Calvin Klein ad, and walked the red carpet at the Grammys. The project helped McFedries and his team raise millions of dollars in venture-capital funding for their startup. Cyan Banister, a former partner at Founders Fund, told the Wall Street Journal that the appeal was simple: “You can create the Kardashians without any of the inherent issues that come with being human.”
Not everyone is enthusiastic about the new possibilities. In late March, a Black New York-based influencer named Tatiana Elizabeth discovered that a white influencer named Lauren Blake Boultier had used A.I. to swap her own face onto a picture of Elizabeth from the U.S. Open in 2024. (Blake issued a statement blaming a “third-party AI content agency” for the oversight.) “The low barrier to entry with A.I. is disgusting,” Elizabeth told me. “I had to wake up in the morning and get a nanny for my son to go to the U.S. Open all the way in Queens. I had to put in eight years of work to get that opportunity.” When I mentioned Baddies in AI, Elizabeth was critical. “Where does the line get drawn? Where’s the respect for each other and each other’s experiences? I don’t think that it’s right, especially without any transparency,” she said.
At a certain level, attaining celebrity requires a body: it’s hard to imagine how fake accounts could imitate, say, the rise of Addison Rae or the feud between Alix Earle and Alex Cooper. McFedries, who, with his team, gave Miquela a rich backstory—she had a blond, Trump-supporting nemesis named Bermuda—told me that he thought the new crop of A.I.-generated accounts was too short-sighted to succeed. “We were trying to build Disney for a new world,” he told me. “The technology enabled the storytelling which enabled the affinity which enabled the commerce. People are skipping steps.” But as A.I. gets better, it seems as if it will only get easier to manufacture the sort of narrative that made Miquela popular. Influencer culture has always been about commodifying intimacy—and, at a certain point, authenticity stopped seeming to really matter to people. Sienna Rose, a neo-soul singer who is widely suspected to be A.I.-generated, has released tracks that have been shared by Selena Gomez and the BTS member V, and she has made it into Spotify’s Viral 50 in the U.S. (In January, on TikTok, whoever runs the account posted a defiant video of Rose with a text overlay that read “when half of the world thinks you’re fake, but you’re really just out here living your dream life.”) Jessica Foster, an A.I.-generated character who claimed to be in the Army and posted photos with Donald Trump, amassed more than a million Instagram followers before Meta took down her account.
In the adult-content industry, identity cosplay takes a different form. One woman wrote in the Baddies in AI group that she was in “several Discord groups with 95% men using a woman’s image for their Fanvue content.” This aligned with what I found in YouTube tutorials about so-called pornbots: men teaching men how to make women for other men. Last year, a video posted by an OnlyFans strategist named Markuss Kohs laid out the value proposition with candor, contrasting the difficulties of human models (“50% Profit Split,” “Hard to work with”) with the rewards of A.I. creations (“Works around the clock”). On various Discord servers for pornbot creators, the tone is eerily convivial: there are debates about the best L.L.M.s for video generation, pointers on how to avoid one’s account getting flagged on different social-media platforms, and words of encouragement for those just starting out with their first models. “I’m looking for people who are at a similar level to brainstorm and shoot ideas with,” a user named Lorenzo posted. An account called Papa Sesh sent the group a picture of a recent job that he’d sent off to a client: “should’ve fixed the nipple up a bit more but oh well he was still happy.”
Marie Sweets, an OnlyFans creator, told me that pornbots are a natural extension of a culture that views women’s work in these spaces as easy and exploitable. “This is how we get to the point of men generating A.I. content to swindle other men into giving them their money, or how we’ve gotten to the point of agencies generating A.I. content of real models while they take fifty per cent of their income.” She said that she isn’t threatened by competition. “This is a brand-new toy that has come to market. I think it will soon be found tedious and cheap by most buyers.”
But one creator who makes content with a handful of different pornbots said that his target audience tends not to notice the difference between fake A.I. characters and human actors. “The results don’t have to be perfect to work,” he said. He spends hours researching and analyzing engagement on Instagram, Threads, and Reddit. “I try to include skin texture, pores, small imperfections, and avoid making them look too smooth or overly ‘model-like.’ I prefer when the girls feel approachable,” he said.
In talking to some OnlyFans users, I sensed a funny sort of knowingness, akin to the suspension of disbelief one has at a magic show. Many adult-film stars already use A.I. to chat with their subscribers. The idea that a porn actress would be able to talk to thousands of users simultaneously is clearly absurd—but it doesn’t make the fantasy any less compelling, even if the words are being typed by a machine. The audience participates anyway.
Robin hasn’t received any brand-partnership inquiries for Isabella, but she remains hopeful. Human influencers, she said, should pay attention. “They’d better learn how to use A.I. if they want to keep up.” Last month, a generative-A.I. studio and a creator-revenue platform launched the A.I. Personality of the Year Awards, with cash prizes across content categories such as fitness, comedy, and fantasy. The winning creators won’t need to attend a ceremony. No one cares what they look like. ♦
