Last month, OpenAI unveiled an ambitious new language model capable of working through challenging problems with a simulated kind of step-by-step reasoning. OpenAI says the approach could be crucial for building more capable AI systems in the future.
In the meantime, perhaps a more modest version of this technology could help make AI girlfriends and boyfriends a bit more spontaneous and alluring.
That’s what Dippy, a startup that offers “uncensored” AI companions is betting. The company recently launched a feature that lets users see the reasoning behind their AI characters’ responses.
Dippy runs its own large language model, which is an open source offering fine-tuned using role-play data, which the company says makes it better at improvising when a user steers a conversation in a particular direction.
Akshat Jagga, Dippy’s CEO, says that adding an additional layer of simulated “thinking”—using what’s known as “chain-of-thought prompting”—can elicit more interesting and surprising responses, too. “A lot of people are using it,” Jagga says. “Usually, when you chat with an LLM, it sort of just gives you a knee-jerk reaction.”
Jagga adds that the new feature can reveal when one of its AI characters is being deceptive, for instance, which some users apparently enjoy as part of their role-play. “It’s interesting when you can actually read the character’s inner thoughts,” Jagga says. “We have this character that is sweet in the foreground, but manipulative in the background.”
I tried chatting with some of Dippy’s default characters, with the PG settings on because otherwise they are way too horny. The feature does add another dimension to the narrative, but the dialog still seems, to me, rather predictable, resembling something lifted from a bad romance novel or an overwrought piece of fan fiction.
One Dippy character, described as “Bully on the outside, warm on the inside,” revealed a soft side behind the gruff exterior when I clicked the “Read thought process” link beneath each message, but both the inner and outer dialogs lacked nuance or surprise and were repetitive. For fun, I also tried asking several characters some simple arithmetic problems, and their thinking sometimes showed how to break the puzzle down to get the correct answer.
Despite its limitations, Dippy seems to show how popular and addictive AI companions are becoming. Jagga and his cofounder, Angad Arneja, previously cofounded Wombo, a company that uses AI to create memes including singing photographs. The pair left in 2023, setting out to build an AI-powered office productivity tool, but after experimenting with different personas for their assistant, they became fascinated with the potential of AI companionship.
With little promotion, Dippy has amassed 500,000 monthly and 50,000 daily active users, Jagga says, with people spending, on average, an hour on the app at a time. “That engagement was absolutely insane for us,” he says.
Dippy revealed that it has secured $2.1 million in “pre-seed” funding in a round led by Drive Capital.
Dippy is of course entering an already bustling market that includes well-known companies like Character.AI and Replika as well as a host of other AI girlfriend apps. A recent report from investment firm Andreessen Horowitz shows that the top 100 generative AI tools based on usage include many AI companions; the chart on engagement among users of such apps shows how much stickier they are than just about anything else out there.
While these apps are often associated with undersocialized young men, they cater to women too. Jagga says that 70 percent of Dippy’s accounts tend to favor male characters, which could mean that many users identify as female.
Besides threatening to upend the world of surrogate OnlyFans chatters, these AI companions may have social effects that we have yet to reckon with. While a few research studies suggest that chatbots can lessen feelings of loneliness, some experts warn that they may result in greater alienation among heavy users, and seem to perpetuate harmful stereotypes.
“Some of these bots have dark patterns,” says Iliana Depounti, an ESRC-backed researcher at Loughborough University in the UK who has studied usage of Replika, another AI companion app. Depounti says these patterns often target the emotional vulnerabilities of lonely people. She adds that Dippy seems to promote themes and narratives designed to appeal to young women in particular. “Some people who use these apps are socially isolated, and these apps create further silos through their emotionally validating algorithms that don’t challenge existing conditions,” she adds.
Rather than just looking inside the minds of AI companions, then, we may need to take a closer look at how people are interacting with these apps to understand the real benefits and risks.