Can I send my digital doppelganger to a meeting?

Arjun Harindranath
By Arjun Harindranath April 7, 2026

There’s something a little off about the first time I meet Daniel Park, founder of Pickle.

It’s not anything he says, exactly, but more in the way his image comes up during our short video call. A few minutes in, I ask him if he’s really there or if he has sent in a digital double, to which he immediately admitted he had and switched over to his regular video feed.

The product that Park is building is now pushing our wildest ideas in science fiction and poses some interesting ethical questions. We already have people sending their Fireflies accounts into meetings they can’t attend. By quietly recording, transcribing, and summarizing conversations in their absence, it’s a small step from there to a more unsettling question: should we be sending our digital doppelgangers instead?

For Park, that question didn’t begin as a philosophical exercise. It started as survival. As a medical student during the COVID pandemic, he found himself trapped in a relentless cycle of online classes.

“It was very tough because we had 10 classes a day and it was very fatiguing for me and I thought this was going to have a psychological effect on me.”

Out of that exhaustion came an idea. What if he didn’t have to be there, at least not physically or even visually? What if something that looked and sounded like him could take his place?

The notion feels, at first glance, deceptive. Sending a digital double into a meeting seems like a more sophisticated version of pretending your camera is broken. Park is aware of that instinctive discomfort. 

“There are many people saying that Pickle is very deceptive and it’s like cheating but real users aren’t just using it for fun but because they need to,” he told StartupBeat. 

The use cases he points to are less recreational and more about constraint. They included parents juggling childcare and work commitments, people living with mental or physical disabilities, individuals with religious reasons for not appearing on camera, or simply professionals buried under the burden of back-to-back meetings.

With this in consideration, the digital doppelganger becomes less of a gimmick and more of an accessibility tool.

Still, the technology is only just catching up to where it ought to be. Today’s avatars might not be able to fool tech journalists, but Park is confident that day is coming soon. The pace of AI development, he believes, is so rapid that highly realistic digital doubles are not a distant possibility but an imminent one.

In fact, he’s already putting the concept into practice. Park sends his own digital double into investor meetings regularly, a quiet admission that even founders pitching the future don’t always have the time—or energy—to show up themselves.

That raises an obvious concern: if anyone can generate a convincing digital version of themselves, what stops them from generating someone else? 

Park is unequivocal on this point. “We have a very strong policy at Pickle that disallows people from using other people’s data or personal characteristics. We have a strong belief that the avatar and voice should be matched to a person like a fingerprint.”

The metaphor is telling. A fingerprint is, in theory, difficult to replicate and is unique to users. It is both uniquely identifying and difficult to replicate. But in practice, digital systems can struggle to maintain that kind of integrity. Deepfakes, voice cloning, and identity theft already blur the boundaries between authentic and artificial. A “fingerprint” avatar system would need to be more than a policy; it would need to be enforceable, verifiable, and trusted across contexts.

And the contexts are expanding quickly. What begins as a tool for skipping a meeting could evolve into something far more consequential. 

Park describes a future goal they call the “soul computer”: a fully functioning AI avatar capable not just of appearing on your behalf, but of thinking, responding, and presenting as you would.

From CEOs too busy to take a call while driving to customers who don’t want to put on makeup, the early use cases may seem mundane. 

But scale changes meaning. But can a digital twin negotiate, decide, or act independently? Should it?

At that point, the ethical questions sharpen. Who is responsible for what the avatar says or does? If your digital double signs a contract, does it bind you? If it mishandles a conversation, misrepresents your views, or even causes harm, where does accountability lie? With the user? The company? The creator of the algorithm?

There are no clear answers yet, and even Park acknowledges that some of these questions remain unresolved though he does see the broader trajectory less as a risk and more as a transformation.

“In the near future an avatar can represent your voice and your thoughts better than the real world. It can capture some root norms that are more authentic to your real self,” he said. 

It’s a striking claim: that a synthetic version of you might one day be more authentic than the person typing, speaking, or showing up on screen. That authenticity, in this vision, is not about physical presence but about consistency—an AI that reflects your values, patterns, and intentions without the distortions of fatigue, distraction, or mood.

Yet history—and literature—suggests we should be cautious about doubling ourselves. In classic mythology, encountering your doppelganger was often a sign of death, a harbinger of something fundamentally out of balance. The idea persists because it touches on a deeper anxiety: that in creating a copy, we might lose control over what it means to be the original.

The Double by Fyodor Dostoevsky also captures this unease, though as a fate worse than death. When the protagonist Mr Goldyakin finds that the entire village he lives in is now populated with multiple Goldyakins “stretched in a long chain like a file of geese ”, he finds himself dragged to jail only to be accompanied by all of his doppelgangers in the prison cells beside him. 

Here’s hoping that the visionaries of digital twins keep us from facing that same fate.

This article is a part of our series on the confluence of startups and ethics. If you have a take on a particularly spicy moral conundrum in the world of startups, drop us a line at [email protected].