Imagine an actor who never ages, never walks off set or demands a higher salary.That’s the promise behind Tilly Norwood, a fully AI-generated “actress” currently being courted by Hollywood’s top talent agencies. Her synthetic presence has ignited a media firestorm, denounced as an existential threat to human performers by some and hailed as a breakthrough in digital creativity by others.But beneath the headlines lies a deeper tension. The binaries used to debate Norwood — human versus machine, threat versus opportunity, good versus bad — flatten complex questions of art, justice and creative power into soundbites. The question isn’t whether the future will be synthetic; it already is. Our challenge now is to ensure that it is also meaningfully human.All agree Tilly isn’t humanIronically, at the centre of this polarizing debate is a rare moment of agreement: all sides acknowledge that Tilly is not human. Her creator, Eline Van der Velden, the CEO of AI production company Particle6, insists that Norwood was never meant to replace a real actor. Critics agree, albeit in protest. SAG-AFTRA, the union representing actors in the U.S., responded with:“It’s a character generated by a computer program that was trained on the work of countless professional performers — without permission or compensation. It has no life experience to draw from, no emotion, and from what we’ve seen, audiences aren’t interested in watching computer-generated content untethered from the human experience.”Their position is rooted in recent history: In 2023, actors went on strike over AI. The resulting agreement secured protections around consent and compensation.So if both sides insist Tilly isn’t human, the controversy, then, isn’t just about what Tilly is, it’s about what she represents.Complexity as a starting pointNorwood represents more than novelty. She’s emblematic of a larger reckoning with how rapidly artificial intelligence is reshaping our lives and the creative sector. The velocity of change is dizzying, and now the question is how do we shape the hybrid world we’ve already entered? It can feel disorienting trying to parse ethics, rights and responsibilities while being bombarded by newness. Especially when that “newness” comes in a form that unnerves us: a near-human likeness that triggers long-standing cultural discomfort. Indeed, Norwood may be a textbook case of the “uncanny valley,” a term coined by Japanese roboticist Masahiro Mori to describe the unease people feel when something looks almost human, but not quite.But if all sides agree that Tilly isn’t human, what happens when audiences still feel something real while watching her on screen? If emotional resonance and storytelling are considered uniquely human traits, maybe the threat posed by synthetic actors has been overstated. On the other hand, who hasn’t teared up in a Pixar film? A character doesn’t have to feel emotion to evoke it.Still, the public conversation remains polarized. As my colleague Owais Lightwala, assistant professor in the School of Performance at Toronto Metropolitan University, puts it: “The conversation around AI right now is so binary that it limits our capacity for real thinking. What we need is to be obsessed with complexity.”Synthetic actors aren’t inherently villains or saviours, Lightwala tells me, they’re a tool, a new medium. The challenge lies in how we build the infrastructures around them, such as rights, ownership and distribution.He points out that while some celebrities see synthetic actors as job threats, most actors already struggle for consistent work. “We ask the one per cent how they feel about losing power, but what about the 99 per cent who never had access to that power in the first place?” Too often missing from this debate is what these tools might make possible for the creators we rarely hear from. The current media landscape is already deeply inequitable. As Lightwala notes, most people never get the chance to realize their creative potential — not for lack of talent, but due to barriers like access, capital, mentorship and time.Now, some of those barriers might finally lower. With AI tools, more people may get the opportunity to create.Of course, that doesn’t mean AI will automatically democratize creativity. While tools are more available, attention and influence remain scarce.Sarah Watling, co-founder and CEO of JaLi Research, a Toronto-based AI facial animation company, offers a more cautionary perspective. She argues that as AI becomes more common, we risk treating it like a utility, essential yet invisible. In her view, the inevitable AI economy won’t be a creator economy, it will be a utility commodity. And “when things become utilities,” she warns, “they usually become monopolized.”Where do we go from here?We need to pivot away from reactionary fear narratives, like Lightwala suggests.Instead of shutting down innovation, we need to continue to experiment. We need to use this moment, when public attention is focused on the rights of actors and the shape of culture, to rethink what was already broken in the industry and allow space for new creative modalities to emerge.Platforms and studios must take the lead in setting transparent, fair policies for how synthetic content is developed, attributed and distributed. In parallel, we need to push creative institutions, unions and agencies to collaborate in the co-design of ethical and contractual guardrails now, before precedents get set in stone, putting consent, fair attribution and compensation at the centre.And creators, for their part, must use these tools not just to replicate what came before, but to imagine what hasn’t been possible until now. That responsibility is as much creative as it is technical.The future will be synthetic. Our task now is to build pathways, train talent, fuel imagination, and have nuanced, if difficult, conversations.Because while technology shapes what’s possible, creators and storytellers have the power to shape what matters.Ramona Pringle does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.