Challenge before tech leaders — build & communicate responsibly

Wait 5 sec.

4 min readFeb 28, 2026 06:12 AM IST First published on: Feb 28, 2026 at 06:12 AM ISTBy Prasad ShejaleI was closely following the Indian Express event with Sam Altman when a 30-second excerpt began circulating widely online. In that clip, Altman compared the energy used to train advanced AI systems with the “20 years of life and food” required to develop human intelligence. The reaction was swift. Many found the analogy unsettling. Some called it clinical. Others saw it as a technologist reducing human life to an energy calculation. That discomfort is understandable.AdvertisementWe are at a moment when AI no longer feels abstract. It feels personal. So when comparisons between machines and humans are framed in quantitative terms, they trigger emotional responses. But context matters.I was fortunate to be there in person, listening to Altman. Those 60 minutes gave a great insight into his perspective. I urge you to watch the full interview; you’ll find something quite different: A technologist deeply grounded in human values.In the entire conversation, the remark did not land as a moral comparison. It was framed as a technical explanation — an attempt to illustrate that intelligence, whether biological or artificial, requires enormous input. Human cognition is shaped by decades of nutrition, education, care, and social investment. AI systems require data, compute power, electricity and infrastructure.AdvertisementThe analogy was not presented as a statement about the value of human life. It was a systems-level explanation of cost and creation. What struck me more in the broader discussion was not detachment, but reflection. Altman spoke about a hospital experience where a nurse’s empathy left a lasting impression — something he acknowledged no machine could replicate. He was clear that while AI can assist, synthesise and accelerate, it cannot replace warmth, presence or human care. When asked what he would never use AI for, he said he would not turn to ChatGPT for guidance on happiness. Information is not wisdom. Data is not lived experience. That distinction remains deeply human.He also spoke as a parent, expressing concern about excessive screen exposure and the growing culture of infinite scroll. Perhaps the most interesting part of the conversation was his view on value in an AI-driven future. As technology makes knowledge and services cheaper and more abundant, he suggested that what becomes scarce — and more precious — is human connection.you may likeThis episode revealed something larger. Public trust in AI leaders is fragile. Every metaphor is scrutinised through a lens of anxiety — about jobs, agency, what it means to be human in a world increasingly shaped by algorithms. In this environment, communication matters more than ever. Engineers often speak in terms of systems and inputs. Audiences hear implications about identity and value. Both reactions are real.The challenge before technology leaders is not only to build responsibly, but to communicate responsibly. For audiences, the challenge is to allow space for full conversations before forming definitive conclusions.The debate around AI’s energy consumption is legitimate. So are concerns about its social impact. These are not trivial discussions. But reducing a nuanced exchange to a single line risks oversimplifying both the technology and the intent behind it. AI forces us to re-examine how we define intelligence, value and creativity. That examination deserves depth. In an era when clips outrun conversations, perhaps the most radical act is patience.The writer is founder and CEO, LS Digital Group