Your AI Fitness Trainer Can Do More Harm Than Good

Wait 5 sec.

Did you know you can customize Google to filter out garbage? Take these steps for better search results, including adding my work at Lifehacker as a preferred source.Want customized workout plans, real-time feedback, and 24/7 motivation, all without the cost of a human trainer? AI personal trainers sound like a perfect solution. Download an app, answer some questions about your goals and fitness level, and receive a personalized training program. I've tested some of these apps myself, and I definitely see the appeal. But more than anything, I see companies stuffing AI into apps where it doesn't belong. There's Strava's Athlete "Intelligence"; Garmin's underwhelming Connect+ subscription; Whoop's recovery recommendations, just to name a few. And as Lifehacker senior health editor Beth Skwarecki points out, people are increasingly asking ChatGPT for training advice, which is infuriating considering how many high-quality, free programs are already out there. Medical professionals and trainers are increasingly noticing clients experiencing anxiety around optimization and performance, becoming discouraged when AI labels their efforts as inadequate. Think of how something like "closing your rings" on Apple Watch activity goals had a chokehold over the nation. Or Fitbit step counts, even when step goals are bullshit in the first place. When metrics don't align with expectations, people feel like failures. And it's not because they haven't made progress, but because an algorithm told them so. Blind trust and data obsessionCertified personal trainer Cara D'Orazio describes what she calls "digital guilt"—the anxiety that creeps in when you miss a workout notification or can't keep up with your app's demands. She recalls clients who arrived at her gym burnt out and demoralized, including one woman whose AI coach prescribed six consecutive training days without rest. The woman felt "lazy" for being sore—a natural physiological response her digital trainer couldn't recognize or validate."People begin relying so much on the algorithm that they lose connection with how their body actually feels," D'Orazio says. "A real coach can tell when your stress levels are high, when you didn't sleep, or when you just need to talk for five minutes before starting. AI doesn't do that. It only sees numbers—calories, steps, heart rate—not emotions, hormones, or mindset." Movement should enhance your relationship with your body, not create anxiety around it. This disconnection is particularly dangerous when you consider how deeply intertwined your fitness can be with your mental health. Marshall Weber, a certified personal trainer and owner of Jack City Fitness, has witnessed the psychological toll firsthand. "I have definitely seen folks get discouraged and even anxious when they rely too heavily on AI fitness enhanced tools," he explains. "While it is great that these apps can track everything, they are lacking a bit on the balance and self compassion side of fitness." I know that when I'm in a vulnerable mental state, this lack of empathy can be devastating. As D'Orazio warns, "If we're not careful, we're going to see a whole new wave of people who are 'fit' on paper but emotionally exhausted and disconnected from their bodies." The constant performance feedback is a recipe for an unhealthy fixation on fitness goals.The human touch AI simply can't replaceOutside of fitness, one of AI's most significant limitations is its inability to read context. Adrian Kelly, a business and sports performance coach, emphasizes the risks here: "Exercise can be quite an emotional experience with highs and lows generated by meeting, or failing to meet, our own expectations." He notes that traditional trainer-client relationships provide something AI cannot replicate: empathy, accountability, and trust built through genuine human connection. A skilled coach recognizes early warning signs of disordered eating, overtraining, or emotional distress. They celebrate non-scale victories, adjust plans when life gets complicated, and remind you that rest is productive."The healthiest results come from building trust, flexibility, and self-awareness—things a machine simply can't measure," D'Orazio says. "Movement should make you feel more human, not less."Dr. Ayesha Bryant, a clinical advisor at Alpas Wellness, warns about the unhealthy fixation on health data that AI systems encourage. "This heavy quantification of fitness can drive clients and patients to perfectionism or body dysmorphia tendencies, especially in vulnerable individuals," Bryant says. The problem is compounded by all that blind trust in the algorithm, where users continue following AI recommendations even when experiencing pain, burnout, or clear signs they need rest or medical attention.Even if someone is self-aware enough to override AI recommendations, there's still the need for algorithmic validation. It's all too easy to shift from intrinsic to extrinsic motivation, forgetting that the whole point of moving your body is because it feels good. The bottom line: Find some balanceThis isn't to say AI fitness tools have no place in a healthy lifestyle. They can be useful for tracking data, setting reminders, or logging workouts. But they should complement—not replace—human guidance and your own body awareness.Weber recommends that anyone training regularly "consider checking in with a PT of some sort just to make sure you are still being kind to yourself." Bryant agrees, emphasizing that "long-term wellness and quality of life is driven by empathy, adaptability, and human connections."If the fitness industry's AI revolution has arrived, we need to approach it with clear eyes. Your body is not a machine to be optimized. It's a complex, intelligent system that deserves compassion, flexibility, and human understanding—things no algorithm can provide.