In the age of automation, who decides when you’re done?
In 2021, Bloomberg published a sobering exposé on Amazon’s Flex program, revealing how contract drivers were being hired, rated, and terminated by algorithms, with little to no human oversight. The article, titled “Fired by Bot”, tells the story of Stephen Normandin, a 63-year-old Army veteran who was abruptly deactivated by an automated system after years of high performance. His appeals were met with generic emails, algorithmic indifference, and ultimately, silence.
Normandin’s experience isn’t an anomaly, it’s a symptom of a deeper issue: the rise of algorithmic authority in workforce management. At Amazon, machines don’t just assist managers; they are the managers. Flex drivers are monitored in real time, scored across opaque metrics, and removed from the platform without meaningful recourse. Human nuance such as locked gates, snow-covered roads, malfunctioning lockers, is often invisible to the system.
This is where code without compassion becomes more than a metaphor. It’s a structural reality.
Why This Matters
At Sakara Digital, we believe technology should serve people, not replace their dignity. The Amazon Flex case highlights several urgent concerns:
- Lack of Due Process: Workers are terminated by bots, with no clear path to appeal or explanation.
- Opacity of Metrics: Performance ratings are algorithmically assigned, but drivers don’t know how they’re calculated.
- Emotional Toll: For many, gig work is a lifeline. Losing access without warning can mean losing housing, transportation, or stability.
Follow Sakara Digital for weekly insights
Practical strategies for ethical AI, digital transformation, and fractional support.
What We’re Exploring
This post kicks off our new series, Code Without Compassion, where we’ll examine the ethical, operational, and emotional consequences of algorithmic decision-making in the workplace. Over the coming weeks, we’ll explore:
- How algorithms are reshaping HR and gig platforms
- What ethical AI governance should look like
- How businesses can design systems that respect human complexity
We’re not anti-automation. We’re pro-accountability.
Coming Next:
Part 2: When Algorithms Manage Humans . A deeper dive into algorithmic HR systems and the risks of removing empathy from management.
This post is part of a series. View the full series Code Without Compassion.
This article was created in collaboration with GenAI and shaped by intentional human insight.
Further Reading
- Research: How AI Is Changing the Labor Market. Harvard Business Review
- Algorithmic Management and the Future of Human Work. arXiv
#FractionalConsulting #LifeSciences #DigitalTransformation #AI #EthicalAI








Your perspective matters—join the conversation.