Schedule a Call

Code Without Compassion, Part 1: Fired by Bot — The Amazon Flex Case

In the age of automation, who decides when you’re done?

Futuristic digital gavel illustration symbolizing algorithmic decision-making, AI in workforce management, and ethical technology in modern workplaces.

Normandin’s experience isn’t an anomaly, it’s a symptom of a deeper issue: the rise of algorithmic authority in workforce management. At Amazon, machines don’t just assist managers; they are the managers. Flex drivers are monitored in real time, scored across opaque metrics, and removed from the platform without meaningful recourse. Human nuance such as locked gates, snow-covered roads, malfunctioning lockers, is often invisible to the system.

This is where code without compassion becomes more than a metaphor. It’s a structural reality.

Why This Matters

At Sakara Digital, we believe technology should serve people, not replace their dignity. The Amazon Flex case highlights several urgent concerns:

  • Lack of Due Process: Workers are terminated by bots, with no clear path to appeal or explanation.
  • Opacity of Metrics: Performance ratings are algorithmically assigned, but drivers don’t know how they’re calculated.
  • Emotional Toll: For many, gig work is a lifeline. Losing access without warning can mean losing housing, transportation, or stability.

What We’re Exploring

This post kicks off our new series, Code Without Compassion, where we’ll examine the ethical, operational, and emotional consequences of algorithmic decision-making in the workplace. Over the coming weeks, we’ll explore:

  • How algorithms are reshaping HR and gig platforms
  • What ethical AI governance should look like
  • How businesses can design systems that respect human complexity

We’re not anti-automation. We’re pro-accountability.

Coming Next:

Part 2: When Algorithms Manage Humans . A deeper dive into algorithmic HR systems and the risks of removing empathy from management.

This article was created in collaboration with GenAI and shaped by intentional human insight.

Further Reading

#FractionalConsulting #LifeSciences #DigitalTransformation #AI #EthicalAI

author avatar
Amie Harpe Founder and Principal Consultant
Amie Harpe is Co-founder, Managing Partner, and Principal Consultant at Sakara Digital, a boutique consulting firm helping pharma, biotech, and medical device organizations navigate digital transformation. Before founding Sakara Digital, Amie spent 23 years at Pfizer in global IT, leading implementations of quality management, document management, learning management, complaints, and change control systems across up to 65 manufacturing sites worldwide. She specializes in quality management systems (QMS), data quality and integrity, ALCOA+ compliance, AI readiness and governance in regulated environments, digital adoption platforms, and fractional IT leadership for life sciences. Amie writes extensively on pharma data quality, AI foundations, and human-centered digital transformation.


Your perspective matters—join the conversation.

Discover more from Sakara Digital

Subscribe now to keep reading and get access to the full archive.

Continue reading