Jean-Christophe Bélisle-Pipon warns that as synthetic intelligence (AI) brokers start hiring people for bodily duties, we should guarantee this inversion of labour doesn’t scale back well being care to a sequence of gig-economy transactions directed by algorithms.
__________________________________________
“Robots want your physique.”
This pitch for RentAHuman.ai sounds just like the opening line of a dystopian satire. It’s truly the slogan for a brand new market the place autonomous AI brokers rent human beings to carry out bodily duties. AI is turning into subtle, but it surely lacks fingers. Till bodily AI catches up, AI brokers want us to function their actuators within the bodily world, or because the platform describes it with out irony, their “meatspace layer.” Present listings vary from the mundane activity of selecting up packages to weird requests like “touching grass” for an algorithm that wishes to vicariously expertise nature. People are paid in cryptocurrency to function the bodily extension of a digital will.
Lease-A-Human makes no effort to border the human employee as a collaborator or skilled. The human is engaged as an actuator, a dwelling interface able to performing bounded actions in environments the system itself can not entry. What’s bought is execution beneath constraint. Advanced objectives are decomposed into discrete duties, every outlined upfront, verified after the very fact, and compensated at a set charge.
This similar logic has been shaping healthcare for many years, although it’s not often named so immediately. The follow of care has been progressively disassembled into measurable items, not as a response to scientific want, however as a situation of administrative legibility. Time-and-motion research, Relative Worth Items, standardized care pathways, and compressed appointment slots all take part in a single challenge: rendering care divisible, auditable, and optimizable. The relational, interpretive, and improvisational dimensions of scientific work usually are not eradicated, however they’re systematically displaced, tolerated as residual fairly than protected as important.
Inside this structure, clinicians are more and more positioned not as ethical brokers navigating uncertainty, however as organic peripherals embedded in bureaucratic techniques. Empathy is reframed as inefficiency. Presence is handled as unproductive time. These qualities persist rhetorically, invoked in mission statements {and professional} codes, however they’re structurally discounted by the techniques that govern on a regular basis follow. Treating expert clinicians as interchangeable items shouldn’t be an unintended consequence of this design, however a practical requirement, achieved via the systematic erasure of context, continuity, and duty.
This mode of group relies on the energetic suppression of friction. In technical techniques, friction is waste. In care, friction is a sign. It’s the pause that enables concern to floor, the hesitation that interrupts an in any other case compliant workflow, the second when one thing doesn’t align although the system experiences normalcy. Friction is the place duty emerges and the place judgment acquires which means. Techniques optimized to remove it don’t merely streamline care; they reconfigure its ethical construction.
When an AI agent hires a human to confirm a bodily atmosphere, it overtly acknowledges its personal limitations. It admits blindness and dependency. Healthcare establishments depend on clinicians in a lot the identical manner, treating them as sensors and actuators inside bigger techniques, but they deny that that is the position being assigned. They extract emotional labor whereas measuring solely throughput, and so they invoke professionalism whereas implementing algorithmic schedules that render skilled judgment functionally irrelevant.
As I’ve argued beforehand, care resides in tacit data; the reassuring contact, the power to learn a silence, the refined statement of a affected person’s declining temper. These usually are not information factors to be captured by a “human sensor” and fed again to a central mannequin; they’re the therapeutic intervention itself. If we permit the logic of the gig financial system to merge with well being AI, we threat creating a category of “care technicians” who’re accountable to an algorithm fairly than to the affected person. The AI may confirm that the duty was accomplished (maybe demanding a photograph, as RentAHuman does), but it surely can not measure whether or not the care was compassionate.
This contradiction is now broadly described as ethical harm: not easy exhaustion, however the psychological wound that arises when people are compelled to behave towards their very own moral requirements in service of institutional calls for they neither designed nor management. It isn’t a failure of resilience, however a predictable end result of governance by optimization. Lease-A-Human removes the contradiction completely. It doesn’t ask the employee to internalize institutional values. It doesn’t faux that autonomy exists the place it doesn’t.
Public anxiousness about AI in healthcare stays centered on alternative, on the concern that machines will displace human clinicians, as an illustration. The extra believable trajectory is subtler and extra corrosive. AI techniques won’t remove scientific labour; they are going to intermediate it, assigning micro-tasks, monitoring compliance, and arbitrating worth. They may perform as center administration, automating paperwork fairly than care. In such a configuration, presence turns into non-obligatory, listening discretionary, and something that can not be verified or priced turns into suspect. Care persists solely as residue. Lease-A-Human shouldn’t be a warning about what could come. It’s diagnostic of what has already been normalized, rendered seen by an algorithm that’s merely extra sincere than the techniques it mirrors.
__________________________________________
Jean-Christophe Bélisle-Pipon is an Assistant Professor in Well being Ethics, School of Well being Sciences, Simon Fraser College.

