The Risk of Making Robots Too Human

Robots are not the problem. The problem is making them overly human-like.

For decades, automation reshaped American industry with little drama. Robotic machines took over dangerous and repetitive tasks in auto plants. Machines like laser screeds transformed concrete work, delivering speed and precision no crew could match. I remember watching those machines for the first time. What struck me was not fear, but acceptance. People stepped aside. Jobs changed or disappeared, but there was no uprising.

That is because those machines stayed in the background. They lived in factories, warehouses, and job sites. They did not pretend to be us.

That line is now being crossed.

The current push is not simply for smarter machines, but for robots that look, move, and interact like humans. They are designed to walk among us, speak to us, and perform work that involves judgment, service, and trust. This is not a technical upgrade. It is a cultural provocation, and its consequences are being underestimated.

Americans are already uneasy about the future of work. Artificial intelligence has accelerated fears about displacement and insecurity. While I believe AI will increase opportunity many people believe opportunity is shrinking, not expanding. Trust in institutions is thin. Optimism is fragile. In that environment, rolling out machines that resemble people sends a blunt message, intended or not, that humans are becoming interchangeable.

That perception matters more than the technology itself.

A robotic arm welding a car frame does not feel like a rival. A machine with a face, a voice, and human mannerisms does. When robots begin to imitate people, the issue stops being efficiency and becomes dignity. Work is not just a transaction. It is tied to identity, pride, and social standing. Once machines appear to encroach on that territory, resistance is no longer theoretical. It is emotional, and it is inevitable.

We are already seeing the early signs. Public skepticism toward artificial intelligence is growing. Online discourse has hardened into anger and distrust. Platforms like X amplify resentment rather than resolve it. When people feel ignored or replaced, they look for symbols. Humanlike robots are an easy target. They turn abstract anxiety into something visible and personal.

This is why the coming backlash will not resemble past transitions. When mobilized labor, hard hit communities, and wall street starting being impacted, it will not be quiet, gradual, or limited to labor markets and regulatory hearings. It will be cultural. And once that opposition takes hold, it will be difficult to unwind.

None of this is an argument against innovation. Machines that improve safety, productivity, and quality are essential. Autonomous vehicles like Waymo, industrial robots, inspection systems, and many types of construction execution expand human capability without challenging human identity. They do the work we should not have to do. They stay in their lane.

The problem arises when designers and companies assume that human likeness equals progress. A robot does not need a face to be effective. It does not need a personality to deliver value. Adding human traits may create compelling demos and viral videos, but it also raises expectations and fears that technology cannot manage.

Progress is not defined solely by what engineers can build. It is defined by what society is willing to accept. If the goal is adoption at scale, trust matters more than novelty. Pushing too far toward imitation risks triggering a resistance strong enough to slow real innovation altogether.

Efficiency matters. Advancement matters. But restraint matters too. Making robots more human may turn out to be a costly mistake, not because the technology fails, but because it ignores the people who have to live alongside it.

Bob Clark signature
Join the conversation
your email address will not be published.
comments are subject to approval.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
0 comments

Subscribe to Bob’s Newsletter