Dark
Light

Does AI Require a Physical Form to Achieve Human-Like Intelligence?

July 1, 2025

Have you ever wondered if a machine really needs a body to learn and reason like a human? While most of us interact with chatbots that exist solely in cyberspace, a growing number of experts believe that a physical form might be essential for truly human-like intelligence.

Remember Rosie from The Jetsons or C-3PO from Star Wars? Early portrayals of AI often gave machines a physical presence. Yet for many of us, the first encounter with disembodied AI was through the film WarGames – when Joshua was a computer without a body. This contrast raises a simple but intriguing question: can a machine understand human emotions, ethics, or logic without interacting physically with our world?

Today’s AI systems – especially large language models – show impressive capabilities but still fall short in complex reasoning. A study from Apple reveals that as problems become more intricate, these systems sometimes struggle to apply consistent logic. Former Google researcher Nick Frosst puts it plainly: what these models do is predict the next word, not truly grasp what we mean.

Early efforts in AI, known as Good Old-Fashioned Artificial Intelligence (GOFAI), focused on symbolic logic and abstract thought. Nowadays, some researchers suggest that to develop real artificial general intelligence, we might have to look at how a body interacts with its environment. This approach, known as embodied cognition, argues that a physical form allows machines to learn from real-world experiences in a way that pure computation simply can’t match.

Cecilia Laschi, a pioneer in soft robotics, sees potential in giving machines softer, more adaptable bodies – inspired by the flexibility of an octopus, for example. By adapting mechanically to their surroundings rather than relying solely on raw computational power, these robots could navigate unpredictable environments more effectively.

Over at UCLA, Associate Professor Ximin He is working on materials that come with a dash of autonomous physical intelligence (API). These materials can sense and react to changes, much like biological tissues. Imagine a future where devices adjust themselves automatically to meet our needs – from medical equipment to everyday technology.

Soft robotics might still be in its early days, but the potential applications are vast. Whether it’s creating medical devices that respond to patient needs or designing autonomous systems that learn from experience, this approach could pave the way for machines that understand the world more deeply.

Bioengineering expert Giulio Sandini sums it up well: if we want machines that mirror human intelligence, they must be able to gather their own experiences. This journey towards artificial general intelligence isn’t just about sharper code—it might require building bodies that let machines truly engage with the world.

Don't Miss