//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>
The next wave of automation may or may not be humanoid, but the success of robotics – otherwise known as “physical AI” or “embodied AI” – will rely on a technology ecosystem that is still in its early stages, according to industry experts interviewed by EE Times.
Universal Robotics is the largest manufacturer of collaborative robots, or cobots, a type of industrial robot designed to work in the same space as humans. Anders Billesø Beck, vice president of technology at Universal Robots told EE Times that cobots unlock a range of applications he calls “human-scale automation”: augmenting or replacing humans in the workflow.
“We’ve been struggling with [the variability of tasks] in traditional engineering for decades – we can model things if they are consistent or expected, that’s a solvable engineering problem, [but where it isn’t,] AI can solve it,” Beck said.
Logistics, whose warehouse operations are largely automated today, would have been impossible to automate without AI, Beck said. Packages are different shapes and sizes (some retailers Universal Robots works with have as many as half a million SKUs), and pallets in the real world look very different to what comes out of the pallet factory – they are dented, painted and shrink wrapped. A Universal Robots autonomous pallet jack had to be trained on 100,000 real images and 1.5 million synthetic images of pallets to create a robust pallet-recognition model.

“The beauty [of AI] is that it is software-updateable,” Beck said. “Some customers might have their own custom pallet, which might not look like a pallet, it might look more like a big plastic tub with a pallet on the bottom. To add that to our model – we can augment the model, it doesn’t need a lot of work because the foundation layers are still there – just a bit of refinement, and off you go.”
A team working without AI would need months of bespoke engineering to adapt to these changes, he said.
Future AI-enabled robots will be able to take on applications like assembly, which makes up around 40% of all industrial processes. Assembly of small parts – screwdriving, plugging in modules or cables, routing wires – has a very high mix of subtly different tasks, which is too varied for today’s cobots.
“We’re going to start to see robots moving into jobs where the workflows change daily or hourly, where some of these tasks are really complex, where human finesse is needed and where mobility and the task execution integrates that,” Beck said. “That will also spark growth in non-industrial applications.”
Non-industrial robotics could be home assistants that perform complex tasks like cooking, or hospitality helpers in restaurants and shops.

Humanoid robots
Visit the show floor at Nvidia’s GTC and you’ll hardly be able to turn a corner without coming across a humanoid robot. This form factor is complex and requires advanced AI, and while it seems to have captured the industry’s imagination, it’s not yet suitable for industrial automation applications, Beck said.
Industrial processes are already optimized for performance, making them relatively low-hanging fruit when it comes to automation, Beck said.
“What’s left is applications where you need the extreme flexibility and capabilities that humans have, because humans are amazing in their versatility – here, humanoids could fit,” Beck said. “But I do think there will be many form factors rolling out the deployment of AI, and there’s no doubt we’ve just started to overcome a lot of the technical hurdles that the humanoid form factor has to overcome.”
Safety is one of these hurdles. An unstable humanoid cobot could fall on a person in its working environment, which isn’t a trivial problem to solve, Beck said.
Safety requirements
Functional safety for robotics has historically been complex, Beck said, and AI is making it even more complicated. Processes like reasoning don’t need to be functionally safe, because this is balanced by safety measures like power and force limitations, but anything that increases unpredictability is at naturally at odds with safety, including AI.
“Everything around safety likes predictability,” Beck said. “If you can describe what the robot will do, then you can do your risk assessment based on an understanding of what it will do. If you can’t fully describe what it will do, then you need to be more open-minded in how you describe the behaviors of the robot.”
Innovation in safety will be a big topic for the next 10 years, Beck said, both from the silicon and the ecosystem side.
“We need to continuously make those workflows easier, because the barrier of entry for innovating on safety is high, it needs specialists,” he said. “We’ve seen that with many robot companies, that has been their stumbling block – they’re just getting to that point where certified safety is a core part of what they do.”

Arm is watching safety and security standards closely, Paul Williamson, senior vice president and general manager of IoT at Arm told EE Times.
Williamson said that while advanced safety features are needed for some robotics applications, some systems will handle safety at a different level, perhaps by separating humans’ workspace from the robots or by having a separate kill switch.
“It’s important to provide those different levels of safety capability to suit the different deployment models, because to drive innovation, you don’t want to burden every form factor with the cost of the most advanced systems,” he said.
Software ecosystem
Universal Robotics has a partner ecosystem where third parties can make hardware peripherals (like grippers, sanders, glue dispensers, welders, cameras, etc) and write software plugins for its cobots.
“No company in the world can solve everything on its own,” Beck said. “Robotics is both the science and the art of integration – you need a lot of systems and technologies playing well together, and that needs good silicon infrastructure, good technologies at every level, and good interfaces for integration.”
Last year, Universal Robotics introduced AI deployment infrastructure for AI companies who want to deploy their models on Universal Robotics’ hardware, a move that was welcomed by many software-centric startups, Beck said.
“[Startups] created great software products that they wanted to take out to customers, but they realized they need to come up with computing infrastructure, and support it,” Beck said. “We’ll have an Arm-based Nvidia-accelerated computer to run the AI models, and we offer long-term industrial product life cycle management and support – so startups can focus on being startups.”
This allows startups to focus on doing what they do best without worrying about infrastructure and the software ecosystem. This ecosystem currently has more than 500 certified cobot-compatible products designed by more than 380 third parties, Beck said.

Ecosystem is essential if we want to avoid what happened with smartphone software, Williamson said.
“The reason this excites Arm is we’re all about building these ecosystems and seeing where we can unlock potential through software commonality and ecosystems,” he said. “[Consistent with the smartphone analogy, Arm] enabling different performance levels and price points and capabilities and devices… allowed software developers to then target those multiple performance and price points in the industry.”
The same underlying architecture can enable both high-end and more power- and cost-efficient devices to exist in a consistent software ecosystem, Williamson said.
“Our goal is to ensure the software ecosystem allows for that flexibility, for customization and tuning, whether that be Cortex-A Linux-based platforms and virtualization spanning different silicon vendors’ platforms to allow models to be deployed for different computing environments, or whether it’s closer to camera operation or low power sensing, having that common software developer flow and tools in the cloud that are already tuned and optimized for Arm allows that portability,” he said.
Chip customers
Arm’s customers in robotics today include chipmakers like NXP, Infineon, Qualcomm and Nvidia. Does Williamson see a future where, perhaps driven by the desire for custom innovation at the silicon level, big robotics companies make their own chips?
“The economics for vertical integration in the robotics sector today isn’t there,” he said.
Some chipmakers with automotive offerings are trying to redeploy their investment into the robotics space, as there is some crossover in requirements, but in general, there is such a breadth of potential form factors that a range of platforms will be required, Williamson said. The trend is towards higher-performance silicon in robots as use cases evolve and volumes grow, he added.
Arm provides compute subsystem (CSS) designs which integrate multiple Arm cores with interconnect in a ready-made subsystems for particular verticals, including the data center, mobile devices and automotive today. Is a CSS for robotics coming?
“We’re always looking at market demands,” Williamson said. “At the moment [in robotics] there is a breadth of performance and form factor, but if that tends to standardize, and if we see clear demands that we think need to be met at the system architecture level, we’d be keen to provide those for the industry.”

Humanoid potential
Both experts agreed that the humanoid form factor could develop into one of the most exciting forms of robotics, albeit a little further into the future.
“I’m super excited about [humanoid robots] because they’re driving innovation, but I think we’re going to see a lot more form factors become more relevant in the short term, and solve real world commercial problems in the short term, which are going to be as important as the long-term opportunity that a humanoid form factor could provide,” Williamson said.
“[Humanoids] generate a whole new wave of innovation within robotics, which is very welcome after maybe a decade of a bit of an innovation drought within robotics, so I’m really excited about that,” Beck said. “There’s no doubt there’s going to be a market for humanoids. Is it going to displace everything? I’m not so sure, but it’s exciting to follow.”