Synopsys Shows Off First Synopsys-Ansys Products


//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>

SANTA CLARA, Calif. — At Synopsys Converge, the EDA giant unveiled the first results of its Ansys acquisition last summer, demonstrating how Ansys’ simulation tools will work with Synopsys tools and workflows to combine multi-physics design with silicon design from a single platform.

“When we closed the acquisition of Ansys, our customers’ first question was: When do I get the technology?” Synopsys CEO Sassine Ghazi said during his keynote at the event. “We promised the first half of ’26 because we know the team—we’ve had a partnership since 2017. We know customers’ requirements very well because we work closely with them, but to deliver on that fused technology in that timeframe was a fantastic execution from our team.”

Combined co-design

The new combined company is investing in three key areas, Ghazi said: co-design, digital twins, and agentic AI.

“We’ve been doing co-design and digital twins for decades, true—the concepts are not new,” he said. “But engineers don’t make an investment in their workflow just because; you do it because you have constraints to deliver. For future products, the constraints are massive.”

ABLIC Expands Market Presence with High-Value Analog Solutions in Europe and the US

By ABLIC  03.16.2026

How Bluetooth® Technology Helps Optimize Operations Across Industrial Spaces

By Bluetooth SIG  03.10.2026

GigaDevice Expands GD25UF Series Density Empowering AI Computing with 1.2V Ultra-Low Power Storage

By GigaDevice  03.10.2026

For a robotics system, this could mean satisfying electronic, mechanical, thermal, and fluidic constraints simultaneously, but tighter constraints lead to innovation, Ghazi said, since they require engineers to think outside the box.

Synopsys thinks of co-design both vertically (for example, within electronics subsystems) and horizontally (for example, across mechanical and electronic subsystems). Vertical co-design has been the norm for some time in silicon design, Ghazi said, but horizontal co-design is now needed. The complexity of designs is leading to over-designing, where handing off between different domains requires design margins, which stack up. The key is how to reduce this margin, he said.

“In the AI superchips era, it’s no longer a vertical optimization, it’s a horizontal co-design,” he said. “[We need to] take into account thermal, warpage, the cracking of the dies, the mechanical aspect—and if you do it too late—think about the cost.”

The first combination of Synopsys and Ansys technologies will be called Multiphysics Fusion technology, aimed at combining Ansys golden multiphysics engines into Synopsys EDA products to tackle electromagnetic, thermal, and mechanical effects alongside silicon design. In the first release, there will be multiphysics solutions for timing signoff, for multi-die, design closure, and analog design. These combinations are under beta test with customers today, Ghazi said.

As an example of a system requiring multiphysics design, Ghazi showed the world’s first HBM4 test chip (produced with a memory partner), using Synopsys IP to connect the logic die and memory stack.  

Synopsys and Ansys products are starting to combine. (Source: Synopsys)

As well as Synopsys products hosting Ansys technology, there are some product lines where Ansys is the host, Ghazi said.

“We had the first major Ansys product release since the acquisition, the R1 release, and again, we did not miss a beat,” he said. “There were a lot of worries from customers. This is a massive acquisition, complex integration, and we did it incredibly well based on our commitment to both our team internally and understanding the customer requirements to deliver to it.”

The R1 is a unified Synopsys-Ansys workflow that combines products from the two portfolios for the first time for tasks such as safety analysis, materials discovery and development, manufacturing process improvement, photonic design and optical simulation, and test automation software.

Synopsys CEO Sassine Ghazi holds up the world’s first HBM4 test chip at Synopsys Converge (Source: EE Times)

Digital twinning

While the concept of co-design has been around for years, design complexity is driving the need for digital twins both at the silicon level and at the system level, at multiple levels of abstraction, Ghazi said. Physical protoyping is getting harder, to the point where it will become impossible for intelligent systems, he said.

Applications like automotive need digital twins of the physical product, of the electronics, and of the environment—all are separate engineering efforts, which means an ecosystem is needed to deliver.

Ghazi announced eDT (electronic digital twins), an open platform for digital twins based in the cloud, which can plug into ecosystem efforts for digital twins in different domains. It will initially focus on automotive use cases.

“Think of it as the operating system you need to design a digital twin for an autonomous vehicle,” Ghazi said. “This is the modern way of providing software-defined hardware-assisted verification to our customers.”

The company has already partnered with Nvidia on Omniverse for digital twins of the environment. Ansys Fluent (computational fluid dynamics product) and Ansys AV Accelerate (for autonomous vehicles) will accelerate the bring-up of other domains’ digital twins inside Omniverse.

Agentic AI

“When I talk to customers about agentic AI, at the user level, there is still a mix of excitement and fear,” Ghazi said. “The fear is around how will it change my job? The excitement is [around] a huge productivity booster. As you go up the organizational chain, there’s a lot of excitement.”

Higher-level managers see the opportunities for agentic AI, he said, since they understand that the number of engineers they have is the limiting factor most of the time.

Synopsys is making progress up the levels of autonomy for agentic AI, he said. Six Synopsys co-pilot agents are available at L1, which can generate outcomes rather than the user doing it manually. There are 24 Synopsys task agents at L2, whose role is to deliver on specific tasks assigned by human engineers. At L3, Synopsys has three multi-agent workflows with an orchestration layer, which means agents can manage each other. L4 and L5 require contextual awareness of multiple agents; an intelligent system based on a reasoning agent will dynamically orchestrate other agents.

Synopsys announced its first L4 agentic workflow at Converge. This workflow can, for example, handle architectural spec to RTL, build test plans, and handle formal verification, static verification, coverage, and debug with separate task agents.

“We’ve made tremendous progress here,” Ghazi said. “My urgency to you is [to] explore, be open-minded. I know scepticism is always good, but not good to a point that not taking advantage of what’s possible and the speed at which things are moving.”

Customers can also plug in their own agents into the workflow, he said.

Nvidia endorsement

Special guest, Nvidia CEO Jensen Huang, joined Ghazi on stage at the end of the keynote to add context to all three of Ghazi’s key announcements.

Reinventing computer graphics multiple times, as Nvidia has done, certainly requires a level of co-design, Huang said.

Ghazi was joined by Nvidia CEO Jensen Huang. (Source: EE Times)

“Now we want to stack co-design from chips all the way out to the system, and now multi-systems, because in order for us to do distributed computing at the scale that we do, the computing fabric is inclusive of the CPU, the GPU, the scale-up switch, the scale-out switch, and the network processors are all part of the software stack,” Huang said.

The next step will be computers the size of a building, such that the building is part of the system, in a gigawatt data center. 

“We have to design the whole thing and refactor and redesign our algorithms all at the same time,” he said. “We work from the top-down, bottom-up, inside-out, outside-in, all at the same time. That’s extreme co-design.”

Test benches for physical AI need to be representative of the physical world and obey the laws of physics, which is “incredibly hard,” Huang said, especially with both software and hardware in the loop, mixing algorithms with AI and agents, and doing this for multiple robots at the same time.

“Omniverse is one of the most complex software systems the world’s ever made, and it took us almost a decade to get here,” he said.

Huang also stressed the importance of AI agents in the design process.

“This is the thing that almost every single analyst gets wrong,” Huang said. “The limitation of Nvidia is not anything except for the number of engineers we have. And that’s the reason why we’re constantly hiring more engineers.”

Every human Nvidia engineer will be given multiple Synopsys agents, specialized in different parts of the design phase, to collaborate with, he said. 

“I was in the first generation of engineers that were able to use tools to design chips rather than just using schematics and doing it by hand,” Huang said. “The last generation of engineers before me never believed it, but the generation of engineers now, after me, can’t imagine living without it.”

Tomorrow’s engineers will rely on extreme co-design, digital twins, and agentic AI.

“We’re in a new phase,” Huang said.


Read also:
Synopsys



Source link