NeuroBlade Raises Funds to Launch its Compute-in-Memory Chip


//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>

NeuroBlade, the compute-in-memory startup, has secured $83 million to help market its data analytics accelerator based on its XRAM computational memory chip. The Series B funding round takes NeuroBlade’s total funding to $110 million since being established in 2018.

The Israel-based startup has developed a data analytics architecture that eliminates major data movement bottlenecks by integrating data processing functions in-memory. While its focus so far has been on developing the architecture enabling the chip, the company also will deliver a data analytics tool dubbed Xiphos, billed as a system-level appliance.

The approach addresses latency issues created by the constant shuffling of data among storage, memory and central processing. Data movement is the primary cause of poor application performance and slow response times. With current architectures unable to scale to meet future data analytics requirements, NeuroBlade designed a computational architecture that reduces data movement while boosting data analytics performance.  Xiphos, the appliance based on its XRAM computational memory chip, is aimed at accelerating data analytics and unclogging traditional I/O bottlenecks as well as overcoming system bandwidth limitations.

NeuroBlade claims to be among the first to bring in-memory processing to production. The biggest stumbling block has been a lack of software, according to Elad Sity, the startup’s CEO and co-founder. “We tried to understand why previous computation in-memory systems failed. Software was a key contributor to that lack of success. That’s because nobody really wants to write software for a new processor,” Sity added in an interview.

NeuroBlade co-founders
NeuroBlade co-founders Eliad Hillel (left) and Elad Sity. (Source: NeuroBlade)

NeuroBlade considered various memory architectures to build upon, ultimately integrating its computational cores with DRAM. “We rebuilt the entire DRAM chip from scratch, as well as the toolchain and verification software. Since there were no tools… we had to build everything from the ground up,” Sity said. “We started building the XRAM technology in 2018 and got the first chip back in 2020.”

The patented XRAM technology is a combination of DRAM and embedded processing logic, with processing elements integrated near memory banks to provide massively parallel execution at high bandwidth. As a result, data-intensive workloads can be processed with far less data movement and higher performance, accelerating operations such as data comparison and manipulation, aggregations, filtering and lookups.

Along with the computer architecture, “We built a data analytics accelerator that speeds up processing and analyzing data over 100 times faster than existing systems, Sity claimed.

NeuroBlade promotes its appliance as addressing data analytics challenges faced by giant data center operators. “When we talked to the hyper-scalers, they all said data analytics was a key part of what they are looking for from their hardware.”

NeuroBlade Xiphos
The Xiphos appliance based on NeuroBlade’s XRAM computational memory chip accelerates data analytics by overcoming I/O bottlenecks and system bandwidth limits. (Source: NeuroBlade)

Hence, NeuroBlade’s emphasis is providing OEMs with servers rather than chips. “Partnerships will be key, and we are working closely with companies like Intel and SAP. Most of our investors are also strategic partners.”

Investors include Corner Ventures, Intel Capital, StageOne Ventures, Grove Ventures and Marius Nacht along with technology companies like MediaTek, Pegatron, Powerchip Semiconductor Manufacturing Corp., United Microelectronics Corp. and Marubeni.

Sity said the new funding also would be used to expand its engineering team in Tel Aviv. With more than 100 employees, NeuroBlade has begun shipping its data accelerator to customers and partners integrating it into some of the world’s largest data centers.

“We see that computational memory also has so much potential in other areas, so we will also invest in research and development,” Sity added. Over the next two years, the startup will also build its customer base before entering volume production, which is expected in 2023.

Among NeuroBlade’s partners is Intel. Lance Weaver, an Intel vice president and general manager of its data center and cloud strategy, said, “Despite being tested like never before this past year, the data center kept the world operating at a critical time. This market is poised for explosive growth, and we think that NeuroBlade looks to have a promising journey ahead.

The NeuroBlade platform is based on Intel’s Xeon Scalable processors, FPGA’s, Optane memory technology and network interface controllers.

Partner SAP said it expects to continue working with NeuroBlade on a new data analytics accelerator based on in-memory processing. Said Patrick Jahnke, head SAP’s innovation office: “The performance projections and breadth of use cases [show] great potential for significantly increased performance improvements for [database management services] at higher energy efficiency and reduced total-cost of ownership,” either on-premises or in the cloud.





Source link