Deci and Intel Collaborate to Optimise Deep Learning Inference on Intel’s CPUs

Deci, the deep learning startup building the next generation of AI, has announced a broad strategic business and technology collaboration with Intel Corporation to optimise deep learning inference on Intel Architecture (IA) CPUs. As one of the first companies to participate in Intel Ignite startup accelerator, Deci will now work with Intel to deploy innovative AI technologies to mutual customers.

The collaboration between Deci and Intel takes a significant step towards enabling deep learning inference at scale on Intel CPUs, reducing costs and latency, and enabling new applications of deep learning inference.

New deep learning tasks can be performed in a real time environment on edge devices and companies that use large scale inference scenarios can dramatically cut cloud or data centre cost, simply by changing the inference hardware from GPU to Intel CPU.

“By optimising the AI models that run on Intel’s hardware, Deci enables customers to get even more speed and will allow for cost-effective and more general deep learning use cases on Intel CPUs,” said Deci CEO and Co-founder, Yonatan Geifman. “We are delighted to collaborate with Intel to deliver even greater value to our mutual customers and look forward to a successful partnership.”

Deci and Intel’s collaboration began with MLPerf where on several popular Intel CPUs, Deci’s AutoNAC (Automated Neural Architecture Construction) technology accelerated the inference speed of the well-known ResNet-50 neural network, reducing the submitted models’ latency by a factor of up to 11.8x and increasing throughput by up to 11x. Deci’s AutoNAC technology uses machine learning to redesign any model and maximise its inference performance on any hardware - all while preserving its accuracy.

Monica Livingston, AI Solutions and Sales Director at Intel said: “Deci delivers optimised deep learning inference on Intel processors as highlighted in MLPerf. Optimising advanced AI models on general purpose infrastructure based on Intel Xeon Scalable CPUs allows our customers to meet performance SLAs, reduce cost, decrease time to deployment, and gives them the ability to effectively scale.”  

This collaboration started with Deci being one of the first companies to join Intel Ignite, an accelerator program designed to support innovative startups in advancing new technologies in disruptive markets. Pilots of Deci’s platform to select customers in the enterprise, cloud, communications, and media segments are being developed to enable them to scale up and further accelerate their deep learning usage on Intel CPUs. As the results of these engagements are shared, an opportunity to scale the Deci platform to a broader base of customers is being planned.

Startup Details

Startup Details


Deci AI

Deci, which means 'tenth' (decimus in Latin), is ushering in a new AI paradigm by using AI to build and operate AI models. Deci’s deep learning platform enables data scientists to transform their AI models into production-grade solutions on any hardware, crafting the next generation of AI for enterprises across the board.

Deci’s proprietary AutoNAC (Automated Neural Architecture Construction) technology autonomously redesigns an enterprise’s deep learning models to squeeze the maximum utilisation out of its hardware.

Founded in 2019 and based in Tel Aviv, Deci’s team of deep learning experts are dedicated to eliminating production-related bottlenecks across the AI lifecycle to allow developers and engineers the time to do what they do best - create innovative AI solutions for our world’s complex problems.

  • Headquarters Regions
    Tel Aviv, Israel
  • Founded Date
  • Founders
    Jonathan Elial, Ran El-Yaniv, Yonatan Geifman
  • Operating Status
  • Number of Employees