Neural Magic is a startup company that focuses on developing technology that enables deep learning models to run on commodity CPUs rather than specialized hardware like GPUs. The company was founded in 2018 by Alexander Matveev, a former researcher at MIT, and Nir Shavit, a professor of computer science at MIT. They raised a total of $50 million in funding to date over 3 rounds, from investors such as Comcast Ventures, NEA, Andreessen Horowitz, Pillar VC, and Amdocs.
The company’s core technology is called “Neural Magic Platform,” consisting of the 1.4 product release of the DeepSparse, SparseML, and SparseZoo libraries. The software optimizes deep learning models to run on CPUs with high performance and efficiency. This is achieved through a combination of algorithmic and hardware optimizations that take advantage of the unique characteristics of CPUs, such as their large cache sizes, multi-threading capabilities, and vector instructions.
The Neural Magic libraries are compatible with popular deep learning frameworks like TensorFlow and PyTorch, making it easy for developers to integrate them into their existing workflows. The software can be used for a variety of applications, including natural language processing, computer vision, and time series analysis.
One of the key advantages of using CPUs instead of specialized hardware is cost. CPUs are much cheaper and more widely available than GPUs, which can be prohibitively expensive for some organizations. This makes deep learning more accessible to a wider range of businesses and researchers.
Another advantage is flexibility. CPUs are general-purpose processors that can be used for a variety of tasks, whereas GPUs are designed specifically for parallel computing. This means that CPUs can be used for a wider range of applications, making them a more versatile option.
Neural Magic also offers improved performance compared to running deep learning models on CPUs without optimization. In benchmarks conducted by the company, the software was able to achieve up to 10 times faster performance on CPUs compared to unoptimized models.
The company has already attracted a number of customers. Striveworks and Neural Magic announced their partnership. Striveworks, a pioneer in responsible MLOps for national security and other highly regulated spaces, will integrate Neural Magic’s core offerings into its Chariot MLOps platform’s training and model services.
Neural Magic has also developed a number of pre-trained models that can be used out-of-the-box for a variety of applications. These models are optimized to run on CPUs, making them a fast and efficient option for organizations that need to get up and running quickly.
Neural Magic’s technology has the potential to transform the field of deep learning by making it more accessible and affordable for a wider range of organizations. By enabling deep learning models to run on CPUs with high performance and efficiency, the company is helping to democratize AI and unlock its potential for new applications.
Neural Magic has strong ties to the AI research community, and has written a number of compelling research papers.
In conclusion, Neural Magic is a promising startup company that is developing innovative technology to optimize deep learning models for CPUs. The company’s software libraries and tools enable high-performance and efficient deep learning on commodity hardware. Its technology has the potential to make deep learning more accessible and affordable for a wider range of organizations, unlocking its potential for new applications.
Contributed by Daniel D. Gutierrez, Editor-in-Chief and Resident Data Scientist for insideAI News. In addition to being a tech journalist, Daniel also is a consultant in data scientist, author, educator and sits on a number of advisory boards for various start-up companies.
Sign up for the free insideAI News newsletter.
Join us on Twitter: https://twitter.com/InsideBigData1
Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/
Join us on Facebook: https://www.facebook.com/insideAI NewsNOW