NXP Announces Expansion of its Scalable Machine Learning Portfolio and Capabilities

NXP Semiconductors today announced that it is enhancing its machine learning development environment and product portfolio. Through an investment, NXP has established an exclusive, strategic partnership with Canada-based Au-Zone Technologies to expand NXP’s eIQ™ Machine Learning (ML) software development environment with easy-to-use ML tools and expand its offering of silicon-optimized inference engines for Edge ML.

Additionally, NXP announced that it has been working with Arm as the lead technology partner in evolving Arm® Ethos-U™ microNPU (Neural Processing Unit) architecture to support applications processors. NXP will integrate the Ethos-U65 microNPU into its next generation of i.MX applications processors to deliver energy-efficient, cost-effective ML solutions for the fast-growing Industrial and IoT Edge.

Ron Martino, Senior Vice President and General Manager of Edge Processing at NXP Semiconductors Said: “NXP’s scalable applications processors deliver an efficient product platform and a broad ecosystem for our customers to quickly deliver innovative systems, Through these partnerships with both Arm and Au-Zone, in addition to technology developments within NXP, our goal is to continuously increase the efficiency of our processors while simultaneously increasing our customers’ productivity and reducing their time to market. NXP’s vision is to help our customers achieve lower cost of ownership, maintain high levels of security with critical data, and to stay safe with enhanced forms of human-machine-interaction.”

Enabling Machine Learning for All 

Au-Zone’s DeepView™ ML Tool Suite will augment eIQ with an intuitive, graphical user interface (GUI) and workflow, enabling developers of all experience levels to import datasets and models, rapidly train, and deploy NN models and ML workloads across the NXP Edge processing portfolio. To meet the demanding requirements of today’s industrial and IoT applications, NXP’s eIQ-DeepViewML Tool Suite will provide developers with advanced features to prune, quantize, validate, and deploy public or proprietary NN models on NXP devices. It’s on-target, graph-level profiling capability will provide developers with unique, run-time insights to optimize NN model architectures, system parameters, and run-time performance. By adding Au-Zone’s DeepView run-time inference engine to complement open source inference technologies in NXP eIQ, users will be able to quickly deploy and evaluate ML workloads and performance across NXP devices with minimal effort. A key feature of this run-time inference engine is that it optimizes the system memory usage and data movement uniquely for each SoC architecture.

Brad Scott, CEO of Au-Zone Said: “Au-Zone is incredibly excited to announce this investment and strategic partnership with NXP, especially with its exciting roadmap for additional ML accelerated devices, We created DeepViewTM to provide developers with intuitive tools and inferencing technology, so this partnership represents a great union of world class silicon, run-time inference engine technology, and a development environment that will further accelerate the deployment of embedded ML features. This partnership builds on a decade of engineering collaboration with NXP and will serve as a catalyst to deliver more advanced Machine Learning technologies and turnkey solutions as OEM’s continue to transition inferencing to the Edge.”

Expanding Machine Learning Acceleration 

To accelerate machine learning in a wider range of Edge applications, NXP will expand its popular i.MX applications processors for the Industrial and IoT Edge with the integration of the Arm Ethos-U65 microNPU, complementing the previously announced i.MX 8M Plus applications processor with integrated NPU. The NXP and Arm technology partnership focused on defining the system-level aspects of this microNPU which supports up to 1 TOPS (512 parallel multiply-accumulate operations at 1GHz). The Ethos-U65 maintains the MCU-class power efficiency of the Ethos-U55 while extending its applicability to higher performance Cortex-A-based system-on-chip (SoC)s. The Ethos-U65 microNPU works in concert with the Cortex-M core already present in NXP’s i.MX families of heterogeneous SoCs, resulting in improved efficiency.

Dennis Laudick, Vice President of Marketing, Machine Learning Group, at Arm Said: “There has been a surge of AI and ML across industrial and IoT applications driving demand for more on-device ML capabilities, The Ethos-U65 will power a new wave of edge AI, providing NXP customers with secure, reliable, and smart on-device intelligence.”

Availability 

Arm Ethos-U65 will be available in future NXP’s i.MX applications processors. The eIQ-DeepViewML Tool Suite and DeepView run-time inference engine, integrated into eIQ, will be available Q1, 2021. The end-to-end software enablement, from training, validating and deploying existing or new neural network models for i.MX 8M Plus and other NXP SoCs, as well as future devices integrating the Ethos-U55 and U65, will be accessible through NXP’s eIQ Machine Learning software development environment. To learn more read our blog and register for the joint NXP and Arm webinar on November 10.

Source: NXP USA, Inc

Related Posts

Subscribe Our Newsletter