Nvidia’s Jetson Nano Puts AI In The Palm Of Your Hand

Nvidia’s Jetson Nano Puts AI In The Palm Of Your Hand




We (TIRIAS Research) recently had an opportunity to evaluate the latest Jetson platform from Nvidia. At just 45mm x 70mm the Jetson Nano is the smallest Artificial Intelligence (AI) platform form factor Nvidia has produced to date. The Jetson Nano is powered by the Tegra X1 SoC, which features quad 1.43 GHz Cortex-A57 CPU cores and the 128-core Maxwell GPU. The Jetson Nano also uses the same Jetpack Software Development Kit (SDK) as the other Jetson platforms, the TX2 and AGX Xavier, allowing for cross platform development. For only $99, plus a little extra for accessories, the Jetson Nano is an amazing platform.

In addition to the Tegra X1 SoC, the Nano developer kit comes configured with 4GB of LPDDR4 memory and plenty on I/O options, including a MIPI CSI connector, four USB 3.0 Type-A ports, one USB 2.0 Micro-B, one gigabit ethernet port, and 40 GPIO pins. The Nano is capable of driving dual displays through single DisplayPort and HDMI ports, it has an microSD card slot for storage, and a somewhat hidden M.2 Key E connection for expansion modules/daughter cards for optional functions like wireless connectivity. The Jetson Nano developer kit comes with a sizable heatsink for passive cooling, but has holes drilled for add-on fans. For our evaluation, we used a Noctua NF-A4x20 5V PWM fan and a Raspberry Pi MIPI Camera Module v2 from RS Components and Allied Electronics.

For development software, the Nano runs an Ubuntu Linux OS and uses the Jetpack SDK, which supports Nvidia’s CUDA developer environment, as well as other common AI frameworks, such as TensorRT, VisionWorks, and OpenCV.

The Nano platform lets you do everything from writing code to testing solutions developed on other platforms. The combination of the Jetson Nano with a GeForce enabled PC using the same CUDA cores makes a very powerful development environment for IoT applications, a solution many system developers are likely to explore. We are still experimenting with the deep learning training capabilities on high-end PC combined with the inference processing on the Nano platform.

Tirias Research has forecasted that more than 95% of all new devices produced by the end of 2025 will leverage some form of machine learning (ML) or AI. Platforms may leverage AI from the cloud, on the device or in some hybrid fashion. While the majority of AI training and inference (the use of a trained model) is being done in the cloud today, more and more will be pushed to devices, or what is often referred to as the edge of the network or just the “edge,” because operations need to be performed in real-time and/or due to limitations on network or cloud resources.

There is a plethora of small form factor development boards available on the market and any compute core can perform ML/AI functions, but none that come close to the capabilities to the Jetson Nano combined with the Jetpack SDK. The Jetson Nano is a perfect example of how ML/AI can be accomplished in small form factors and in battery powered devices. While we were not able to test the maximum performance of the platform, which Nvidia specs at 472 GFLOPS, we were impressed with the capabilities of the platform. The platform makes both a great learning tool for educational purposes and a development board. Because of its size and configuration, it is likely that companies will use the Jetson Nano module for production purposes. Nvidia has already adapted the Nano to a robotics platform called the JetBot, which could easily be adapted to drones or other industrial applications for autonomous control.

The Jetson Nano development kit is available on Amazon and there are already a number of cases and expansion packs available through Amazon, Newegg, Etsy, and a number of eBay sites. Nvidia provides free specifications for the JetBot but users can also find complete kits on the internet for just over $100.






NVIDIA

Leave a Reply

Your email address will not be published.