Description
* Performs high-speed ML inferencing: High-speed TensorFlow Lite inferencing with low power, small footprint, local inferencing.
* Supports all major platforms: Connects via USB 3.0 Type-C to any system running Debian Linux (including Raspberry Pi), macOS, or Windows 10.
* Supports TensorFlow Lite: no need to build models from the ground up. Tensorflow Lite models can be compiled to run on the edge TPE.
* Supports AutoML Vision Edge: easily build and deploy fast, high-accuracy custom image classification models at the edge.
Compatible with Google Cloud.