TensorFlow, Machine Learning, ML, Neural Networks, video, raspberry pi, Google, Coral, TPU, Edge, IoT, python

Google's 'Coral' Edge TPU Dev Boards

I received a pair of Google's 'Coral' Edge TPU Dev Boards today. They have the exact same dimensions as Raspberry Pi devices, including the location of the mounting post holes. This is convenient if you have project boxes or mounting plates that you normally use for Pis and want to use them with the Coral Dev Board.

When comparing the Dev Boards to the Accelerators I received last week, the accuracy of the models is nearly identical. Of course, this isn't surprising, since the Accelerators and the Dev Boards are using the exact same trained models and I'm using the same photos for scoring.

Obligatory beer test:

python3 demo/classify_image.py \
--model test_data/inception_v4_299_quant_edgetpu.tflite \
--label test_data/imagenet_labels.txt \
--image test_data/beverage.jpg

beer glass
Score :  0.972656

And to provide a test where the results aren't quite a clear cut, I decided to use a photograph from a recent vacation to see how it handled a scenery picture.

python3 demo/classify_image.py \
--model test_data/inception_v4_299_quant_edgetpu.tflite \
--label test_data/imagenet_labels.txt \
--image test_data/bonaire.jpg

Score :  0.554688
seashore, coast, seacoast, sea-coast
Score :  0.230469
Score :  0.136719

Some of the things I liked about the Dev Board ...

  • When you flash the Dev Board with the mendel image, everything you need to make the board operational is pre-installed on the image. No need to update/install python or tensorflow, etc. And the edgetpu software is also pre-installed, too.
  • The TPU assembly on the Dev Board is actually removable and can be integrated into your own hardware stack if needed.
  • Though there's a microSD card slot, no card is actually required to make the device operational. It comes with 8GB of storage onboard.
  • Wireless setup completely painless. I had no problems connecting it to my 5Gz AC network.
  • The fan is not cosmetic. Those TPUs get hot very fast!
  • There's a nice video streaming demo that's included (command edgetpu_demo --stream). The video is not live, but is instead using a h264 encoded video of traffic. The video displays the inference speed (tracking up to 20 vehicles per frame) in ms, and translates that into the number of frames per second it could parse.

All in all, the Dev Boards were easier to get up and running than the Accelerators and have a decent amount of hardware packed into a tiny form factor. I'll move on from static image analysis to video streaming analytics once the 2 cameras arrive.

Author image

About James Conner

Scuba dive master, wildlife photographer, anthropologist, programmer, electronics tinkerer and big data expert.
You've successfully subscribed to My Areas of Expertise
Great! Next, complete checkout for full access to My Areas of Expertise
Welcome back! You've successfully signed in.
Unable to sign you in. Please try again.
Success! Your account is fully activated, you now have access to all content.
Error! Stripe checkout failed.
Success! Your billing info is updated.
Error! Billing info update failed.