Nvidia Jetbot first look

Assembling the bot, took about 30 minutes. I was using waveshare kit, the kit enclosure is made of metal so it is very sturdy. It also came with a motor hat and LED display showing useful things like IP address. My kit did not come with 18650 batteries, so make sure to buy your own. I charged it while off as recommended.

Before I turned it on, I made sure either the tires were removed or it was raised off the ground (with a block), so it didn’t drive off the table by accident while I was working on it!

I set up roscore to run as a service just so I didn’t have to constantly start it manually everytime I restarted the bot. This made things alot more convenient.

The JetBot
My current workspace

Autonomous Robotics

I played around with the jetbot notebooks

The notebook I spent alot of time with was collision avoidance, as it contained the whole ML workflow i.e.

  • Gather test data on Device (Nano)
  • Train ML Model on Host (My PC with a GPU)
  • Deploy onto Device
  • Record results
  • Repeat

This workflow made alot of sense, although I wonder if there is a more efficent way to move models and data between the embedded device and the Host PC!

Regarding data collection, one thing I learned was, it should be done using sensors that your device will most likely use

  • e.g camera on the robot

Small things can influence how your model works e.g a person being in a picture can make a difference. So your data needs to cover all possible scenarios!

For the training portion, I took advantage of transfer learning

  • Use a large pretrained net (e.g AlexNet) and train with our collected images
  • Leverage the fact that AlexNet has learned many low level features
  • replace AlexNet’s 1000 outputs wide layer with our 2 outputs wide layer (representing blocked or not blocked)

My first attempt at Autonomous Robotics

Experiment Results

I Have done 6 training iterations so far:

  • v1 ~100 images each of both classes (free and blocked)
    • images were taken haphazardly
    • jetbot kept falling off the desk
  • v2 ~300 images each of both classes (~100 from v1 + ~200 new images)
    • jetbot performed a little better but kept falling off the desk
  • v3 to v5 ~450 images of each of both classes (images were taken from scratch)
    • much better performance, still falling off desks but at a lower rate
  • v6 ~400 images of each of both classes (images were taken from scratch)
    • moved the jetbot to the floor as it had fallen off the desk one too many times
    • still not perfect as you will see below

JetBot after some training

As you can see, I have much to learn about training algorithms and camera work!

That being said, this was a very good experience. Nothing beats actually getting robot parts in your hands and making said parts do cool things! I have learned more in the past week than in months of online videos. If you want to get into robotics, this is a great way to start!

Some things to look forward too

The tutorials were done using Jupyter Notebooks and Python, I am going to implement this stuff in C++ using ROS e.g

  • Motor control
  • Inferencing
  • etc

I am also going to turn this into a legged robot walking robot. Why? you ask. Because I can, so look forward to posts about Reinforcement Learning and using ROS to control this Robot!

But before you get to see the legged robot, I will be detailing my experiences with Audio Generative Networks on the Jetson Nano. Spoiler It ends in FAILURE but I learned some important lessons which will definitely help you on your own Autonomous journey.

Stay tuned.

References