Skip to content

mmajewsk/Tonic

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

An open-sourced project of autonomous car

IntroductionDocumentationScreenshotsContribute

Tonic logo

Latest commit Licence Discord Server

demo

⚠️ ➡️ Video here ⬅️ ⚠️

a.k.a.: "Roomba, that does not suck"

Written in Python 🐍 (mostly).


Contents:

Introduction

This repository contains main software and documentation for the Tonic project. This project aims to create an open-sourced autonomous driving system, along with its hardware prototype implementation. Some essential parts of the projects are contained in other related repos. See the list of related repos The core idea of how this should work is as follows:

  1. After setting up the robot/car, drive it manually, and dump the video and steering feed (this part is called data taking).
  2. Create a 3D mapping of the environment with Tonic/autonomous.
  3. Define checkpoints, through which the machine will drive.
  4. Program the car to drive on the defined paths.

All of that to be possible for as cheap as possible, with a raspberry PI and only a single camera.

Features

  • Camera live feed and recording.
  • Live steering system and recording.
  • Working IMU live streaming and recording.
  • Working odometry live streaming and recording.
  • Qt GUI client for driving and data taking.
  • SLAM mapping, and navigation implemented with ORB_SLAM2 and its custom fork, custom python bindings, and serialisation.

How does it work

As for now, this repository (mmajewsk/Tonic) contains guides and software for building, running and steering the car 🚘 for the data taking. The code is divided, into Tonic/control and Tonic/car.

The Tonic/control contains the code that is meant to be run on your laptop/pc/mac, that will control the raspberry pi Tonic/car.

The machine and control interface is communicating via WiFi network, using sockets.

Sensors, camera, and steering - each one is implemented as a separate service using sockets. You can steer it with the keyboard on PC, while seeing live feed from the camera. All of the sensors, steering and video can be dumped to files on PC. You don't need to turn all of the sensors to make this work.

The odometry and IMU are not necessary to make an environment mapping

Ok so how do I start

  1. Take a look at previous versions and the current one in video and screenshots.
  2. First start by assembling the hardware.
  3. Then set up the machine and interface software.
  4. Do the data-taking run, running steering and video data, as described here.

To make your machine drive autonomously, follow the guide in Tonic/autonomous repo.

Contribute

🧑‍🔧 This project is meant to be open for everyone. The contributions are welcome. If you would like to help see what's listed in the issues here, or add something yourself.

Also, you can join the 🗣️ discord server if you are looking for quick help, or just want to say hi ;)

Related repos