Skip to content
forked from kimmk/ras-bots

Robotics and Autonomous Systems - UTU 2021 student project

Notifications You must be signed in to change notification settings

dmonteroh/ras-bots

 
 

Repository files navigation

Original Repo can be found here, forked for showcase purposes. This project was done for the "Perception and Navigation in Robotics" class from the Smart Systems Programme of the University of Turku with my teammates. Demonstration video can be found in this LinkedIn post

Instructions - WIP

Necessary installations:

Update submodules:

git submodule init
git submodule update

Drone related drivers and dependencies. Installation of TIERS' ROS Driver and Python Driver.

sudo apt install ros-melodic-camera-info-manager-py ros-melodic-codec-image-transport python-catkin-tools python3-dev python3-pip python-dev python-pip
sudo -H pip3 install --upgrade pip
sudo -H pip3 install https://github.com/damiafuentes/DJITelloPy/archive/master.zip

Building the project

cd ~/ras-bots
catkin init
catkin build

Open Project Plan: Aircraft Carrier & Discovery

1. Team name and team members (up to 4 persons/team)

Sebastian Icking - [email protected]

Kimmo Korpelin – [email protected]

Daniel Montero – [email protected]

Gabriel Pirlogeanu - [email protected]

2. Application / Use-case

We want to combine navigating through an unknown environment with a combination of drone and Jetbot movement, where part of the environment might be inaccessible to either of the robots.

3. The system

Landing the drone on top of the Jetbot also requires synchronized positioning.

  • Robots: 1 Jetbots and 1 Drone

  • Computing Platform: Due to the Jetbot processing power not being good enough, we might add one of our laptops as a data processing is desirable, however, it would not act as the master.

  • Sensors: LIDAR sensor and Cameras for the Jetbot

  • Communication: Most communication should happen through ROS topics, as it's the most stable way to send camera information.

  • Algorithms:

    • Lidar:
      • SLAM
      • Potential Fields (Jetbot avoidance)
    • Visual:
      • Landing algorithm
      • Fiducial recognition
      • Drone search/navigation algorithm
  • Data flow: Communication happens through ROS and there should be a constant flow of information between Drone and Jetbot. The Jetbot makes decisions and commands the Drone to move around.

    • Jetbot-LIDAR
      • Sensor data is processed to allow Jetbot to move around
      • Jetbot decides on resting place
      • Jetbot commands the Drone for take off
    • Drone-Camera
      • Drone streams camera data to Jetbot
      • Jetbot instructs the Drone to move around for Discovery
      • Jetbot recognizes Fiducial and instructs Drone to come back
    • Jetbot-Camera
      • Once the Drone is in the field of view, send commands to correct positioning
      • Jetbot commands Drone to land

enter image description here

4. GitHub repo link

https://github.com/kimmk/ras-bots

5. Background

The team already has some experience receiving data from a robot to a laptop using ROS topics, so we are confident that the communication part will part, however, we have not done communication between robots in the past.

We're also somewhat confident of the LIDAR-SLAM Potential Fields driving and basic navigation for the Jetbot to avoid obstacles, but we have not yet built a combination of both systems at the same time.

Finally, the drone "discovery" path is one of the riskiest parts of the project. While we have successfully received and transformed data from a drone in the past, we have yet to make a "search & find" functionality.

6. Expected challenges and wishes to learn

Main expected challenge is the "search & find" algorithm. Other important challenges are: communication between robots, managing the processing capabilities of the Jetbot and not crashing the Drone.

We would like to know more about path-planning and obstacle avoidance if possible. If we were to add mapping to our project, then some information regarding this would be nice, otherwise we would need to investigate how to combine visual images with sensor data.

Another known challenge is getting quality imagery from the drone, since the data tends to be poor.

7. Team roles

  • Sebastian

    • Physical modification of the Jetbot
    • Landing Algorithm
  • Kimmo

    • Drone Data Interpretation
    • Drone Search&Find Algorithm
  • Daniel

    • Drone Commands
    • Jetbone Commands
    • Fiducial Interpretation
  • Gabriel

    • SLAM (Mapping room with LIDAR)
    • Jetbot Autonomous Movement

8. Work packages (how is the work going to be divided among team members and in time), with tentative project schedule.

Tentative Project Schedule: Drive GANTT

9. Description of final experiment or demonstration.

We let the Jetbot with the docked Drone into a space. The Jetbot drives around and eventually settles. Jetbot commands the Drone around to do discovery, and once the goal is reached (fiducial finding), the Drone is autonomously landed on the Jetbot, and the Jetbot can leave.

About

Robotics and Autonomous Systems - UTU 2021 student project

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 85.1%
  • Python 14.4%
  • CMake 0.5%