My next step is to build a robot for SLAM and Navigation! I decide to choose ROS as there are some ready-to-use packages, which I think would give me a boost. The main processor is a Raspberry Pi 3 and the low-level controller is a STM32 based control board. The LiDAR is a Slamtec A1 which costs just about 100 pounds. The camera is a very fast 120 fps PS3 EYE, which is nice for machine vision purpose. I can't fit my Kinect onto this platform as I am not sure it has enough computation power, also converting power is an issue.
▲Picture of the robot platform. The robot has three omni wheels. On top layer there is a LiDAR produced by Slamtec. The middle layer contains a high framerate camera and a Raspberry Pi 3. On the bottom layer there is a low-level microcontroller and a motor driver.
▲Test of the LiDAR and Hector Mapping
A geometric filter that is able to use machine learning technique as a definition of Trust factor with relation to geometric orientations and constraints. LC filter minimizes the data usage for detection or other applications like SLAM in which prevent the systems from critical over calculations during entrance to overwhelming landmarks. This filter is designed for edge-base detection or advance cameras like event-camera. As another outstanding property, due to two layer expert evaluations, LC filter simply can estimate the incoming objects to the camera with only reliance to vehicle IMU sensor.
A side project during the summer holiday. The hardware of this self-balance robot is off-the-shelf, which ease myself and make me focus on the software. The system is controlled by a STM32F103 Cortex-M3 microcontroller. The main sensor is a MPU6050 6-axis motion sensor. The attitude estimation is achieved by a complementary filter and a PI controller. As I don't have a model of the system, I can only manually tune the controller parameters, which is crucial to achieve satisfied stability.
▲Front view of the robot.
▲Top view of the robot.
▲Picture of the visual tracking systems (right - version 1, left - version 2).
▲Screenshot of vision tracking using a web camera
This is a FIRA form factor robot that is designed for educational purpose. The platform is built by a friend's company: Embedded Dream Studio. I was involved in the Arduino library design and PID controller improvement.
▲ The robot is only 10 cm * 10 cm * 10 cm. A pound coin is used for comparison of size.
▲ The robot has a well designed structure. The robot is driven by two differential wheels with PCBs vertical placed on both sides. 4*AA batteries are used to power the platform. The robot has communication to a computer through a nRF24L01 wireless transceiver.
▲UGV System Diagram
▲Trajectory and Control Performance plotted in MATLAB
▲Picture of the robot platform. Raspberry Pi is used as the main controller and an Arduino-like low-level controller is used to control motors.
This is a course project when I was in the University of Sheffield. A closed-loop controller for a model helicopter need to be firstly developed in MATLAB/Simulink. To simulate the controller with hardware-in-the-loop, NI LabVIEW is used which uses the same control structure and parameters from previous simulation. A Data Acquisition Device, myDAQ, is used as an interface to connect LabVIEW to the helicopter hardware.
▲A Picture of the Helicopter System
▲180° Rotatable Ultrasonic Sensors
▲Top view of the robot
▲Mechanical design of the robot (using Solidworks)
▲PCB - Dual Channel High Current DC Motor Driver (made from sketch)
▲PCB - ATmega128 Robot Controller (made from sketch)
▲RoboFish Waterpolo Competition
▲RoboFish Simulation (multi agents)
Copyright (C) 2017 - 2018 by Xiaotian Dai, Back to top