The Vision Guidance Filming Drone is a research project to study the possibility of an application of a general use of an autonomous flying quadcopter.
Project Overview
Currently the culture of sharing videos of outdoor activities are growing. The VisionGuidance Filming drone is a research project to study the possibility of an application of an autonomous guided quadcopter designed for the mass.
The vision guidance filming drone tracks and follows the protagonist in the video so that quadcopter may be able to film the protagonist. Currently people use poles or helmet cams to hold the camera and take selfies. The vision guidance filming drone automates the process by following the protagonist and film the selfie. (which is not exactly a selfie as it is the ‘drone’ itself filming the person not the person himself)
This can be for various applications such as outdoor filming such as skiing, surfing, driving or indoor filming.
The project was funded and displayed in there Creative Design Fair held in Seoul National University, Capstone international Design Competition, Electrical Engineering Fair in 2013.
Project Members
- Jaeyoung Lim, Seoul National University
- Dongho Kang, Seoul National University
- Sanghyun Hong, Seoul National University
- Jaemin Cho, Seoul National University
Prizes and honors
- Bronze Prize, International Capstone Design Fair, Nov. 2013
- Excellence Reward, Seoul National University Creative Design Fair, Sep. 2013
- Bronze Prize, Seoul National University Electrical Fair, Oct. 2013
Results
The drone has been developed to successfully identify and track the protagonist of the video.
This is the demonstration of the drone controlling it’s distance and orientation to keep the protagonist to be in the middle of the frame.
System Overview
Hardware
- AR.Drone
AR.Drone is a radio controlled flying quadcopter bult by the French company Parrot. The drone is designed to be controlled by mobile or tablet operating systems such as iOS or Android. Communication is done using Wifi. The API is opened for development which makes the platform hany for developing various applications using the AR.drone hardware.
AR.Drone 1.0 was used for the project. AR.Drone 2.0 is not compatible to the AR.drone for P5 library run on processing.
- Gopro
GoPro is an American brand of high-definition personal cameras, often used in estreame action video photography. Gopro cameras are light weight, and enduarable in harsh conditions.
Gopro hero3 is used for filming for the Vision Guidance Filming drone as weight is important for flight time and performance of the drone.
Gopro was operated idependantly from as the drone control. The operator controls the Gopro using the gopro Wifi interface.
Software
Setup
- Processing
Processing was used to write the program to process the video streaming data. Processing is a programming language, development environment. Processing language has a set of libraries which can shorten the image processing development process.
The AR.Drone is controlled by a program written on processing in the laptop. The library for controlling an AR.Drone using processing was made by Shigeo Yoshida in the name of AR.Drone for P5. The library supports.
- NyAR Toolkit
For the tracking of the position and orientation of the protagonist, NyAR ToolKit was used. The NyAR Toolkit tracks a marker. You can either get the markers from a pattern generation program included in NyAR Toolkit NyIdMarkerCard_00_09_KJ-VHA10LB
Flight Control Software
The flight control software consists of a function to track and fly the quadcopter, and to display the sound and visual flight data.
Below is the code for the setup of the processing drone. The initialization of the connection with the AR.Drone which the flight data and video data is transferred.
void setup() { size(320, 240, P3D); colorMode(RGB, 100); println(MultiMarker.VERSION); nya = new MultiMarker(this, width, height, "camera_para.dat", NyAR4PsgConfig.CONFIG_PSG); nya.addARMarker("patt.hiro", 80); ardrone = new ARDroneForP5("192.168.1.1"); ardrone.connect(); ardrone.connectNav(); ardrone.connectVideo(); ardrone.start(); PFrame f = new PFrame(); }
Two windows are displayed in the flight control software. One is used to display the video streaming and tracking status, the second is the flight data visualization window which displays the flight data in an instinct way.
- Track and Guidance
- Quadcopter degree of freedom
Quadcopters have four degrees of freedom which is because it is controlled using four motors. Quadcopters can control yaw, altitude and planar translational velocity.
For the drone to hover and maintain the protagonist in the center of the frame, the distance and the orientation between the drone shall be controlled to the reference. If the drone does not control its position without the reference orientation of the marker, the drone will tend to spin around the protagonist, which will result in the loss of visuality of the marker.
- Marker Orientation
The marker orientation is calculated by the length ratio between the horizontal length and the vertical length which the marker is seen. The ratio is obtained by calculating the position of four corners in the camera image. The inverse cosine of the ratio is the angle of the marker. The marker angle has a poor accuracy below 40º, so the drone tens to float around within 40º.
- Marker Distance
The distance is calculated using the marker size. The size is calculated by calculating the square root of the visible area of the marker. This result assumes the marker as a uniform square and then the square root of the area is the length of one line of the square.
//Control Parameter posX=(int) pos2d[0].x; posY=(int) pos2d[0].y; Rlength = sqrt((pos2d[2].x - pos2d[1].x) * (pos2d[2].x - pos2d[1].x) + (pos2d[2].y - pos2d[1].y) * (pos2d[2].y - pos2d[1].y)); Llength = sqrt((pos2d[3].x - pos2d[0].x) * (pos2d[3].x - pos2d[0].x) + (pos2d[3].y - pos2d[0].y) * (pos2d[3].y - pos2d[0].y)); horizon = ((pos2d[1].x + pos2d[2].x) - (pos2d[0].x + pos2d[3].x)) / 2; vertical= ((pos2d[2].y + pos2d[3].y) - (pos2d[1].y + pos2d[0].y)) / 2; attackAngle = (180 / 3.14) * acos(horizon / vertical); if(Llength > Rlength) { attackAngle = -attackAngle; } objectsize = sqrt(horizon * vertical);
If the drone is controlled using a PD controller, as the quadcopter gains velocity by tilting its body, the distance and orientation is a system type 2 with a PD controller. This means that the drone can track both a standing and a moving protagonist.
The control code is as below. The library AR.dorneP5 has an input for velocity parameter for the control of the AR.Done. The attitude is controlled by the velocity input as the pitch or roll angle is porportional to the velocity of the quadcopter. In quadcopter dynamics the translational velocity of roll and pitch, altitude and yaw is decoupled with each other so that each speed can be calculated and commanded independently. The velocity input is calculated by the difference from the center of the video image to the center of the marker and the orientation of the marker.
//Control Yaw of the drone if(posX > 160){ yawSpeedf = yawGain * (pos2d[0].x - 160); yawSpeed = (int) yawSpeedf; ardrone.spinRight(yawSpeed); delay(15); } if(posX0) { rollSpeedf = rollGain * (attackAngle - 25); rollSpeed = (int) rollSpeedf; ardrone.goRight(rollSpeed); delay(15); } if(attackAngle < 0) { rollSpeedf = rollGain * (- attackAngle - 25); rollSpeed = (int) rollSpeedf; ardrone.goLeft(rollSpeed); delay(15); } if(objectsize15) { distanceSpeedf = distanceGain * (objectsize - 20); distanceSpeed = (int) distanceSpeedf; ardrone.backward(distanceSpeed); delay(15); } if(altitude > 1000){ ardrone.down(); } if(altitude < 900){ ardrone.up(); }
Visualizing Flight Data
The left picture shows the Flight data visualization is done by the mountain terrain displayed in the middle lower part of the screen. The peak of the mountains flow in the direction which the drone is flying. The number of peaks show the altitude of the drone.
The middle shows in which position the four corners of the NyID markers are placed when the drone has identified the protagonist. Markers are a good way to distinguish the protagonist compared to color tracking or outline siluete recognition. Sound is generated when the marker is identified as to show.
Discussion
The possibility of a general purpose quadcopter was studied through the application of a visually guided autonomous filming drone. Filming was done while the drone was autonomously controlled so that the protagonist was centered in the frame of the video image.
One of the problems was that the drone did not have capabilities to navigate obstackes with in its flying path. Navigating obstacles can be done using object recognition or using other sensors such as SONAR or LIDAR. This may require complex payloads or make the system heavy which will require an bigger flying platform.
The experience was some how problematic. The idea was that the protagonist would turn the drone ON, then do their activities such as snow boarding .etc, and then after finishing, turning the drone off and checking out the video.
The problem of the concept was that the user needed feed back to indicate whether if the drone is really tracking the protagonist or not. In this project, the problem was tackled to provide sound feedback if the drone is tracking the drone and if needed providing a visual flight data representation to look at the drone’s status in an instant.
It turned out that the user being aware of that the drone is tracking itself or not, somehow disturbs the protagonist to concentrate doing its main task. Current experience filming a selfie has the same problem; for example filming a selfie using a gopro attatched to a long pole, but turned out that eliminating the drone from sight did not reduce the work load of the person filming the selfie. May be this is the absence of the sureness that the drone was following or may be the protagonist has to get use to the weird feeling that ‘something’ will definitely be filming himself.
Pingback: Vision Guidance Filming Drone | EastskyKang
Pingback: Vision Guidance Filming Drone – 셀카를 위한 비젼 트래킹 드론 | EastskyKang