Being able to communicate effectively during a forest fire is vital to effectively tackle the fire and maintain the safety of teams in the field. Teams can be spread over huge remote areas with difficult terrain interrupting standard radio communications. Little to no communications infrastructure exists in these environments adding to the challenges, and existing infrastructure that does exist could be damaged or unreliable due to the fire and smoke.
The solution proposed in this project implements a moveable, robust and fast acting communications network using a team of autonomous quadcopters. The management team dictate where the crew is working and where communications are required, and a swarm network of drones will automatically ensure coverage in the area specified by deploying a network of radio relays. A simplified model of the system is shown below.
The quadcopters will have radio relays attached to them and will communicate with neighbouring beacons to ensure constant connectivity. This ensures that isolated groups of firefighters are connected to each other and the outside teams and ensures that in the chaotic environment of a remote forest fire organisation and coordination can be maintained.
To maintain battery life they will land in elevated locations to maximise signal output. They will also be equipped with temperature sensors as a safety precaution and automatically relocate if they detect they are in the line of the fire.
Proposed Implementation Schedule
For initial proof of concept the system will allow a user to specify where the communication network is needed, and for a drone to fly to that location and land with its radio beacon.
Stage 1 is to build the quadcopter and achieve stable flight.
Stage 2 is to integrate google maps API into a user application area.
Stage 3 is to be able to define waypoints on the google maps API.
Stage 4 is to be able to draw a user defined area of operations to deploy radio beacons.
Stage 5 is to calculate the distance and angles between GPS coordinates.
Stage 6 is to automatically calculate where drones should be in the user defined operating area and determine these GPS coordinates and output them.
Future work will be required to output these GPS coordinates to the NXP IoT module to allow for fully autonomous flights. Drones will be commanded to specific coordinates and hover in place until battery is low, and then fly back to a defined base point for recharging so they can be sent back out. Another option is that drones could fly out to their locations and land increasing service time.
Additional protocols that would improve system performance are temperature sensing to preserve drone in high risk areas and ensure that when operating near fires the risk of drones getting damaged by fire is mitigated.
Due to changing drone laws in the UK and delays in packages being sent out the duration of the project was cut short, as a result a lot of the automation that had been planned for the project had to be removed to achieve a reasonable time frame. This is planned for one of the future competitions.
Stage 1 Implementation
Once the drone had been built I tested the controls to ensure the drone would hover and the control systems were stable and everything was working properly.
The PID settings could be further optimised as shown below, when turning quickly it can get a bit twitchy but for the most part it performs well enough to move into the next stage of implementation and integration.
Stage 2 Implementation
Now the drone is flying the software fun can begin. An initial GUI that simply allows the user to define an operating area has been developed. The first step is to implement a GUI that has an integrated google maps API window to allow for operation anywhere in the world.
The project has been implemented using a windows form application as apposed to using a web based server. This is so it can be used anywhere no matter what connectivity is available. The google maps API utilised is GMap.net plugin due to it's flexibility and ease of use. The map is centred on the GPS latitude and longitude entered in the appropriate text boxes. These are converted from strings to doubles and stored as a global variable.
Stage 3 Implementation
Stage 3 implementation adds the ability for the user to define waypoints on the google maps API. These waypoints will store the GPS coordinates in a list in order to be used in future calculations. These waypoints are generated automatically when the user clicks anywhere in the map window.
Stage 4 Implementation
Stage 4 is to be able to draw a user defined area of operations to deploy radio beacons. Converting the array of mouse clicks the user makes into a polygon vector on top of the map is the first step of automating coordination of the drones in an operating area.
To achieve this each mouse click is mapped as a GPS coordinate in a list and the user confirming that the points listed are the area these points are converted into a polygon vector.
Other features added in this stage of implementation are a clear all waypoints button, that restores the form back to it's default state, it clears all markers and polygons and resets the map back to it's original scaling and location, as well as clearing all the background variables.
Another feature added is the ability to move the map location to anywhere in the world rather than a pre-entered GPS coordinate. By changing the Lat/Long coordinates this will automatically center the map on the new coordinate.
Stage 5 Implementation
Stage 5 is to calculate the distance and angles between GPS coordinates. This function will be needed for stage 6 and is done using the cosine rule and the Pythagoras theorem.
To calculate the length of a line with a known height and width pythagoras is all that is required. a^2 + b^2 = c^2, therefore c = sqrt(a^2 + b^2). The code for this is shown below where a is the latitude between the two points, and b is the longitude between the two points.
The angle function is performed using the cosine rule, which is CosC = b^2 + c^2 - a^2 / (2bc). Each of these lengths needs to be calculated using the length formula outlined above and is then passed into the cosine formula. The result is output from the function as the angle.
Stage 6 Implementation
Stage 6 is where the fun automation comes in and calculates automatically where drones should be in the user defined operating area and determine these GPS coordinates and output them.
The basic theory of this automation is that the user will define an operating area as shown, and then this operating area is broken down into triangles using 3 GPS coordinates for each point of the triangle. Then using the previously discussed functions the properties of these individual triangles can be calculated. This iteration also adds the ability to zoom in and out of the map by defining the zoom radium - just enter an approximate size of your operating area and the application will calculate the best zoom level for this area.
Once the area has been broken into triangles an overlay of waypoints is mapped into a square overlay over the area. To then determine which waypoints are within the user defined area some fun maths is done. For each waypoint generated as part of the grid every waypoint has the angle between it and the 3 points of the triangle. To determine if the point is inside the triangle the angles between the point and each of the lines in the triangle are calculated. If the 3 angles all added up are equal to (or very close due to rounding of the double bits) then it is within the area, if it is less than or greater than 360 degrees then it lies outside the area. This calculation is done for every single point and every single triangle using for loops.
The method of evaluating where waypoints are required means that if a waypoint is present in multiple triangles then repeated waypoints will be generated and output and additional processing will need to be done to remove these duplicates before being output to drones.
All of the waypoints that are within the user specified operating area are then drawn on the map with the GPS coordinates output in the text box. Outputting this to the IoT module is the next step.
Future Work Required to improve the system
The next step is integrate this user application with the IoT module and command the drone to fly. QGroundControl already allows the ability to output GPS coordinates to the drone and automate missions, so mirroring this API in the application should be relatively straightforward. The main area for further work is safety and risk mitigation. Detecting low battery and automatically flying back to a determined safe point can be done in the PX4 firmware and is straightforward, however the temperature sensing and automatic repositioning of drones when they deem an area too hot or unsafe to be present and communicating this back to the command station is essential for a completely autonomous and smart system.
To conclude a user application has been developed that allows the user to define where ground operators are working and where a radio communications network is required. The range of the radio relays are defined as a global variable in the code so whichever commercially available relay is attached to the drones, it's range can be added and the distance between drones can be automatically updated.
This array of coordinates that has been generated can be automatically output to drones and command them to fly to these coordinates creating an automatically updated communications network allowing operators to communicate freely without relying on local communications network that might not be present in remote areas.
Using the Application
Instructions on using the application .exe found in the GitHub repository (GPSTest3/bin/Release/GPSTest3.exe) are outlined below. It is advisable not to set the beacon radius to a small number or draw a large operating area as this can be a large drain on CPU resources with the current implementation method. Keeping the values at their default should result in a realistic scenario.