Sensors comparison for underground void mapping by UAV’s

Unmanned aerial vehicles (UAVs) are being employed in a rapidly increasing number of applications in mining. This includes the emerging area of mapping underground void spaces such as stopes, which are otherwise inaccessible to humans, automated ground vehicles and survey technologies. Void mapping can provide both visual rock surface and 3D structural information about stopes, supporting more effective planning of ongoing blast designs. Underground stope mapping by UAVs, however, involves overcoming a number of engineering challenges to allow flights beyond operator line‐of‐sight where there is no global positioning system (GPS), natural or artificial light, or existing communications infrastructure.

This paper describes the construction of a UAV sensor suite that uses sound navigation and ranging (SONAR) data to create a rough 3D model of the underground UAV operational environment in real time to provide operators with high situational awareness for beyond-line-of-sight operations. The system also provides a backup when dust obscures visual sensors to provide situation awareness and a coarser, but still informative, 3D model of the underground space.

Typically, light detection and ranging (LIDAR) systems have superseded SONAR sensors for similar applications. LIDAR is much more accurate than SONAR but has several disadvantages. SONAR sensor data is sparse, and therefore much easier to process in real time on‐board the UAV than LIDAR. The SONAR sensor hardware is also lighter than current LIDAR systems, which is of importance regarding the constrained payload capacity of UAVs. However, the most important factor that makes SONAR stand out in this application is its ability to operate in dusty or smoke‐filled environments.

The UAV system was tested both above and below‐ground using a predefined path with checkpoint locations for the UAV to follow. Due to the lack of GPS, survey points in combination with photogrammetry allowed the UAV’s location to be estimated. This allowed the system to be tested to determine how accurate the SONAR data is in comparison with 3D modelling via photogrammetry of images from a separate digital single‐lens reflex camera.

Comparing the shape of void surfaces determined by photogrammetry with that determined by SONAR provides quantifiable accuracy when the photogrammetry models are used as ground truth data. Above‐ground and underground pilot studies have determined that SONAR sensors provide acceptable accuracy compared with modelling via photogrammetry, sufficient to provide effective situational awareness for human operation of the UAV beyond line‐of‐sight.

To access the full paper on – A comparison of sensors for underground void mapping by unmanned aerial vehicles, click here.