Skip to content

Automatic Detection of Quarries and the Lithology below them in Switzerland

Huriel Reichel (FHNW) - Nils Hamel (UNIGE)
Supervision : Nils Hamel (UNIGE) - Raphael Rollier (swisstopo)

Proposed by swisstopo - PROJ-DQRY
June 2021 to January 2022 - Published on January 30th, 2022


Abstract: Mining is an important economic activity in Switzerland and therefore it is monitored by the Confederation through swisstopo. To this points, the identification of quarries has been made manually, which even being done with very high quality, unfortunately does not follow the constant changing and updating pattern of these features. For this reason, swisstopo contacted the STDL to automatically detect quarries through the whole country. The training was done using SWISSIMAGE with 10cm spatial resolution and the Deep Learning Framework from the STDL. Moreover there were two iteration steps with the domain expert which included the manual correction of detection for new training. Interaction with the domain expert was very relevant for final results and summing to his appreciation, an F1 Score of 85% was obtained in the end, which due to peculiar characteristics of quarries can be considered an optimal result.

1 - Introduction

Mining is an important economic activity worldwide and this is also the case in Switzerland. The Confederation topographic office (swisstopo) is responsible for monitoring the presence of quarries and also the materials being explored. This is extremely relevant for planning the demand and shortage of explored materials and also their transportation through the country. As this of federal importance the mapping of these features is already done. Although this work is very detailed and accurate, quarries have a very characteristical updating pattern. Quarries can appear and disappear in a matter of a few months, in especial when they are relatively small, as in Switzerland. Therefore it is of interest of swisstopo to make an automatic detection of quarries in a way that it is also reproducible in time.

A strategy offen offered by the Swiss Territorial Data Lab is the automatic detection of several objects in aerial imagery through deep learning, following our Object Detection Framework. In this case it is fully applicable as quarries in Switzerland are relatively small, so high resolution imagery is required, which is something our Neural Network has proven to tackle with mastery in past projects. This high resolution imagery is also reachable through SWISSIMAGE, aerial images from swisstopo that cover almost the whole country with a 10cm pixel size (GSD).

Nevertheless, in order to train our neural network, and as it's usually the case in deep learning, several labelled images are required. These data work as ground truth so that the neural network "learns" what's the object to be detected and which should not. For this purpose, the work from the topographic landscape model (TLM) team of swisstopo has been of extreme importance. Among different surface features, quarries have been mapped all over Switzerland with a highly detailed scale.

Although the high quality and precision of the labels from TLM, quarries are constantly changing, appearing and disappearing, and therefore the labels are not always synchronized with the images from SWISSIMAGE. This lack of of synchronization between these sets of data can be seen in Figure 1, where in the left one has the year of mapping of TLM and on the right the year of the SWISSIMAGE flights.


Figure 1 : Comparison of TLM (left) and SWISSIMAGE (right) temporality.

For this purpose, a two-times interaction was necessary with the domain expert. In order to have a ground truth that was fully synchronized with SWISSIMAGE we required two stages of training : one making use of the TLM data and a second one with a manual correction of the predicted labels from the first interaction. It is of crucial importance to state that this correction needed to be made by the domain expert so that he could carefully check each detection in pre-defined tiles. With that in hands, we could go further with a more trustworthy training.

As stated, it is of interest of swisstopo to also identify the material being explored by every quarry. For that purpose, it was recommended the usage of the GeoCover dataset from swisstopo as well. This dataset a vector layer of the the geological cover of the whole Switzerland, which challenged us to cross the detector predictions with such vector information.

In summary, the challenge of the STDL was to investigate to which extent is it possible to automatically detect quarries using Deep Learning considering their high update ratio using aerial imagery.

2 - Methodology

First of all the "area of interest" must be identified. This is where the detection and training took place. In this case, a polygon of the whole Switzerland was used. After that, the area of interest is divided in several tiles of fixed size. This is then defining the slicing of SWISSIMAGE (given as WMS). For this study, tiles of different sizes were tested, being 500x500m tiles defined for final usage. Following the resolution of the images must be defined, which, again, after several tests, was defined as 512x512 pixels.

For validation purposes the data is then split into Training, Validation and Testing. The training data-set is used inside the network for its learning; the validation is completely apart from training and used only to check results and testing is used for cross-validation. 70% of the data was used for training, 15% for validation and 15% for testing.

To what concerns the labels, the ones from TLM were manually checked so that a group of approximately 250 labels with full synchronization with the SWISSIMAGE were found and recorded. Following, the first row training passes through the same framework from former STDL projects. We make use of a predictive Recursive-Convolutional Neural Network with ResNet-50 backbone provided by Detectron2. A deeper explanation of the network functionality can be found here and here.

Even with different parameters set, it was observed that predictions were including too much false positives, which were mainly made of snow. Most probably the reflectance of snow is similar to the one of quarries and this needed a special treatment. For this purpose, a filtering of the results was used. First of all the features were filtered based on the score values (0.9) and then by elevation, using the SRTM digital elevation model. As snow usually does not precipitate below around 1155 m, this was used as threshold. Finally an area threshold is also passed (using smallest predictions area) and predictions are merged. A more detailed description of how to operate this first filter can be seen here.

Once several tests were performed, the new predictions were sent back to the domain expert for detailed revision with a rigid protocol. This included the removal of false positives and the inclusion of false negatives, mainly. This was performed by 4 different experts from swisstopo in 4 regions with the same amount of tiles to be analyzed. It is important to the state again the importance of domain expertise in this step, as a very careful and manual evaluation of what is and what is not a quarry must be made.

Once the predictions were corrected, a new session of training was performed using different parameters. Once again, the same resolution and tile size were used as in the first iteration (512x512m tiles with 512x512 pixels of resolution), although this time a new filtering was developed. Very similar to the first one, but in a different order, allowing more aesthetical predictions in the end, something the domain expert was also carrying about.

This procedure is summarized in figure 2.


Figure 2 : Methodology applied for the detection of quarries and new training sessions.

In the end, in order to also include the geological information of the detected quarries, a third layer resulting of the intersection of both the predictions and the GeoCover labels is created. This was done in a way that the final user can click to obtain both information on the quarry (when not a pure prediction) and the information of the geology/lithology on this part of the quarry. As a result, each resulting intersection poylgon contains both information from quarry and GeoCover.

In order to evaluate the obtained results, the F1 Score was computed and also the final predictions were compared to the corrected labels from the domain expert side. This was done visually by acquiring the centroid of each quarry detected and by a heat-map, allowing one to detect the spatial pattern of detections. The heat-map was computed using 10'000 m radius and a 100 m pixel size.

3 - Results & Discussion

In the first iteration, when the neural was trained with some labels of the TLM vector data, an optimal F1 score of approximately 0.78 was obtained. The figure 3 shows the behavior of the precision, recall and F1 score for the final model selected.


Figure 3 : Precision, Recall and F1 score of the first iteration (using TLM data).

Given the predictions resulting from the correction by the domain experts, there was an outstanding improve in the F1 score obtained, which was of approximately 0.85 in its optimal, as seen in figure 4. A total of 1265 were found in Switzerland after filtering.


Figure 4 : Precision, Recall and F1 score of the second iteration (using data corrected by the domain expert).

Figure 5 demonstrates some examples of detected quarries and this one can have some notion of the quality of the shape of the detections and how they mark the real-world quarries. Examples of false positives and false negatives, unfortunately still present in the detections are also shown. This is also an interesting demonstration of how some objects that are very similar to quarries, in the point of view of non-experts and how they may influence the results. These examples of errors are also an interesting indication of the importance of domain expertise in evaluating machine made results.


Figure 5 : Examples of detected quarries, with true positive, false negative and false positive.

To check on the validity of the new predictions generated, the centroid of them was plot along the centroid of the corrected labels, so one could check the spatial pattern of them and this way evaluate if they were respecting the same behavior. Figure 6 shows this plot.


Figure 6 : Disposition of the centroids of assessed predictions and final predictions.

One can see that despite some slight differences, the overall pattern is very similar among the disposition of the predictions. A very similar result can be seen with the computed heat-map of these points, seen in figure 7.


Figure 7 : Heatmap of assessed predictions and final predictions.

There is a small area to the west of the country where there were less detections than desired and in general there were more predictions than before. The objective of the heat-map is more to give a general view of the results than giving an exact comparison, as a point is created for every feature and the new filter used tended to smooth the results and join many features into a single one too.

At the end the results were also intersected with GeoCover, which provide the Swiss soil detailed lithology, and an example of the results can be seen below using the QGIS Software.


Figure 8 : Intersection of predictions with GeoCover seen in QGIS.

Finally and most important, the domain expert was highly satisfied with this work, due to the support it can give to swisstopo and the TLM team in mapping the future quarries. The domain expert also demonstrated interest in pursuing the work by investigating the temporal pattern of quarries and detecting the volume of material in each quarry.

4 - Conclusion

Through this collaboration with swisstopo, we managed to demonstrate that data science is able to provide relevant and efficient tool to ease complex and time-consuming task. With the produced inventory of the quarries on the whole Swiss territory, we were able to provide a quasi-exhaustive view of the situation to the domain expert, leading him to have a better view of the exploitation sites.

This is of importance and a major step forward compared to the previous situation. Indeed, before this project, the only solution available to the domain expert was to gather all the federal and cantonal data, through non-standardized and time-consuming process, to hope having a beginning of an inventory, with temporality issues. With the developed prototype, within hours, the entire SWISSIMAGE data-set can be processed and turn into a full scale inventory, guiding the domain expert directly toward its interests.

The resulting geographical layer can then be seen as the result of this demonstrator, able to turn the aerial images into a simple polygonal layer representing the quarries, with little false positive and false negative, providing the required view for the domain expert understanding of the Swiss situation. With such a result, it is possible to convolve it with all the other existing data, with the GeoCover in the first place. This lithology model of the Swiss soil can be intersected with the produced quarries layer in order to create a secondary geographical layer merging both quarries location and quarries soil type, leading to a powerful analyzing tool for the domain expert.

The produced demonstrator shows that it is possible, in hours, to deduce a simple and reliable geographical layer based on a simple set of orthomosaic. The STDL then was able to prove the possibility to repeat the process along the time dimension, for future and past images, opening the way to build and rebuild the history and evolution of the quarries. With such a process, it will be possible to compute statistical quantities on the long term to catch the evolution and the resources, leading to more reliable strategical understanding of the Swiss resources and sovereignty.