Vortrag CC BY 4.0
Veröffentlicht

Integrating context-based recommendation with deep NN image classification for plant identification tasks

Accurate plant species identification is essential for many scenarios in botanical research and conservation of biodiversity. Since a main obstacle is the large number of possible candidate species to consider, assistance through automatic identification techniques is highly desirable. On one side, photos of plant organs taken by users in the field can effectively be used in machine learning-based image classification, predicting the most likely matching taxa. At the same time, metadata on the user's spatio-temporal context usually goes unused despite its potential to be considered as an additional aspect to augment and improve prediction quality. We develop a recommender system utilizing a user's context to predict a list of plant taxa most likely to be observed at a given geographical location and time. Using a data-driven approach, we integrate knowledge on plant observations, species distribution maps, phenology and environmental geodata in order to calculate contextual recommendations on a local scale. The resulting model facilitates fine-grained ranking of plant taxa expected to occur in close proximity to a user in the field. Focusing on the territory of Germany with a list of the most common wild flowering plant taxa we are presented with a 2.8k class problem. Using a NASNet deep convolutional neural network trained on 860k taxon-labelled plant images we can presently achieve a 82% top-1 prediction accuracy. For a recommender system the combination of biogeographical information, phenology and habitat suitability models is showing viable results, being able to reduce the list of candidate taxa on average more than threefold with a recall of 25% for the top 20 list positions, 50% for the first 70 and a 90% recall for the full recommended list, based on contextual metadata alone. We show how prediction performance

Zitieren

Zitierform:
Zitierform konnte nicht geladen werden.

Rechte

Nutzung und Vervielfältigung: