Vis enkel innførsel

dc.contributor.advisorBurud, Ingunn
dc.contributor.advisorOsten, Julia
dc.contributor.authorJäkel, Annika
dc.coverage.spatialEuropeen_US
dc.date.accessioned2020-07-01T12:10:39Z
dc.date.available2020-07-01T12:10:39Z
dc.date.issued2020
dc.identifier.urihttps://hdl.handle.net/11250/2660341
dc.description.abstractThe objective of this thesis was to develop an automated labeling system for RGB images (red green blue) of sugar beet and weed plants with the help of multispectral imaging. 863 image pairs of sugar beet and 18 weed species, consisting each in one RGB and one multispectral image of the same plants, were acquired in the lab. Different apertures and bandpass filters were tested and the multispectral camera captured 15 wavebands between 654 and 866 nm. The pixels of the multispectral images were classified with a pipeline of two fully connected artificial neural networks (ANN) of the same architecture of ten hidden layers. The first ANN distinguished between plants and background, and the second one between sugar beet and weed. The transfer of the classifications based on the multispectral images onto the RGB images was attempted with a local motion model (imregdemon, Matlab) and a global motion model (projection). One projection matrix (homography) was computed for each acquisition session during which the camera position did not change and plants had a similar height. The best homographies were chosen based on spatial parameters, the mean squared error and the sum of the correlation matrix between the RGB and the warped spectral image. The classification accuracy for the background versus plant classifier were ¿ 98 % for both classes. The sugar beet versus weed classifier reached a per-class accuracy between 73 and 95 % and dice coefficients between 0.71 and 0.92 for the evaluation data set. On a plant level, classification results were very satisfying if plants of the same age (sugar beet) and species (weed) had been included in the training data set. The application of the local motion model failed most likely due to huge differences in resolution, reflectance values and image sections. After the filtering of the projection matrices, 95 % of the image pairs reached a satisfying projection of bounding boxes. The accuracy of projection was not high enough for conveying pixel segmentation masks. This could be achieved by further applying a local-motion model. The overall goal, to automatically label sugar beet and weed plants, was achieved for bounding boxes. Nevertheless, the system, especially the image registration can be further improved regarding reliability and performance. The developed labeling system will be tested with field data.en_US
dc.language.isoengen_US
dc.publisherNorwegian University of Life Sciences, Åsen_US
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/deed.no*
dc.subjectAutomated labelingen_US
dc.subjectComputer visionen_US
dc.titleAutomation of the labeling of images of sugar beet cultivation with hyperspectral imagingen_US
dc.typeMaster thesisen_US
dc.subject.nsiVDP::Mathematics and natural science: 400::Information and communication science: 420::Simulation, visualization, signal processing, image processing: 429en_US
dc.subject.nsiVDP::Agriculture and fishery disciplines: 900::Agriculture disciplines: 910::Agricultural technology: 916en_US
dc.source.pagenumber88en_US
dc.description.localcodeM-DVen_US


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel

Attribution-NonCommercial-NoDerivatives 4.0 Internasjonal
Med mindre annet er angitt, så er denne innførselen lisensiert som Attribution-NonCommercial-NoDerivatives 4.0 Internasjonal