Show simple item record

2021-12-05Zeitschriftenartikel DOI: 10.18452/24215
Comparing End-to-End Machine Learning Methods for Spectra Classification
dc.contributor.authorSun, Yue
dc.contributor.authorBrockhauser, Sandor
dc.contributor.authorHegedus, Peter
dc.date.accessioned2022-02-24T13:37:12Z
dc.date.available2022-02-24T13:37:12Z
dc.date.issued2021-12-05none
dc.date.updated2022-01-03T12:26:25Z
dc.identifier.urihttp://edoc.hu-berlin.de/18452/24881
dc.description.abstractIn scientific research, spectroscopy and diffraction experimental techniques are widely used and produce huge amounts of spectral data. Learning patterns from spectra is critical during these experiments. This provides immediate feedback on the actual status of the experiment (e.g., time-resolved status of the sample), which helps guide the experiment. The two major spectral changes what we aim to capture are either the change in intensity distribution (e.g., drop or appearance) of peaks at certain locations, or the shift of those on the spectrum. This study aims to develop deep learning (DL) classification frameworks for one-dimensional (1D) spectral time series. In this work, we deal with the spectra classification problem from two different perspectives, one is a general two-dimensional (2D) space segmentation problem, and the other is a common 1D time series classification problem. We focused on the two proposed classification models under these two settings, the namely the end-to-end binned Fully Connected Neural Network (FCNN) with the automatically capturing weighting factors model and the convolutional SCT attention model. Under the setting of 1D time series classification, several other end-to-end structures based on FCNN, Convolutional Neural Network (CNN), ResNets, Long Short-Term Memory (LSTM), and Transformer were explored. Finally, we evaluated and compared the performance of these classification models based on the High Energy Density (HED) spectra dataset from multiple perspectives, and further performed the feature importance analysis to explore their interpretability. The results show that all the applied models can achieve 100% classification confidence, but the models applied under the 1D time series classification setting are superior. Among them, Transformer-based methods consume the least training time (0.449 s). Our proposed convolutional Spatial-Channel-Temporal (SCT) attention model uses 1.269 s, but its self-attention mechanism performed across spatial, channel, and temporal dimensions can suppress indistinguishable features better than others, and selectively focus on obvious features with high separability.eng
dc.language.isoengnone
dc.publisherHumboldt-Universität zu Berlin
dc.rights(CC BY 4.0) Attribution 4.0 Internationalger
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectend-to-end machine learningeng
dc.subjectspectra classificationeng
dc.subjectdeep learning (DL)eng
dc.subjectconvolutional neural network (CNN)eng
dc.subjecttransformereng
dc.subjectlong short-term memory (LSTM)eng
dc.subjectself-attentioneng
dc.subject.ddc004 Informatiknone
dc.subject.ddc600 Technik und Technologienone
dc.titleComparing End-to-End Machine Learning Methods for Spectra Classificationnone
dc.typearticle
dc.identifier.urnurn:nbn:de:kobv:11-110-18452/24881-9
dc.identifier.doihttp://dx.doi.org/10.18452/24215
dc.type.versionpublishedVersionnone
local.edoc.pages26none
local.edoc.type-nameZeitschriftenartikel
local.edoc.container-typeperiodical
local.edoc.container-type-nameZeitschrift
dc.description.versionPeer Reviewednone
dc.identifier.eissn2076-3417
dcterms.bibliographicCitation.doi10.3390/app112311520none
dcterms.bibliographicCitation.journaltitleApplied Sciences : open access journalnone
dcterms.bibliographicCitation.volume11none
dcterms.bibliographicCitation.issue23none
dcterms.bibliographicCitation.articlenumber11520none
dcterms.bibliographicCitation.originalpublishernameMDPInone
dcterms.bibliographicCitation.originalpublisherplaceBaselnone
bua.departmentMathematisch-Naturwissenschaftliche Fakultätnone

Show simple item record