posted on 2021-12-21, 14:05authored byAnkur Baliyan, Hideto Imai, Akansha Dager, Olga Milikofu, Toru Akiba
Synchronously
detecting multiple Raman spectral signatures in two-dimensional/three-dimensional
(2D/3D) hyperspectral Raman analysis is a daunting challenge. The
underlying reasons notwithstanding the enormous volume of the data
and also the complexities involved in the end-to-end Raman analytics
pipeline: baseline removal, cosmic noise elimination, and extraction
of trusted spectral signatures and abundance maps. Elimination of
cosmic noise is the bottleneck in the entire Raman analytics pipeline.
Unless this issue is addressed, the realization of autonomous Raman
analytics is impractical. Here, we present a learner-predictor strategy-based
“automated hyperspectral Raman analysis framework” to
rapidly fingerprint the molecular variations in the hyperspectral
2D/3D Raman dataset. We introduce the spectrum angle mapper (SAM)
technique to eradicate the cosmic noise from the hyperspectral Raman
dataset. The learner-predictor strategy eludes the necessity of human
inference, and analytics can be done in autonomous mode. The learner
owns the ability to learn; it automatically eliminates the baseline
and cosmic noise from the Raman dataset, extracts the predominant
spectral signatures, and renders the respective abundance maps. In
a nutshell, the learner precisely learned the spectral features space
during the hyperspectral Raman analysis. Afterward, the learned spectral
features space was translated into a neural network (LNN) model. In
the predictor, machine-learned intelligence (LNN) is utilized to predict
the alternate batch specimen’s abundance maps in real time.
The qualitative/quantitative evaluation of abundance maps implicitly
lays the foundation for monitoring the offline/inline industrial qualitative/quantitative
quality control (QA/QC) process. The present strategy is best suited
for 2D/3D/four-dimensional (4D) hyperspectral Raman spectroscopic
techniques. The proposed ML framework is intuitive because it obviates
human intelligence, sophisticated computational hardware, and solely
a personal computer is enough for the end-to-end pipeline.