Multiscale Feature Learning From Coupled Neuroelectrical and Hemodynamic Time Series
Keywords:
Multiscale analysis, feature learning, neuroelectrical signals, hemodynamic signals, time-series fusion, signal processing, EEG–fNIRS fusionAbstract
Coupled neuroelectrical and hemodynamic measurements provide complementary images of the brain activity combining high-temporal-resolution electrical processes and spatially localized vascular activity. Although these modalities can be analysed jointly, joint analysis has not been successfully achieved to date because of the incompatibility of their sampling rates, the nonhomogeneous properties of the signals and intrinsic delays introduced by neurovascular coupling. The article puts forward a multi-scale feature learning model of joint signal and hemodynamic time series. This framework is based on the multiresolution signal processing in which each modality is first decomposed into scale-specific representations in multiscale analysis techniques to represent transient and long-term temporal structures. Based on such representations, discriminative features are obtained and are matched by cross-modal coupling analysis with a consideration of time warping and scale variations. A strategy of feature fusion is then resorted to so as to obtain an integrated representation that can be used to perform downstream classification and pattern recognition tasks. The suggested methodology is tested over well-known multimodal neurophysiological test datasets, the quality of which is measured by the quality of representation, the accuracy of classification, and cross-scale stability. The experimental findings prove that there are constant improvements compared to the single-scale and single-modal baselines, thus showing that multiscale modelling has a benefit compared to the former in heterogeneous brain signals. The results show that multiresolution signal characteristics and cross-modal interrelation explicitly integrated into multimodal neurophysiological analysis improve the performance of multimodal neurophysiological analysis considerably. This research highlights the importance of multiscale signal processing models in combined analysis of arduous timeseries data and offers to the systematic basis of future signature-based applications of signal-based multimodal brain monitoring.
