Diseases, G. B. D. & Injuries, C. Global burden of 369 diseases and injuries in 204 countries and territories, 1990-2019: a systematic analysis for the Global Burden of Disease Study 2019. Lancet 396, 1204−1222 (2020).
Mitchell, A. J., Vaze, A. & Rao, S. Clinical diagnosis of depression in primary care: a meta-analysis. Lancet 374, 609–619 (2009).
de Aguiar Neto, F. S. & Rosa, J. L. G. Depression biomarkers using non-invasive EEG: A review. Neurosci. Biobehav. Rev. 105, 83–93 (2019).
Stolicyn, A., Steele, J. D. & Series, P. Prediction of depression symptoms in individual subjects with face and eye movement tracking. Psychol. Med. 52, 1784–1792 (2022).
Toto, E., Tlachac, M. & Rundensteiner, E. A. AudiBERT: a deep transfer learning multimodal classification framework for depression screening. In Proceedings 30th ACM International Conference on Information & Knowledge Management 4145–4154. https://doi.org/10.1145/3459637.3481895 (2021).
Francese, R. & Attanasio, P. Emotion detection for supporting depression screening. Multimed. Tools Appl. 82, 12771–12795 (2023).
Hasler, G. Pathophysiology of depression: do we have any solid evidence of interest to clinicians?. World Psychiatry 9, 155 (2010).
Han, S. et al. Orbitofrontal cortex-hippocampus potentiation mediates relief for depression: a randomized double-blind trial and TMS-EEG study. Cell Rep. Med. 4, 101060 (2023).
Rajkowska, G. et al. Morphometric evidence for neuronal and glial prefrontal cell pathology in major depression. Biol. Psychiatry 45, 1085–1098 (1999).
Drevets, W. C. et al. Subgenual prefrontal cortex abnormalities in mood disorders. Nature 386, 824–827 (1997).
Mayberg, H. S. et al. Reciprocal limbic-cortical function and negative mood: converging PET findings in depression and normal sadness. Am. J. Psychiatry 156, 675–682 (1999).
Esterman, M. et al. Frontal eye field involvement in sustaining visual attention: evidence from transcranial magnetic stimulation. Neuroimage 111, 542–548 (2015).
Corbetta, M. & Shulman, G. L. Control of goal-directed and stimulus-driven attention in the brain. Nat. Rev. Neurosci. 3, 201–215 (2002).
Lazarov, A., Ben-Zion, Z., Shamai, D., Pine, D. S. & Bar-Haim, Y. Free viewing of sad and happy faces in depression: a potential target for attention bias modification. J. Affect. Disord. 238, 94–100 (2018).
Girard, J. M. et al. Nonverbal social withdrawal in depression: evidence from manual and automatic analysis. Image Vis. Comput. 32, 641–647 (2014).
Deligianni, F., Guo, Y. & Yang, G. Z. From emotions to mood disorders: a survey on gait analysis methodology. IEEE J. Biomed. Health Inf. 23, 2302–2316 (2019).
Kebets, V. et al. Somatosensory-motor dysconnectivity spans multiple transdiagnostic dimensions of psychopathology. Biol. Psychiatry 86, 779–791 (2019).
Schrijvers, D., Hulstijn, W. & Sabbe, B. G. Psychomotor symptoms in depression: a diagnostic, pathophysiological and therapeutic tool. J. Affect. Disord. 109, 1–20 (2008).
Sobin, C. & Sackeim, H. A. Psychomotor symptoms of depression. Am. J. Psychiatry 154, 4–17 (1997).
He, L. et al. Deep learning for depression recognition with audiovisual cues: a review. Inf. Fusion 80, 56–86 (2022).
Abd-Alrazaq, A. et al. Systematic review and meta-analysis of performance of wearable artificial intelligence in detecting and predicting depression. npj Digit. Med. 6, 84 (2023).
Mangalik, S. et al. Robust language-based mental health assessments in time and space through social media. npj Digit. Med. 7, 109 (2024).
Kaczmarczyk, R., Wilhelm, T. I., Martin, R. & Roos, J. Evaluating multimodal AI in medical diagnostics. npj Digit. Med. 7, 205 (2024).
Kline, A. et al. Multimodal machine learning in precision health: a scoping review. npj Digit. Med. 5, 171 (2022).
Cunningham, S., Hudson, C. C. & Harkness, K. Social media and depression symptoms: a meta-analysis. Res. Child Adolesc. Psychopathol. 49, 241–253 (2021).
Yoon, S., Kleinman, M., Mertz, J. & Brannick, M. Is social network site usage related to depression? A meta-analysis of Facebook–depression relations. J. Affect. Disord. 248, 65–72 (2019).
Yang, J. et al. Cross-subject classification of depression by using multiparadigm EEG feature fusion. Comput. Methods Prog. Biomed. 233, 107360 (2023).
Mohammadi, M. et al. Data mining EEG signals in depression for their diagnostic value. BMC Med. Inform. Decis. Mak. 15, 1–14 (2015).
Koo, P. C. et al. Combined cognitive, psychomotor and electrophysiological biomarkers in major depressive disorder. Eur. Arch. Psychiatry Clin. Neurosci. 269, 823–832 (2019).
Nassibi, A., Papavassiliou, C. & Atashzar, S. F. Depression diagnosis using machine intelligence based on spatiospectrotemporal analysis of multi-channel EEG. Med. Biol. Eng. Comput. 60, 3187–3202 (2022).
Soni, S., Seal, A., Yazidi, A. & Krejcar, O. Graphical representation learning-based approach for automatic classification of electroencephalogram signals in depression. Comput. Biol. Med. 145, 105420 (2022).
Seal, A. et al. DeprNet: a deep convolution neural network framework for detecting depression using EEG. IEEE Trans. Instrum. Meas. 70, 1–13 (2021).
Liu, B., Chang, H., Peng, K. & Wang, X. An end-to-end depression recognition method based on EEGNet. Front. Psychiatry 13, 864393 (2022).
Li, X. et al. Depression recognition using machine learning methods with different feature generation strategies. Artif. Intell. Med. 99, 101696 (2019).
Li, X., Hu, B., Sun, S. & Cai, H. EEG-based mild depressive detection using feature selection methods and classifiers. Comput. Methods Prog. Biomed. 136, 151–161 (2016).
Song, X., Yan, D., Zhao, L. & Yang, L. LSDD-EEGNet: an efficient end-to-end framework for EEG-based depression detection. Biomed. Signal Process. Control 75, 103612 (2022).
Shen, J., Zhang, X., Wang, G., Ding, Z. & Hu, B. An improved empirical mode decomposition of electroencephalogram signals for depression detection. IEEE Trans. Affect. Comput. 13, 262–271 (2019).
Soni, S., Seal, A., Mohanty, S. K. & Sakurai, K. Electroencephalography signals-based sparse networks integration using a fuzzy ensemble technique for depression detection. Biomed. Signal Process. Control 85, 104873 (2023).
Jiang, W. et al. EEG-based subject-independent depression detection using dynamic convolution and feature adaptation. In Proceedings International Conference on Swarm Intelligence 272–283 (2023).
Tasci, G. et al. Automated accurate detection of depression using twin Pascal’s triangles lattice pattern with EEG Signals. Knowl. Based Syst. 260, 110190 (2023).
Sadiq, M. T., Akbari, H., Siuly, S., Yousaf, A. & Rehman, A. U. A novel computer-aided diagnosis framework for EEG-based identification of neural diseases. Comput. Biol. Med. 138, 104922 (2021).
Shi, Q. et al. Depression detection using resting state three-channel EEG signal. Preprint at arXiv https://doi.org/10.48550/arXiv.2002.09175 (2020).
Chen, T., Guo, Y., Hao, S. & Hong, R. Exploring self-attention graph pooling with EEG-based topological structure and soft label for depression detection. IEEE Trans. Affect. Comput. 13, 2106–2118 (2022).
Wang, H.-G., Meng, Q.-H., Jin, L.-C., Wang, J.-B. & Hou, H.-R. AMG: a depression detection model with autoencoder and multi-Head graph convolutional network. In Proceedings 2023 42nd Chinese Control Conference (CCC) 8551–8556 (2023).
Shivcharan, M., Boby, K. & Sridevi, V. EEG based machine learning models for automated depression detection. In Proceedings 2023 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT) 1–6 (2023).
Bai, R., Guo, Y., Tan, X., Feng, L. & Xie, H. An EEG-based depression detection method using machine learning model. Int. J. Pharma Med. Biol. Sci. 10, 17–22 (2021).
Fan, Y., Yu, R., Li, J., Zhu, J. & Li, X. EEG-based mild depression recognition using multi-kernel convolutional and spatial-temporal feature. In Proceedings 2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) 1777–1784 (2020).
Tian, F. et al. The three-lead EEG sensor: introducing an EEG-assisted depression diagnosis system based on ant lion optimization. IEEE Trans. Biomed. Circuits Syst. 17, 1305–1318 (2023).
Zhu, G. et al. Detecting depression using single-channel EEG and graph methods. Mathematics 10, 4177 (2022).
Zhang, B. et al. Feature-level fusion based on spatial-temporal of pervasive EEG for depression recognition. Comput. Methods Prog. Biomed. 226, 107113 (2022).
Li, D., Tang, J., Deng, Y. & Yang, L. Classification of resting state EEG data in patients with depression. In Proceedings 2020 IEEE International Conference on E-health Networking, Application & Services (HEALTHCOM) 1–2 (2020).
Mahato, S., Goyal, N., Ram, D. & Paul, S. Detection of depression and scaling of severity using six channel EEG data. J. Med. Syst. 44, 1–12 (2020).
Lin, H. et al. MDD-TSVM: a novel semisupervised-based method for major depressive disorder detection using electroencephalogram signals. Comput. Biol. Med. 140, 105039 (2022).
Rafiei, A., Zahedifar, R., Sitaula, C. & Marzbanrad, F. Automated detection of major depressive disorder with EEG signals: a time series classification using deep learning. IEEE Access 10, 73804–73817 (2022).
Duan, L. et al. Machine learning approaches for MDD detection and emotion decoding using EEG signals. Front. Hum. Neurosci. 14, 284 (2020).
Acharya, U. R. et al. Automated EEG-based screening of depression using deep convolutional neural network. Comput. Methods Prog. Biomed. 161, 103–113 (2018).
Cohn, J. F. et al. Detecting depression from facial actions and vocal prosody. In Proceedings 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops 1–7 (2009).
Yang, M. et al. Undisturbed mental state assessment in the 5G era: a case study of depression detection based on facial expressions. IEEE Wirel. Commun. 28, 46–53 (2021).
Shangguan, Z. et al. Dual-stream multiple instance learning for depression detection with facial expression videos. IEEE Trans. Neural Syst. Rehabil. Eng. 31, 554–563 (2022).
Wang, Q., Yang, H. & Yu, Y. Facial expression video analysis for depression detection in Chinese patients. J. Vis. Commun. Image Represent. 57, 228–233 (2018).
Hu, B., Tao, Y. & Yang, M. Detecting depression based on facial cues elicited by emotional stimuli in video. Comput. Biol. Med. 165, 107457 (2023).
Lin, L., Chen, X., Shen, Y. & Zhang, L. Towards automatic depression detection: a BiLSTM/1D CNN-based model. Appl. Sci. 10, 8701 (2020).
Kiss, G. & Vicsi, K. Comparison of read and spontaneous speech in case of automatic detection of depression. In Proceedings 2017 8th IEEE International Conference on Cognitive Infocommunications (CogInfoCom) 000213–000218 (2017).
Mobram, S. & Vali, M. Depression detection based on linear and nonlinear speech features in I-vector/SVDA framework. Comput. Biol. Med. 149, 105926 (2022).
Vázquez-Romero, A. & Gallardo-AntolÃn, A. Automatic detection of depression in speech using ensemble convolutional neural networks. Entropy. 22, 688 (2020).
Huang, Z., Epps, J. & Joachim, D. Speech landmark bigrams for depression detection from naturalistic smartphone speech. In Proceedings ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 5856–5860 (2019).
Huang, Z., Epps, J., Joachim, D. & Sethu, V. Natural language processing methods for acoustic and landmark event-based features in speech-based depression detection. IEEE J. Sel. Top. Signal Process. 14, 435–448 (2019).
Huang, Z., Epps, J. & Joachim, D. Investigation of speech landmark patterns for depression detection. IEEE Trans. Affect. Comput. 13, 666–679 (2019).
Yang, W. et al. Attention guided learnable time-domain filterbanks for speech depression detection. Neural Netw. 165, 135–149 (2023).
Li, X., Cao, T., Sun, S., Hu, B. & Ratcliffe, M. Classification study on eye movement data: towards a new approach in depression detection. In Proceedings 2016 IEEE Congress on Evolutionary Computation (CEC) 1227–1232 (2016).
Shen, R., Zhan, Q., Wang, Y. & Ma, H. Depression detection by analysing eye movements on emotional images. In Proceedings ICASSP 2021–2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 7973–7977 (2021).
Pan, Z., Ma, H., Zhang, L. & Wang, Y. Depression detection based on reaction time and eye movement. In Proceedings 2019 IEEE International Conference on Image Processing (ICIP) 2184–2188 (2019).
Le, C., Ma, H. & Wang, Y. A method for extracting eye movement and response characteristics to distinguish depressed people. In Proceedings Image and Graphics: 9th International Conference, ICIG 2017, Shanghai, China, September 13-15, 2017 Revised Selected Papers, Part I 9. 489–500 (2017).
Zhang, D. et al. Effective differentiation between depressed patients and controls using discriminative eye movement features. J. Affect. Disord. 307, 237–243 (2022).
Diao, Y. et al. A combination of P300 and eye movement data improves the accuracy of auxiliary diagnoses of depression. J. Affect. Disord. 297, 386–395 (2022).
Shao, W. et al. A multi-modal gait analysis-based detection system of the risk of depression. IEEE J. Biomed. Health Inform. 26, 4859–4868 (2021).
Lu, H., Shao, W., Ngai, E., Hu, X. & Hu, B. A new skeletal representation based on gait for depression detection. In Proceedings 2020 IEEE International Conference on E-health Networking, Application & Services (HEALTHCOM) 1–6 (2020).
Wang, T. et al. A gait assessment framework for depression detection using kinect sensors. IEEE Sens. J. 21, 3260–3270 (2020).
Li, W., Wang, Q., Liu, X. & Yu, Y. Simple action for depression detection: using kinect-recorded human kinematic skeletal data. BMC Psychiatry 21, 205 (2021).
Yu, Y. et al. Depression and severity detection based on body kinematic features: using kinect recorded skeleton data of simple action. Front. Neurol. 13, 905917 (2022).
Wang, Y., Wang, J., Liu, X. & Zhu, T. Detecting depression through gait data: examining the contribution of gait features in recognizing depression. Front. Psychiatry 12, 661213 (2021).
Zhang, X. et al. Multimodal depression detection: fusion of electroencephalography and paralinguistic behaviors using a novel strategy for classifier ensemble. IEEE J. Biomed. Health Inform. 23, 2265–2275 (2019).
Chen, T., Hong, R., Guo, Y., Hao, S. & Hu, B. MS²-GNN: exploring GNN-based multimodal fusion network for depression detection. IEEE Trans. Cybern. 53, 7749–7759 (2022).
Qayyum, A., Razzak, I., Tanveer, M., Mazher, M. & Alhaqbani, B. High-density electroencephalography and speech signal based deep framework for clinical depression diagnosis. IEEE/ACM Trans. Comput. Biol. Bioinform. 20, 2587–2597 (2023).
Ahmed, S., Yousuf, M. A., Monowar, M. M., Hamid, M. A. & Alassafi, M. Taking all the factors we need: a multimodal depression classification with uncertainty approximation. IEEE Access 11, 99847–99861 (2023).
Zhu, J. et al. Content-based multiple evidence fusion on EEG and eye movements for mild depression recognition. Comput. Methods Prog. Biomed. 226, 107100 (2022).
Othmani, A., Zeghina, A.-O. & Muzammel, M. A model of normality inspired deep learning framework for depression relapse prediction using audiovisual data. Comput. Methods Prog. Biomed. 226, 107132 (2022).
Yang, L., Jiang, D. & Sahli, H. Integrating deep and shallow models for multi-modal depression analysis—hybrid architectures. IEEE Trans. Affect. Comput. 12, 239–253 (2018).
Yang, S., Cui, L., Wang, L., Wang, T. & You, J. Enhancing multimodal depression diagnosis through representation learning and knowledge transfer. Heliyon 10, e25959 (2024).
Muzammel, M., Salam, H. & Othmani, A. End-to-end multimodal clinical depression recognition using deep neural networks: a comparative analysis. Comput. Methods Prog. Biomed. 211, 106433 (2021).
Joshi, J. et al. Multimodal assistive technologies for depression diagnosis and monitoring. J. Multimodal Use. Interfaces 7, 217–228 (2013).
Chen, J. et al. IIFDD: Intra and inter-modal fusion for depression detection with multi-modal information from Internet of Medical Things. Inf. Fusion 102, 102017 (2024).
Chen, X. et al. MGSN: Depression EEG lightweight detection based on multiscale DGCN and SNN for multichannel topology. Biomed. Signal Process. Control 92, 106051 (2024).
Ying, M. et al. EDT: An EEG-based attention model for feature learning and depression recognition. Biomed. Signal Process. Control 93, 106182 (2024).
Sun, C., Jiang, M., Gao, L., Xin, Y. & Dong, Y. A novel study for depression detecting using audio signals based on graph neural network. Biomed. Signal Process. Control 88, 105675 (2024).
Shao, X., Ying, M., Zhu, J., Li, X. & Hu, B. Achieving EEG-based depression recognition using Decentralized-Centralized structure. Biomed. Signal Process. Control 95, 106402 (2024).
Xu, X., Wang, Y., Wei, X., Wang, F. & Zhang, X. Attention-based acoustic feature fusion network for depression detection. Neurocomputing 601, 128209 (2024).
Zou, B. et al. Semi-structural interview-based Chinese multimodal depression corpus towards automatic preliminary screening of depressive disorders. IEEE Trans. Affect. Comput. 14, 2823–2838 (2022).
Chen, D. et al. Comparative efficacy of multimodal AI methods in screening for major depressive disorder: machine learning model development predictive pilot study. JMIR Form. Res. 9, e56057 (2025).
Zhou, L., Hu, B. & Guan, Z.-H. MDRA: a multimodal depression risk assessment model using audio and text. IEEE Signal Process. Lett. 32, 2045–2049 (2025).
Xie, W. et al. Interpreting depression from question-wise long-term video recording of SDS evaluation. IEEE J. Biomed. Health Inform. 26, 865–875 (2021).
Al Hanai, T., Ghassemi, M. M. & Glass, J. R. Detecting depression with audio/text sequence modeling of interviews. In Proceedings Interspeech 1716–1720 (ISCA, 2018).
Toto, E., Tlachac, M., Stevens, F. L. & Rundensteiner, E. A. Audio-based depression screening using sliding window sub-clip pooling. In Proceedings 2020 19th IEEE International Conference on Machine Learning and Applications (ICMLA) 791–796 (2020).
Sardari, S., Nakisa, B., Rastgoo, M. N. & Eklund, P. Audio based depression detection using convolutional autoencoder. Expert Syst. Appl. 189, 116076 (2022).
Ding, H. et al. IntervoxNet: a novel dual-modal audio-text fusion network for automatic and efficient depression detection from interviews. Front. Phys. 12, 1430035 (2024).
Shen, Y., Yang, H. & Lin, L. Automatic depression detection: an emotional audio-textual corpus and a gru/bilstm-based model. In Proceedings ICASSP 2022–2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 6247–6251 (2022).
Siuly, S. et al. Electroencephalogram (EEG) and its background. In EEG Signal Analysis and Classification: Techniques and Applications 3–21 (Springer International Publishing, 2016).
Fox, N. A. If it’s not left, it’s right: electroencephalograph asymmetry and the development of emotion. Am. Psychol. 46, 863 (1991).
Thibodeau, R., Jorgensen, R. S. & Kim, S. Depression, anxiety, and resting frontal EEG asymmetry: a meta-analytic review. J. Abnorm. Psychol. 115, 715 (2006).
Chen, T. & Guestrin, C. XGBoost: a scalable tree boosting system. In Proceedings 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 785–794 (ACM, 2016).
Ke, G. et al. Lightgbm: a highly efficient gradient boosting decision tree. Adv. Neural Inf. Process. Syst. 30 (2017).
Prokhorenkova, L., Gusev, G., Vorobev, A., Dorogush, A. V. & Gulin, A. CatBoost: unbiased boosting with categorical features. Adv. Neural Inf. Process. Syst. 31 (2018).
Gönen, M. & Alpaydın, E. Multiple kernel learning algorithms. J. Mach. Learn. Res. 12, 2211–2268 (2011).
Bucak, S. S., Jin, R. & Jain, A. K. Multiple kernel learning for visual object recognition: a review. IEEE Trans. Pattern Anal. Mach. Intell. 36, 1354–1369 (2013).
Lahat, D., Adali, T. & Jutten, C. Multimodal data fusion: an overview of methods, challenges, and prospects. Proc. IEEE 103, 1449–1477 (2015).
Litjens, G. et al. A survey on deep learning in medical image analysis. Med. Image Anal. 42, 60–88 (2017).
Vaswani, A. et al. Attention is all you need. Advances in neural information processing systems 30 (2017).
Lin, C.-C., Lin, K., Wang, L., Liu, Z. & Li, L. Cross-modal representation learning for zero-shot action recognition. In Proceedings IEEE/CVF Conference on Computer Vision and Pattern Recognition 19978–19988 (IEEE, 2022).
Lundberg, S. M. & Lee, S.-I. A unified approach to interpreting model predictions. Adv. Neural Inf. Process. Syst. 30 (2017).
Fu, T.-c. A review on time series data mining. Eng. Appl. Artif. Intell. 24, 164–181 (2011).
Cerutti, S. In the spotlight: biomedical signal processing. IEEE Rev. Biomed. Eng. 1, 8–11 (2008).
Cavanagh, J. F., Napolitano, A., Wu, C. & Mueen, A. The patient repository for EEG data+ computational tools (PRED+ CT). Front. Neuroinform. 11, 67 (2017).
Schuller, B. et al. AVEC 2011–the first international audio/visual emotion challenge. In Proceedings Affective Computing and Intelligent Interaction: Fourth International Conference, ACII 2011, Memphis, TN, USA, October 9–12, 2011, Part II 415–424 (2011).
Schuller, B., Valster, M., Eyben, F., Cowie, R. & Pantic, M. AVEC 2012: the continuous audio/visual emotion challenge. In Proceedings 14th ACM International Conference on Multimodal Interaction 449–456 (ACM, 2012).
Valstar, M. et al. AVEC 2013: the continuous audio/visual emotion and depression recognition challenge. In Proceedings 3rd ACM International Workshop on Audio/Visual Emotion Challenge 3–10 (ACM, 2013).
Valstar, M. et al. AVEC 2014: the 4th International audio/visual emotion challenge and workshop. In Proceedings 4th International Workshop on Audio/Visual Emotion Challenge 3–10 (ACM, 2014).
Valstar, M. et al. AVEC 2016: depression, mood, and emotion recognition workshop and challenge. In Proceedings 6th International Workshop on Audio/Visual Emotion Challenge 3–10 (ACM, 2016).
Ringeval, F. et al. Avec 2017: real-life depression, and affect recognition workshop and challenge. In Proceedings 7th Annual Workshop on Audio/Visual Emotion Challenge 3–9 (ACM, 2017).
Ringeval, F. et al. AVEC 2018 workshop and challenge: Bipolar disorder and cross-cultural affect recognition. In Proceedings 2018 on Audio/Visual Emotion Challenge and Workshop 3–13 (ACM, 2018).
Ringeval, F. et al. AVEC 2019 workshop and challenge: state-of-mind, detecting depression with AI, and cross-cultural affect recognition. In Proceedings 9th International on Audio/visual Emotion Challenge and Workshop 3–12 (ACM, 2019).
Cai, H. et al. A multi-modal open dataset for mental-disorder analysis. Sci. Data 9, 178 (2022).
Stahlschmidt, S. R., Ulfenborg, B. & Synnergren, J. Multimodal deep learning for biomedical data fusion: a review. Brief. Bioinform. 23, bbab569 (2022).
Baltrušaitis, T., Ahuja, C. & Morency, L.-P. Multimodal machine learning: a survey and taxonomy. IEEE Trans. Pattern Anal. Mach. Intell. 41, 423–443 (2018).
Custers, B. Click here to consent forever: expiry dates for informed consent. Big Data Soc. 3, 2053951715624935 (2016).
Flores, R., Tlachac, M., Toto, E. & Rundensteiner, E. Audiface: multimodal deep learning for depression screening. In Proceedings Machine Learning for Healthcare Conference 609–630 (2022).
Page, M. J. et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 372, n71 (2021).
Whiting, P. F. et al. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann. Intern. Med. 155, 529–536 (2011).
Jackson, D. & Turner, R. Power analysis for random-effects meta-analysis. Res. Synth. Methods 8, 290–302 (2017).