Research Interests

Fundamental & Applied AI, Machine Learning, Deep Learning, Computer Vision, Parameter Efficient Transfer Learning, Digital Health, Time-Series Analysis, Electrical circuits

Publications

  1. Paper 1
    Sel, K., Mohammadi, A., Pekgrew, R. I., & Jafari, R. (2023). Physics-informed neural networks for modeling physiological time-series for cuffless blood pressure estimation. npj Digital Medicine, 6(1), 110. Read Paper

    We model pulse waveforms with physics-informed constraints so the network respects cardiovascular dynamics while learning from limited data, enabling cuffless blood pressure estimation that is robust across subjects.

  2. Paper 2
    Mohammadi, A., Fakharzadeh, M., & Baraeinejad, B. (2022). An integrated human stress detection sensor using supervised algorithms. IEEE Sensors Journal, 22(8), 8216-8223. Read Paper

    This work presents a compact hardware–software pipeline that fuses physiological sensing with supervised learning to classify mental stress in real time, emphasizing low-power design and consistent performance across users.

  3. Preprint 1
    Mohammadi, A., Masabarakiza, I., Barnes, E., Carreiro, D., Van Dine, A., & Peeples, J. Investigation of Time-Frequency Feature Combinations with Histogram Layer Time Delay Neural Networks. IEEE OCEANS 2025, Read Preprint

    We systematically study time–frequency representations paired with histogram layers inside TDNNs for audio recognition, highlighting which feature combinations yield stable accuracy and offering guidance on feature selection.

  4. Preprint 2
    Mohammadi, A., Kelhe, T., Carreiro, D., Van Dine, A., & Peeples, J. Transfer Learning for Passive Sonar Classification using Pre-trained Audio and ImageNet Models. IEEE OCEANS 2025, Read Preprint

    This paper benchmarks transfer from large audio and vision backbones to passive sonar spectrograms.

  5. Preprint 4
    Mohammadi, A., Carreiro, D., Van Dine, A., & Peeples, J. (2025). Histogram-based Parameter-efficient Tuning for Passive Sonar Classification. arXiv preprint. Read Preprint

    We introduce histogram adapters inserted in parallel to backbone layers to capture distributional shifts. Training a small fraction of parameters yields efficient adaptation and strong downstream performance.