arXiv:2508.10954v2 Announce Type: replace-cross
Abstract: Modern AI models are typically trained on static datasets, limiting their ability to continuously adapt to rapidly evolving real-world environments. While continual learning (CL) addresses this limitation, most CL methods are designed for natural images and often underperform or fail to transfer to medical data due to domain bias, institutional constraints, and subtle inter-stage boundaries. We propose UniPrompt-CL, a medical-oriented prompt-based continual learning method that improves prompt pool design via a minimally expanding unified prompt pool and a new regularization term, achieving a better stability-plasticity trade-off with lower computational cost. Across two domain-incremental learning settings, UniPrompt-CL effectively reduces inference cost while improving AvgACC by 1-3 percentage points. In addition to strong performance, extensive experiments clearly validate the motivation and effectiveness of the proposed improvements.

Subscribe for Updates

Copyright 2025 dijee Intelligence Ltd.   dijee Intelligence Ltd. is a private limited company registered in England and Wales at Media House, Sopers Road, Cuffley, Hertfordshire, EN6 4RY, UK registration number 16808844