Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
The Forgotten Shield: Safety Grafting in Parameter-Space for Medical MLLMs
0
Zitationen
10
Autoren
2025
Jahr
Abstract
Abstract Medical Multimodal Large Language Models (Medical MLLMs) have achieved remarkable progress in specialized medical tasks; however, research into their safety has lagged, posing potential risks for real-world deployment. In this paper, we first establish a multidimensional evaluation framework to systematically benchmark the safety of current SOTA Medical MLLMs. Our empirical analysis reveals pervasive vulnerabilities across both general and medical-specific safety dimensions in existing models, particularly highlighting their fragility against cross-modality jailbreak attacks. Furthermore, we find that the medical fine-tuning process frequently induces catastrophic forgetting of the model’s original safety alignment. To address this challenge, we propose a novel “Parameter-Space Intervention” approach for efficient safety re-alignment. This method extracts intrinsic safety knowledge representations from original base models and concurrently injects them into the target model during the construction of medical capabilities. Additionally, we design a fine-grained parameter search algorithm to achieve an optimal trade-off between safety and medical performance. Experimental results demonstrate that our approach significantly bolsters the safety guardrails of Medical MLLMs without relying on additional domain-specific safety data, while minimizing degradation to core medical performance.
Ähnliche Arbeiten
Rethinking the Inception Architecture for Computer Vision
2016 · 30.684 Zit.
MobileNetV2: Inverted Residuals and Linear Bottlenecks
2018 · 24.960 Zit.
CBAM: Convolutional Block Attention Module
2018 · 21.777 Zit.
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
2020 · 21.493 Zit.
Xception: Deep Learning with Depthwise Separable Convolutions
2017 · 18.690 Zit.