Skip to main content
Fig. 1 | BMC Medical Imaging

Fig. 1

From: A survey of the impact of self-supervised pretraining for diagnostic tasks in medical X-ray, CT, MRI, and ultrasound

Fig. 1

Example of a typical SSL workflow, with an application to chest X-ray classification. (1) Self-supervised pretraining: A parameterized model \(g_\phi (f_\theta (\textbf{x}))\) is trained to solve a pretext task using only the chest X-rays. The labels for the pretext task are determined from the inputs themselves, and the model is trained to minimize the pretext objective \(\mathcal {L}_{\text {pre}}\). At the end of this step, \(f_\theta\) should output useful feature representations. (2) Supervised fine-tuning: Parameterized model \(q_\psi (f_\theta (\textbf{x}))\) is trained to solve the supervised learning task of chest X-ray classification using labels specific to the classification task. Note that the previously learned \(f_\theta\) is reused for this task, as it produces feature representations specific to chest X-rays

Back to article page