Fluorescence in situ hybridization (FISH) is widely regarded as the gold standard for evaluating human epidermal growth factor receptor 2 (HER2) status in breast cancer; however, it poses challenges such as the need for specialized training and issues related to signal degradation from dye quenching. Silver-enhanced in situ hybridization (SISH) serves as an automated alternative, employing permanent staining suitable for bright-field microscopy. Determining HER2 status involves distinguishing between “Amplified” and “Non-Amplified” regions by assessing HER2 and centromere 17 (CEN17) signals in SISH-stained slides. This study is the first to leverage deep learning for classifying Normal, Amplified, and Non-Amplified regions within HER2-SISH whole slide images (WSIs), which are notably more complex to analyze compared to hematoxylin and eosin (H&E)-stained slides. Our proposed approach consists of a two-stage process: first, we evaluate deep-learning models on annotated image regions, and then we apply the most effective model to WSIs for regional identification and localization. Subsequently, pseudo-color maps representing each class are overlaid, and the WSIs are reconstructed with these mapped regions. Using a private dataset of HER2-SISH breast cancer slides digitized at 40× magnification, we achieved a patch-level classification accuracy of99.9%and a generalization accuracy of78.8%by applying transfer learning with a Vision Transformer (ViT) model. The robustness of the model was further evaluated through k-fold cross-validation, yielding an average performance accuracy of98%, with metrics reported alongside95%confidence intervals to ensure statistical reliability. This method shows significant promise for clinical applications, particularly in assessing HER2 expression status in HER2-SISH histopathology images. It provides an automated solution that can aid pathologists in efficiently identifying HER2-amplified regions, thus enhancing diagnostic outcomes for breast cancer treatment.
荧光原位杂交(FISH)被广泛视为评估乳腺癌中人表皮生长因子受体2(HER2)状态的金标准,但其存在操作需专业培训及染料淬灭导致信号衰减等问题。银增强原位杂交(SISH)作为一种自动化替代方案,采用适用于明场显微镜的永久染色技术。通过分析SISH染色切片中HER2与着丝粒17(CEN17)信号,可区分“扩增”与“非扩增”区域以判定HER2状态。本研究首次利用深度学习对HER2-SISH全切片图像(WSI)中的正常、扩增及非扩增区域进行分类,此类图像相较于苏木精-伊红(H&E)染色切片具有更高的分析复杂度。我们提出的方法采用两阶段流程:首先在标注图像区域评估深度学习模型,随后将最优模型应用于WSI进行区域识别与定位;继而叠加各类别的伪彩色映射,并基于映射区域重建WSI。利用40倍放大数字化的HER2-SISH乳腺癌切片私有数据集,通过视觉Transformer(ViT)模型进行迁移学习,我们在图像块级别分类中达到99.9%的准确率,泛化准确率达78.8%。经k折交叉验证进一步评估模型鲁棒性,平均性能准确率为98%,各项指标均附有95%置信区间以确保统计可靠性。该方法在临床应用中展现出显著潜力,尤其适用于HER2-SISH组织病理学图像中HER2表达状态的评估,为病理学家高效识别HER2扩增区域提供自动化解决方案,从而提升乳腺癌治疗的诊断效能。