首页> 美国卫生研究院文献>Sensors (Basel Switzerland) >AF-SENet: Classification of Cancer in Cervical Tissue Pathological Images Based on Fusing Deep Convolution Features
【2h】

AF-SENet: Classification of Cancer in Cervical Tissue Pathological Images Based on Fusing Deep Convolution Features

机译:AF-Senet:基于熔断深卷积特征的颈椎组织病理图像中癌症的分类

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Cervical cancer is the fourth most common cancer in the world. Whole-slide images (WSIs) are an important standard for the diagnosis of cervical cancer. Missed diagnoses and misdiagnoses often occur due to the high similarity in pathological cervical images, the large number of readings, the long reading time, and the insufficient experience levels of pathologists. Existing models have insufficient feature extraction and representation capabilities, and they suffer from insufficient pathological classification. Therefore, this work first designs an image processing algorithm for data augmentation. Second, the deep convolutional features are extracted by fine-tuning pre-trained deep network models, including ResNet50 v2, DenseNet121, Inception v3, VGGNet19, and Inception-ResNet, and then local binary patterns and a histogram of the oriented gradient to extract traditional image features are used. Third, the features extracted by the fine-tuned models are serially fused according to the feature representation ability parameters and the accuracy of multiple experiments proposed in this paper, and spectral embedding is used for dimension reduction. Finally, the fused features are inputted into the Analysis of Variance-F value-Spectral Embedding Net (AF-SENet) for classification. There are four different pathological images of the dataset: normal, low-grade squamous intraepithelial lesion (LSIL), high-grade squamous intraepithelial lesion (HSIL), and cancer. The dataset is divided into a training set (90%) and a test set (10%). The serial fusion effect of the deep features extracted by Resnet50v2 and DenseNet121 (C5) is the best, with average classification accuracy reaching 95.33%, which is 1.07% higher than ResNet50 v2 and 1.05% higher than DenseNet121. The recognition ability is significantly improved, especially in LSIL, reaching 90.89%, which is 2.88% higher than ResNet50 v2 and 2.1% higher than DenseNet121. Thus, this method significantly improves the accuracy and generalization ability of pathological cervical WSI recognition by fusing deep features.
机译:宫颈癌是世界上最常见的癌症。全幻灯片(WSIS)是诊断宫颈癌的重要标准。错过诊断和误诊通常由于病理宫颈图像中的高度相似,读数大量的读数,长读时间和病理学家的体验水平不足。现有模型具有足够的特征提取和表示能力,并且它们遭受了不足的病理分类。因此,这项工作首先设计了一种用于数据增强的图像处理算法。其次,通过微调预先训练的深网络模型来提取深度卷积特征,包括Reset50 V2,DenSenet121,Inception V3,VGGNet19和Inception-Reset,然后是局部二进制模式和取向梯度的直方图,以提取传统的使用图像功能。第三,根据本文提取的微调模型提取的特征是根据特征表示能力参数的串行熔断,并且使用本文提出的多个实验的精度,并且光谱嵌入用于减压。最后,融合特征被输入到分析分类中的分析,用于分类嵌入网(AF-Senet)。数据集有四种不同的病理学图像:正常,低等级鳞状上皮病变(LSIL),高级鳞状上皮病变(HSIL)和癌症。数据集分为培训集(90%)和测试集(10%)。 Reset50v2和DenSenet121(C5)提取的深度特征的串行融合效应是最佳的,平均分类精度达到95.33%,比Resnet50 V2高1.07%,比densenet121高1.05%。识别能力显着改善,特别是在LSIL,达到90.89%,比resnet50 V2高2.88%,高于Densenet121。因此,这种方法通过熔合深度特征显着提高了病理宫颈WSI识别的准确性和泛化能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号