高级检索
当前位置: 首页 > 详情页

Fundus Image Based Cataract Classification

| 认领 | 导出 |

文献详情

资源类型:
WOS体系:

收录情况: ◇ CPCI(ISTP)

机构: [1]Tsinghua Univ, Dept Automat, Tsinghua Natl Lab Informat Sci & Technol, Beijing, Peoples R China [2]Res Inst Informat & Technol, Beijing, Peoples R China [3]Tsinghua Univ, Res Inst Applicat Technol Wuxi, Beijing, Jiangsu, Peoples R China [4]Beijing Univ Technol, Sch Software Engn, Beijing, Peoples R China [5]Capital Med Univ, Beijing Tongren Hosp, Beijing Tongren Eye Ctr, Beijing, Peoples R China
出处:
ISSN:

关键词: cataract classification fundus image 2-dimensional discrete Fourier transform Principle Component Analysis Linear Discriminant Analysis AdaBoost

摘要:
Cataract is one of the leading causes of visual impairment worldwide. People with cataracts often suffer a lot in many aspects of daily life. Although early treatment can reduce the sufferings of cataract patients and prevent visual impairment turning to blindness, people in less developed areas still can't get timely treatment because of poor eye care services or lack of professional ophthalmologists. Besides, the present commonly used methods for cataract diagnosis, clinical assessment and photographic grading, need to be operated at a slit lamp by ophthalmologists, which are complicated and expensive for many patients. So reducing the cost and simplifying the process of early cataract diagnosis is of great importance. In this paper, we proposed a fundus image based cataract classification method by using pattern recognition, which can be used in early screening of cataract. By calculating the 2-dimensional discrete Fourier transform of a fundus image and using the calculated spectrum as features, a cataract classification and grading method is carried out by using the linear discriminant analysis promoted with the AdaBoost algorithm as the classifier. A preliminary test is implemented on an image sample set including 460 fundus images that normal, mild, moderate and severe cataract images are 158, 137, 86 and 79 respectively. Correspondingly, the two-class and four-class classification accuracy for our proposed method are 95.22% and 81.52%. We believe that our proposed method has a great potential in practical applications.

语种:
被引次数:
WOS:
第一作者:
第一作者机构: [1]Tsinghua Univ, Dept Automat, Tsinghua Natl Lab Informat Sci & Technol, Beijing, Peoples R China
推荐引用方式(GB/T 7714):
APA:
MLA:

资源点击量:21169 今日访问量:0 总访问量:1219 更新日期:2025-01-01 建议使用谷歌、火狐浏览器 常见问题

版权所有©2020 首都医科大学附属北京同仁医院 技术支持:重庆聚合科技有限公司 地址:北京市东城区东交民巷1号(100730)