An automatic diagnosis method for the knee meniscus tears in MR images


EXPERT SYSTEMS WITH APPLICATIONS, vol.36, no.2, pp.1208-1216, 2009 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 36 Issue: 2
  • Publication Date: 2009
  • Doi Number: 10.1016/j.eswa.2007.11.036
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Page Numbers: pp.1208-1216
  • Keywords: Medical image processing, Statistical image processing, Bone Segmentation, Knee meniscus location, Meniscus tears, Automatic diagnosis, SEGMENTATION, SIGNALS, FUSION, VOLUME
  • Karadeniz Technical University Affiliated: Yes


Everyday vast amount of information accumulated in medical databases. These databases include quite useful information that could be exploited to improve diagnosis of illnesses and their treatments. However, classification of this information is becoming more and more difficult. In this paper, an automatic method to diagnose the knee meniscus tears from MR medical images is presented. This proposed system uses histogram based method with edge detection filtering and statistical segmentation based methods to locate meniscus at knee joint. A template matching technique is also employed to extract the meniscus. Finally, the meniscus area is analyzed to detect the meniscus tears automatically. Accurate segmentation of the statistical pattern requires a technique that eliminates background effects. Hence, the density distributions of the statistical patterns on images with varying background are corrected. Here, the statistical segmentation method also extracts a representing image of the statistical patterns such as bone and uses the image to enhance the segmentation. Performance of this method is examined on MR images in varying qualities. The results show that our method is quite successful in segmentation of knee bones and diagnosis of the meniscus tears. This system has achieved accuracy about 93% in the diagnosis of meniscus tears on MR images. (C) 2007 Elsevier Ltd. All rights reserved.