Detecting and Labeling Diseases in Chest X-Rays with Deep Learning

Researchers from the National Institutes of Health in Bethesda, Maryland are using NVIDIA GPUs and deep learning to automatically annotate diseases from chest x-rays.

Accelerated by Tesla GPUs, the team trained their convolutional neural networks on a publicly available radiology dataset of chest x-rays and reports to describe the characteristics of a disease, such as location, severity and the affected organs.

Labeling Chest X-rays with GPUs
Examples of annotation generations (light green box) compared to true annotations (yellow box) for input images in the test set.

The researchers mention this is the first study (to the best of their knowledge) that mines from a publicly available radiology image and report dataset, not only to classify and detect disease in images, but also to describe their context similar to how a human observer would read.

Read the research paper >>

About Brad Nemire

Brad Nemire
Brad Nemire is on the Developer Marketing team and loves reading about all of the fascinating research being done by developers using NVIDIA GPUs. Reach out to Brad on Twitter @BradNemire and let him know how you’re using GPUs to accelerate your research. Brad graduated from San Diego State University and currently resides in San Jose, CA. Follow @BradNemire on Twitter