Singapore General Hospital (SGH) is Singapore’s largest acute tertiary hospital and national referral centre, with over 800 specialists offering medical care to about 1 million citizens each year. As an academic healthcare institution, SGH also plays a key role in nurturing doctors, nurses and allied health professionals, and is committed to innovative translational and clinical research to deliver the best care and outcomes to patients.
An area that the hospital is looking into is to improve the speed and accuracy of diagnosing a specific type of breast tumour known as fibroepithelial lesions (FEL).
FEL are biphasic tumours which consist of benign fibroadenomas (FAs) and the less common phyllodes tumours (PTs). Diagnosing FELs can be a challenge as FAs and PTs have overlapping features but require different clinical management. While FAs are usually monitored without further treatment, PTs typically require surgery to remove the tumours. It is very important, therefore, to make the correct diagnosis by analysing the morphological features of cellular FA and benign PT tissue samples obtained from core needle biopsies under a microscope.
SGH and AI Singapore (AISG) collaborated on a 100 Experiments (100E) project that leverages artificial intelligence (AI) to assist the laboratory in solving this problem. A novel bespoke two-stage computer vision model was developed and trained on high-resolution whole-slide images of tissue samples. At the time of development, this was the first known study utilising AI to evaluate core biopsy images of FA and PT.
In the first stage of the model development, the large, gigapixel-scale whole-slide image was divided into multiple, smaller patches. A convolutional neural network (CNN) was then built to identify and extract the tissue samples’ discriminative features within each patch. This was guided by detailed annotations of lesional areas within each slide, as provided by breast pathology experts in the SGH team.
In the second stage, the features extracted from the small patches in the first stage were fed into a recurrent neural network (RNN), which analysed how the lesional patches were arranged spatially within each slide. In doing so, the model was able to take a bird’s eye view of the whole slide and produce an overall evaluation of whether it was an FA or PT.
The AI solution was also accompanied by an explanatory module that provided a visual heatmap of the key discriminative features driving each prediction. This helped end users to better understand the decision-making process behind the AI model.
When tested on a new set of whole-slide images, the model achieved an accuracy of 87.5 per cent in its predictions.
The model was subsequently packaged into a minimally viable solution and is being trialed for use on clinical samples.
Dr Cheng Chee Leong, Senior Consultant with the Department of Anatomical Pathology at SGH, said the collaboration with AISG has been an eye-opening experience for him.
With a more objective and rapid detection tool, it can potentially translate into better treatment decisions that could reduce the need for surgical management, alleviate anxiety in patients, and deliver significant cost savings.