Systematic approach to a channelized Hotelling model observer implementation for a physical phantom containing mass-like lesions: Application to digital breast tomosynthesis.

Affiliation

Dept. of Medical Physics and Quality Assessment, KU Leuven, Leuven, Belgium. Electronic address: [Email]

Abstract

OBJECTIVE : to develop a channelized model observer (CHO) that matches human reader (HR) scoring of a physical phantom containing breast simulating structure and mass lesion-like targets for use in quality control of digital breast tomosynthesis (DBT) imaging systems.
METHODS : A total of 108 DBT scans of the phantom was acquired using a Siemens Inspiration DBT system. The detectability of mass-like targets was evaluated by human readers using a 4-alternative forced choice (4-AFC) method. The percentage correct (PC) values were then used as the benchmark for CHO tuning, again using a 4-AFC method. Three different channel functions were considered: Gabor, Laguerre-Gauss and Difference of Gaussian. With regard to the observer template, various methods for generating the expected signal were studied along with the influence of the number of training images used to form the covariance matrix for the observer template. Impact of bias in the training process on the observer template was evaluated next, as well as HR and CHO reproducibility.
RESULTS : HR performance was most closely matched by 8 Gabor channels with tuned phase, orientation and frequency, using an observer template generated from training image data. Just 24 DBT image stacks were required to give robust CHO performance with 0% bias, although a bias of up to 33% in the training images also gave acceptable performance. CHO and HR reproducibility were similar (on average 3.2 PC versus 3.4 PC).
CONCLUSIONS : The CHO algorithm developed matches human reader performance and is therefore a promising candidate for automated readout of phantom studies.

Keywords

Channelized Hotelling observer,Digital breast tomosynthesis,Human observer,Mass lesions,Physical phantom,