We have designed, implemented, and validated an algorithm capable of 3D PET-CT registration in the chest, using mutual information as a similarity criterion. Inherent differences in the imaging protocols produce significant non-linear motion between the two acquisitions. To recover this motion, local deformations modeled with cubic B-splines are incorporated into the transformation. The deformation is defined on a regular grid and is parameterized by potentially several thousand coefficients. Together with a spline-based continuous representation of images and Parzen histogram estimates, the deformation model allows for closed-form expressions of the criterion and its gradient. A limited-memory quasi-Newton optimization package is used in a hierarchical multiresolution framework to automatically align the images. To characterize the performance of the algorithm, 27 scans from patients involved in routine lung cancer screening were used in a validation study. The registrations were assessed visually by two observers in specific anatomic locations using a split window validation technique. The visually reported errors are in the 0-6mm range and the average computation time is 100 minutes.
Keywords:
: registration; nonrigid; deformation; nonlinear; multimodality; validation; multiresolution; mutual information; positron emission tomography (PET); computed tomography (CT)