Cavities formed by osteoclasts on the surface of cancellous bone during bone remodeling (resorption cavities) are believed to act as stress risers and impair cancellous bone strength and stiffness. Although resorption cavities are readily detected as eroded surfaces in histology sections, identification of resorption cavities in three-dimensional images of cancellous bone has been rare. Here we use sub-micrometer resolution images of rat lumbar vertebral cancellous bone obtained through serial milling (n = 5) to determine how measures of the number and surface area of resorption cavities are influenced by image resolution. Three-dimensional images of a 1 mm cube of cancellous bone were collected at 0.7 × 0.7 × 5.0 μm/voxel using fluorescence based serial milling and uniformly coarsened to four other resolutions ranging from 1.4 × 1.4 × 5.0 to 11.2 × 11.2 × 10 μm/voxel. Cavities were identified in the three-dimensional image as an indentation on the cancellous bone surface and were confirmed as eroded surfaces by viewing two-dimensional cross-sections (mimicking histology techniques). The number of cavities observed in the 0.7 × 0.7 × 5.0 μm/voxel images (22.0 ± 1.43, mean ± SD) was not significantly different from that in the 1.4 × 1.4 × 5.0 μm/voxel images (19.2 ± 2.59) and an average of 79% of the cavities observed at both of these resolutions were coincident. However, at lower resolutions, cavity detection was confounded by low sensitivity (< 20%) and high false positive rates (> 40%). Our results demonstrate that when image voxel size exceeds 1.4 × 1.4 × 5.0 μm/voxel identification of resorption cavities by bone surface morphology is highly inaccurate. Experimental and computational studies of resorption cavities in three-dimensional images of cancellous bone may therefore require images to be collected at resolutions of 1.4 μm/pixel in-plane or better to ensure consistent identification of resorption cavities.
Keywords:
Bone remodeling; Cancellous bone; Imaging; Histomorphometry; Bone resorption