The decreased risk of osteoporotic fractures in thiazide diuretics (TD) users is possibly not only caused by an increase in bone mineral density (BMD), but by an increase in other determinants of bone strength as well, such as the trabecular bone score (TBS). To test this hypothesis, we studied the association between TD use and both lumbar spine BMD (LS-BMD) and lumbar spine TBS (LS-TBS) cross-sectionally in 6096 participants from the Rotterdam Study, as well as the association between TD use and bone turnover estimated by serum osteocalcin levels. We found that past and current use of TD were associated with an increase of LS-BMD (β = 0.021 g/cm² (95% CI: 0.006;0.036) and β = 0.016 g/cm² (95% CI: 0.002;0.031), respectively). Use of ≥1 defined daily dose (DDD) (β = 0.028, 95% CI: 0.010;0.046; p for trend within DDD of use <0.001) and use of >365 days (β = 0.033, 95% CI: 0.014;0.052; p for trend within duration of use <0.001) were positively associated with LS-BMD. No significant association between TD use and LS-TBS was observed. Mean serum osteocalcin levels were significantly different between users and non-users of TD (20.2 ng/ml (SD 8.3) and 22.5 ng/ml (SD 17.0), respectively, p < 0.001). Furthermore, linear regression analysis showed that the use of TD was associated with a 3.2 ng/l (95% CI: −4.4.; −2.0) lower serum osteocalcin level compared to non-use of TD, when adjusted for Rotterdam Study cohort, age, and sex. Our results may implicate that the decreased fracture risk in TD users is explained by increased bone mass rather than by improved bone microarchitecture. Alternatively, changes in bone microarchitecture might not be detected through TBS and more sophisticated techniques are possibly needed to study a potential effect of TD on bone microarchitecture.
Keywords:
Thiazide diuretics; Bone mineral density; Trabecular bone score; Osteoporosis