Previously, a tumor dose-response model based on resource competition and cell-cycle-dependent radiosensitivity accurately predicted local failure rates for early-stage NSCLC cohorts. Here, the model mathematically determined non-uniform inter-fraction intervals minimizing local failures at similar normal tissue toxicity risk, i.e., iso-BED3 (iso-NTCP) for fractionation schemes 18Gyx3, 12Gyx4, 10Gyx5, 7.5Gyx8, 5Gyx12, 4Gyx15. Next, we used these optimized schedules to reduce toxicity risk (BED3) while maintaining stable local failures (TCP).
Optimal schedules consistently favored a "primer shot" fraction followed by a 2-week break, allowing tumor reoxygenation. Increasing or decreasing the assumed baseline hypoxia extended or shortened this optimal break by up to one week. Fraction sizes of 7.5 Gy and up required a single primer shot, while smaller fractions needed one or two extra fractions for full reoxygenation. The optimized schedules, versus consecutive weekday fractionation, predicted absolute LF reductions of 4.6%-7.4%, except for the already optimal LF rate seen for 18Gyx3. Primer shot schedules could also reduce BED3 at iso-TCP with the biggest improvements for the shortest schedules (94.6Gy reduction for 18Gyx3).
Radiotherapy is traditionally given in equally spaced weekday fractions. We hypothesize that heterogeneous interfraction intervals can increase radiosensitivity via reoxygenation. Through modeling, we investigate whether this minimizes local failures and toxicity for early-stage non-small cell lung cancer (NSCLC).
A validated simulation model clearly supports non-standard "primer shot" fractionation, reducing the impact of hypoxia-induced radioresistance. A limitation of this study is that primer-shot fractionation is outside prior clinical experience and therefore will require clinical studies for definitive testing.
This website uses cookies to ensure you get the best experience on our website.