Skip to main content
main-content
Top

15-11-2011 | General practice | Article

3-month BCR-ABL1 level best predictor of on-imatinib CML outcome

Abstract

Free abstract

MedWire News: Measuring transcript levels of the BCR-ABL1 oncogene at 3 months is the best way to identify patients with chronic myeloid leukemia in chronic phase (CML-CP) treated with imatinib who are likely to have a poor prognosis, UK researchers report.

Performing this assessment at 3 months will allow early clinical intervention in patients predicted to have a poor outcome, write David Marin (Imperial College London) and colleagues in the Journal of Clinical Oncology.

The researchers explain that the introduction of tyrosine kinase inhibitors such as imatinib has proved to be a major advance in the management of patients with CML-CP.

"In the first year of treatment, patients are most commonly monitored by regular examination of the bone marrow and are classified as responders or nonresponders on the basis of the achievement of major cytogenetic response (MCyR) or complete cytogenetic response (CCyR) at given time points," they note. However, the cytogenetic method is not always accurate.

Therefore, to identify molecular milestones that would predict overall survival (OS) and other outcomes more reliably than serial marrow cytogenetics, Marin and team measured BCR-ABL1 transcript levels in 282 patients with CML-CP at 3, 6, and 12 months after starting treatment with imatinib 400 mg/day as first-line therapy. The patients received dasatinib or nilotinib if treatment with imatinib failed.

Overall, the 8-year OS and progression-free (PFS) rates were 84.3% and 83.7%, respectively.

Using receiver operating characteristic curve analysis, the researchers identified the optimal cutoff in transcript level that allowed them to classify the patients as high- or low-risk with maximal sensitivity and specificity for each individual outcome and time point.

They report that, at 3 months, the optimal cutoff for predicting OS and event-free survival (EFS) was 9.84%. Compared with patients with transcript levels below this cutoff (low-risk), those with above-cutoff levels (high-risk) had significantly lower 8-year OS (93.3% vs 56.9%) and EFS (65.1% vs 6.9%).

The optimal cutoff for PFS was slightly lower, at 9.54%. At this level, high-risk patients had a significantly worse PFS rate than low-risk patients, at 57.0% versus 92.8%.

Similarly, transcript levels of more than 1.67% at 6 months and more than 0.53% at 12 months identified patients at high-risk for death or disease progression.

However, multivariate analysis, adjusted for age, gender, and clinical features, revealed that 3-month transcript level (higher or lower than 9.84%) was the only predictor for OS (relative risk [RR]=7.33), PFS (RR=7.16), and EFS (RR=9.71).

Furthermore, the prognostic value of the 3-month assessment was independent of whether or not patients had had their dose of imatinib temporarily reduced or discontinued because of adverse effects.

This indicates that the 3-month measurement is the "most informative," say the researchers.

They add: "The 6- and 12-month assessments contributed little (if anything) more to identifying patients with a high risk of progression.

"This suggests that therapeutic strategies for which patients are supposed to achieve successive milestones at specific time points, for example MCyR at 6 months and then CCyR at 12 months and so on, could be abandoned in favor of a single assessment at an early time point that would direct high-risk patients to alternative therapy," the team concludes.

By Laura Dean

Related topics