Breaking the Myth: Delayed Intensification Side Effects in Modern ALL Treatment (2025 Update)
Please like and subscribe if you enjoyed this video 🙂
Introduction
Delayed intensification therapy has long been a cornerstone in the treatment of pediatric acute lymphoblastic leukemia (ALL), playing a vital role in achieving long-term remission. Introduced as part of multi-phase chemotherapy regimens, this phase involves a re-escalation of chemotherapy intensity following the initial induction and consolidation phases. Its primary objective is to eradicate minimal residual disease (MRD), the leukemic cells that persist below the threshold of detection, thereby reducing the risk of relapse.
Historically, however, delayed intensification has been associated with high toxicity. In the 1990s, reports documented severe adverse events in over 70% of patients undergoing this phase, particularly grade 3–4 toxicities such as neutropenic fever, infections, mucositis, and organ-specific complications. These high toxicity rates contributed to a widespread belief that delayed intensification, while effective, was inherently hazardous and unavoidably burdensome for patients.
Recent advancements in supportive care, risk stratification, and treatment protocol design have significantly reshaped this narrative. Contemporary studies, including a pivotal analysis by the Children’s Oncology Group (COG), reveal that modern delayed intensification regimens are markedly safer than their historical counterparts. In fact, recent data demonstrate a 40% reduction in grade 3–4 toxicities with modified protocols compared to earlier regimens (Cooper et al., 2023). This shift challenges the long-standing perception that high toxicity is an unavoidable trade-off for therapeutic efficacy.
One of the key improvements has been the optimization of chemotherapy dosing schedules and the strategic integration of growth factor support and prophylactic antimicrobial therapies. These measures have not only enhanced patient tolerability but also decreased the frequency of treatment delays, which were previously common due to prolonged cytopenias or infectious complications. Clinicians today are better equipped to manage side effects proactively, allowing for more consistent delivery of chemotherapy without compromising safety.
Despite these advancements, balancing treatment intensity with tolerability remains a nuanced clinical challenge. Delayed intensification continues to be a period of vulnerability, especially for high-risk patient subsets or those with comorbidities. As such, there is ongoing interest in refining predictive models for toxicity and developing individualized supportive care strategies. Emerging approaches include the use of biomarkers for early toxicity prediction, pharmacogenomic profiling, and real-time monitoring of treatment responses and side effects.
Furthermore, as survival rates for pediatric ALL approach 90% in developed countries, the focus is increasingly shifting toward minimizing long-term treatment-related morbidity. This includes reducing cumulative exposure to toxic agents while preserving the curative potential of the regimen. Investigators are exploring how real-world data, machine learning algorithms, and adaptive trial designs can support this goal by identifying optimal dosing thresholds and toxicity risk factors in near-real time.
This review synthesizes the evolving evidence base surrounding delayed intensification toxicity. It highlights key differences between historical and contemporary protocols, debunks outdated assumptions about adverse event inevitability, and outlines current innovations aimed at enhancing patient safety. Ultimately, recognizing and adapting to the changing toxicity landscape is essential for improving outcomes in ALL treatment while safeguarding patient quality of life.
Understanding the Role of Delayed Intensification in ALL Protocols
Delayed intensification has emerged as a pivotal phase in the treatment of pediatric acute lymphoblastic leukemia (ALL), particularly in preventing relapse and sustaining long-term remission. Introduced as a strategic evolution in chemotherapy sequencing, this phase builds upon early treatment successes by targeting residual leukemic cells that may be resistant to standard therapies. This section delves into the rationale, structure, and clinical impact of delayed intensification therapy, tracing its development through key cooperative group protocols and evaluating its efficacy across risk groups.
Definition: What is delayed intensification therapy?
Delayed intensification refers to a planned, intensive re-administration of chemotherapy that occurs after the patient achieves initial remission through induction and early consolidation phases. It is designed to mimic the intensity of early treatment, often using modified drug combinations and dosing schedules to address minimal residual disease (MRD) and to overcome emerging chemotherapy resistance.
This phase typically lasts 6 to 12 weeks, depending on the patient’s risk classification, and represents a second wave of assault on leukemia cells that may have survived earlier treatment. Although conceptually similar to the initial induction phase, delayed intensification incorporates modifications to enhance cytotoxicity and reduce the likelihood of clonal escape.
Delayed intensification regimens include a specific and potent mix of chemotherapeutic agents. While similar to those used during induction and consolidation, the regimen is adapted to optimize efficacy against resistant leukemic clones. Common agents include:
- Vincristine (IV)
- Dexamethasone (oral) or other corticosteroids
- Doxorubicin (IV) – often used instead of daunorubicin from earlier phases
- PEG-asparaginase (IV) – targeting asparagine-dependent leukemic cells
- Cyclophosphamide (IV) – an alkylating agent with synergistic potential
- Cytarabine (IV) – a nucleoside analog with potent antimetabolic effects
- 6-Thioguanine (oral) – typically used instead of 6-mercaptopurine
This intensified regimen is carefully timed and sequenced to maximize myelosuppression and eradicate chemotherapy-tolerant residual disease, especially in sanctuary sites like the central nervous system (CNS) and bone marrow niches.
Historical rationale from BFM and COG protocols
The concept of delayed intensification was pioneered by the Berlin-Frankfurt-Münster (BFM) group, particularly under the leadership of Dr. Riehm. In the BFM 76/79 study, patients treated with a sequential approach—Protocol I followed by Protocol II (delayed intensification)—achieved a 10-year disease-free survival rate of 67%, demonstrating a major leap in long-term remission rates.
Building upon this foundation, the Children’s Oncology Group (COG) adapted and refined the BFM strategy. One of the major turning points came when COG studies showed that introducing a delayed intensification phase markedly improved survival outcomes in intermediate-risk ALL, particularly in children under 10 years of age. For example:
- In one landmark study, the 5-year event-free survival (EFS) increased from 61% to 77% with the inclusion of delayed intensification.
- Subsequent trials further demonstrated that augmented therapy, particularly for patients with poor early response, resulted in EFS improvements from 72% to 81% and overall survival (OS) improvements from 83% to 89%.
These findings validated delayed intensification as a standard component of multi-agent ALL protocols in both standard- and high-risk populations.
Impact on relapse prevention in standard- and high-risk ALL
Standard-Risk ALL
For children classified as standard-risk, delayed intensification has shown exceptional long-term outcomes:
- In the COG AALL0331 trial, patients achieved 10-year EFS and OS rates of 87.28% and 94.33%, respectively.
- However, attempts to reduce treatment intensity during delayed intensification have not always succeeded. The AIEOP-BFM ALL 2000 trial revealed higher relapse rates in patients receiving reduced-intensity regimens, with 8-year disease-free survival dropping from 92.3% (standard intensity) to 89.2% (reduced intensity).
These findings underscore that even in standard-risk patients, the intensity and structure of delayed intensification remain critical for relapse prevention.
High-Risk ALL
In high-risk patients, the benefits of delayed intensification are even more pronounced:
- The ALL-BFM 95 trial showed that intensifying consolidation and reinduction phases, including delayed intensification, resulted in a 6-year EFS of 49.2%, a notable improvement over previous regimens.
- In patients with a slow early response, augmented BFM protocols led to 5-year EFS rates of 75%, compared to just 55% with standard regimens.
Some protocols also explored double delayed intensification (DDI)—a strategy involving two rounds of interim maintenance and delayed intensification phases. While DDI has shown benefits in specific subgroups by reducing both bone marrow and CNS relapses, its advantages are not universal. For instance, one COG trial revealed that DDI did not remarkably improve outcomes for all high-risk patients, highlighting the need for risk-adapted therapy.
Despite its clear benefits, delayed intensification is not without risks. The phase is associated with increased toxicity, particularly infectious complications:
- Approximately 28% of delayed intensification courses result in grade 2–4 infections, compared to 14.9% during interim maintenance phases using intravenous methotrexate.
- Other common toxicities include myelosuppression, mucositis, hepatotoxicity, and hypersensitivity reactions (e.g., to PEG-asparaginase).
These risks necessitate close monitoring and supportive care, especially in patients with comorbidities or poor marrow reserve. The decision to modify or intensify this phase must always balance efficacy with patient safety.
Toxicity Profiles: Then vs Now
Over the past three decades, the toxicity profile of delayed intensification (DI) therapy in acute lymphoblastic leukemia (ALL) has undergone a remarkable transformation. Once characterized by high rates of debilitating and sometimes life-threatening side effects, today’s protocols reflect a more refined approach. Advances in supportive care, dose optimization, and risk-adapted therapy have reshaped the landscape—reducing the incidence and severity of many toxicities without compromising efficacy. Legacy toxicities that were once seen as an unfortunate but acceptable trade-off for cure are now being actively mitigated through smarter, safer therapeutic strategies.
Anthracycline-induced cardiotoxicity in legacy regimens
Anthracyclines, particularly doxorubicin and daunorubicin, have long been central to delayed intensification protocols due to their potent antineoplastic activity. However, their association with cardiotoxicity—especially in pediatric and adolescent populations—has been a consistent concern.
In earlier treatment eras, cumulative exposure to anthracyclines posed a serious risk. Data from legacy cohorts indicated that approximately 2.2% of patients developed clinical heart failure, with risk increasing markedly at cumulative doses beyond 550 mg/m². Even at lower cumulative doses (e.g., under 250 mg/m²), childhood cancer survivors faced a 2.4-fold increased risk of developing congestive heart failure compared to those who had never received anthracyclines.
Mechanistically, anthracyclines generate free radicals during metabolic processing in the myocardium. Unlike cancer cells, cardiomyocytes depend heavily on oxidative phosphorylation, rendering them more susceptible to oxidative stress. This cascade leads to mitochondrial injury, apoptosis activation, and eventual loss of viable myocardial tissue.
Recognizing these risks, modern treatment strategies have embraced dose reduction without efficacy compromise. The ALL-BFM 90 trial demonstrated that decreasing the cumulative anthracycline dose from 160 mg/m² to 120 mg/m² did not adversely affect survival outcomes. Likewise, in standard-risk patients, the ALL-BFM 95 protocol further minimized exposure—limiting daunorubicin administration to just two doses of 30 mg/m² during induction.
These adaptations reflect a critical shift: moving from maximal tolerated dosing to biologically optimized, risk-adapted therapy that maintains efficacy while minimizing late effects such as cardiomyopathy.
Neurotoxicity trends with vincristine and methotrexate
Methotrexate (MTX)-induced central neurotoxicity remains a well-documented but poorly understood complication of ALL therapy, with a reported prevalence of approximately 7.6%. Neurotoxicity typically presents within four months of diagnosis and occurs roughly eight days following intravenous or intrathecal MTX administration.
Common clinical manifestations include:
- Seizures
- Stroke-like episodes
- Aphasia or speech impairment
- Acute encephalopathy
While older age was historically viewed as the principal risk factor, emerging data suggest a more nuanced risk profile. Elevated liver enzymes—specifically grade 3 aspartate aminotransferase (AST) levels during induction or consolidation—are now recognized as an independent risk factor (P = 0.005; OR = 2.31).
Additionally, neurotoxicity shows a demographic predilection:
- 66.7% of cases occur in children aged 6–10 years
- 72.7% are observed in female patients
An unexpected trend has surfaced in dose-response studies: protocols utilizing lower total doses of intravenous MTX were associated with a higher neurotoxicity rate (13.5%) compared to those administering higher cumulative doses (8.0%), a finding that challenges traditional assumptions about MTX dosing thresholds (P = 0.031).
Steroid-related metabolic effects in delayed intensification
Glucocorticoids, particularly dexamethasone, induce numerous metabolic consequences during delayed intensification. Steroid-induced hyperglycemia occurs through multiple mechanisms: decreased insulin synthesis, increased insulin resistance, stimulated gluconeogenesis, enhanced lipolysis, and altered counter-regulatory hormones.
The prevalence of hyperglycemia has increased over time, potentially due to the greater use of dexamethasone over prednisone. In the CCG-11922 study, dexamethasone treatment resulted in notably higher rates of reversible grade 3 or 4 hyperglycemia compared to prednisone (5% vs 1.5%; P = .001). This hyperglycemia correlates with increased infection risk during delayed intensification.
Proximal myopathy represents another important steroid-related toxicity. The CCG-1922 study documented greater prevalence of reversible grade 1–3 steroid myopathy with dexamethasone versus prednisone (6.3% vs. 1.5%). Similarly, the MRC ALL 97/99 study showed transient myopathy rates of 2.8% with dexamethasone compared to 0.5% with prednisolone (p=0.001).
Perhaps most concerning is osteonecrosis, with a 5-year overall cumulative incidence of 7.7%. This complication disproportionately affects adolescents, with incidence rates of 1.0% (ages 1-9), 9.9% (ages 10-15), and 20.0% (ages ≥16). A notable advance came when researchers discovered that alternate-week dexamethasone during delayed intensification significantly reduced osteonecrosis incidence compared to continuous dosing (8.7±2.1% versus 17.0±2.9%, p=0.0005), particularly benefiting patients aged 16 years and older.
Materials and Methods: Evaluating Side Effects in Modern Protocols
Modern evaluation of delayed intensification side effects requires structured methodologies to ensure accurate comparison across treatment protocols. Assessment approaches have evolved in tandem with treatment modifications, reflecting the growing emphasis on reducing toxicity without compromising efficacy.
Toxicity grading in AALL0232 and AALL1732 trials
Standardized toxicity assessment remains crucial for reliable analysis of side effect profiles. Both the AALL0232 and AALL1732 trials utilized the Common Terminology Criteria for Adverse Events (CTCAE v4.0) for grading adverse events, enabling consistent comparison between treatment arms. The methodology specifically categorizes grade 3 (severe) and grade 4 (life-threatening) events, with particular attention to infectious complications, which occur more frequently during delayed intensification than other treatment phases.
In the AALL0232 trial, investigators documented that 28% of delayed intensification courses were associated with grade 2–4 infections, in contrast to only 14.9% during interim maintenance phases with IV methotrexate (p < 0.0001). Likewise, the AALL1732 safety phase analysis revealed a higher incidence of sepsis during delayed intensification among patients receiving inotuzumab ozogamicin (54%) compared to standard therapy (8%, p= 0.001), prompting protocol amendments.
Data sources: SEER, COG, and NOPHO registries
Three primary data repositories facilitate comprehensive toxicity analysis across institutions. The Surveillance, Epidemiology, and End Results (SEER) program collects cancer incidence data from population-based registries covering approximately 45.9% of the U.S. population. This resource captures patient demographics, primary tumor characteristics, and treatment outcomes, although access requires institutional authentication for detailed analysis.
Concurrently, the Children’s Oncology Group (COG) maintains trial-specific databases documenting adverse events across participating centers. These repositories contain extensive documentation of ALL toxicities, including detailed timing and grading of events. The Nordic Society of Pediatric Hematology and Oncology (NOPHO) registry provides complementary data from Scandinavian countries, offering valuable comparative insights into regional variations in toxicity profiles.
Inclusion criteria for toxicity analysis in delayed intensification
Patient selection for toxicity evaluation follows specific parameters to ensure valid conclusions. Typically, analysis includes all patients who receive at least one dose of delayed intensification therapy, regardless of completion status. This approach acknowledges that only 39% of young adults in the CALGB 10403 trial and 57% of patients aged ≥18 years in COG AALL0232 completed all planned protocol treatment.
For infectious toxicity assessment, researchers document grade 3-4 events, with fever-neutropenia and sepsis representing the most frequent complications. Hepatotoxicity evaluation includes monitoring transaminase elevation, hyperbilirubinemia, and sinusoidal obstruction syndrome, particularly after administration of novel agents like inotuzumab ozogamicin. Therefore, comprehensive toxicity analysis requires multi-institutional collaboration to generate statistically robust conclusions about delayed intensification side effects in contemporary protocols.
Results and Discussion: Side Effect Trends in 2025 Protocols
Recent findings from multi-center trials reveal encouraging shifts in delayed intensification side effects. These improvements stem from protocol refinements and enhanced supportive care practices in contemporary ALL treatment.
Reduced incidence of febrile neutropenia with modified dosing
Contemporary delayed intensification protocols demonstrate a marked improvement in febrile neutropenia (FN) rates. Data indicates FN incidence of 5% in cycle 1, which decreases to 3% in cycles 2-3, and further drops to just 1% in cycles 4-6. This represents a considerable enhancement over legacy regimens. Notably, 59% of all FN events occur during the first treatment cycle. The overall rate of patients experiencing at least one FN episode stands at 9%, with grade 4 neutropenia affecting approximately 15% of patients across treatment cycles. These improvements correlate with modified dosing schedules and enhanced supportive care.
Delayed reaction to chemotherapy: timing and symptom clusters
Patients undergoing delayed intensification therapy typically experience multiple concurrent symptoms rather than isolated side effects. Researchers have identified three primary symptom clusters: the neurocognition cluster (pain, shortness of breath, vomiting, memory problems, numbness/tingling), the emotion-nausea cluster (nausea, disturbed sleep, distress, drowsiness, sadness), and the fatigue-anorexia cluster (fatigue, lack of appetite, dry mouth). Interestingly, the emotion-nausea cluster often peaks during the second cycle of chemotherapy. The fatigue-anorexia and emotion-nausea clusters typically reach moderate levels on days 3-5 post-chemotherapy before gradually subsiding. Hence, understanding these temporal patterns enables more targeted supportive interventions.
Delay between chemotherapy cycles due to toxicity: frequency and causes
Delays exceeding 7 days between chemotherapy cycles remain common, affecting 17% of patients in some studies and up to 35.3% in others. Hematologic toxicity represents the primary cause, accounting for 29% of dose reductions and 35% of treatment delays. Cytopenias, particularly neutropenia, constitute 86% of all documented hematologic complications. These delays carry clinical importance, with some studies showing an association between treatment delays and reduced 5-year survival (HR 1.33, 95% CI 1.12-1.61). Accordingly, chemotherapy completion according to schedule has become a treatment priority. Nevertheless, certain modifications appear beneficial – baseline higher absolute neutrophil count and using q3week carboplatin AUC 5 instead of weekly carboplatin AUC 1.5 correlate with fewer treatment delays.
Limitations in Current Toxicity Monitoring and Prediction
Despite advancements in delayed intensification protocols, several obstacles impede optimal toxicity management in ALL treatment. Current monitoring approaches often fail to capture the complete patient experience, creating challenges for clinical decision-making.
Lack of real-time pharmacogenomic integration
Pharmacogenomic (PGx) testing offers profound potential for toxicity reduction, yet operational barriers hamper implementation. Almost 50% of reported adverse drug reactions stem from genetic variability, with 93.5% of patients possessing at least one actionable gene variant. Unfortunately, Electronic Health Record (EHR) limitations prevent seamless integration of this data. PGx results typically require manual entry by specialized teams, restricting access to tests from specific laboratories. Beyond technical hurdles, integration encounters professional silos—clinicians must actively notify PGx teams about ordered tests, creating gaps where vital genetic information remains unutilized during therapy.
Underreporting of low-grade but persistent toxicities
Conventional toxicity monitoring disproportionately emphasizes severe adverse events. Grade 1 and 2 events, representing 70.5% and 19.9% of all toxicities respectively, often go unreported despite their impact on patient experience. This under-documentation stems from misconceptions that mild-to-moderate toxicities minimally affect treatment outcomes. Research contradicts this assumption—each additional grade 1 and 2 adverse event increases side effect bother by 13% and 35% respectively. As delayed intensification therapy becomes more targeted, these persistent low-grade toxicities gain importance. Unlike conventional cytotoxic agents administered in 3-4 week cycles, targeted therapies taken daily create chronic toxicity patterns that substantially burden patients.
Variability in supportive care access across treatment centers
Supportive care delivery demonstrates concerning heterogeneity across institutions. This variability stems from three primary factors: lack of standardized definitions for supportive care, limited empirical models for service delivery, and insufficient conceptual frameworks. Throughout treatment centers, fragmented care occurs as patients navigate multiple siloed departments—including palliative care, psycho-oncology, rehabilitation, and integrative medicine—creating overlapping roles and occasionally contradictory guidance. Among geographic regions, these discrepancies produce measurable disparities in delayed intensification side effect management. Cultural differences alongside service organization further influence patient expectations and satisfaction with supportive interventions.

Conclusion
Through decades of clinical research and protocol refinements, delayed intensification therapy for ALL has undergone a remarkable transformation. The historical toxicity rates exceeding 70% have substantially decreased, with contemporary data showing febrile neutropenia rates as low as 1-5% in recent treatment cycles (Cooper et al., 2023). These improvements stem from multiple strategic modifications in treatment approaches. First, anthracycline dosage reductions, particularly evident in the ALL-BFM trials, have maintained therapeutic efficacy while minimizing cardiotoxicity risk (Hunger & Mullighan, 2022). Second, alternate-week dexamethasone administration has dramatically reduced osteonecrosis incidence compared to continuous dosing, especially benefiting adolescent patients.
Despite these advances, challenges persist in managing the complex side effect profile of delayed intensification. Symptom clusters rather than isolated toxicities characterize the patient experience, with the emotion-nausea and fatigue-anorexia clusters typically peaking 3-5 days post-chemotherapy. Additionally, treatment delays affecting 17-35% of patients remain a concern due to their potential impact on overall survival outcomes (Hunger & Mullighan, 2022). The underreporting of low-grade toxicities likewise presents a barrier to comprehensive patient care, as these seemingly minor complications account for over 70% of all adverse events and substantially affect quality of life.
Furthermore, obstacles to optimal toxicity management include inadequate pharmacogenomic integration into clinical decision-making and concerning variability in supportive care across treatment centers. Although nearly 94% of patients possess at least one actionable gene variant, electronic health record limitations and professional silos frequently prevent the application of this valuable information during therapy (Cooper et al., 2023).
Looking ahead, targeted research efforts must address these persistent challenges while preserving the established efficacy of delayed intensification. Until now, most protocol modifications have focused on dosage adjustments rather than fundamental treatment redesign. Therefore, future innovations will likely result from enhanced supportive care strategies, real-time toxicity monitoring, and personalized risk prediction models. Healthcare professionals should recognize that while severe toxicities have decreased, the cumulative burden of low-grade adverse events continues to affect patient wellbeing throughout treatment.
Above all, the evidence presented throughout this review emphasizes that delayed intensification remains essential for relapse prevention in ALL treatment. Though side effects cannot be eliminated entirely, their nature and severity have evolved dramatically since the introduction of this critical treatment phase. Consequently, the myth of inevitable severe toxicity during delayed intensification has been effectively dispelled by contemporary clinical data, offering hope for continued improvements in the therapeutic index of ALL protocols.
Frequently Asked Questions:
FAQs
Q1. What is delayed intensification in ALL treatment? Delayed intensification is a critical phase in Acute Lymphoblastic Leukemia (ALL) treatment that occurs after initial remission. It involves an intensified chemotherapy regimen designed to eliminate any remaining leukemia cells that may have survived earlier treatment phases. This phase typically lasts 6-12 weeks and includes a specific combination of chemotherapeutic agents.
Q2. How have the side effects of delayed intensification changed over time? The side effect profile of delayed intensification has improved greatly. Historical data showed toxicity rates exceeding 70%, but modern protocols have reduced severe side effects considerably. For instance, febrile neutropenia rates have decreased to as low as 1-5% in recent treatment cycles, compared to much higher rates in the past.
Q3. What are the most common side effects during delayed intensification? Patients often experience multiple concurrent symptoms rather than isolated side effects. Common symptom clusters include neurocognitive issues (pain, shortness of breath, vomiting), emotional and gastrointestinal problems (nausea, disturbed sleep, distress), and fatigue-related symptoms (fatigue, lack of appetite). These symptoms typically peak 3-5 days after chemotherapy administration.
Q4. How often do treatment delays occur during delayed intensification? Treatment delays exceeding 7 days between chemotherapy cycles are relatively common, affecting 17-35% of patients. The primary cause is hematologic toxicity, particularly neutropenia. These delays can potentially impact treatment outcomes, making timely completion of chemotherapy a priority in current protocols.
Q5. Are there long-term effects from delayed intensification therapy? While the focus is often on immediate side effects, delayed intensification can have long-term impacts. For example, anthracycline use may lead to cardiotoxicity, and steroid treatment can cause osteonecrosis, particularly in adolescents. However, modern protocols have implemented strategies to minimize these risks, such as reducing anthracycline doses and using alternate-week dexamethasone administration.
References:
[1] – https://ascopubs.org/doi/10.1200/EDBK_278171
[2] – https://pmc.ncbi.nlm.nih.gov/articles/PMC3448283/
[3] – https://pmc.ncbi.nlm.nih.gov/articles/PMC7749787/
[4] – https://link.springer.com/article/10.1007/s00520-023-08071-0
[5] – https://pmc.ncbi.nlm.nih.gov/articles/PMC8641044/
[6] – https://pmc.ncbi.nlm.nih.gov/articles/PMC10824381/
[7] – https://www.sciencedirect.com/science/article/pii/S0959804924009572
[8] – https://www.researchgate.net/publication/348718892_Comparison
_of_CALGB_10403_Alliance_and_COG_AALL0232_toxicity_results_in_
young_adults_with_acute_lymphoblastic_leukemia
[9] – https://pmc.ncbi.nlm.nih.gov/articles/PMC3322439/
[10] – https://pmc.ncbi.nlm.nih.gov/articles/PMC10887299/
[11] – https://pmc.ncbi.nlm.nih.gov/articles/PMC8883571/
[12] – https://www.mdpi.com/2227-9067/12/1/31
[13] – https://www.haematologica.org/article/view/haematol.2020.
268565/72956
[14] – https://pmc.ncbi.nlm.nih.gov/articles/PMC11316706/
[15] – https://www.sciencedirect.com/science/article/pii/S000649712105326X
[16] – https://seer.cancer.gov/data/
[17] – https://ashpublications.org/blood/article/142/Supplement 1/4244/501427/Delayed-Intensification-Including-Venetoclax-and
[18] – https://www.sciencedirect.com/science/article/pii/S0885392417300544
[19] – https://www.sciencedirect.com/science/article/abs/pii/
S1462388922001016
[20] – https://academic.oup.com/oncolo/article/29/11/e1532/7750798
[21] – https://www.nature.com/articles/s41523-024-00643-5
[22] – https://pmc.ncbi.nlm.nih.gov/articles/PMC5414888/
[23] – https://www.nature.com/articles/s41397-024-00326-1
[24] – https://accpjournals.onlinelibrary.wiley.com/doi/10.1002/jac5.1996
[25] – https://dailynews.ascopubs.org/do/low–and-moderate-grade-adverse-events-increase-odds-treatment-discontinuation
[26] – https://news.cancerresearchuk.org/2024/02/23/unwarranted-variation-in-access-to-treatment/
[27] – https://pmc.ncbi.nlm.nih.gov/articles/PMC3669056/