Impact of preexisting cardiovascular disease (CVD) on treatments and outcomes of patients with breast or lung cancer.

2020 ◽  
Vol 38 (15_suppl) ◽  
pp. 12063-12063
Author(s):  
Atul Batra ◽  
Shiying Kong ◽  
Rodrigo Rigo ◽  
Winson Y. Cheung

12063 Background: Prior cardio-oncology and geriatric oncology research has mainly focused on cancer treatments and their late effects on cardiac health, but little information is known about how cardiac health may influence subsequent cancer treatments. This real-world study aimed to evaluate the associations of pre-existing CVD on treatment adherence and survival in patients with breast or lung cancer. Methods: We linked administrative data from the population-based cancer registry, electronic medical records, and billing claims in a large province (Alberta, Canada) over a 10-year time period (2006-2015). Multivariable logistic regression analyses were performed to identify associations of CVD with cancer treatments. Multivariable Cox proportional hazards models were constructed to determine the effect of CVD on overall survival (OS), while adjusting for receipt of cancer treatments. Results: We identified 46,227 patients with breast or lung cancer, of whom 77% were women and median age was 65 years. While 82% of patients with breast cancer were early stage, 50% with lung cancer had metastasis. The prevalence of pre-existing CVD was 20% where congestive heart failure was most frequent. In logistic regression, CVD was associated with lower odds of receiving appropriate chemotherapy (OR, 0.60, 95% CI, 0.56-0.65, P<.0001), radiotherapy (OR, 0.76, 95% CI, 0.72-0.81, P<.0001), and surgery (OR, 0.60, 95% CI, 0.54-0.66, P <.0001), irrespective of tumor site (Table). The 5-year OS was lower in patients with baseline CVD as compared to those without (46% vs 58%, P<0.0001). Upon adjusting for stage and treatment, CVD continued to correlate with worse OS (HR, 1.23, 95% CI, 1.19-1.26; P<.0001). Conclusions: Cancer patients with prior CVD were less likely to receive standard cancer therapy. Even among those who underwent cancer treatments, worse outcomes were observed in those with CVD. Early cardio-oncology and geriatric oncology engagement may reduce treatment bias and ensure that carefully selected patients with a cardiac history are still offered appropriate cancer therapy. [Table: see text]

2008 ◽  
Vol 56 (7) ◽  
pp. 954-957 ◽  
Author(s):  
Jeanette M. Tetrault ◽  
Maor Sauler ◽  
Carolyn K. Wells ◽  
John Concato

BackgroundMultivariable models are frequently used in the medical literature, but many clinicians have limited training in these analytic methods. Our objective was to assess the prevalence of multivariable methods in medical literature, quantify reporting of methodological criteria applicable to most methods, and determine if assumptions specific to logistic regression or proportional hazards analysis were evaluated.MethodsWe examined all original articles in Annals of Internal Medicine, British Medical Journal, Journal of the American Medical Association, Lancet, and New England Journal of Medicine, from January through June 2006. Articles reporting multivariable methods underwent a comprehensive review; reporting of methodological criteria was based on each article's primary analysis.ResultsAmong 452 articles, 272 (60%) used multivariable analysis; logistic regression (89 [33%] of 272) and proportional hazards (76 [28%] of 272) were most prominent. Reporting of methodological criteria, when applicable, ranged from 5% (12/265) for assessing influential observations to 84% (222/265) for description of variable coding. Discussion of interpreting odds ratios occurred in 13% (12/89) of articles reporting logistic regression as the primary method and discussion of the proportional hazards assumption occurred in 21% (16/76) of articles using Cox proportional hazards as the primary method.ConclusionsMore complete reporting of multivariable analysis in the medical literature can improve understanding, interpretation, and perhaps application of these methods.


Author(s):  
Laurie Grieshober ◽  
Stefan Graw ◽  
Matt J. Barnett ◽  
Gary E. Goodman ◽  
Chu Chen ◽  
...  

Abstract Purpose The neutrophil-to-lymphocyte ratio (NLR) is a marker of systemic inflammation that has been reported to be associated with survival after chronic disease diagnoses, including lung cancer. We hypothesized that the inflammatory profile reflected by pre-diagnosis NLR, rather than the well-studied pre-treatment NLR at diagnosis, may be associated with increased mortality after lung cancer is diagnosed in high-risk heavy smokers. Methods We examined associations between pre-diagnosis methylation-derived NLR (mdNLR) and lung cancer-specific and all-cause mortality in 279 non-small lung cancer (NSCLC) and 81 small cell lung cancer (SCLC) cases from the β-Carotene and Retinol Efficacy Trial (CARET). Cox proportional hazards models were adjusted for age, sex, smoking status, pack years, and time between blood draw and diagnosis, and stratified by stage of disease. Models were run separately by histotype. Results Among SCLC cases, those with pre-diagnosis mdNLR in the highest quartile had 2.5-fold increased mortality compared to those in the lowest quartile. For each unit increase in pre-diagnosis mdNLR, we observed 22–23% increased mortality (SCLC-specific hazard ratio [HR] = 1.23, 95% confidence interval [CI]: 1.02, 1.48; all-cause HR = 1.22, 95% CI 1.01, 1.46). SCLC associations were strongest for current smokers at blood draw (Interaction Ps = 0.03). Increasing mdNLR was not associated with mortality among NSCLC overall, nor within adenocarcinoma (N = 148) or squamous cell carcinoma (N = 115) case groups. Conclusion Our findings suggest that increased mdNLR, representing a systemic inflammatory profile on average 4.5 years before a SCLC diagnosis, may be associated with mortality in heavy smokers who go on to develop SCLC but not NSCLC.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Raquel Araujo-Gutierrez ◽  
Kalyan R. Chitturi ◽  
Jiaqiong Xu ◽  
Yuanchen Wang ◽  
Elizabeth Kinder ◽  
...  

Abstract Background Cancer therapy-related cardiac dysfunction (CTRD) is a major source of morbidity and mortality in long-term cancer survivors. Decreased GLS predicts decreased left ventricular ejection fraction (LVEF) in patients receiving anthracyclines, but knowledge regarding the clinical utility of baseline GLS in patients at low-risk of (CTRD) is limited. Objectives The purpose of this study was to investigate whether baseline echocardiographic assessment of global longitudinal strain (GLS) before treatment with anthracyclines is predictive of (CTRD) in a broad cohort of patients with normal baseline LVEF. Methods Study participants comprised 188 patients at a single institution who underwent baseline 2-dimensional (2D) speckle-tracking echocardiography before treatment with anthracyclines and at least one follow-up echocardiogram 3 months after chemotherapy initiation. Patients with a baseline LVEF <55% were excluded from the analysis. The primary endpoint, (CTRD), was defined as an absolute decline in LVEF > 10% from baseline and an overall reduced LVEF <50%. Potential and known risk factors were evaluated using univariable and multivariable Cox proportional hazards regression analysis. Results Twenty-three patients (12.23%) developed (CTRD). Among patients with (CTRD), the mean GLS was -17.51% ± 2.77%. The optimal cutoff point for (CTRD) was -18.05%. The sensitivity was 0.70 and specificity was 0.70. The area under ROC curve was 0.70. After adjustment for cardiovascular and cancer therapy related risk factors, GLS or decreased baseline GLS ≥-18% was predictive of (CTRD) (adjusted hazards ratio 1.17, 95% confidence interval 1.00, 1.36; p = 0.044 for GLS, or hazards ratio 3.54; 95% confidence interval 1.34, 9.35; p = 0.011 for decreased GLS), along with history of tobacco use, pre-chemotherapy systolic blood pressure, and cumulative anthracycline dose. Conclusions Baseline GLS or decreased baseline GLS was predictive of (CTRD) before anthracycline treatment in a cohort of cancer patients with a normal baseline LVEF. This data supports the implementation of strain-protocol echocardiography in cardio-oncology practice for identifying and monitoring patients who are at elevated risk of (CTRD).


Author(s):  
Joshua R Ehrlich ◽  
Bonnielin K Swenor ◽  
Yunshu Zhou ◽  
Kenneth M Langa

Abstract Background Vision impairment (VI) is associated with incident cognitive decline and dementia. However, it is not known whether VI is associated only with the transition to cognitive impairment, or whether it is also associated with later transitions to dementia. Methods We used data from the population-based Aging, Demographics and Memory Study (ADAMS) to investigate the association of visual acuity impairment (VI; defined as binocular presenting visual acuity &lt;20/40) with transitions from cognitively normal (CN) to cognitive impairment no dementia (CIND) and from CIND to dementia. Multivariable Cox proportional hazards models and logistic regression were used to model the association of VI with cognitive transitions, adjusted for covariates. Results There were 351 participants included in this study (weighted percentages: 45% male, 64% age 70-79 years) with a mean follow-up time of 4.1 years. In a multivariable model, the hazard of dementia was elevated among those with VI (HR=1.63, 95%CI=1.04-2.58). Participants with VI had a greater hazard of transitioning from CN to CIND (HR=1.86, 95%CI=1.09-3.18). However, among those with CIND and VI a similar percentage transitioned to dementia (48%) and remained CIND (52%); there was no significant association between VI and transitioning from CIND to dementia (HR=0.94, 95%CI=0.56-1.55). Using logistic regression models, the same associations between VI and cognitive transitions were identified. Conclusions Poor vision is associated with the development of CIND. The association of VI and dementia appears to be due to the higher risk of dementia among individuals with CIND. Findings may inform the design of future interventional studies.


2020 ◽  
Author(s):  
Linlin Wang ◽  
Lihui Ge ◽  
Guofeng Zhang ◽  
Yi Ren ◽  
Yongyu Liu

Abstract Background: Whether lung segmentectomy is a safe and effective surgical treatment in patients with early non-small cell lung cancer (NSCLC) remains controversial. We have therefore reviewed the clinicopathologic characteristics and survival outcomes of patients receiving a lobectomy vs. segmentectomy to treat early T (>2 cm and ≤3 cm) N0M0 NSCLC.Methods: We obtained data from the Surveillance, Epidemiology, and End Results (SEER) database for patients who underwent lobectomy or segmentectomy between 2004 and 2015. To reduce bias and imbalance between the treatment groups, propensity score matching (PSM) analysis was performed. We used Kaplan-Meier curves to estimate overall survival (OS) and lung cancer-specific survival (LCSS), performed univariate and multivariate Cox proportional hazards regression analyses to identify independent prognostic factors for OS and CSS, and applied the Cox proportional hazards model to create forest plots. Results: A total of 5783 patients from the SEER database were included. Of these, 5531 patients underwent lobectomy, and 252 patients underwent segmentectomy. Before matching, both univariate and multivariate Cox regression analyses showed that patients who underwent lobectomy had better OS (hazard ratio [HR]: 1.561; 95% confidence interval [CI] 1.292-1.885; P <0.001) and LCSS (HR: 1.551; 95% CI 1.198-2.009; P=0.001) than patients who underwent segmentectomy. However, survival differences between the groups were not significant; OS (P=0.160) and LCSS (P=0.097) after matching. Regression analyses revealed that age, sex, lymph node dissection, and grade were independent predictors of OS and LCSS (P <0.05).Conclusions: For patients with stage T (>2 cm and ≤3 cm) N0M0 non-small cell lung cancer, segmentectomy can achieve the same OS and LCSS compared with lobectomy. A large number of patients require further long-term follow-up analyses.


2021 ◽  
Vol 8 (2) ◽  
pp. 27-33
Author(s):  
Jiping Zeng ◽  
Ken Batai ◽  
Benjamin Lee

In this study, we aimed to evaluate the impact of surgical wait time (SWT) on outcomes of patients with renal cell carcinoma (RCC), and to investigate risk factors associated with prolonged SWT. Using the National Cancer Database, we retrospectively reviewed the records of patients with pT3 RCC treated with radical or partial nephrectomy between 2004 and 2014. The cohort was divided based on SWT. The primary out-come was 5-year overall survival (OS). Logistic regression analysis was used to investigate the risk factors associated with delayed surgery. Cox proportional hazards models were fitted to assess relations between SWT and 5-year OS after adjusting for confounding factors. A total of 22,653 patients were included in the analysis. Patients with SWT > 10 weeks had higher occurrence of upstaging. Using logistic regression, we found that female patients, African-American or Spanish origin patients, treatment in academic or integrated network cancer center, lack of insurance, median household income of <$38,000, and the Charlson–Deyo score of ≥1 were more likely to have prolonged SWT. SWT > 10 weeks was associated with decreased 5-year OS (hazard ratio [HR], 1.24; 95% confidence interval [CI], 1.15–1.33). This risk was not markedly attenuated after adjusting for confounding variables, including age, gender, race, insurance status, Charlson–Deyo score, tumor size, and surgical margin status (adjusted HR, 1.13; 95% CI, 1.04–1.24). In conclusion, the vast majority of patients underwent surgery within 10 weeks. There is a statistically significant trend of increasing SWT over the study period. SWT > 10 weeks is associated with decreased 5-year OS.


2021 ◽  
pp. 1-11
Author(s):  
Dennis London ◽  
Dev N. Patel ◽  
Bernadine Donahue ◽  
Ralph E. Navarro ◽  
Jason Gurewitz ◽  
...  

OBJECTIVE Patients with non–small cell lung cancer (NSCLC) metastatic to the brain are living longer. The risk of new brain metastases when these patients stop systemic therapy is unknown. The authors hypothesized that the risk of new brain metastases remains constant for as long as patients are off systemic therapy. METHODS A prospectively collected registry of patients undergoing radiosurgery for brain metastases was analyzed. Of 606 patients with NSCLC, 63 met the inclusion criteria of discontinuing systemic therapy for at least 90 days and undergoing active surveillance. The risk factors for the development of new tumors were determined using Cox proportional hazards and recurrent events models. RESULTS The median duration to new brain metastases off systemic therapy was 16.0 months. The probability of developing an additional new tumor at 6, 12, and 18 months was 26%, 40%, and 53%, respectively. There were no additional new tumors 22 months after stopping therapy. Patients who discontinued therapy due to intolerance or progression of the disease and those with mutations in RAS or receptor tyrosine kinase (RTK) pathways (e.g., KRAS, EGFR) were more likely to develop new tumors (hazard ratio [HR] 2.25, 95% confidence interval [CI] 1.33–3.81, p = 2.5 × 10−3; HR 2.51, 95% CI 1.45–4.34, p = 9.8 × 10−4, respectively). CONCLUSIONS The rate of new brain metastases from NSCLC in patients off systemic therapy decreases over time and is uncommon 2 years after cessation of cancer therapy. Patients who stop therapy due to toxicity or who have RAS or RTK pathway mutations have a higher rate of new metastases and should be followed more closely.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Tomoko Namba-Hamano ◽  
Takayuki Hamano ◽  
Masahiro Kyo ◽  
Yutaka Yamaguchi ◽  
Kawamura Masataka ◽  
...  

Abstract Background and Aims Few studies have evaluated long-term graft histology. The aims of this study were to reveal the histological characteristics peculiar to long-term graft and to identify clinical manifestations and histological findings predicting graft survival after biopsy. Method In this retrospective study, we enrolled all allograft biopsies conducted in two institutions between 2002 and 2018 in recipients who had underwent transplantation 10 years before (n=107). The revised Banff criteria were used to evaluate histological findings. For a baseline cress-sectional study, we employed logistic regression analyses, to explore clinical factors associated with each histological parameter. Restricted cubic spline functions were used for non-linear associations. In longitudinal study, log-rank test and Cox proportional hazards models were used to evaluate the death-censored graft loss. Results Median (IQR) of time after transplantation, recipient age at biopsy, and donor age were 13 (11, 19), 49 (42, 59), and 51 years (43, 58), respectively. Median (IQR) eGFR and proteinuria at biopsy was 29 (24,40) mL/min/1.73m2 and 0.46 (0.18,0.80) g/day, respectively. Seventeen patients (16%) had FSGS lesion, which was the most common glomerular abnormality in this cohort. Figure 1 shows the distribution of histological parameters. Donor age, in addition to proteinuria, was found to be associated with the presence of FSGS lesion [Odds ratio 2.37 (95%CI 1.16-4.88) per 10-year]. When constructing a non-linear model, estimated prevalence of FSGS lesion was increased in grafts from donors of &gt; 40 years old (Figure 2). Logistic regression analyses revealed that eGFR at biopsy and transplantation vintage were associated with the presence of ci [Odds ratio 0.48 (95%CI 0.32-0.71) per 10 mL/min/1.73m2, and 1.17 (1.05-1.30) per 10-year, respectively]. We also found that eGFR at biopsy and proteinuria were associated with the presence of ct [Odds ratio 0.40 (95%CI 0.26-0.63) per 10 mL/min/1.73m2, and 2.02 (1.07-3.84) per 1g/day, respectively]. Figure 3 shows Kaplan-Meier curves for death-censored graft survival after biopsy. During 3.5 years of observation, 33% of patients lost their graft functions. Log rank tests revealed that the risk of graft loss is increased in the groups with the presence of ct (p=0.001), and FSGS lesion (p=0.0001), and higher score of cg (p&lt;0.0001). In multivariate Cox proportional hazards model, the highest score of cg in addition to grater proteinuria and lower eGFR at biopsy showed higher risk of graft loss after biopsy [Hazard ratio 3.26 (95% CI 1.25-8.53) as compared to cg0, 1.64 (1.09-2.46) per g/day, and 0.39 (0.24-0.64) per 10 mL/min/1.73m2, respectively]. Conclusion The grafts from older donors, especially older than 40 years old, have FSGS lesion more frequently. Only cg score, not ct score or FSGS lesion, predicts graft survival after biopsy in patients with long transplantation vintage, independently from clinical information.


2001 ◽  
Vol 19 (6) ◽  
pp. 1671-1675 ◽  
Author(s):  
Shari Gelber ◽  
Alan S. Coates ◽  
Aron Goldhirsch ◽  
Monica Castiglione-Gertsch ◽  
Gianluigi Marini ◽  
...  

PURPOSE: To evaluate the impact of subsequent pregnancy on the prognosis of patients with early breast cancer. PATIENTS AND METHODS: One hundred eight patients who became pregnant after diagnosis of early-stage breast cancer were identified in institutions participating in International Breast Cancer Study Group (IBCSG) studies. Fourteen had relapse of breast cancer before their first subsequent pregnancy. The remaining 94 patients (including eight who relapsed during pregnancy) formed the study group reported here. A comparison group of 188 was obtained by randomly selecting two patients, matched for nodal status, tumor size, age, and year of diagnosis from the IBCSG database, who were free of relapse for at least as long as the time between breast cancer diagnosis and completion of pregnancy for each pregnant patient. Survival comparison used Cox proportional hazards regression models. RESULTS: Overall 5- and 10-year survival percentages (± SE) measured from the diagnosis of early-stage breast cancer among the 94 study group patients were 92% ± 3% and 86% ± 4%, respectively. For the matched comparison group survival was 85% ± 3% at 5 years and 74% ± 4% at 10 years (risk ratio, 0.44; 95% confidence interval, 0.21 to 0.96; P = .04). CONCLUSION: Subsequent pregnancy does not adversely affect the prognosis of early-stage breast cancer. The superior survival seen in this and other controlled series may merely reflect a healthy patient selection bias, but is also consistent with an antitumor effect of the pregnancy.


Sign in / Sign up

Export Citation Format

Share Document