Atrial fibrillation (AF) during sepsis is associated with increased morbidity and mortality, but practice patterns and outcomes associated with rate- and rhythm-targeted treatments for AF during sepsis are unclear.
This was a retrospective cohort study using enhanced billing data from approximately 20% of United States hospitals. We identified factors associated with IV AF treatments (β-blockers [BBs], calcium channel blockers [CCBs], digoxin, or amiodarone) during sepsis. We used propensity score matching and instrumental variable approaches to compare mortality between AF treatments.
Among 39,693 patients with AF during sepsis, mean age was 77 ± 11 years, 49% were women, and 76% were white. CCBs were the most commonly selected initial AF treatment during sepsis (14,202 patients [36%]), followed by BBs (11,290 [28%]), digoxin (7,937 [20%]), and amiodarone (6,264 [16%]). Initial AF treatment selection differed according to geographic location, hospital teaching status, and physician specialty. In propensity-matched analyses, BBs were associated with lower hospital mortality when compared with CCBs (n = 18,720; relative risk [RR], 0.92; 95% CI, 0.86-0.97), digoxin (n = 13,994; RR, 0.79; 95% CI, 0.75-0.85), and amiodarone (n = 5,378; RR, 0.64; 95% CI, 0.61-0.69). Instrumental variable analysis showed similar results (adjusted RR fifth quintile vs first quintile of hospital BB use rate, 0.67; 95% CI, 0.58-0.79). Results were similar among subgroups with new-onset or preexisting AF, heart failure, vasopressor-dependent shock, or hypertension.
Although CCBs were the most frequently used IV medications for AF during sepsis, BBs were associated with superior clinical outcomes in all subgroups analyzed. Our findings provide rationale for clinical trials comparing the effectiveness of AF rate- and rhythm-targeted treatments during sepsis.