Among the 1 million patients hospitalized each year for sepsis in the United States, nearly one in four will experience atrial fibrillation (AF). Despite the use of β-blockers, calcium channel blockers, digoxin, and amiodarone for decades in the treatment of sepsis-associated AF, the effect of medication choice on outcomes has never been studied.
In this issue of CHEST (see page 74), Walkey et al examined this important question using an administrative database capturing 20% of all discharges from nonfederal US hospitals from 2010 to 2013. The authors identified patients receiving intravenous treatment for sepsis-associated AF via: (1) International Classification of Diseases, Ninth Revision, Clinical Modification codes for sepsis and AF and (2) pharmacy billing records for administration of antibiotics and intravenous calcium channel blockers, β-blockers, digoxin, or amiodarone. Among > 500,000 patients with a first hospitalization for sepsis, 20% had AF, 35% of whom were treated with intravenous therapy. Initial treatment was most commonly a calcium channel blocker (36%) followed by a β-blocker (28%), digoxin (20%), and amiodarone (16%). Choice of therapy was linked to patient characteristics (eg, higher rates of digoxin and amiodarone use in patients receiving vasopressors) but also inexplicably to geographic region and hospital characteristics unrelated to patient condition or prognosis. For example, there was a strong preference for β-blockers over calcium channel blockers in the northeast region of the country, whereas in hospitals in western areas, the preference was reversed by an equal magnitude. After propensity-matching, in-hospital mortality was lower with β-blockers than with calcium channel blockers (risk ratio, 0.92; 95% CI, 0.86-0.97), digoxin (risk ratio, 0.79; 95% CI, 0.75-0.85), or amiodarone (risk ratio, 0.64; 95% CI, 0.61-0.69).