The profession of medicine has come a long way in public esteem in the United States.2 In the early days of the republic, physicians strove for recognition as elite healers who deserved a monopoly on medical practice. In fact, healers of all types existed on more-or-less equal footing, for several reasons. First, democratic ideals encouraged people to consider themselves equal to anyone else. Status was earned, not hereditary. And so, natural healers asserted that they had as much right as anyone to diagnose and treat disease. Second, doctors did not have much to offer sick people. Scientific understanding of disease was weak, and treatments were ineffective and, in the case of blood letting, often dangerous. Third, the public, especially during the decades of Jacksonian democracy, did not recognize physicians’ right to set the standards of medical practice and judge one another. Guides to self-care were bestsellers, in part because the US economy was weak and few could afford medical care. Transportation was painfully slow, which limited access to physicians and raised its cost. To make a living in this world of do-it-yourself health care, physicians developed side occupations such as selling medicines, fruits, and vegetables, which further blurred the distinction between professionals and other people.