In: Economics
PLEASE USE EXAMPLES
The Evolution of the U.S. Healthcare System
Overview
Between the years 1750 and 2000, healthcare in the United States evolved from a simple system of home remedies and itinerant doctors with little training to a complex, scientific, technological, and bureaucratic system often called the "medical industrial complex." The complex is built on medical science and technology and the authority of medical professionals. The evolution of this complex includes the acceptance of the "germ theory" as the cause of disease, professionalization of doctors, technological advancements in treating disease, the rise of great institutions of medical training and healing, and the advent of medical insurance. Governmental institutions, controls, health care programs, drug regulations, and medical insurance also evolved during this period. Most recently, the healthcare system has seen the growth of corporations whose business is making a profit from healthcare.
Background
Prior to 1800, medicine in the United States was a "family affair." Women were expected to take care of illnesses within the family and only on those occasions of very serious, life threatening illnesses were doctors summoned. Called "domestic medicine," early American medical practice was a combination of home remedies and a few scientifically practiced procedures carried out by doctors who, without the kind of credentials they must now have, traveled extensively as they practiced medicine.
The practice of midwifery—attending women in childbirth and delivering babies—was a common profession for women, since most births took place at home. Until the mid-eighteenth century Western medicine was based on the ancient Greek principle of "four humors"—blood, phlegm, black bile, and yellow bile. Balance among the humors was the key to health; disease was thought to be caused by too much or too little of the fluids. The healing power of hot, cold, dry, and wet preparations, and a variety of plants and herbs, were also highly regarded. When needed, people called on "bone-setters" and surgeons, most of whom had no formal training.
Physicians with medical degrees and scientific training began showing up on the American landscape in the late colonial period. The University of Pennsylvania opened the first medical college in 1765 and the Massachusetts Medical Society (publishers of today's New England Journal of Medicine), incorporated in 1781, sought to license physicians. Medical schools were often opened by physicians who wanted to improve American medicine and raise the medical profession to the high status it enjoyed in Europe and in England. With scientific training, doctors became more authoritative and practiced medicine as small entrepreneurs, charging a fee for their services.
In the early 1800s, both in Europe and in the United States, physicians with formal medical training began to stress the idea that germs and social conditions might cause and spread disease, especially in cities. Many municipalities created "dispensaries" that dispensed medicines to the poor and offered free physician services. Epidemics of cholera, diphtheria, tuberculosis, and yellow fever, and concerns about sanitation and hygiene, led many city governments to create departments of health. New advances in studying bacteria were put to practical use as "germ theory" became the accepted cause for illness. It was in the face of epidemics and poor sanitation, government-sponsored public health, and healthcare that private healthcare began to systematically diverge.
Impact
As America became increasingly urbanized in the mid 1800s, hospitals, first built by city governments to treat the poor, began treating the not-so-poor. Doctors, with increased authority and power, stopped traveling to their sickest patients and began treating them all under one roof. Unlike hospitals in Europe where patients were treated in large wards, American patients who could pay were treated in smaller, often private rooms.
In the years following the Civil War (1865), hospitals became either public or private. More medical schools and institutions devoted to medical research emerged. A trend toward physicians needing more training led to the Johns Hopkins University's medical school's requirement in 1893 that all medical students arrive with a four-year degree and spend another four years becoming a physician.
Earliest efforts of doctors to create a unified professional organization started in the mid 1800s and, in 1846, the American Medical Association (AMA) was established. With little early impact on American medicine, by the next century the AMA had great influence over the politics and practice of medicine. An early AMA victory was the regulation of drugs.
How has the evolution of medical technology, graduate medical education, and the professionalization of medical and nursing staff affected the delivery of care?
Continuing professional development and ongoing education in the beginning of the 21st century was authorized by health care facilities so as to enhance the ever changing management of health care system and medical practice. This however was examined due to issues in care cost, redundancy, and continuity. To handle these problems, there is a need to integrate the five central competencies needed to transform patients care quality, safety and education based on the Advisory Committee on Interdisciplinary, Community and Based Linkages (ACICBL). The five competencies comprise of the information technology utilization, patient-centered care, employment of evidence-based practices, and interdisciplinary teams work.
Healthcare system has also experienced great improvement with the changing technology. Technology has in many ways managed to change various life aspects, including creating a breakthrough in biomedical science, communication, medicine, and in informatics or collection, storage and use of the patients data. Research, development of advanced medical equipment, and advancement in treatment methods permitted medical providers to utilize innovative means and new tools to practice medicine. The use of telehealth has also played a great role in enhancing patients’ education and promoting self-care among patients with chronic illness and ensuring good health by checking on individual lifestyle. The increase in treatment accessibility has increased chances for breaches and security compromise. The integration of the health information technology to the system of health care opens up opportunities for research and exploration, which assist in enhancing health care system efficiency than ever before. Technology has highly contributed in enhancing patient care. Patient care is more reliable with advanced technology as doctors and nurses can currently utilize hand-held devices connected to the healthcare system to input, access patient date, and updating it instantly to the health care patient’s medical record (Bahouth, Esposito-Herr& Babineau, 2007).
Patients assessment and test results which include lab results, HER, vital signs records, and other important patient data can currently be viewed in one menu, which a true transformation. Barcode readers are also being used to enhance medicine administration in hospitals. This is basically meant to minimize medical errors, since the technology allows matching the prescription with the medical database to see the similarities. Barcode technology and ability to retrieve patient’s records through hand devices also creates alerts to nurses and doctors on patients’ allergies and other possible medical complications while Technology has therefore enhanced quality of care, efficiency and reduced medical error, especially with regard to data and patient history. All these changes demonstrate a high level of transformation in the healthcare system promoting efficiency and reliable care.
Why has the United States been unsuccessful in evolving the current health care system into a national health care system?
The United States is the only western industrialized nation that fails to provide universal coverage and the only nation where health care for the majority of the population is financed by for-profit, minimally regulated private insurance companies. These arrangements leave one-sixth of the population uninsured at any given time, and they leave others at risk of losing insurance as a result of normal life course events. Political theorists of the welfare state usually attribute the failure of national health insurance in the United States to broader forces of American political development, but they ignore the distinctive character of the health care financing arrangements that do exist. Medical sociologists emphasize the way that physicians parlayed their professional expertise into legal, institutional, and economic power but not the way this power was asserted in the political arena. This paper proposes a theory of stakeholder mobilization as the primary obstacle to national health insurance. The evidence supports the argument that powerful stakeholder groups, first the American Medical Association, then organizations of insurance companies and employer groups, have been able to defeat every effort to enact national health insurance across an entire century because they had superior resources and an organizational structure that closely mirrored the federated arrangements of the American state. The exception occurred when the AFL-CIO, with its national leadership, state federations and union locals, mobilized on behalf of Medicare.