Table of Contents |
It has been a long journey from healthcare at the beginning of America to our current healthcare environment. This lengthy journey to the current body of knowledge about the human body and healthcare in the United States had many challenges. In the early days of the United States, as settlers from Europe made America their home from around 1607 through the late 1700s, they didn’t plan to bring enough people trained as physicians, surgeons, or apothecaries (early pharmacists who prepared and sold medicines and drugs) to match the number of settlers who would need care. There were no hospitals and few doctor’s offices.
IN CONTEXT
Family members cared for those who were sick; healers also may have visited those who were ill. Healers were often the pastors and ministers who had little to no training in medicine, although they may have known about plants and herbs that were thought to help with certain illnesses. If it was a contagious illness, then other family members would also become sick, and it was not unusual for whole families to die of infectious diseases. There was little knowledge about disease prevention or ways to minimize the spread of disease. There were no formal medicines, and opium or alcohol were used to relieve pain.
Some of the common diseases at this time were mumps, measles, smallpox, yellow fever, diphtheria, scarlet fever, and influenza.
The United States made real progress in medical care during and after the American Revolution (1775-1783). This type of progress often occurred during and after wars, as medical personnel learned from caring for sick and injured soldiers. They began to understand the concept of contamination and the need for cleanliness and sanitation to diminish infection and the spread of disease. Smallpox immunization was invented to help soldiers build immunity to the disease. These medical personnel experimented with draining infections and drawing blood. They taught their new discoveries and findings to doctors and surgeons outside of the battle grounds, and progress in medicine entered a new chapter.
By the mid-1700s, it was apparent that larger cities needed hospitals.
You may be surprised to learn that Benjamin Franklin had a large role in opening the hospital, and he became the hospital’s secretary. Many of the hospital's earliest records about patients are in his handwriting. He made notes about the patient’s name, address, illness, and admission and discharge dates. Other large hospitals followed, with New York Hospital opening in 1771 and Boston Hospital opening in 1821.
As settlers began heading west, they brought their diseases with them. Like in the 1600s, they didn’t plan for this, and each group of settlers didn’t always have a medical practitioner as part of their group. It was not unusual that settlers had to bury some of their group at every stop along the way.
During this time, demand for medical practitioners far exceeded the supply. Medical training was often informal through apprenticeships rather than through schooling, and the training was unregulated. There were no medical education standards, so every instance of medical training was unique depending on who was doing the training. As a consequence, many private schools opened that graduated medical students in 6 months or less. They were considered by many to be diploma mills, meaning their main goal was to make money by getting people in and out quickly with a diploma. By the late 1800s, it is estimated that there were around 75 of these schools in the U.S.
By the 1850s, scientific knowledge was developing rapidly. There was substantial knowledge of kidney, heart, and chest diseases. New lab techniques were being developed to help with diagnosing symptoms and diseases. The stethoscope and anesthesia had been discovered. Just like the American Revolution, the Civil War (1861–1865) brought a major evolution in American medicine. There were significant advancements in surgical procedures, and the official role of nursing made a profound difference in the way medicine was practiced for the soldiers and beyond.
While the 19th century brought significant changes to the healthcare landscape, the 20th century ushered in a new dedication to quality care, standards of medical training and care, and the new concept of health insurance.
In 1906, the Carnegie Foundation hired a man named Abraham Flexner to study the status of medical education in the U.S. From 1906 to 1910, Flexner traveled the country surveying medical schools against a pre-established set of criteria that was considered to be necessary for quality medical training. At the end of his study, he put out the Flexner Report, which described many inferior schools, listed the schools that didn’t meet the criteria, and suggested closing many of the schools due to oversupply and underperformance. As a result, many of the schools closed, and the expectations for medical school were set to a much higher standard.
During World War II, employers tried new ways to attract workers and to retain them. One of the new ideas was to offer a form of health insurance as a benefit. It also became popular for hospitals to offer a certain number of days of hospital care, if needed, for a small monthly “insurance” fee. It is said to have saved Baylor Hospital and University from bankruptcy in 1923. As these plans became more robust, hospitals were struggling to manage them and asked the American Hospital Association (AHA) to help. The AHA stepped in, and by 1933, they were managing all of the hospital’s insurance plans. In 1946, this branch of the AHA was renamed Blue Cross and was the birth of the Blue Cross and Blue Shield that we know today. Blue Cross broke away from the AHA in 1972 and continued to evolve. From the ideas of Blue Cross, other health insurance organizations began to appear in the industry, and private health insurance through employment became the ideal and the norm.
IN CONTEXT
The American Hospital Association began as the Association of Hospital Superintendents in 1899 with an informal gathering of nine hospital administrators (called superintendents at that time). For the next 8 years, it was a private club which only allowed hospital superintendents in their membership. By 1906, the name changed to the American Hospital Association, but it continued as a private membership group for 12 more years. The group’s mission was “the promotion of economy and efficiency in hospital management” (American Hospital Association, n.d.). In 1913, the AHA allowed other leaders within hospitals to become members, and by 1918, it allowed hospitals to have an institutional membership. The organization continued to evolve and in 1936 began publishing a journal called Hospitals. The AHA championed the first college degree programs for hospital administration, and in 1945, they published the first directory of hospitals in the U.S.
Through the years, the organization advocated for many quality-of-care initiatives along with quality of education for hospital executives. They supported the establishment of Medicare and Medicaid and established many committees to support other issues impacting hospitals and healthcare. In 1973, they established the first Patient's Bill of Rights, which you will learn more about later in the course. In 1995, they adopted a new mission statement which is still current today: “To advance the health of individuals and communities. AHA leads, represents and serves hospitals, health systems and other related organizations that are accountable to the community and committed to health improvement” (American Hospital Association, n.d.).
The mid-20th century was marked by significant federal involvement in healthcare. The creation of Medicare and Medicaid in 1965 under President Lyndon B. Johnson was a landmark achievement. Medicare provides health insurance for Americans ages 65 and older, while Medicaid extends coverage to low-income individuals and families. These programs significantly expand access to healthcare and reduce the financial burden on vulnerable populations. The establishment of these programs marked a shift towards greater federal responsibility for healthcare.
The introduction of the Health Insurance Portability and Accountability Act (HIPAA) (a U.S. law enacted to protect patient health information and ensure privacy and security in healthcare) in 1996 was a great disruptor for healthcare organizations. It required years of preparation because of the need to develop new policies and procedures and put in place new staff who could specialize in privacy and security. It also required significant training of all staff. In fact, annual HIPAA training is still required for all employees working in healthcare facilities.
The late 20th and early 21st centuries have been characterized by efforts to address the challenges of rising costs, disparities in access, and the need for quality improvement. A major trend in the early 21st century was the evolution of technology in healthcare.
Electronic health records (EHRs) changed the way patient health information is being managed. Replacing paper medical records used for decades with EHRs has significantly improved access to patient information, which can lead directly to better care. It has also eliminated the issue of illegible handwritten records, monotonous tasks, and the large amount of space needed to manage and house paper records. Other digital tools have the potential to improve care coordination, enhance patient engagement, and reduce costs. Artificial intelligence is being used to assist physicians in quicker and more accurate diagnoses, analyzing medical imaging data, and offering robust information about clinical and administrative challenges, which all assist with decision making.
In 2010, the Affordable Care Act (ACA), often called Obamacare, was a new reform unlike any other in the history of the U.S. The goal was to reduce the number of uninsured Americans and make access to healthcare more equitable for all. The ACA required most Americans to have health insurance or pay a penalty if they didn’t. If a person was insured through their employer and was happy with the cost and coverage, they didn’t have to make any changes. However, those who had no coverage or wanted to explore options could shop for an insurance plan through a newly established healthcare marketplace. The ACA also established that those with preexisting conditions could not be denied coverage or be forced to pay a higher premium than others in the same plan.
IN CONTEXT
While the ACA has had a significant impact on reducing the number of uninsured Americans, it was not without complications. Some unexpected consequences were that many of the insurance providers raised their prices in the marketplace, many healthcare consumers were still unable to afford an insurance plan, and many of the plans had very high deductibles. These issues still remain, and many advocates are working to make improvements.
The issue of the cost of healthcare in the U.S. remains a hot topic in economics, society, and politics. Because healthcare costs and expenditures in the United States are among the highest in the world, most agree that the system is broken, and solutions must be found. At the same time, equity and disparities in access to care continue to be widely discussed and studied.
Source: THIS TUTORIAL WAS AUTHORED BY SOPHIA LEARNING. PLEASE SEE OUR TERMS OF USE.
REFERENCES
American Hospital Association. (n.d.) American Hospital Association Timeline. www.aha.org/about/history
Public Broadcasting Service. (n.d.). Guns Germs & Steel: Variables. smallpox. PBS. www.pbs.org/gunsgermssteel/variables/smallpox.html#:~:text=When%20the%20Europeans%20arrived%2C%20carrying,estimated%2090%25%20of%20Native%20Americans.