Miller Report for the Week of July 18th, 2022; by William Miller, M.D.
Given current negotiations between Adventist Health and Anthem Blue Cross, I thought it might be interesting to examine how our health care system came to rely upon health insurance as an integral component of providing healthcare services to people in America. This will be a series in which subsequent articles will examine Medicare and other forms of single-payer healthcare including the American Veteran’s Association and the Canadian healthcare system.
The story begins in 1850 when the Franklin Health Assurance Company of Massachusetts first offered “accident insurance” to cover disabilities that might occur as a result of injuries while traveling by railroad or steamboat. Think modern “airplane travelers’ insurance”. This was so popular that by 1866 there were over 60 companies in the US offering such insurance. This eventually led to the addition of insurance against disability from other causes, including illnesses. By 1911, disability-from-illness policies were common.
In the 1920s, hospitals started offering their own pre-paid policies in which participants would pay health care expenses for treatment of illnesses if provided at their hospital. In particular, coverage of the cost of surgical procedures was emphasized because these were the things that occurred most in hospitals since other healthcare services were mainly performed outside of the hospital setting. This would eventually lead to the disparity that we see today in the high reimbursement for surgical procedures and the relatively low reimbursement for primary care including healthcare maintenance, preventive care, pediatric care, and psychiatric care.
Another pivotal event in the 1920s was that insurance companies offering healthcare policies began to come together in associations. The driver for this was that if a health insurance company experienced a significant degree of payouts during a particular year, then the association would shore up the member to prevent bankruptcy. Recall that at this point, the idea behind insurance was that people would together pay into the insurance company as a collective so that if and when any one participant needed expensive care, then payment would be made out of the collective pot of money. The association acted as a sort of insurance for the insurance companies. The first such association of insurance companies was Blue Cross, founded in 1929.
Around this time, employers began offering coverage of healthcare expenses as a benefit of employment. Initially, the employer would shoulder the risks of such expenses, but soon companies sprang up to offer these employers insurance under which the insurance company would bear the risk and the employers would pay into the insurance company. In 1939, insurance companies that offered such coverage to employers came together and formed Blue Shield. This association of insurance companies initially focused on lumber and mining operations in California.
At the start, insurance companies reimbursed patients for the expenses they incurred. Thus, healthcare providers, like doctors and hospitals, were paid fee-for-service. In other words, if a doctor or hospital charged a patient with insurance, then the patient paid the doctor or hospital bill. The insurance company would then reimburse the patient for having paid the medical bill. Costs of healthcare were thus moderated by market forces. Charges were controlled by “supply-and-demand” and “what the market will bear”. These are the basic underpinnings of our whole concept of a capitalist system.
However, two things began to change in the 1970s. First was technology. In the 1970s, we saw the development of the first mechanical ventilators and, as a result, the first dedicated units within hospitals to provide intensive, life-supporting care. Medical technology began to increase at an explosive rate. Advancements in computer technology, as a result of NASA’s Apollo Project, gave us computer-assisted tomography x-rays, or CAT scans, which came into use in hospitals in 1979. MRI and PET scan technology soon followed in the 1980s. Advancements in cancer treatments gave us new chemotherapy options. The number of antibiotics used to treat infections expanded. Implantable devices such as pacemakers and artificial joints also became mainstream in the 1980s. A major breakthrough in organ transplants occurred in 1983 with the discovery of cyclosporine as an immune suppression to prevent organ rejection by the recipient’s immune system. Prior to that, organ transplant recipients had been doomed to only survive a matter of a few days. Bone marrow transplants were also developed in the 1990s as a way of treating some of the most aggressive cancers. All of these new technologies came at a tremendous cost. What is more, in a capitalistic system, hospitals were driven to compete with other hospitals by offering the newest and greatest technologies at an ever-rising cost. Very quickly, hospital bills had to increase to cover these expenses. Additionally, whole new physician specialties developed around the new technologies such as critical care medicine specialists in ICUs, cardiothoracic surgeons providing heart valve replacements and coronary artery bypass surgeries, and orthopedic surgeons specializing in total joint replacements.
The second big change was the introduction of malpractice lawsuits. Prior to this time, people understood that nothing was perfect and that doctors, being human beings, could not guarantee perfect outcomes for everyone. However, fueled by the promise of these new, developing technologies and advancements in medical science, patients and their families came to expect perfect outcomes every time. Lawyers were all too eager to take advantage of this previously untapped source of litigation dollars. As a result of the advancement in expensive technology and the expectation that these technologies should be available to everyone everywhere, lest a costly malpractice lawsuit ensues, the cost of healthcare in the United States of America skyrocketed.
By the early 1990s, it was clear to insurance companies that they had to do something to curb their rising payouts. After all, they had responsibilities to the insured to have enough money to cover costs. They also had their investors to consider who expected to receive stock dividends. It was during this time that insurance companies shifted from paying hospitals and doctors whatever they charged and started negotiating contracts with them to pay reduced rates for the services provided. Out of this concept grew such policies as Healthcare Maintenance Organizations (HMOs), managed healthcare capitation plans, and Preferred Provider Organizations (PPOs). In all of these, the insurance company acts as a middle-man, negotiating higher rates from employers on the front end, while negotiating lower payments to healthcare providers on the back end. The insurance company, in the middle, profits. The unfortunate consequence, however, is that hospitals are stuck between rising costs and lower reimbursements. As a result, many hospitals across the country have closed and healthcare is getting more difficult to provide. Some suggest that socialized medicine, or at least a single payor system, is the answer. We will explore this in subsequent Miller Report articles.
You can access all previous Miller Reports online at www.WMillerMD.com.
Dr. Miller is a practicing hospitalist and the Chief of Staff at Adventist Health Mendocino Coast hospital in Ft. Bragg, California. The views shared in this weekly column are those of the author and do not necessarily represent those of the publisher or of Adventist Health.