The economic model of the United States’ health care system — which really isn’t a system — is not the supply and demand free market model some would have you believe. And some of those who want you to believe that it is a free market knows full well it isn’t, but fooling the public serves certain purposes.
And one of the business sectors foisting this lie on the public is the insurance industry.
It is little known that Blue Cross/Blue Shield not only has its roots in Texas, its seeds were planted by group of Dallas teachers contracting with Baylor University Hospital, according to University of Iowa master’s thesis by Frederic R. Hedinger also credited Baylor. What’s more important than that detail is the plan worked and that it spread across the country, especially during the Depression.
Blue Cross was a benefit plan, meaning it paid for what the doctors thought the patient needed. If the patient needed surgery, the patient got it. And Blue Cross paid the hospital for its charges. The premiums were based on community ratings, which meant that the entire population’s medical needs were estimated and were the basis for the premiums.
Meanwhile, in California, Kaiser-Permanente was formed, becoming the first true HMO. It was a closed panel HMO because the physicians worked for the Permanente Medical Group and the medical group contracted with the insurance plan, Kaiser, which also owned the hospitals. This gave the insurers control of the costs and it succeeded in lowering costs of health corer.
But something happened on the way to the 21st Century.
First, some hospitals and some doctors took advantage of the benefit plans’ no-questions-asked fee-for-service system. Some greedy doctors (surgeons got most of the bad rap) got profligate with diagnoses based on wallet biopsies and performed hysterectomies and tonsillectomies and adenoidectomies. The old joke about hysterectomies was that the only criteria for those procedures were the presence of a uterus and a Blue Cross card, and the uterus wasn’t always needed.
The few rotten apples spoiled the basic for everyone because as people who paid for health care discovered the greed and looked for ways to curb the abuses. So began such things as peer review and utilization review, with the former being doctors who looked over each others’ shoulders and the latter involving non-doctors — and that opened the door for the insurance companies to gain control of the medical decision-making.
Kaiser had a better handle on how to deal with inappropriate utilization. The Permanente Medical Group, which contracted with Kaiser, was essentially told: Here are X-number of lives and our actuarial tables tell us that they will require Y-amount of care. So, here’s the math and the dollars. You take care of folks within that amount and anything you do over it, you eat; you keep the difference to distribute among yourselves.
The doctors watched over the utilization and took care of the over-users. In short, at this point, they retained some semblance of control of the decision-making.
Another part of the story is important and coincides with these mid-20th Centaury events.
Recognize that the structure of the system was that the hospitals were predominantly nonprofit and so there was a certain lack of business pressure with respect to performance and the bottom line. In addition, the people with the financial responsibility on board of directors were the moneyed movers and shakers of the community. Physicians, while having plenty of interest in the hospitals’ functioning well, were neither employees nor contractors. They were free agents within their own organization.
In some communities, working with the medical staff was like herding cats. It’s not that the doctors were evil. It’s that they wanted some things and the hospitals sometimes couldn’t give it. The entire structure was complicated.
At the same time — this was post-World War II and Korea — the population was growing, the need for a full range of services was being demanded and the federal government was there to help. The passage the Hill-Burton Act spurred a boom in hospital contraction that lasted from the late forties until the mid-1970s. While the main purpose was to build hospitals in communities that didn’t have them — read rural — like most federal programs, people who played the game well got the money and hospitals sprang up everywhere. Or, existing hospitals got new wings.
Policymakers came to realize, however, that supply created its own demand. If a hospital built the beds, ORs and x-ray suites and so on, they had to be used to cover the debt service. At the micro level, the community had to pay for the over-building and cover the costs of the resources. At the macro level, pumping patients through the system to fill beds and finance the growth added up to higher costs for the country.
And guess what came along to help fill them? Medicare and Medicaid. They created another revenue stream for the system. In economic terms, they created demand. Remember, though, that some of that demand wasn’t need based on clinical judgment and evidence-based medicine. The model for paying doctors by the procedure and service still prevailed in the early days of those huge programs.
But they also filled hospitals and clinics and added to the total medical and health care costs for the nation. Combine this influx of revenue with the supply creating its own demand, and medical costs skyrocketed. By the way, some researchers noticed that all of these medial resources didn’t lower what were then considered key indicators of a nation’s health — infant and material mortality, life expectancy and other measures of morbidly and mortality.
Around the mid-1960s, scientific advances and medical technology were having a greater effect on medical care .The first kidney transplant had already occurred in Boston in the 1950s. By 1967, Dr. Christian Barnard stunned the world with the first heart transplant. The first CAT scanner came along in the 1970s, but the people who came up with the idea had developed it in the 1960s. And so it went.
But, for all the good the technology did, and it did no doubt, it also drove up the costs. And it was around the mid-1960s we began to hear the word “crisis,” which we’re still hearing today. Some of those people taking about “crisis” were members of Congress. One would think that if they could call it a crisis for 40-plus years, they might have found a way to solve it, but that’s another story.
One of the answers Congress had for cutting costs in the 1970s was to control capital expenditures. It passed a series of law establishing coordinated planning for communities that required states to institute programs to assure that expensive capital expenditures could be justified by community need and not duplicate other services. This was called Certificate of Need, or CON, and remains on the books in some states. If a hospital wanted a new CAT scanner, or someone wanted to build a new nursing home, they had to get approved through a process that justified the expense. The theory was that if s need was proven, the utilization would be optimized.
Some hospital administrators understood this was a good thing for the community, the nation and the system. But as a group they fought CON. Then they got smart and began to work with their doctors and gamed the system — except in two hospital towns. Behind the scenes, medical equipment and hospital construction interests (contractors, architects, etc.) fought CON. But despite the political nature of the local competition for new toys, many felt that the legislation provided for a rational distribution of resources and to some degree held back the growth of health care costs, if only a little.
Was the approach to controlling costs by treating the system as a quasi-public utility a success?
Policymakers will never know. The programs were not in place long enough to fully test them because in 1982, something happened to change the landscape once again.
That and how these forces got the United States to where the system is today are in the next installment.