Today’s post is a guest post submitted by Dr. Kenneth Fisher, a nephrologist, blogger @ People for Progress in Healthcare, and author of several books, including the one you see to the left. Dr. Fisher is also the co-founder of the Michigan chapter of the FMMA.
Additional, he is the lead author of In Defiance of Death: Exposing the Real Costs of End-of-Life Care, and offers a free e-book, The Ten Questions Walter Cronkite Would Have Asked About Health Care Reform.
We have no financial relationship to disclose.
Government Assumes a Role in Healthcare
The concept of sickness insurance began in Germany in 1883. Chancellor Otto Von Bismarck initiated insurance for the poor. A decision about how these services were to be delivered is critical to understanding the contentious debates around healthcare. Could Bismarck have given vouchers for care as needed? Alternatively, should the government control the needed healthcare facilities?
Although individuals were the recipients of the care, they were uninvolved in cost or the menu of coverages available. Perhaps if Chancellor Bismarck could have foreseen our information age, he would have realized that patients could, with a trusted physician, make appropriate decisions for themselves. In 1912, Theodore Roosevelt — running for president under the Bull Moose Party — proposed a similar federalized form of national health insurance.
In 1913, the American Medical Association (AMA) Council on Health & Public Instruction suggested sending a representative to Europe to study the issue of care for the poor. The Board of directors squashed the idea, and with the advent of the World War, interest waned.
Subsequently, several state medical societies tried unsuccessfully to deal with expanding coverage, but in 1920, a resolution passed by the AMA House of Delegates put an abrupt end to all discussion of the issue.
The AMA, which at that time was at the peak of its influence, could not resolve the issue of having government support healthcare for the needy while maintaining the independence of the patient-physician relationship. In essence, they were stuck thinking only about a federal Bismarck model, not proposing a solution (i.e. direct payment) to this quandary has plagued our country ever since.
World War II and Beyond
During World War II and its accompanying wage controls, companies needed to lure more domestic workers to increase weapons production and thus introduced health insurance as a pre-tax benefit. Over time, this benefit became more expansive, morphing into all-inclusive pre-paid healthcare and not true insurance. Thus, covered individuals were precluded from concern about cost, even for minor issues. Unlike other countries, employee coverage insured most American families.
What about coverage for retirees or the unemployed? This became an increasing political issue after World War II. President Truman defeated Dewey in 1948 in part because of healthcare, but another war (Korea) delayed any serious action. President Eisenhower signed the Kerr-Mills Act that provided federal state support for means-tested healthcare to the elderly poor. Unfortunately, only four states provided full services.
Upon his election, John F. Kennedy wanted federal care for all those over age 65. He tried mightily, but was defeated by two main adversaries. One was the AMA that fought intensively against any federal funding for healthcare, while not proposing any creative alternative. This was rather odd as physicians were now benefitting from federally funded medical research that was increasing medical options, and had the opportunity to increase incomes.
The second obstacle was Wilbur Mills (D-Ark), chair of House Ways & Means. His concerns were rather profound. Unlike Social Security, the proposed plans for the elderly and poor were open-ended. Costs per person had no limits. Another concern was that suddenly increasing demand with no increase in supply would cause prices to explode.
He was also aware that after World War II there was a baby boom, which meant in the future there would be a decrease in the worker-to-retiree ratio, causing a greater burden on the younger generation. What he could not have anticipated was the increase in life expectancy and the ever-increasing availability of more expensive medical therapies. Today, the typical Medicare recipient receives approximately three-fold the cost of care relative to their contributions. Unfortunately for us, these still-valid concerns are not being addressed, hence the chaos.
The election of 1964, a landslide for Lyndon Johnson and a super Democratic majority in Congress, meant that a federal program for the elderly and the poor would pass. Mills, knowing Johnson had the votes, crafted what is now Medicare/Medicaid addressing none of his previous concerns. Another problem Mills did not anticipate was cost shifting by hospitals negotiating higher prices for private insurances to cover losses because of Medicare / Medicaid inadequate reimbursements. This increases private insurance premiums, in large part causing stagnant employee wages.
Start receiving paid survey opportunities in your area of expertise to your email inbox by joining the Curizon community of Physicians and Healthcare Professionals.
Use our link to Join and you'll also be entered into a drawing for an additional $250 to be awarded to one new registrant referred by Physician on FIRE this month.
Perpetuating the chaos, Congress, rather than addressing fundamental cost issues, is stuck in the late 19th century, attempting a slew of government-directed top down, price controlled, heavily bureaucratic fixes.
These futile attempts include:
1) Diagnosis Related Groups (DRGs) in 1983: a price fixed system, eliminating hospital market forces such as real prices, outcomes, efficiency, and lack of price transparency.
2) Ever more complicated physician Current Procedural Terminology (CPT) codes in 1992: a price fixed system that disregards extensive training, skill levels, and outcomes, adopted by all 3rd party payers.
3) The “Sustainable” Growth Rate (SGR) in 1997: a law to decrease doctor payments if Medicare costs were greater than increases in GDP, a complete failure as Congress put off enforcement yearly.
4) The HITECH Act of 2009, based on a retracted Rand study that computerized medical records would save Medicare billions per year, but paradoxically has increased costs by adding facility fees, takes significant face time from patients, cronyism at its worst by favoring a few computer companies that still cannot share test data.
6) The Medicare and Chip Reauthorization Act (MACRA) in 2015 to end the failed “SGR” debacle, trying to assure quality not volume by computer algorithms on 40 trillions transactions/year using imprecise data, an obvious absurdity.
These actions have led to runaway costs along with deep patient and physician malcontent, yearning for more personal relationships.
A Solution Proposed
The solution: fix the fundamentals!
Put individuals, not the government in charge of their care by depositing yearly an actuarially adjusted amount into their health account that could fund routine needs, an insurance plan, and direct care. Monies left over could be carried over the following year with the yearly deposit somewhat less, so the government could share some of the savings.
At a young age, Americans could choose to have their Medicare payroll deductions paid directly into a special account paying for care when elderly. These monies could be invested in a conservative allocation of ½ stocks, ½ bonds.
There is a way out of this chaos. Give Medicare patients the option of directing their own care.