Healthcare is interesting; prior to the end of WWII there
Posted on: September 7, 2020 at 11:58:27 CT
MU-TULSA MU
Posts:
24063
Member For:
24.90 yrs
Level:
User
M.O.B. Votes:
0
wasnt health insurance, after WWII healthcare became a way to try and get people to work for companies and to join unions. During the 50's and 60's healthcare was pretty much up to individuals to pay for. Doctor's actually posted what it cost for an office visit, shots, etc on a bulletin board at their office. Then Johnson signed into law Medicare in 66 and all things began to change. First doctors raised hell saying the government had no right telling them what they could and could not charge. Then within a few years, doctors' offices were full of old people and doctors began to embrace Medicare. More and more companies offered healthcare as an incentive for new employees and it became the norm instead of the unusual. Now people are saying it is a RIGHT and everyone should have FREE healthcare, amazing how things change