In: Nursing
Currently 37% of Americans have their insurance paid for by Medicare or Medicaid, 11% are uninsured, but the majority is private pay, either by individuals, but primarily employers at 52% (Auter & Newport, 2017). Should employers continue to pay the burden of health care benefits to employees, or should the government institute a form of national health insurance instead?
The government should institute a form of national health insurance for those who are uninsured, and as per the data given in the question above, it should be for the 11%.
If an employer is giving insurance to its employees, its because it can. Every organisation works to generate profit. If some portion of this profit goes into making sure that its employees have a stable healthcare plan to fall back on in medical emergencies, its absolutely okay. They will not bankrupt.
If a company if earning a certain percentage of profit, the government must make it mandatory for those organisations to provide healthcare insurance to its employees.
All the organisations that dont fit into the above mentioned scenario, the government should come forward to provide insurance.
The ability to afford or get basic and emergency healthcare by the citizens of a country should be a fundamental right for which he/she cannot be denied.
To make sure healthacre is available and accessible to all, the government should formulate a plan along with big organisations and firms to make sure no individual is left along in trying times.