Originally posted by Imran Siddiqui
You DO realize that employers don't HAVE TO PROVIDE health insurance, don't you? It is something they do to entice workers.
health insurance has nothing to do with one's job and is a business cost that is part of the reason businesses are moving jobs to regions where they can hire workers and not provide health care insurance.
You DO realize that employers don't HAVE TO PROVIDE health insurance, don't you? It is something they do to entice workers.
Comment