Stories >> Healthcare
John Goodman: Should Employers Be Required To Provide Health Insurance To Their Employees?
The Obamacare mandate for large employers kicks in this year and for smaller employers it kicks in next year. But an increasing number of economists – both on the right and the left — are saying that mandated health insurance benefits at the work place are a bad idea. Are they right?
One reason this question is so difficult to even discuss is
Click to Link
Posted: February 19, 2015 Thursday 08:00 AM