The Obamacare mandate for large employers kicks in this year and for smaller employers it kicks in next year. But an increasing number of economists—both on the right and the left—are saying that mandated health insurance benefits at the work place are a bad idea. Are they right?