Is my employer required to provide me with health insurane coverage?

Free Insurance Comparison

 Secured with SHA-256 Encryption

Asked April 15, 2013

1 Answer


Health insurance is not mandatory at this time. Typically, health insurance is offered as an employee benefit, meaning that the availability of health insurance is used to entice employees to join or stay with a company. Usually, the employer pays for a portion of the health insurance benefits, and the employee pays for the remainder of the premiums. In some cases, the employer may offer to pay the full premiums.

With that said, it is important to note that the law regarding health insurance is changing with the Affordable Care Act. Beginning in 2014, employers with 50 employees or more will be required to make health insurance available. Under the new law, the employer may not be required to pay for any of the insurance, but he must make it available or face penalties for not doing so. At that time, if the employer refuses to make health insurance available, he will be liable for fines in the neighborhood of $2000 per full time employee, which could amount to extremely large fines. Employers with less than 50 employees are not required to offer health insurance at this time, and such regulations are not currently being considered.

However, your employer does not have to make health insurance available to all employees. For example, an employer's benefits package might not offer health insurance until you have been with the company for a certain number of years or until they have reached a certain rank within the company. Some companies may offer tiers of benefits, where the lowest tier does not have health insurance available, but other tiers do, and the employer has the option of paying differing amounts of the premiums based on which tier an employee us in. This is more commonly seen as a benefits package offered to full time employees but not to those who work part time for the company.

You can also be required to purchase health insurance if it is offered. If your employment contract states that you must participate in the health insurance plan offered by the company, then you must participate. In other words, buying the health insurance package offered by your employer can legally be a condition of employment, and failure to participate could even result in termination.

Answered April 15, 2013 by Anonymous

Related Links

Free Insurance Comparison

Compare quotes from the top insurance companies and save!

 Secured with SHA-256 Encryption