• John Larrimer

Are Employers Required to Carry Workers Comp Insurance?

Q: Are all companies required to have insurance for workers compensation? Which types of companies are required and which are not?

A: Excellent question. Workers compensation is a form of insurance that provides wage replacement as well as medical benefits to employees injured while on the job. This is not the same as punitive or general damages for injury caused by negligence. Workers compensation laws vary by state. Maryland was the first state to require employers to cover workers compensation in 1902. By 1949, all 50 states had a form of law requiring workers to carry it. Texas is the only state where employers can opt out of workers comp insurance. In all other states, employers can face harsh penalties if they do not carry the required insurance. Many states have public uninsured employer funds available to injured workers whose employers were not covered. Workers comp insurance generally gives employers immunity from other damages, unless the act which causes injury proves to be intentionally or grossly negligent.

-John Larrimer

Larrimer & Larrimer, LLC—Columbus Workers Comp Attorneys

0 views0 comments

Recent Posts

See All

Returning to Work After Injury

Should I return to work light duty? If you are hurt on the job, your doctor may give you work restrictions. If those restrictions prevent you from returning to your normal duties, you may be entitled

How do I get paid after an Injury?

Workplace injuries can set you back Work injuries don’t just hurt physically, they often have a lasting financial impact. If you are hurt at work and the injury prevents you from returning to full dut

Hurt at work? What now?

When You Get Hurt on the Job If you suffer a workplace injury, it is important to notify your employer of the injury and seek treatment as soon as possible. While your injuries may resolve over time