Can employers require employees to get the covid-19 vaccine before they come to work, as with many legal questions? The answer isn’t really a concrete yes or no, but it’s at least a probably. There is no federal law prohibiting employers from requiring vaccines for their employees. And the CDC says whether an employer can mandate the covid-19 vaccine is a matter of state, not federal law. But in a roundabout way, their sort of is federal law addressing mandatory vaccines. Even if an employer decides to mandate vaccines for their employees, they still have to provide what are called reasonable accommodations for people with disabilities or religious objections.
And of course, don’t forget about collective bargaining agreements, which may require some negotiation between the employer and the union before requiring vaccines. There is an already established history of certain employers requiring vaccines, and that, of course, is in the industry of health care. Not surprisingly, if you’re a health care worker, your employer probably has some pretty stringent vaccine requirements. The Supreme Court has long said that governments can make vaccines compulsory. And the Supreme Court has also said that mandatory vaccines can be required in schools.
Employees who feel like they just don’t trust the vaccine yet may find that alone may not get them an exemption from the vaccine requirement at work. It follows from the default rule that will employment that employers can impose certain requirements on their employees as long as they don’t violate federal or state law. There will be litigation over whether employers can mandate vaccines. Someone’s going on right now, but if you were to lay odds, the odds are in favor of the employer on this legal issue.