bn:01790599n
Noun Concept
Categories: Health education, Health care reform, Health insurance, Universal healthcare, Health policy in the United States
EN
health insurance mandate  employer mandate  Employer mandates  Health care mandate  Health insurance mandates
EN
A health insurance mandate is either an employer or individual mandate to obtain private health insurance instead of a national health insurance plan. Wikipedia
Definitions
Relations
Sources