• UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    Your employer usually pays your healthcare

    In the United States? That’s not true at all. Roughly half of American workers don’t get health insurance from the employer. And that’s before you consider folks who lose their jobs after suffering a medical emergency.