Florida may soon become the first state to end all vaccine mandates, including those for schoolchildren, setting the stage for deadly infections to make a comeback
By Grace Wade
8 September 2025
Florida surgeon general Joseph Ladapo speaking at an anti-vaccine event in Sarasota, Florida
Dave Decker/ZUMA Press Wire/Shutterstock
Scepticism around vaccines has reached new heights in the US, with Florida officials pushing to eliminate all vaccine mandates, including those for schoolchildren. The move may embolden other states to follow suit, and also risks triggering a resurgence in childhood illnesses long kept at bay.
“If I were a virus, I would throw a party right now,” says Cynthia Leifer at Cornell University in New York state. “The potential removal of all vaccine mandates in Florida will allow diseases that have been kept in check for decades to rear their ugly heads again.”
Read more
The 5 best things you can do to boost the chance of a vaccine working
Once relegated to the sidelines, the anti-vaccine movement has become a significant force in the US since the covid-19 pandemic. Florida is a prime example. In 2022, it became the first state to recommend against covid-19 mRNA vaccination for most children. Two years later, it extended that guidance to everyone. Now, it could become the only state to eliminate vaccine mandates entirely.
“The Florida Department of Health, in partnership with the governor, is going to be working to end all vaccine mandates in Florida law,” announced Joseph Ladapo, the state’s top public health official, on 3 September. “Every last one of them is wrong and drips with disdain and slavery.”
Like every US state, Florida legally requires children to be vaccinated against several diseases before they enter school. While the Florida Department of Health, which Ladapo oversees, has significant authority when it comes to school vaccine mandates, state lawmakers are the only ones who can repeal all vaccine requirements.