Will the COVID-19 vaccine become mandatory? That's a question many are asking these days and, by the looks of it, the answer may well be yes — although as I'll explain later, I suspect the harms of the vaccine will become so apparent that it'll kill such efforts before they become widespread.
In a Jan. 1, 2021, Newsweek interview, Dr. Anthony Fauci said he was "sure" some institutions and businesses will require employees to be vaccinated, and that it's "quite possible" the vaccine will be required for overseas travel.
When asked about the possibility of mandating the vaccine on a local level, such as for children attending school, he stated that "Everything will be on the table for discussion." That said, he pointed out that since "we almost never mandate things federally" — with regard to health — he doesn't believe a national vaccine mandate will be enacted.
In related news Dec. 21, 2020, presidential candidate Joe Biden rolled up his sleeve to get publicly inoculated against COVID-19, stating that the vaccine was "nothing to worry about." He's also gone on record saying he will push for a 100-day mask mandate in federal buildings if he wins the presidency.