No one can “force” you to believe things are a certain way, but (maybe I am wrong) it appears our nation (USA) is under a coordinated attack by government and media bias to misinform us into believing whatever they want truth or not. That is unamerican, it is immoral and unethical, it is destroying the principles of the Constitutional Republic this great nation was founded upon. And they are getting away with it.

As far as I can tell, this is what we are now supposed to believe as Americans, in this first year of the Glorious Deep State’s Blessed and Eternal Reign: Our country is so irredeemably racist that people of all races keep sneaking …
Source: What You Must Believe as an American in 2021
RELATED:
White House Peddles Disgusting Lie That Doesn’t Apply to a Single Republican
Biden Sounds Like Grim Reaper with Latest Chilling Warning to Americans