11
Jul
2022
Asserting that America is a Christian nation implies the country belongs to Christians and excludes others.
Because that's basically saying, like, this country is about us, not you. We own this nation and we allow you to live here.