Genuinely, Why do Christians think they are oppressed?
Like, I saw a post and it was a black christian women going "me when christianity becomes illegal" and then posing as if her mugshot was getting taken. Why do (some) Christians fantasize about an imaginary world where they are oppressed as if they haven't oppressed literally everyone else.
Not to mention a lot of christians be picking and choosing what beliefs they are gonna choose to follow, being "gay is a sin" yet so is eating shellfish, wearing mixed fabrics, adultery, stealing, etc, but y'all gonna forget about that tho. Women aren't even allowed to speak in church, become priest and have to rely on their husbands.
Then yall wonder why ppl choosing atheism...
Edit: I should have clarified the Americas and Europe. But the comments have really opened my eyes to mistreatment christians face in other parts of the world, I wasn't surprised about the middle east but places like Nigeria surprised me.