Tag: Africa
Many believe that white men have changed history and excluded Blacks so they can take credit for everything, even the existence of God. This is a lie and people need to understand the truth.
Wealthy white men brought us here and wealthy white men should send us back if they want an all-white country.
The United States are famous for propaganda and lies about Africa. They show pictures of African children living in poverty while at the same time neglecting the poverty in their own country.
A Christian Nation Bought Slaves from African Kings: Somebody Lying
Counselor Jan 30, 2024 0 1066
Human bondage is not in the character of a true Christian people. So apparently, they lied then and are lying now claiming America was founded on Christian values...
The spiritual world is not a joke or something to be played with simply because you feeling ethnic or cultural.
Popular Posts
-
The Idolatry of Black Media
Counselor Oct 23, 2024 0 4358
-
Assimilation Gone Wrong: When Hispanics and Other Minor...
Counselor Mar 18, 2024 0 1503
-
A Warning to Black Sell Outs
Counselor Apr 7, 2024 0 1206
-
I’m Sorry white Person, Did You Say Something?
Counselor Feb 14, 2024 0 1167
-
Black America Must Rethink Military Enlistment: The Nex...
Counselor Jan 21, 2024 0 1135
Our Picks
-
If You See a whites Only Sign, this is What You Do
Counselor Mar 18, 2025 0 616
-
Black Connections in the Bible and Christianity
Counselor Mar 10, 2025 0 728
-
Should Blacks Leave America? Hell No
Counselor Feb 25, 2025 0 411
-
Stand Down: The Current US Problems Are Not Ours
Counselor Feb 17, 2025 0 455
-
Encourage Your Children to Get a Trade and Rebuild Our ...
Counselor Jan 26, 2025 0 345