FIND THE ANSWERS

Do whites realize that once they become a minority, they will no longer control America?

Answer this question

Do you know the correct answer? Make money answering questions! Join now.
  • Do whites realize that once they become a minority, they will no longer control America?


Answers