So I was going through my History Text book and soon found all kinds of biased views in it. It talks about how white people basically invaded villages in Africa and drug people out of their homes, in some cases yes this happened. But in most cases slaves were sold by other Africans, sometimes the Africans even captured whites and used them as slaves. No one mentions that however. It also talks about all the wrongs Americans did to the Indians, wrongs were committed on both sides and I won't try to justify what Americans did but Indians also raided villages in towns and killed men, women and children. All of the bias in the text books is sickening to a point yet we as Americans happily go along with it.