Why do Americans get a bad rap for conquering the Indians?

Didn’t the Indians fight against other tribes. They conquered raped and pillaged. Is there one country in history that hasn’t taken land from other people

5 Answers

  • Anonymous
    1 month ago

     Our biased history book brainwash those with no brain zxj

     . . . . . . . . . .

     , , , , , , , , , ,

  • 1 month ago

    Americans and "Indians" are the same people, which entails all of the Americas. Perhaps you meant to reference the foreigners from Europe who today are better known as citizens of whatever said corporation, e.g., U.S. Citizen, transferred property from under the crown to being under a corporation as a citizen (still slaves, just a different name). 

    To conquer means to take control of by military force. The answer to your question is self-explanatory. Many advanced nations were existent as so described by the explorers and notated in their diaries. Some nations did combat each other just as the European nations did to one another for trivial excuses such as one's belief; however, the population was FREE MEN. We all have the responsibility to uphold the divine law, which also embraces our divine rights. The European stockholders were and still are to this very day on a mission to subjugate the world population including ignorant and cowardly "white" people (so, so ignorant they are). 

    To answer your question more directly, Europeans get a bad rap because they committed genocide against the Americans by physical violence and paper genocide (misnomers and mislabeled identities), crimes for which they've yet to be held accountable. The descendants of those Americans are extant and although they are not looking to make a show out of the truth, they want to be left alone in peace.

    The ambiguous so-called "Native Americans" are not the aboriginals and are in fact, wards of the state. During the homestead act, many of the "white" slaves found an opportunity by claiming to be of "Native American" ancestry to claim land. While it is an act of survival, this further diminishes the identity of the aboriginals. 

    Begin your research on the "Indian" wars. 

  • Anonymous
    1 month ago

    Because they have nothing better to say …

    Here’s a simple break down

    Natives fought amongst each other

    American Natives aren't even the same people

    There’s no evidence all of them had the concept of land ownership

    American Natives would have attacked other nations. Or someone else would have claimed the Americas if we didn't.

    It wasn’t the US before the Europeans "discovered" it and built it.

  • 1 month ago

    America was built on violence and stolen land. That's not an admirable trait, and is very un-american. American's like to think of themselves as the best the world has to offer, the good guys, and a shining light in the dark (although I think most are finally seeing themselves the way the rest of the world has for decades). It's no surprise they don't talk about it. 

    But, in the end the native Americans were unable to defend what was theirs. I don't agree with it, but it's similar to evolution. The weak can't adapt and die out, the strong, adapt, survive and thrive. It wouldn't have gone any other way. Not with human nature being what it is.

  • What do you think of the answers? You can sign in to give your opinion on the answer.
  • 1 month ago

    the native americans did fight within their tribes, however they all knew that the land belonged to no one. then some white guys come, kill a lot of them, rape their women, steal their land, and took their children and put them in special schools, changed their names, and whitewashed them. and americans still act like its their country.

Still have questions? Get answers by asking now.