Coming back to American history, there are actually cultural and political impacts of colonialism happened in America. Spain is just one of colonizers successfully put their influence in Americans’ life; just mention California, Florida, and Southwest areas that obviously got