The Effects of Imperialism In the late nineteenth century, the United States and other western nations in Europe began to imperialize developing countries in Africa and Asia. Believing the white race to be racially superior, both the U.S. and European nations forcefully controlled developing countries because ideologically they surmised that it was their destiny and god given right to ‘civilize’ native peoples. Imperialism was also spurred on in Europe by competition between the nations, which was