At the dawn of World War I, the United States was only a rising power. Our reputation was relatively benign among Middle Easterners, who saw no "imperial ambitions" in our presence and were grateful for the educational and philanthropic services Americans provided. Yet by September 11, 2001, everything had changed. The U.S. had now become a "world colossus so prominent in the political, economic, and cultural life of the Middle East that it was the unquestioned target of those bent on attacking the West for its perceived offenses against Islam."