Applied to American Studies, imagology—the term derives from Latin “imago”/image—asks not how Americans themselves see America, but how other cultures view the United States. In other words, what is the image of the United States as seen in literatures from outside the United States?
Source: wiktionary