DefinitionThe view that the United States should play an active role in world affairs.Sourcehttps://digestiblepolitics.wordpress.com/dictionary-of-political-words-simply-explained/