It's hard to truly understand the vibe of a place unless you've spent significant time there. This is part of the reason we rely on stereotypes: France and baguettes, England and questionable food, and so on.
How do non-Americans view the United States? Is it all just stereotypes about Ronald McDonald and John Wayne, or is it more nuanced? This r/AskReddit thread asked Europeans what they imagine life in the U.S. to be like.