What are the biggest differences between the US you see in fiction and the real life US?
Just wondering. Most people around the world form their view of USA based on what they see in movies, TV shows, cartoons, etc. Even if you don't want to do that, it's inevitable that a lot of what ends up influencing your opinion and view of a foreign country. Just like a lot of Americans and non-Japanese people in general will sub-consciously assume that anime and video games are accurate depictions of Japanese culture and life in Japan, since they've never visited the country and thus that's the main exposure they have to it.
I'd say in Europe we generally view the US as a country of extremes. I feel like a huge chunk of Europeans view Americans as either morbidly obese or super fit and good looking. And also as a place full of adrenaline and excitement for better or worse. I also feel like, to a lot of Europeans, half of USA is full of easy people having sex and a lot of gun violence and the other half is full of super religious nuts and very far-right leaning politicians which make European right-wing politicians seem like left-wingers.
So, since this forum has a lot of Americans and I've never been to your country, I'd like to ask what the biggest differences between life in the US and what you see on TV are. Asides from the obvious things such as not everyone owning a big expensive house, everyone having a very well-paying job and everyone being attractive, of course.