Honest opinion, do you think they were better when they were mainstream and part of pop culture, or do you think they are better off not trying to have that crossover mainstream appeal? WWE had their best eras when they were mainstream. In the 80s and later 90s-early 2000s. After the wrestling bubble popped around 2002, is when they started loosing their mainstream appeal and audience. Granted the Ruthless Aggression era was also a great time for WWE(which I wish WWE today was like that era), but now fastforward, they aren't producing the best storylines, bookings, and creative. So what is your opinion?