Is the WWE Wrestling?
Seriously, that's a question I am beginning to ask to myself. It really looks like the wrestling IS the WWE.
If the WWE do something bad, nobody is going to stop watching it, nobody is gonna try something else. It's the WWE or nothing, it's pretty bad because that means that from all the guys who loves wrestling, there is only a certain amount who is ready to try something else. I'm same ready to think that the WWE control the business and that if they want the ENTIRE industry to be about something, this is going to be awesome but that if they want something to looks like a joke in the ENTIRE industry, they can do it. A totally control on the business
What's your opinion on it, is this all bullshit or did wrestling is really just WWE on those days ?