I ask this because I've heard too many negative opinions about the industry. I've heard the phrase, "It's not what you know, it's who you know." I've also heard the phrase of "You have to play the game to win."
Hahahaha! And maybe it's just the rebel in me that refuses to accept such hand-me-down sayings, but I also acknowledge that the people who have told me how much Hollywood sucks, are people who aren't exactly in the industry anymore. I won't go so far as to say that they've failed or are now "washed up", but what I'm saying, is that I'm sure Hollywood doesn't suck for everyone?
Yeah, I'm sure that being in this business, you're gonna run into producers who are blunt with you and may come off as assholes. But being in their line of work, I'd expect them to be.
I know I can google some article about it, but in such articles you have to be politically correct and make sure nothing can come back to bite you in the ass.
Right now, we're gathered around a polished table in a dimly lit lounge. There's a pianist throwing down a jazzy tune. I'm pouring you a drink and asking you flat out..."Tell me your thoughts? If you could, would you create another Hollywood in another city outside of New York and California? Tell me about it."