I'm over "The Oscars" and have been for years. Quit trying to church it up by saying "I only want to see what people are wearing". Why should ANYBODY watch a bunch of overpaid, overprivileged people pat each other on the back for no apparent reason? Let's also add to the fact that Hollywood doesn't really care what you like, they have some high notion that they're making "art" instead of just play acting. Okay, that's my grumble, the best picture nominees are next.