Can someone tell me why:
1. Robert Redford is at the 2007s Sundance Film Festival demanding in the opening speech that President Bush apologize to the American people for everything that has happened since 9/11? Why, tell me, does anyone give a rats fundament what Robert Redford thinks about anything much less politics? Can someone tell me why people who make movies in America are seen to be smarter and have a higher political acumen that God Himself? For that matter, why is anyone in this country revering some schmuck who appears on TV or film to the point where these stars are virtually deified about everything from their opinions on life itself to what brand of underwear they wear?
2. There is such a proliferation of sex-crazed fiends living in America who routinely take other peoples kids for their dastardly and horrid schemes? Can someone tell me why this seems so prevalent now? Is it better reporting or is indeed the moral fabric of American society now broken beyond repair? Does anyone have a clue? Those of whom I asked this question, those who grew up in the 30s, 40s, and 50s, are not able to recall from distant memory such a preponderance of these cases...