I have a love hate relationship with Disney. When I was a kid I was into the movies and I went to Disney World in Florida and had a good time. I know its a tradition for many families to take Florida Holidays with the kids to see Disney World. But now that I’m an adult I just don’t feel the same way anymore. Their cartoons seem annoying and manipulative to me and its all so merchandised these days. And I’m not even sure they’re a good idea for kids anymore. I am not sure where I would spend my Florida Holidays now. Maybe on the beach somewhere or relaxing in a spa. I don’t think Disney would be atop my list.