What would be your recommendation for some of the more authentic destinations in the USA? I've been to Texas when I was younger and it really did feel like a different place. I really got the feeling that I was in America, cowboys and everything. Then I went to Florida and as a European it really didn't feel any different from some of the places here. You have your beach, you've got the bars and restaurants there, you've got big shopping centers... exactly the same as you'll find anywhere else.