I agree. I played around with the Surface RT for over 30 mins at a Microsoft store with the intention of buying it for my parents for Christmas, and I walked away because even I couldn't effectively figure out what the "rules" were for interacting with Metro. I'm sure if I gave it more time, I could, but there is no way my parents, who still use XP, would be able to figure it out.<p>I wasn't sure what I needed to do to get to the "Desktop" mode where it looked like Windows 7, or how to flip back and forth, and which things I could swipe, etc. I felt like it was a big mess because a lot of the UI features that we've come to expect were not there. In contrast, the iPhone and subsequently the iPad were intuitive right off the bat.<p>To be fair, I'm seeing a lot of this terrible UI experience in other things as well. For example, on Chrome when you are reading a PDF, if you want to save it or zoom, it's not obvious how to do it. You need to miraculously hover over the bottom right corner and then the buttons show themselves, but there are no visual cues indicating that that's what you're supposed to do. It's fancy, but terrible UI.<p>The same thing occurs on Facebook, where people are just expected to know where to hover in order to show functionality. I don't know where this trend came from, but it's terrible, and I think this article is showing an extension of how we are moving away from all the visual cues and things we've learned about UX in the past 30 years. Sure, it's different but it doesn't mean it's better, especially when it forced people to hunt, peck, and guess for functionality, something that UX is supposed to get rid of.