This article on Masa from January gives some insight on what engineers were doing: <a href="https://www.fastcompany.com/90285552/the-most-powerful-person-in-silicon-valley" rel="nofollow">https://www.fastcompany.com/90285552/the-most-powerful-perso...</a><p>tl;dr They used sensors to determine that people drink coffee in the morning and the line was long. They also used sensors to determine that small groups of people were booking large conference rooms so they made more small conference rooms.<p>Relevant part:<p><i>WeWork’s potential lies in what might happen when you apply AI to the environment where most of us spend the majority of our waking hours. I head down one floor to meet Mark Tanner, a WeWork product manager, who shows me a proprietary software system that the company has built to manage the 335 locations it now operates around the world. He starts by pulling up an aerial view of the WeWork floor I had just visited. My movements, from the moment I stepped off the elevator, have been monitored and captured by a sophisticated system of sensors that live under tables, above couches, and so forth. It’s part of a pilot that WeWork is testing to explore how people move through their workday. The machines pick up all kinds of details, which WeWork then uses to adjust everything from design to hiring. For example, sensors installed near this office’s main-floor self-serve coffee station helped WeWork discern that the morning lines were too long, so they added a barista. The larger conference rooms rarely got filled to capacity–often just two or three people would use rooms designed for 20–so the company is refashioning some spaces for smaller groups. (WeWork executives assure me that “the sensors do not capture personal identifiable information.”)</i>