> Still, the sheer scale of the problem is daunting. “Any reasonable search engine has to have 20 billion-50 billion pages in its active index,” Mr Ramaswamy said. When a user runs a query, the retrieval system must sift through vast troves of data then rank them in milliseconds.<p>this sounds interesting but as an outsider to these topics I have many questions.<p>how is the search achieved in milliseconds (hardware/software), for possibly millions of users simultaneously across the world<p>what is so difficult about creating an index of 20-50 billion entries? I'd imagine faang etc have resources to do these things with little effort