I'm happy that Mr. Ramaswamy is atoning for advertising. I noticed neevabot crawling a website I have access to just the other day. I have been trying to prevent SEO bots like AhRefs, Screaming Frog & etc from crawling and indexing my websites out of spite. SEO is a big part of why the web kind of sucks.
The search engine is called Neeva [1]. It seems to be pretty advanced already. The results are responsive and accurate. I was surprised that they offered a maps feature, but this appears to be Apple Maps<p>One thing that surprised me is that searching for "php strpos" revealed not just the PHP docs for the function as I expected, but actually has the sections of the page visible in the first result, as well as a succinct code snippet in the description. The text does get cut off with no way to scroll, but still, having the function signature at a glance is really cool<p>[1] <a href="https://neeva.com/" rel="nofollow">https://neeva.com/</a>
I observed that on some searches, it seems to get stuck in a loop, repeating earlier hits over and over again on subsequent pages. It seems to me that agood search engine should, at the very least, do something to trap repeats.
<p><pre><code> And almost all - but not the BBC - had at least one belonging to Google, meaning Google is receiving anonymised information about users visiting those pages.
</code></pre>
BBC.com and BBC.co.uk both run google ads via DoubleClick and thus have Google trackers.
aka "Tech exec says he has remorse of building a bad plateform, says the new one will be better and really nothing like the previous one". What a joker! Either they play dumb or they really don't understand the mechanics of how big internet plateforms work. If they want to show off their immense knowledge and experience they should contribute to community projects. There are tons. Only this can produce a qualitatively different result.