Hey everyone! Metaphor team here.<p>We launched Metaphor earlier this morning! It's a search engine based on the same sorts of generative modeling ideas behind Stable Diffusion, GPT-3, etc. It's trained to predict the next <i>link</i> (similar to how GPT-3 predicts the next <i>word</i>).<p>After GPT-3 came out we started thinking about how pretraining (for large language models) and indexing (for search engines) feel pretty similar. In both you have some code that's looking at all the text on the internet and trying to compress it into a better representation. GPT-3 itself isn't a search engine, but it got us thinking, what would it look like to have something GPT-3-shaped, but able to search the web?<p>This new self-supervised objective, next link prediction, is what we came up with. (It's got to be self-supervised so that you have basically infinite training data – that's what makes generative models so good.) Then it took us about 8 months of iterating on model architectures to get something that works well.<p>And now you all can play with it! Very excited to see what sorts of interesting prompts you can come up with.
One search string that really illustrated the problems with modern-day Google for me is "best things to do in hawaii". Try it and see what I mean. It's just link after link of blogspam. You get extremely long pages filled with ads and generic stock photos of Hawaii, but which are bereft of any actual content. I just want a single person's account of how they went to Hawaii and what they liked/didn't like, but it's impossible to find, even though I'm sure it's out there on the internet somehow.<p>The best thing to google if you want an answer to this question is something like "reddit best thing to do in hawaii" which gets you actual accounts from actual real people who actually went to Hawaii and have interesting things to say about it.<p>I tried this with metaphor.systems as well, using their prompting language - "My favorite place to go in Hawaii is:". Unfortunately, I still didn't get great results, though some of them showed some promise.
I've been using Metaphor for a few weeks now and have almost entirely switched from Google and other search engines. Keyword based search simply doesn't come close when it comes to getting the _right_ results. While I have to sift through a few pages of results on Google and then maybe find what I'm looking for, on metaphor, there's almost no SEO spam or Wikipedia-style links dominating the top results. It directs you to sources that are relevant to your search query. I don't know how they did this (probably a lot of very specific and targeted tricks), but Alex and team have created a marvelous product and I'm excited to see where this goes! Congrats on the launch!
I used to work on Google search but it was a long time ago so hopefully I am not too biased here.<p>I think it would really help the UI to have better snippets. Ie, the text that appears below the blue link for a set of search results. In Google search results the key words are often bolded, as well. It helps you skim through and see which of the results are going to be a good fit.<p>Maybe there is some fancy AI thing you can do to generate snippets, or tell me more about the page. For example one of the search results for your sample query is:<p><i>Online resources in philosophy and ethics</i><p><i>sophia-project.org/</i><p>That doesn't really tell me anything without clicking on it. Is it good? I don't know... I usually don't click on that many results from a Google search, people often decide after only selecting one or two, based on the snippet.
How will you afford to keep the search engine up to date without expensive retraining of the entire model? My understanding is that fine-tuning will not result in the same accuracy as a full retrain.
Going from the supposedly curated examples, the Wikipedia page for the "most Jackson Pollock-like", the "most Dalai Lama-like" and the "most Elon Musk-like" figure from the 2nd century is Secundus the Silent.<p>Given that his name is Secundus and his Wikipedia short blurb mentions twice that he lived in the 2nd century AD, I think your AI has decided that he is just the most 2nd century figure.
Congrats on launching! I found myself using this more than I expected in the closed beta. I used it most for opinionated prompts (e.g. "the PG essay I gave my parents to help them understand startups was..."), but also had some luck with finding content by its description (e.g. "I really like the intuitive explanation of [college math topic] at ...".
Apart from the gains of the civil rights movement, there have been other factors which have occurred such as African American communities have suffered from very high imprisonment rates in young males
This worked really well when I tried using it to search for papers. I didn't record the details but it was something related to converting a mesh to boundary representation.
Someone else posted a link to their for profit search engine. I find having to login to use the product a bit disturbing. What if I don't want my data collected?
When I read the word title, I expected a search engine, which finds metaphors based on my text input. Too sad that there still isn’t anything like this :(
It would be interesting to be able to search with descriptors of the content rather than questions / keywords /content match searches. maybe?<p>But I feel this page is offputting. the templates make it feel less flexible than it probably is.<p>> Here's a
> wikipedia page
> about the most
> Elon Musk
> -like figure from the
> 19th
> century:<p>This is an interesting query that you can't do in google. I like it.<p>> Here's a cool demo of
> GPT-3<p>This one is bad. It's more cumbersome than a search of "GPT-3 demo" and probably not going to give you anything more noteworthy.<p>I'm curious if there's a reason 3 of your prompts try to identify content that is "cool"?