This is very creative. Thing to note is that this is not free (or at least it is likely not to be, OpenAI is charging for APIs). We are talking about ~1c per search looking at current GPT pricing (2c per ~750 words in/out). It is also indiscriminate deployed in this way, triggering for every search, even those that chatgpt will show suboptimal responses for, like 'starbucks near me', navigational queries like 'twitter' or anything else chatgpt will be bad in the context of search (a lot of things!). And it is non-trivial to predict which searches GPT will be good for (especially within typical result latency requirements of modern search engines).<p>We are doing some experiments with this at Kagi, and the main trick is to manage cost, possibly through on-demand triggering mechanism (which also can help manage accuracy). One thing to keep in mind is that this is likely going to get better/faster/cheaper in the future.
I started trying to use ChatGPT instead of google for a lot of my queries. The tricky part is unlearning how to search.<p>For instance, I had a timestamp 2022-11-27 00:00:00.000 and wanted to display it in local string. The problem was (depending on your time zone) if you do new Date("2022-11-27 00:00:00.000").toLocaleString(), you get 11/26/2022<p>So I would have googled something like "pass timezone to toLocaleString", which would give me this answer [0]. But its still a lot.<p>Instead in ChatGPT I would ask:<p>> assume i'm in EST timezone, en-US<p>> const d = "2022-11-27T00:00:00.000Z"<p>> const dd = new Date(d)<p>> How can I print out local string as Sun Nov 27 2022?<p>To which it replied:<p>> const d = "2022-11-27T00:00:00.000Z";<p>> const dd = new Date(d);<p>> console.log(dd.toLocaleDateString("en-US", { timeZone: "America/New_York" }));<p>But the answer was actually wrong, which is fine! Easy to test<p>> This actually prints out 11/26/2022, 7:00:00 PM. Try again<p>To which it replied:<p>> The output you are getting is correct, because the date you provided is in the UTC time zone, and you are formatting it using the Eastern Time (ET) time zone...<p>> const d = "2022-11-27T00:00:00.000Z";<p>> const dd = new Date(d);<p>> console.log(dd.toLocaleString("en-US", { timeZone: "UTC" }));<p>> // Output: "11/27/2022, 7:00:00 PM"<p>It was pretty incredible and much better than stackoverflow as now I don't have to worry about generalizing my answer or implementing the answer to my particular use case. ChatGPT tends to be verbose but the code is clearly noted and you can often ignore the words for simple tasks.<p><a href="https://stackoverflow.com/questions/17478086/chrome-timezone-option-to-date-tolocalestring" rel="nofollow">https://stackoverflow.com/questions/17478086/chrome-timezone...</a>
ChatGPT is an absolute game changer. And I was very conservative with systems like GPT and DALLE-2. For example, I've been very lazy with automating some things on my work laptop, using Powershell. Now, I just had to ask ChatGPT for "write a powershell script that toggles "use setup script" in the windows proxy settings" and I was done in 5 minutes. Amazing. I foresee a paradigm shift in how we use Google in the next 1-2 years.
The future is likely a GPT-plus-level model, browsing-enabled, specialized in formulating queries, reading snippets <i>and</i> the underlying web-pages.<p>It'll take a single expressed intent, and conducts a series of queries, page-readings, refined queries, & rerankings/summarizations before providing you a synthesized response.<p>In a second or two.<p>With ads.<p>And deeply-embedded 'sponsored recommendations'.
Does it make sense to put this on the chrome store? OpenAI is eventually going to end the beta period or Azure will run out of GPUs, whichever is first, then it'll no longer be accessible.<p>In theory we can replicate this with GPT-3 but ChatGPT has better access to its knowledge, when it's not being a nagging busybody, which makes it much friendlier to interact with, when it's not being a nagging busybody.
<p><pre><code> Human: So a train is hurtling down a track and you get to save an old man or 4 babies. What would you do?
AI: I would choose to save the four babies, as this would have a greater long-term benefit to society as a whole.
</code></pre>
Ok, that's me dead when I get to near the end of my life.
Why does this extension on Firefox need "Access your data for all websites", when it seemingly could have made do with "Access your data for sites in the “named” domain"?<p><a href="https://support.mozilla.org/en-US/kb/permission-request-messages-firefox-extensions" rel="nofollow">https://support.mozilla.org/en-US/kb/permission-request-mess...</a>
Is Google in deep waters? And not only because of how good ChatGPT can be, but also as far as "content" goes, because from what I can tell the quality of writing by ChatGPT is very passable for articles, especially if you know the topic you're writing on in and out.<p>I have been thinking about this over the weekend but I haven't been able to get a good feel for it on my own.
Super cool extension! And it's amazing that the interaction with ChatGPT is so easy to put in place (for the users).<p>It inspired me to fork the background code and create an extension that proposes improvements to tweets while composing them: <a href="https://github.com/matthieubulte/chat-gpt-twitter-extension" rel="nofollow">https://github.com/matthieubulte/chat-gpt-twitter-extension</a>
As a side note, I tried to use ChatGPT to collaboratively build a Chrome extension. It seemed a little out of date, preferring Manifest V2 and also not entirely "correct" in how it generated code to match my intent.<p>Has anyone else tried and had luck with this? I wonder if ChatGPT will be limited for niche integration type code (Modding games maybe, Chrome extensions in this example, etc) where there are not as many code examples out there.<p>Further, the development cycle and ability to test easily for some of these integrations is itself a barrier so I wouldn't put all the blame on the language models.
UPDATE: The extension has been published to Web Stores:<p><a href="https://chrome.google.com/webstore/detail/chatgpt-for-google/jgjaeacdkonaoafenlfkkkmbaopkbilf" rel="nofollow">https://chrome.google.com/webstore/detail/chatgpt-for-google...</a><p><a href="https://addons.mozilla.org/addon/chatgpt-for-google/" rel="nofollow">https://addons.mozilla.org/addon/chatgpt-for-google/</a>
Google search has degraded so much (thanks to blogspam gaming SEO, and Google as a predominantly ads business optimizing for...well ads), that most of us likely append "reddit" or "site:reddit.com" to our Google search queries to get better quality results.<p>ChatGPT is so insanely good, especially for actual knowledge - that it will likely replace Google as the default search for many people.
Does ChatGPT actually come with an official API as of now?<p>Also, regarding that extension: Is that currently using the API key of the person who developed the extension? I believe OpenAI used to have a rule in their terms of service which forbids applications that essentially hand the reins of the API key owner over to third parties.
I scanned all the comments and the GH issues and I can't tell if I'm just the only one annoyed by this:<p>Is there anyway to turn off the "fake conversation/typing"-effect? It's incredibly distracting to have animated "typing". I think I'd almost rather it wait until the response was complete before showing it. After trying ChatGPT on OpenAI's website I see it uses the same effect which I find annoying and gimmicky. I went to check the network tab on the dev tools console expecting to see that the response came back right away (full response) however instead of seeing a response I see an empty "event stream" tab and it looks like the request took about how long it took for it to "type" out the response. Maybe this is some way to get results back as soon as possible but I find it jarring. I dislike faking human-like limitations for no good reason.
I just made a chrome extension to summarize web pages (blogs, articles, news, etc) - using ChatGPT<p><a href="https://github.com/clmnin/summarize.site" rel="nofollow">https://github.com/clmnin/summarize.site</a>
And here's a Twitter bot powered by ChatGPT: <a href="https://github.com/transitive-bullshit/chatgpt-twitter-bot" rel="nofollow">https://github.com/transitive-bullshit/chatgpt-twitter-bot</a>
Wow I tried it, and for anyone still in doubt: it's incredibly good. I usually tend ignore all the AI hype and "smart-chat-bots-thing", but boooy this is soo much better
I am quite impressed with ChatGPT but I wish it were more opinionated. I asked it to tell me a story about Jesus and Hitler switching bodies and I get the canned "I'm sorry Dave, I'm afraid I can't do that". It seems to try to give the most CYA PC answer sometimes.