this is why I don’t think google is threatened by LLMs. 1) this doesn’t do links, and 2) it has accuracy issues. I don’t understand the hysteria about how google will be replaced by ChatGPT.
> A Google executive the Times spoke to but didn't name said AI chatbots like ChatGPT could upend the search giant's business, which relies heavily on ads and e-commerce found in Google Search. In a memo and audio recording obtained by the Times, the publication says CEO Sundar Pichai has been in meetings to "define Google's AI strategy" and has "upended the work of numerous groups inside the company to respond to the threat that ChatGPT poses."
Google is threatened by LLMs because they're going to improve and Google's advantages can be overcome, and because LLMs are going to completely ruin the things Google indexes (you can already see the damage from content farms, bots, etc), meaning fewer eyeballs on their ads.
Not having links can be considered a feature, depending on what kind of front end you are using. When using Siri or Alexa, I typiccally want an answer, and not a quote from a page which would need further interaction, or a link to a page which requires me to actually read the content myself. In these cases, I'd much rather prefer a system like ChatGPT give an answer directly.
Is going to take a long time till Google is obsolete, esp. because a sizeable amount of people will still want to get links. But in a voice query situation, I'd actually prefer if I didn't have to interact with HTML at all.
I would never trust Siri or Alexa for an answer that actually mattered. The population of Budapest or tomorrow's weather, sure. The correct torque for my wheel nuts or whether it's legal to carry a gun in ____, hell no.
How often do you actually use voice query ? The only thing I use Siri is switching music when I'm driving and even that is like 80/20 to understand me correctly.
What's the use case anyway ? Fact checking someone in a conversation ? Doesn't sound like something I'd pay for.
You're mixing things up. I totally agree that Siri and Alexa are quite unusable right now. You can basically just ask for the time and the weather, everything else fails. However, with ChatGPT as a potential backend, I think the situation would change.
It doesn’t seem impossible to extend these AIs to do maths correctly.
It would be a simple classification problem to detect whether a prompt requires a more classic maths model. It’s it’s maths, use something like wolfram alpha, it’s it’s finance show random numbers, otherwise use a language model.
Add some relevant links on top of that and you are good to go.
Of course they also need to fix the confidence level and find a solution these AIs to say that they don’t know.
I recall that asking gpt3 to write python code to compute the answer instead of trying to come up with the answer itself worked a ton better.
You do have to execute the python yourself somehow still, but even with just a super simple interpreter you can make your own math-enhanced LLM ai assistant