Yes, this exactly. That's why we didn't go with chat for our UX here, and for future product areas we likely won't either. We already have good UX for our kind of product and haven't seen much feedback or convinced via some other means that adding chat would help more than it would hurt.
One of the weirdest parts of using Bing Chat is that it has tab-to-autocomplete function that is almost always wrong about what I want to say. I wish there was an LLM that actually was an "autocorrect on steroids" because that's honestly one of my most-anticipated features of this technology.
Having an LLM spell-checker that would autocorrect my spelling as I typed, based on the context of what I was typing? That would be magnificent.
Yeah, Copilot serves this purpose wonderfully—I've actually started writing documentation straight in VSCode and even occasionally things like certain emails, Jira tickets, or just general notes pertaining to anything vaguely technical, solely because Copilot is quite good at acting as a technical writing assistant.
Since it's just OpenAI's text completion model with a code finetune and without the chat/assistant RLHF, it works much better as an "advanced autocomplete" than ChatGPT or even OpenAI's Turbo model via their API. I can be much more surgical with how I use it (often accepting just a few words at a time), and it's good at following my usual tone.