What you're describing applies to coffee shops where a latte has the same amount of coffee regardless of the cup size.
Others are mostly describing someone who makes coffee for themself at home or in a break room. That person likely chooses a cup size depending on how much they coffee they want, how frequently they want a refill, etc.
you're partially right. It doesn't matter if they had specified the grams of coffee beans they used to produce those cups. It would have been better to specify both number of cups and how they were produced.
I've been burned on openrouter getting routed through terrible quants with equally terrible quality. While paying maybe 15% less.
Nearly a year ago it was impossible to avoid it due to silly openrouter routing algorithm and the api. You had to set multiple things just right to make it work.
Similar to their other api quirks. You want valid json format response? sure, set response_format to "json" just like our documentation suggests. Oh, it only works some of the time? How silly, why would you expect it to work all of the time? If you want it to work more often, set require_params to true. We may still use other providers that don't offer it, but you want that, right? You don't? Well, then set our "very_require_params" to "very_true". And then switch a few toggles in the frontend. Oh and also add these 7 lines just so your other config options don't break. Oh wait they will break, how silly of us Is there any way to make it work as advertised? Of course no!
Sorry for the semi-offtopic rant. I still use them every day though, but not for open models anymore.
More fair comparison would be writing/talking about Russian language in English. That way you'd still focus on Russian. Same way with programming - it's not like you stop seeing any code. So why should you forget it?
Seems plausible. I used to refer to StackOverflow before LLMs and a good amount of the examples there were flawed code presented as working. If the LLM had less junk in its training then it might benefit even though the volume of training on that language is lower.
reply