My "trick" was to divide things into batches (which can be big with LLMs with larger context sizes) and classify the items in each batch, then take the resulting categories from each batch and feed them into an LLM to group semantically similar categories into groups with a representative category for each group. The representative category can be chosen from the group or created by the LLM. This is an over-simplification of the process but that's the gist of it.
Language support is not mentioned in the repo.
But from the paper, it offers extensive multilingual support (nearly 100 languages) which is good, but I need to test it to see how it compares to Gemini and Mistral OCR.
Claude Skills seem to be the option that offers highest flexibility to add more capabilities at most simplicity. Better than MCP in my opinion. Hope it becomes a standard and get adopted by OpenAI and the rest of labs.
Good question! I selected the edition with the smallest Goodreads ID¹ that has the publication date and cover photo available. If all editions don't have publication date nor cover photo, then we get the one with the smallest ID.
And you're right, in a few cases, this resulted in getting less widely read editions for some books.
1: Assuming smaller ID means earlier addition to Goodreads' database.
I have Raycast extensions for GPT and Claude models. Whenever I have a question, the most powerful LLMs in the world are two key strokes away.
This way is easier than going to the browser then ChatGPT tab for example then creating a new chat.
I found myself using LLMs more and getting more out of them because of this frictionless interaction. They've become more of actual "helpful assistants."
Can you explain more? Like which tool do you use for this wiki page? Or is it an internal tool? And do you use it to write meeting notes and then discuss on the same page?
If it's a discussion "too big" for slack/teams, we create a confluence wiki page to go over the details, discuss it using the Talk add-on (lets you make comments in-line) and then have a meeting to go over it.