Experience with medium sized LLMs
I have tried to use several models on my 8gb ram MacBook and concluded that 4b parameters models are just “stupid” for my tasks (i.e. summarisation of pdfs, language learning, etc.).
I have tried to use several models on my 8gb ram MacBook and concluded that 4b parameters models are just “stupid” for my tasks (i.e. summarisation of pdfs, language learning, etc.).