I can see why a year or so ago, MaxQDA needed to impose a limit of 120 000 characters (which corresponds to about 30 000 tokens), but the context windows of current LLMs are way higher than this (Anthropic's models, for example, have 200 000 token (not characters!) limit) so I'm wondering whether there are any plans to increase the limits in MaxQDA's AI Assist...
I can see why a year or so ago, MaxQDA needed to impose a limit of 120 000 characters (which corresponds to about 30 000 tokens), but the context windows of current LLMs are way higher than this (Anthropic's models, for example, have 200 000 token (not characters!) limit) so I'm wondering whether there are any plans to increase the limits in MaxQDA's AI Assist...
0 Stimmen
0 Kommentare
Anmelden oder Registrieren um einen Kommentar zu veröffentlichen