I can see why a year or so ago, MaxQDA needed to impose a limit of 120 000 characters (which corresponds to about 30 000 tokens), but the context windows of current LLMs are way higher than this (Anthropic's models, for example, have 200 000 token (not characters!) limit) so I'm wondering whether there are any plans to increase the limits in MaxQDA's AI Assist...
0 Stimmen
1 Kommentare
Khaled Alostathgepostet
etwa 1 Monat her
Administrator
I can see why a year or so ago, MaxQDA needed to impose a limit of 120 000 characters (which corresponds to about 30 000 tokens), but the context windows of current LLMs are way higher than this (Anthropic's models, for example, have 200 000 token (not characters!) limit) so I'm wondering whether there are any plans to increase the limits in MaxQDA's AI Assist...
0 Stimmen
1 Kommentare
Khaled Alostath gepostet etwa 1 Monat her Administrator
Dear Haug,
For feedback and information/suggestions about future functions, you might submit your inquiries via this link: https://www.maxqda.com/maxqda-feedback
Best regards,
Khaled
0 Stimmen
Anmelden oder Registrieren um einen Kommentar zu veröffentlichen