I can see why a year or so ago, MaxQDA needed to impose a limit of 120 000 characters (which corresponds to about 30 000 tokens), but the context windows of current LLMs are way higher than this (Anthropic's models, for example, have 200 000 token (not characters!) limit) so I'm wondering whether there are any plans to increase the limits in MaxQDA's AI Assist...
I can see why a year or so ago, MaxQDA needed to impose a limit of 120 000 characters (which corresponds to about 30 000 tokens), but the context windows of current LLMs are way higher than this (Anthropic's models, for example, have 200 000 token (not characters!) limit) so I'm wondering whether there are any plans to increase the limits in MaxQDA's AI Assist...
0 Votes
1 Comments
Khaled Alostath posted about 1 month ago Admin
Dear Haug,
For feedback and information/suggestions about future functions, you might submit your inquiries via this link: https://www.maxqda.com/maxqda-feedback
Best regards,
Khaled
0 Votes
Login or Sign up to post a comment