It would be great if you could allow an optional (large!) download of an open source LLM model. We’d have to investigate the performance of such a model, both in terms of latency and accuracy.
Potentially there could also be a cheaper subscription plan if you only use offline model(s) since Selkie wouldn’t have any per-user monthly overhead.
Please authenticate to join the conversation.
Completed
Feature Request
About 1 year ago

joethephish
Get notified by email when there are changes.
Completed
Feature Request
About 1 year ago

joethephish
Get notified by email when there are changes.