Support offline LLMs

It would be great if you could allow an optional (large!) download of an open source LLM model. We’d have to investigate the performance of such a model, both in terms of latency and accuracy.

Potentially there could also be a cheaper subscription plan if you only use offline model(s) since Selkie wouldn’t have any per-user monthly overhead.

Please authenticate to join the conversation.

Upvoters
Status

Completed

Board
💡

Feature Request

Date

About 1 year ago

Author

joethephish

Subscribe to post

Get notified by email when there are changes.