-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Global community pools to share LocalAI federated instances and workers #3113
Comments
This would likely be a new golang app that can be deployed e.g. in Vercel and would need a simple form for users to submit tokens. I see two sections in this app:
|
thinking it again. no need of vercel or a dynamic web app at all: can all be static and have GH workflow pipelines to run "cron" jobs to update the data. |
scratch that - too complicated to add new tokens then |
https://explorer.localai.io is now live. It still misses some of UX around on how to run things, but it's just low hanging fruit on documentation that will be addressed in follow-ups |
What do you think about some kind of 'credit points' ? Basically we can reward worker nodes who do computation in cluster And this credits can be spend directly as completion tokens in same cluster. In that case I think we can just take ERC-20 standard as example and deploy it to blockchain |
Since now we have federation support #2915 and #2343 it makes sense to build a place under the LocalAI website to list and visualize community pools.
By community pools, I'm refering to a way for people to share swarms token, so they can both provide hardware capabilities, and use the federation for inference (like, Petals, but with more "shards").
The idea is to have an "explorer" or a "dashboard" which shows a list of active pools and with how many federated instances or llama.cpp workers, reporting their capability and availability.
Things to notice:
This would allow the users to:
The text was updated successfully, but these errors were encountered: