-
-
Notifications
You must be signed in to change notification settings - Fork 2
Description
Problem
Waiting is pain, and embedding hundreds of thousands of inputs over and over again can take a long time. Not to mention that larger models take up a ton of space (1 mill inputs for 3 models for both baselines is like 6 gbs).
Not only that, but there is a vast range of data visualizations that could be made, warning and alerts, recommendations based on what flags are getting triggered, etc.
Solution
This whole platform could be a paid service where people upload their I/Os and you keep their embeddings remotely. This would involve creating a whole front end with user login, a backend API to receive one off calls and a file upload system to receive massive training data files. This would be a fun project in its own right, but obviously involves cloud server costs to rapidly process embeddings. If you do decide to make a business out of this, give us a call, we would love to help.
Additional information
No response
👨👧👦 Contributing
- 🙋♂️ Yes, I'd love to make a PR to implement this feature!