Private AI is primarily designed to be self-hosted by the user via a container, to provide users with the best possible experience in terms of latency and security. It would be counter-productive to send sensitive data across the Internet to a 3rd party system for the purpose of preserving privacy. It also ensures that Private AI never sees or handles customer data, unlike alternative services which usually retain a right to use any data passed through the system for service improvements and ML model development.

Instead of running the container locally, it is also possible to use a cloud version of the API, hosted at the following endpoint:

Custom integrations that do not rely on Docker can also be delivered upon request.

Installation is organized as follows:

© Copyright 2024 Private AI.