Security is not a feature we added. It is how the architecture works. Source code stays local, API keys stay in the OS keychain, and PII gets redacted before any LLM call.
These rules are enforced at the code level. They cannot be overridden by configuration, user input, or admin settings.
The plugin runs inside Android Studio on the developer's machine. When a flow unit reaches an AI phase, the plugin sends the code directly to the LLM provider API using the developer's own key.
Our backend server never sees source code. It receives only metadata: flow unit status changes, phase transitions, team reports, and LLM usage logs (token counts and costs, not content).
API keys for LLM providers are stored in the OS keychain via IntelliJ's PasswordSafe API. They are never written to settings files, environment variables, or log output.