For end-users and teams who want full control over their data and lower costs—including support for local LLMs.
You can run the alwyse backend (API, agents, memory store) on your own hardware or in your own cloud. Your memories, profile, and all AI processing stay under your control. The alwyse apps (web, iOS, macOS) connect to your instance instead of the shared cloud.
To use your private instance from anywhere—e.g. your phone while away from home—alwyse offers a relay feature. The relay is a lightweight tunnel that lets the app reach your server without exposing your home network or opening ports.
When you use the relay to connect to your private deployment, traffic is encrypted end-to-end. TLS is terminated on your machine (your server), not on the relay. The relay server only sees encrypted bytes—it cannot read your API calls, your memories, or your voice sessions.
No intermediary (including the relay operator or the hosting provider) can read your data. You get the convenience of reaching your private instance from anywhere, with the same privacy guarantees as if you were on your local network.
Every agent in alwyse has a trustworthiness report so you know what it does before you activate it. In a private deployment, you combine that transparency with full control over where data lives and who can access it.
You choose which agents are active. Your data stays in your environment. The relay does not see the content of your traffic. Together, this gives you both informed choice (trust reports) and technical control (private deployment and end-to-end encryption).
Learn about privacy & trust →alwyse is a personal intelligence: all the agents you activate use LLMs to work with your memories, patterns, and context. When you host alwyse on your own private server, you can run those LLMs locally—on your own hardware or in your own VPC—instead of sending every call to a cloud LLM provider.
That keeps prompts and responses inside your infrastructure and can save a lot on alwyse costs overall. You pay for compute and hosting once; there are no third-party token fees for that traffic. For teams and power users, private deployment with local LLMs is a way to get the full benefit of your personal intelligence while controlling both cost and data.
Trustworthiness reports for the alwyse infrastructure and for each agent are attributed to security experts. They give you a clear view of how the system and each helper are designed and reviewed.
When alwyse is hosted in the cloud, it utilizes confidential computing infrastructure in combination with modern on-device Trusted Platform technologies. Together, these provide a provable guarantee: no CVOYA employee, no employee of the cloud provider, and no enforcement agency can ever see your data.
alwyse is built so that users and organizations can have a single “second brain” experience while keeping data and spend under their control when they need it.