Security and encryption in Privatemode

Privatemode is the first AI service with true end-to-end confidential computing. This page describes why you need this and how it works.

AI with threats
The problem

Existing AI services process your data in plaintext

Current AI services—like ChatGPT for end users and Azure AI for businesses—don't have technical mechanisms in place to enforce data security and privacy end-to-end.

Thus, your data—such as prompts and responses—remains vulnerable to inside-out leaks and outside-in attacks. This is the reason why many business and individuals are reluctant to share sensitive data with AI services.

Potential threats include malicious insiders and hackers.

The solution

Privatemode protects your data end-to-end

In Privatemode, your data is processed in a shielded environment. This environment is created with the help of a hardware-based technology called confidential computing, which—among other mechanisms—keeps your data encrypted even during processing in main memory.

Finally, you can process even your sensitive data with AI.

How it works, is explained in the remainder of this page.

AI protected against threats

The three pillars of Privatemode

Pillar #1
Encryption at rest, in transit, and in use

End-to-end confidential computing

In Privatemode, prompts and responses are fully protected from external access. Prompts are encrypted client-side using AES-256 and decrypted only within Privatemode’s confidential-computing environment (“the box”), enforced by AMD CPUs and Nvidia H100 GPUs. Within the box, the data remains encrypted in use, ensuring it never appears as plaintext in main memory.

Pillar #2
Remote attestation

End-to-end attestation and verification

In Privatemode, prompts and responses are fully protected from external access. Prompts are encrypted client-side using AES-256 and decrypted only within Privatemode’s confidential-computing environment (“the box”), enforced by AMD CPUs and Nvidia H100 GPUs. Within the box, the data remains encrypted in use, ensuring it never appears as plaintext in main memory.

Pillar #3
Black-box architecture

Black-box architecture

Privatemode is architected such that user data can neither be accessed by the infrastructure provider (for example, Azure), nor the service provider (Edgeless Systems), nor other parties such as the provider of the AI model (for example, Meta). While confidential-computing mechanisms prevent outside-in access, sandboxing mechanisms and end-to-end remote attestation prevent inside-out leaks.

Want to learn more about confidential computing?

Read the Confidential Computing Wiki