Low Level Design: Zero Trust Network Architecture

Introduction

Zero trust replaces implicit perimeter-based trust with continuous verification. The core principle is: never trust, always verify. Every request is authenticated, authorized, and encrypted regardless of network location. The model is motivated by the collapse of the traditional perimeter — remote work, cloud migration, and insider threats have made network location an unreliable signal of trust.

Core Principles

Verify explicitly: authenticate and authorize every request using all available data — identity, location, device health, service context, and data sensitivity. Use least privilege access: limit access to the minimum required for the user’s role and task. Assume breach: minimize blast radius through segmentation, restrict lateral movement, and monitor everything with the assumption that adversaries are already inside.

Identity Verification

Every user and service has a strong, verifiable identity. Humans authenticate via an IdP with MFA required. Services authenticate via workload identity — SPIFFE/SPIRE, service account tokens, or mTLS client certificates. Identity is never implicit from network location: being on the corporate network grants nothing by itself. Identity tokens are short-lived (1 hour) and continuously refreshed to limit the window of token compromise.

Device Trust

Device compliance is checked before granting access. MDM (Mobile Device Management) reports device posture: OS version, disk encryption status, screen lock enforcement, and security agent installation. A device certificate is issued by the MDM and presented alongside user identity at the access layer. Non-compliant devices are denied access or restricted to low-sensitivity resources. BYOD devices receive more restricted access than managed corporate devices.

Policy Enforcement Points

The BeyondCorp-style access proxy routes all requests to internal resources through a proxy that evaluates policy — user identity, device trust, resource sensitivity, and request context — before forwarding. Alternatively, enforcement can be pushed to the service level via a sidecar in a service mesh (e.g., Istio with OPA). The policy engine (OPA) evaluates policy rules centrally; the Policy Enforcement Point (PEP) enforces the decision. Policies support hot-reload without requiring a restart.

Micro-Segmentation

The network is divided into small segments with traffic between segments controlled by explicit policy. Lateral movement is limited: a compromised service cannot freely reach all other services. Micro-segmentation is implemented via service mesh mTLS (each service can only call explicitly authorized peers), cloud security groups (per-workload firewall rules), or software-defined networking. Firewall rules are defined as code and reviewed via pull request, creating an auditable change history.

Continuous Authorization

Authorization is not evaluated only at login — it is re-evaluated on each request. Context signals include: user risk score (recent failed logins, unusual location), device health (compliance check timestamp), data sensitivity level, and time of day. Step-up authentication is triggered when the risk threshold increases — the user is prompted to re-authenticate before accessing a sensitive operation. Sessions are revoked immediately if the risk score exceeds a critical threshold.

BeyondCorp Implementation

Google’s BeyondCorp model enforces access at the application layer rather than the network layer. The access proxy checks user identity (via LDAP or IdP) and device certificate validity on every request. Resource policies define (user_group, device_type, access_level) tuples that govern what can be accessed and under what conditions. Because enforcement is application-layer, VPN is not required — users work from any network with the same security posture as they would have on the corporate LAN.

See also: Netflix Interview Guide 2026: Streaming Architecture, Recommendation Systems, and Engineering Excellence

See also: Scale AI Interview Guide 2026: Data Infrastructure, RLHF Pipelines, and ML Engineering

See also: Uber Interview Guide 2026: Dispatch Systems, Geospatial Algorithms, and Marketplace Engineering

Scroll to Top