Snap Interview Guide

Snap Interview Guide 2026: AR Engineering, Camera Systems, and Real-Time Communication at Scale

Snap Inc. is a camera company — their words, and accurate. While known for Snapchat, their core technical differentiation is augmented reality (Lens Studio, Spectacles), real-time video processing, and ephemeral messaging. This guide covers SWE and graphics/ML engineering interviews at L3–L6.

The Snap Interview Process

  1. Recruiter screen (30 min)
  2. Technical phone screen (1 hour) — 1–2 coding problems; may include camera/graphics domain question for relevant teams
  3. Virtual onsite (4–5 rounds):
    • 2× coding (algorithms, data structures; graphics/CV questions for camera teams)
    • 1× system design (messaging, stories, AR platform, or ads)
    • 1× domain depth (for AR/ML roles: linear algebra, rendering, neural networks)
    • 1× behavioral (Snap values)

Snap has distinct engineering orgs: Core Infrastructure, Advertising (major revenue driver), Camera Platform (AR/ML), Maps, and Snap Kit (SDK). Interview style varies by team.

Core Algorithms: Computer Vision and AR

Face Landmark Detection (Simplified)

import math
from typing import List, Tuple

class FaceLandmarkTracker:
    """
    Simplified face landmark tracking for AR filters.
    Snap's Lens Studio runs landmark detection at 30+ FPS on mobile.

    Real system uses deep learning (MobileNet-based landmark detector)
    with 68+ facial landmarks (eyes, nose, mouth corners, jawline).

    Key optimizations for mobile:
    - Quantized INT8 models (4x faster than FP32)
    - Region of Interest: only process detected face region
    - Temporal smoothing: use Kalman filter to smooth jitter
    - Two-stage: face detection → landmark refinement
    """

    def __init__(self, n_landmarks: int = 68):
        self.n_landmarks = n_landmarks
        self.smoothing_factor = 0.7  # for exponential moving average

    def smooth_landmarks(
        self,
        prev_landmarks: List[Tuple[float, float]],
        new_landmarks: List[Tuple[float, float]]
    ) -> List[Tuple[float, float]]:
        """
        Exponential moving average smoothing to reduce jitter.
        α=0.7 means: 70% current frame, 30% previous frame.

        Higher α = more responsive to movement, more jitter.
        Lower α = smoother but lags behind fast movements.
        """
        if not prev_landmarks:
            return new_landmarks

        α = self.smoothing_factor
        smoothed = []
        for prev, curr in zip(prev_landmarks, new_landmarks):
            x = α * curr[0] + (1 - α) * prev[0]
            y = α * curr[1] + (1 - α) * prev[1]
            smoothed.append((x, y))
        return smoothed

    def fit_face_oval(
        self,
        landmarks: List[Tuple[float, float]],
        jawline_indices: List[int] = None
    ) -> Tuple[float, float, float, float]:
        """
        Fit an ellipse to jaw landmarks for face oval detection.
        Used for AR face contour effects.

        Returns: (center_x, center_y, width, height)
        """
        if not landmarks:
            return (0, 0, 0, 0)

        indices = jawline_indices or list(range(len(landmarks)))
        points = [landmarks[i] for i in indices if i  Tuple[float, float, float]:
        """
        Estimate pitch, yaw, roll from landmark geometry.
        Used to align 3D AR objects (glasses, hats) with head orientation.

        Simplified geometric approach — real systems use PnP algorithm
        (Perspective-n-Point) with 3D model correspondence.

        Returns: (pitch, yaw, roll) in degrees
        """
        nose_tip = landmarks_2d[key_point_indices.get('nose_tip', 33)]
        left_eye = landmarks_2d[key_point_indices.get('left_eye', 36)]
        right_eye = landmarks_2d[key_point_indices.get('right_eye', 45)]

        # Yaw: horizontal head turn — estimated from eye position symmetry
        eye_dx = right_eye[0] - left_eye[0]
        eye_dy = right_eye[1] - left_eye[1]
        yaw = math.degrees(math.atan2(eye_dy, eye_dx))

        # Roll: head tilt
        roll = math.degrees(math.atan2(
            right_eye[1] - left_eye[1],
            right_eye[0] - left_eye[0]
        ))

        # Pitch: very rough estimate from nose position relative to eyes
        eye_center_y = (left_eye[1] + right_eye[1]) / 2
        pitch_raw = nose_tip[1] - eye_center_y
        pitch = pitch_raw / 10.0  # normalize

        return (pitch, yaw, roll)

Ephemeral Message Storage with TTL

import time
import heapq
from typing import Any, Dict, List, Optional, Tuple

class EphemeralMessageStore:
    """
    Message store with automatic TTL expiry — core to Snapchat's
    ephemeral messaging model.

    Snaps expire after:
    - Viewed: 1–10 seconds (sender's choice)
    - Unopened: 30 days
    - Stories: 24 hours

    Implementation: heap-based TTL eviction + hash map for O(1) lookup.
    Real Snap uses Cassandra with TTL column families for this.
    """

    def __init__(self):
        self.messages: Dict[str, Dict] = {}  # msg_id -> {content, expires_at}
        self.expiry_heap: List[Tuple[float, str]] = []  # (expires_at, msg_id)

    def store(self, msg_id: str, content: Any,
              ttl_seconds: float) -> float:
        """Store message with TTL. Returns expiry timestamp."""
        expires_at = time.time() + ttl_seconds
        self.messages[msg_id] = {
            'content': content,
            'expires_at': expires_at,
            'viewed': False,
        }
        heapq.heappush(self.expiry_heap, (expires_at, msg_id))
        return expires_at

    def retrieve(self, msg_id: str) -> Optional[Any]:
        """
        Retrieve message if not expired. Mark as viewed.
        Snap-specific: after view, content is deleted within seconds.
        """
        self._evict_expired()

        if msg_id not in self.messages:
            return None

        msg = self.messages[msg_id]
        if time.time() > msg['expires_at']:
            del self.messages[msg_id]
            return None

        # If already viewed, cannot retrieve again
        if msg['viewed']:
            return None

        content = msg['content']
        msg['viewed'] = True
        # Schedule immediate deletion (in production: async job after ACK)
        msg['expires_at'] = time.time() + 5  # 5 second grace for replay protection
        return content

    def _evict_expired(self):
        """Lazy eviction: clean up expired messages from heap."""
        now = time.time()
        while self.expiry_heap and self.expiry_heap[0][0] <= now:
            expires_at, msg_id = heapq.heappop(self.expiry_heap)
            if msg_id in self.messages:
                msg = self.messages[msg_id]
                if msg['expires_at'] <= now:
                    del self.messages[msg_id]

System Design: Snap Stories at Scale

Common question: “Design Snap Stories — content visible to followers for 24 hours.”

"""
Snap Stories Architecture:

Content Upload:
  Mobile → [Upload Service] → S3 (video/image)
                            → [Transcoding] → multiple resolutions
                            → [CDN distribution] (Fastly/Cloudflare)

Story Creation:
  POST /story → [Story Service] → creates story record
  Record: {story_id, creator_id, media_url, created_at, expires_at=+24h}

Story Feed (for a viewer):
  GET /feed → [Feed Service] → fetches stories from all followed accounts
  Sorted by: recency (most recent first)
  Viewed status tracked per viewer (Redis SET: viewer_id:viewed_stories)

Discovery (public stories):
  Snap Map shows stories pinned to geographic locations
  "Our Story" aggregates community content around events/locations
  Requires geospatial indexing (S2 geometry or geohashing)

Scale numbers:
  - 400M daily active users
  - 4B+ Snaps created daily
  - Stories served with <300ms latency globally

CDN strategy:
  - Popular stories (celebrity accounts) eagerly pushed to edge nodes
  - Unknown creators: pull-through cache (cached on first request)
  - 24h TTL: CDN entries automatically invalidate at story expiry
"""

Snap Engineering Culture

  • Innovation-focused: Snap pioneered Stories, AR filters, ephemeral messaging — culture rewards novel ideas
  • Mobile-first: Everything is evaluated on mobile performance; battery impact, frame rate, bandwidth matter
  • Privacy: Ephemeral design is intentional; privacy is a product differentiator
  • Hardware integration: Spectacles (AR glasses) — Snap is investing in the AR hardware stack

Behavioral at Snap

  • “What’s a product you think is badly designed and how would you fix it?” — Snap values design sensibility
  • Creativity: “Tell me about something you built that you’re genuinely proud of.”
  • Speed vs. craft: Social apps iterate fast; show you can ship quickly without sacrificing stability

Compensation (L3–L6, US, 2025 data)

Level Title Base Total Comp
L3 SWE $150–180K $195–245K
L4 Senior SWE $185–220K $260–350K
L5 Staff SWE $220–260K $360–480K
L6 Principal $260–310K $480–650K+

Snap is publicly traded (NYSE: SNAP). RSUs vest quarterly over 4 years. Stock has been challenging post-ATH; evaluate on long-term AR/Spectacles potential.

Interview Tips

  • Use Snapchat and Lens Studio: Build an AR Lens before interviewing for the camera platform team
  • Computer vision basics: Know CNNs, object detection, facial landmark detection at a conceptual level
  • Mobile optimization: Battery, latency, frame rate — these metrics matter at Snap more than elsewhere
  • LeetCode: Medium difficulty; image processing, graph algorithms, and sliding window patterns are common

Practice problems: LeetCode 315 (Count Smaller Numbers After Self), 239 (Sliding Window Max), 307 (Range Sum Query Mutable), 56 (Merge Intervals).

Related System Design Interview Questions

Practice these system design problems that appear in Snap interviews:

Related Company Interview Guides

Explore all our company interview guides covering FAANG, startups, and high-growth tech companies.

Scroll to Top