Runway is the AI video creation platform — known for Gen-3 (text-to-video) and a deep set of creative tools used by film and advertising professionals. Series D in 2024. The interview emphasizes deep generative-video research, creative-tools product engineering, and the unique blend of professional and consumer use cases.
Process
Recruiter screen → 60-minute coding (Python or TypeScript) → onsite virtual: 2 coding, 1 ML system design or product system design, 1 craft deep-dive, 1 behavioral. Research candidates get a paper-discussion round. Cycle: 4–5 weeks.
What they actually ask
- Design a video generation pipeline that runs at multiple quality tiers
- Design a credit / billing system for variable-cost generation jobs
- Design a creative editor that integrates generative with classic editing
- Coding: medium-hard DSA
- Behavioral: ownership, taste, customer empathy for film/creative professionals
Levels and comp (2026)
- SE: $185K–$255K total
- Senior SE: $265K–$365K total
- Staff / ML Research: $380K–$570K+ total
Prep priorities
- Be fluent in Python (research / serving) and TypeScript (product)
- Understand video diffusion, motion priors, and temporal coherence
- Brush up on video editing pipelines, NLE concepts, and color science
Frequently Asked Questions
Is Runway remote-friendly?
Hubs in NYC (HQ) and San Francisco. Many engineering roles remote within US/Europe.
How does Runway compare to Pika or OpenAI Sora?
Runway has the strongest professional / film customer base and a deeper editing product. Pika competes on consumer simplicity. Sora is highest-quality but limited access. Runway pays competitively at top of AI startup tier.
What is the engineering culture?
Research-engineering blended; strong art-and-tech identity. Customer-driven by film and advertising professionals.