Salaries

Three coworkers would like to know their average salary. how can they do it, without disclosing their own salaries?

Solution

How about: Person A writes a number that is her salary plus a random amount (AS + AR) and hands it to B, without showing C. B then adds his salary plus a random amount (BS + BR) and passes to C (at each step, they write on a new paper and don’t show the 3rd person). C adds CS + CR and passes to A. Now A subtracts her random number (AR), passes to B. B and C each subtract their random number and pass. After C is done, he shows the result and they divide by 3.

As has been noted already, there’s no way to liar-proof the scheme.

It’s also worth noting that once they know the average, any of the three knows the sum of the other 2 salaries.

2026 Update: Secure Computation and Privacy Puzzles

The classic salary puzzle — N people want to compute their average salary without anyone revealing their individual salary — is now directly relevant to production engineering as differential privacy, federated learning, and secure multi-party computation are mainstream ML concerns.

The classic solution: Person 1 adds a random large number to their salary and passes the sum to Person 2. Each person adds their salary and passes along. Person 1 subtracts their random number from the final total. Nobody ever sees an intermediate value that reveals a single salary.

Modern version in production ML (2026):

  • Federated Learning: Each mobile device trains on local data and shares only gradient updates (not raw data). The server aggregates gradients without seeing individual training examples — the same “sum without revealing individuals” principle.
  • Differential Privacy: Add calibrated noise to statistics before sharing, so individual records can’t be reconstructed. Apple and Google use this for telemetry collection.
  • Secure Aggregation: Google’s production federated learning system uses cryptographic secure aggregation so even the aggregation server can’t see individual device updates.

Still asked at (2026): Google (privacy engineering roles), Apple (differential privacy team), and security-focused ML companies. Understanding the principle demonstrates awareness of the privacy-utility tradeoff that matters in every data-driven system.

Scroll to Top