Packet Loss
Packet loss is when data sent across a network never arrives at its destination. In multiplayer games, even small amounts of packet loss — 1–3% — cause rubber-banding, missing inputs, and broken hit registration.
High latency feels bad. Packet loss feels broken. The two are often confused by players, but they have different root causes and different symptoms.
What causes packet loss
Network congestion is the most common cause. Routers and switches drop packets when they receive more data than they can forward. This happens at ISP infrastructure, home routers under load, and data centre uplinks during traffic spikes.
Wi-Fi interference causes significant packet loss in home environments. Competing devices, walls, and distance from the router all contribute. A player on Wi-Fi will often experience higher packet loss than a player on a wired connection, even at the same reported latency.
Faulty hardware — bad cables, degraded network interface cards, and failing switches — produces intermittent packet loss that is hard to diagnose because it does not show up consistently.
Distance and routing can cause packets to pass through many hops, each one an opportunity for a congested or failing node to drop the packet. This is why geographic proximity between players and game servers matters — fewer hops means fewer places for packets to disappear.
How packet loss manifests in games
On UDP-based games (most real-time multiplayer titles), a lost packet is simply gone. The client’s interpolation system runs out of data and has to extrapolate or freeze the affected player until the next update arrives. This produces the characteristic rubber-band or teleport effect when the correction lands.
Inputs sent from client to server can also be lost. A jump or shot command that never arrives means the server never processes it — the action simply does not happen. Some networking implementations send inputs redundantly across multiple packets specifically to protect against this.
On TCP-based connections, lost packets trigger retransmission, which introduces a latency spike rather than a missing update. Players experience this as the game freezing momentarily and then snapping forward.
Acceptable packet loss thresholds
- 0–0.5%: Essentially unnoticeable. Normal background noise on most internet connections.
- 0.5–2%: Noticeable in competitive play. Some rubber-banding, occasional missed inputs.
- 2–5%: Significant degradation. Consistent rubber-banding, unreliable hit registration.
- 5%+: Effectively unplayable for fast-paced multiplayer. Some turn-based or low-frequency games remain functional.
These thresholds are lower than most people expect. Games sending 64 updates per second at 2% loss are dropping more than one update per second, which is enough to produce visible artifacts and inconsistent gameplay.
Packet loss vs. latency
Latency adds a fixed delay that every player experiences and that the netcode is designed to handle. Packet loss is irregular — it removes specific updates unpredictably, breaks the assumptions of interpolation and prediction systems, and cannot be compensated for until the missing data either arrives (TCP) or is reconstructed from context (UDP with redundant sends).
A game running at 80ms with 0% packet loss usually feels better than the same game at 40ms with 3% packet loss.
See also: Jitter · Latency · Netcode · UDP vs TCP