Attribution April 2026 · 5 min read

Why your MMP says one thing and GA4 says another

Attribution model differences, lookback windows, and the data gaps that make your reports disagree.

SD
Servet Demirhan Performance Marketing and Growth

If you have ever pulled a weekly install report from AppsFlyer or Adjust, then cross-checked it against GA4, you already know the punchline: the numbers never match. Not even close. A campaign that your MMP credits with 4,200 installs might show up in GA4 as 2,900 first_open events. The reverse can happen too. Understanding why those gaps exist is the first step toward building a reporting framework your team can actually trust.

The attribution model gap

MMPs like AppsFlyer, Adjust, and Branch typically operate on a last-touch, deterministic model with fingerprinting fallbacks. They assign credit to the last click (or view) before an install, pulling data from device-level ad interactions. GA4, on the other hand, uses a data-driven attribution model by default since late 2023, distributing fractional credit across touchpoints based on a machine-learning algorithm trained on your conversion paths. Two different philosophies, two different numbers.

Lookback windows and timing differences

Most MMPs default to a 7-day click-through and 24-hour view-through lookback window, though networks like Meta often negotiate their own. GA4 uses a 30-day default engagement lookback. This means an install that happened eight days after a click will be "organic" in your MMP but could still be attributed to a paid campaign in GA4 if there was a mid-funnel website visit. Aligning lookback windows is one of the cheapest fixes you can make, yet most teams never do it.

SKAdNetwork and the iOS privacy layer

Post-ATT, a significant share of iOS conversions arrive via SKAdNetwork postbacks, which are delayed, aggregated, and stripped of device-level identifiers. MMPs ingest these postbacks and attempt to map them back to campaigns, but the data is inherently lossy. GA4 does not consume SKAN postbacks directly; it relies on its own SDK events. The result is a structural gap on iOS that no amount of dashboard configuration can fully close. The practical move is to treat MMP data as the source of truth for media buying decisions and GA4 as the source of truth for on-site and in-app behavioural analysis.

Building a reconciliation layer

The teams that handle this well build a lightweight reconciliation layer, usually a weekly BigQuery job or a Looker Studio page, that pulls raw data from both sources, normalises campaign naming conventions, and outputs a single blended view. The key columns are mmp_installs, ga4_first_opens, delta_pct, and a flag for anything over a 20 percent variance. When the delta spikes, you investigate. When it stays within tolerance, you move on and spend your energy on creative testing instead.