← All writing

Why Meta and Google Analytics Never Agree (And How to Reconcile Them)

Meta says $87K. GA4 says $41K. Both are technically correct. Here's the three-source attribution divergence and the reconciliation framework that resolves it.

Jordan Glickman·May 10, 2026·10
Attribution

Meta says your campaigns generated $87,000 in revenue last week. GA4 shows $41,000 attributed to paid social. Your Shopify dashboard shows $63,000 in total store revenue for the same period.

Three tools. Three numbers. None of them matching.

This is one of the most common frustrations in performance marketing, and one of the most consistently misunderstood. Most brands treat it as a technical problem — a misconfigured pixel, a UTM issue, or a data sync delay. They spend hours troubleshooting integrations, looking for the setting that will make the numbers align.

The numbers will never fully align. Not because the tools are broken. Because they are measuring fundamentally different things using fundamentally different methodologies. Understanding what each system is actually counting is the prerequisite to using any of them correctly for decisions.

Image brief: Four-row card layout — Meta Ads Manager (relative platform performance), GA4 (on-site behavior), Shopify Revenue (business truth), MER (budget allocation). Each card shows decision it drives and decisions it should not be used for. Clean minimal design. alt: "Four-source reporting framework for eCommerce attribution." caption: "The goal is not one number that all tools agree on. The goal is understanding what each number is actually measuring."

Why the gap exists: three structural causes

The discrepancy between Meta and Google Analytics is not random. It is predictable and it comes from three distinct measurement divergences that will always produce different numbers — not occasionally, but every single reporting period.

Cause 1: Attribution window differences

Meta's default attribution model credits a conversion to an ad if the user clicked that ad within a 7-day window, or viewed it within a 1-day window. This means if someone sees your Meta ad on Monday but does not click, then returns and purchases on Tuesday through a direct session, Meta attributes that purchase to the ad impression.

GA4, by default, attributes conversions to the session in which the conversion occurred. If Tuesday's purchase came through a direct session, GA4 credits direct traffic — not the Meta ad the person saw the day before.

Same customer. Same purchase. Two completely different attribution outcomes.

The view-through attribution window in Meta is the single largest source of discrepancy between the two platforms. It is not illegitimate — the customer did see the ad, and the impression may well have contributed to the decision. But GA4's session-based, click-focused methodology will never reflect those view-through attributions because it has no way to connect the impression event to the later session.

Cause 2: Cross-device tracking gaps

A customer sees a Meta ad on their phone while commuting. They browse the product page on mobile, do not purchase, then return the next evening on their laptop and complete the order through a direct session.

Meta's logged-in tracking may connect both sessions because the user was logged into Facebook on both devices. GA4, which relies on cookies, treats them as two separate, unconnected users. The Meta pixel fires on both sessions and Meta sees the full path. GA4 sees a new desktop user purchasing via direct traffic with no prior touchpoint.

Cross-device purchase behavior is common in mobile-first advertising environments, and it consistently produces Meta-reported conversions that have no corresponding paid social session in GA4 — because the purchase path crossed a device boundary that cookie-based tracking cannot follow.

Cause 3: ITP, ad blockers, and cookie restrictions

Safari's Intelligent Tracking Prevention limits third-party cookie lifespan to as little as 24 hours. Firefox's Enhanced Tracking Protection has similar effects. Ad blockers prevent tracking scripts from firing entirely on affected sessions.

When a user with these settings clicks a Meta ad and purchases within the 7-day attribution window, Meta's server-side Conversions API may successfully log the conversion. GA4's client-side JavaScript may not fire at all for ad-blocker users, and may attribute the purchase to direct traffic rather than paid social for ITP-affected Safari users whose cookies expired before the return session.

The result is a population of conversions that Meta records and GA4 either misses entirely or assigns to a different channel.

The four-source reporting framework

The mistake most brands make is looking for a single number all tools will agree on. That number does not exist. The productive reframe is assigning each data source to the specific decision it is actually suited to make.

Meta Ads Manager: Use for relative performance within the platform.

Meta's numbers are internally consistent. The same methodology applies to every campaign, ad set, and creative within the account. This means that while the absolute revenue figure will exceed business reality, the relative performance between two ads or two campaigns is meaningful and reliable. Ad A's ROAS being 30% higher than Ad B's ROAS within Meta is a valid signal, even if neither absolute number maps to Shopify revenue. Use Meta for within-platform optimization decisions — creative testing, bid strategy, audience comparisons. Never use it to quantify Meta's total contribution to business revenue.

GA4: Use for traffic quality and on-site behavior.

GA4 is the best tool for understanding what happens after the click. Engagement rate, pages per session, time on site, checkout funnel drop-off, and audience behavior patterns are all most reliably measured here. The session-based attributed revenue figures are useful for directional purposes but will systematically undercount Meta's contribution for all three reasons described above. Use GA4 for site optimization, funnel diagnosis, and audience behavior analysis. Do not use it as the authority on channel-level revenue contribution.

Shopify or payment processor: Use for total revenue truth.

Your eCommerce platform or payment processor knows exactly how much revenue changed hands, when, and in what amount. No attribution model touches this figure. It is the denominator against which every other number should be interpreted. Total platform-attributed revenue from Meta plus Google plus every other channel combined will almost always exceed Shopify revenue, because multiple platforms are each claiming credit for the same purchases. The Shopify figure is the reality; the platform figures are each platform's claim on that reality.

Marketing Efficiency Ratio: Use for budget allocation between channels.

MER — total Shopify revenue divided by total marketing spend across all channels — cuts through attribution disagreements entirely. It does not ask how much Meta generated or how much Google generated. It asks how much revenue the full marketing program generated per dollar of total investment. When MER improves, the program is working. When it deteriorates despite strong individual platform numbers, something structural is wrong. MER-based decision-making is what makes cross-channel budget allocation honest rather than dependent on whichever platform's attribution model wins the argument.

Discrepancy benchmarks: what is normal vs. what is a problem

| Discrepancy Type | Typical Range | What It Indicates | |---|---|---| | Meta reported vs. GA4 paid social | Meta 30–80% higher | Normal and structural. View-through + cross-device attribution. | | Total platform-attributed vs. Shopify revenue | Platforms 40–120% higher | Normal at scale. Cross-platform double-counting. | | GA4 direct traffic as % of total sessions | 15–40% of sessions | High direct often signals untracked paid conversions. | | Meta vs. GA4 on click-only, single-device window | 10–25% higher | Residual gap from ITP and ad blocker limitations. |

A 40–60% gap between Meta-reported and GA4 paid social revenue is structural and expected, not a sign of a broken setup. A gap consistently above 80% warrants investigation — it may indicate view-through attribution windows set too broadly, a Conversions API configuration double-counting events, or a pixel firing duplicate events. But chasing the gap below 40% is not a productive goal. It is not achievable without changing what the tools measure, which is not in your control.

Building a reconciliation report

Rather than trying to make numbers agree, build a weekly dashboard that shows each source alongside the others in a way that produces clarity instead of confusion.

The reconciliation report covers five figures in a single view:

  1. Shopify total revenue — the ground truth. No attribution model applied.
  2. Meta Ads Manager reported revenue — with attribution window noted. Useful for relative platform optimization but expected to exceed GA4.
  3. GA4 paid social attributed revenue — expected to be lower than Meta. Useful as a floor estimate of traceable click-through contribution.
  4. Total platform-attributed revenue (all channels combined) — expected to substantially exceed Shopify due to cross-platform double-counting. Never compare this to Shopify revenue as if they are measuring the same thing.
  5. Marketing Efficiency Ratio — Shopify revenue divided by total marketing spend. The number that drives allocation decisions.

Presented together consistently, these five figures stop generating confusion and start generating a coherent picture. The team stops asking "why don't these numbers match" and starts asking "what does each number tell us that the others cannot?" That shift in question is worth more than any attribution tool.

UTM tagging standard that reduces the avoidable gap

While the structural gap cannot be eliminated, poor UTM tagging inflates it unnecessarily. Every Meta ad driving off-platform traffic needs a UTM structure that allows GA4 to correctly attribute the session.

The most common failure is inconsistent campaign naming conventions that fragment what should be a single campaign into dozens of unrecognizable UTM variants in GA4 reports — making cross-campaign analysis in GA4 unreliable not because of measurement limitations but because of labeling failures.

A clean standard uses utm_source matching the exact platform name used in internal reporting (meta, google), utm_medium standardized across all paid channels (paid-social for social platforms, paid-search for search), and utm_campaign matching the platform campaign name with spaces replaced by hyphens and consistent casing. When this is applied consistently, GA4 becomes significantly more useful for cross-channel trend analysis because the source and medium categories are reliable over time.

How to handle this conversation with clients

The attribution discrepancy conversation is one of the most important credibility moments in an agency-client relationship. When a client first sees that Meta reports roughly double what GA4 shows for the same campaign, the instinctive response is to question whether the campaigns are working at all — or to question why the agency is "allowing" the numbers to diverge.

The agency that responds by trying to make the numbers agree, or by defending Meta's methodology over GA4's, loses credibility. The agency that explains the structural reasons for the divergence, introduces MER as the shared decision framework, and provides a reconciliation report that makes each number interpretable earns trust precisely because it demonstrates measurement sophistication rather than measurement anxiety.

At Impremis, we introduce the reconciliation framework in the first month of every client relationship. We explain upfront that the numbers will never fully match and why, and we establish MER as the shared language for budget decisions before the discrepancy has the chance to become a source of confusion or conflict.

That proactive framing turns a potential credibility problem into a credibility advantage.

FAQ

Should we use Meta-reported revenue or GA4 for ad set-level optimization within Meta? Meta-reported revenue. GA4 cannot replicate Meta's internal relative performance data because it does not have visibility into which ad set drove which session across view-through and cross-device paths. For decisions happening within Meta — creative performance, bid strategy, audience comparison — Meta's numbers are the right input despite their absolute overstatement.

How should we report marketing performance to leadership or investors? Lead with Shopify revenue as the business reality. Present MER as the efficiency metric. Include platform-reported metrics as context, with a brief explanation that they measure channel-level relative performance rather than absolute business contribution. Never present total platform-attributed revenue as if it represents business revenue — that number will always exceed Shopify revenue and will confuse or mislead anyone not familiar with cross-platform attribution behavior.

Can we reduce the discrepancy by switching to a shorter Meta attribution window? Yes, setting Meta to 7-day click, 0-day view removes view-through attribution and significantly reduces the gap. This is worth testing if you want a more conservative revenue estimate from Meta. The tradeoff is that Meta's delivery optimization will change — it will optimize toward click-converting audiences rather than including view-through converters, which may change campaign performance. Test before applying globally.

Why does GA4 show more direct traffic than we would expect? High direct traffic in GA4 is frequently a symptom of untracked paid conversions. When Meta Conversions API fires a conversion that GA4 cannot attribute because the click was not tracked through a browser cookie, GA4 assigns the session to direct by default. A direct traffic percentage consistently above 30% of total sessions is a signal worth investigating through UTM coverage audits.

Closing

The gap between Meta and Google Analytics is not a measurement failure. It is a measurement feature — a predictable consequence of two tools built with different methodologies for different purposes.

The path forward is not finding the single number that all tools agree on. It is building a reporting framework that assigns each tool to the decision it is actually suited to make, and reading MER as the coherent signal that cuts through the attribution disagreements that will otherwise never be resolved.

Clean UTMs. Weekly reconciliation. MER as the allocation driver. Everything else is context.

Keep reading

Pieces I've written on related topics that pair well with this one:

Subscribe to the newsletter

Get every post in your inbox.

New writing every two weeks. No fluff. Unsubscribe anytime.

Subscribe