Attribution After Cookies: What Marketers Actually Measure Now
Third-party cookies are gone. Apple killed cross-app tracking. The attribution models that justified billions in ad spend have collapsed. Here is what smart brands are doing instead.
In January 2025, Google finally removed third-party cookies from Chrome. The move had been delayed three times since its original 2022 deadline, giving the industry years to prepare. Most of the industry did not prepare.
The immediate impact was quieter than expected. No systems crashed. No campaigns stopped running. But across thousands of marketing departments, a slow panic set in as dashboards went dark. Retargeting audiences shrank. Attribution reports became unreliable. The measurement infrastructure that had justified digital ad budgets for a decade started showing numbers that nobody trusted.
This wasn't a sudden crisis. It was the final stage of a slow-motion collapse that began with GDPR in 2018, accelerated with Apple's ATT in 2021, and finished with Chrome's cookie deprecation in 2025. The question is no longer whether the old measurement model is broken. The question is what replaces it.
What We Lost
To understand the scale of the problem, you need to understand what third-party cookies actually did.
A third-party cookie was a small file placed on your browser by a domain other than the one you were visiting. When you visited a shoe website, Facebook's pixel placed a cookie. When you later visited Facebook, that cookie told Facebook you'd been shopping for shoes. Facebook then showed you shoe ads. When you clicked and bought, the attribution system credited Facebook with the conversion.
This mechanism powered three pillars of digital marketing:
Cross-site tracking let advertisers follow users across the internet, building behavioral profiles. A single user might generate data from hundreds of websites, creating a rich picture of interests, purchase intent, and browsing habits.
Retargeting used that behavioral data to show ads to people who had already expressed interest. Someone who visited your pricing page but didn't buy would see your ads on other websites for days afterward. Retargeting typically generated the highest return on ad spend of any tactic — conversion rates 3-5x higher than prospecting campaigns, according to Criteo's 2023 data.
Multi-touch attribution tracked a user's journey across multiple ad interactions to determine which touchpoints contributed to a conversion. A user might click a Google ad, later see a Facebook ad, then click a retargeting ad, and finally convert. Attribution models assigned fractional credit to each touchpoint.
All three depended on the ability to identify the same user across different websites. Without third-party cookies, that ability evaporated for Chrome's 65% market share. Combined with Safari (which blocked third-party cookies in 2020) and Firefox (2019), well over 90% of browser traffic now blocks cross-site tracking.
The Three Approaches That Are Working
No single methodology replaces what cookies provided. Instead, the industry is converging on three complementary approaches.
First-Party Data Infrastructure
The most immediate response has been a scramble to collect first-party data — information that customers provide directly through interactions with your own properties.
Email addresses. Phone numbers. Purchase history. Account creation. Newsletter signups. Loyalty programs. Every piece of data a customer gives you voluntarily becomes exponentially more valuable in a cookieless world.
The brands that invested in first-party data infrastructure before 2025 have a significant advantage. Sephora's Beauty Insider loyalty program, with over 34 million members, provides rich behavioral data that doesn't depend on third-party tracking. Amazon's entire advertising business — which generated $46.9 billion in 2023 — runs on first-party purchase and browsing data collected on its own properties.
For smaller brands, the first-party data gap is harder to close. Building a meaningful customer database requires time, volume, and a compelling reason for users to share information. Brands that spent the past decade renting audiences from Facebook instead of building their own now face the consequences.
The technical challenge is consolidation. Most brands have customer data scattered across their website analytics, email platform, CRM, payment processor, and customer support tools. A Customer Data Platform (CDP) — from vendors like Segment, mParticle, or Snowplow — aggregates these sources into unified customer profiles. According to the CDP Institute, adoption of CDPs grew 25% year-over-year in 2024, with mid-market companies driving most of the growth.
Marketing Mix Modeling (MMM) Renaissance
Marketing mix modeling isn't new. It's a statistical technique from the 1960s that uses aggregate data — total spend by channel, total revenue by period, seasonal adjustments — to estimate the impact of each marketing channel on business outcomes.
MMM fell out of favor during the precision-targeting era because it couldn't match the granularity of click-level attribution. Why use a statistical estimate when you could track actual conversions to actual clicks?
Now that click-level tracking is degraded, MMM is back. But the modern version looks different from the 1960s original.
Modern MMM is faster. Traditional models required months of data and weeks of analysis. Tools like Google's Meridian (open-sourced in 2024), Meta's Robyn, and startups like Recast and Paramark run models in days using Bayesian statistics and machine learning.
Modern MMM incorporates more variables. Beyond channel spend, models now include weather, competitive activity, macroeconomic indicators, and organic social metrics. The models are still estimates — they'll never match the precision of individual-level tracking — but they're getting more accurate.
The limitation remains granularity. MMM tells you that Facebook drove approximately 18% of last quarter's revenue. It doesn't tell you which Facebook campaigns, which creatives, or which audiences performed best. For tactical optimization, you still need channel-level data. MMM provides strategic direction, not tactical guidance.
Incrementality Testing
Incrementality testing answers the most important question in marketing measurement: what would have happened if we hadn't spent this money?
The methodology is straightforward. Take a market or audience and split it into two groups: one that sees the ad (test) and one that doesn't (holdout). Compare the outcomes. The difference is the incremental impact of the advertising.
Geo-based testing — running ads in some cities but not others and comparing sales — is the most common implementation. Meta, Google, and TikTok all offer built-in incrementality testing frameworks. Third-party platforms like Measured and InMarket offer cross-channel incrementality measurement.
The insight that incrementality testing reveals is often uncomfortable. Studies consistently show that 30-50% of performance marketing spend is non-incremental — meaning the conversions would have happened anyway. Brand search campaigns are the worst offenders. People searching for your brand name were already going to find you. Paying for the click just moves the conversion from organic to paid without creating additional revenue.
A 2024 analysis by Nielsen found that brands running incrementality tests alongside traditional attribution models reduced wasted ad spend by an average of 22%. The tests don't just measure differently — they measure something more useful.
The Emerging Standard
The measurement approach that's crystallizing among sophisticated marketing organizations looks like this:
Strategic layer (quarterly): Marketing mix modeling determines overall channel allocation. How much to spend on Google versus Meta versus TikTok versus TV versus sponsorships. This replaces the "attribution model says Facebook drives the most conversions, so put more money there" approach.
Validation layer (bi-annually): Incrementality tests validate the MMM findings. If MMM says YouTube is contributing 12% of revenue, an incrementality test in a subset of markets can confirm or deny that estimate.
Tactical layer (daily/weekly): Platform-native attribution models (Meta's Conversions API, Google's enhanced conversions) handle day-to-day campaign optimization. These models use first-party data and modeled conversions (statistical estimates for untracked users) and are less accurate than the old cookie-based models, but they're sufficient for tactical decisions like which creatives to scale and which to pause.
Qualitative layer (ongoing): Self-reported attribution ("How did you hear about us?") provides a check on all the quantitative models. Post-purchase surveys consistently reveal channel contributions that no model captures — podcast mentions, friend recommendations, Reddit threads.
What Smaller Brands Should Do
The measurement framework above requires resources that many smaller brands don't have. The budget for MMM tools, incrementality tests, and CDPs starts at $50K per year and scales quickly.
For brands spending under $500K annually on marketing, a simpler approach works:
Blended CAC as the primary metric. Stop trying to attribute customers to individual channels. Instead, divide total marketing spend by total new customers. If blended CAC is below your target, increase spend. If it's above, reduce. This is crude but honest — it doesn't pretend to know things it can't know.
Self-reported attribution on every form. Add "How did you hear about us?" as a required field on signup, purchase, and inquiry forms. Make the options specific (not just "social media" — which platform? what kind of content?). Aggregate monthly. The data is imperfect but directionally useful.
Channel-off testing. Once a quarter, turn off one paid channel entirely for two to four weeks and observe the impact on blended metrics. If you pause Facebook ads and total revenue drops 5%, Facebook is probably contributing about 5% of revenue — regardless of what Facebook's attribution says.
UTM discipline. Tag every link, everywhere. UTMs are first-party by nature and survive cookie deprecation. A proper UTM taxonomy — consistent naming conventions for source, medium, campaign, and content — provides the foundation for whatever measurement tools you adopt later.
The Bigger Shift
Cookie deprecation forced a measurement reckoning, but the implications extend beyond measurement.
The era of precision targeting created a specific kind of marketing: bottom-funnel, conversion-optimized, performance-obsessed. It rewarded short-term thinking and punished brand building. Why invest in something that won't attribute for six months when you can run a retargeting campaign that converts tomorrow?
Without precision targeting, that short-term calculus breaks down. Brands are being forced — many for the first time — to think about building demand rather than just capturing it. To invest in content, community, and brand awareness that can't be attributed to a click but can be measured through blended metrics, incrementality, and market share.
The irony is that this is what marketing was always supposed to be. The cookie era was the aberration — a two-decade period where an unusually precise tracking mechanism convinced brands that marketing was an engineering problem with deterministic outputs.
Marketing has never been deterministic. It's probabilistic at best. The post-cookie era just makes that obvious.

