Skip to content
DCO Meta Ads

Dynamic Creative Optimization (DCO): How It Works on Meta Ads

An automated ad-serving method that assembles the best-performing combination of creative elements in real time for each individual viewer.

Dynamic Creative Optimization (DCO) is an ad-serving approach where the platform automatically assembles and delivers the best-performing combination of creative elements for each individual viewer in real time.

Instead of publishing a finished ad and hoping it works for everyone, you provide raw ingredients - multiple images, headlines, body copy options, and CTAs - and Meta’s algorithm builds and tests combinations on the fly, routing each user to the version most likely to drive a conversion.

How DCO works on Meta

When a user is eligible to see your ad, Meta evaluates their profile - interests, past behavior, device, time of day, placement - and selects the combination of ingredients most likely to perform for that specific person.

You enable DCO in Meta Ads Manager by toggling “Dynamic Creative” inside an ad set. The asset limits:

  • Up to 10 images or videos
  • 5 primary text options
  • 5 headlines
  • 5 descriptions
  • 5 CTA button labels

Meta generates all viable combinations from these inputs and allocates delivery toward the ones performing best across the audience. You do not choose which combination runs - the algorithm does.

With maximum assets uploaded (10 images x 5 texts x 5 headlines x 5 descriptions x 5 CTAs), there are theoretically 6,250 possible combinations. Meta won’t test all of them - it focuses on combinations with the most signal - but the combinatorial scope is what makes DCO different from standard A/B testing.

DCO vs. standard creative testing

In standard A/B testing, you publish two or three complete ads and measure which performs better across the entire audience over a defined period. DCO goes further in two ways.

First, it tests at the ingredient level rather than the assembled-ad level. You don’t need to manually construct every combination you want to test - Meta does the combinatorial work.

Second, it personalizes per user rather than per audience segment. A/B testing finds the best single creative for an audience. DCO finds the best creative for each individual within that audience. The same ad set might show Image A + Headline 3 to one user and Image C + Headline 1 to another, based on what Meta predicts each person responds to.

This is the theoretical advantage. In practice, the personalization signal is only as good as Meta’s data about each user. iOS 14 degraded that signal for some user segments. The performance improvement from DCO over well-run manual testing tends to be 5-15% rather than transformational.

Expected performance

Meta’s internal benchmarks for DCO: 10-20% higher CTR and 8-15% lower CPA versus single static creatives. Independent data from Segwise’s 2025 analysis of 1.1 million ad variations found IPM (installs per mille) lifts of up to 65% on social platforms when DCO was used versus static single creatives.

The lift is largest for advertisers who were previously running 1-2 static variants - there is simply more room to improve. Advertisers already running 5-10 manually tested variants typically see a smaller improvement (5-10%) because the manual testing has already found most of the signal.

How to interpret DCO reports

Meta’s reporting for DCO shows aggregate performance at the ad set level plus a breakdown of which creative assets got the most delivery. You can see which images, headlines, and copy variants are winning - but the report shows each element’s performance independently, not specific combinations.

To get combination-level data, you need to use Meta’s Creative Reporting, filter by “Dynamic Creative Element,” and look at the asset-level breakdown. This tells you: which image generated the best CPA, which headline drove the most CTR, which CTA had the best conversion rate. It does not tell you which image+headline combination was optimal.

For most advertisers, the asset-level breakdown is enough to make decisions. Pull the best-performing image and best-performing headline combination and run it as a static ad - this is the standard “DCO as a testing tool, static as the delivery vehicle” workflow.

When to use DCO

DCO is most valuable when:

  • You are in early discovery mode for a product category or audience and haven’t identified what works
  • You have multiple meaningfully different creative angles to test (not just image color variations)
  • You have at least 3-5 variations of each creative element to make the combinatorics worthwhile
  • You want to reduce manual creative management overhead at scale

In our Ridge Wallet teardown, Ridge uses DCO for 50% of their ring product ads - a category where they’re still learning what resonates - but only 9% of their wallet ads, where years of testing have produced proven winners. That ratio makes sense: DCO for discovery, static for known performers.

When not to use DCO

DCO is counterproductive when:

  • You have a single proven hero creative that reliably outperforms everything else. Running DCO introduces algorithmic noise and can actually reduce performance by pulling delivery away from what works.
  • You need clean test data on a specific variable. DCO mixes variables simultaneously, so if you want to know whether Headline A beats Headline B while holding everything else constant, run a standard A/B test instead.
  • Your creative assets are too similar. If all 10 images are product shots with slightly different backgrounds, DCO has very little variation to exploit. The algorithm will quickly settle on one option.

DCO vs. Advantage+ Creative

These are related but different features. DCO (Dynamic Creative toggle in Ads Manager) lets you upload multiple named assets and test combinations, giving you visibility into which ingredients win.

Advantage+ Creative is a separate feature that automatically applies enhancements to a single creative: generating background variations, adjusting brightness/contrast, adding text overlays. You upload one image; Meta creates variants automatically.

Advantage+ Creative produces more variants but gives you less control and visibility. DCO produces fewer variants but you choose exactly what those variants are. For most systematic creative testing, DCO is the more useful tool.

Frequently asked questions

How many impressions does DCO need before producing reliable data? At least 1,000 impressions per asset to identify directional signals, and 50-100 conversions per element before making scaling decisions with confidence. At 95% statistical confidence, under-testing produces false winners. If your budget doesn’t support that volume within 7-14 days, standard manual A/B testing with two variants will be more reliable.

Can I use DCO with my product catalog? No. DCO uses manually uploaded creative assets, not catalog feeds. For catalog-based personalization, use Dynamic Product Ads (DPA) instead. They serve different purposes: DPA for showing specific products from your catalog based on browsing behavior, DCO for testing creative variation across a defined set of assets.

Does DCO work at small budgets? Not well. The algorithm needs enough impressions to identify patterns across combinations. At very low daily budgets ($20-50/day), DCO often converges on one combination quickly due to limited data, which defeats the purpose. Better to run a clean two-variant test at that budget level.

Where we've analyzed DCO

Obvi Runs 4x More Google Ads Than Meta Ads. But When You Look at What's Actually Active - the Ratio Reverses.

Meta AdsGoogle AdsFull TeardownAds StrategyFeb 28 · 14 min read

119 Google ads vs 30 Meta ads - but only 22 Google ads are live. Obvi's real ad engine is Meta: 70% video, zero discounts, three pain-point funnels, and a two-track CTA strategy.

Ridge Wallet Marketing Strategy: 273 Meta Ads, 50 Instagram Posts and Website Scraped, Full Funnel Analyzed

Meta AdsInstagramFull TeardownFeb 20 · 18 min read

I scraped Ridge Wallet's entire Meta Ad Library - all 273 active creatives - and analyzed their Instagram, tech stack, and email flows. 88% of their ads lead with value, not discounts.

I Scraped 400 of RYZE's Meta Ads. Here's What a $50M Mushroom Coffee Brand's Ad Machine Actually Looks Like.

Meta AdsFull TeardownAds StrategyDTCMar 6 · 16 min read

400 active ads, 28 body copy variants, one copy powering 56% of the sample. Inside RYZE's two-track Meta strategy - workhorse acquisition engine vs. 207-day brand play - plus a product reformulation their ads gave away.

See also