← All writing

The Weekly Dashboard We Run Across 300+ Brand Accounts

The 9-metric weekly dashboard we run across 300+ brand accounts — covering MER, CPA, creative health, and spend pacing with clear flag thresholds.

Jordan Glickman·May 10, 2026·9
Operations

Data without a review rhythm is just noise.

Most agencies have access to the same data their clients do. The difference between an agency that drives results and one that just reports on them is not the data. It is the discipline of knowing which numbers to look at, in which order, and on what cadence.

At Impremis, I manage media across hundreds of brand accounts simultaneously. That scale forces operational discipline that smaller shops can avoid for a while — until they can't. When you're accountable for hundreds of accounts, the monitoring process cannot be improvised. It needs to be a system.

Here is the weekly dashboard review we actually run: what's on it, why each metric earned its slot, and how we use what we find on Monday morning.

Image brief: Three-column header row — "Business Health / Media Performance / Creative Performance" — with three metrics listed beneath each column. alt: "Three-tier performance dashboard." caption: "Data without a review rhythm is just noise."

Weekly is the right cadence

Before getting into the metrics, it is worth defending the cadence itself.

Daily monitoring is reactive and noisy. Performance marketing accounts have natural day-of-week variance — Monday looks nothing like Friday. A single bad day is almost never a signal worth acting on, but teams that monitor daily tend to over-optimize on short data windows. That disrupts algorithm learning phases and introduces variance instead of reducing it.

Monthly reviews are too slow. A problem that surfaces in week one and goes undetected until the end-of-month report has already consumed four weeks of budget and potentially damaged a client relationship that earlier intervention could have protected.

Weekly is the right balance. Frequent enough to catch emerging problems before they become expensive. Slow enough that the data volume is meaningful. The Monday morning dashboard review is non-negotiable across every account we manage.

The three-tier structure

The dashboard is organized by purpose, not by platform. Each tier answers a different question at a different lag.

Tier 1: Business Health. Is the business working? These are the numbers the client cares about most — and the ones that determine whether the account relationship is at risk.

Tier 2: Media Performance. Is the media buying working? One layer below business outcomes. Lets you diagnose channel-level efficiency before it surfaces in revenue.

Tier 3: Creative Performance. Are the creative assets working? Leading indicators. Creative metrics move before media efficiency moves, which moves before business outcomes move. Catching creative fatigue here — before the client feels it in revenue — is where most of the proactive value gets created.

Tier 1: Business health metrics

Marketing Efficiency Ratio (MER)

MER is total revenue divided by total ad spend, measured at the business level, not the platform level. It is the single most important number in the weekly review because it cannot be gamed by attribution windows or platform-level discrepancies.

If MER is healthy and trending in the right direction, most other problems are manageable. If MER is declining, there is something structural happening — and that signal comes before it shows up in ROAS, CPA, or the client's P&L. We track it weekly against a 4-week rolling average and year-over-year when seasonal data is available.

New Customer Acquisition Rate

What share of this week's purchases came from first-time buyers? This metric tells you whether paid media is expanding the customer base or just recirculating revenue through existing buyers via retargeting.

A healthy account typically holds a new customer rate at or above 40%. When that number drops consistently, it usually means the retargeting pool is being over-served relative to prospecting, or that top-of-funnel investment has been cut and the awareness pipeline is drying up.

Revenue Per Visitor (RPV)

Total revenue divided by total site visitors for the week. As I've covered in the RPV framework post, RPV is the metric that ties media quality, landing page performance, and offer strength into a single number.

We track RPV by traffic source. Blended RPV tells you the health of the overall funnel. RPV by channel tells you where the leak is.

Tier 2: Media performance metrics

CPA by campaign type

We separate CPA reporting by campaign function — prospecting, retargeting, and retention. Blended CPA is a useful headline number but it masks the efficiency of each layer.

A rising prospecting CPA usually signals one of three things: creative fatigue, audience saturation, or a targeting issue. A rising retargeting CPA tells you the audience quality from the prospecting layer has declined, or retargeting creative is wearing out. Separating them makes the diagnosis precise instead of approximate.

ROAS with attribution context

ROAS is on the dashboard because clients track it and because it is a useful directional signal. But it is always presented with the attribution window and model explicitly noted — because ROAS without attribution context is misleading, and the same 3.5x can represent completely different economic realities depending on the model.

We standardize attribution settings across all accounts and flag any platform-driven changes when they appear in the weekly review.

Spend pacing vs. target

Are we on track to hit the monthly budget target at the current daily run rate? Underspending is as much a problem as overspending. An account that ends the month at 80% of target spend did not just leave money unspent — it left audience reach, creative testing volume, and algorithm learning on the table.

We flag accounts more than 10% off pacing by Wednesday of each week. That gives enough time to make adjustments before the weekend, which is typically the highest-volume conversion window across most DTC categories.

Tier 3: Creative performance metrics

Thumb stop rate by asset

The percentage of viewers who watch past the first three seconds of each video creative, tracked at the individual asset level — not blended at the campaign level.

When thumb stop rate drops on a previously strong asset, it is an early signal of creative fatigue. The asset has reached enough of the target audience that the hook no longer generates a stop reaction. This is a leading indicator that CPA increases are coming if no creative action is taken. We flag any asset whose thumb stop rate declines more than 15% week over week for immediate creative team review.

Creative efficiency score

An internal composite metric that combines thumb stop rate, link click-through rate, and cost per purchase into a single score — allowing us to stack-rank all active creatives in an account at a glance.

Top-scoring creatives get increased budget allocation. Bottom-scoring creatives get flagged and evaluated for pause. This happens weekly, so budget is always flowing toward the creative assets that are currently earning it rather than assets that earned it last month.

New creative volume vs. target

How many new creative concepts entered testing this week versus the account's target testing volume? This is a process metric, not a performance metric — but it is one of the most important things we track.

Creative pipeline health is a leading indicator of future performance. An account falling behind on new creative testing is accumulating risk, even if current performance looks stable. When existing creatives fatigue and there are no new assets ready to replace them, there is always a performance gap. We set a minimum monthly testing target for every account based on spend level, and the weekly dashboard tracks whether production is on pace. (The creative testing system post has more on how we structure that volume.)

The full dashboard

| Metric | Tier | Flag Threshold | |---|---|---| | Marketing Efficiency Ratio (MER) | Business Health | Down 10% vs. 4-week average | | New Customer Acquisition Rate | Business Health | Below 40% of total purchases | | Revenue Per Visitor by Channel | Business Health | Down 15% vs. prior week | | CPA by Campaign Type | Media Performance | Up 20% vs. 4-week average | | ROAS with Attribution Context | Media Performance | Directional — window must be noted | | Spend Pacing vs. Target | Media Performance | More than 10% off by Wednesday | | Thumb Stop Rate by Asset | Creative Performance | Down 15% week over week | | Creative Efficiency Score | Creative Performance | Bottom 25% of assets flagged | | New Creative Volume vs. Target | Creative Performance | Behind target by more than one concept |

How the review actually runs

The dashboard is reviewed by account leads every Monday morning. Not a passive read — an active triage.

Every flagged metric produces one of three outcomes:

  1. A defined action the account lead takes before end of Monday
  2. An escalation to a senior strategist if the flag is ambiguous or the stakes are high
  3. An acknowledgment that the flag is being monitored with no immediate action required

Nothing stays in limbo. Every flag has a resolution or escalation path by Monday afternoon.

That discipline is what makes scale manageable. Flags that are acknowledged and forgotten are worse than no flags at all — they create the illusion of monitoring without the substance of it.

What the portfolio reveals that individual accounts can't

One of the most undervalued benefits of running a consistent weekly review across hundreds of accounts is what it reveals at the macro level.

When MER is declining across multiple accounts in the same category simultaneously, it is not an account-level problem. It is a market-level signal — CPMs rising, category competition increasing, consumer demand shifting. That macro view changes the strategic advice I give to every client in that category, not just the ones whose accounts are flagged.

When creative fatigue is accelerating across the portfolio, it tells me something about platform behavior — not just a single account. TikTok feed velocity increases. The window before frequency fatigue shortens. That insight goes into every creative brief we write, not just the briefs for struggling accounts.

That portfolio perspective is one of the things a multi-account agency can offer that a single brand's in-house team genuinely cannot replicate. The dashboard is how we systematize access to it.

FAQ

What's the minimum account size for this dashboard to make sense? $30K/month in spend and above. Below that, the data volume can make weekly thresholds noisy. At sub-$30K, biweekly or monthly review with a simpler 4-metric version works better.

What if a metric is flagged every week? That is a structural problem, not a weekly optimization issue. Chronic flagging means something in the underlying account setup or offer is broken. The dashboard surfaces it — but the fix requires a deeper audit, not a Tuesday adjustment.

Should clients see this dashboard? A simplified version, yes. Internal flag thresholds and composite scoring should stay internal. What clients should see weekly: MER trend, new customer rate, spend pacing, and one creative performance callout. More than that is noise for most clients.

What tools do you use to build it? The source data comes from the ad platforms and GA4. We assemble it in a standardized reporting layer. The specific tool matters less than the standardization — the dashboard must look identical across every account, or the review process breaks down.

Closing

If you are running performance marketing at meaningful scale without a formalized weekly review process, the time to build it is before performance degrades — not after.

Start with the nine metrics above. Set your flag thresholds. Assign clear ownership. Define the escalation path for every flag type. Run it consistently for 90 days.

The rhythm is the advantage. Not the data.

Keep reading

Pieces I've written on related topics that pair well with this one:

Subscribe to the newsletter

Get every post in your inbox.

New writing every two weeks. No fluff. Unsubscribe anytime.

Subscribe