Cohort Analysis Deep Dive: Why 'Last Week vs. This Week' Is Useless
Understanding why comparing arbitrary time periods misleads you—and how cohort analysis reveals the truth about your business.
Your marketing manager walks in with great news.
“This week we had 50 signups. Last week we only had 40. We’re up 25%!”
You celebrate. You tell the founder. You update the board.
Then, a week later, you have 35 signups. You’re down 30% week-over-week.
You panic. Did something break? Did we lose momentum? Should we cut marketing spend?
But here’s the thing: The number of signups you get this week is mostly random noise. Weather, day of the week, whether you happened to publish a viral post—all of that creates variation.
Cohort Analysis fixes this problem by comparing apples to apples.
The Problem With Week-Over-Week Comparisons
“Last week vs. this week” is useless for several reasons:
Reason 1: Day of week matters
If last week had two weekends and this week had only one, the comparison is meaningless.
Signups might be:
- Monday: 5
- Tuesday: 6
- Wednesday: 7
- Thursday: 8
- Friday: 9
- Saturday: 3
- Sunday: 2
Last week’s total: 40
But if this week includes a different set of days (or one less weekend day), the total will be different just due to timing, not due to a real change in business.
Reason 2: You can’t see seasonal patterns
If every September is slower because people are back-to-school shopping (for e-commerce) or back-to-work budget cuts (for B2B), you won’t notice in a week-over-week comparison.
You’ll think “something’s wrong” when really, it’s just seasonality.
Reason 3: You can’t separate “new customer quality” from “retention”
Let’s say week 1 you acquired 50 signups. Week 2 you acquired 40 signups. Week 3 you acquired 60 signups.
Revenue is fluctuating wildly. Is it because the quality of new customers is changing? Or because old customers are churning at different rates?
Week-over-week comparisons don’t tell you.
The Cohort Solution
A Cohort is a group of customers who share a common characteristic or experience within a defined time period.
Most commonly, we group by “signup date.”
Example cohort:
“Customers who signed up in January 2024”
This cohort includes everyone who created an account in January, regardless of when we measure them.
Now we can ask: “How many of the January 2024 cohort were active 1 month later? 3 months later? 6 months later?”
| Cohort | Size | Active after 1 month | Active after 3 months | Active after 6 months |
|---|---|---|---|---|
| Jan 2024 | 500 | 85% (425 users) | 60% (300 users) | 40% (200 users) |
| Feb 2024 | 450 | 87% (392 users) | 58% (261 users) | 38% (171 users) |
| Mar 2024 | 480 | 90% (432 users) | 62% (298 users) | 42% (202 users) |
| Apr 2024 | 510 | 88% (449 users) | 64% (326 users) | 45% (229 users) |
| May 2024 | 520 | 91% (473 users) | 66% (343 users) | — |
| Jun 2024 | 495 | 92% (455 users) | — | — |
Now you can see a real pattern: Later cohorts (Apr, May, Jun) have better retention than earlier cohorts (Jan, Feb, Mar).
This tells you: Something improved in your product or onboarding between January and April.
Maybe you fixed a bug. Maybe you added a killer feature. Maybe you improved onboarding. Whatever it was, customers who signed up in April stick around longer than customers who signed up in January.
This is a real, actionable insight. Week-over-week comparisons would never show you this.
Cohort Retention Analysis
The most common cohort analysis is retention (what percentage stay active over time).
Here’s how to read it:
| Cohort | Month 0 | Month 1 | Month 2 | Month 3 | Month 6 |
|---|---|---|---|---|---|
| Jan 2024 | 100% | 85% | 72% | 60% | 40% |
| Feb 2024 | 100% | 87% | 75% | 58% | 38% |
| Mar 2024 | 100% | 90% | 78% | 62% | 42% |
Interpretation:
- Month 0 (signup): Everyone is active (100%)
- Month 1: 85% of Jan cohort is still active. 15% have churned.
- Month 2: 72% are still active. 28% have churned overall (though some of the 15% who left in month 1 might have come back).
- Month 3: 60% still active. 40% have churned.
- Month 6: 40% still active. 60% have churned.
By comparing across cohorts, you see:
- Jan cohort: Steeper decline (85% → 60% over 3 months)
- Mar cohort: Shallower decline (90% → 62% over 3 months)
Later cohorts retain better. Your product is improving.
Cohort Revenue Analysis
You can do the same analysis with revenue instead of retention.
| Cohort | Month 0 | Month 1 | Month 2 | Month 3 | Month 6 |
|---|---|---|---|---|---|
| Jan 2024 | $50k | $35k | $20k | $12k | $4k |
| Feb 2024 | $45k | $33k | $19k | $11k | $3.5k |
| Mar 2024 | $48k | $36k | $22k | $13k | $5k |
This shows:
- Immediate revenue from signup (Month 0): Fluctuates between $45k-$50k
- Revenue drops 30% in Month 1
- Revenue continues to decline through Month 6
But look at March cohort: Month 0 revenue is slightly lower ($48k vs. $50k), but Month 6 revenue is higher ($5k vs. $4k).
This tells you: Later cohorts might acquire at lower initial value but retain better and generate more long-term value.
This changes your strategy: Maybe you should focus on customer success and retention (since later cohorts keep more value), rather than just trying to sign up bigger customers at the beginning.
Cohort Acquisition Analysis
You can also analyze by acquisition source.
| Cohort | Organic Search | Paid Search | Referral | |
|---|---|---|---|---|
| Jan 2024 | 35% | 40% | 15% | 10% |
| Feb 2024 | 38% | 38% | 18% | 6% |
| Mar 2024 | 42% | 35% | 16% | 7% |
| Apr 2024 | 45% | 32% | 15% | 8% |
This shows your organic search acquisition is growing (35% in Jan → 45% in Apr) while paid search is shrinking (40% in Jan → 32% in Apr).
Now combine with retention:
- Organic customers have 50% 6-month retention
- Paid search customers have 35% 6-month retention
- Facebook customers have 25% 6-month retention
- Referral customers have 60% 6-month retention
Your best customers come from referral (60% retention) and organic (50% retention).
Your worst come from Facebook (25% retention).
This is a signal: Double down on organic and referral. Cut or improve Facebook.
Avoiding Cohort Analysis Mistakes
Mistake 1: Incomplete cohorts
If you’re looking at June 2024 signups and measuring 6-month retention, you only have data through December 2024. You’re looking at 6 months of incomplete data.
Solution: Only look at cohorts that are “mature” (you have complete data for the time period you’re measuring).
Mistake 2: Survivorship bias
If you measure “revenue per cohort,” you’re only seeing customers who didn’t churn.
The Jan 2024 cohort that churned out has $0 revenue, which doesn’t show up in your average.
Solution: Measure both retention and revenue. A cohort might have high revenue per staying customer but low retention, meaning you’re losing most of them.
Mistake 3: Confusing cohorts with segments
A cohort is a time-based group (customers who signed up in January).
A segment is a behavioral or demographic group (customers from healthcare industry, or customers who used feature X).
You can combine them: “Healthcare customers who signed up in January.” But they’re different concepts.
Building Your Cohort Analysis
We typically recommend looking at:
1. Retention (% of cohort still active) by signup month
- Shows if product quality is improving or degrading
2. Revenue (or MRR) by signup month
- Shows if revenue quality is improving
3. Churn rate by cohort
- Similar to retention but inverted; shows what % leave
4. Activation by cohort
- What % completed key actions (like “used feature X” or “made a purchase”)
- Shows if onboarding is improving
5. CAC by cohort
- What did it cost to acquire this cohort?
- Combined with LTV, shows if unit economics are improving
The Actionable Insight
The power of cohort analysis is that it separates signal from noise.
Week-over-week: “Signups down 20%, oh no!”
Cohort analysis: “March cohort has 45% retention vs. January cohort’s 40%. Our product is improving. The drop in signups this week is just noise.”
One creates panic. The other creates insight.
We help you set up cohort analyses so you understand whether your business is actually improving or just fluctuating.