About

Shop talk on "intricately managed miracles" and early-stage subculture edited by four professionals in the throes of growing and funding early-to-mid stage tech companies.  For bios and other goodness click here.

Subscribe
Tweets
Search
Community
Powered by Squarespace
The Five & Dime
« Business Development - The Basics | Main | The Curbed Chronicles: Don’t be a Jerk & Don’t Quit Your Day Job »
Thursday
Nov182010

Measuring User Engagement with Cohort Reports

Getting a reporting infastructure in place is always a challenge because it requires dev resources.

The good news for your beta period is that you can save your dev hours for features - you don't need a big dashboard report right now. Just focus on measuring user engagement.

Why? Because beta is all about nailing product/market fit and your on-boarding process – and user engagement is the metric that most directly reflects both.

There are 2 critical reports to measuring engagement. The first is a blunt instrument; the second more parsed out – and during beta, lots more useful.

Here’s how they work.

1.       Rolling % Active

The idea here is that you want to get some sense of what % of your overall user base is active in any given week or month. The calendar week or month is an artificial and rigid way to measure engagement activity. So instead of saying ‘50% of our users were active in November, vs. 45% in October’ – you generate a weekly report showing the rolling % of users active in the past 7 and 30 days. It gives you a smoother curve.

While overall % active is clearly a key stat, it has some shortcomings especially during the beta phase.

The primary shortcoming is that you can’t tell *which* of your users are active or inactive.

For example, are your active users mostly just new sign-ups who go nuts with rookie enthusiasm for a few days and then go silent, or are they your old powerhouse users who’ve been with you since the start -- or both?

Relatedly, an overall % active number doesn’t allow you to easily map engagement fluctuations to product changes. For example, have users who signed up after you implemented some new on-boarding technique engaged better than the users who signed up before that technique was implemented?

Also, I think it’s common during the first 12-24 months of a new product to have some subset of people sign up, go dormant, and then come back as real, engaged users x months later (it took me at least 9 months to actually start using Twitter, and I’m still dragging my feet on Evernote). An overall % Active metric won’t give you a sense of whether or not the initial sign-ups who did not convert to active users are coming back to you over time - or just staying dormant.

2.       Weekly Sign-Up Cohorts

Cohort reports are the best way to get behind the overall % Active stat and understand which users are active in any given time period.

The best way to understand cohorts is to look at an example.

Let’s say you launched your product on October 4th and have therefore been live in beta for 6 weeks. The following report shows for every week you’ve been live – how many people signed up each week, and how many of those users (per weekly sign-up cohort) remained active in each subsequent week:

(*these numbers are goofy btw, just trying to show the concept)



So if 30 people signed up in your first week of being live, then per this report, 14 of them were still active 6 weeks later.

Notice that of those first 30 sign-ups, lots of them went dormant in weeks 2-4, but came roaring back in week 5. Why did they come back? Maybe you implemented your weekly digest in that 5th week and reminded users you existed. Or maybe social proof kicked in and they realized how awesome you are.

Whatever the causes – cohorts let you map product or marketing activities to user engagement outcomes.

Note that you can run these *monthly* as well – so for all users who signed up in any given month, what number of those users are still active in each subsequent month.

There are lots of other helpful ‘flavors’ of cohort reports that you can easily churn out once you’ve got your basic queries in place. The most obvious of these is % of users active. This is really the same report as above, just showing % of each cohort active instead of absolute #, like this:

The cool thing about the % view is that you can benchmark engagement very easily week to week. So in the above report, look at the ‘Week 2’ column. The November 1st cohort is doing lots better in its second week than the October cohorts. Maybe you tested a new tutorial video that week – looks like it worked!

Although I won’t get into queries, one fundamental input to mention is your definition of ‘active user.’

As a product person, I always want to define ‘active’ very narrowly or stringently - but as big Rick Eaton always says, you don’t get points for ‘difficulty level’ in the active user contest :) Maybe more on this in another post.

At any rate – cohorts are helpful. Hope it makes sense! (Also interested what other folks are doing to measure engagement out there.)

References (2)

References allow you to track sources for this article, as well as articles that were written in response to this article.
  • Response
    ブランド時計通販
  • Response
    Response: video sexe gratuit
    Measuring User Engagement with Cohort Reports - Home - EarlyStager

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>