Imagine getting a call from the Chief Learning Officer of a country. I did once (from Canada), and she was kind and asked me pointedly about how I'd arrived at my numbers and conclusions in a paper I'd published. Why? Because the world of corporate whitepapers rarely has the same rigor as academic papers for peer-reviewed journals. Fortunately I do have that rigor ready for such conversations, even if it doesn't go into the paper. And it's from this perspective that I keep a commercial report such as ON24's in context.
ON24 recently released their annual Webinar Benchmarks Report, and a couple recent conversations I've had in the industry made me think that others may benefit by bringing those conversations into a public arena.
To their credit, ON24 shares data in a world of webinars that has always been starved for anything they can get their hands on. The objective here is simply to provide a few alternative discussion points to what ON24 has graciously shared – take it with a grain of salt and feel free to chime in.
Consider that their numbers are skewed upward
In their methodology, ON24 notes that this only represents webinars with 100 or more attendees. They don't share why, but remember that this will skew all downstream analyses. For many, many years average webinar attendance is under 100 participants, often 75-80. This is based on a) 18 years in the industry at PlaceWare, Microsoft, Corvent, and 1080 Group and b) conversations with many vendors with whom I'm under NDA (nobody wants to tell you that's the number, but reality is nothing to be ashamed of).
A couple reasons to keep ON24's numbers in context include:
- It’s not based on all webinars.
- If averages in their analysis aren’t based on all the webinars they did but only the larger ones, the numbers will be higher (like “average number of attendees”).
- It likely represents mostly better-resourced organizations.
- The vast majority of companies are under 500 employees (97% of them in the United States). Just like companies fall onto a size distribution range that displays exponentiation (what looks like the “hockey stick), so do their budgets and, correspondingly, the number of people they potentially drive to webinars.
- By definition, ON24’s solution is not entry level in features or price relative to the market as a whole. The customers that they attract (who are the source of the data for the report) are only a subset of the market, and probably not reflective of market averages.
Takeaway: Cut off the handle of the hockey stick and you dramatically raise the mean (average). Unless you’re a really big company, keep this in mind as you benchmark your own results. Even then keep in mind that this is based on a data set purposefully left out smaller sessions (that, to their credit, they disclosed).
Start your promotion cycle as early as you can
ON24 is dead on about this. Email is the default promotional tactic (still), but you can only hit your list so much before fatigue sets in. If you have enough lead time, you could hit them again (in the direct mail industry of old, you'd send the same offer to the same list repeatedly until response rates started dropping). Further, social media and other means of promotion don’t have the same response profiles (i.e., only a fraction of your Twitter following sees any given post without you paying).
Takeaway: There is (almost) nothing to lose and everything to gain by starting your promotion as far in advance as possible.
Test your own time and day of week for your webinars
ON24 reports that, in order, Thursday, Wednesday, and Tuesday are the best days of the week and 11am, 10am, and Noon Pacific are respectively the best time slots. This is consistent with patterns in more than one of my studies and even that of the conference call (telecom) industry that predates webinars.
That said, saying "This is what most people do so you should do it" is a bit like saying “The best time to have a television news show is at 6pm.” If everybody rushed that primetime slot, dilution occurs because people can only watch/attend one show at a time (presumably!). There is more competition for the eyeballs you are trying to reach during primetime, and webinar primetime is no different.
The other challenge? Just be aware that the report numbers are stated in North America-centric terms.
I’ve had clients who’ve experienced great success with non-primetime hours either because they avoid the rush or because they need to also reach Europe or Australia with a webinar – and 11am Pacific doesn't work so well for either of those geographies. And I’ve had clients who found that Fridays worked best for them in their situation and market.
Takeaway: This is an easy thing to test for yourself, and it’s important to do so because results vary by industry, audience, and geography. Your webinar solution’s ability to gather data during registration, polls, etc. will make it easy to survey your audience and dial in what's right for you.
Don’t confuse “time logged in” with engagement
I'm going to gently say this is one part of the report that commits a logical fallacy. ON24’s report shows that the “average viewing time” is 55 minutes, arguing that this is evidence of webinars "holding audiences' attention."
Being logged in for 55 minutes and “holding their attention” are different things, and you need to be more discerning than that. True, being logged in for 55 minutes is better than "you suck so bad I'm outta here." But being present and paying attention are very different things in an in-person classroom, and they are online, too.
If your vendor's solution enables you to detect the active application on a participant's desktop (one proxy for attention), you'll likely see that "attention" waxes and wanes, and this is for a number of reasons (the subject of another post). One obvious dropoff point is when you hit Q&A at the end of the presentation (an unfortunate but typical webinar practice). ON24's numbers later in the report about the average view time for a recording may corroborate this to some degree. Too, elsewhere in ON24's report they comment on usage by widget (another proxy for engagement), but those numbers are not correlated with when during the webinar they were used, and if I'm not mistaken, that is a possibility for them to report on.
Takeaway: ON24's metrics are a good start, but look a little more deeply to really understand where engagement is or isn't happening in your own webinars.
Have your presenters appear on camera
ON24’s report shows that fewer than one in five webinars used presenter video. This is a mistake (despite some of the arguments I make here!). Why? Adding one thing to the arguments in the aforementioned link, you’ll stand out from the crowd.
Takeaway: If most of the world of webinars is missing a critical element of personal connection, presenting on camera is an easy way to improve that.
Think “programs,” not “webinar and recording”
1/3 of attendees in ON24’s report only attended the recordings. If recordings and how you create momentum with programs that you can’t do with one-off events, join me for our own upcoming webinar on the topic and I’ll share a big “why” behind my assertion and how ON24’s data corroborates a trend I’ll be helping you turn to your advantage.
The bottom line
ON24 has shared some good data, and we all should appreciate that. To be fair, it's also a marketing piece for them, not an academic paper, so I offer some leeway for the fact that they're motivated to motivate you to do webinars.
It might be useful for remember that knowing a metric (e.g., what percentage of registrants attend) doesn’t tell you how to improve that metric for yourself – at best it lets you compare to see if you’re in the ballpark. Again, to be fair to ON24, it is a benchmark report even if it contains a couple issues that might be suspect in a more rigorous setting.
Here's to better webinars for you and yours!