Brightcove
Support+1 888 882 1880
Products
Solutions
Resources
Company
Search IconA magnifying glass icon.
Talk to UsRequest a Demo

Back

Scott Slater

By Scott Slater

Senior Director, Marketing Operations and Analytics at Brightcove

Video Marketing Metrics: How to Score Leads

Marketing

Lead Scoring

For B2B buyers, video’s value is well known. According to our own research, 95% of buyers say video plays an important role in moving forward with a purchase. Further, 93% said video is important in building brand trust.

For B2B marketers, video’s value is often overlooked. There’s no shortage of uses: Product demos, product reviews, how-to videos, and brand videos are all commonly employed. But the incredible utility of video marketing metrics is rarely explored beyond the realm of performance reporting.

Video Is More Than Content

Video, like any content format, is more than a vehicle for communication; it’s a data generator. It not only connects your brand with your customer, it quantifies the quality of that connection and experience. And unlike other formats, video offers three levels of content consumption data.

Like most content formats, video can tell you about customer interest and whether the audience is the right fit for the product. And like some formats, it can tell you about customer intent and whether the audience is interested enough to take action. But better than any other format, video can tell you about customer engagement.

Engagement scoring is critical to B2B marketers because we use that kind of data to score leads and make business decisions about audience targeting. With video, we can measure how much of the content was consumed without relying on less accurate metrics like scroll depth or time on page.

Incorporating video into a lead scoring model isn’t complicated. Like any other content format, it all starts by doing your homework.

Research Video’s Role in Conversions

While there is a level of subjectivity in traditional lead scoring, research plays an important role. You need to work backward from won deals to determine how video contributed to the buying process. There are several ways to do this, but the following methods can help you get started.

  • Buyer personas. Some of the necessary research should already be done. Buyer personas contain market and industry research as well as customer data and feedback. Reviewing these existing resources should help you determine “what” buyers watch. Interactive video metrics elevate buyer personas even further by telling you what buyers interact with in the video and in what order. These data points offer unparalleled precision and can help reduce the subjectivity in lead scoring.
  • Analytics. Once you know “what” your buyers watch, you need to know “how much” of it they watch. This includes both the number of videos and the average percent viewed of each video. You can get that information by integrating your online video platform (OVP) with Google Analytics 4.
  • Sales team and customers. Even the most accurate and granular data can lead you to the wrong conclusions. That’s why you should always run your theories by your sales team and, if possible, your customers. For example, a lot of buyers may have watched a particular video because they couldn’t find the information they needed. Behavioral data can tell you “what” they watch and “how much,” but sometimes it can’t tell you what they “don’t” want to watch.

Set the Right Video KPIs

Video is rich with data points. But not all data is useful for every purpose, nor should it be analyzed in isolation. For lead scoring to make the most of video analytics, you need to set the right video KPIs. And that means using the proper metrics in conjunction with thresholds.

Metrics

Most OVPs offer several different metrics like impressions, play rate, and minutes watched. However, most of those are better for B2C or campaigns where awareness is the objective. Minutes watched, in particular, can be misleading. For example, watching 10 seconds of 100 videos demonstrates less intent than 100 seconds of 10 videos.

The best metric for lead scoring is engagement, sometimes known as retention or completion rate. It tells you how much of a video your viewers watched on average. High engagement for the right kind of video is a strong indicator of a good lead.

Engagement can also be enhanced by other advanced options.

  • Viewability. Viewability allows you to pause playback when the player is no longer viewable. This prevents your engagement from being inflated when a user scrolls further down the page or opens another tab.
  • Interactivity. With interactive video, you can add clickable overlays to a video and give viewers the opportunity to take action. High engagement along with a high interaction rate is a strong indicator of viewer interest.
  • Audience Insights. If you have Brightcove Audience Insights, you have access to our proprietary Attention Index. This metric subtracts bottom engagement from top engagement, controlling for more passive viewers and providing a clearer picture of affinity.

Thresholds

Once you know your metrics, you need to set your thresholds.

  • Percent viewed. This could be a single threshold (e.g., over 75%) or a graduated threshold (incremental points at 25%, 50%, etc.). Just remember that source channel and video duration can heavily affect this. Top funnel channels and longer videos will tend to have lower engagement and may require a separate setting.
  • Videos viewed. This should be dependent on the previous threshold: videos viewed at the minimum percent viewed. And like any other lead scoring model, the timeframe is key here. These views need to occur within the period when most buyers convert, whether that’s a day, a week, or a month.
  • Negative thresholds. Depending on the precision of your audience targeting and the volume of video data, it can also be helpful to set negative thresholds. For example, you may want to apply negative scores if the percent viewed is under 5%. Similarly, you could set a time decay (i.e., a graduated threshold) as time elapses since the last view. This will prevent video engagement from contributing to a lead’s current score if the view was months or even a year ago.

Assign Points to Video KPIs

Classic lead scoring models follow similar patterns in assigning points: Product/branded content is scored higher; generic/problem content is scored lower. This is because we expect early-stage buyers to watch sizzle reels and later-stage buyers to watch product demos.

However, the buyer journey isn’t linear. B2B buyers make up their own journeys, regardless of our funnels and stages. In other words, weighting isolated content will lead to inaccurate scores. The best thing to do when assigning points to video KPIs is to do so in sequences.

Buyer behavior usually falls into two categories: browsing and researching.

  • Browsing behavior. Tends to be more erratic and less engaged but it can still trigger thresholds. Sequences of unrelated topics, especially unrelated to the brand or products, usually indicate browsing behavior.
  • Researching behavior. Usually more focused and more engaged. Sequences of related topics, especially related to a single product or similar ones, usually indicate researching behavior.

Using these sequences, you can more accurately assign points based on actual behavioral data, not isolated interactions.

Build and Test the Model

To build a video-based lead scoring model, you’ll need access to user-level data by integrating your OVP with your MAP or CRM. However, most OVPs have a limited number of integrations, so make sure your video platform has the one you need. For example, Brightcove Audience Sync connects with most of the popular tools available.

Once your model is built, all that’s left to do is test and refine it. Lead scoring best practices aren’t based on metrics and calculations; they’re based on dedication and patience.

Certainly, some metrics are better than others, and video metrics are especially valuable as lead scoring criteria. But even they can’t help a rushed B2B marketing strategy. Give your model time to work, compare it to the old one, and improve it as new data becomes available.


BACK TO TOP