Examples of lead scoring models
4
56

Chapter 4

7 minutes

Examples of lead scoring models

Enough theory — how about some examples? As we talk to companies about their approach to lead scoring, we find a lot of diversity in how they actually pull it off.

Here's a sampling of real-life scoring methods from Clearbit customers and friends, including a couple of anonymized and composite illustrations to cover for the spotlight-shy. They include basic, intermediate, and advanced examples of SaaS qualification systems

Basic lead scoring systems

Heap's scoring is more advanced now, but let's look at an example of something they did in their early days for illustrative purposes. Heap enriched leads in Salesforce, where a simple model qualified companies using two firmographic fit points: employee count and industry. Then, it bucketed companies into Low, Medium, and High priority.

For routing, they used Salesforce rules to send companies to the right sales team tier based on employee count: SMB (fewer than 100), mid-market (100-499), or enterprise (500 and up).

This B2B SaaS communications company uses a point-based lead score that drops prospects into A/B/C/F buckets. It's calculated in Salesforce. They award 1-5 points for firmographic attributes such as industry, country, and technology used. A lead's Clearbit tech tags tell them whether the lead is using their competition and technologies that integrate well with Mystery Customer's tool.

Intermediate lead scoring systems

Proposify scores leads in Marketo. Then when leads pass a score threshold, they get pushed into Salesforce (and to sales to work).

They use both firmographic attributes and behavioral data:

  • Clearbit data like job title, industry, company size, and annual revenue
  • Behavioral data collected from Proposify's web app and website, using tracking from Marketo and Segment. Activities tracked include pricing-page views, free trial activations, and feature usage within the web app.

Their stack includes Marketo, Salesforce, Clearbit, and Segment, and they worked with the agency Outshine to set it up.

This company, a developer tool, uses both fit and behavioral data in a point-based lead score.

Their company fit data is centered around number of employees, while behavioral data signals includes activities like opt-ins, event attendance, a recent contact request, and high-value page visits. These behaviors roll into an intent score (and a hot/warm/viable label) that's used as a multiplier to the firmographic score.

To detect and fast-track surging accounts, they use a Clearbit Reveal-Salesforce integration. Reveal detects when a target account visits their site and tracks the week-over-week increase or decrease. This change in website activity is pushed to Salesforce. For example, if 10 people from the same account visit the site in the last week, the sales team can see that and reach out ASAP.

Advanced lead scoring system

This company, a work management platform, uses two machine-learning models to score and route leads.

The first model predicts a user's intent to purchase based on their in-product usage behaviors.

The second model predicts whether the prospect will have high or low MRR. They send high-MRR folks to sales and low-MRR folks to a self-serve product track. This model uses firmographic data, such as industry and company size, by enriching data in Snowflake through the Clearbit Enrichment API.

LeanData uses an account scoring platform to score at the company level, giving the accounts in their database a strong/moderate/weak fit score based on past data. It also gives them an intent score (such as "Decision" or "Purchase" stage) based on behaviors like website visits and branded keyword search. Outbound prospects are routed to an SDR or AE based on their score; the sales rep can see behavioral history and reach out to that lead, or use Clearbit to find other decision makers at that company and use an account-based approach.

Inbound leads are routed based on how they came in ("campaign-based routing"). They get a P1 priority level for high-intent behaviors like hand raises, AppExchange downloads, and Drift chat—P1 leads get a response within 10 minutes.

Leads get a P2 priority level for high-value opt-ins, like attending a webinar or downloading a specific piece of content. These leads get a 48 hour response time, which allows the rep a moment to craft a personalized message.

P1 and P2 inbound leads are sent to an AE if they're in an open opportunity, CSM if they're a current customer, or an SDR otherwise.

The evolution from basic to advanced

Like many companies, Clearbit's lead qualification system has evolved over time, so here are three snapshots of our scoring at different stages.

When we started out, we didn't use official scoring, and our sales reps spoke to a wide variety of leads. We filtered out leads that were obviously individuals (with Gmail addresses) or spam. This was our "very basic" stage.

Our reps soon developed a good sense for what makes a good lead, and we also ran regression analyses to see what qualities our closed-won customers shared. We used these shared attributes to build a point-based lead scoring system in Salesforce Process Builder. We were about three years old by then — read more about the model here. It bucketed companies into A/B/F groups, and routed higher scores to AEs while SDRs vetted the lower ones. Reps could see a lead's A/B/F score right in Salesforce.

It used several Clearbit fit data points, as well as self-reported data:

  • Firmographic:
    • Industry
    • Business model tags (SaaS, B2B, etc.)
    • Technology used (Salesforce, Marketo, Drift, etc.)
    • Estimated annual revenue
    • Employee range
    • Country
  • Demographic:
    • Sales/marketing team
    • Leadership level (indicator of purchasing power)
  • Use case survey:
    • When someone signed up for a Clearbit account, they answered a one-question onboarding survey about their use case (e.g., they could select "personalize your website" and "see which companies are visiting your website"). This answer was factored into the lead score.

Once a lead was scored, Process Builder routed leads to the right rep. We'd call that a "medium-stage" setup.

As we matured, we stepped into a more advanced system. Today, we use:

  • LeanData for routing
  • MadKudu machine learning to score leads on fit (low, medium, good, very good)
  • A custom-built model for our intent scoring.

Our custom intent model uses Segment to scoop up a lead's behaviors across our marketing channels and Clearbit products (e.g., product usage, website visits, email opens). It feeds this information into Redshift, our data warehouse. A tool called dbt sits on Redshift and transforms the activity data into one nice master table. Census pushes the data from that table into tools like Salesforce and Customer.io, where our sales and marketing teams can see a lead's interactions and behaviors to use in personalized outreach.

We're still calibrating when to send a lead to sales based on this activity data — it's a work in progress.


Remember, these examples are just for illustrative purposes to show the range of scoring implementations across different companies. The situations we've described are ever-evolving and may have changed by the time you're reading this. Like Clearbit, everyone's figuring it out and iterating over time.

The Standard in B2B Data

Now reinvented with Artificial Intelligence—Clearbit is the first AI Native Data Provider. Enrich your records, reveal buying intent, and connect with your ideal customers.

image-hero