The problem: Marketing was passing a high volume of junk leads to sales.
At a SaaS software provider, everyone who signed up for a free trial of the company’s software was automatically passed to Sales for follow-up, whether or not they met sales-ready criteria.
The solution: I revamped the marketing-automation system so that it scored and filtered leads before passing them to Sales.
I diagnosed tough technical issues, rewrote lead-scoring rules and got Sales and Marketing buy-in on which leads should be filtered out.
The results: The conversion rate of marketing-qualified leads (MQLs) to sales-accepted leads (SALs) went up 60 percent year-over-year.
The details: Here’s how I diagnosed and fixed the underlying issues with the marketing-automation system.
The primary cause of the company’s inability to score and filter leads was poor integration between the company’s Pardot marketing-automation software, Sugar CRM system and proprietary back-end user database.
When prospects signed up for a free trial, the data initially populated the company’s back-end user database. This data then had to be passed to both Pardot and Sugar in order for Sales to follow up.
IT implemented Pardot about 15 months before I joined the company. Initially, IT tried to pass data from the user database exclusively to the marketing-automation system. But for some unknown reason, all the records weren’t making it through. To meet the tight implementation deadline, IT decided to push all free-trial records simultaneously to both Pardot and Sugar.
As a result, junk leads flooded the CRM system, such as records with missing phone numbers, test accounts from company employees and prospects that just weren’t sales-ready.
I painstakingly diagnosed the root cause. I used the VLOOKUP Excel formula to compare records that were successfully pushed to the Sugar CRM system, but never made it to the Pardot marketing-automation system. I discovered that the user database wasn’t pushing records that already existed in Pardot.
I then researched our API calls and discovered that we were using the wrong command. Instead of a command that created new records and updated existing records, we were using a command that only created new records.
Armed with this information, I circled back with IT and worked with them on a fix. We turned off the two simultaneous API calls and pushed all prospects to Pardot for lead scoring. For the first time since the marketing-automation software was implemented, we were able to take advantage of lead scoring and filtering.
In addition to fixing the integration issues, I also thoroughly audited dozens of lead-scoring rules. I corrected poor logic and wrote new rules to ensure that we properly scored all leads based on:
- demographic criteria, such as geography, company size, lead source, purchasing time frame and decision-making authority.
- behaviors, such as taking the free trial, attending webinars, downloading white papers and visiting key web pages.
Before the changes went live, I modeled the new lead-scoring rules in our sandbox environment. I projected how many leads would be screened out and what types of leads would be affected. I presented these findings to the VP of marketing and to Sales managers for approval.
When the changes went live, I got immediate anecdotal reports from front-line telesales reps that the leads were of much better quality. Over the course of the next 11 months, the average rate at which marketing-qualified leads (MQLs) became sales-accepted leads (SALs) grew by 60 percent.