Here’s a number that stopped me in my tracks when I first saw it in a BrightLocal report: 86% of Google Business Profile views come from category-based searches — meaning the vast majority of people finding your GBP listing aren’t searching your business name. They’re searching “plumber near me” or “emergency locksmith Reading”. That statistic, pulled from Business Web Directory, tells you something most local SEO advice quietly ignores: GBP behaves exactly like a directory. And if it behaves like a directory, the question of whether to invest in directories or GBP is the wrong question entirely.
I ran a local services company for eight years before switching to consulting. I spent the first three of those years treating directories as a relic and GBP as the only game in town. I was wrong, and I lost leads because of it. What follows is the evidence — some of it strong, some of it shakier than practitioners admit — for why you should be running both channels in parallel.
The 73% Discovery Gap
BrightLocal’s 2024 local consumer search behaviour data surfaced a pattern that I’d been seeing anecdotally for years but couldn’t prove: roughly 73% of consumers use more than one source before contacting a local business. They cross-check. They click a directory listing, then search the business name on Google, then look at the GBP reviews, then maybe click through to the website. If you only appear in one of those touchpoints, you’re invisible in the verification loop — which is where conversions actually happen.
Why single-channel visibility fails
I made this mistake with my own company in 2016. We ranked beautifully on GBP for “carpet cleaning [town name]” and I assumed that was enough. Then a competitor — who wasn’t even ranking as well on GBP — started outpacing us on leads. Turned out he was listed in four regional directories I’d dismissed as irrelevant. Prospects were finding him on GBP, then seeing him in Yell, Thomson Local, and two trade-specific directories. That repetition created trust. I appeared once; he appeared five times.
Single-channel visibility fails because trust isn’t built in a single impression. It’s built through consistent repetition across independent sources. When a prospect sees your name, address, and phone number match on GBP, a regional directory, and an industry-specific listing, the implicit message is: this business is real, established, and findable.
Measuring multi-touch search behaviour
The measurement problem is brutal. Standard GA4 attribution tends to credit the last click, which usually goes to GBP or direct traffic. The directory visit that happened three days earlier — the one that actually seeded the decision — vanishes from your reporting. This is why most business owners think directories “don’t work”. They can’t see the work they’re doing.
The honest way to measure this is to use unique phone numbers per channel (call tracking) or at minimum UTM-tagged URLs for every directory listing that supports them. When I finally did this for my own business, the picture changed completely. Directories were responsible for about 22% of initial discovery even though they showed up as maybe 4% in GA4.
What BrightLocal’s 2024 findings reveal
The headline from that year’s local search behaviour study wasn’t that GBP dominates (it does) — it was that consumers use directories as a verification layer rather than a discovery layer. This is a meaningful distinction. Discovery happens on Google. Verification happens everywhere else. If you’re not in the “everywhere else”, the discovery doesn’t convert.
Did you know? According to Jasmine Directory, Google Business Profile “sits in a fascinating grey area between a traditional directory and something entirely different” — a dynamic platform integrating with Google’s entire ecosystem through local packs, knowledge panels, and Maps simultaneously.
Traffic Source Overlap Examined
The “they cannibalise each other” argument is the most persistent bit of local SEO folk wisdom, and it’s wrong. The traffic sources don’t substitute; they stack.
Directory referrals vs. GBP clicks
In the twelve months I tracked this properly for my services business, here’s what the split looked like across a roughly 400-lead-per-quarter operation:
| Channel | First-Touch % | Last-Touch % | Avg. Lead Quality (1-5) | Notes |
|---|---|---|---|---|
| Google Business Profile | 41% | 58% | 3.8 | High volume, mixed intent |
| Regional directory | 18% | 6% | 4.2 | Lower volume, higher intent |
| Industry-specific directory | 9% | 4% | 4.6 | Smallest volume, best quality |
| Organic website | 14% | 11% | 4.0 | Blog-driven queries |
| Referral (word of mouth) | 11% | 15% | 4.7 | Highest conversion |
| Direct / brand search | 4% | 3% | 4.1 | Return customers mostly |
| Social media | 2% | 2% | 2.9 | Tyre-kickers, largely |
| Paid search | 1% | 1% | 3.3 | Tested briefly; dropped |
Notice the gap between first-touch and last-touch for directories. Directories seeded 27% of leads but only closed 10% of them. That’s not a failing channel — that’s a top-of-funnel channel being misread by bottom-of-funnel metrics.
Where journeys actually begin
A customer journey that begins on a directory usually looks like this: search on Google → directory result appears in the top ten → click → read business blurb and reviews → open new tab → search business name → land on GBP or website → convert. The directory started the journey but gets zero credit in conventional analytics.
I’ve seen this fail spectacularly when owners cut their directory listings based on GA4 data alone. They save £400/year in directory fees and lose £6,000/year in leads they can no longer trace. Been there.
Attribution blind spots in analytics
Four attribution blind spots worth knowing about:
First, dark social verification: someone sees your directory listing, sends the link to their spouse via WhatsApp, spouse clicks directly — appears as “direct” traffic.
Second, offline-to-online leaks: directory listing visible to someone on mobile, they decide later on desktop to search your brand name — you get a “branded search” with no trace of the directory’s role.
Third, cookie expiration: if the journey spans more than the session window, the directory touchpoint is often lost.
Fourth, GBP’s internal attribution is a black box: Google reports “discovery searches” without telling you whether those searches were influenced by prior directory exposure.
Myth: If directory referrals show as a tiny percentage in Google Analytics, directories aren’t worth paying for. Reality: Directories predominantly influence first-touch and verification stages, both of which are systematically under-reported by last-click attribution models. You need call tracking or UTM-tagged directory URLs to see their real impact.
Citation Signals and Ranking Weight
NAP consistency across platforms
NAP — name, address, phone number — consistency is the single most empirically-supported citation factor. Moz’s local search ranking factors survey has put citation signals in the top five ranking factors for local pack placement for roughly a decade. The evidence here is strong: businesses with inconsistent NAP data across directories rank worse in local packs, controlling for other factors.
What counts as “inconsistent”? Different suite numbers, abbreviations (“Street” vs “St”), old phone numbers lingering on an abandoned listing. I once spent a weekend fixing NAP inconsistencies across 23 listings for a client. Their local pack position moved from 7th to 3rd within six weeks. No other changes. That’s not proof — it’s one case — but it’s directionally consistent with the research.
How Google weighs third-party mentions
Google has publicly confirmed that citations from reputable sources feed into its local ranking algorithm. The exact weight is proprietary, but the pattern that emerges from third-party SEO analyses is that Google cares about:
Directory authority (age, traffic, editorial standards). Geographic relevance (a regional UK directory matters more for a UK business than a global one). Category relevance (an industry-specific directory signals topical authority). Freshness and activity (dormant listings on dormant directories count for little).
Jasmine Directory’s analysis notes that “a local business listed in a regional directory for their specific area carries more weight than a listing in a national directory,” citing research that locally-focused directories can improve local search rankings by up to 15%. That specific 15% figure deserves some scepticism — it’s been cited in multiple places without clear methodology — but the underlying principle (local > national for local businesses) is well-established.
Strong vs. weak correlation evidence
Here’s where I’ll be blunt about evidence quality, because most local SEO writing blurs this.
Strong evidence: NAP consistency correlates with local pack ranking. This has been replicated across multiple independent studies over many years.
Moderate evidence: Citation volume correlates with ranking, but with diminishing returns past roughly 40-60 quality citations.
Weak evidence: Specific percentage uplift claims (“directories boost rankings 15%”). These usually come from single-case studies or from vendors with incentives to report positive numbers.
Near-worthless claims: Anything citing “Directory Trust Flow Metrics” or “Citation Authority Score Factors” as if they were named Google signals. These are SEO-industry constructs, not confirmed algorithmic factors.
Did you know? According to research published by Jasmine Directory, locally-focused directories can improve local search rankings by up to 15% compared to generic national listings — though the specific methodology behind this figure is worth scrutinising before treating it as gospel.
Side-by-Side Performance Metrics
Conversion rates by entry point
From my own data and from three client engagements I ran through similar measurement frameworks, GBP consistently drives higher raw conversion volume but lower per-lead quality. Directory entries convert at a lower rate but produce leads with higher intent and higher average order values.
Why? Because the person who found you via a directory typically read more about you before clicking. They pre-qualified themselves. The GBP clicker often clicks because you’re the closest pin on the map — convenience, not fit.
Engagement depth comparisons
GBP visitors who arrive via the “Website” button on a listing spend less time on site than directory-referred visitors — in my tracking, about 42 seconds vs. 2 minutes 18 seconds. This isn’t necessarily bad; GBP users often don’t need the website because they’ve got everything they need (phone, hours, reviews) right on the profile. But it does mean GBP traffic is less useful for anything that requires deeper engagement — content downloads, quote forms, service comparisons.
Cost-per-lead across both channels
GBP is free at the point of listing but has real hidden costs: review management, photo updates, Q&A monitoring, post creation. Budget roughly 4-6 hours per month if you’re doing it properly. Directories range from free (most) to £300-£600 per year for premium placements on quality platforms.
On a blended basis across the businesses I’ve advised, the cost-per-lead looks roughly like this:
GBP (including the owner’s time): £3-£8 per lead. Free directories (with NAP upkeep): £2-£5 per lead. Paid regional directories (mid-tier): £8-£18 per lead. Paid industry-specific directories: £15-£40 per lead, but with conversion rates often 2-3× higher.
The industry-specific directories look expensive until you factor in close rates. A £35 lead that closes at 45% is cheaper per customer than an £8 lead that closes at 9%.
Quick tip: Don’t compare channels by cost-per-lead alone. Track cost-per-customer (cost-per-lead ÷ close rate). I’ve watched business owners kill their best-performing channel because they looked at the wrong metric.
Reading the Mixed Evidence
The local SEO field is drowning in studies, surveys, and “data-driven” claims of wildly varying quality. Knowing which to trust matters more than reading more of them.
Studies worth trusting
BrightLocal’s annual Local Consumer Review Survey and Local Search Ranking Factors study are the two most methodologically transparent regular pieces of research in the space. They publish sample sizes, methodology, and year-over-year changes. Whitespark’s local search studies are similarly rigorous. Moz’s data, while less frequent now, has historical credibility.
Google’s own publications — like the Economic Impact Report featuring businesses such as 3 Farm Daughters, a family pasta business serving 20,000 customers annually through Google tools — are useful as case illustrations but not as comparative evidence. They’re marketing, not research, and that’s fine as long as you read them that way.
Claims that don’t hold up
Anything starting with “X% of consumers…” without a linked methodology deserves suspicion. I’ve traced a surprising number of these to a single survey of 300-500 respondents repurposed across dozens of blog posts. Sample size matters; sampling method matters more. A self-selected survey of people already using a local search tool tells you about that tool’s users, not about consumers generally.
Be especially wary of before-and-after case studies from vendors. “Client X’s traffic increased 340% after we added them to 50 directories” rarely controls for the other 12 things that changed simultaneously.
Sample size and bias warnings
Two biases show up constantly:
Survivorship bias: the businesses that invested heavily in directories and thrived get case-studied; the ones that invested and stagnated don’t. You’re reading a filtered sample.
Vendor incentive bias: directory providers publish research showing directories work; GBP-focused tool providers publish research showing GBP dominates. Both can be simultaneously true or simultaneously exaggerated depending on methodology.
Myth: Google has officially deprecated business directories as a ranking signal. Reality: Google has never said this. What has changed — as Jasmine Directory notes — is how Google evaluates directory quality. Low-quality directory stuffing stopped working around 2016-2018; quality citations in relevant, trusted directories still factor into local rankings.
What if… Google deprecated third-party citations entirely tomorrow and said only GBP matters? Even then, directories would still drive direct referral traffic, still provide verification trust signals to human visitors, and still rank on their own for queries where your GBP doesn’t appear. The ranking signal is a bonus, not the primary reason directories exist. Businesses that treat them purely as SEO plays miss the point.
Building a Two-Channel Strategy
Enough diagnosis. Here’s how I’d actually structure this if I were starting a local services business tomorrow with limited budget and limited time.
Allocating budget between them
For a small local business with, say, £200/month in total local search budget (including your own time valued at £30/hour), I’d split roughly:
60% to GBP: ongoing review solicitation, weekly posts, monthly photo updates, Q&A management, responding to messages within hours.
25% to directory presence: a handful of high-quality listings — one or two regional directories, one industry-specific directory, and a general business directory with editorial standards such as Jasmine Directory. Focus on depth over breadth.
15% to citation maintenance: auditing NAP consistency quarterly, cleaning up duplicate or outdated listings, adding new listings as the business expands.
This split inverts the common advice of “list everywhere”. Listing on 200 low-quality directories used to help in 2013; today it produces noise, occasional duplicate penalties, and management overhead that crowds out the high-value work.
Metrics that matter for each
For GBP, I’d track: profile views (direct vs. discovery split), actions (calls, directions, website clicks), review volume and recency, photo views, and local pack ranking for your three most commercially valuable queries.
For directories, I’d track: referral traffic via UTM tags, call volume via tracking numbers, listing views where the directory provides them, and — this is the big one — organic ranking for long-tail queries that mention the directory’s category pages. A surprising amount of directory value comes from the directory itself ranking for queries you wouldn’t rank for directly.
One metric I’ve stopped caring about: citation count. Going from 40 quality citations to 80 mediocre ones doesn’t move the needle. Going from 40 quality citations to 50 quality citations in newly-relevant directories does.
Quarterly audit framework
Every quarter, I run the same short audit. Takes about two hours for a single-location business:
First, pull up every active directory listing (keep a spreadsheet — trust me on this; I lost track once and spent a weekend rebuilding it). Check NAP consistency. Note any outdated information, including hours changes, new service categories, old logos.
Second, check GBP insights for the quarter. Look at the discovery vs. direct search split. If direct searches are rising as a percentage, your brand is building. If discovery searches are rising, your category visibility is improving. Both are good signals.
Third, mystery-shop yourself. Open an incognito browser, search three variations of your core category in your service area, and note where you appear — in the local pack, in directory listings that appear on page one, in “best of” roundups. You’re looking for gaps. If a directory consistently ranks for your queries and you’re not listed in it, that’s a priority addition.
Fourth, review the competitor terrain. Which directories are your three top competitors listed in that you’re not? This is often the fastest way to find citation opportunities you’ve missed.
Fifth — and this one’s tedious but matters — check for duplicate listings. These multiply without you noticing, especially after business moves, phone number changes, or name tweaks. Duplicates dilute your citation signal and occasionally trigger Google’s duplicate detection systems.
Did you know? According to TexAu, automated Google Maps export tools can extract bulk business information in minutes — what used to take hours or days. Useful for competitive analysis and directory building, though worth checking that your use complies with Google’s terms of service before scaling anything up.
A case worth walking through
A client I worked with in 2023 — a small HVAC company in a mid-sized English town — came to me convinced they needed to “quit directories” because their GA4 showed directories as 3% of traffic. I asked to see their call records. We spent an afternoon matching inbound calls to source via a simple tracking number setup across their top five directory listings.
The result: 28% of their actual booked jobs over a 90-day window traced back to directory listings as first touch. The GA4 report wasn’t wrong — it just wasn’t measuring what they thought it was measuring. We kept the directories, cut four low-quality ones they were wasting money on, added one specialised trades directory, and reallocated the saved budget to GBP review solicitation. Lead volume rose 19% in the following quarter; close rate improved from 31% to 38% because the overall lead quality improved.
The lesson isn’t that directories are always worth it. Sometimes they’re not — I’ve also advised clients to cut directory spend entirely where the data genuinely didn’t support it. The lesson is that the default analytics setup lies about this particular question, and you need to build better measurement before you make the decision.
Quick tip: Before cutting any directory listing, set up a 60-day tracking number experiment. Route the directory’s displayed phone through a unique tracking line. If call volume from that line is less than the directory’s annual cost divided by your average job value, then cut it. If it’s more, keep it. Don’t rely on GA4’s referral reports — they systematically undercount phone-driven conversions.
What the Data Suggests You Should Do Differently
If you’ve been running either channel in isolation, the evidence points clearly toward a hybrid strategy — but not an undisciplined one. Most “list everywhere” advice from the 2010s is actively counterproductive now. Most “GBP is all you need” advice from the last few years is underselling the verification and trust role that quality directories still play.
Three moves are worth making this quarter regardless of your current setup:
Audit your NAP consistency across your top 15 existing citations. This alone will move rankings for most businesses I’ve audited — the evidence for NAP consistency as a ranking factor is as solid as anything gets in local SEO.
Set up proper measurement before you make any budget changes. Tracking numbers, UTM tags, and a quarterly manual review of lead-to-source attribution. Stop trusting default analytics to answer a question it was never designed to answer.
Stop thinking in terms of “directories vs. GBP” entirely. The businesses outperforming in local search are the ones running both channels as parts of one visibility system — each measured on its own merits, each funded according to the leads it actually generates, and each refreshed on a consistent schedule. The evidence doesn’t say one beats the other. It says, reliably and repeatedly, that the ones running both beat the ones running either.
The next twelve months will likely bring more algorithm shifts, more AI-generated local content, and more fragmented search behaviour as users split their discovery between Google, Maps, social platforms, and AI assistants that pull from directory data in ways we can’t yet fully measure. The businesses that stay visible will be the ones whose information is accurate, consistent, and present across the places those systems look. That’s not a prediction — it’s already happening, and the cost of preparing for it is lower than the cost of catching up.

