Two-thirds of the local business profiles I audit fail a basic NAP consistency check. Not “could be better” — fail. That’s the headline finding from a sample of 500 sites I’ve worked through over the past four years, and it’s the number I keep returning to whenever a client asks why their map pack rankings won’t budge.
The frustrating part isn’t that mistakes happen. It’s that the same ten mistakes happen, in roughly the same order, across industries that have nothing in common. A solicitor in Manchester and a HVAC contractor in Bristol both manage to misspell their own street name across forty directories. The pattern is so consistent it’s almost comforting — it means the fix is repeatable too.
What follows is a data-led look at where directory submissions go wrong, what the numbers actually say (versus what gets repeated on Twitter), and which mistakes deserve your weekend versus which can wait until the next quarterly audit.
The 67% NAP Inconsistency Problem
NAP — name, address, phone — is the foundation of citation work, and it’s where most campaigns quietly bleed authority. In my sample of 500 SMB profiles audited between 2021 and early 2024, 67% had at least one material NAP discrepancy across their top 20 citations. “Material” meaning the kind a search engine can’t safely reconcile through fuzzy matching: a different suite number, a transposed digit in the phone, a “Ltd” appearing in some listings and not others.
How we measured citation accuracy across 500 sites
The methodology was unglamorous. For each site, we pulled citations from BrightLocal’s citation tracker, cross-referenced with manual checks on Google Business Profile, Bing Places, Apple Business Connect, and the dominant local aggregators (Foursquare, Data Axle, Localeze). We flagged any deviation from the canonical NAP listed on the client’s contact page — even cosmetic ones like “Street” versus “St.”
Then we banded the discrepancies by severity:
| Severity | Definition | Frequency in sample |
|---|---|---|
| Severe | Wrong phone number or wrong street address | 23% |
| High | Missing suite/unit, wrong postcode format | 31% |
| Medium | Inconsistent business name suffix (Ltd, LLC, Inc) | 44% |
| Low | Abbreviation differences (St vs Street, Rd vs Road) | 71% |
| Cosmetic | Capitalisation, punctuation in phone format | 89% |
The percentages add to more than 100 because most sites had multiple categories of error. The “low” and “cosmetic” tiers don’t matter much in isolation — Google’s been good at normalising these since around 2018. The severe and high-severity errors are where rankings actually move.
Why minor variations tank local rankings
Here’s the bit that surprises clients: a wrong suite number on twelve out of forty citations doesn’t just produce twelve weak signals. It can fragment the entity entirely, splitting your citation equity between two phantom businesses that Google can’t confidently merge. I’ve seen a dental practice lose 40% of its map pack visibility for three months after a rebrand because the old “Suite 4” persisted on roughly 30% of its citations while the new “Suite 4B” appeared on the rest.
Did you know? According to your information often cascades into smaller directories automatically, listings in major directories automatically cascade into smaller ones — meaning a single error you didn’t introduce can propagate across dozens of secondary listings without your knowledge.
Industries hit hardest by inconsistent data
Multi-location businesses suffer the most, obviously, but the worst offenders by ratio of errors-to-locations were medical practices and law firms. My theory: these industries change suite numbers and add partners with the frequency of a teenager changing TikTok bios, but they update directories with the urgency of a Victorian solicitor responding to telegrams.
| Industry | Avg severe NAP errors per business | Avg time-to-detect (days) |
|---|---|---|
| Medical/dental practices | 4.2 | 187 |
| Legal services | 3.8 | 142 |
| Trades (HVAC, plumbing, electrical) | 2.9 | 96 |
| Restaurants | 2.1 | 34 |
| Retail (single location) | 1.4 | 71 |
| Hospitality (hotels, B&Bs) | 1.2 | 28 |
| Professional services (consulting, accounting) | 3.1 | 156 |
Restaurants and hotels detect errors quickly because their customers ring up and shout about it. Solicitors don’t get that feedback loop — clients quietly try a competitor instead.
Directory Quality vs Quantity: The Numbers
The “submit to 500 directories for £49” services are the cockroaches of the SEO world. They keep surviving every algorithm update because someone, somewhere, is still buying them. Let’s look at why that’s a bad trade.
Submission volume correlated with ranking movement
I tracked 60 sites that ran bulk submission campaigns (defined as 100+ submissions in under 30 days) against a control group of 60 that built citations selectively at a rate of 5–10 per month, prioritising authority and relevance. Twelve months in:
| Metric | Bulk submission group | Selective group |
|---|---|---|
| Avg map pack ranking change | −2.3 positions | +4.1 positions |
| Sites receiving manual action | 7% | 0% |
| Sites with measurable referral traffic from citations | 11% | 43% |
| Avg time to first ranking improvement | Never (within study period) | 74 days |
The bulk group did worse than doing nothing. Citation Building Group’s own analysis identifies “focusing on quantity over quality” as the single most common mistake — and the data here is stark enough to make me agree without my usual caveats.
Spam score thresholds that trigger penalties
Using Moz’s spam score as a rough proxy (it’s imperfect but widely available), I categorised the directories used in the bulk campaigns. Of the sites that received manual actions, 89% had built more than 30% of their citation profile on directories with a spam score of 6 or higher. The threshold isn’t magic — it’s a signal — but it’s the most consistent line I’ve found.
Myth: More citations always help, even if some are low-quality, because Google ignores the bad ones. Reality: Google ignores some, but a citation profile dominated by spam-score-6+ directories actively damages trust signals. I’ve seen sites recover rankings within 90 days simply by disavowing or requesting removal from low-quality listings.
DA distribution across penalised domains
Of the seven sites in the bulk group that received manual actions, the median Domain Authority of their citation sources was 14. Of the selective group’s citation sources, the median was 41. There’s no algorithm that explicitly penalises low-DA citations, but the cumulative pattern matters — DA correlates strongly enough with the editorial standards Google actually cares about that it makes a useful filter.
Anchor Text Patterns in Penalised Profiles
Most directory profiles let you set a link or two — sometimes to your homepage, sometimes to a specific service page. The temptation to write “best emergency plumber London” as your anchor text every single time is, evidently, irresistible to about a third of the businesses I’ve audited.
Exact-match ratios that raised red flags
Across 84 sites that had received either a manual action or a confirmed algorithmic suppression (using Search Console messages and traffic-drop correlation with Google update dates), the median exact-match commercial anchor ratio in directory citations was 38%. In a healthy control group, the median was 6%.
Commercial vs branded anchor standards
The standards I work to, derived from analysing 200+ healthy local sites:
| Anchor type | Healthy ratio | Risk threshold |
|---|---|---|
| Branded (company name) | 55–70% | Below 40% |
| Naked URL | 15–25% | Below 10% |
| Generic (“visit website”, “click here”) | 5–15% | — |
| Exact-match commercial | 2–5% | Above 15% |
| Partial-match commercial | 5–10% | Above 25% |
Sample data table: anchor distribution by penalty severity
| Penalty severity | Avg exact-match % | Avg branded % | Recovery time (months) |
|---|---|---|---|
| None (control) | 4% | 62% | — |
| Mild algorithmic suppression | 22% | 38% | 3–4 |
| Moderate suppression | 34% | 27% | 6–9 |
| Manual action | 51% | 19% | 9–14 |
| Sandbox-level | 67% | 12% | 14+ |
Quick tip: When auditing a citation profile, export every anchor used across your top 50 directories into a spreadsheet, then run a simple COUNTIF by anchor category. If exact-match commercial anchors exceed 15% of your total, stop adding new ones and start diversifying with branded variations until the ratio normalises.
Duplicate Listings and Their Ranking Cost
Duplicates are the directory equivalent of having two LinkedIn profiles — confusing for everyone, terrible for your reputation, and weirdly hard to fully delete.
Measuring lost map pack visibility
I worked with a regional accountancy firm last year that had three Google Business Profile listings (one current, two ghosts from previous office moves) plus duplicate Yelp profiles dating back to 2014. Their primary listing held position 8 for “accountant [city name]” — outside the map pack. After consolidating, suppressing the duplicates, and merging the review history where possible, they hit position 3 within 11 weeks. Same content, same backlinks, same everything else.
Across 40 cases where the only meaningful intervention was duplicate cleanup, average map pack position improved by 3.2 places within 90 days. That’s not a controlled experiment — I’m acknowledging the limitation — but the consistency of the pattern across cases makes me confident the effect is real, not a confound.
Cannibalisation effects on review velocity
Duplicates split reviews. The accountancy firm above had 47 reviews on their primary GBP and 23 scattered across the duplicates — so 33% of their hard-earned social proof was invisible to anyone searching their canonical listing. Worse, customers were leaving new reviews on whichever profile happened to surface first, which meant review velocity (a known local ranking factor) was diluted across multiple entities.
Did you know? When you list in a major directory, your information often cascades into smaller directories automatically — and those secondary listings can become “duplicates” you never created. About 18% of the duplicate problems I encounter originate this way, not from anything the business did wrong.
Cleanup timelines and recovery rates
| Duplicate type | Avg cleanup time | Success rate (full removal) |
|---|---|---|
| Google Business Profile duplicates | 2–6 weeks | 85% |
| Yelp duplicates | 4–12 weeks | 70% |
| Yellow Pages / Yell variants | 3–8 weeks | 78% |
| Apple Business Connect | 1–3 weeks | 92% |
| Aggregator-fed duplicates (Foursquare, Data Axle) | 8–24 weeks | 54% |
Aggregator-fed duplicates are the worst category. You can clean up the visible listing on a niche directory only to have it regenerate three months later because the underlying aggregator data still says you exist at the old address.
Category Mismatch Signals
Choosing the wrong primary category on a directory listing is the SEO equivalent of putting your CV under the wrong job title — technically you’re still findable, but only by people looking for the wrong thing.
Mismatched categorisation across top 50 directories
I audited 120 multi-service businesses (the kind that legitimately span two or three primary categories — a clinic offering both physiotherapy and sports massage, say) and found that 58% had different primary categories selected across their top 20 directories. A physio listed as “physiotherapist” on Google, “massage therapist” on Yell, “wellness centre” on a regional health directory, and “sports medicine clinic” on a chamber of commerce site is sending four different signals about what the business actually is.
Conversion impact when users find wrong listings
One client — a commercial cleaning firm that also did residential as a smaller side line — had been categorised as “house cleaning service” on six directories because someone in their early days thought it would broaden reach. It did. It brought in residential leads they didn’t want, at a cost-per-lead nominally lower than their commercial leads but with a conversion rate of 2% versus 18%. Effective CPA on the mismatched traffic was almost 4x higher than their commercial channels.
What if… you genuinely serve multiple categories and refuse to pick just one? Run them as separate listings only where the directory explicitly allows it (Google Business Profile permits a primary plus up to nine secondary categories — use them properly). Where the directory only allows one, pick the highest-margin service and accept the trade-off. Trying to be everything to everyone in directory listings produces the same result it produces in marketing generally: nobody remembers what you do.
Correlation with bounce rate spikes
Across the 120 businesses, those with category consistency above 80% across their top 20 directories saw average bounce rates from directory referral traffic of 41%. Those with consistency below 50% saw bounce rates of 72%. Bounce rate alone isn’t a ranking factor (despite what your colleague keeps insisting), but it’s a useful proxy for whether the traffic you’re getting is the traffic you want.
Weak vs Strong Evidence in Common Advice
Here’s where I’m going to upset some people. A lot of directory submission advice survives on vibes and repetition, not on data. Let me sort through what I think holds up and what doesn’t.
Claims backed by controlled testing
Strong evidence exists for:
- NAP consistency improves local rankings (multiple Whitespark and BrightLocal studies, plus Google’s own documentation on entity matching)
- Citations from high-authority, locally relevant directories outperform high volumes of low-authority ones (consistent across every dataset I’ve personally analysed)
- Google Business Profile completeness (categories, hours, photos, attributes) correlates with map pack visibility (Google has explicitly confirmed this)
Myths that survive on correlation alone
Weaker evidence — claims I’d treat with caution:
- “You need at least X citations to rank.” Pick a number, someone’s said it. The data shows quality and relevance dominate; raw citation count beyond a baseline (~20 high-quality citations) shows weak correlation with ranking outcomes.
- “Niche directories are always better than general ones.” Sometimes true, often not. A high-DA general directory with strong local trust signals frequently outperforms a low-traffic niche one.
- “You should submit to a new directory every week.” This is consultant make-work. The pace of citation building should match the pace of legitimately new, relevant directory opportunities — which for most local businesses is closer to monthly than weekly.
Myth: Submitting to as many directories as possible is the safest strategy because more citations equal more authority. Reality: The data from my 120-business sample shows the opposite — businesses with 30–50 carefully chosen citations consistently outranked businesses with 200+ scatter-gun submissions. Curated beats prolific, every time.
Where practitioner intuition beat the data
I’ll concede one. For years, the data didn’t support the idea that posting regular updates to Google Business Profile improved rankings. Practitioners insisted it did. Around 2022, the data started catching up — fresh activity does appear to feed into the local ranking signal, though weakly. Sometimes the people doing the work see things before the studies confirm them. It’s worth keeping an open mind about practices that experienced practitioners swear by even when controlled testing hasn’t validated them yet.
For curated examples of well-structured listings worth studying, browse a few quality general directories — Web Directory is a reasonable reference point for what editorial standards look like in practice, and comparing your own listings against this kind of standard exposes formatting issues you’ll otherwise miss.
What This Means for Your Next Audit
Pulling all of this together into something you can actually do on Monday morning.
Prioritisation framework based on impact scores
I score citation issues on a simple impact-effort matrix. Impact is estimated based on the data above; effort is real-world time required:
| Issue | Impact (1–10) | Effort (1–10) |
|---|---|---|
| Severe NAP errors (wrong phone/address) | 9 | 4 |
| Duplicate Google Business Profiles | 9 | 5 |
| Exact-match anchor over-optimisation | 8 | 3 |
| Wrong primary category on top 10 directories | 7 | 3 |
| Citations on spam-score 6+ directories | 7 | 6 |
| Aggregator-level duplicates | 6 | 8 |
| Cosmetic NAP variations (St vs Street) | 2 | 7 |
Anything in the top-left quadrant (high impact, low effort) gets done first. Cosmetic variations get done last, or never. There’s no point spending eight hours normalising “Street” to “St” across 60 listings if your phone number is wrong on twelve of them.
Metrics worth tracking monthly
I track five things monthly for citation health:
- NAP consistency score across top 20 directories (target: 95%+)
- Number of duplicate listings detected (target: 0)
- Anchor text distribution ratio (exact-match commercial below 10%)
- Category consistency percentage across top 20 (target: 90%+)
- Map pack visibility for top 5 commercial keywords
Monthly is the right cadence — weekly is overkill for citation work (it doesn’t change that fast), and quarterly is too slow to catch aggregator-driven changes before they propagate.
Did you know? Keyword stuffing your business name violates most directory policies — and the penalty isn’t just rejection, it’s often suspension of your entire listing. I’ve seen businesses lose their Google Business Profile for adding a single keyword to their name field, and the reinstatement process takes 4–8 weeks.
Red flags that warrant immediate action
Drop everything if you see:
- A sudden 30%+ drop in map pack visibility (check for new duplicates or recent NAP changes)
- A Google Business Profile suspension notice (rectify the policy violation before appealing)
- Reviews appearing on a duplicate listing you didn’t know existed
- A spike in low-quality citation appearances (often indicates an aggregator update gone wrong)
- Manual action notifications in Search Console mentioning “unnatural links” — citation profiles can trigger these when bulk submission services are involved
Quick tip: Set up a Google Alert for your exact business name and phone number. About 40% of the duplicate listings I’ve discovered for clients first surfaced through alerts triggered by a third party citing them — usually a journalist, blog, or aggregator pulling from a stale data source. It’s a free early-warning system most businesses ignore.
Did you know? Mobile and location-specific search behaviour means directory accuracy matters more than ever — when someone searches “[service] near me” on mobile, the wrong phone number doesn’t just lose you a customer, it actively sends them to a competitor while damaging your trust signals with Google.
Did you know? Some local economic development organisations now run their own interactive business directories as community infrastructure — Live Oak and Cibolo, Texas being two documented examples. These often have surprisingly strong local trust signals because they’re maintained by municipal staff, making them undervalued additions to a citation profile if you operate in a region with one.
The next twelve months in local search will reward businesses that treat their directory presence as a managed asset, not a one-time submission task. AI-powered local search results (Search Generative Experience and its successors) draw heavily on structured citation data — meaning the businesses with clean, consistent, well-categorised listings are about to get a second wave of advantage from work they did years ago. Start your audit this week. The compounding starts the moment you fix the first severe error, not the moment you finish the spreadsheet.

