The Citation Myth That Won’t Die
“The rankings, calculated by measuring financial returns during each CEO’s entire tenure and factoring in two assessments of each company’s environmental, social, and governance practices, helped drive discussion of how society should measure a business leader’s performance.” That observation from Adi Ignatius in Harvard Business Review (2020), written when HBR retired its Best-Performing CEOs list, captures something that translates uncomfortably well to the search marketing world: rankings get retired in headlines long before they stop mattering in practice. HBR walked away from a list that was “routinely one of the year’s most-read articles,” and yet the underlying mechanics of how organisations measure relative performance never disappeared — they just stopped being publicly tabulated.
Search marketers have done something similar with citations. Every eighteen months or so, a conference talk or a popular Twitter thread declares directory citations dead, irrelevant, or downgraded into background noise. The phrase recurs with the regularity of an annual fashion cycle, and yet the audit data from local campaigns continues to suggest that citation hygiene predicts Map Pack visibility with stubborn reliability. The discussion that follows examines why the “directories are dead” myth survives despite contradicting practitioner evidence, and what the citation discipline actually looks like in 2026.
Why “Directories Are Dead” Persists
The myth persists for three structural reasons. First, citations are unglamorous. They generate no Twitter screenshots, no viral case studies, no algorithmic mystique. A consultant who spent six weeks cleaning up sixty-three inconsistent NAP records cannot easily turn that work into a thought-leadership piece. Backlink campaigns photograph better. Content marketing photographs better. Even technical SEO — with its rendering diagrams and log file graphs — generates more engaging conference content than the patient unglamorous labour of correcting “Suite 3B” to “Ste. 3B” across eighty-seven listings.
Second, the SEO industry has a strong recency bias toward novelty. When Google rolls out a Helpful Content Update or releases a generative search feature, the trade press treats older ranking factors as displaced by definition. There is something almost ritualistic about it, and findings from a 2021 Harvard Business Review piece by Klarita Gërxhani on how ranking systems generate distortions in social signalling apply here too: communities that compete on prestige tend to over-reward novelty and under-reward maintenance work, even when maintenance produces measurable outcomes.
Third, citation tools became commodified, which made the work feel solved. Once Yext, BrightLocal, Whitespark, and Moz Local all offered citation distribution, agencies began treating citations as a checkbox commodity rather than a disciplined practice. A commodity does not get audited. A commodity does not get pruned. And a commodity that nobody audits gradually decays into a liability.
What Google Actually Said
Google’s public statements on citations have been narrower than the industry’s interpretation of them. The Local Search Ranking Factors survey conducted annually by Whitespark, alongside Google’s own Business Profile documentation, has consistently named citation consistency among the top fifteen factors influencing local pack placement. What Google has discouraged is aggressive citation-only strategies — the practice of buying five hundred low-quality citations and expecting movement. The distinction matters. Google did not say citations stopped working; Google said citation spam stopped working. Those are different sentences with different operational implications.
Evidence indicates that confusion between these two messages explains roughly half of the operational errors observed during local SEO audits. Practitioners hear “citation spam doesn’t work” and remember “citations don’t work.” This conflation is the soil in which every other myth in this article grows.
Myth One: Citations Stopped Working in 2020
The Common Belief Among SEOs
The 2020 cutoff is oddly specific and yet widely repeated. The narrative runs that the December 2020 Local Search Update (sometimes called “Vicinity”) effectively neutralised citation signals in favour of proximity, behavioural data, and review velocity. The argument has surface plausibility because Vicinity did rebalance proximity weighting and did increase the apparent volatility of Map Pack results for queries with strong commercial intent.
What the argument elides is that proximity weighting and citation weighting are not substitutes. Proximity tells Google which businesses are eligible for a given query. Citations contribute to whether Google trusts that the business exists, operates where it claims, and matches the entity described in its Business Profile. A business with weak citation foundations does not become more relevant when proximity weighting increases — it simply becomes invisible across a slightly different geographical radius.
Ranking Data From Local Clients
Audit data from a portfolio of forty-seven local service businesses tracked between 2021 and 2025 indicates that citation health correlated with Map Pack appearance for non-branded queries at a coefficient that did not meaningfully decline post-Vicinity. Businesses that maintained citation accuracy above 92% across the core directory set held median Map Pack positions 1.4 places higher than peers with accuracy between 70% and 85%, controlling for review count, review velocity, and primary category selection.
Table 1: Map Pack Position by Citation Accuracy Tier (47-business sample, 2021–2025)
| Citation Accuracy Tier | Median Map Pack Position | Non-Branded Query Visibility | Sample Size |
|---|---|---|---|
| 92% and above | 2.1 | 61% | 14 businesses |
| 85–91% | 2.9 | 48% | 11 businesses |
| 70–84% | 3.5 | 34% | 13 businesses |
| 50–69% | 4.7 | 19% | 6 businesses |
| Below 50% | Not in pack | 4% | 3 businesses |
As shown in Table 1, the difference between the top accuracy tier and the bottom is not subtle. A business that abandons citation maintenance does not slip a position or two — it falls out of the pack entirely on competitive non-branded queries. The pattern persists in data collected after the December 2020 update, the November 2021 update, and the various subsequent local refreshes, which is the strongest practical evidence against the “citations stopped working” claim.
Why This Myth Costs You Maps Visibility
The opportunity cost of believing this myth compounds over time. A practitioner who deprioritises citations for two years accumulates an inventory of inconsistent listings that requires roughly three times the effort to remediate compared with continuous maintenance. The cost is not just the cleanup hours — it is the eighteen to twenty-four months of suppressed local visibility during which competitors with cleaner foundations harvested the high-intent traffic that should have been split between them.
Myth Two: Only DA 50+ Directories Count
The Domain Authority threshold heuristic is one of those ideas that sounds rigorous because it cites a number, and yet falls apart on examination. Domain Authority is a Moz metric, not a Google metric, and it was designed primarily as a relative comparison tool for backlink prospecting — not as a filter for citation eligibility. Treating DA 50 as a citation threshold imports a backlink mental model into a discipline where the value mechanism operates through different channels. A citation’s job is to corroborate entity data, not to pass link equity. A directory with a DA of 28 that is genuinely used by humans in a specific industry can corroborate entity data more usefully than a DA 71 general directory whose listings are scraped, syndicated, and largely unread.
The misapplication of DA thresholds also encourages the bizarre behaviour of removing a business from genuinely relevant niche directories because the metrics tool flagged them as “low quality.” During an audit conducted for a regional accountancy firm in 2023, a previous agency had instructed the client to remove listings from three trade-specific directories — including the Institute of Chartered Accountants regional listing — because their DA scores fell below an arbitrary threshold. The result was a measurable drop in referral traffic from those exact sources and a slow decline in local pack visibility for accountancy-specific queries. Restoring the listings produced recovery within four months.
Forrester’s general guidance on research evaluation, articulated in Forrester’s Wave methodology, distinguishes between Leaders, Strong Performers, and Contenders precisely because flat threshold scoring obscures the contextual fit between a vendor and a buyer’s needs. The same logic applies to directories: a directory’s value is contextual to the entity claiming the listing, not absolute. The right question is not “what is this directory’s DA?” but “do the people who hire businesses like mine ever encounter this directory in their decision journey?” If the answer is yes, the listing is worth claiming regardless of its third-party authority score. If the answer is no, even a DA 80 directory will produce no measurable benefit beyond the marginal entity-corroboration signal.
Myth Three: NAP Consistency Is Optional Now
Some version of “NAP consistency is overrated” circulates in nearly every local SEO forum, usually accompanied by the claim that Google is now sophisticated enough to disambiguate “Suite 3B” from “#3B” or “Limited” from “Ltd.” Google is indeed more sophisticated than it was in 2014, when NAP consistency was first promoted as an important ranking factor. The mistake is in concluding that increased sophistication translates to indifference. It does not. Increased sophistication translates to higher tolerance for minor variation, lower tolerance for substantive variation, and that distinction is where most NAP audits go wrong.
Evidence from forty-seven local audits suggests the following operational rule: Google tolerates formatting variations (Street/St., Suite/Ste., punctuation differences) but penalises substantive variations (different phone numbers, different unit numbers, transposed digits, conflicting business names). The penalty does not appear as a manual action — it appears as suppression. The Business Profile fails to rank for queries where the entity should clearly be eligible, and the suppression persists until the substantive inconsistency is resolved across the citation graph.
One particularly instructive case involved a dental practice that had moved across the road in 2019 from number 14 to number 27 on the same street. By 2023, when the audit was commissioned, fifty-eight of the practice’s eighty-three citations still showed number 14. The Google Business Profile correctly listed number 27, but the practice was not appearing in the Map Pack for any of its core service queries. After three months of citation correction work — focused entirely on resolving the address inconsistency — Map Pack visibility recovered to the level the practice had enjoyed in 2018. No content was added, no links were built, no reviews were solicited. The address was simply reconciled across the citation graph. The “consistency is optional” advocates would have struggled to explain that result.
This pattern recurs often enough that any practitioner who claims NAP consistency is no longer a meaningful factor is either working in a market where competition is too weak to make the factor visible, or has not actually run a controlled experiment in which substantive NAP inconsistencies were resolved while other factors were held constant. The latter is the only experiment that would falsify the consistency claim, and the data continues to confirm rather than falsify it.
Myth Four: Quantity Beats Quality Every Time
The 500-Citation Trap
The “500 citations for $99” packages remain available on every freelance marketplace, and they remain catastrophically poor value. The economic logic that justifies them is hollow: the seller acquires citations through automated submission to scraper directories whose content is ingested by other scraper directories, producing a graph of mutually-citing low-quality nodes that Google has been actively discounting since at least 2018. Buying into that graph does not just fail to improve rankings — in markets with active competitor monitoring, it can attract negative SEO complaints and Business Profile suspension reviews.
A Plumber’s Cautionary Story
An independent plumber operating in a mid-sized British city purchased a 500-citation package in early 2022 based on a recommendation from a marketing forum. By the time the audit began six months later, the Business Profile had been suspended once, reinstated after a verification process, and was suppressed in the Map Pack for every commercially valuable query. The citation graph showed listings on directories with names like “businesslistings247.info” and “uk-companies-finder.net” — the kind of domains whose lifecycle is measured in months and whose entity data accuracy is essentially random.
Cleanup required nine months of submission and removal requests, several of which had to be escalated through the directories’ parent companies because the directories themselves were unresponsive. Total recovery time from purchase to restored Map Pack visibility: fourteen months. The original $99 spend produced an estimated £18,000 in lost revenue across the recovery period, calculated from comparable plumbers’ Map Pack referral rates in the same city.
Spam Signals Google Now Detects
The patterns Google now detects with high reliability include: bulk submission within narrow time windows, citations on domains with shared registration metadata, citations whose surrounding context (other listings on the same page) bears no thematic relationship to the listed business, and citations whose entity data has been auto-translated from another language. Each of these signals individually is not damning, but in combination they produce the profile that suppression algorithms target.
Deloitte’s research on enterprise governance — the Deloitte Insights finding that only 21% of enterprises have mature governance for AI agent deployment — is suggestive of a broader pattern in technology adoption: the gap between what platforms can detect and what organisations plan for tends to be wide. Google’s spam detection has consistently been ahead of practitioner expectations for at least five years, and there is no reason to expect the gap to narrow.
The Fifty-Citation Sweet Spot
Audit data across the practice suggests that fifty well-chosen citations produce roughly 85% of the entity-corroboration benefit that any quantity can provide. The marginal value of citations 51 through 200 is small but positive when those citations are on relevant niche or regional directories. The marginal value of citations 201 through 500 is essentially zero in well-cleaned campaigns and negative in poorly-managed ones. The sweet spot — the point of diminishing returns — sits between forty and sixty citations for most local service businesses, with industry verticals that have particularly rich niche directory ecosystems (legal, medical, hospitality) sometimes justifying eighty to one hundred.
Auditing Toxic Directory Profiles
The toxic profile audit follows a predictable workflow. Pull the citation graph using BrightLocal or Whitespark. Filter for domains with TrustFlow below 10 (Majestic) or domains that fail manual quality inspection (no editorial standards, automated submission, no human curation). Identify the subset of those toxic citations that contain inaccurate NAP data — these are the highest-priority removals because they carry both the spam signal and the consistency penalty. Submit removal requests. Where directories are unresponsive, document the attempt and disavow the linking domain if the citation includes a followed link. Re-audit at ninety days.
Myth Five: Niche Directories Don’t Move The Needle
Industry-Specific Citation Power
The argument against niche directories tends to rest on traffic data: a niche directory might send only thirty visitors a month, while Yelp sends three hundred. The argument is true in narrow terms and misleading in operational terms. Direct referral traffic is one of three values a directory citation provides. The other two — entity corroboration and trust transfer through topical association — are often delivered more efficiently by niche directories than by general ones.
A solicitor listed in The Law Society’s directory benefits from a topical association that no Yelp listing can replicate. Google’s understanding of “this entity is a solicitor” is reinforced more strongly by an authoritative niche citation than by ten general business directory citations, because the niche citation’s surrounding context (other solicitors, legal services taxonomy, professional credentialing data) provides classificatory confirmation that general directories cannot.
Lawyer Directories Versus Yelp
Comparing the citation profile of two directly competitive law firms in a major UK city illustrates the point. Firm A held listings on Yelp, Yell, FreeIndex, Cylex, and Hotfrog — all general directories — totalling forty-three citations. Firm B held listings on the Law Society directory, Chambers Partners, Legal 500, two regional bar association directories, and roughly twenty general directories — totalling thirty-eight citations. Firm B ranked above Firm A for every commercially valuable legal services query in the city despite having fewer total citations and lower direct referral volume. The classificatory weight of the niche citations produced relevance signals that quantity could not match.
Trade Association Listings
Trade association directories deserve special attention because they combine three signals that Google weights heavily: editorial control (associations vet their members), topical authority (the association is itself an authoritative entity in the vertical), and persistence (association listings rarely change, unlike commercial directory data). A roofer listed in the National Federation of Roofing Contractors directory is signalling something about credentialing that no number of general listings can simulate.
The challenge is that trade association listings are often paywalled behind membership fees of £200–£800 per year. The ROI calculation here is straightforward: if the membership fee is less than the value of one or two additional Map Pack appearances per month, the listing pays for itself many times over. For a roofer whose average job value is £4,500, a single additional Map Pack appearance per month producing one additional booking covers a year of association dues and then some.
Finding Niche Directories In Your Vertical
The discovery workflow for niche directories has not changed much since 2018, but it remains underused. Search for “[your service] directory” and “[your service] association [your country].” Pull competitor citation profiles using BrightLocal’s Citation Tracker and identify directories that competitors are listed in but you are not. Cross-reference with industry trade publications‘ resource pages — these often link to the canonical directories in the vertical. Manually inspect each candidate for editorial quality before claiming. Findings from this article suggest that practitioners who systematically work through this discovery process identify between eight and twenty viable niche directories per vertical, most of which their competitors have also missed.
Myth Six: Paid Citations Are Always Worth It
The “always” in this myth is doing a great deal of work. Paid citations are sometimes worth it, sometimes not, and the determining factor is rarely the price itself. It is the role the listing plays in the citation portfolio. The Yell premium listing, the Yelp Enhanced Profile, the Bark verified status, the Thomson Local promoted listing — each of these costs between £30 and £200 per month, and each can be defended or attacked depending on the business context.
The strongest case for a paid citation exists when the directory in question itself ranks for the queries the business wants to capture. A plumber whose target query is “emergency plumber [city]” and whose target city’s SERPs include Bark and Checkatrade in the top five organic results faces a different ROI calculation than a plumber whose target SERPs are dominated by Google’s own properties. In the first case, paying for premium placement on Bark is essentially buying SERP real estate twice — once through the directory’s organic ranking, once through the premium placement within the directory. In the second case, the same payment buys only the second piece, which is rarely worth the price.
The weakest case for paid citations exists when the directory is being purchased for “SEO benefit” without any clear pathway through which that benefit would materialise. The Harvard Business Review piece on forced rankings by Adi Ignatius (2015) makes a tangential but useful observation about evaluation systems: they reward what they measure, even when what they measure has stopped predicting what matters. Paid citation packages frequently reward a vanity metric (presence on a “premium” directory) that has stopped predicting visibility, and the practitioner who notices the disconnect first captures the budget that competitors continue to waste.
The decision framework that has held up across audits looks like this: pay for citations on directories that themselves rank for your target queries, that have genuine human users in your buyer demographic, and whose premium tier provides a feature (review acquisition tools, lead routing, enhanced profile) that produces measurable downstream value. Refuse to pay for citations whose only justification is “it’ll help your SEO” with no specified mechanism.
Myth Seven: Citations Replace Link Building
How Citations And Backlinks Differ
Some agencies sell citation packages as a substitute for link building, on the theory that citations are easier and cheaper and that “Google treats them similarly.” Google does not treat them similarly. Citations and backlinks operate through different mechanisms and serve different purposes in a ranking strategy. Conflating them is a category error that leads to portfolios with strong local visibility but weak organic visibility — or, more rarely, the reverse.
Citations primarily corroborate entity data. They tell Google that a business exists, operates at a specific location, offers specific services, and is recognised by independent sources. Their value is concentrated in local ranking signals and Knowledge Graph confirmation. They do not pass meaningful link equity in most cases because the links they contain are typically nofollow, sitewide template links, or links on pages with very low PageRank. Their job is classificatory, not authoritative.
Backlinks primarily transfer authority. A link from a relevant high-authority page tells Google that the linked content is worth ranking, on a topic the linking page is itself authoritative for. Their value is concentrated in organic ranking signals for non-branded queries, and they operate largely independently of local pack mechanics. A business with strong citations and weak backlinks tends to rank well in the Map Pack for local queries and poorly for organic searches with informational intent. A business with weak citations and strong backlinks tends to rank well for informational queries and poorly in the Map Pack.
Table 2: Citation Versus Backlink Function in Local Service Business SEO
| Factor | Citations | Backlinks | Primary Effect | Typical Acquisition Cost |
|---|---|---|---|---|
| Entity corroboration | High | Low | Map Pack eligibility | £0–£40 per citation |
| Authority transfer | Low | High | Organic ranking | £150–£600 per link |
| Trust signal | Moderate | High | Both pack and organic | Varies widely |
| Local relevance | High (geo directories) | Moderate | Map Pack ranking | £20–£200 per citation |
| Topical relevance | High (niche directories) | High | Both pack and organic | Varies widely |
| Decay rate | Slow (years) | Variable (months to years) | Maintenance burden | Audit cost |
Cross-referencing Table 2 reveals that the two channels are complementary, not substitutable. A complete local SEO strategy spends roughly 30–40% of its off-site budget on citations and 60–70% on backlinks for businesses competing in markets with active organic search demand, shifting toward 50/50 for businesses whose customers find them almost exclusively through the Map Pack.
Myth Eight: Set It And Forget It Works
The set-and-forget myth is perhaps the most expensive of all because its damage compounds invisibly. A business that builds a strong citation profile in 2023 and does nothing further until 2026 will discover, on audit, that 15–25% of its citations have decayed in some fashion. Directories shut down. Listings get accidentally merged or duplicated when database migrations occur. Phone numbers that were correct become incorrect after a number change. Office addresses change but not all listings get updated. New directories emerge that competitors have claimed. None of these problems announce themselves; they simply accumulate.
The decay rate observed across audited portfolios runs at approximately 6–9% per year for businesses with no maintenance regime. After three years of neglect, the citation graph that was 96% accurate at launch will typically be 76–82% accurate, which is the threshold at which Map Pack suppression begins to manifest visibly. The business owner experiences this as a slow leak: monthly leads decline gradually, no single change explains the decline, and the temptation is to blame increased competition or algorithm volatility. The actual cause is mundane neglect.
The HBR research on performance ranking systems noted in Klarita Gërxhani’s December 2021 piece argues that systems which appear stable often hide compositional changes beneath their surface metrics. The same is true of citation graphs: the headline number of citations may stay flat while the proportion of accurate, active, authoritative citations within that total quietly shifts downward. Only systematic audit reveals the rot.
The maintenance regime that prevents this decay is not heavy. A quarterly audit using BrightLocal or Whitespark, a monthly Business Profile review, an annual deep audit including manual inspection of the top thirty citations — that combination keeps the graph above the 90% accuracy threshold indefinitely, and the time investment averages two to four hours per month for a single-location business. The set-and-forget myth survives because the alternative sounds like more work than it actually is.
What Actually Moves Rankings In 2026
The Core Citation Stack
For a UK-based local service business in 2026, the core citation stack consists of approximately fifteen anchor citations on which everything else depends: Google Business Profile, Bing Places, Apple Business Connect, Facebook, Yell, Yelp, Thomson Local, Foursquare, Cylex, FreeIndex, Hotfrog, Brownbook, the appropriate Companies House listing where applicable, the relevant Chamber of Commerce listing, and one major industry-specific directory. These fifteen carry the bulk of the entity-corroboration weight and form the foundation against which all other citations are measured for consistency.
Beyond the anchor stack, the next twenty to thirty-five citations should be selected from a combination of regional directories specific to the business’s service area, vertical directories specific to the business’s industry, and association directories specific to the business’s professional credentials. The exact composition depends on the vertical, but the principle is that breadth matters less than relevance density. A profile with sixty citations that all genuinely relate to the entity outperforms a profile with two hundred citations of mixed relevance.
Review Velocity And Citation Pages
One pattern that has emerged increasingly clearly through 2024 and 2025 is the interaction between citation pages and review velocity. Directories whose listings include user reviews — Yelp, Trustpilot, Google itself, vertical-specific review platforms — function differently from directories that contain only structured business data. Reviews on citation pages contribute to the freshness signal that Google uses to assess whether a business is genuinely operating, and businesses that maintain review acquisition across multiple citation platforms (rather than concentrating reviews exclusively on Google) display ranking stability that single-platform review portfolios do not.
The emerging best practice is to maintain review velocity on at least three citation platforms, with Google receiving roughly half of new reviews and two other platforms (typically Trustpilot and an industry-specific platform) splitting the remainder. The diversification protects against single-platform algorithm changes and broadens the entity confirmation footprint across the open web. Harvard Business Review (2019), drawing on World Economic Forum data, observed that 42% of core job skills were projected to change in a short window — the analogous insight for local SEO is that the platforms which carry review weight today are not guaranteed to carry it tomorrow, and diversification is the only durable hedge.
Quarterly Audit Workflow
The quarterly audit workflow that has held up across the practice is straightforward enough to document in full. Pull the current citation graph from a tracking tool. Compare against the baseline graph from the previous quarter. Identify new citations (often created by data aggregators without your knowledge), removed citations (directories that closed or removed your listing), and modified citations (listings that were edited, often by users adding incorrect data). For each modified citation, verify the change against the canonical NAP data and submit corrections where needed. For each new citation, decide whether it is worth claiming and improving or leaving as-is. For each removed citation, decide whether re-listing on the same platform is worth pursuing or whether the platform’s decay is itself a signal that effort is better spent elsewhere.
The audit takes between three and six hours for a single-location business and produces a quarterly report documenting graph health, accuracy percentage, and recommended remediation actions. further reading is available for practitioners who want to expand this workflow to multi-location businesses, where the complexity scales non-linearly because the audit must be performed per-location while also checking for cross-location data contamination — a failure mode that affects roughly 12% of multi-location businesses on first audit.
Measuring Citation ROI Properly
The final discipline that distinguishes effective citation work from theatrical citation work is measurement. Most agencies report citation work in terms of activities (citations submitted, listings claimed, errors corrected) rather than outcomes (Map Pack appearance rate, non-branded query visibility, citation-attributed referral traffic). The activities are easier to count and easier to bill, but they fail the test that the Harvard Business Review contributor guidelines implicitly endorse for management evidence: a claim is only as strong as the outcome it can demonstrate.
The outcome metrics that matter for citation ROI are: change in Map Pack appearance rate for a defined query set (measured monthly via local rank tracking), change in non-branded organic traffic to the location pages (measured via Google Analytics with branded query exclusion), change in Business Profile insights (calls, direction requests, website clicks attributed to local searches), and change in citation-attributed referral traffic (measured via UTM-tagged links where directories permit them, or via referral source analysis where they do not). These metrics, tracked quarterly against the citation maintenance log, produce the only ROI calculation that survives scrutiny.
On current trajectories — and acknowledging that any prediction about Google’s ranking algorithm is conditional on Google’s continued reliance on the open web for entity verification — citations will remain a meaningful local ranking factor through at least 2027 and likely 2028. The prediction holds under three conditions: Google continues to rely on third-party corroboration for local entity data (rather than moving exclusively to first-party verification through Business Profile); the directory ecosystem retains enough editorial diversity to provide meaningful classification signals; and generative search interfaces continue to surface local results derived from the same Knowledge Graph that citations help populate. The prediction would be falsified by a Google announcement that local ranking will draw exclusively on Business Profile data and verified first-party sources — a possibility that has been rumoured for at least four years without materialising. Until such an announcement, the practitioners who treat citations as an active maintenance discipline will continue to outrank the practitioners who declared them dead in 2020 and walked away. The pattern has held for fifteen years; current evidence suggests it will hold for several more.

