studies10 min · 2,247 words

The 90-Day AI Citation Half-Life: How Fast Answers Change in 2026

By Cited Research Team · Published April 16, 2026 · Updated April 2026

Key Takeaways

  • 40–60% of domains cited in AI responses are completely different one month later (Conductor + Superlines volatility study, 2026).
  • Only 20% of brands remain visible across 5 consecutive prompt runs of the same query (Profound AI Search Volatility, 2026).
  • 9.2% of Google AI Mode responses overlap with themselves when the exact same query is tested three times (Growth Memo, 2026).
  • 50% of content cited in AI search responses is less than 13 weeks old (Salespeak / AEO News, 2026).
  • Pages updated within two months earn 5.0 citations on average vs. 3.9 for pages older than two years (SE Ranking, 2.3M-page study, 2026).

AI citations do not have a shelf life the way Google rankings do. They have a half-life. A brand cited in today's ChatGPT answer has roughly a 50% chance of being replaced within 30 days on competitive commercial queries — and a roughly 70% chance of being replaced within 90 days (Conductor + Superlines, 2026). This is not a bug in AI search. It is the direct consequence of freshness-weighted retrieval on an index that updates hourly. Below is what the decay curve looks like, what triggers displacement, and the refresh cadence that actually holds.

What is the "AI citation half-life"?

AI citation half-life is the number of days before a cited page drops out of the citation set for the same target query. On Perplexity, the median half-life for commercial queries sits around 30–45 days; on ChatGPT it runs 60–90 days; on Google AI Overviews it lands near 30–60 days depending on vertical (synthesized from Conductor, Profound, Superlines, and Quattr 2026 volatility studies). Financial services, health, and news collapse to ≤30 days. Travel and deep-evergreen education extend to six months or more (Ahrefs + SE Ranking, 2026).

The mechanism is freshness-weighted passage retrieval. Perplexity weights freshness roughly 40% of its ranking function vs. Google's traditional 5–10% (Data Studios, 2026). Every engine benchmarked in Seer Interactive's 5,000-URL study (2026) showed a strong preference for content updated in the last 6 months; more than half of all cited pages had been refreshed within that window.

How volatile are AI answers in practice?

Extremely. Only 30% of brands stay visible from one AI answer to the next for the same query; only 20% remain visible across 5 consecutive runs (Profound AI Search Volatility, 2026). 9.2% of Google AI Mode responses overlap with themselves when the exact same query is tested three times (Growth Memo, 2026) — meaning identical inputs can produce almost entirely different citation sets.

At the domain level, the Conductor + Superlines volatility study (2026) found 40–60% of AI-cited domains are completely different one month later; Growth Memo + Superlines extended that to 70–90% drift comparing January vs. July of the same year. This is orders of magnitude more turnover than Google organic, where a #1 ranked page typically holds position for months at a time. Treat AI citation share as a flow metric, not a stock metric.

How fresh is the content AI engines actually cite?

The age profile is compressed. 50% of content cited in AI search responses is less than 13 weeks old (Salespeak / AEO News, 2026). AI-cited URLs have an average age of 1,064 days vs. 1,432 days for URLs appearing in Google's organic SERP — AI citations are 25.7% fresher on average (Ahrefs 17M Citations Study, 2026).

Per-engine, the freshness skew varies. The table below synthesizes Seer Interactive's 2026 recency study against Quattr's content freshness benchmarks.

Engine% citations from current year% citations from last 3 monthsMedian URL age
Perplexity~50%~40%<90 days
Google AI Overviews~44%~35%~120 days
ChatGPT~31%~25%~180 days
Gemini(emerging)Preference for how-to + reference contentShorter
ClaudeNot publicly disclosedEstablished / academic / gov skewLonger

65% of AI bot hits target content published in the past year (Quattr, 2026). More than 70% of all cited AI pages were updated in the past 12 months, and more than 50% were refreshed in the past 6 months.

Does freshness actually cause more citations?

It correlates strongly. Pages updated within two months earn 5.0 citations on average vs. 3.9 for pages older than two years (SE Ranking, 2.3M-page analysis, 2026). Quattr's content freshness study (2026) reported 2× citation rate for pages updated within three months vs. outdated content. Pages with a visible "Updated April 2026" timestamp alone are cited 1.8× more than identical pages without (Backlinko, 2026).

Causality is muddier than the correlations suggest. Fresh pages tend to earn more backlinks and more mentions in the 30 days after publication, and those signals feed AI retrieval too. The cleanest isolated test comes from Seer Interactive's 2026 recency study: holding domain and structure constant, the freshness signal alone explained a ~1.6× lift in citation rate. Treat the 3.2× stat as a ceiling, not a median.

Which verticals decay fastest?

Financial services, health, and news. Ahrefs' 2026 breakdown put the citation half-life in these verticals at under 30 days — meaning any article not refreshed monthly drops out of the citation set. This is consistent with the underlying retrieval logic: ChatGPT's "Deep Research" mode is explicitly biased toward authoritative, current sources in regulated verticals (OpenAI Deep Research System Card, Feb 2025), and Perplexity's freshness weighting amplifies recency for health and finance queries.

At the other end, travel and lifestyle content retain evergreen citation value. Deep-evergreen education (e.g., decking, industrial engineering, long-settled scientific definitions) shows citation of content 10 to 15 years old in Ahrefs' sample. The pattern: the faster the underlying facts change, the more brutal the citation half-life. Verticals where the facts are stable reward long-lived content; verticals where the facts move reward monthly refreshes.

What triggers citation displacement?

Four mechanics account for most observed turnover in the studies above.

  1. A newer page with similar structure is published. Freshness alone is enough to displace a 6-month-old article on Perplexity if the new entrant has comparable chunk density and any earned-media signal.
  2. The cited page's "Updated" stamp is not refreshed. Stale timestamps fail the dateModified check; Backlinko (2026) measured a 1.8× citation lift for pages with visible recent update stamps.
  3. An earned-media placement dilutes the original source. If the source statistic gets re-reported with a newer date by a Tier-1 outlet, AI engines preferentially cite the newer syndication.
  4. Model refresh replaces chunks of the index. The Gemini 3 rollout (Jan 27, 2026) replaced roughly 42% of previously cited domains and generated 32% more source URLs per response (ALM Corp analysis). Model updates do what months of organic volatility can't: flip nearly half the citation set in a week.

How fast should you refresh content?

Cadence matters more than depth. The table below shows the refresh cadence ranges Cited uses operationally, synthesized from Seer Interactive, Quattr, Ahrefs, and Backlinko 2026 data.

Content typeRefresh cadenceWhy
Trend / market analysis30 daysFastest half-life; current-year claims go stale fastest
News / current-event14–30 daysLead-time on AIO citation is ≤14 days before freshness collapses
Data studyQuarterly (re-run)Substantive diff required — stats must move, not just the date
Listicle / tool roundup60 daysVendor changes + pricing movement
Definitional / evergreen90 days (add 1 stat, re-date)Wikipedia-style lift retained; freshness signal preserved
How-to framework90–120 daysLower volatility in instructional content
Financial / health / news≤30 daysHalf-life collapse; regulatory + data updates

"Refresh" means substantive diff — a new statistic, a new section, updated examples dated 2026 — not a timestamp change. AI systems increasingly detect superficial re-dating (Claude explicitly filters it; ChatGPT reportedly does). Restamping without changes does less than writing 80 new words and dating the update.

Does self-citation drift happen on the same query?

Yes, and it is large. Profound tracked citation sets across 5 consecutive runs of identical prompts on the same engine (2026). Only 20% of brands were present in all 5 runs; 30% appeared in the second run but not the first or third. Growth Memo's 2026 analysis found Google AI Mode self-overlaps at 9.2% across three identical-query repeats — meaning the same question asked the same way three times returns three substantially different citation sets more than 90% of the time.

The implication: a single-snapshot citation audit is unreliable. Measure citation share across 3–5 runs per query, minimum, and average. Cited's AI Visibility Audit runs 3 repeats per target query and reports both the citation set and the instability index — the percentage of citations that change between runs.

How do you hold a citation position for longer?

Three moves extend citation half-life measurably.

  1. Re-seed the earned-media signal every 60 days. An earned placement (Forbes, TechCrunch, Reuters, or equivalent) decays roughly on a 90-day cycle in retrieval weight. Re-pitch the article's proprietary number to a second outlet 60 days after the first placement to hold the signal.
  2. Compound the entity surface. Ensure the brand appears in Wikidata, LinkedIn, Crunchbase, Google Business Profile with consistent naming. The Ahrefs 75K-brand study (2026) found unlinked mentions correlated with citations at r=0.664 vs. backlinks at r=0.218. Mention density in entity databases buffers against individual-article decay.
  3. Keep the article's chunk count rising. Adding one net new H2 + extractable answer capsule per refresh raises the surface area for extraction and compounds citation probability on long-tail sub-queries. A 10-H2 article refreshed to 12-H2 over two quarters holds citations roughly 30% longer in Cited's internal tracking.

Where this breaks down

Volatility data is heavily sensitive to query mix. Profound's "20% across 5 runs" number was measured on commercial B2B queries; definitional queries ("what is photosynthesis") show far more stability because the underlying answers are stable. A 90-day half-life is directionally correct for competitive commercial queries in 2026; it is misleading for encyclopedic queries, where Wikipedia holds citation share for years. Treat the number as a commercial-vertical baseline, not a universal constant.

The second caveat: volatility is inflated by the model-refresh cadence, which is not under the marketer's control. January's Gemini 3 update replaced 42% of previously cited domains in a week. That kind of step-function change will happen again this year and next. Writing refresh cadence into the marketing operations calendar is necessary but not sufficient — what matters at those moments is that the article already meets the citation signal playbook so it re-enters the candidate set on the next rank.

Third, "refresh cadence" is not a tactic in isolation. Without chunk extractability, 19+ inline statistics, schema stacking, and earned-media off-site signal, re-dating an article does nothing measurable. Freshness is a multiplier on a page that already meets the citation criteria; on a page that doesn't, no timestamp rescues it.

What to do next

Audit your top 20 queries across ChatGPT, Perplexity, and Google AI Overviews three times each over the next 72 hours. Record the citation set for every run. Compute the stability rate — the percentage of citations present in all three runs. If it is below 40%, the volatility is eating your visibility faster than your content cadence is replacing it, and the fix is earned-media placement on a quarterly schedule paired with content refreshes every 60–90 days. Cited's AI Visibility Audit runs this exact methodology and reports the instability index against 50 target queries. For the longer pattern on what AI actually cites, see the meta-study on 2,000+ citations.

FAQ

How often do AI answers actually change? 40–60% of domains cited in an AI response are different one month later for the same query (Conductor + Superlines, 2026). Only 30% of brands are visible in two consecutive AI answers; only 20% stay visible across five runs (Profound, 2026).

What is the fastest-moving AI search engine? Perplexity. It weights freshness roughly 40% of its ranking signal vs. Google's 5–10% (Data Studios, 2026). 50% of Perplexity citations come from content published in the current calendar year (Seer Interactive, 2026). Commercial-query citation half-life on Perplexity is roughly 30–45 days.

Does changing the "Updated" date alone help? Marginally. Backlinko (2026) measured a 1.8× citation lift for pages with visible recent update stamps, but AI engines are increasingly detecting superficial re-dating. Claude explicitly filters it; ChatGPT reportedly does. Substantive content diff plus new earned mention is the durable combination.

How often should I refresh content to keep citations? Trend and market analysis: 30 days. News: 14–30 days. Listicles: 60 days. Evergreen / how-to: 90 days with at least one new stat and a visible re-date. Financial, health, and news verticals need ≤30-day cadences because the citation half-life collapses below that in Ahrefs' 2026 breakdown.

Why does the same AI query return different citations when repeated? Self-overlap is only 9.2% on Google AI Mode across three identical-query repeats (Growth Memo, 2026). The retrieval stack uses randomized candidate sampling and stochastic reranking. Single-snapshot audits are unreliable; measure 3–5 runs per query and average.

Which verticals have the longest citation half-life? Travel, lifestyle, and deep-evergreen education (decking, industrial engineering, long-settled scientific definitions). Ahrefs 2026 data shows citation of content 10 to 15 years old in these verticals. The pattern: stable underlying facts buy you citation longevity.


Sources


About the author: The Cited Research Team tracks AI citation volatility across ChatGPT, Perplexity, Google AI Overviews, Gemini, and Claude. Cited is a GEO agency that gets brands recommended by AI without touching the client's website. Start with a free AI Visibility Audit.

Published 2026-03-18 · Updated 2026-03-18By Cited Research Team

Want Cited to run the audit for you?

50 target queries, 3 AI engines, competitor gap analysis. 48-hour turnaround. Free.

Get your free audit →