← Back to Labs
Authenticity IndexSociaVault Labs~15 min read

The SociaVault Score: One Number for Influencer Authenticity

The SV-Score is a 0–100 index that measures how genuine an influencer's audience and engagement really are. Built on 120 million+ data points, calibrated against real fraud patterns, and updated continuously.

The Scale

0255075100
CriticalWeakModerateStrongExceptional
6 Signal Categories Real-Time Scoring Instagram & TikTok 120M+ Data Points

What Is the SV-Score?

The SociaVault Score (SV-Score) is a composite authenticity index that rates any influencer account on a scale of 0 to 100. A higher score means a more authentic audience and genuine engagement. A lower score indicates significant signs of fake followers, artificial engagement, or suspicious growth patterns.

Think of it like a credit score — but for influencer trustworthiness. Instead of asking “Does this influencer have fake followers?” you ask “What's their SV-Score?”

The score synthesizes six signal categories into a single number, each weighted based on its predictive power for detecting inauthenticity. The weights were calibrated using data from our Fake Follower Study — the largest independent assessment of influencer fraud, covering 100,000 accounts and 120 million data points.

For Brands & Agencies

Vet influencers before signing deals. An SV-Score of 75+ means you're paying for real reach. Below 50? Walk away.

For Influencers

Prove your audience is real. A high SV-Score is a competitive advantage when negotiating brand partnerships.

For Researchers

A standardized, reproducible metric for academic studies on influencer marketing effectiveness and fraud.

Score Bands

Every SV-Score falls into one of five bands, each with a clear interpretation and recommended action.

90–100Exceptional
~8% of accounts

Best-in-class authenticity. Verified organic growth, highly engaged real audience, no detectable anomalies. Top 8% of all accounts.

75–89Strong
~22% of accounts

High authenticity with minor anomalies at most. Safe for brand partnerships. Engagement is overwhelmingly genuine.

50–74Moderate
~33% of accounts

Mixed signals. Some genuine engagement alongside suspicious patterns. Worth investigating before committing budget. May include accounts that grew fast legitimately but attracted bot attention.

25–49Weak
~25% of accounts

Significant authenticity issues. Likely purchased followers, engagement pods, or artificial growth. High risk for brand partnerships.

0–24Critical
~12% of accounts

Strong indicators of systematic fraud. Overwhelmingly fake followers, bot-driven engagement, or extreme growth manipulation. Avoid.

Note: Band percentages are based on our dataset of 100,000 accounts. Real-world distribution varies by platform, niche, and follower tier. High-competition niches like Beauty may have a higher proportion of low-scoring accounts.

The Six Signals

The SV-Score is composed of six weighted signal categories. Each captures a different dimension of authenticity. The weights reflect each signal's predictive power, as validated against known fraudulent and known authentic accounts in our study dataset.

30%

Follower Authenticity

Are the followers real people? This is the heaviest-weighted signal because it's the most direct measure of fraud.

  • Ratio of real vs. bot/inactive accounts in follower base
  • Follower account age, completeness, and activity patterns
  • Geographic distribution vs. expected audience location
  • Follower-to-following ratio anomalies across the audience
25%

Engagement Quality

Is the engagement real? Measures whether likes, comments, and shares come from genuine accounts with organic patterns.

  • Like-to-follower ratio compared to niche benchmarks
  • Engagement velocity — how fast engagement accumulates post-publish
  • Proportion of engaged accounts that are themselves authentic
  • Engagement consistency across posts (vs. sudden spikes)
20%

Comment Quality

Are the comments meaningful? This signal catches engagement pods, comment bots, and purchased commentary.

  • Semantic relevance — do comments relate to the actual content?
  • Linguistic diversity — unique phrasing vs. copy-paste patterns
  • Comment depth — substantive replies vs. generic emoji responses
  • Commenter authenticity — are commenters themselves real?
15%

Growth Pattern

How did they grow? Organic growth follows recognizable curves. Purchased followers create detectable spikes.

  • Historical follower growth trajectory analysis
  • Spike detection — sudden jumps inconsistent with content performance
  • Growth-to-content correlation — do follower gains match viral posts?
  • Sustained growth consistency vs. stagnation-spike patterns
7%

Audience-Content Alignment

Does the audience match the content? A fitness influencer whose audience is primarily inactive accounts from a different country is a red flag.

  • Audience demographic alignment with content niche
  • Audience interest overlap with posted content topics
  • Language consistency — content language vs. audience language
  • Geographic plausibility for the creator's content type
3%

Cross-Platform Consistency

Do their numbers add up across platforms? Wildly different metrics on IG vs. TikTok can signal manipulation on one platform.

  • Follower count ratios across platforms vs. expected norms
  • Engagement rate consistency across active platforms
  • Growth timing — do multiple platforms grow together organically?
  • Content performance correlation across channels

Signal Weights

Follower Authenticity30%
Engagement Quality25%
Comment Quality20%
Growth Pattern15%
Audience-Content Alignment7%
Cross-Platform Consistency3%

Weights sum to 100%. They were derived through logistic regression against a labeled dataset of 5,000 manually verified accounts (2,500 confirmed authentic, 2,500 confirmed fraudulent) and validated against the broader 100,000-account study population.

How the Score Is Calculated

A step-by-step breakdown of what happens when an account is scored.

01

Data Collection

We pull the account's public profile data, recent posts, engagement metrics, follower samples, and comment data via the SociaVault API. For cross-platform scoring, we identify linked accounts.

02

Signal Extraction

Each of the six signal categories is analyzed independently. Follower samples are evaluated for bot characteristics. Engagement patterns are compared to niche benchmarks. Comments are run through semantic analysis. Growth curves are modeled.

03

Normalization

Raw signal values are normalized to a 0–100 sub-score per category. Normalization accounts for platform-specific baselines (Instagram norms differ from TikTok) and follower tier expectations (nano accounts naturally have different patterns than mega accounts).

04

Weighted Aggregation

The six normalized sub-scores are combined using the published weights: Follower Authenticity (30%), Engagement Quality (25%), Comment Quality (20%), Growth Pattern (15%), Audience-Content Alignment (7%), Cross-Platform Consistency (3%).

05

Calibration

The raw composite score passes through a calibration function trained on our labeled dataset. This ensures that scores are meaningful within each band — a 75 truly represents 'Strong' authenticity, not just a mathematical average.

06

Output

The final SV-Score (0–100) is returned alongside the six sub-scores, the score band label, and a confidence indicator based on data completeness. Accounts with limited public data receive a lower confidence flag.

Simplified Formula

SV-Score = calibrate(

0.30 × follower_authenticity +

0.25 × engagement_quality +

0.20 × comment_quality +

0.15 × growth_pattern +

0.07 × audience_alignment +

0.03 × cross_platform

)

Where each sub-score is normalized 0–100 per platform and tier, and calibrate() is a monotonic mapping function trained on labeled data.

Score Distribution

How 100,000 accounts scored across our study dataset. The distribution reveals that most accounts cluster in the Moderate-to-Strong range, with a significant tail of low-scoring accounts.

Overall Distribution

90–100 Exceptional8%
75–89 Strong22%
50–74 Moderate33%
25–49 Weak25%
0–24 Critical12%

Key Statistics

Mean SV-Score58.3
Median SV-Score62.1
Standard Deviation21.4
Accounts scoring 75+30%

Platform Differences

SV-Scores are normalized per platform, but average scores still differ. TikTok accounts score higher on average because the platform's algorithm-driven discovery model makes follower purchasing less impactful — and therefore less common.

IG

Instagram

Mean SV-Score

53.7

Median SV-Score

56.2

Exceptional (90–100)5.8%
Strong (75–89)18.4%
Moderate (50–74)32.6%
Weak (25–49)28.7%
Critical (0–24)14.5%
TT

TikTok

Mean SV-Score

62.9

Median SV-Score

67.8

Exceptional (90–100)10.2%
Strong (75–89)25.6%
Moderate (50–74)33.4%
Weak (25–49)21.3%
Critical (0–24)9.5%

Why TikTok Scores Higher

Algorithm-Driven Discovery

TikTok's For You Page means follower count matters less for reach, reducing the incentive to buy followers.

Younger Ecosystem

The TikTok influencer market is newer. The fake follower marketplace is less mature than Instagram's.

Views-Based Monetization

Brand deals on TikTok increasingly focus on views and engagement rates, not raw follower counts.

SV-Score Benchmarks by Follower Tier

Expected SV-Scores vary by follower tier. Nano influencers consistently score highest; macro influencers score lowest — consistent with our fraud-by-tier findings.

Follower TierMean SV-ScoreMedian
Nano (1K–10K)68.271.4
Micro (10K–50K)61.364.7
Mid (50K–100K)55.157.8
Macro (100K–500K)48.650.2
Mega (500K+)52.355.1

The Macro Tier Gap

Macro influencers (100K–500K) have the lowest mean SV-Score at 48.6, consistent with their 48.3% fraud rate in our study. This is the tier where buying followers has the highest ROI — crossing the 100K threshold can increase per-post rates by $5,000+.

Real Examples (Anonymized)

To illustrate how the SV-Score works in practice, here are three anonymized accounts from our dataset representing different bands.

Account A

Fitness micro-influencer · Instagram · 34K followers

87

Strong

91

Follower Auth.

85

Engagement

88

Comments

82

Growth

90

Alignment

79

Cross-Platform

Steady organic growth over 18 months. High comment relevance (fitness questions, meal prep tips). Audience is 89% US/UK-based, matching content language. No suspicious spikes.

Account B

Lifestyle mid-tier · TikTok · 87K followers

61

Moderate

58

Follower Auth.

72

Engagement

64

Comments

45

Growth

68

Alignment

55

Cross-Platform

One viral video caused a 15K follower spike in 48 hours — legitimate, but attracted bot followers as a side effect. Growth pattern sub-score is lower due to the spike. Overall engagement is healthy, but the follower base has ~18% suspicious accounts that followed during the viral period.

Account C

Beauty macro-influencer · Instagram · 240K followers

22

Critical

14

Follower Auth.

28

Engagement

19

Comments

31

Growth

22

Alignment

35

Cross-Platform

Three distinct follower purchase events visible in growth history (20K+ spikes with no corresponding content). 62% of comments are generic (“Nice!”, “Love this”, emoji-only). Audience geography is 44% from regions mismatched with English-language beauty content. Like-to-comment ratio is 300:1 (benchmark for niche is 80:1).

What the SV-Score Is Not

Transparency means being clear about what the score doesn't measure. Misuse of any metric undermines trust.

Not a Content Quality Score

The SV-Score measures audience authenticity, not whether someone's content is good. A mediocre creator with a real audience will outscore a talented creator who bought followers.

Not a Performance Predictor

A high SV-Score means the audience is real — it doesn't guarantee high engagement rates, conversions, or ROI on a specific campaign.

Not an Absolute Fraud Verdict

The score indicates probability, not certainty. Some legitimate accounts may score lower due to factors like viral bot attention or unusual but authentic growth patterns.

Not Static

Scores change over time. An account that cleans up fake followers will see their score improve. An account that buys followers will see it drop.

Recalibration Schedule

Fraud evolves. The SV-Score evolves with it. We recalibrate the model on a fixed schedule to ensure scores remain meaningful as fraud techniques and platform dynamics change.

QuarterlyFull Model Recalibration

Re-train weights and calibration function against an updated labeled dataset. New fraud patterns detected in the previous quarter are incorporated.

MonthlyBenchmark Refresh

Update platform-specific and niche-specific normalization baselines. As platform norms shift (e.g., average engagement rates change), the baselines need to reflect current reality.

ContinuousSignal Monitoring

Real-time monitoring for new fraud patterns, bot behavior changes, or platform API changes that could affect data collection and signal accuracy.

AnnuallyPublic Methodology Review

Full public update of the methodology page, including any weight changes, new signals added, or signals retired. Published as a versioned changelog.

Research Foundation

The SV-Score isn't built on theory. It's calibrated against real-world data from the largest independent influencer fraud study published to date.

The Fake Follower Problem: 2026 State of Influencer Fraud

100,000 accounts · 120 million data points · Instagram & TikTok · 5 follower tiers · 10 content niches · 12-indicator methodology

37.2%

Overall Fraud Rate

$4.6B

Annual Waste

48.3%

Macro Tier Fraud

87%

Detection Accuracy

Read the Full Study

5,000

Manually labeled training accounts (2,500 authentic + 2,500 fraudulent)

92.4%

Classification accuracy on held-out test set (SV-Score ≥75 vs. labeled authentic)

0.89

AUC-ROC for binary classification (authentic vs. fraudulent)

Frequently Asked Questions

Can an influencer game their SV-Score?

Not easily. The score uses six independent signal categories, each drawing from different data sources. Gaming one dimension (e.g., buying real-looking followers) typically creates anomalies in other dimensions (engagement patterns, growth curves). As fraud techniques evolve, our quarterly recalibration process adapts the model to detect new patterns.

How often is a score updated?

Scores are computed on-demand using the latest available data. Each API request fetches current profile data and re-calculates the score in real time. Historical scores are snapshots — they reflect the account's state at the time of calculation.

Why only Instagram and TikTok?

Instagram and TikTok are the two largest influencer marketing platforms and the platforms with the most developed fake follower ecosystems. We're expanding to YouTube, LinkedIn, and Twitter/X scores as our labeled training datasets for those platforms reach sufficient size.

What if a legitimate account scores low?

This can happen — especially for accounts that had a viral moment and attracted bot followers passively, or accounts in high-fraud niches. The score reflects what the data shows; it doesn't make a moral judgment. We recommend looking at the sub-scores to understand which dimension is pulling the score down.

How is this different from other fake follower tools?

Most tools give you a binary yes/no or a percentage of fake followers. The SV-Score is a composite index across six dimensions, calibrated against a labeled dataset of 5,000 accounts, and normalized by platform, tier, and niche. It's not just counting bots — it's measuring overall authenticity holistically.

Is the methodology peer-reviewed?

The SV-Score methodology is published transparently on this page for public scrutiny. We have submitted our Fake Follower Study to academic pre-print servers and are pursuing formal peer review. Our goal is full transparency — you can evaluate the methodology yourself.

What data do you collect?

Only publicly available data. We analyze public profile information, public post metrics, public comments, and publicly available follower lists. We never access private data, DMs, or any information behind login walls.

Can I check my own SV-Score?

Yes — the SV-Score will be available through the SociaVault API. We're also building a free lookup tool on this page where you can enter any public handle and see their score.

Stop Guessing. Start Measuring.

The SV-Score gives you a single, transparent number to evaluate influencer authenticity. Built on real data, updated continuously, and designed to become the industry standard.