There's this anomaly in X's web traffic data that has been tearing my head apart for several months now.
According to data from SEMRush, which measures referral traffic from Google, X saw very sudden and massive increases in web referrals in April of last year.
Organic traffic to "www.twitter.com" rose from a little over 1 billion per day up to April 5th, 2023, to more than 2 billion on April 7th. Since then, it has continued to rise and stands at an average around 3 billion per day.
Which is a huge increase, and there's nothing that I can find to suggest why Twitter, or X, would have suddenly seen such a big jump in web referrals at this time (Note: I asked SEMRush about this, and they said that a change in its reporting methodology correlates with this spike, though it still seems like a big jump).
Maybe, then, I thought, it could be that X was limiting access to its API (which began at the end of March), which would have been followed by a big jump in researchers accessing the platform through different means. Maybe this is related to the rate limits that X implemented as a way of stopping companies making AI scrape its platform, though those didn't actually begin until July.
Worth noting, also that "www.x.com" has also experienced a like spike in traffic very recently, up from organic of 390k
Super Bowl was on the 12th, so that isn't it, and nor is it a day following Valentine's Day. And in addition, the change here is big, a multiple of 3.4x
So why this? What's happening either inside X's system or inside Google's that this could cause such dramatic increases/decreases in these values?
Perhaps, for x.com at least, the spike is simply an artefact of the ongoing efforts by the company to cut over to the new domain?
It's also the case that other reporting providers have not noted these same increases, and actually demonstrate that X traffic, web and mobile alike, are falling year-over-year.
None of these systems has 100% data access, but both generally provide solid indicators of trends and general performance. And as noted, SEMRush has explained that they did start tracking "SERP Features" in April 2023, which could explain some of the volatility in its figures.
But it just seems not quite right, which has made it hard to get a real read on X's true performance.
Then today, I saw this report from Mashable, citing data from ad measurement provider CHEQ, which determined that nearly 76 percent of all the referral traffic brought in through X ads on the Super Bowl weekend is coming from sources that CHEQ labeled as likely "fake."
As Mashable notes:
"CHEQ monitors bots and fake users across the internet in order to minimize online ad fraud for its clients.". The company does this by tracking exactly how people coming from disparate sources, like X are interacting with a client page after having clicked one of their links. The company may also notice when a bot is pretending to be some actual user, for instance, an individual committing some form of fraud by purporting to use any particular sort of operating system to browse a website".
Now, although there certainly is methodology in play here, the evidence is also anecdotal inasmuch as CHEQ only has visibility into this kind of measurement for its own client list. But again for comparison's sake, CHEQ also reported that the same measurement report for last year's Super Bowl found that just 2.8 percent of the traffic coming from X was likely phony, out of 159,000 visits.
For perspective, this past year, CHEQ scanned 144,000 visits to its clients' sites that came from X over Super Bowl weekend,
Is X inflating its numbers by fakes and spam sources to drive up its reach figures in order to induce more ad spend?
Still that still sounds unlikely, however, considering the business risk involved. Then again, X owner Elon Musk made fakes on the platform a key focus as part of his efforts to get out of buying it, and the courts rejected that.
In July 2022, when Musk was trying to get out of his $44 billion offer for the app, Musk argued that the platform wasn't actually worth that price due to the high amount of bot profiles, which Twitter had continually included in its active user figures.
Twitter had long argued that the number of fake profiles within the app was less than 5% of its total mDAU count, based on its own sampling. But Musk said it was actually a far bigger number, as his own team's analysis came to the conclusion that around 33% of Twitter's active profiles were likely to be fake. Musk eventually settled on it being a more modest 20%, while noting that he thought it was actually much higher.
So Musk himself said at least 20% of X's active usage came from bots and spam. But since taking over the platform, Musk has never mentioned it, only citing "record high" usage figures, and declaring victory over bots in the app.
And apparently, none of these two facts can be true.
For example, if bots indeed were 20% of the mDAU number on Twitter at the time that Musk bought it as Elon would say, it then meant that X would have to delete approximately 50.6 million bot accounts just to clean it up and begin anew-without adding these bot accounts to their mDAU count for its active users. X's user count has since risen from 238 million daily actives, to more than 250 million today, a number Musk first reported in November 2022, just a month after he took control of the app. Which would suggest that, measured in bot removals, X would have had to have added around 62 million more users in that brief period after Musk took over.
The platform never added more than 30 million more actives over a 12-month period, so either X saw astronomical growth as a result of Elon taking over, or Elon's bot estimates were wrong, and were likely juiced as part of a bid to wriggle out of the X deal.
Or X still has a pile of bots, 20 percent of its user base, and Musk and Co. have merely opted to continue tallying them in its usage stats, just as Twitter management allegedly did before him.
But X also has ad verification partners who could provide third-party analysis on that front as well and probably would confirm whether X ads are reaching real people or not. And if X is, in fact, padding the numbers with bot traffic, those verification partners would be able to catch that too.
Is that right?
Well, it depends on what, precisely, each of its verification partners is measuring.
Integral Ad Science (IAS) offers brand safety and suitability measurement on X. Nonetheless, it has provided some viewability and invalid traffic verification earlier. It is unknown whether IAS still guarantees validity of traffic on X. End.
DoubleVerify (DV) also offers brand safety measurement for X and fraud and viewability measurement across both display and video campaigns (we've also asked DV to comment on this new report).
By last June, X was also meeting with a few more measurement providers such as Zefr and Unitary to offer even more control over ad placement but did not make specific mentions of audience authentication.
So, basically, X's ad verification partners are more focused on ad placement and not much on audience verification; they do it only with commissioning partners and not with all of X's traffic. So, there is a possibility that even they could not offer confirmatory data to quash this very issue.
So, does this mean that X is fudging its numbers with the hope of making it look like a better-performing one?
No, it doesn't. Again, all of these data points could have alternative explanations (again, SEMRush has provided more insight on this), and because CHEQ only has access to a selection of X traffic data, it's not clear if this is happening across the board.
The real insight will be coming from your ad performance data, and what you are seeing. When you write your X insights have a lot of clicks and engagement but not showing in your own analytics and performance data, then that is probably something wrong, and X ads may be gathering bot traffic.
Essentially, just like in everything digital marketing, results vary by the individual. Therefore, unless you are still experiencing high-quality ad performance on X, perhaps it is a good place to stay.
Perhaps you should continue monitoring the performance metrics on X so that you catch those anomalies for future reference.