Looking at the recent traffic data for Findory.com, I was surprised to see traffic spiking.
In fact, including all traffic, Findory.com is up to 26M page views per month, about 10 page views per second on average.
That's odd, I thought. Findory's advertising revenue and third party analyses from sites like Alexa both show slow but steady declines at Findory.com, not a traffic spike. What is keeping Findory's web servers so darn busy?
Turns out that the vast majority (in excess of 95%) of these page views are various forms of robots, mostly hitting Findory's free APIs.
Those page views are not people. They generate no revenue directly. They have little to no value to Findory.
In fact, I suspect that most of these API accesses are being used for various forms of weblog spam. For example, I suspect some are accessing Findory content, stripping all the links out, then placing AdSense ads or link farm links next to that content. Ah, spam, wonderful spam.
I never have been particularly idealistic when it comes to APIs. I tend to take a cynical view on the motivations of companies that offer free APIs.
I also have suspected that most people using APIs seek short term profits, not innovation or building something substantial. While it is just one data point, Findory's experience appears to confirm that view.
Subscribe to:
Post Comments (Atom)
3 comments:
You're still measuring hits not pageviews right?
What are these numbers exclusing robots?
Hi, Kevin. This time it is page views, not hits, though the overall pattern is the same using hits.
Identifiable robots are roughly 60-70% of traffic according to awstats. But, I should point out, plenty of robots may not be nice enough to identify themselves.
Yeah, careful with those APIs. My Simpy [1] has a nice and free REST-like API with a bunch of open-source clients available to talk to it, but behind the scenes there are monitors that prevent abuse. Without those monitors.... oh la la...
[1] http://simpy.com/ (surprise!)
Post a Comment