Friday, June 30, 2023

Attacking the economics of scams and misinformation

We see so many scams and so much misinformation on the internet because it is profitable.

It's cheap to create bogus accounts. It's cheap to use hordes of accounts to shill your scams and feign popularity. Posting false customer reviews easily can make crap look like it's trustworthy and useful.

Bad actors are even more effective when they manipulate ranking algorithms. When fake crowds of bogus accounts like, share, and click on content, algorithms that use customer behavior -- such as trending, search, and recommenders -- think crap is genuinely popular and show it to even more people.

Today, the FTC announced rules that change the game: "Federal Trade Commission Announces Proposed Rule Banning Fake Reviews and Testimonials."

These rules make it much more risky and costly to create fake reviews of your products. About a third of customer reviews are fake!

These new rules also make it much more costly to manipulate social media using fake accounts and shills. From the FTC: "Businesses would be prohibited from selling false indicators of social media influence, like fake followers or views. The proposed rule also would bar anyone from buying such indicators to misrepresent their importance for a commercial purpose."

Important is this changes the economics of spam for the bad guys. Before, faking and shilling was free advertising for the bad guys and often profitable. Now businesses that shill and use fake followers face risk of much higher costs, likely tipping the balance into making a lot of scams unprofitable.

A paper from a decade ago, "The Economics of Spam", describes how difficult it is already is to make money as an email spammer. Then it summaries interventions like the FTC's recent action brilliantly, saying, "The most promising economic interventions are those that raise the cost of doing business for the spammers, which would cut into their margins and make many campaigns unprofitable."

More risky and less profitable means less of it. This action from the FTC is great news for anyone who uses the internet.

Tuesday, June 13, 2023

Optimizing for the wrong thing

Many companies that think of themselves as data-driven underestimate how easy it is for metrics to go terribly wrong.

Take a simple example. Imagine an executive who will be bonused and promoted if they increase advertising revenue next quarter.

The easiest way for this exec to get their payday is to put a lot more ads in the product. That will increase revenue now, but annoy customers over time, causing a short-term lift in revenue but a long-term decline for the company.

By the time those costs show up, that exec is out the door, on to the next job. Even if they stay at the company, it's hard to prove that the increased ads caused a broad decline in customer growth and satisfaction, so the exec gets away with it.

It's not hard for A/B-tested algorithms to go terribly wrong too. If the algorithms are optimized over time for clicks, engagement, or immediate revenue, they'll eventually favor scams, lots of ads, deceptive ads, and propaganda because those tend to maximize those metrics.

If your goal metrics aren't the actual goals of the company -- which should be long-term customer growth, satisfaction, and retention -- then you easily can make ML algorithms optimize for things that hurt your customers and the company.

Data-driven organizations using A/B testing are great but have serious problems if the measurements aren't well-aligned with the long-term success of the company. Lazily picking how you measure teams is likely to cause high future costs and decline.