This article is part of the On Tech newsletter. You can sign up here to receive it on weekdays.
This week, Amazon acknowledged the reality: it has a problem with fake reviews.
The problem is that Amazon shown debt at almost everyone involved in unreliable ratings, and not nearly enough at the company itself. Amazon criticized Facebook, but did not realize that the two companies had an underlying problem that could weaken people’s confidence in their services: an inability to effectively police their vast sites.
Learning from the masses is a promise of the digital age that has not yet expanded. It can be great to evaluate others’ feedback before we buy a product, book a hotel or go to a doctor. But it is so common and profitable for companies and services pay for or otherwise manipulate ratings on all sorts of sites it’s hard to trust anything we see.
The persistence of fake reviews raises two big questions for Amazon: how much attention is paid to stopping fake customer feedback? And would buyers be better off if Amazon reconsidered its essence as an (online) online bazaar?
Amazon’s rules prohibits businesses from offering people money or other incentives for review. Amazon says it catches most false ratings and works to keep the offenders ahead. Still, the global industry of rating fraud work actively on Amazon and everyone knows it.
Amazon appears to have been spurred on by the Federal Trade Commission, according to Vox’s Recode publication and by journalists to take action to curb manipulated ratings.
To a Wall Street Journal columnist wrote this week about buying a RAVPower electric charger that comes with a $ 35 gift card with a gift card, in exchange for a review, the seller said Thursday that it was banned from Amazon. (The statement is in Chinese, and I read it via Google Translate.) This follows a ban on several other major sellers who have apparently been buying reviews for years.
If government advocates and newspaper columnists see sellers openly manipulating reviews, how hard does the company look for them?
Maybe you think that’s just how the world works: caveat emptor. If I read ratings of products on Amazon or of doctors on Zocdoc, the feedback is helpful, but I take it along with a grain of salt.
But unfortunately, many people are harmed by fake reviews, and it’s not always easy for us to spot. The Washington Post recently wrote about a family being fooled Purchased Google Ratings for a Liquor Addiction Treatment Center. I wrote last year about research that found that Amazon captures a lot of bought-out reviews, but only months later and after the buyer showed signs that they were misled into buying a product.
I wish Amazon would take more responsibility for the problem. In its statement this week, the company blames social media companies and the poor enforcement by regulators for false reviews. Amazon has a point. Fraudulent online ratings are a big undertaking with many enablers. Facebook and China’s WeChat app not doing enough over forums where companies coordinate review manipulation.
But Amazon hasn’t said much about what it can do differently. For example, University of California researchers I spoke to last fall found that purchased reviews were much more common among Chinese suppliers and for products for which many sellers sold an almost identical product. Maybe that means Amazon needs to better police sellers in China? Or that it would be helpful to limit the number of sellers that contain the same bathroom?
Strong reviews also help sellers appear prominently when we search for products on Amazon, which creates a huge financial incentive to cheat. Should Amazon reconsider how it offsets the search results ratings? The company did not say.
It is particularly disappointing that Amazon does not acknowledge that fake reviews are the result of its choice to opt for quantity over quality.
People can buy almost anything on Amazon and almost any seller. It may be good for buyers, but it Come along compromises. Being an all-purpose store – and one that tries to work with as little human intervention as possible – makes it harder for Amazon to eradicate counterfeit or dangerous products and purchased reviews.
Before we go …
No more “speed filter”. NPR reports that Snapchat will phase out an app feature that lets people record and share how fast they drive. Proponents of traffic safety say the function has encouraged young people to drive recklessly for years to get bragging rights.
Using WhatsApp to eliminate myths: During the pandemic, health workers in rural India used WhatsApp to counter incorrect information about the virus, The Verge. reports. It takes a lot of time for healthcare professionals to check information on the app, but the online messages as well as personal conversations keep many people safe.
LOOK AT THE GIANT BUNNY: My colleague Amanda Hess chatted with people who post online videos of their numerous and exotic animals. The niche called Pet Tube caters to our love of sight beats like a heap of snakes swaying on a piano, but these people also love animals – ‘even potentially riotous swarms of animals’, Amanda wrote.
Hugs on this
A baby seal tests the water. The little one quickly moves from insecure to cheerful.
We want to hear from you. Tell us what you think of this newsletter and what you want more. You can reach us at [email protected]
If you have not yet received this newsletter in your inbox, please sign up here. You can also read past On Tech columns.