So here's some perspective from an Amazon seller doing 7 figures+ annually.
- On average, only 1-3% of customers review products.
- Each review is worth a lot of money, often times multiples of the product itself, and especially if you're just starting out.
- Each category in Amazon has it's own Average rating, for example, electronics typically have lower ratings because more things can go wrong and there are more usability issues vs something like kitchenware, where less things fail outright.
- If you play in a category with a certain failure rate, it is absolutely essential that you do everything you can to mitigate bad reviews as enough of them will sink your business, even if you have a great product.
- It takes 8+ 5 star reviews to counteract a 1 star review if you want to maintain a 4.5 star average which is the bar for a good product. This is extremely hard to do without manipulation.
- People who complain about fake reviews are only seeing half the problem, the other half is that legit businesses who do it the fair way can't compete. How do you launch a great product on Amazon with 0 reviews? Hope that 500 people buy it to maybe get 5 reviews? Alternatively you spend thousands on product ads hoping that enough people buy... or just succumb to the dark side and pay for reviews which is WAY cheaper.
- If you hate Amazon reviews, do your part and start reviewing the good products on Amazon. It is worth more to the seller than you think!
To me, a five star review means that a product went above and beyond my expectations in some extraordinary way. I have bought products that fit that description, but not many.
Everything else, I'd give either 4 stars or a still-very-satisfactory 3 stars.
The problem is, I know these 4-and-3 star reviews actually hurt sellers, which isn't my intention at all. So I just don't leave feedback.
It was here on HN a number of months ago that I learned that in the Uber world, 3 stars does not, in fact, mean "acceptable". That knowledge altered my view of all reviews, including Amazon's, in two ways:
1) Like you, it means that I'm no longer willing to give reviews/ratings. If there is no consensus on what the different numbers of stars mean, then I can't be at all comfortable that my rating will indicate to others what I intended to indicate.
2) It means that I no longer put any weight whatsoever on ratings I see from others, for the exact same reason: I can't know that what I think the rating means is at all what the rater intended it to mean.
That 3 and 4 star ratings hurt Amazon sellers underlines this problem.
Knowing how the system works and how 5 stars are expected, the way I look at Uber rating is all drivers start with 5 stars. A good enough journey gets that. But stars will be taken off if things like the car is dirty or they drive dangerously etc. It starts at 5 starts, not has to earn them.
I get why this is, but’s it’s really never been this way. Even on early eBay, if a transaction simply occurred without errors it was pure etiquette to rate them a full 5 with a comment along the lines of “A++++ SELLER! SHIPPED ON TIME, GOT EXACTLY WHAT WAS LISTED!”
Really, same thing goes for Amazon and Uber, etc. the etiquette is to start at a full rating by default and deduct based on what goes wrong. You can’t “earn” extra stars... if I order a box of batteries, and they arrive on time and the box is full, that’s a perfect transaction. 5/5, no problem. You can’t realistically expect extra batteries, or for the batteries to perfect above their rating. They’re batteries.
Not sure I understand your problem. What prevents you from simply changing your interpretation of review scores so that it will better fit the expectations of others, instead of avoiding leaving helpful reviews altogether?
I totally agree. For five stars it would have to be exceptionally good. Something I'm very satisfied I might give four stars, which ironically drags the score down.
I recently purchased an alarm clock. When I put the clock in my bedroom, the light from the display was so bright it kept me awake. I went back to the item on Amazon and read all the reviews. One person mentioned the same issue yet still gave it four stars!
i believe that's the reasoning for why many systems have devolved to boolean choices rather than multi-tier (distortions in voting cause mispresentations of the customer population's sentiments).
but boolean choices are really 3-tier systems where it's assumed that only the extremely statisfied or dissatisfied customer will vote (up or down), and the lukewarm/indifferent customer is assumed not to vote. however, that assumption very likely misrepresents the sentiments of the (majority) non-voters and thus the population as a whole.
you might address this by moving to a 3-tier system: (1) unsatisfied/bad, (2) acceptable/fine, (3) exceeded expectations/great to more accurately differentiate the non-voting/indifferent customers, but non-voters would have no incentive to suddenly voice their opinions and make the system more accurate.
you might be able to counteract that impulse by incentivizing customers to vote on every product/service delivery event (like earning points for future discounts) to lower response bias. you could also do a separate study to see how the voter/non-voter population differ, and adjust the boolean ratings accordingly.
or just succumb to the dark side and pay for reviews which is WAY cheaper.
I thought this was what the Amazon Vine program was for -- a legitimate way for retailers to provide free products to reviewers in return for a review, and those reviews are clearly labeled as such.
Seems to work pretty well - I've bought some products that only had Vine reviews, and I've agreed with the reviewer.
So much so that if you leave a 1-star review, there's a good chance you'll eventually get an email asking you to remove the 1-star review in exchange for a refund, a giftcard worth the price of the product, etc.
It encourages bad behavior though; You can basically gamble on what sellers will send you that email and thus give you the product for free.
I once bought an O-ring for my blender. It's literally a 1cm-wide circle of rubber with a diameter of about 2 inches. It has 1 job - keep liquid from leaking out the bottom of the blender.
The day after I received it, the company sent me an email asking me to please please please rate my new product!
Fine! I gave it 3 stars and wrote, "It's an O-ring. It does exactly what I expected it to and nothing more."
The next day I get an email saying, "We see you gave our product a 3-star review. What can we do to improve our product? What didn't you like about our product?"
WTF? It's a goddamn O-ring! There's nothing to review beyond "it works" or "it doesn't." What the heck do you want me to say about it? No O-ring is ever going to be 5 stars. Sorry! That's just the nature of the product.
At this point, it's just harassment. Stop begging for my approval, and especially when I give you my opinion, please don't question me about why. I explained it in the review.
> there's a good chance you'll eventually get an email asking you to remove the 1-star review in exchange
Wow. I've never given an Amazon product a 1 star review, but if I did, and I got such an email, I would absolutely update my review to warn everyone that this is happening, and that the real average review for the product is likely to be lower than is shown.
Isn’t it more of business problem that someone is hoping to stand out and become successful by selling on Amazon or any of the big platforms without creating an “unfair advantage”?
I don’t mean by gaming the system or doing something unethical, I mean in the classic business sense of creating a differentiator or in MBA speak thinking about Porter’s Five Forces.
It seems that Amazon stores (or some franchise) could act as a good proxy for launching / improving these products - imagine running a store, and lining up products to be "test driven" in store, with all the drivers licenses and check ups one might want. I can happily imagine trying out a fairly wide range of electronics - others may prefer shoes on Thursday or Fly traps on Friday.
TBH, retail is supposed to becoming "experience" based so this might be a runner. If someone tells me Jeff's mobile number I will persuade him.
Upwork, fiverr, freelancer also have jobs posted which are for fake reviews, even though the job description is cleverly disguised as a a blog post or some other innocent thing.
In my experience, making reviews easier to give causes the review quality and usefulness to go down. This happened when Netflix went from 5 stars to a simple like/dislike. I’m not sure why Amazon didn’t just block non-verified-purchaser reviews, increasing spammer costs significantly.
I suspect that Amazon reviews are going to be even less useful now. Especially given things like this:
> Amazon does not provide many specifics about how a product’s overall star rating is calculated, other than stating that it is not a simple average but instead uses “machine-learned models” that take into account factors such as how recent the rating or review is and whether it was a verified purchase or not.
I'm pretty sure that a ton of the review fraud on amazon comes through verified purchases. You refund/pay people to buy the product on amazon and leave a review in the best case, in the worst case you operate accounts buying your own stuff, and then flow the inventory back and basically pay an amazon tax for leaving good reviews.
IMO this is pretty solvable by looking at an account's purchase history too, but I don't think it's just as simple as blocking non-verified-purchase reviews.
And while you're dreaming that purchase history solves that, the scam world has long moved on.
Currently, companies pay people to buy items. They can keep those items, they just need to leave a good review. There are intermediaries who handle lots of sellers, so people buy a mixed bag of random garbage in exchange for the occasional review.
Yes, you can probably test for statistical anomalies, but I'm willing to bet that's quickly countered too - just have people buy occasional legit items so their profile is "statistically normal".
As far as I can tell, Amazon tries to fight that by keeping their ML model secret so scammers don't learn too quickly, but essentially, they're currently finding out what the Internet learned about SEOs manipulating search results.
Yes, I've seen articles where people who do this say there are a bunch of Facebook groups for exactly this purpose. Manufacturers/sellers ask people to buy their goods on the group and write good reviews. Once they get the proof of the purchase/review, they Paypal (or whatever) money to the "reviewer." I remember an article I read where some woman's house was overflowing with junk she didn't want because she was writing so many of these fake reviews. I would think Amazon could apply their awesome machine learning to figure out which accounts are pumping out these suspect reviews and perhaps de-prioritizing and maybe even rate limiting and delaying publishing of their reviews. I'd think that would do a lot to alleviate the problem.
I suspect solving it is hard to impossible. You can heuristic anything as suspicious at a glance but moderation at scale fails. You personally can decide "this review is bullshit" and block without issue but start flagging in a false positive or even true positive by a manipulator looking to start shit could cause considerable backlash as a futile attempt at pleasing everyone is made. Transparent and consistent rules invite gaming and opaque ones invite accusations of malfeasance and the uncertainty promotes bad feelings and bad behavior.
Worse still is that even taking a stance to not take a stance because you know it cannot be done will bring backlash as there are many who demand you take their stance, even allowing easy distribution of self chosen block and filter lists are not enough. There are many who demand the appliance experience.
There's also low scale review fraud like my sister bought a pet cam and gave it a 3 star review, and the seller started to offer escalating offers for her to remove or increase the rating from full refund, another camera, refund + free camera, and refund + money for the inconvenience.
Perhaps they could limit refunded products to 3 star or below? Really there isn't a good reason to positively review something and take it back, if it was your fault eg if it didn't fit, then you could just not review it.
> I’m not sure why Amazon didn’t just block non-verified-purchaser reviews, increasing spammer costs significantly.
I don't think it'd do much good. There are already large groups that subsidize verified purchases to get 5 star reviews. I read an article about it, confirmed some investigation on Facebook of on my own.
Basically someone runs a Facebook group where sellers advertise free product, promising Paypal reimbursement of the purchase price in exchange for reviews. There are at least (or were, I haven't checked recently) hundreds of Facebook groups across many languages with thousands of members each doing these activities. The reviewers are randos who like free stuff, and I don't think anyone could detect them if they only casually participate in the review scams.
I rather look at the content of the reviews than their ratings, there's been times where someone left what looks like an honest review and it's 4 / 5 but they only write positive things. I don't know what the solution is, but I definitely think more thorough reviews would help.
If you return a bad product, try and get a replacement to see if it was just bad luck, but also please do write a detailed review.
If you get a good product and it barely has reviews please review it.
someone left what looks like an honest review and it's 4 / 5 but they only write positive things
That may well be me. I don't think that "meets requirements" should merit a 5. When I do reviews, I view a bare "meets requirements" as 3 or maybe 4, depending on the type of product. I want to leave some headroom to be able to point out products that really do excel.
Another ambiguity is whether the rating is on an absolute scale, or normalized for value (it's not a perfect product, but it's super-cheap).
I think that on Netflix's case they are not going for reviews. They want to know what you like, to suggest something you'll probably like too. Doesn't matter if it's 4 or 5 stars.
> I’m not sure why Amazon didn’t just block non-verified-purchaser reviews, increasing spammer costs significantly.
I'm not sure this would help much. I've been asked by vendors to leave a review and if it's 5 stars, I will get something in return. I think for some people this would be a "well, nobody is getting hurt, right?" decision and they'd just do it.
The (edit: average) Amazon star rating has been such a poor indication of the quality of a product for a very long time - at best, it's a weak indicator of which products you might check out first.
For me personally, the most valuable bit of feedback are the negative (edit: 1 and 2-star written) reviews - they are pretty much the only review content I trust. I'm looking for patterns of issues that multiple reviewers raise about the product.
The positive reviews have so little value when anyone can post a review. So many shallow positive reviews from unverified 'buyers'.
I think we are in agreement - I didn't word my comment well.
When I was referring to the star rating, I meant the overall average rating of the product (ie what Amazon is trying to promote now over star rating plus written reviews).
When I mentioned the negative reviews, I was referring to the 1 and 2-star written reviews. I use the negative written reviews exactly how you described - and it's what I find valuable.
A useful heuristic that I also use, but it's good to be aware that fake reviews go in the other direction, too, of sabotaging competitors' product reviews.
plus with some of the low star reviews you can compare what they stated was wrong with your own expectations of what could go wrong with the product.
the only two considerations for me are the number of reviews and the quality of the low star reviews for the same. the dates of reviews is very useful as well, if a product doesn't have many recent good reviews it can offset the number of reviews in my view
The smart sellers realize this as well. Often the most "useful" review is also fake. Let's just create a 2 or 3 star review and list out the pros and (fake) cons and get people to upvote it.
The downside of this approach is for sellers to use this as a weapon. Pay people to buy a competitor's product and give it a nasty 1 star review. It becomes an arms race.
It is also super annoying when there are reviews like "came in three days not two. 1 STAR!!!" that are not product related. This doesn't even get into the commingling of product which is the real problem.
This may help a bit with fake reviews, but it amplifies the other substantial problem -- authentic, but untrustworthy reviews. I see two type of those all the time:
1) Reviews by people who have not used the product. The tipoff for this is when the customer says something like "I just got this, it looks great! I can't wait to use it." If someone hasn't even used it, then they can't possibly give a useful review of any sort.
2) Reviews in product listings that contain multiple products, or products that have changed since the original listing. The tipoff for this is comments describing a product that is clearly not the one being listed, or is one of several different products in the same listing.
If there is an increase in reviews that are just stars without comment, it becomes impossible to root these out at all. I don't see how this would lead to more reliable reviews.
Amazon has been my wheelhouse, fulltime, for the past 7 months. I'm building a site that looks at most listings per category and determines "true" scores for each one. The site also lets your filter and sort, instantly, on a variety of attributes, like "price", "unit price", "shipping time", "recent price drop %", "used price discount", "popularity", "brand quality", etc. Spending 7 months collecting and analyzing data of several Amazon categories has been exhausting, but quite revealing.
As you can imagine, dealing with low quality products with fake reviews is a challenge -- but it turns out it's not too hard to handle, even with my dataset which is far more limited than Amazon's. Without looking at any reviews or any metadata of reviews (author, count, chronology, etc), one could filter out "impostor" products with 95%+ accuracy.
Here's a neat trick: Next time you're unsure if a product has fake reviews, click on the brand of the product and see what else they sell. If you're looking at binoculars, and that same brand also sells dog food bowls, then maybe you should reconsider.
I've concluded that Amazon really doesn't care about fake reviews -- they will show users whatever listing has the maximum Expected Value (conversion rate * revenue), per your context (search term, category, or both). Even if a product has obvious fake reviews, if there are enough other people buying it it will float to the top, and Amazon is fine with that.
The main problem they have, I imagine, is sparse data. There are only certain fields (depending on category) which they force sellers to populate, eg: "name", "brand", etc. Item weight (which is distinct from _package_ weight), and "number of units" do not seem required, and so not many items have that information filled.
So, with sparse data, they have three choices:
1) Allow filter/sort by "unit price" and do not show the X% of listings that are missing this data -- many of which the user may actually be aware of and/or interested in.
2) Don't allow the option at all, and just rely on the fact your customer will do comparisons manually.
3) Try to derive the number of units from text cues in the product name, features, and description, then do #1.
I haven't ordered from amazon in over a year. I to go to physical stores and/or order from a reputable vendor online.
Actually it's nice. If I truly want it, I'll actually go and get it, or spend the time to find a reputable vendor. The added friction ensures I don't buy random things that I thought I wanted but don't.
I've not ordered from Amazon for 3+ years now. You can buy the same products at the same price elsewhere or generally always from ebay.
The only real advantage Amazon have is prime (if you pay for it) and simple returns, but you have to deal with a drone if anything goes wrong aside from that. Most other retailers have actual people that can make decisions answering queries
For anything outdoor related, climbing, hiking, cycling. I'll always go to REI and talk with a sales person there. REI seems to hire people who have actually used the equipment they sell, I don't mind paying a small markup for the experience.
Amazon reached that point for me a number of years ago. Since Amazon reviews are largely worthless, I don't use them to decide whether or not to buy a product. I do read them, though, because they often contain useful information about specific issues and usage tips.
Amazon is fourth in my list of where I buy things from. First is a local physical store. If I can't find what I need there, then I seek out manufacturers websites, or websites of authorized retailers. If I can't find what I need there, then I check out eBay. If that doesn't work, then it's Amazon.
This changed over the last year or so, though -- before, I would go to Amazon before eBay.
I think this parallels really well with the question of how much privacy people are willing to give up for the convenience of digital services. In both cases, it seems like we are increasingly willing to give up privacy and trust in the name of convenience, but I do wonder if there is a breaking point. If there is a breaking point, is it even a system that can be reversed?
Amazon has been more expensive lately and most other stores have free-shipping now too. I've definitely found myself being much more open to shopping around and purchasing elsewhere than I was 4-5 years ago.
To me a bigger problem than fake reviews is the rampant bait-and-switch. Way too often I will see a product with a 4 or 5 star rating, and find the reviews are talking about something completely different.
That, or there will be 10 different products nested under the same identifier. Not just different colors, but different products. So I have no idea which one the person it praising or calling crap. Maybe instead of Avatars next to the reviewer's name, Amazon should just attach an expandable screencap of the product page at the time the "verified purchase" was made.
> “For brands, this means the black-hat review clubs and sellers will have less impact, as fake reviews as a percentage of legit reviews should decrease.”
- On average, only 1-3% of customers review products.
- Each review is worth a lot of money, often times multiples of the product itself, and especially if you're just starting out.
- Each category in Amazon has it's own Average rating, for example, electronics typically have lower ratings because more things can go wrong and there are more usability issues vs something like kitchenware, where less things fail outright.
- If you play in a category with a certain failure rate, it is absolutely essential that you do everything you can to mitigate bad reviews as enough of them will sink your business, even if you have a great product.
- It takes 8+ 5 star reviews to counteract a 1 star review if you want to maintain a 4.5 star average which is the bar for a good product. This is extremely hard to do without manipulation.
- People who complain about fake reviews are only seeing half the problem, the other half is that legit businesses who do it the fair way can't compete. How do you launch a great product on Amazon with 0 reviews? Hope that 500 people buy it to maybe get 5 reviews? Alternatively you spend thousands on product ads hoping that enough people buy... or just succumb to the dark side and pay for reviews which is WAY cheaper.
- If you hate Amazon reviews, do your part and start reviewing the good products on Amazon. It is worth more to the seller than you think!
To me, a five star review means that a product went above and beyond my expectations in some extraordinary way. I have bought products that fit that description, but not many.
Everything else, I'd give either 4 stars or a still-very-satisfactory 3 stars.
The problem is, I know these 4-and-3 star reviews actually hurt sellers, which isn't my intention at all. So I just don't leave feedback.
This is also why I don't rate Uber drivers.
It was here on HN a number of months ago that I learned that in the Uber world, 3 stars does not, in fact, mean "acceptable". That knowledge altered my view of all reviews, including Amazon's, in two ways:
1) Like you, it means that I'm no longer willing to give reviews/ratings. If there is no consensus on what the different numbers of stars mean, then I can't be at all comfortable that my rating will indicate to others what I intended to indicate.
2) It means that I no longer put any weight whatsoever on ratings I see from others, for the exact same reason: I can't know that what I think the rating means is at all what the rater intended it to mean.
That 3 and 4 star ratings hurt Amazon sellers underlines this problem.
Really, same thing goes for Amazon and Uber, etc. the etiquette is to start at a full rating by default and deduct based on what goes wrong. You can’t “earn” extra stars... if I order a box of batteries, and they arrive on time and the box is full, that’s a perfect transaction. 5/5, no problem. You can’t realistically expect extra batteries, or for the batteries to perfect above their rating. They’re batteries.
I binary-ize any score that isn't already a yes/no, because thats what every score turns into.
I recently purchased an alarm clock. When I put the clock in my bedroom, the light from the display was so bright it kept me awake. I went back to the item on Amazon and read all the reviews. One person mentioned the same issue yet still gave it four stars!
https://imgur.com/A1umWh2
but boolean choices are really 3-tier systems where it's assumed that only the extremely statisfied or dissatisfied customer will vote (up or down), and the lukewarm/indifferent customer is assumed not to vote. however, that assumption very likely misrepresents the sentiments of the (majority) non-voters and thus the population as a whole.
you might address this by moving to a 3-tier system: (1) unsatisfied/bad, (2) acceptable/fine, (3) exceeded expectations/great to more accurately differentiate the non-voting/indifferent customers, but non-voters would have no incentive to suddenly voice their opinions and make the system more accurate.
you might be able to counteract that impulse by incentivizing customers to vote on every product/service delivery event (like earning points for future discounts) to lower response bias. you could also do a separate study to see how the voter/non-voter population differ, and adjust the boolean ratings accordingly.
in any case, rating systems are tricky.
Deleted Comment
I thought this was what the Amazon Vine program was for -- a legitimate way for retailers to provide free products to reviewers in return for a review, and those reviews are clearly labeled as such.
Seems to work pretty well - I've bought some products that only had Vine reviews, and I've agreed with the reviewer.
Legitimate ones go for Vine that would have probably kinda earned it anyway.
So much so that if you leave a 1-star review, there's a good chance you'll eventually get an email asking you to remove the 1-star review in exchange for a refund, a giftcard worth the price of the product, etc.
It encourages bad behavior though; You can basically gamble on what sellers will send you that email and thus give you the product for free.
The day after I received it, the company sent me an email asking me to please please please rate my new product!
Fine! I gave it 3 stars and wrote, "It's an O-ring. It does exactly what I expected it to and nothing more."
The next day I get an email saying, "We see you gave our product a 3-star review. What can we do to improve our product? What didn't you like about our product?"
WTF? It's a goddamn O-ring! There's nothing to review beyond "it works" or "it doesn't." What the heck do you want me to say about it? No O-ring is ever going to be 5 stars. Sorry! That's just the nature of the product.
At this point, it's just harassment. Stop begging for my approval, and especially when I give you my opinion, please don't question me about why. I explained it in the review.
Wow. I've never given an Amazon product a 1 star review, but if I did, and I got such an email, I would absolutely update my review to warn everyone that this is happening, and that the real average review for the product is likely to be lower than is shown.
Deleted Comment
I don’t mean by gaming the system or doing something unethical, I mean in the classic business sense of creating a differentiator or in MBA speak thinking about Porter’s Five Forces.
And who will own my reviews? I'd rather put my reviews on a more open review system.
I may be able to trust a reviewer on product A if their opinions on B,C and D were like my own.
Deleted Comment
TBH, retail is supposed to becoming "experience" based so this might be a runner. If someone tells me Jeff's mobile number I will persuade him.
https://www.amazon.com/launchpad/startups/benefits
https://thehustle.co/underground-world-of-kindle-ebooks
https://thehustle.co/part-2-confessions-from-the-scammy-unde...
https://thehustle.co/part-3-confessions-from-the-scammy-unde...
I suspect that Amazon reviews are going to be even less useful now. Especially given things like this:
> Amazon does not provide many specifics about how a product’s overall star rating is calculated, other than stating that it is not a simple average but instead uses “machine-learned models” that take into account factors such as how recent the rating or review is and whether it was a verified purchase or not.
IMO this is pretty solvable by looking at an account's purchase history too, but I don't think it's just as simple as blocking non-verified-purchase reviews.
Currently, companies pay people to buy items. They can keep those items, they just need to leave a good review. There are intermediaries who handle lots of sellers, so people buy a mixed bag of random garbage in exchange for the occasional review.
Yes, you can probably test for statistical anomalies, but I'm willing to bet that's quickly countered too - just have people buy occasional legit items so their profile is "statistically normal".
As far as I can tell, Amazon tries to fight that by keeping their ML model secret so scammers don't learn too quickly, but essentially, they're currently finding out what the Internet learned about SEOs manipulating search results.
Worse still is that even taking a stance to not take a stance because you know it cannot be done will bring backlash as there are many who demand you take their stance, even allowing easy distribution of self chosen block and filter lists are not enough. There are many who demand the appliance experience.
Dead Comment
I don't think it'd do much good. There are already large groups that subsidize verified purchases to get 5 star reviews. I read an article about it, confirmed some investigation on Facebook of on my own.
Basically someone runs a Facebook group where sellers advertise free product, promising Paypal reimbursement of the purchase price in exchange for reviews. There are at least (or were, I haven't checked recently) hundreds of Facebook groups across many languages with thousands of members each doing these activities. The reviewers are randos who like free stuff, and I don't think anyone could detect them if they only casually participate in the review scams.
If you return a bad product, try and get a replacement to see if it was just bad luck, but also please do write a detailed review.
If you get a good product and it barely has reviews please review it.
That may well be me. I don't think that "meets requirements" should merit a 5. When I do reviews, I view a bare "meets requirements" as 3 or maybe 4, depending on the type of product. I want to leave some headroom to be able to point out products that really do excel.
Another ambiguity is whether the rating is on an absolute scale, or normalized for value (it's not a perfect product, but it's super-cheap).
I'm not sure this would help much. I've been asked by vendors to leave a review and if it's 5 stars, I will get something in return. I think for some people this would be a "well, nobody is getting hurt, right?" decision and they'd just do it.
Allow me to generalize
making X easier causes the quality and usefulness to go down
For me personally, the most valuable bit of feedback are the negative (edit: 1 and 2-star written) reviews - they are pretty much the only review content I trust. I'm looking for patterns of issues that multiple reviewers raise about the product.
The positive reviews have so little value when anyone can post a review. So many shallow positive reviews from unverified 'buyers'.
4/5 star reviews are useless, but 2/1 star reviews reveal a great deal of useful information about a product.
A bunch of 1 star reviews of "UPS damaged my product" indicates a product that's as advertised and isn't astroturfed (much?).
No 1 star reviews, or 1 star reviews indicating misleading advertising tend to indicate astroturing.
1 star reviews indicating product failures, support failures, DOA etc... indicate to not purchase the product at all.
It's not a pure indicator, but I've found it to be more reliable than other review systems.
When I was referring to the star rating, I meant the overall average rating of the product (ie what Amazon is trying to promote now over star rating plus written reviews).
When I mentioned the negative reviews, I was referring to the 1 and 2-star written reviews. I use the negative written reviews exactly how you described - and it's what I find valuable.
the only two considerations for me are the number of reviews and the quality of the low star reviews for the same. the dates of reviews is very useful as well, if a product doesn't have many recent good reviews it can offset the number of reviews in my view
Source: I'm an Amazon seller.
It is also super annoying when there are reviews like "came in three days not two. 1 STAR!!!" that are not product related. This doesn't even get into the commingling of product which is the real problem.
1) Reviews by people who have not used the product. The tipoff for this is when the customer says something like "I just got this, it looks great! I can't wait to use it." If someone hasn't even used it, then they can't possibly give a useful review of any sort.
2) Reviews in product listings that contain multiple products, or products that have changed since the original listing. The tipoff for this is comments describing a product that is clearly not the one being listed, or is one of several different products in the same listing.
If there is an increase in reviews that are just stars without comment, it becomes impossible to root these out at all. I don't see how this would lead to more reliable reviews.
As you can imagine, dealing with low quality products with fake reviews is a challenge -- but it turns out it's not too hard to handle, even with my dataset which is far more limited than Amazon's. Without looking at any reviews or any metadata of reviews (author, count, chronology, etc), one could filter out "impostor" products with 95%+ accuracy.
Here's a neat trick: Next time you're unsure if a product has fake reviews, click on the brand of the product and see what else they sell. If you're looking at binoculars, and that same brand also sells dog food bowls, then maybe you should reconsider.
I've concluded that Amazon really doesn't care about fake reviews -- they will show users whatever listing has the maximum Expected Value (conversion rate * revenue), per your context (search term, category, or both). Even if a product has obvious fake reviews, if there are enough other people buying it it will float to the top, and Amazon is fine with that.
This is quite an important feature. Amazon already shows this information (sometimes at least) and yet they don't let you sort by it.
So, with sparse data, they have three choices:
1) Allow filter/sort by "unit price" and do not show the X% of listings that are missing this data -- many of which the user may actually be aware of and/or interested in.
2) Don't allow the option at all, and just rely on the fact your customer will do comparisons manually.
3) Try to derive the number of units from text cues in the product name, features, and description, then do #1.
It seems they chose #2. I'm going for #3.
Or would people shift to other online retailers?
Actually it's nice. If I truly want it, I'll actually go and get it, or spend the time to find a reputable vendor. The added friction ensures I don't buy random things that I thought I wanted but don't.
The only real advantage Amazon have is prime (if you pay for it) and simple returns, but you have to deal with a drone if anything goes wrong aside from that. Most other retailers have actual people that can make decisions answering queries
Amazon is fourth in my list of where I buy things from. First is a local physical store. If I can't find what I need there, then I seek out manufacturers websites, or websites of authorized retailers. If I can't find what I need there, then I check out eBay. If that doesn't work, then it's Amazon.
This changed over the last year or so, though -- before, I would go to Amazon before eBay.
That, or there will be 10 different products nested under the same identifier. Not just different colors, but different products. So I have no idea which one the person it praising or calling crap. Maybe instead of Avatars next to the reviewer's name, Amazon should just attach an expandable screencap of the product page at the time the "verified purchase" was made.
Well, at least they're thinking happy thoughts.