Review farming, likes farming, upvote farming, etc - a system of bots or software that increase popularity of a product. This often misleads a consumer into buying it.<p>Example video of how farming happens: https://9gag.com/gag/aGdrew5<p>Please note the video may perhaps be fake. But the practice of farming is not impossible.<p>I use words such as, product, review, buying, etc. But their meaning might change upon context. For e.g. apply the context for this site...<p>Note: I am not pointing fingers at any online applications or services.<p>How to stop this?<p>The consumer cannot do much. He or she, at best, can perhaps report the comment or review. Provided the consumer can spot it.<p>It is up to the concerned enterprise to investigate and take action. Else they are going to lose the trust of customers. I am highlighting a few approaches.<p>(1) IP address analysis.<p>If all reviews to some product comes from one ip address, we can take some action. This is simplest of cases to expect<p>(2) Study the user accounts.<p>(2.1) If there are a few accounts with the same creation date, and, those same accounts have not much review history...you can investigate and arrest the actual owner of that user, or, more practicality, ban the user.<p>(2.2) What if there is a geography separation? Find out if the user has frequently changed geographics. This ties into point #1.<p>(2.3) Suspicious user. Mostly they won't have a photo, bio, etc. A relative air of anonymity is a red flag, but, requires more investigation.<p>Final notes...<p>I could only highlight a few approaches. But I think a better proactive approach is to delay the reviews as such - and then run those analyses, cleanup, and, release them.<p>"Farming" indicates a bigger social problem exists. Hoping you can expand what more can be done here. Its actually more complicated than it looks.
My 2 cents: Get rid of stars and then have each reviewer submit something positive about the product as well as something negative as nothing is ever perfect. Eventually people will figure out the reviewers writing "it works too good!" or "it has no problems!" as negatives are fake. Even if the bots manage to get around the automatic detection (i.e. IP + activity tracking) they'll still be submitting negatives which will hopefully balance out their positives and incline the user to examine each argument instead of being subconsciously manipulated by the 10/10s and 5 stars.