Forum upvotes and downvotes have no correlation with quality, though.<p>As implemented on Twitter and Reddit and HN, they are simply engagement features. In other words, the point of voting is not to surface quality content, it is to make site visitors feel like they’re doing something, and therefore be more likely to return. Karma scores, too, are not about rewarding quality but simply about creating artificial incentives for return visits.<p>Quality in forums comes mostly from careful human moderation. HN is carefully moderated by dang and others. Likewise, the highest-quality subreddits are those that have strict rules and active moderators that enforce them.
Any algorithm is going to get gamed sooner or later. The biggest problem I've with Google ranking is this:<p>1. Everyone now knows that Google favors long content pieces which covers the topic in depth (what, why, when, etc.)<p>2. So an army of content marketing firms are writing 2000 word posts for simple topics that can be covered in 200 words.<p>3. As the user gets lost in the 2000 word article, trying to find what they really need, Google treats this as a positive "lots of time spent on page" signal and rewards this behavior further.<p>The result is people trying to now write 3000 word articles to "one up" the other already long posts dominating the first few results.
I guess I’m all in favor of having better search engine results, but the pretentious tone and outright naïvety of this piece is astonishing.<p>> a ranking algorithm that is immune to SEO<p>Counter point: Voting rings.<p>> An object with a higher market value is probably better<p>Counter point: Pricing psychology, Tesla stock.<p>> They are willing to bid up the price by spending their upvote on it.<p>Nobody spends anything. This sentence alone is a deal breaker. There is no downside to upvoting and I can upvote everything. Makes no sense.
As far as can tell, dang does a pretty good job fighting off sock-puppets and vote manipulation as things are.<p>Imagine if there was a multi-billion dollar industry focused on making fake accounts in order to increase vote counts for HN posts. I'd don't think dang would be very successful in fighting them off. So the claim that this is "immune to SEO" is debatable.
I don't know if this new algorithm has fundamental advantages against what Google's currently doing other than it's not yet being targeted by SEO attacks. You can probably create a new algorithm which works significantly better than Google for the current web configuration, but it will be only until everyone decides to "optimize" their site against the new algorithm.<p>Probably in someday, general computational intelligence could solve this problem by understanding the content itself and use it for ranking in a meaningful way. But if you're just trying to utilize some arbitrary, accidental structure inside the data set for ranking, there always will be a way to exploit. That's the whole point of SEO anyway.
SEO will respond by creating 100% robot-operated communities that upvote their blogs, essentially shifting the burden onto Reddit/HN/Twitter staff, which will surely fall just like Google's did.<p>It looks so obvious to me yet the article doesn't even mention the word "bot" as a factor in modern Internet. To solve such a hard problem one has to try a lot harder than this.
A lot of the Google SEO “secrets” are not so secret. If you have an inquisitive mind - start a new blog with the means of getting your articles ranked on page 1. In 3 months you will know exactly how it works because there is real-time feedback. Google says it doesn’t use site authority, but it clearly does.<p>People will pay $5,000 (a number i saw once) to get a link from Forbes. Not because they give a shit about Forbes traffic but because Google still thinks that Forbes is some “above them all” content site.<p>But if you want to see the real dirty stuff people do, buy a subscription to a tool like Ahrefs and see how people manipulate links for SEO purposes. It might just give you gray hairs.
Have a quick look at the /newest page here and you’ll see it dominated by people who are already gaming it for conventional SEO, just in case Google starts factoring this in some day.<p>If you make it Fake Internet Points are worth real money, all you’ll do is fill all these places with SEO people and ruin them.<p>Please don’t.
There is a strange apparent paradox of Google floundering with attempting to address SEO spam, and alternative approaches including the one in the post and indeed my own demonstrating great success.<p>The truth is any alternative search ranking to Google's will probably work well. The problem is the search mono-culture with one very dominant search engine, which permits extreme specialization. If you have real competition with multiple mutually incompatible ranking algorithms, black hat SEO patterns <i>will</i> drastically lose effectiveness. The unreasonable effectiveness of SEO spam is largely water assuming the shape of the cup.
This is incredibly naive. Of course this is not immune to SEO but is highly susceptible to SEO. When it is in the financial interests of millions of people with billions of dollars at stake, people will simply tweak the signals going into inputs of your algorithm and "break" your "anti-seo".
The author mentions the second-order effect of SEO upon the web: an entire industry gaming their client's sites to the top of search results has debased the quality of the web overall. How is that same industry not going to pivot to debasing social sites like HN in order to do the same thing? The entire market analogy is busted because HN and reddit aren't markets and up/down votes aren't currency. You don't "spend" your upvote on something. You have an unlimited supply.
This seems incredibly biased and naive. If you want to just search for a tiny fraction of issues relevant to startups and coding, just add you site: filter or use HN search and move on. There's no anti SEO going on here, it's ranking by echo chamber on a metric you hope hasn't been manipulated (that much). Given the the breadth and depth of a fully functioning search engine, this wouldn't cover much at all either.
I think you'd want to index by subreddit as well in the case of Reddit. Getting 500 upvotes on /r/pics is probably not the same as 500 upvotes on some relatively obscure subreddit.
If the public knows exactly what the algorithm is, they will game it. If they don't know but can approximate its behavior by observation, they will game that.<p>I also think that lots of upvotes on influential sites will soon turn into improved page rank so don't think that it will make a difference. Google isn't using the original, naive page rank algorithm, they have repeatedly refined it to fight the SEO folks.
<i>and so, the oversimplified description of MarketRank is “just add up all the upvotes”. We have some more work to do to make the values accurate, but this is the general idea.</i><p>"1,000 FB Post LIKES $15 (On sale for $12)"[1]<p>"1,000 Facebook Likes $14.5o (On sale for $8.82)" [2]<p>No, this idea isn't going to work.<p>[1] <a href="https://www.fbpostlikes.com/" rel="nofollow">https://www.fbpostlikes.com/</a><p>[2] <a href="https://www.instafollowers.co/buy-facebook-likes" rel="nofollow">https://www.instafollowers.co/buy-facebook-likes</a>
Would be going back to page rank, in its near original form with minor tweaks from lessons learnt, be better than current search engines?<p>This seems like a decent attempt to resolve the the search engine problem. Has flaws yes and is limited to blogs with up votes so not universal.<p>Perhaps a universal search engine is impossible. Instead each type of content have its own search algorithm
Surely Google has already thought of this.<p>Is it that Google actually benefits from having an algorithm that is susceptible to being gamed and therefore they are disincentivized to move away from that?
This is a very bad idea that is very well presented. Thanks for a fun read.<p>Why bad? People have largely said, but I’ll note that a new blog with a single legendary quality post on a niche topic would have no chance to rank, just like google. Plus this could be games if you engage in social media promotion, which many people choose not to do.
Sometimes wrong things get upvoted and reach to the top on Reddit. How does this algorithm stop that? Case: In football (soccer for US folks), there was a report by Marca about two players fighting in dressing room after a terrible loss. It got to 6000 upvotes on reddit. Turns out later everyone denied it. Now, would that show up, unless corrected at the source? because you can't invalidate keywords. What happens when things are in a flux?<p>For a market analogy - while the market hyped up Apple and Microsoft, they also hyped up Theranos. Market had the 99 boom, and then 2000 crash, and a 2008 crash. That is, market can very easily by misled. Replacing a system with something slightly better is probably not a good idea. There are ways Google can be improved or replaced, but things like these are not the way.
> Our naive currency conversion will work exactly like our naive inflation adjustment. We will compare the cost of a similar basket of goods across the different platforms. In this case, our basket of goods will be similar to our inflation calculation, i.e the average of the top 50 highest ranking inflation adjusted websites on a platform.<p>> We can now calculate the “GDP” of a domain by adding the total value of all webpages produced by a domain.<p>This is very easy to game by spamming posts with one upvote on any platform, because the formula for calculating score increases strictly positively with the number of links.<p>You could have low-voted posts count against the total score, but then anyone could easily de-rank a competitor's content by creating low-voted posts about it.
>What we need is a ranking algorithm that can only be gamed by creating genuinely good content.<p>No. We need an algorithm that returns relevant results for the search terms. A machine shouldn't try to decide what "good content" is.
What I took away from this article is that the author(s) don't seem to have a feedback cycle outside their bubble.<p>Hopefully the discussion here might help a bit. I read a lot of valid criticism here that should (in best case) already have been incorporated into the thinking process while researching the idea of a 'Market Rank'.<p>> An object with a higher market value is probably better<p>That was the moment the article lost me personally.
I knew this was comedy when I saw that their top ranked site was paulgraham.com and they counted hacker news points as almost double Reddit. This is a poorly thought out idea that I just can't get on board with. This also doesn't account for new content that hasn't been ranked on a social aggregator site. I thought it would be about training an AI to recognise good content.
Instead of using upvotes from sites such as Reddit and HN, why not instead allow the users of the search engine itself upvote and downvote results? This might allow quality results outside the scope of larger online communities to still be highly ranked, and also tailor the results to the audience of the search engine itself.
This "algorithm" can be gamed with ease.<p>And the huge number of false negatives is not unimportant at all, as the author likes to think it is. Using an idea like this will transform the Internet in bubbles and if you are unlucky you will never find information that is outside your bubble.
It's a good article, but I don't think it's a sensible suggestion. I don't think the problem can be solved by looking at how content score. There should be a dynamic structure that will check that a site is of good quality. and this quality control has to be fed from very different things. To be good at SEO right now, it's enough to add meta tags, write articles with certain characters and do a few more things like these. it has nothing to do with quality. Good page speed has nothing to do with quality. Using ssr has nothing to do with quality content. I don't know exactly for what purpose opentelemetry is used, but maybe we need something like opentelemetry for quality monitoring.
/me not a Twitter user<p>Are Twitter upvotes really a useful guide to article quality?<p>I note that the searches the MarketRank people have reported on are all concerned with startups and software. I wonder how it performs on e.g. political analysis, recipes, DIY, living on a budget and so on.<p>I find it particularly hard to find product reviews that aren't rather shallow re-hashes of marketing material; a product review that points up the deficiencies of a product is a rare animal indeed.
<p><pre><code> > And so, the oversimplified description of MarketRank is “just add up all the upvotes”.
</code></pre>
Wait… the sum of fake internet points is what we’re using to measure pages? I agree that search results could be better but this doesn’t seem to be the way.
A reddit upvote is not a single currency. By this logic a submitted link would get an equal number of upvotes (on average) when submitted to any subreddit. This is not the case. Smaller subreddits have less potential for upvotes.
Everyone's pretty skeptical here, but I think something like this is the way forward. Not based on Reddit or HN, of course, but based on the same principle, namely, user curation. It's argued that Google was so successful at first with PageRank because at the time it was common for users to have personal web pages that had long lists of curated links to other websites. That signal has gone down the toilet, but why can't we just skip the middleman and get people to rank sites they think are interesting by, you know, upvoting them or something like that. We need collaborative filtering for content aggregation.
why can't we have upvote and downvote on search results, and use that as as measure? i guess a lot of people would downvote spammy sites, not sure about upvoting legitimate ones
If you've heard of 'karma farming', you know that Reddit upvotes can be gamed. There are numerous accounts that accumulate upvotes (similar to HN commenter score) by simply reposting popular posts from 4-6 weeks ago, aka the right amount of time for most daily users to forget that they had previously seen it. Usually this would occur on high-volume, high-density subreddits like r/pics, r/funny etc where there are lots of new posts each day.<p>Then, those high karma accounts can be used to astroturf on more valuable real estate, the sections of Reddit where people ask for product recommendations. Posts made by high karma accounts are given greater visibility in the Reddit algorithm, which correlates to a higher number of upvotes as more people see it in their feed.
The first thing an anti-SEO anything should do is ban Pinterest, by far the most annoying site on the internet.<p>See also the brilliantly named plugin, Unpinterested.
I stopped reading when I saw the algorithm that is purportedly 'immune to SEO' relies entirely on metrics that are already heavily manipulated.