Welcome to the staging ground for new communities! Each proposal has a description in the "Descriptions" category and a body of questions and answers in "Incubator Q&A". You can ask questions (and get answers, we hope!) right away, and start new proposals.
Are you here to participate in a specific proposal? Click on the proposal tag (with the dark outline) to see only posts about that proposal and not all of the others that are in progress. Tags are at the bottom of each post.
How viable is generic SEO in 2023? Question
I recall that back in the dark ages of web search (Altavista), search engines were naive and keywords were a huge thing in SEO. People would even create landing pages stuffed with keywords in tiny, transparent text to get higher rankings.
When Google first came on the scene the revolution was that its ranking relied on something the webmaster does not control: How many other people link to that page (PageRank). This was then gamed in other ways.
Over the last decade or so the trend among major search engines seems to be "personalization", where each user is tracked and results are biased towards what that user "would want". Moreover, search engines are reportedly introducing arbitrary biases into their algorithm, based on their belief that certain types of content are "better" for their users to see (I think Covid is a good example). Lastly, there are rumors of pay-for-play schemes that boost rankings as a form of paid advertising (I mean the "native" rankings, not the ones labeled as sponsored).
My point is that it seems every major search engine today has a lot of their own proprietary "secret sauce" and the details of their algorithms are no longer common knowledge. Given this, doesn't that invalidate the idea of "SEO" in a general sense? Isn't SEO becoming something that is very specific to a given search engine, such that instead of talking about "improving SEO", you would talk about only "improving Google SEO" or "improving Bing SEO" with the understanding that these may involve very different (and possibly mutually exclusive) measures?
Or is there still enough common ground between search engines that it makes sense to talk about general SEO advice?
1 answer
The fundamentals of SEO haven't changed in two decades:
- Create great content.
- Use the words and phrases for which you want to be found in search engines.
- Ensure that search engines can crawl your site so that it gets indexed.
- Grow your site's reputation by getting links from other sites.
- Follow the guidance from search engines so that your site doesn't get banned for spam.
Search engine algorithms have never been public knowledge. It has always been a process of reverse engineering them to figure out techniques that help sites rank well.
There have always been differences between different search engines. Ranking well in one search engine has never guaranteed ranking well in all of them. Today Google and Bing use algorithms that are more similar than search engines of the past. Think about how vast differences were between Altavista and Google in the late 90's. Rankings in Altavista were purely keyword driven where Google use link juice.
I'd even go as far as to say that it is much harder to find techniques that work well in one particular search engine over another today. Google in particular has made it harder to evaluate your SEO. They no longer publish Pagerank as a metric, they no longer send search phrases in the referrer header, and they have really cracked down on scrapers that check rankings. In many ways, all you have today is the generic SEO advice because ranking tricks for particular search engines are harder to find.
0 comment threads