Top 10 SEO Myths Debunked 2018

Top 10 SEO Myths ! Yes, you have read it right. SEO is well-known for many mis-conceptions. Many webmasters look at this mis-conceptions as facts, but actually they’re myths. There are top 10 SEO myths which people considered as facts. People believe, some of them cause penalty. But Google doesn’t penalise but take measures to limit number of search results based on quality. Intent is to provide quality results.

Google takes steps to become more transparent through increased activity in the SEO community such as regular Webmater Hangouts, conferences, insightful discussions on Twitter.

Big names like John Mueller, Gary Illyes, and Danny Sullivan at Google are helping to dispel the myths of SEO with facts.

For further clarity, I’ve put together a list of 10 common SEO myths about Google and SEO and why they’re wrong.

Myth 1: Duplicate Content Gets Google Penalty

There is no penalty for duplicate content by Google.

Google understands that duplicate content on the web is quite obvious. Google just aims to index the highest quality and most relevant page. It doesn’t present same content to searchers repeatedly in the search results.

Unless a site is trying to manipulate rankings with duplicate content on entire site (for example, creating doorway pages), sometimes similar pages are folded together in the index and an alternative version of the page is shown instead can be penalised in terms of rankings.

There are different techniques that are used for signals to search engines by SEO professionals through which they indicate which pages to be indexed. Some of them are using canonicals, sitemap inclusion or exclusion, internal links pointing to desired page etc.

Myth 2: Google Looks at Canonical URL as the Preferred Version for Indexing

The rel canonical tag is a signal to Google for preferred page but its not true always. Google may not respect it just because you set a URL as the preferred version for indexing via a canonical tag. Sometimes, Google may not select it for indexing.

You can observe such instances  in the new version of Google Search Console. In the Index Coverage report, it comes under under the flag ‘Submitted URL not selected as canonical’.

Google can choose another page for indexing (other than the one you have selected as the canonical) from set of duplicate pages. Because it feels that page is better version for users and must be shown in search results.

In such cases, you need to set preferred version through other techniques like sitemaps inclusion, robots.txt exclusion or internal links etc.

The key is to ensure you’re sending Google consistent signals as to the preferred version of the page.

Myth 3: Quality Updates Result in Algorithmic Penalties

In a recent interview, former Google engineer Fili Wiese spoke about the myth of algorithmic penalties:

“One misconception that often exists is around things like Google Panda or Phantom as the industry called it, or Fred. Quality updates basically. But people think those are penalties or algorithmic penalties (when they’re not).

The thing is there is no such thing as an algorithmic penalty, it’s actually a recalculation. It’s like a big black box with the formula inside, you put something in, something comes out, and what comes out is the rankings and what goes in is your website.

The algorithm changes are just basically changes within the black box, which means that what comes out on the other side is slightly different now. Does that mean you’ve been penalized? No. it may feel like it, but you’re not penalized.”

This is a subtle difference that Wiese raises, but an important one in understanding how Google’s search algorithms operate.

Myth 4: Google Has 3 Top Ranking Factors

This was big news in March 2016 when Andrei Lipattsev announced that links, content, and RankBrain made up the top 3 Google ranking factors.

Later, Mueller has since dismissed this statement in a Webmaster Hangout, saying that it isn’t possible to determine the most important ranking factors because this changes from query-to-query and from day-to-day.

Instead of focusing  on individual ranking signals, SEO pros should focus on optimising their sites to improve user experience, match user intent and to improve site quality that can match up with Google’s latest developments.

Myth 5: Google’s Sandbox Applies a Filter When Indexing New Sites

Another big misconception or SEO myth is in the form of how Google treats new sites in the index. There is a strong belief among some in the SEO community that Google applies a filter to new websites soon after its launch and this is to stop spammy sites from ranking.

Mueller put the Google sandbox to bed in a Webmaster Hangout, when he said that there was no such filter being applied to new sites.

Google algorithms just check how the website fits in with others trying to rank for the same queries and then it tries to return best results to searchers.

Myth 6: Use of Disavow File to Maintain a Site’s Link Profile

One of the major SEO activities include using a disavow file to maintain site’s backlink profile.

Google’s algorithms can understand these types of low-quality backlinks and it knows when they should be ignored. So maintaining a disavow file and updating it regularly has not  much significance so far.

At BrightonSEO in September 2017, Illyes said that if backlinks are coming in organically to a site, it’s extremely unlikely that the site will receive a manual action that we see in Google Search Console.

You need to make use of the disavow file only when a site receives a manual action, in order to remove the  spammy or low-quality links.

Myth 7: Google Values Backlinks from All High Authority Domains

Getting backlinks from an high authority website is a successful link building task. However, along with backlinks from high authority domain; relevancy is also an important factor.

An insight from Illyes’ BrightonSEO Q&A revealed that Google takes into account the context of backlinks, meaning that SEO pros should perhaps give more importance to link relevance when going after links.

Illyes thinks there is value in fixing internal and external links, but it is important to keep context and relevancy in mind. If a poor quality article (which has nothing to do with your site) links to your site, Google will ignore it because the context doesn’t match.

Myth 8: Google Uses Page Speed as a Major Ranking Signal

Since 2010, site speed is an important ranking signal and a key part of Google’s algorithms

However, Mueller says, while there are plans to introduce a speed update later in 2018, Google only uses speed to differentiate between slow pages and those in the normal range. In fact, DeepCrawl where Mueller works found that Googlebot will crawl and index pages that take up to three minutes to respond.

Speed influences rankings through feedback from user experience, like visitors bouncing from a page that takes too long to load, but as it stands Google’s use of speed in their algorithms is rudimentary for the time being.

Myth 9: Fred Was an Algorithm Update Related to Link Quality

Google continuously update their search algorithms at an average rate of 2-3 per day. The Fred algorithm update in March 2017 was thought to be an update related to link quality.

However, Illyes laughs at this and says, there was no specific algorithm update like Panda or Penguin. He clears that 95-98 percent of these ongoing updates are not actionable for webmasters. Fluctuations can always happens, but you should always maintain quality of your site. People should keep talking about your brand through links, social networks etc.

Myth 10: Crawl Budget is not Important

Crawl budget is one of the most important ranking factor but overlooked by some who overestimate Google’s ability to crawl all the pages on a given site.

Google can crawl all of the pages on a given site at once for small to medium sites (up to around 200,000 pages). However, crawl budget is a pressing issue for those managing large enterprise sites because they need to ensure important pages are being crawled and on a regular basis.

One way to check if Google has crawled the vast majority of your site is by looking to see if Googlebot is crawling lots of 404 pages, as this indicates most of the important pages have already been crawled.

Source- http://www.searchitonweb.com/top-10-seo-myths-debunked-2018/ 

Leave a Reply

Your email address will not be published. Required fields are marked *

shares
STILL NOT SURE WHAT TO DO?

We are glad that you preferred to contact us. Please fill our short form and one of our friendly team members will contact you back via Email.









Type Code* captcha


X
CONTACT US