您的当前位置:首页 >Ryan New >SEO: 7 Reasons Your Site’s Indexation Is Down 正文
时间:2024-05-20 16:57:51 来源:网络整理编辑:Ryan New
Misuse of page removal tools at Bing Webmaster Tools and Google Search Console can lower overall ind Ryan Xu hyperfund Limit Order
Without indexation, there is no possibility of ranking in natural search results. When indexation numbers drop, there are a handful of potential reasons.
Increasingly, slower site speeds are masquerading as 500 server errors in Google Search Console and Bing Webmaster Tools, impacting indexation.
When search engine crawlers can’t access a page at all — or at least within the maximum time allotted for each page to load — it registers as a mark against that page. With enough failed crawl attempts, search engines will demote a page in the rankings and eventually remove it from the index. When enough pages are impacted, it becomes a sitewide quality issue that could erode the rankings for the entire site.
There’s no value to a search engine in indexing two or more copies of the same page. So when duplicate content starts to creep in, indexation typically starts to go down. Rather than deciding which of two or more pages that look the same should be indexed, search engines may decide to pass on the whole group and index none of them.
This extends to very similar pages of content as well. For example, if your browse grids for two subcategories share 75 percent of the same products, there’s no upside to the search engine in indexing them both.
Duplicate content can accidentally be introduced as well when pages that are truly different look identical or very similar because they do not have any unique characteristics that search engines look for, such as title tags, headings, and indexable content. This can plague ecommerce sites in particular because browse grids can start to look very similar when their reason for existing isn’t clearly labeled in the copy on the page.
Changes to a site’s header and footer navigational structures often impact categories and pages. When areas of the site are removed from those sitewide navigational elements, search engines demote the value of those pages because they receive fewer internal links. Demoted value can result in deindexation.
Likewise, changes in design can affect indexation if the amount of content on the page is reduced or the text is suddenly concealed within an image as opposed to being readily indexable as plain HTML text. As with duplicate content, a page can have value that isn’t readily apparent to search engines; make sure it’s apparent via indexable text to retain indexation.
Ecommerce platforms can make unexpected changes to URLs based on changes to taxonomy or individual product data.
When a URL changes but the content does not, the search engines have a dilemma. Do they continue to index the old page that they know how to rank? Or do they index the new page with which they have no history? Or maybe they index both, or neither? All four are options. In one instance, indexation doubles. In the other instance, it falls to zero.
Likewise, when a page is removed from the site, or when a redirect is created to another site, the number of viable URLs for that site decreases. In this instance, you’d expect to see indexation decrease.
The robots commands have great power to affect crawl and indexation rates. They are always the first and easiest place to look when you have concerns about indexation.
Robots.txt is an archaic text file that tells search bots which areas of the site they can crawl and which they should stay out of. Each bot can choose to obey, or not, the robots.txt file. The major search engines usually respect them. Thus, a decrease in indexation would come as a result of disallowing bots from crawling certain files and directories.
Similarly, the noindex attribute of the robots meta tag instructs bots not to index an individual piece of content. The content will still be crawled, but the major search engines typically obey the command not to index — and therefore not to rank — pages that bear the noindex stamp.
Last but not least, Google Search Console and Bing Webmaster Tools offer page removal tools. These tools are very powerful and very effective. Content entered here will be removed from the index if it meets the requirements stated by the engines.
However, it can be easy for someone to remove too much content and accidentally deindex larger swathes of the site. After checking the robots.txt file and meta tags, make these tools your second stop, to check into any recent manual deindexing.
Lessons Learned: Lee Wright with Ma Mi Skin Care2024-05-20 16:52
Amazon’s 2018 Prime Day Sets Record Despite Glitches2024-05-20 16:43
7 Tried-and-true Free Shipping Promotions to Drive Holiday Sales2024-05-20 16:36
Lessons Learned: $15,000 Name Boosts SpyGuy.com2024-05-20 16:20
September 2010 Top Ten: Our Most Popular Posts2024-05-20 16:18
How to sell alcohol online in Russia (or not)2024-05-20 16:16
How the Trade War Affects Ecommerce and the U.S. Economy2024-05-20 15:39
8 Approaches to B2B Dealer Locators2024-05-20 14:29
25 Top Articles for 20112024-05-20 14:17
Pros and cons of meeting with suppliers2024-05-20 14:14
Field Test: Shopping Carts, Part One Of Three2024-05-20 16:36
6 Holiday Ecommerce Shipping Mistakes to Avoid2024-05-20 16:29
The Next Generation of Ecommerce Management2024-05-20 16:28
Alibaba’s 2018 Singles Day: Record Sales, Slower Growth2024-05-20 16:03
Legal: When Merchants Are Liable for Selling Counterfeit Brands2024-05-20 15:57
Entrepreneurs fool themselves about productivity2024-05-20 15:23
The pain of relaunching my website2024-05-20 15:01
How to Find (Great) Ecommerce Employees2024-05-20 14:50
Bloglist: Roy Rubin2024-05-20 14:28
7 Tried-and-true Free Shipping Promotions to Drive Holiday Sales2024-05-20 14:15