MindMap Gallery URL Parameter - A Mind Map SEO Guide
A complet mind map summaries about URL parameters. It includes 4 parts: 1.The commin SEO issues with URL parameters. 2.How to evaluate your parameter problem. 3.6 major SEO solutions to tame URL parameters 4.Practices & Tips
Edited at 2021-02-04 09:14:04The human body is the physical structure of a human being. It is a complex and intricate system composed of various organs, tissues, and cells, working together to support life and enable various functions. In the human body, there are ten primary body systems. The body system shows all the ten body systems and their contribution to the maintenance of a healthy human body.
This electromagnetic waves concept map will clear out any misinformation and problem the students might have regarding the electromagnetic waves.
The concept map of the immune system is the types of the immune system, how it is further branched out, and what every cell is supposed to do for a better understanding and learning.
The human body is the physical structure of a human being. It is a complex and intricate system composed of various organs, tissues, and cells, working together to support life and enable various functions. In the human body, there are ten primary body systems. The body system shows all the ten body systems and their contribution to the maintenance of a healthy human body.
This electromagnetic waves concept map will clear out any misinformation and problem the students might have regarding the electromagnetic waves.
The concept map of the immune system is the types of the immune system, how it is further branched out, and what every cell is supposed to do for a better understanding and learning.
URL Parameter
SEO Issues with URL Parameters
1. Parameters Create Duplicate Content
Static URL
https://www.example.com/widgets
Tracking parameter
https://www.example.com/widgets?sessionID=32764
Reordering parameter
https://www.example.com/widgets?sort=newest
Identifying parameter
https://www.example.com?category=widgets
Searching parameter
https://www.example.com/products?search=widget
2. Parameters Waste Crawl Budget
Crawling redundant parameter pages drains crawl budget, reducing your site’s ability to index SEO relevant pages and increasing server load.
3. Parameters Split Page Ranking Signals
If you have multiple permutations of the same page content, links and social shares may be coming in on various versions. This dilutes your ranking signals. When you confuse a crawler, it becomes unsure which of the competing pages to index for the search query.
4. Parameters Make URLs Less Clickable
Parameter URLs are unsightly. They’re hard to read. They don’t seem as trustworthy. As such, they are less likely to be clicked.
Assess the Extent of Your Parameter Problem
Run a crawler
With a tool like Screaming Frog you can search for “?” in the URL.
Look in Google Search Console URL Parameters Tool
Google auto-adds the query strings it finds.
Review your log files
See if Googlebot is crawling parameter-based URLs.
Search with site: inurl: advanced operators
Know how Google is indexing the parameters you found by putting the key in a site:example.com inurl:key combination query.
Look in Google Analytics All Pages report
Search for “?” to see how each of the parameters you found are used by users. Be sure to check that URL query parameters have not been excluded in the view setting.
SEO Solutions to Tame URL Parameters
Limit Parameter-Based URLs
1. Eliminate Unnecessary Parameters
Ask your developer for a list of every website parameters and its function. Chances are, you will discover parameters that no longer perform a valuable function. Any parameters caused by technical debt should be immediately eliminated.
2. Prevent Empty Values
URL parameters should be added to a URL only when they have a function. Don’t permit parameter keys to be added if the value is blank.
3. Use Keys Only Once
Avoid applying multiple parameters with the same parameter name and a different value.
4. Order URL Parameters
If the same URL parameter is rearranged, the pages are interpreted by search engines as equal.
Pros
Allows more efficient use of crawl budget.
Reduces duplicate content issues.
Consolidates ranking signals to fewer pages.
Suitable for all parameter types.
Cons
Moderate technical implementation time
Rel=”Canonical” Link Attribute
The rel=”canonical” link attribute calls out that a page has identical or similar content to another. This encourages search engines to consolidate the ranking signals to the URL specified as canonical.
Pros
Relatively easy technical implementation.
Very likely to safeguard against duplicate content issues.
Consolidates ranking signals to the canonical URL.
Cons
Wastes crawl budget on parameter pages.
Not suitable for all parameter types.
Interpreted by search engines as a strong hint, not a directive.
Meta Robots Noindex Tag
Set a noindex directive for any parameter based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page.URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to nofollow the page’s links.
Pros
Relatively easy technical implementation.
Very likely to safeguard against duplicate content issues.
Suitable for all parameter types you do not wish to be indexed.
Removes existing parameter-based URLs from the index.
Cons
Won’t prevent search engines from crawling URLs, but will encourage them to do so less frequently.
Doesn’t consolidate ranking signals.
Interpreted by search engines as a strong hint, not a directive.
Robots.txt Disallow
The robots.txt file is what search engines look at first before crawling your site. If they see something is disallowed, they won’t even go there. You can use this file to block crawler access to every parameter based URL (with Disallow: /*?*) or only to specific query strings you don’t want to be indexed.
Pros
Simple technical implementation.
Allows more efficient use of crawl budget.
Avoids duplicate content issues.
Suitable for all parameter types you do not wish to be crawled.
Cons
Doesn’t consolidate ranking signals.
Doesn’t remove existing URLs from the index.
URL Parameter Tool in Google Search Console
Configure Google’s URL parameter tool to tell crawlers the purpose of your parameters and how you would like them to be handled.
Pros
No developer time needed.
Allows more efficient use of crawl budget.
Likely to safeguard against duplicate content issues.
Suitable for all parameter types.
Cons
Doesn’t consolidate ranking signals.
Interpreted by Google as a helpful hint, not a directive.
Only works for Google and with lesser control for Bing.
Move From Dynamic to Static URLs
Many people think the optimal way to handle URL parameters is to simply avoid them in the first place. To achieve this, you can use server-side URL rewrites to convert parameters into subfolder URLs.
URL with parameters: www.example.com/view-product?id=482794
Subfolder URL: www.example.com/widgets/blue
Pros
Shifts crawler focus from parameter-based to static URLs which have a higher likelihood to rank.
Cons
Significant investment of development time for URL rewrites and 301 redirects.
Doesn’t prevent duplicate content issues.
Doesn’t consolidate ranking signals.
Not suitable for all parameter types.
May lead to thin content issues.
Doesn’t always provide a linkable or bookmarkable URL.
Practice & Tips
Often the SEO solutions actively conflict with one another.
If you implement robots.txt disallow, Google would not be able to see any meta noindex tag.
You also shouldn’t combine a meta noindex tag with a rel=canonical link attribute.
Pratice Plan
Do keyword research to understand what parameters should be search engine friendly, static URLs.
Implement correct pagination handling with rel=”next & rel=”prev”.
For all remaining parameter-based URLs, implement consistent ordering rules, which use keys only once and prevent empty values to limit the number of URLs.
Add a rel=canonical link attribute to suitable parameter pages to combine ranking ability.
Configure URL parameter handling in both Google and Bing as a failsafe to help search engines understand each parameter’s function.
Double check that no parameter-based URLs are being submitted in the XML sitemap.