URL Parameter - A Mind Map SEO Guide

URL Parameter
SEO Issues with URL Parameters
Assess the Extent of Your Parameter Problem
SEO Solutions to Tame URL Parameters
1. Parameters Create Duplicate Content
Static URL
https://www.example.com/widgets
2. Parameters Waste Crawl Budget
Tracking parameter
Reordering parameter
Identifying parameter
Searching parameter
https://www.example.com/widgets?sessionID=32764
https://www.example.com/widgets?sort=newest
https://www.example.com?category=widgets
https://www.example.com/products?search=widget
Crawling redundant parameter pages drains crawl budget, reducing your site’s ability to index SEO relevant pages and increasing server load.
3. Parameters Split Page Ranking Signals
If you have multiple permutations of the same page content, links and social shares may be coming in on various versions. This dilutes your ranking signals. When you confuse a crawler, it becomes unsure which of the competing pages to index for the search query.
4. Parameters Make URLs Less Clickable
Parameter URLs are unsightly. They’re hard to read. They don’t seem as trustworthy. As such, they are less likely to be clicked.
Run a crawler
Look in Google Search Console URL Parameters Tool
Review your log files
Search with site: inurl: advanced operators
Look in Google Analytics All Pages report
With a tool like Screaming Frog you can search for “?” in the URL.
Google auto-adds the query strings it finds.
See if Googlebot is crawling parameter-based URLs.
Know how Google is indexing the parameters you found by putting the key in a site:example.com inurl:key combination query.
Search for “?” to see how each of the parameters you found are used by users. Be sure to check that URL query parameters have not been excluded in the view setting.
Limit Parameter-Based URLs
1. Eliminate Unnecessary Parameters
Ask your developer for a list of every website parameters and its function. Chances are, you will discover parameters that no longer perform a valuable function. Any parameters caused by technical debt should be immediately eliminated.
2. Prevent Empty Values
URL parameters should be added to a URL only when they have a function. Don’t permit parameter keys to be added if the value is blank.
3. Use Keys Only Once
Avoid applying multiple parameters with the same parameter name and a different value.
4. Order URL Parameters
If the same URL parameter is rearranged, the pages are interpreted by search engines as equal.
Pros
Cons
Allows more efficient use of crawl budget.
Moderate technical implementation time
Reduces duplicate content issues.
Consolidates ranking signals to fewer pages.
Suitable for all parameter types.
Rel=”Canonical” Link Attribute
The rel=”canonical” link attribute calls out that a page has identical or similar content to another. This encourages search engines to consolidate the ranking signals to the URL specified as canonical.
Pros
Cons
Relatively easy technical implementation.
Very likely to safeguard against duplicate content issues.
Consolidates ranking signals to the canonical URL.
Wastes crawl budget on parameter pages.
Not suitable for all parameter types.
Interpreted by search engines as a strong hint, not a directive.
Meta Robots Noindex Tag
Set a noindex directive for any parameter based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page.URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to nofollow the page’s links.
Pros
Cons
Relatively easy technical implementation.
Very likely to safeguard against duplicate content issues.
Suitable for all parameter types you do not wish to be indexed.
Removes existing parameter-based URLs from the index.
Won’t prevent search engines from crawling URLs, but will encourage them to do so less frequently.
Doesn’t consolidate ranking signals.
Interpreted by search engines as a strong hint, not a directive.
Robots.txt Disallow
The robots.txt file is what search engines look at first before crawling your site. If they see something is disallowed, they won’t even go there. You can use this file to block crawler access to every parameter based URL (with Disallow: /*?*) or only to specific query strings you don’t want to be indexed.
Pros
Simple technical implementation.
Allows more efficient use of crawl budget.
Avoids duplicate content issues.
Suitable for all parameter types you do not wish to be crawled.
Cons
Doesn’t consolidate ranking signals.
Doesn’t remove existing URLs from the index.
URL Parameter Tool in Google Search Console
Configure Google’s URL parameter tool to tell crawlers the purpose of your parameters and how you would like them to be handled.
Pros
Cons
No developer time needed.
Doesn’t consolidate ranking signals.
Allows more efficient use of crawl budget.
Likely to safeguard against duplicate content issues.
Suitable for all parameter types.
Interpreted by Google as a helpful hint, not a directive.
Only works for Google and with lesser control for Bing.
Move From Dynamic to Static URLs
Many people think the optimal way to handle URL parameters is to simply avoid them in the first place. To achieve this, you can use server-side URL rewrites to convert parameters into subfolder URLs.
URL with parameters: www.example.com/view-product?id=482794
Subfolder URL: www.example.com/widgets/blue
Pros
Cons
Shifts crawler focus from parameter-based to static URLs which have a higher likelihood to rank.
Significant investment of development time for URL rewrites and 301 redirects.
Doesn’t prevent duplicate content issues.
Doesn’t consolidate ranking signals.
Not suitable for all parameter types.
May lead to thin content issues.
Doesn’t always provide a linkable or bookmarkable URL.
Practice & Tips
Often the SEO solutions actively conflict with one another.
If you implement robots.txt disallow, Google would not be able to see any meta noindex tag.
You also shouldn’t combine a meta noindex tag with a rel=canonical link attribute.
Pratice Plan
Do keyword research to understand what parameters should be search engine friendly, static URLs.
Implement correct pagination handling with rel=”next & rel=”prev”.
For all remaining parameter-based URLs, implement consistent ordering rules, which use keys only once and prevent empty values to limit the number of URLs.
Add a rel=canonical link attribute to suitable parameter pages to combine ranking ability.
Configure URL parameter handling in both Google and Bing as a failsafe to help search engines understand each parameter’s function.
Double check that no parameter-based URLs are being submitted in the XML sitemap.
191 1