For the Internet users, a URL (Uniform Resource Locator) is nothing more than an address that points to a unique web page. But for an SEO expert, it’s a whole lot more than that.
A URL provides a number of signals to search engines such as:
- What kind of content is published on the page
- What is the purpose of the page
- Who is the intended audience of the page
The signals can be strong or weak but it is very important that you optimize your website’s URLs for the best results. Listed below is a compilation of the most important guidelines that you should keep in mind:
1. Always create URLs that can be easily read by humans
If the URL looks like the output of an encryption algorithm, for instance, neither search engines nor humans can read it.
Example of a bad URL: www.yourdomain.com/askj203948908pagenext98709
Example of a good URL: www.yourdomain.com/office-furniture-in-chicago
2. Use Hyphens NOT underscores in Web Page URLs
Google crawlers do NOT read underscores.
Example of a bad URL: www.yourdomain.com/office_chairs_in_chicago
Example of a good URL: www.yourdomain.com/office-chairs-chicago
3. Got any bad URLs? Use Robots.Txt file to block them
This way, you can manually tell search engines like Google NOT to consider these URLs. A bad URL may also be the one that points to the same page that another, well-optimized URL points to.
4. Keyword Optimize Your URLs
Use target keywords in the URLs. Let search engine keywords know what your pages are about. Just don’t make them completely unrelated to the content published on pages.
5. Start Using 301 Redirects
As and when the URLs of your website undergo some changes, make it a point to use 301 redirects for ‘informing’ Google about the change. You do not want to lose your hard earned page rankings, do you?
6. Have any Mobile Friendly URLs?
Make sure to add them all to the sitemaps. This is how you can increase the chance of these ‘mobile-friendly’ pages ranking higher in Google’s mobile search results.
7. Create easily identifiable category structure for the URLs
You may just have all URLs as:
- .. and so on
Or, you may have them organized in a meaningful way such as
The idea is to create a directory structure that search engine crawlers can easily understand.