Technical SEO

Technical SEO
Spread the love






Do everything technically is right. Technical SEO is road and build the road. You will survive in long run. So if you want to strong road map for your site. You must have technical SEO in place.
Before, you make a site to live, make sure your technical SEO are in place and there is no technical error. Because, once google have indexed your site, then it is very difficult to roll back and very hard get rank. In some cases google can penalise you. So spend a lot of time in technical SEO, crawl the whole site, check URL structure, solve error issues, duplicate meta, blank meta.

These are some Technical SEO Points :-

Proper Redirects, Status Codes
      200: Page loads just fine.
      301: permanently relocated that information over here.
      302: page is over for the time being, but it won’t stay forever.
     404: page doesn’t exist

As a technical SEO tip, we recommend always use a 301 (permanent) redirect for mapping old URL to new URL. A 301 tells the search engine to drop the old page from its index and replace it with the new URL. Search engines transfer most of the link equity from the old page to the new one, so you won’t suffer a loss in rankings.
Suppose you are permanently redirecting, the status code needs to be a 301 not a 302. If not, then you aren’t passing any value from Page X to Page Y, and Page Y will probably never rank.
Don’t redirect everything to the home page.
If the page doesn’t exist, the status code needs be 404, not 200. That’s called a soft 404, and it creates confusing between Google and web server.

Canonicals

Canonical URLs to improve link and ranking signals for same content available through multiple URL structures or same content to be accessed through multiple URLs. You need to identify preferred URL or page, which you want to rank.
Suppose “Page B” is preferred URL, then In the source code of Page A, you have < link rel=”canonical” href=”http://www.example.com/page/b”>.

This mean, you telling to search engine that you’re on Page A, but that best read content is actually on Page B (preferred URL).
1.Duplicate home page like www.example.com and www.example.com/index.html, here you can apply canonical
2. Duplicate paths to the same page – www.example.com/page/a and www.example.com/page/a/1

Your canonical page that you want indexed by search engines, so that URL needs to be in your sitemap and internal linking structure.

 

URLs Structure and Parameter

Check the same url are not with upper and lower case. Same as
http://www.example.com/page and http://www.example.com/Page. Search engine can treat this different URLs and duplicate content. I generally keep all urls to lowercase. People rarely type capitalized URLs into a search bar. In that case you can redirect with 301 status code.
Carefully use the parameter in url. In some cases different parameter serve the same content and this lead to duplicate page content. Google treat this two different urls.

http://www.example.com/book?pub=tata&page=15&print=1
And
http://www.example.com/book?pub=tata&page=15&print=1
In Google Webmaster Tools, you can tell Google how it should read your parameters, but you should implement some canonicals too.

So, at time making URL, you should check these point:-

1. Create clear, friendly URLs
2. Each page should have one URL
3. The home page should have one URL
4. Make sure you have paginated URLs
5. Don’t make long URL. Keep it short
6. URL should be structure with section or category wise

 

Protect Site Performance

Your website’s server speed and page loading time, collectively called “site performance” can affect the user experience and impact SEO. Longer the web pages to load time, it can lead to reduce the conversion rate and increase the bounce rate. Search engine spiders would also take time to crawl or may skip, and would not be index the page and hurt your rankings.

 

These can be many technical factor for page load time.

1. Page should be with gzip (compressed).
2. Properly use the Javascript, CSS and Ads with Asynchronous.
3. You can defer the file which required later in page download.
4. Check server side and page level caching is there.
5. You can apply caching on static files for 10-15 days.(its depend on you, how frequently you updating these files).
6. CDN also the good option.
7. Use Google’s free tool PageSpeed Insights

 

Indexing

1. Create an HTML sitemap – for users and search engines
2. If necessary, create an XML Sitemap
3. Check your use of the meta robots tag
4. The Nofollow tag
5. Check for broken links (404s)
6. Check redirects (302s and 301s)
7. Site crawls and indexing
8. Robots.txt files – is set up correctly?

 

Create a Custom 404 Error Page

When someone clicks a bad link or types in a wrong address on your website.  Then custom error page will display.
 

Use Structured Data

Structured data markup works like that you mark up your website content with additional bits of HTML code, and the search engines read these notes to learn what’s what on your site. You provide some extra information to search engine.
The biggest SEO optimization benefit is search results may display more relevant information from your site — those extra “rich snippets” of information appear below the title and description. Structured data markup is available in Schema.org and Bread Crumb

 

Properly Usr robots.txt

It is the robots.txt, file which allow or disallow a site crawl to by particular engine. It also block a private directory from crawler.

1 Comment

Leave a Reply

Your email address will not be published. Required fields are marked *