4

What is On page SEO & How to Do it?

The On page SEO is the basic and most important thing in Search Engine Optimization. It is done on our own website and hence it can be controlled and edited. It contains the various guidelines laid down by the search engines such as Google or Bing to help a website rank well on the Search Engines.

Meta Title

It most important in On-page SEO. Each page should contain a unique title, having a length less than 65 characters. You should place one or two important keyword which you want to rank for in the title tag. Use your target keyword at the beginning of title. Write an attractive title to catch eyeballs and attract users, relevant to your webpage content.

<title>Free Web Design, Web Development Company India | Pinkwhale </title>

 

Meta Description

This more like a short summary of a webpage. It has be unique for every page, and should be related to your page title and content. It can have a maximum length of 160 characters and the targeted keyword should be included in it.

Eg- <meta name=”description” content=” Description goes here…”/>

Meta Keywords

In this tag you are supposed to put a list of keywords. You can put 4-5 keywords in this tag. This tag is no longer important as it is not given any SEO value by major Search Engine’s like Google. You may or may not use it is no longer important for ranking.

<meta name=”keywords” content=”web hosting, website hosting, Hosting, cheap web hosting” />

URL Structure

Although you can write a very long one but make sure it is short and readable. An URL can be of two types dynamic and Static. A dynamic URL is a randomly generated URL in the form of default characters or some random numbers. You can see below that dynamic URLs are difficult to read and understand by users and search engines.

http://iibmindia.in/course-detail.php?crId=14

Another type of URL is Static URL which should be used for On page SEO as it is search engine friendly. These are easy to read and understand by both search engines and and humans. You can use a keyword in the URL for seo benefits. Look at the static URL shown below.

http://iibmindia.in/courses/master-in-business-administration-mba

Heading Tag

For the headline of a page use H1 tag because it has some SEO value. Apart from H1 tag you can also use H2 and H3 tag on pages of your website if required. It highlights words and headings and has some SEO value. Although there are 6 types of Heading tags try to use at least h1 and h2 or h3 on every page if possible.

Keyword Research

For on-page SEO it is very important to do keyword research. For this we use a tool called keyword planner to find relevant keywords. We normally try to find keywords with high search volume, low competition and long-tail keyword. Use LSI keywords throughout your content as it helps to improve relevancy of content on a webpage.  LSI(latent schemantic keyword) are the synmons of your target keywords. LSI Graph the best is tool to generate LSI Keywords.

Types of keywords-

Short-tail Keyword-  It is 1-2 word keyword which has very high competition and higher search volumes.

Medium-tail Keyword- It has 2-3 word keyword which has medium competition and medium search volumes

Long-tail Keyword- It has 3-5 word keyword which has normally low search volumes and low competition.

Internal Linking

It is the inter linking of various pages within a website. It gives better user experience for users and makes crawling easy for search engines and hence it is one of ranking factors.

External Links

Use external links linking to external websites. on your websites pages and blog posts. Using these links makes your webpage more relevant in the eyes of search engines.

Sitemap

It is an XML file containing the list of all the pages containing in a website. This file lets the search engines know the various pages containing in a website. It is like a map of your website and is useful for search engines and not users. There are various online tools available free for creation of sitemaps like XML-Sitemaps. After creating it you will have to upload it on the server. Finally you can submit it to Google Webmaster tools inside crawl section.

Ex- www.webcanny.co.nz/sitemap.xml

Robots.txt

It is a file kept in the root folder which contains instruction’s for search engines or robots. The first page visited by search engines is robots.txt file . In this you can define access to various pages of your website. If you don’t want bots to visit a particular page, image or folder then then you can block them through certain instructions.

 

User-agent:*    – It means it is applicable to all robots

Disallow:/ –    Robots are not allowed to visit.

Optimization of Images

Use a description for the images in alt tag

Use a keyword in the image file name.

Use compressed images in website, it improves the load time of webpage.

Improving Page & Website Speed

Google considers the website speed and page load time as one of the ranking factors, so it is very important to improve the website speed. We can use  tools like Google Page speed insights to test the speed of our site. There other factors listed below by which we can improve the website speed.

  • Leverage browser caching
  • Enabling the Gzip compression by adding a code in .htaccess file in web server
  • Scaled server Images
  • Compression of images
  • Specifying of Image dimensions
  • Adding expires Headers in .htaccess file
  • Reduceing the number of http requests on pages.

W3C Code Validation

Search engines don’t like websites full of HTML, Javascript and CSS errors. So it is very important to use W3C validator and check for errors on website and rectify them. Since errors affect our search engine rankings the seo experts will have to identify the errors and communicate it with the web developers and ask them to correct it.

Keyword density

Don’t spam by inserting too many keywords in your content for SEO benefits and do not use same keyword again and again. A keyword density of 2% is fine if you are using various variations of your keywords.

Breadcrumb Navigation

Breadcrumb Navigation are very user friendly and provides a good user experience, so we should use them in our website. It is also easy for search engines to crawl various other pages of site via breadcrumb links. It is one of the many ranking factors, so it should be used in our website. For this we can tell the web designer or developer to make on our website.

Broken Links

Search engines and users do not like broken links or 404 error pages and hence it can affect the rankings of a website. You can use tools like deadlinkchecker.com  or Google Webmaster tools to find out broken links and remove them, or redirect them to home page or other relevant page.

Mobile Friendly Website

Google and other search engines are strongly recommending to make a mobile-friendly site so it has become an important ranking factor and will help in ranking websites higher on search engines. So make your website mobile-friendly or responsive so that users have a better experience when they open your website on mobile and tablets. There are various tools like mobiletest.me to check if your site is responsive.

URL Canonicalization

It is used when we have to redirect www URL to non-www URL or vice-versa. It is done to prevent the duplication of content because both URLs may appear on search results.

Example-    https:// www.example.com     and   https://example.com   both URLs may get indexed on search engine results page. Since both URLs will have same content, there will be issue of content duplication and Google may penalize our website so to prevent this we do 301 redirection of one URL to another.

Canonical Tag

This tag is used when there is issue of content duplication. We can use it in following situations.

  • When we have www and non-www version of a website.
  • When there is multiple pages created due to categories and tags on CMS
  • When there is both http and https URLs for any website.
  • When same content copy is used on different URLs example in ecommerce websites.
  • Sharing syndicated content on multiple different sites.

To solve the issue of duplicate content we put the below code on the duplicate page, telling the search engines that it is a duplicate page of the original page.

<link rel=”canonical” href=”http://www.techmappers.com/index.html” />

.htaccess File

It is .txt file containing various codes or instructions, it is kept in the root folder of the server. Normally codes related to Expires headers, Gzip compression, 301 redirect are written in this file.

 

301 Redirect Usage

This is used when we have to redirect one page to another page or one domain to another domain. It can be used in following circumstances.

  • When we are changing our company domain to a new domain(URL).
  • Used when we have to redirect www URL to non-www URL or vice-versa.
  • When we want to redirect index page to home page
  • It is also used when we wish to redirect 404 error page to Home page.

 

Related Links- SEO Course in Bangalore

 

 

Prakash
 

I am a Digital Marketing Expert who helps Small business grow online. I create brands by improving sales and conversions through Online Marketing channels like SEO, Social Media, Adwords & Content Marketing. During free time, i write blogs and share my knowledge with others.

Click Here to Leave a Comment Below 4 comments