Can you imagine how many blogs there are in the world?
More than 500 million! And on average, more than 2 million blog posts are made every day.
Then you realize how difficult it is to get your blog on the first page of search engine by competing with so many blogs and blog posts.
And there is no substitute for Search Engine Optimization (SEO) to do this difficult task. That’s why marketers spend a lot of money on SEO.
If you read the entire article, you will be inspired to do the same.
What is Search engine optimization (SEO)?
There are certain rules or special procedures that need to be followed to make a website important to search engines and to get it on the first page of search engines.
To get rank there are many tools that need to be used properly. So Search engine optimization or SEO is the process of ranking a website in search engines using these tools and methods.
Why Search engine optimization (SEO) is needed?
Revenue will come from your blog only when you optimize your blog.
You will not get too many visitors without optimization.
A large part of the visitors come from the search results. 53.3% of a website’s total traffic comes from organic search (BrightEdge).
The amount of visitors you get through SEO is not the amount of visitors you get from social media. SEO produces 1000+% more traffic than social media.
If you can bring your website to the first page of Google, your visitors will increase a lot.
The #1 result in Google gets approximately 32% of all clicks (Backlinko)
90.83% of pages do not receive organic search traffic from Google
Then you understand that it is not possible to increase your visitors without SEO.
White hat vs. black hat SEO
Everyone tries to get high ranking in search engine. Good results do not come immediately after trying. You have to give time for this. But there are some people who want to get rankings in search engines through spamming or any other illegal technique.
They do not want to follow the guidelines provided by Google Webmaster Guidelines and Bings Webmaster. These are called black hat SEO. Search engines penalize them. These sites do not get rankings.
According to Wordstream, they use some techniques
- Content Automation
- Doorway Pages
- Hidden Text or Links
- Keyword Stuffing
- Reporting a Competitor (or Negative SEO)
- Sneaky Redirects
- Link Schemes
- Guest Posting Networks
- Link Manipulation (including buying links)
- Article Spinning
- Link Farms, Link Wheels or Link Networks
- Rich Snippet Markup Spam
- Automated Queries to Google
- Creating pages, subdomains, or domains with duplicate content
- Pages with malicious behavior, such as phishing, viruses, trojans, and other malware
And those who follow the search engine guidelines and rank the page using various tools are called white hat SEO.
If you want to do business for long time then you must do White Hat SEO.
In addition to Black Hat and White Hat SEO, there is another SEO technique
Gray Hat SEO. It is neither white hat SEO nor illegal like black hat SEO. For Gray Hat SEO you will not get penalty from search engine. But Google does not support it.
How does search engine work?
Three of the things that search engines do are major
Crawling – Crawling is a search engine process through which the search engine finds all kinds of new and updated content and code URLs on the Internet through one or more bots / robots (many call it spiders or crawlers).
Indexing – The information collected after crawling is indexed according to specific rules that are then used to display results according to users’ searches.
Ranking – Showing the best content as relevant and accurate results from the indexed content according to the user’s search, that is called ranking.
One way to check the site’s index is “site: yourdomain.com”
If your site is not found by search engines then there may be some reason behind it.
- Your site is brand new which Google has not yet indexed.
- Your site does not have any kind of backlinks.
- The bot could not crawl because of a problem with your menu / navigation.
- Your site may contain code that prevents spybots from crawling.
- Your site has been penalized.
If you search by typing ‘site: yourdomain.com’ in the Google search bar, Google will show you the indexed pages of the site.
The robots.txt file is located in the root directory of the website (ex.yourdomain.com/robots.txt). This directs the search engine which part of the site should be indexed and which part should not.
How does Google bot follow the robots.txt file?
If the Google bot does not find the Robots.txt file for a site, it starts crawling the site without any instructions.
If the Google bot finds the robots.txt file for a site, it will follow the instructions and crawl the site.
Google Bot will not crawl the site if it encounters an error while accessing the Robots.txt file on a site and cannot determine it.
- Make pages primarily for users, not for search engines.
- Don’t deceive your users.
- Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you, or to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
- Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.
Avoid the following techniques
- Automatically generated content
- Participating in link schemes
- Creating pages with little or no original content
- Sneaky redirects
- Hidden text or links
- Doorway pages
- Scraped content
- Participating in affiliate programs without adding sufficient value
- Loading pages with irrelevant keywords
- Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware
- Abusing structured data markup
- Sending automated queries to Google
- HTML Tags: Ensure your HTML element and alt attributes are descriptive, specific, and accurate.
- <TITLE> tags – Title of the page. Write descriptive and unique titles for each page of your website.
- <META name=”description”> – Brief summary and description of the webpage. This may appear as the page description in the search results. Write relevant descriptions, and you can use this added space to expand on the <title> tag in a meaningful way.
- <META name=”robots”> – Used to provide crawlers instructions on how to crawl and index a specific pages content.to let Bing knows about your snippet and content preview preferences using robots meta tags .
- <a href> tag – specifies the URL linking to another page. If you want to link to another part in the same page use the #tag.
- <img src> tag –specifies an image file to be displayed
- alt attributes – use this attribute on <img> tags to describe the image. Use descriptive information rich context within alt attributes to provide context to the images.
- <H1> tag – helps users understand the content of a page more clearly when properly used.
- <H1>-<H6> Header tags – Define the structure of your page and helps Bing understand the understand the content of each paragraph.
- <P> tag – delineates paragraphs.
- <TABLE> tag – Use <TABLE><TH> etc. for data tables. Do not use <TABLE> for layout.
- Use HTML5 semantic elements as they have am intrinsic meaning to browser, developer and search engine, especially use the following HTML5 Semantic Elements:<article>, <aside>, <details>, <figcaption>, <figure>, <footer>, <header>, <main>, <mark>, <nav>, <section>, <summary>, <time>.
In order to rank a page, that page has to be optimized. If you can optimize it properly, ranking on the page is easy. On-page SEO requires keyword research, image optimization, content optimization, link building, and writing good quality articles.
On-page Search engine optimization (SEO)
In order to do on-page SEO, you have to work on the page
- Meta title
- Meta description
- URL setup
- Navigation Set up
- Image optimization
- Content optimization
- Internal linking
- Site speed
- Webmaster Tool Submission
- Analytics set up
- Robot text set up
- Sitemap create and submit
- Sitemap created and submitted
- Set up of data structure
- AMP set up
Each page titles should be unique and accurate
The <title> tag of the page should be such that both the user and the search engine can get a good idea about the article by looking at the title. The title tag should be placed in the <head>element of the HTML.
See this example
<title> What are the requirements to become an SEO expert?</title>
Meta tag description
A page title does not provide enough information to understand about the writings.
So the meta tag description has to be given. A meta tag description is one or two lines or short paragraphs.
This gives the search engine an idea about the summary of the page.
See this example
<title> What are the requirements to become an SEO expert?</title>
<meta name = “description=” content=”An SEO expert is someone that knows how SEO works and how to apply SEO to increase the rankings of a website in Search Engines.”>
Search engines always encourage bloggers to give meta tags. Google might use it to snippet the webpages.
Be sure to include meta tags in a way that gives search engines and readers a good idea of the content of your writing. This is why you will get good rankings in search engines and the number of visitors will also increase.
Add structured data markup
Structured data is code that you can add to your site page. The advantage of this code is that it will describe your content to search engines. Search engines will use this knowledge to display your content in search results in an attractive way.
Each content you created requires a unique URL. Search engine spiders will use this link to crawl your content and index your pages.
URLs are generally split into multiple distinct sections:
Despite Google differentiate between the “www” and “non-www” version, It recommends using https: // to all websites when possible.
So when you add your website to Google search console, you can use both.
Never enlarge URL. The shorter the URL, the more user friendly your page will be. Google prefers short URLs.
Sometimes “404 errors or 404 not found on” are shown on some pages of the website, It happens because of broken links or writing wrong URL.
404 errors is a Hypertext Transfer Protocol (HTTP) status code.
If so, you may lose visitors. Therefore, links should be added so that the visitors of these pages can go to the right content.
Set up Navigation
Navigation is very important for any website. It is the structure of your website. Search engines pay great importance to navigation. Navigation gives visitors and search engines a basic idea of the content of the website. Visitors can also easily access the content they need. One thing to keep in mind is that Google always values the convenience of its users. If your site is user friendly then Google will also rank your site more and more.
You can use breadcrumb lists. This allows users to quickly go to the previous section or the main page. Doing so will improve your navigation system.
If your website has more users, create a navigation page for the convenience of the users. This will allow users to surf your website more easily.
Create sitemap for search engine
You need to create a XML sitemap so that search engines can easily find your updated page. The sitemap will contain all relevant URLs. It works like a roadmap. With this roadmap, search engines can find out what content a website has and how to access that content.
- Create content for users, of course not for search engines. You have to write according to your readers demand.
- Create content that makes people interested in reading your content. Users do not like unattractive topics or text. As a result you may lose rankings.
- Your sentences should be in short and simple language so that the user can easily understand. The writing must be accurate.
- It would be great if you could give a list of the topics in your article at the beginning of the article.
- The writing must be unique and informative. Always provide updated information. Write the article in such a way that fulfills users expectation.
Scraped content is the use one’s own writing without the permission of him/her. This is a very bad thing, so you may get a search engine penalty.
Duplicate content is the use of the same content across different domains or subdomains. A writer can do it for legitimate reasons. That’s why Google advises writers to use the rel=canonical tag so that Google can understand the original text.
- You must write regularly. If you do not write regularly, you will not get good ranking in search engines.
- Post images and videos that are original and relevant. Give descriptive titles of images and videos. Be sure to give “Alt text” because search engines index images or videos based on this. Picture or video quality must be good.
- Page load time is very important for ranking so pay special attention to it. The lower the page load, the better your Search Engine Optimization (SEO) performance.
- Advertise on your blog but don’t post in a way that looks ugly. You also need to make sure that the ads do not ruin the beauty of your content.
- Use backlinks to enrich the article, but the links must be well written.
- Do not create auto-generated content. Google dislikes it. Auto-generated content can never write a complete and neat article.
- Write the article in detail. Search engines do not like short articles. Write at least 1000+ words then you will get good ranking.
Next Part is coming …….