Search Engine optimization #1:Online Optimization-Generating Robots and sitemap file
Articles,  Blog

Search Engine optimization #1:Online Optimization-Generating Robots and sitemap file


Hello friends!! Welcome to our channel HIGH-TECHDROID and
this is BharathKrishna. Today, we are going to about what is Search
Engine Optimisation(SEO), it’s categories and the uses in this video. Let’s get into the video. So, Before getting into Search Engine Optimisation,
let’s have a look at website. A website comprises of both front end and
back end. For example, I’ve opened an application
called Facebook. In this, user interacts to the browser with
which called the front end. Now if I give my username and password, the
browser searches and gives the results. This is done with the help of an a support
called back end. Front end is designed using HTML, CSS, PHP,
etc., called as mark-up languages. Developing is done using C, C++, Java and
recently even Python is being used. Next, there is a third concept called Digital
Marketing. Digital Marketing simply means, when we launch
our new website, users (ie., customers) do not have any idea about our website. So, we need promote ie., advertise our website. With the help of promotion, our user level
gets increased. When user level increases, our website views
and ranking gets increased. This is known as Digital Marketing. Now there is a concept called search engine
optimisation. Search engine optimisation is the process
of getting traffic from the search engines. So, now let’s take Google. In Google, if we create traffic to our website,
our viewers and users rate get increased. This is called search engine optimisation. In this search engine optimisation, there
are two categories. They include 1.Onpage Optimisation and 2.Offpage
optimisation. On-page optimisation comprises all the activities
that we do in our website to tune our website. To our website, whatever activities we do
other than what we do in website is known as off-page optimisation. Under this off-page optimisation, falls the
another category called Social Media Optimisation. In this social media optimisation, we give
the details of our website and promotion through image submission, video submission or other such activities and publish our website in social media. This is Off-page optimisation in short. Let me explain about off-page optimisation
in detail further in next video. Now let’s get into on-page optimisation. There are many strategies in on-page optimisation
and one of them is robots file. In our website, there will be private files
or confidential files in our website to be protected from Web crawlers. Web crawlers are those which includes Google
Bot, the Google’s web crawler. Similarly, Bing, Yahoo and so have their respective
web crawlers. These web crawlers crawl our robots file from our website ie., read our website. In robots.txt file, the pages which doesn’t
be allowed to crawl are insisted. So these web crawlers do not crawl our robots
file but crawl our website. Hence, robots file is used for this purpose. Now let me show you how the robots file looks. I take an application called WordPress as
example. WordPress.com/robot.txt. on giving this, the
particular site whose robot.txt file you wanted to view is displayed. In this, User agent is given as *. Actually,
this is given to show that user agent refers to the search engines. * Denotes ‘for all’ that is all the search
engines has to follow the orders of it. Hence, it is given as User agent=*. Next, This allow and disallow enables us to select
the pages the crawlers can crawl and cannot crawl respectively. In WordPress, the pages like next, activate
should not be crawled hence they’ve insisted it in robots file. So next, the crawlers do not crawl the activate
page in it. Hence robot file is used for above purpose. To generate robot.txt file, go to search engine
and type robot.txt file generator. So, click on the first link. In this, there is option called Default-All
Robots are. Do not change anything in it. Next, what are the search engines that will
crawl are found. Next, paste all links of all your private
and confidential files in Restricted Directories. So that these file are put under ‘Disallow’. So that web crawlers do not crawl the particular page. Next, click on ‘Create robot.txt file’. Now our robot.txt file is generated. We can paste this in our root file. Next is sitemap. This is our second concept. Sitemap is, for Google ie., search engines,
provides a way to access our website or pages. To see the contents in the sitemap file. I go to the same application WordPress. will
see what they have done with the sitemap.sitemap.xml, Our sitemap will always be in xml format.
we have to create it with the xml, so that in this sitemap page all our page link will
be given in this sitemap page. what will be happen in the sense is Googlebot
will search our sitemap page and it crawl i.e, will read and also will read all link
in the page and understand it and then will store it in its database. if you guys ask what will happen by storing
it, If anyone search i.e, (search by our keyword) What will google do is, The link which already
stored in its database will be shown as the result.so our view’s rate will be increased. Our website will be published for them, so
that our website ranking will also be increased. This sitemap
file will generated by Going to search and give sitemap generator , So on clicking the
first link move into the it (XML-SITEMAPS.COM). Here by pasting our site link and just giving
start automatically our sitemap file will be generated. This is only suitable for the website have
less than 500 pages, If it is more than 500 pages by paying to it we can create the account
for us . so that we can also generate for more than 500 page website too. Now I going to generate the sitemap for my
blog. Just by giving start it will automatically
generate. so generated, When I give view sitemap details
All my website link will be given here We have to download it and paste the downloaded
file in our root file i.e, (in our website root file). We successfully generated the
sitemap for our website. When WebCrawler search our website all our
website link will stored in its database and suggest to the users who are all searching
. So today we have seen two strategies what
is Robot.txt file and
Sitemap file Now we create a new application i.e, new site
not only creating the sitemap file also paste it in a webmaster tool i.e, (Google webmasters
tool). So that what will happen is google will automatically
store our search’s in their database .so viewers will reach our website. Incase we haven’t create our sitemap for
our website what will happen is, then too will get reach what is? Our website will get reach slowly. If the sitemap is done website get reach some
faster than getting reach by not doing it. So this is the use of sitemap. And this becomes the end of the video will
see on next video,Thankyou.

One Comment

Leave a Reply

Your email address will not be published. Required fields are marked *