1 / 22

Search engine optimization (SEO)

Search engine optimization (SEO) is the process of increasing the quality and quantity of website traffic by increasing visibility of a website or a web page to users of a web search engine.

it4int
Download Presentation

Search engine optimization (SEO)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Search Engine optimization (SEO) SEO Tutorials

  2. Search Engine optimization Seostand for Search Engine optimization, which is the practice of increasing the quantity and quality of traffic to your websites through organic search engine results.

  3. What is domain ? • Domain name is the identity of one or more IP addresses, • i.e the domain name (google.com) ponit to the IP address (74.125.127.147). • It is easy to remember a name rather than long string of numbers.

  4. A domain name cannot have more than 63 characters excluding .com, .net, .org, .edu, etc. • e.ghttps://www.google.com • https: - Protocol • www. - Subdomain • google.com - Domain and domain suffix

  5. Uniform Resource Locator (URL) URL Structure URL Protocols Include HTTP (Hypertext Transfer Protocol) and HTTPS (HTTP Secure) - web resources mailto - email addresses ftp - files on a File Transfer Protocol (FTP) server telnet - a session to access remote computers. Most URL protocols are followed by a colon and two forward slashes; mailto is followed only by a colon. • The URL contains the name of the protocol needed to access a resource, as well as a resource name. • The first part of a URL identifies what protocol to use as the primary access medium. • The second part identifies the IP address or domain name -- and possibly subdomain -- where the resource is located.

  6. Top Level Domain • .com -Commercial websites • .in – Use in india • .pak - Pakistan • .int- for international organizations • .edu - for education • .net - for network • .org - Any type organizations •  .gov - For Government websites • .biz – Business

  7. World Wide Web( WWW) • World Wide Web is a like a huge electronic book whose pages are stored on multiple servers across the world. These pages are connected by links called "hypertext". Unlike a book, where we move from one page to another in sequence, on World Wide Web we follow a web of hypertext links to visit the desired page or information.

  8. What is Search Engine Search engine is a service that allows Internet users to search for content via the World Wide Web (WWW). A user enters keywords or key phrases into a search engine and receives a list of Web content results in the form of websites, images, videos or other online data. The list of content returned via a search engine to a user is known as a search engine results page (SERP).

  9. Top 10 Most Popular Search Engines • Google • Bing • Yahoo • Ask.com • Aol.com • Baidu • Wolframalpha • Duckduckgo • Internet archive • Yandex

  10. What is SEO ? SEO stands for Search Engine Optimization. It is a process designed to optimize a website for search engines. It helps websites achieve higher ranking in search engine results when people search keywords related to their products and services. Thus, it increases the quantity and quality of traffic to a website through organic search engine results. See the following image to understand the basic activities involved in the SEO.

  11. Types of SEO • White Hat SEO • Black Hat SEO

  12. White Hat SEO • It refers to the SEO techniques which are in accordance with the SEO guidelines set by the search engines. It means it uses approved search engine optimization techniques to improve the ranking of a site on search engine results pages (SERP). • Unlike Black Hat SEO, it mainly focuses on the human audience opposed to a search engine. People who are looking for a long-term investment on their websites rely on white hat SEO techniques.

  13. Black Hat SEO • It refers to the SEO techniques which are not in accordance with the SEO guidelines set by the search engines. These techniques exploit the weaknesses in search engines to get higher rankings for websites on the search engine results pages (SERP). • It mainly focuses on search engines and not on the human audience. People who are looking for a quick financial return on their website rather than a long term investment use black hat SEO techniques.

  14. White Hat SEO Techniques A list of 5 popular white hat SEO techniques are given below: Good content A unique well-written content makes your website appear more trustworthy and valuable to search engines and human visitors. It optimizes your website for search engines which helps you get higher ranking on the search engine listings as search engines offer the most appropriate website to the end users for their search. Proper use of title, keywords and metatags The information contained in the HTML code is known as Metadata. It provides crawler the information about the site for classification and indexing purposes. So, proper title, keyword and metatag should be incorporated in the metadata.

  15. Ease of navigation Search engines also consider the ease of navigation while assessing the usefulness of a site, so avoid the irrelevant links and use universally recognizable links. It is not only important for the users but also for the crawlers who index the sites. Site Performance Site and page performance is another factor considered by search engines to assess the sites. The unavailable sites or the unavailable pages cannot be indexed by crawlers of search engines; a week or even a day of non-performing site or pages can adversely affect the site traffic. So, make sure your site loads fast and is accessible all the time.

  16. Quality inbound links The site must have quality inbound links as search engines regularly assess back links for their relevance. If a site is found to have irrelevant backlinks it will be discounted or penalized by the search engine, e.g. a website about farming in India containing a number of links from Europeans websites about technology will be degraded by the search engines.

  17. Black Hat SEO Techniques A list of top 6 black hat SEO techniques are given below: Keyword Stuffing Search engine analyzes the keywords and key phrases on the webpages to index the websites. To exploit this feature of search engine, some SEO practitioners increase keyword density to get higher ranking which is considered a black hat SEO technique. Keyword density between two to four percent is considered optimal, increasing keyword density beyond that will irritate your readers and affect your ranking. Cloaking It refers to coding webpages in such a way that search engines see one set of content and visitors see the another set of content, i.e. a user searching for "gold price" clicks on a search result "current gold price" and is greeted with a travel and tourism site. This practice is not in accordance with search engines' guidelines which say to create content for users not for the search engines.

  18. Hidden Text The text which search engines can view but readers can't is known as hidden text. This technique is used to incorporate irrelevant keywords and hide text or links to increase keyword density or improve internal link structure. Some of the ways to hide text are to set the font size to zero, use CSS to set text off-screen, create white text on a white background, etc. Doorway Pages The poorly written pages which are rich in keywords but don't contain relevant information and focus on the links to redirect users to an unrelated page are called doorway pages. These pages are used by black hat SEO professionals to pass on user traffic to unrelated sites.

  19. Article Spinning It involves rewriting a single article to produce its different copies in such a way that each copy looks like a new article. The content of such articles is repetitive, poorly written and has low value for the visitors. In this technique, such articles are regularly uploaded to create the illusion of fresh articles. Duplicate Content The content copied from a website to publish it on another website as original content is known as duplicate content. This black hat technique is known as plagiarism.

  20. How Search Engine Works The work of the search engine is divided into three stages, i.e. crawling, indexing and retrieval. Crawling The search engines have the web crawler or spiders to perform crawling. The task of crawler is to visit a web page, read it and follow the links to other web pages of the site. Each time the crawler visits a webpage it makes a copy of the page and adds its URL to the index. After adding the URL it regularly visits the sites like every month or two to look for updates or changes. Indexing In this stage, the crawler creates the index of the search engine. The index is like a huge book which contains a copy of each web page found by the crawler. If any webpage changes the crawler updates the book with new content. So, the index comprises URL of different webpages visited by the crawler and contains the information collected by the crawler. This information is used by search engines to provide the relevant answers to users for their queries. If a page is not added to the index it will not be available to the users.

  21. Retrieval This is the final stage in which the search engine provides the most useful and relevant answers in a particular order. Search engines use algorithms to improve the search results so that only genuine information could reach to the users, e.g. PageRank is a popular algorithm used by search engines. It shifts through the pages recorded in the index and shows those webpages on the first page of results that it thinks are the best.

  22. Google Algorithm Updates In the beginning, in 90's, search engines was not as effective as it is today; it was mainly focused on keyword matching and backlinks. So, it was quite easy for the low-quality websites to rank higher by targeting their exact keywords with lots of backlinks. To solve this problem, Google introduced a algorithm to filter the results so that it could clean the web. Since then Google is continuously updating its algorithm to maintain and improve the efficiency of its search engine.

More Related