E N D
Technical SEO Technical SEO is the practice of optimizing your website to help search engines find, crawl, understand, and index your pages. It helps increase visibility and rankings in search engines.
Page speed Page speed (also called “load speed”) measures how fast the content of a page loads. From an SEO standpoint, having a fast page speed is essential. Faster loading times can contribute to higher rankings. While users wait, the content on the page might move around as new elements load. Common page speed metrics include the following: Time to First Byte (TTFB): How long it takes for the page to begin loading First Contentful Paint (FCP): How long it takes for the user to see the first element of a page (like an image) Onload time: How long it takes to fully load the content of a page Largest Contentful Paint (LCP)—Calculates the time a webpage takes to load its largest element for a user First Input Delay (FID)—Measures the time it takes to react to a user's first interaction with a webpage Cumulative Layout Shift (CLS)—Measures the unexpected shifts in layouts of various elements on a webpage To begin, go to PageSpeed Insights and paste the URL you want to measure into the search bar. Note: You can only check one URL at a time—not a full site.
Responsive Design Responsive design allows you to serve the same page to both mobile and desktop users. The server sends the same HTML code every time. CSS changes how the page renders based on the device. Like this: 58% of all searches in Google are now done from a mobile device. Google’s Mobile-first Index ranks the search results based only on the mobile-version of the page. And yes, this occurs even if you’re searching from a desktop. Before this update, Google’s index would use a mix of desktop and mobile results. So if someone searched from an iPhone, Google would show them mobile results. And if someone searched for something on a desktop, they’d get “desktop results”. If your site is already perfectly optimized for mobile, you should be good. So if your site… Loads resources across all devices Doesn’t hide content on mobile versions of your site Loads quickly like mobile users expect Has working internal links and redirects Boasts a UX that’s optimized for any device that your visitors use
SSL Certificate. HTTPS is a secure way to make a data transfer between a web server and a web browser. The SSL/TLS certificates will expire after 398 days and must be renewed on time. The option to renew is possible within 30 days of the expiration date. Information that should always be secured with SSL certificates: Bank account details and credit card transactions Medical records Private personal information Credentials Proprietary information The URL begins with https:// instead of http://. There is a padlock icon next to the address bar. If you see the “Not secure” warning, you’re not using HTTPS.
Robots.txt optimization A robots.txt file is a set of instructions used by websites to tell search engines which pages should and should not be crawled. Robots.txt files guide crawler access but should not be used to keep pages out of Google's index. Why Is Robots.txt Important? That’s because Google can usually find and index all of the important pages on your site. Maximize Crawl Budget: Crawl budget refers to the number of pages Google will crawl on your site within a given time frame. The number can vary based on your site’s size, health, and number of backlinks. If your website’s number of pages exceeds your site’s crawl budget, you could have unindexed pages on your site. Note: Most website owners don’t need to worry too much about crawl budget, according to Google. This is primarily a concern for larger sites with thousands of URLs. Block Non-Public Pages: Crawl bots don’t need to sift through every page on your site. Because not all of them were created to be served in the search engine results pages (SERPs). Hide Resources: Sometimes you want to exclude resources such as PDFs, videos, and images from search results. To keep them private or have Google focus on more important content. In either case, robots.txt keeps them from being crawled And everything that comes after “disallow” are pages or sections that you want to block. User-agent: googlebot Disallow: /images You can also use an asterisk (*) to speak to any and all bots that stop by your website. User-agent: * Disallow: /images Useful robots.txt rules Here are some common useful robots.txt rules: Useful rules Keep in mind that in some situations URLs from the site may still be indexed, even if they haven't been crawled. Disallow crawling of the entire site Note: This does not match the various AdsBot crawlers, which must be named explicitly. User-agent: * Disallow: / Append a forward slash to the directory name to disallow crawling of a whole directory. Disallow crawling of a directory and its contents Caution: Remember, don't use robots.txt to block access to private content; use proper authentication instead. URLs disallowed by the robots.txt file might still be indexed without being crawled, and the robots.txt file can be viewed by anyone, potentially disclosing the location of your private content. User-agent: * Disallow: /calendar/ Disallow: /junk/ Disallow: /books/fiction/contemporary/ Only googlebot-news may crawl the whole site. Allow access to a single crawler User-agent: Googlebot-news Allow: / User-agent: * Disallow: / Unnecessarybot may not crawl the site, all other bots may. Allow access to all but a single crawler User-agent: Unnecessarybot Disallow: / User-agent: * Allow: / For example, disallow the useless_file.html page located at https://example.com/useless_file.html, and other_useless_file.html in the junk directory. Disallow crawling of a single web page User-agent: * Disallow: /useless_file.html Disallow: /junk/other_useless_file.html Crawlers may only access the public subdirectory. Disallow crawling of the whole site except a subdirectory User-agent: * Disallow: / Allow: /public/ For example, disallow the dogs.jpg image. Block a specific image from Google Images User-agent: Googlebot-Image Disallow: /images/dogs.jpg Block all images on your site from Google Images Google can't index images and videos without crawling them. User-agent: Googlebot-Image Disallow: / For example, disallow for crawling all .gif files. Disallow crawling of files of a specific file type User-agent: Googlebot Disallow: /*.gif$ Disallow crawling of an entire site, but allow Mediapartners-Google This implementation hides your pages from search results, but the Mediapartners-Google web crawler can still analyze them to decide what ads to show visitors on your site. User-agent: * Disallow: / User-agent: Mediapartners-Google Allow: / Use the * and $ wildcards to match URLs that end with a specific string For example, disallow all .xls files. User-agent: Googlebot Disallow: /*.xls$
Sitemap optimization A sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships between them. Search engines like Google read this file to crawl your site more efficiently. A sitemap tells search engines which pages and files you think are important in your site. There are two types of sitemaps: XML sitemaps: sitemaps written in a specific format designed for search engine crawlers HTML sitemaps: sitemaps that look like regular pages and help users navigate the website
Implement Structured Data Structured data (also called schema markup) is code that helps Google better understand a page’s content. And by adding the right structured data, your pages can win rich snippets. Rich snippets are more appealing search results with additional information appearing under the title and description.
What are SEO friendly URLs? SEO-friendly URLs are URLs that are designed to meet the needs of users and help search engines understand what a web page is about. They are typically short, and include relevant keywords. Along with your title tag, link anchor text, and the content itself, search engines use your webpage’s URL to understand what your content is all about. Use a Keyword: Your URL should contain a keyword that you want your page to rank for. (Preferably, that page’s target keyword) Use Hyphens: In-Between Words Use hyphens as “word separators” in your URL. I use a hyphen “-”, to let search engines know that “SEO”, “site” and “audit” are three separate words. Short: Your URLs should be short and sweet. That’s because long URLs confuse Google and other search engines. Avoid Using Dates: Back in the day, CMS’s (like WordPress) automatically included dates in URLs:
Internal Links Internal links are hyperlinks that point to different pages on the same website. These differ from external links, which link to pages on other websites. Internal links are a crucial part of SEO for three main reasons: They help search engines understand your site’s structure They pass authority They help users navigate your site page authority Internal linking helps pass authority (or PageRank) to other pages on your site. PageRank is an algorithm Google uses to measure a webpage’s importance when it comes to ranking. Navigational Links Navigational links are the most important internal links because they live permanently on your main menu. They also make up your site’s main navigational structure. Footer Links Footer links are a type of navigation link. They appear on every page of your site, but at the bottom of the page instead of the top: Sidebar Links Sidebar links are another type of navigational link that some sites use to direct users to related content. Contextual Links Contextual links (or in-text links) are usually placed in the main body content of a page.
Pagination SEO Pagination is an SEO/design technique that involves breaking up content on multiple pages, allowing users to move through long articles or bulk product listings quickly. Improves User Experience: Pagination improves the user experience because it lets users move through large amounts of content more efficiently. Plus, numbered pages tell users where they are in your content. And how much content they have to navigate. Faster Page Load:Your pages may load faster if you only display a partial amount of content. And it’s essential your pages load fast. Because page speed is a confirmed ranking factor for Google. Internal Links: Pagination offers internal linking opportunities. Internal links links on your website that point to different pages on your website—help search engines understand your site’s structure. And pass authority to other pages. Which can improve your rankings.
Duplicate Content In general, Google doesn’t want to rank pages with duplicate content. In fact, Google states that: “Google tries hard to index and show pages with distinct information”. sometimes you’ll find that your site creates a new URL for every different version of your product… which results in THOUSANDS of duplicate content pages. you can check out your indexed pages in the Google Search Console. Printer-friendly pages If your content management system creates printer-friendly pages and you link to those from your article pages, Google will usually find them, unless you specifically block them. WWW vs. non-WWW This is one of the oldest in the book, but sometimes search engines still get it wrong: WWW vs. non- WWW duplicate content, when both versions of your site are accessible. Another, less common situation but one I’ve seen as well is HTTP vs. HTTPS duplicate content, where the same content is served out over both. and the solution is Canonical URL: A canonical tag is a way of telling search engines that a specific URL represents the master copy of a page.
Broken Links One of the most common reasons a link breaks is because the target page is deleted or moved to a new URL without updating the link. A 400 bad request error may look like this: Domain Name Change 502 errors resulting page may look like this: The Impact of Broken Links: Broken links can impact SEO in several ways: Site quality: Google wants to recommend useful, up-to-date sites. Too many broken links can signal that your site isn’t up to date or is poorly maintained. Crawl errors: Google’s site crawlers, or “bots,” crawl the web by following links between pages. When a bot hits a broken link, it creates a crawl error. Meaning your page can't be fully crawled and indexed in search engines. Wasted link authority: Internal links pass authority (known as link equity) between connected pages on your site. But when Page A links to a broken Page B, that authority gets wasted rather than passed on. Google Search Console Google Search Console (GSC) is a free tool from Google that allows website owners to monitor their site’s presence in Google Search results. The Page Indexing report in GSC can help you identify broken links. 404 Error An HTTP error 404 occurs when the web server cannot find a resource (like a webpage) at a certain URL. The user has used the wrong page URL, e.g., by making a typo in it The website owner has deleted the resource and it’s no longer available on the server The website owner has changed the URL linking to the resource The website owner has misconfigured the website so that the server can’t find the resource. HTTP response status httpstatus. URLs like these become known as “soft” 404 errors. Because, although they don’t meet the technical definition of a 404 error, they lead users to think the pages at these URLs can’t be found. In contrast, “hard” 404 errors are instances in which the browser gets back the actual 404 HTTP status code.
301 & 302 Redirect The main difference between a 301 and a 302 redirect is that 301 redirects are permanent and 302 redirects are temporary. 301 Redirect 301 is the most common type. It takes visitors to the new URL and tells search engines the redirect is permanent. For example, from https://www.website.com/old-page-name/ to https://www.website.com/new-page-name/. 1. If, for any reason, you want to delete a page on your site, you should redirect it to another relevant page if at all possible. It creates a better user experience. And gives people an alternative. 2. You may also want to migrate your site to a new domain entirely. Say you’re migrating from a .net to a .com. Or maybe you’ve rebranded and need to move to a different domain name. A 301 redirect is the best way to do that. And notify Google using Google Search Console's Change of Address tool. A 302 redirect, on the other hand, also takes visitors to the new URL but tells search engines the redirect is only temporary. 302 Redirect A 302 redirect, on the other hand, also takes visitors to the new URL but tells search engines the redirect is only temporary. 1. Website Maintenance or Redesign You’re working on a big update to your page at www.example.com/my-page and don’t want anyone to see it before it’s ready. 2. A/B Testing You’re testing a new version of a landing page to see if it outperforms the existing page. You’ll want to send a certain percentage of your traffic from the existing page (example.com/page-1) to the test version (example.com/page-2). How to Implement a 302 Redirect you will redirect URL with the help of many plugins. The Yoast SEO redirect manager allows you to quickly add or remove 302, 301 redirects. You’ll need a Yoast Premium subscription. domain redirect in cPanel Log into your cPanel. 2. Navigate to the Domains section and click on the Redirects option: NOTE: You need to enter the protocol as well, e.g., http://, https://
Image Optimization Image optimization involves creating and delivering high-quality images in the ideal format, size, and resolution to increase user engagement. It also involves accurately labeling images with metadata so search engine crawlers can read them and understand page context. TIP: Optimized images take up less storage space on your server, so site backups are completed more quickly faster. Resize your images: Image size and file size are not the same things. Image size refers to the dimensions of an image (e.g., 1024 pixels by 680 pixels). File size is the storage space (e.g., 350 kilobytes). resize images here right image file type Using the right image file type (also commonly referred to as image formats) is vital to make sure your images are displayed properly, not blurry, and are clearly visible to users. Image formats that Google can index are: JPEGs PNG WebP SVG BMP and GIFs Compress images for faster loading Image compression reduces the size of image files. This makes them more suitable for quick loading and efficient storage on various devices. The file size of images directly affects the overall loading speed of a page. So, if a page has images with large file sizes then users might have to wait for longer than usual. compress images here Write descriptive alt text alt text plays a crucial role in helping Google and other search engines comprehend the content and context of images, especially since they can’t interpret them visually. It’s also used by screen readers to describe images to visually impaired users. So, it’s vital for the accessibility of your pages and enhancing your UX. Plus, browsers display the alt text for an image when the image itself can’t be rendered. captions A caption is a text that appears directly below an image. It doesn’t usually just describe the image but gives more context to it, like in the image below.
Site Architecture An effective site structure organizes pages in a way that helps crawlers find your website content quickly and easily. So, ensure all the pages are just a few clicks away from your homepage when structuring your site.