On-Page Technical SEO

Do you want more traffic from search engines and social platforms? On-Page Technical SEO boosts traffic by enhancing integration with those platforms.

Enticing Visitors from Search Engines

Tuning pages for search engines promote click-through rates from search results. Use it in combination with browser security and website performance.
Search bots scan sites to gather the information that is in source code but invisible to the user. Pages that lack search data delegate this critical marketing task to the search bot. The chart shows the percentage of websites using each of the fields for search engines.
Percent of Websites with Search Data.

Page Title and Description

The next diagram displays default search results containing the page title, URL, and description. These are clickbait because it attracts users. The URL seo is a pretty link because it is human-readable, lacks a file extension, and mirrors the page description.
Basic search engine result. Placing the code shown below into the HTML header creates the above result.
<title>SEO | Strategic Mind</title>
<meta name="description" content="Boost webpage ranking by making pages fast, easily indexed by search engines, and populated with Web 3.0 data.">
The title is part of the HTML standard and shows in the browser tab. WordPress and other Content Management Systems automatically insert it into the page. Its maximal width in search results is 600 pixels or 50 to 60 characters. The description is from 100 to 150 characters on mobile and goes up to 350 on desktops. So the leading part of the description needs to make sense by itself because the rest may not be there. Search engines can change the title or description to better match user requests.

Image Tag

The image tag has several SEO metrics. Search bots prefer descriptive file names with hyphens separating words because that provides more information. Unlike the page examples, there is a file extension on the URL. The alt field should be present. Its content displays when there is no image and with text-only browsers. Bots examine images, although that is mainly to filter inappropriate content. The title field is optional, and the text pops up when the mouse is over the image. All of these fields are relevant to SEO.
Strategic Mind Small Logo.Strategic Mind Large Logo. The logo appears in many locations, including the browser tab, bookmarks, and history. Modern logos are more functional compared to the original favicon.ico. It uses modern file types, can be much larger, and allows different images for varying display sizes. The logo on the left is for small display areas such as in the browser tab. The bigger one on the right is for larger display areas. The code below has the standard as well as Apple logo links for a web page.
<link rel="icon" type="image/png" href="logo.png" sizes="32x32 96x96" />
<link rel="icon" type="image/png" href="logo-large.png" sizes="128x128 512x512 1024x1024" />
<link rel="apple-touch-icon" type="image/png" href="logo.png" sizes="32x32 96x96" />
<link rel="apple-touch-icon" type="image/png" href="logo-large.png" sizes="128x128 1024x1024" />

Structured Data

Structured data is a modern framework to encode search data. It does not replace the older clickbait. Instead, it stores that information and allows for considerably more detail. The search result shown below is enhanced because it is more prominent and functional. In this case, there is a click to phone link.
An enhanced search result using structured data. There are multiple options for adding structured data to a page. In most scenarios, LD-JSON (Linked Data JSON) is the best choice because of the straightforward syntax. The code below presents the company name, address, phone number, and related fields. It fits in just before the ending HTML tag. It can also hold pricing, product description, and many other fields.
<script type="application/ld+json">[
  "@context": "https://schema.org",
  "@type": "Corporation",
  "name": "Strategic Mind",
  "url": "https://strategicmind.com",
  "address": {
    "@type": "PostalAddress",
    "streetAddress": "Hatchford Court",
    "addressLocality": "Toronto",
    "addressRegion": "ON",
    "postalCode": "M1T 3W8",
    "addressCountry": "CA"
  "contactPoint": [
    "@type": "ContactPoint",
    "telephone": "+16472439321",
    "contactType": "customer service"
  "sameAs": [
Using a structured-data tester confirms that bots will successfully parse the content.

Attracting Social Clients

The image below is a prepopulated Facebook post. The data comes from the URL in the message. Examples are at the bottom of this page, or you can reference the URL in a post.
Website Enhanced Social Post. The following chart presents the percentage of websites that include data for social posting. Facebook introduced the Open Graph type used by most platforms, while Twitter has there standard.
Percent of Websites with Social Metadata.

Open Graph Image

The Open Graph image provides the URL for a picture. It should be in landscape mode with an aspect ratio of 1:1.9. The maximal width is 1,200 pixels, although rendering tends to be 300 to 500. The image used by the social platform does not need to be visible on the page.

Twitter Card

The twitter card is the picture for a page. It has the same requirements as the Open Graph image and can be the same URL.
Walking from the physical to the virtual.

Open Graph URL

The open graph URL is the conical URL for the web page.

Facebook Admin

The Facebook admin links the website to a Facebook social page. It helps establish social proof on the popularity of a page by recognizing the association with posts.

Twitter Site

The twitter web page for the company.

Open Graph Title

A field used to prepopulate posts on some social platforms.

The code below is an example of the above settings and it goes into the HTML header.

<meta name="twitter:card" content="summary_large_image" />
<meta name="twitter:creator" content="@mindstrategic" />
<meta name="twitter:image:alt" content="Technical SEO" />
<meta name="twitter:site" content="@mindstrategic" />
<meta name="twitter:url" content="https://twitter.com/mindstrategic" />
<meta property="fb:admins" content="000000000000000" />
<meta property="og:description" content="Technical SEO" />
<meta property="og:image:alt" content="Technical SEO" />
<meta property="og:image" content="https://strategicmind.com/share/technical-seo.jpg" />
<meta property="og:image:width" content="1200" />
<meta property="og:image:height" content="630" />
<meta property="og:site_name" content="Strategic Mind" />
<meta property="og:title" content="Technical SEO" />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://strategicmind.com/technical-seo" />
These are the website code validators for Facebook, Linkedin, and Twitter.

Mobile Usability

Mobile devices consume over half the bandwidth on the internet, yet, it is not supported by 38% of business websites. Google now indexes sites using mobile, and not desktop, pages. A mobile-friendly display has no horizontal scrolling. Also, the page links are at least finger width apart, which is 50 pixels. Users can test their site with the Google Mobile-Friendly Tester. Here is the code to make the website aware of device-width. It is a prerequisite for responsive coding provides instructions on how the page adapts to changing display areas.
<meta name="viewport" content="width=device-width, initial-scale=1" />
The following code alters the font size when the screen width is under 500 pixels. Designing a page to accommodate varying screen sizes and aspect ratio involves consiterable design effort.
@media screen and (max-width:500px){

Conical URL

The conical URL is a single place for a web page. There are several reasons to have one.

Here is the code to specify a conical URL.

<link rel="canonical" href="https://strategicmind.com/technical-seo" />


The language tag helps search engines send users to the page that matches their preferences. Sites with a single language can still set the default, as shown below.
<link rel="alternate" hreflang="x-default" href="https://strategicmind.com/technical-seo" />

Boosting Page Ranking


A successful HTTP return code is between 200 and 299. Even when the page looks flawless, an unsuccessful code prevents indexing. The robots file has to allow page scanning to add, update, or delete the index. Pages are removed from the index if they have the no-index tag. The search bot also ensures there is no spamming, plagiarism, or other misleading content.


A backlink is an inbound referral from another website. Links from reputable sites have the most value, especially when they are on topic. For example, a Wiki site carries more authority than a local directory because it is on topic and has exceptional standing.


Many bots scan websites and domains to evaluate their level of protection. It contributes to page ranking, email delivery rates, and the likelihood of cyberattacks. Relevant factors include secure browser connections, protecting the infrastructure, and how the service provider protects its facilities.

Domain Creation Date

All else being equal, older domains are better because they are less likely to have malware and had more time to hone their content. The following chart shows the distribution of creation dates across domains.
Distribution of Domain Creation Dates.

Website URL Management

Website URL management provides the site structure to search bots and other applications seeking to understand the content. It improves page ranking by minimizing SEO errors and supports automated testing of URL links. It involves creating a sitemap, removing broken links, eliminating redirects, and validating meta links; as shown in the next diagram.
The Steps in Website URL Management.


A sitemap identifies the URL marketed to search engines and other applications. The following chart shows the distribution of websites by the number of indexable pages.
Distribution of Indexable Pages Across Websites.

Search bots find new web pages using backlinks, other search engines, and other techniques. However, sitemaps accelerate indexing and give site owners more control over the schedule. They render the most benefit for new domains, a large number of URLs, and pages that are not attached to the root webpage with HTML.

The default name for the sitemap is sitemap.xml in the root directory. Administrators can submit the sitemap directly to the larger search consoles. Adding a reference to the /robots.txt file helps all bots find the map. Here is how to add the entry.

User-agent: *
Allow: /
Sitemap: https://strategicmind.com/sitemap.xml

The following code comes from the sitemap.xml file on this site. It includes pages, priority images on the page, and the index frequency requested by the owner.

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="https://www.sitemaps.org/schemas/sitemap/0.9" xmlns:image="https://www.google.com/schemas/sitemap-image/1.1">
        <image:title>Search Data</image:title>

Broken Links

Search bots detect broken links even when they are not apparent to the user. They degrade domain authority and project a sloppy image. The following chart shows the distribution of broken internal links across websites. Internal links have the lowest error rate. Backlinks, meta links, search indices, or outbound links have far more errors. For example, 5.2% of the Facebook pointers are dead, which is 80 times worse than internal links. To thoroughly test connections confirm the following scenarios return an HTTP success code between 200 and 299.
Distribution of Broken Internal Links Across Websites. The next command checks internal links using open-source tools wget and grep. The first one lists the broken links while the second identifies the source files. Replace the missing.jpg with the results from your scan. Run the second command for each broken link. If necessary, download the website content to flat files.
wget --recursive --no-verbose https://strategicmind.com 2>&1 | grep -B 1 'Not Found'
grep -R 'missing.jpg' /var/www/html
Practical and automated testing for all link types requires custom coding to extract the URL then testing for a successful HTTP return code.


Redirection maps one URL to another. It enhances website usability by rerouting users and search bots from similar URL names to the correct one. It maintains page ranking, backlinks, and search indexes when changing URL names. Those changes can result from usability, domain transfers, name standardization, and implementing pretty links. The following chart shows the percentage of websites that duplicate all content across protocols and domain prefixes. There are many ways that sites degrade domain ranking through duplication. Also, it complicates URL testing because it's unclear which one holds the search engine and backlink ranking and backlinks. The following redirections map hundreds of URLs to a single one that matches the sitemap and conical URL.
Duplicate Website Content across Protocols and Domain Prefixes.

Domain Transfers

Websites can have many domain names. It happens during name migrations, to support multiple languages, and to prevent confusion from similar names. Minimizing ambiguity improves usability and reduces fraud. Mapping them back to a single place promotes domain authority and page ranking. Here is an example of domain redirection that comes back to this spot on the page.


When users enter a URL without a protocol, the request goes to HTTP. However, most sites use HTTPS. Redirection from one to the other increases usability by reducing the number of error messages users receive. The next link demonstrates how it works.

Domain Prefix

The www in front of a domain name is the DNS prefix. Websites can redirect any prefix to a single location, including a blank. The feature auto-corrects typos and helps clean up undesired search entries. Use wildcard DNS and SSL certificates to remap any prefix. For example, the DNS for this site includes strategicmind.com, *.strategicmind.com, and *.*.strategicmind.com. These following links got cleared out of the search index because they redirected back to this page.

Case Sensitivity

Case sensitivity varies based on the segment of the URL and the underlying technologies. For example, Linux file names are case sensitive while Windows are not. It can also change based on database configurations and if the page is dynamically or static. Similar issues happen with search engine indices. The best practice is using lower case. The next example redirects back to here.

Directory Listing

A directory listing has a trailing slash on the name. However, web servers interpret it in multiple ways. The handling of the slash on a single website may vary between pages, server directories, and the root URL. Hidding directory listings keep sites more secure. The best practice is to map directory listings back to the file name. Here is an example.

URL Spelling

Spelling correction is possible for any part of the URL. Previous sections corrected the protocols, domain names, directories, and character case. Also, it's possible to correct the spelling for the final part of the URL, as shown here.
Get the News Letter