Scotty@SomeCodeiWrote.com | (720) 854-5077 SomeCodeiWrote

Website Quality Assurance & Quality Control

100 / 100

Search

100 / 100

User Experience

100 / 100

Conversion

Sitewide Items

Sitewide items make large sweeping changes across an entire website. These high level items can have the largest bang for the buck. Every website must be examined on a per page basis, but starting with a base of overall integrity must come first.

Site Security 100 / 100

The goal of website security is to reduce potential entry points and create layers of protection. If someone gain access to a portion of your site (the admin), there are ways to limit the damage caused.

Keeping everything up to date is key. Understanding where your code came from is massively important. Set up as many roadblocks as possible, using security redundancies. Always think outside the box because hackers exploit lazy or repetitive behavior.

Take charge and help protect your digital assets. Step one is to create a Google Webmaster Account.

I highly recommend everyone understand how hacking works. Learn more directly from Google.

  • SSL certificate
  • Padlock over HTTPS
  • Contact info matches WhoIs
  • Contact info matches Google Plus
  • Google Plus Integration
  • Server protection software
  • Server Signature off
  • Block Libwww-perl access

A SSL certificate and padlock over HTTPS helps ensure everything is secure through encryption. Encrypting data is the first line of defense against hackers and other ne'er-do-wells.

Contact information matching WhoIs and Google Plus, as well as Google Plus integrations all help to verify who owns the website. The cross checking of data helps reduce fraudulent content.

Server protection software further ensures hackers can not infiltrate your website and perform malicious acts. Many hosting companies off this service, or you can purchase it from companies like SiteLock.

WordPress Specific security

WordPress is by far the most common content management system. As such, it is also the most attacked content management system. A few simple steps can greatly increase your website security.

For WordPress, I highly recommend All in one Security.

  • Remove WP generator
  • Captcha on login
  • Move login page
  • Limit login attempts
  • Change database prefix
  • Hide wp-config.php and .htaccess
  • Disable file editing
  • Additional Firewalls
  • Disable WP XML-RPC
  • Disable WP Json (REST API)

Out of the box WordPress includes a generator in the meta data. This generator data details the exact version of WordPress your site is using. If a hacker knows the version of WordPress you are using, they may be able to exploit vulnerabilities.

Limiting login attempts, adding a captcha to the login page, and moving the location of the login page all help prevent brute force attacks.

The standard database pre-fix for any WordPress site is wp_. Hackers use this to their advantage, writing scripts that look for a database called wp_users or wp_options etc. By changing the database prefix to anything other than wp_ you make it significantly harder for hackers to target your database tables, because they no longer know the exact location.

Wp-config.php and .htaccess are two powerful files used by WordPress. Hiding these files helps prevent hackers from gaining access to them. Similarly, be sure to check that your hosting configuration does not store a copy of either file. Depending on how the file is duplicated, it may become accessible to hackers, leaving your website in a very vulnerable place.

Disabling file editing removes the ability to change theme or plugin files from the dashboard. If a hacker has made his way into the WordPress dashboard, but not your server directly, the hacker will not be able to do as much damage.

Duplicate Content 100 / 100

  • Force the use of HTTPS://
  • URL canonicalization
  • IP canonicalization
    • IP address: IP Address
  • Remove file extensions
  • Force /index redirect

Forcing the use of HTTPS and using URL and IP canonicalization limits the number of copies of a website. Search engines want to crawl one version of each website as opposed to numerous copies of every website.

Without proper settings, these could all be valid URL variations of the same website. With Proper canonicalization, all of these links go end in the same palce.

Removing file extensions (.html, .php or .aspx) makes the URL easier to read and may can reduce duplicate content.

Forcing /index to redirect further reduces duplicate content.

Required Files / Pages 100 / 100

  • Files
    • Favicon
      • Favicon.ico in root
    • Touch icons
    • Robots.txt
      • Robots.txt has location of sitemap.xml
    • Sitemap.xml
      • Sitemap.xml Index
  • Pages
    • HTML sitemap
    • 404 not found
      • Search box on 404
      • Sitemap on 404
      • Contact information on 404
      • Contact form on 404
    • Terms & conditions
    • Privacy policy
    • Linking policy

Files

A favicon is the picture at the top of a browser. These help users differentiate between multiple tabs.

The favicon is also used in other ways and by placing a favicon.ico file in the root folder it ensures all programs can access the favicon.

Touch icons are much like the favicon, but specific to browsers or devices.

A robots.txt tells search bots what pages to not search for or index. This file limits the number of pages a search bot will have to find and index, which in turn means the website is indexed faster and more accurately.

Adding the location of the sitemap.xml to the robots.txt tells the search bots where to find the sitemap.xml. This again helps get your website indexed faster and more accurately.

A sitemap.xml is a list of all pages in a way that is easy for search bots to read.

Including an index for sitemap.xml helps search bots find and index the pages of large websites.

Pages

A HTML sitemap is a list of all pages in a way that is easy for humans to read. Search bots also use the HTML sitemap to help cross check the sitemap.xml.

A 404 page ("page not found") improves human user experience by getting them to a usable page. A good 404 page will include a HTML sitemap, a search tool and a message that makes it clear the page was not found. I also include a contact form and contact information.

A 404 page also indicates to search bots that the page was not found. When a search bot comes across a 404 pages as opposed to the actual page, it triggers a notification in Google Webmaster Tools to help notify the webmaster of the issue.

A Terms & Conditions page can limit the website owners liability.

A Privacy Policy shows visitors you respect their privacy and will not use the information they provide in any malicious way. Keep in mind just the use of a website offers up a host of information about the user in the form of analytics.

Having a Linking Policy is key. Although back-linking can increase organic search results, search engines are placing a high emphasis on sites only having relevant back-links. Large numbers of back-links are often the result of spamming sites and cause more work for the search engines.

If you have too many back-links for unrelated sites you will be harshly punished. Removing these “black hat SEO” back-links takes tremendous effort. By having a linking policy you are signaling to search engines that you’ll help police links to your site and that you will not engage in black hat link building. Furthermore, a linking policy may help you if you need to request that a bad back-link be ignored by search engines – which is an increasingly difficult thing to achieve.

Site Speed 100 / 100

Making websites faster makes all comes down to less is more. The less a computer has to read and process the faster the website will be. Fast websites make for a better user experience. Google’s Page Speed Test is one of the best. GT Metrix and Pingdom both provide unique page speed insights as well.

  • Use the cleanest code possible. Write error free code and less of it.
  • Leverage caching to maximize the re-use of items.
  • Make the fewest requests possible (10-20 or less).
  • Create the smallest packet possible (size in MB).
  • Use a high powered server to deliver content – perhaps the use of a Content Delivery Network
  • Caching
    • Sever side cache
    • Client side cache
    • Remove version from URL
  • Requests
    • Enable keep alive
    • Small number of requests
    • Prioritized requests
    • Use of sprites
    • Use of font based icons
  • Packet size
    • Packet Size under 1MB
    • Enable Gzip or Deflate
    • Minify HTML
    • Minify CSS
    • Minify JS
    • No use of inline styles
    • Serve scaled images
    • Serve optimized images

  • GT Metrix Page Speed Score: Score / 100
  • GT Metrix YSlow Score: Score / 100
  • Google Page Speed Score Desktop: Score / 100
  • Google Page Speed Score Mobile: Score / 100
  • HTTP Requests: 20
  • Packet Size: 0.655MB

Caching

Most modern websites use a language that runs on the server where the website is hosted. These server side languages allow developers to create more web pages in less time.

When someone requests a webpage, the server side language runs and assembles an HTML document to send to the client (the end user). The client’s device then renders the HTML document as a webpage.

Server side caching saves those HTML documents, so the server side language does not have to assemble it every time someone requests a webpage. Not processing the server side language to create an HTML document for each and every request saves a tremendous amount of server side processing power.

Using WordPress? Try CometCache for a simple and effective caching solution.

Browsers and devices also save items and re-use them in order to speed things up. If you have visited a web page (and client side caching is properly configured), the browser will store a copy of major website elements.

As an example - rather than downloading the logo every time you go to a new web page on a website, the computer can store that image and re-use it.

Client side caching is enabled and configured via the .htaccess file

CSS and JS files often come in versions. Calling the resource with a version in the URL (/custom.js?ver=4.3) effectively eliminates that resource from being stored in a cache.

Requests

Without keep alive enabled, a connection to the server may occur for each file requested. Connecting to the server can be a relatively time consuming process. Eliminating the need to re-connect for each file request per download increases the speed at which the website downloads.

To further speed up a website, combine resources to reduce the number of requests required for a webpage. Compile CSS and JS into one file (1 CSS and 1 JS, two total), rather than loading multiple of each.

Use what you need on that specific page. If a particular script is only used on one page, only load it on that page, not every page of the website.

Using WordPress? Make use of Dequeue Script and Dequeue Style functions.

Once the resources have been effectively combined, load them in a prioritized manner. Depending on the resource and the browser, requesting a resource may cause all other resources to stop downloading. Load only the most essential items in the head, and load the rest at the bottom of the body.

Test speed, make changes, and check again in order to optimize the order and combination of resources.

A sprite combines many images into one image to save download time. One larger image downloads and processes faster than multiple smaller images.

Font based icons load faster than image based icons because a font file is much smaller than an image file. An additional bonus is that font based icons scale for mobile or living room seamlessly.

Packet Size

Enable Gzip or deflate, and minifying the HTML, CSS and JS files all make the files for a website smaller and thus the website faster. The do however work differently and both methods should be used. Gzip reduces duplicate strings, whereas Gzip removes whitespace.

Gzip is enabled in the .htaccess file. Minification can be achieved dynamically via a sever side script, or during the compiling process.

Using WordPress? Try the Autoptimize plugin to minify HTML, CSS, and JS.

Inline styles are inefficient by design and were replaced with CSS quite some time ago. Inline styles act on one item at a time whereas CSS can style many items at once which makes CSS much more efficient.

Serving scaled images is the use of smaller images, particularly on mobile devices. Rather than making the images look smaller through CSS, the website loads images that are actually smaller. Smaller images download and process faster than larger ones.

Images are files and just like HMTL, CSS or JS can have extra information attached to them. Just like removing white space and comments from your code, you can also optimize your images to shrink them. Try opening an image with a text editor and you quickly realize how much information is stored in a simple image file.

Using WordPress? Try using WP Smush By WPMU DEV or EWWW Image Optimizer.

Site Structure: 100 / 100

  • Silo structure
  • No underscore ( _ ), & or ? in URL
  • No numbered URLs (/example-2)
  • Menu style search result

  • Pages indexed by Google (approximate): 53
  • Content Management System: WordPress

Silo structure makes it easier for search bots to find and index the pages of a website. Silo structure also organizes the content of a website for a good human user experience.

The structure of a particular URL is very important. To search bots; an underscore in a URL means the two words are read as one word, a hyphen means the two words are read as two words.

example.com/about_us is seen as page 'aboutus'
example.com/about-us is seen as page 'about us'

& and ? are ok for certain URLs, such as when a search is performed, but otherwise are an indicator of spam URLs.

Numbered URLs are generally come from poor site planning, poor use of a content management system, or spam URLs

Menu style search results (a list of pages below the main URL in a search result) are a results of good site structure as well as proper use of Google Webmaster Tools.

3rd Party Tools 100 / 100

  • Google analytics
  • Google webmaster tools
  • Bing webmaster tools
  • Other analytics tools

Google Analytics tracks website usage. Understanding where your traffic comes from, where visitors go onsite, and when visitors leave can be very helpful in optimizing your website experience.

Google Webmaster Tools helps a webmaster monitor and improve a website. Webmaster tools allows you to add a sitemap to aid Google in finding your site. Webmaster tools also has warns you of crawl errors (typically missing or moved pages) and security risks. It's a big deal and you should absolutely use Google Webmaster tools.

Google analytics is not the only website tracking option. There are dozens of alternatives that may be crucial to your website, on a case by case basis.

Per Page Items

Once you have made sure all of your site wide settings are correct, you must check each and every page. Building correctly from day one remains the key. I use this checklist to construct each page, but also as a post build quality control check.

Page Structure 100 / 100

The core HTML structure of webpages is often overlooked or not understood. A sad fact because proper structure makes the pages easier to use and makes them easier for search bots to read and index.

Heading tags (H1, H2, H3 etc) are fundamental elements for the structure of a web page. Web pages should ideally be structured similar to an outline. Headings serve as markers within the page to help define blocks of content.

Unfortunately, many web developers use these structure elements for their associated styles. This style over structure methodology is wrong and should be avoided.

This is not to say the page should not be styled, quite the contrary. However, a properly structured page is much easier to style than a page created using structure elements for their associated styles.

  • 1 H1 tag
    • Text in H1 - not an image
  • 2-6 H2 tags
  • 1-3 H3 tags
  • H4 - H6 tags as applicable
  • Use of UL, OL and DL
  • Div Based Structure

Every web page should have exactly one H1 tag, no more and no less. The H1 tag should always include text, and not an image.

H2's - H6's, UL's (unorganized lists), and OL's (ordered lists), DL's (description lists) should be used as applicable. The key here being as applicable and used correctly. Haphazardly using headings is not correct. The idea is to properly structure your document into a clear and intelligible thought process.

Proper heading structure dictates that a headings be sequential and nested. Except for the H1, all headings should have a lower number H tag above it.

Ex: If you use an H3 tag, there should be an H2 tag above it in the document.

Div based structure is a flexible structure required for mobile design. An alternative structure is tables, which you do not want to use. Much like inline styles, tables have long been obsolete.

Page Content 100 / 100

  • 500 word minimum
    • Word count: 650
  • Reading Level
    • Reading level (School Grade): 8.5
  • 50 links or less
    • Number of links: 25
  • Use of HTML (not XHTML)
  • Use of HTML5 elements
  • Images have good file names
  • Content above the fold
  • Structured Data
  • No use of Flash

500 words is a rough word count minimum for a page to "count". Pages below 500 words tend to not be indexed or not highly rewarded in search results. This page clocks in over 4,600 words.

Targeting a reading level of 6th - 9th grade strikes a balance between insulting your audience by talking down to them and talking over their head. Test the reading score of your site.

50 links or less is a guideline, not a hard set rule. Too many links may trigger a spam alert from search bots. Too many links may also present a poor human user experience resulting from too many options.

An alternative to HTML is XHTML, which is now outdated and obsolete. The syntax of XHTML is different than HTML. This means computers can still read XHTML but it is like reading Old English.

The use of HTML5 elements indicates to search bots that the website uses the most up to date code. HTML5 uses all of the elements of HTML4 and some new ones too. See the elements new to HTML5 on W3schools.com

Flash (a type of programming) should be avoided because not all devices support Flash based code.

Page Errors 100 / 100

  • Spell check
  • No broken links
  • No broken images
  • Able to load all resources
  • No JS errors
  • No CSS errors
  • No HTML errors
    • Alternative text for all images
    • Natural width and height for all images
    • Correct use of HTML5 Elements
    • Correct structure and syntax
    • No typo's / space errors
    • Escape all characters
    • No use of obsolete code
    • No duplicate ID

Total number of errors on page: 0

A lack of spelling errors not only shows professionalism, it tends to give a small bump in search. Consider that search bots also read and judge the reading level of the content. Try Respelt.com to quickly check any web page or website for spelling errors.

Broken links are a bad human experience, but also negatively effect search bots. Search engines want to find and index every page of the internet. A broken link is then a break in the chain that is the world wide web.

Broken images are most often the result of an incorrect source URL. When a browser can not load an image, it not only loads the page more slowly, but it causes larger usability issues.

Most often a browser can not load a resource because it is being called from an insecure resource, but on a secure page, or it is being called from a separate domain. Anytime a resource can not be loaded, the web page will load more slowly, but will inevitably load with a poor experience.

JavaScript errors often cascade in the worst way possible. Not only can JavaScript errors cause on page functionality to deteriorate in a hurry, it often prevents other resources from loading. Bear in mind that loading a JavaScript resource halts other items from loading. This means the simplest of error in a script can effectively break the entire page.

Pages with HTML errors take longer to load and often have a poor human experience. HTML errors are the main source of “broken” pages and poor cross browser performance. You can check any web page using the W3C Markup Validation Service

A final thought on HTML errors. Webpages are HTML documents that are read and rendered from top to bottom. Small errors at the top of the document often cascade down the document and cause much larger issues further down the document (web page).

Alternative text on all images is required by HTML documents and is intended for the visually impaired.

Natural width and height on images makes pages load faster and maintains the page structure if the image does not load.

Correct use of HTML5 elements is key. The incorrect use of HTML5 elements is far worse than not using HTML5 elements.

Incorrect syntax and structure is far more common than it should be. Putting items in the wrong place or order makes web pages load slow and or break – particularly in older browsers.

Typos or space errors can cause a havoc. Something as little as a missing space can confuse the computer.

Not escaping all characters can also confuse the computer.

Most browsers will typically render obsolete code, but the name implies it all. Replacing obsolete code is the bare minimum of keeping code up to date.

ID’s by definition are unique objects on a page, and therefore can not appear more than once. If an ID appears on a page more than once the browser is unable to determine which object the ID is referring to.

Classes are applied when a type of object appears more than once on a page. A very important and basic programing flaw, many developers over look duplicate ID’s. Duplicate ID’s should never ever be allowed into production.

Page Usability 100 / 100

  • High contrast between text and background
  • Solid background For text
  • Large size font
  • Good experience across all browsers
  • Fluid responsive grid
  • Large space between links on mobile
  • Pass Google's mobile friendly check
  • Appear in searches from mobile
  • Images have good descriptive alternative text
  • Breadcrumbs
  • 1st Link Skips Navigation
  • @media print Rules

  • Google User Experience Score: 98/ 100
  • Number of small tap targets (link space): 0

High contrast between text and background, large size font and a solid background for all text makes the text easier to read.

There are no bad web browsers. Developers who dislike a browser often fail to admit: trouble in a specific browser is due to nothing but poor programming choices. A poor experience in a specific browser is most often created by plain and simple errors. Closely behind errors, browser specific problems are due to using code a browser can not read and render. The browser is not bad or at fault, it is simply reading and rendering what it was given.

Websites must render well across a full range of devices from the smallest of smart phones to the largest of smart TVs. A fluid responsive grid ensures that the website delivers a usable and attractive experience across all screen sizes.

The easiest way to ensure a smooth fluid grid? Use a mobile first front end framework like Bootstrap. Compare front end frameworks.

Large space between links on mobile prevents accidentally navigating to an incorrect page.You want large tap targets on mobile.

Some search engines are so concerned with a good mobile experience, they may not display your site in a search from mobile. Check to see if you pass Google's mobile check. Then confirm your site displays in a search from a mobile device.


Mobile friendly success

Page Conversions 100 / 100

  • Phone number
  • Email
  • Address
  • Map
  • Contact form
  • Newsletter sign up
  • Lead capture
  • E-Commerce / online conversion
  • “Next steps” / no dead ends
  • Multiple paths to same conversion

Easy to find and easy to use on-page conversions are crucial to effective internet.

Because each user will want to engage with you in a different way, offering many different ways for the user to contact you is key.

E-Commerce in this context refers to any online, automated conversion fulfillment. Contact forms, book an appointment or make a reservation are all forms of E-Commerce. Think of it as E-Conversion as opposed to E-Commerce.

Next steps loosely refer to leading links at the bottom of pages, or any kind of leading next step. A method used to stop dead ends, the idea is to always have a next for the customer to engage in.

One next step should always be a strong conversion, or ask. Another next step should include a logical next step for the customer along their decision making process.

Creating multiple paths to conversion reflects a more modern approach to sales. The old and antiquated sales funnel methodology has been replaced with a more realistic and flexible customer journey map. Consider your customer may find you site at multiple points in their decision making process and allow their journey to unfold as they desire.

Meta Data* 100 / 100

* Please note, most search engines no longer use meta data for search ranking.

Required

  • Title
    • Title length (less than 50 characters)
      • Title character count: 45
  • Viewport

Recomended

  • Description
    • Description length (less than 150 characters)
      • Meta description character count: 145
  • Author
  • Image_src
  • Keywords

The title is a required portion of all HTML documents. Although it varies by browser, the title often appears at the top of a browser tab. The title should be 50 characters or less.

The meta viewport is essential and all but required for responsive design. The meta viewport tells the device how big the webpage should be and how it should fit within the device.

A meta description is a short (approximately 150 characters or less) description of the web page. It may be the snippet that appears on a search result and often can effect click through rates from search.

Meta author is sometimes used in a referring link and indicates who wrote the web page.

Meta image_src is often used as the image when a referring link is added to a website.

Meta keywords were once the king of search but are now mostly overlooked. Although not used for search ranking, the meta keywords are still used by many web applications and should be included.

View a complete list of available meta tags here.

Only you can prevent bad internet.

Created by Scott Starkweather. © Some Code i Wrote