Skip to main content

Achieving the "Optimize" in SEO

we optimize your website for more customers from the moment the Open sign flashesWebsite architecture has been a focus for SEOs (Search Engine Optimizers) for a long time now, but over the past few years, it’s become even more important. That’s because architecture is the foundation of the entire website. It affects both how visitors interact with your site and whether or not search engines are properly able to analyze your optimized content.

But website architecture shouldn’t just be a concern for SEOs. It should be a concern for website developers, as well. Whenever we hear someone is designing a new website, we are quick to recommend that they build a strong, search-engine-friendly website right from the start, even if they don’t immediately hire an SEO. Ignoring meta tags and content that will assist in higher rankings could eventually cost more to fix the problems later after months of lost opportunities.

Descriptive Titles and Page Names

Your file names (or aliases if you use a CMS) and page titles can be the simplest and most important Search Engine step you take to get indexed properly. Consider the page featuring your bio; possibly named bio.php, might have a title tag something like "Meet John Doe, Columbia's Award Winning Dental Implant Specialist". A better filename might be john_doe_bio.php, that gets John Doe extra exposure on the search results page.

You will hear me preach over and over again that "content is king" and another great example of that is how your rating is weighted based on how well your title, description and keyword tags actually match what is on your page. As SEO (search Engine Optimization) has evolved over the years, there have been many a marketing tricksters trying to outsmart the indexing systems. I once had a client that paid $500 to a company who rewrote their content to fit their keywords trying to get a higher result in Google.

The content was horrible! It read like a bad yellow page listing: (the names have been changed for this example)

Call Bogus Plumbing Service, Plumbing, Drain Service, Denver, Colorado for your Plumbing, Drain Service needs. John Doe of Bogus Plumbing Service, Plumbing, Drain Service, Denver, Colorado has been a professional plumber, drain cleaning in Denver, Colorado for 40 years. John Doe of Bogus Plumbing Service, Plumbing, Drain Service, Denver, Colorado is a licensed and insured plumber in Denver, Colorado. ....

Yuck!

It really doesn't get much worse than that! The client could not understand why he was not getting any calls from this great placement. I was able to finally dissuade him from the obsession of that coveted and ever elusive number one spot on the search results and steered him toward attracting customers with well written services and testimonial pages. His Plumbing Company did eventually find its way on to the first page of Google results using his preferred keywords.

Keywords Metas Can Be Misleading

Actually, keyword development is pretty simple. You find the most important words in that page's content and prioritize them to the top 10 words or phrases. Then you make sure they are in the meta tag labeled Keywords. We only begin to encounter problems when the desired keywords are not found in the content. This results in lower ratings and sometimes your page will even be kicked from the index as spam content.

We generally recommend to our clients to tell us the keywords they think the public would use to find their company or service. Then we ask them to ask friends or family what they might enter in Google to find them. We do however steer them away from trying "John Doe, Bogus Plumbing, Denver" This assumes they already know what or who they're looking for. We want to better understand what a stranger might enter to look for their type of service: "kitchen drain clog, plumber, Denver", might be more accurate and attract more visits.

Once we have a consensus list of search words or phrases, we examine how the content on the home, services, and other relevant pages can reflect them while still making sense. Years ago, the Keywords Meta Tag was critical to being found, but now, with more sophisticated indexing algorithms, your entire content is indexed and ready for searching. The Description Tag is now the much more important of the Metas.

++++++++++++++++++++++++++++++++++

Make sure your hosting includes SSL. Currently, you must have SSL encryption installed on your domain. In the past, HTTPS/SSL security was reserved solely for the e-commerce sections of the website. This was to protect sensitive personal information, such as credit card numbers. However, major browsers recognized the importance of overall security, and without the HTTPS, they may not serve your site. Some hosting companies charge additional annual fees to add the SSL encryption. 

Keep your security certificate current. Expired security certificates can wreak havoc for your visitors, giving them all kinds of nasty notices in their browser that are likely to scare them off. Keep an eye on your certificate renewals to stay ahead of this.

Allow indexing of site via the robots.txt. Every now and then when a new site rolls out, the developer forgets to change the robots.txt file to allow the search engines to crawl the pages. If your Web marketer doesn’t think about checking this file, you could spend months wondering why you’re not getting the traffic you expected. Double-check your robots.txt file to make sure it does not “disallow” search engines from crawling your site.

Declare your document type. The page’s “doctype” tells the browsers how to translate each Web page. Without a properly declared doctype, the browser has to guess. For the most part, its guess will be correct, but some things simply may not translate properly. Search engines use this to make sure they are analyzing each part of your site correctly.

Use valid CSS. While invalid CSS won’t necessarily affect your rankings, it is yet another thing that can cause your page to be translated incorrectly by the browser or the search engine. Proper translation of each page ensures everyone sees what you want them to see.

Make your CSS and JavaScript files accessible. Don’t hide your CSS and JavaScript files from search engines. This information is important to helping them render the pages correctly, so they know how to analyze each part appropriately. It’s possible that if the search engines are unable to tell how you’re treating different content, key components won’t be given the value they deserve.

Avoid using proprietary language. Often if you are using a free site builder, or a Microsoft builder app, it will create pages using very app specific coding and language. And while it may display perfectly in Edge or Google, does it appear as expected in all browsers and platforms?

Add descriptive image alt attributes. Any image that is called for in the code of the page (rather than via CSS) should use an appropriately labeled alt attribute. This is a minor thing, but it’s generally just a good practice to remember as the images are being added.

Redirect old URLs. Inevitably, there will be some URL changes in any site redesign. Before you remove the old site, capture all the current URLs so you can 301 redirect any URLs that may have changed or are no longer valid. By 301 redirecting these URLs, you can capture most of the authority value any of those pages may have earned in the past and pass it to the corresponding new pages.

404 bad URLs. And just in case you missed any 301 redirects of old URLs, be sure that any invalid URL returns a 404 code with a properly designed 404 page. This makes all the difference between a visitor navigating through your menu to the correct page, and leaving for a competitor.

Forget printer-friendly pages. Developers used to create “printer-friendly” pages that had their own URL. This is no longer necessary and is in fact bad practice. Use CSS to make sure any page on your site is printer-friendly, removing things that don’t make sense for the printed page and using formatting that is better suited for paper.

Emphasize clickable links. Underlined text is still the universal indicator that the text is a hyperlink, but you can also use accenting colors and hover action to allow visitors to recognize that it is a link.

Include a Search app. A search box allows the visitor - and search engines - to easily find content that 

16. Establish a proper page hierarchy. Page URLs should use an established hierarchical format that mimics the navigation of the website. Navigational categories and subcategories should be represented in all URLs.

17. Have a balanced directory structure. When developing the navigation/page hierarchy, strike a good balance between shallow and deep. You don’t want visitors to have to make too many clicks before finding the content they want. However, too many options from the home page generally prevents visitors from making a reasoned selection. Instead, they tend to click the most convenient link rather than searching for the right one.

18. Write unique title tags. Every page of the site should start with its own unique title tag. You don’t have to go all SEO on it if time doesn’t permit, but having a title that represents the content of the page is a must for rolling the site out. Keep each one between 35 and 55 characters.

19. Write unique meta descriptions. See above. A good description should be between 100 and 155 characters.

20. Use properly coded lists. Use proper HTML code (

    ,
      ,
    • ) for bulleted and numbered lists. This tells the browser and search engine that a piece of content is an actual list item, which can affect how that text is being translated for search value.

       

      21. Reduce code bloat. As development progresses and new features are added to a site, it’s easy for the code to become bloated. Many times, developers are looking for the easiest/quickest way to do something — but that is often the most bloated way, as well. Code bloat slows down page speed, so it’s best to keep that to a minimum.

      22. Reduce HTML table usage. Like frames, tables are on their way out of common usage, as there are much more streamlined ways to do the same thing. Unfortunately, it’s often easier to create and manage tables. Avoid using tables whenever possible, and use CSS instead for content that needs to have the table-style layout.

      23. Use absolute links in navigation. Developers like to use relative links because it makes it easy to move a site from a development server to the live URL. However, relative links can lead to problems with interpretation and scraping. I recommend using absolute links whenever possible, but at the very least in the site navigation.

      24. Implement non-spiderable shopping cart links. Any link into your shopping cart should not be spiderable by search engines. You don’t want search engines adding products to a cart just by following a link. Keep them out of all these areas so they stay focused on your content.

      25. Disallow pages to keep search engines out. Use your robots.txt file to keep search engines from spidering pages they shouldn’t have access to. Disallowing these pages will keep the search engines from reading any content on the page; however, links to those pages can still end up in search results if the engines find other signals that give them an indication of the page’s value.
      26. NoIndex pages to keep them out of SERPs. If you want to keep pages out of the search engine results pages (SERPs) completely, using the noindex meta tag is the better route to go. This tells the search engines not to index the page at all.

      27. NoFollow links to keep them from passing value. If you don’t want any particular link to pass value to another page, use the nofollow attribute in the link code. Keep in mind that the link itself will cause a loss of link value from the page — it just won’t be passed to the page you are linking to.

      28. Check for broken links. Before you roll the site out, check for and fix any broken links. When crawling your site, you don’t want Google to find errors like this out of the gate, as that can diminish the site’s overall value score. You should do this again once the site is live, just to be sure something didn’t go wrong in the transfer.

      29. Find ways to increase page load speed. There are always things you can do to improve site speed. Look for even the smallest of opportunities to make your pages load even faster.
      30. Reduce the number of on-page links. Search engines recommend that any single page have no more than 100 links. But that doesn’t mean you have to approach that number before culling excessive links. Review your site navigation and key pages to ensure you haven’t used excessive linking.

      31. Eliminate duplicate content. Do your best to prevent any duplicate content. This is especially important for e-commerce sites with multiple paths to similar information. Each page of content should have a single canonical URL. The rest should be eliminated. If you can’t eliminate all URLs that produce dupe content, use the canonical tag as a stop-gap measure.

      32. Implement proper heading tag hierarchy. Each page should have one, and only one, H1 tag. The remaining top-level heading tags (H2-4) should be used for content areas only, reserving H5-6 for navigational headings.

      33. Don’t use session IDs. This is another old technology that, perplexingly, is still being used today. There are far better means of tracking visitors through your site, so avoid using this method at all costs.

      34. Use search-engine-friendly links. Make sure all your links (except those you deliberately want to keep away from search engines) are using search-engine-friendly link code. Using the wrong link code can inadvertently keep search engines away from very valuable content.

      35. Implement structured data. Structured data is additional coding around key elements of content that help the search engines understand the purpose or value of that content. This can affect how your site displays in the search results, as well as what information is presented to searchers altogether.

      Final Thoughts

      Implementing each of the suggestions above will push your site one step closer to being search-engine-friendly. My suggestion would be to pay attention to all of them because rolling out a new site that isn’t completely search-engine-friendly can have disastrous results. If you wait until after the site rolls out — even if you fix problems quickly — you can still experience some negative long-term ramifications.

      I suggest going through this list with your developer to make sure each has been completed before approving the site to go live, even if that bumps the deadline a few weeks. Better to roll a site out slightly late than push out a site that will tank your business and create more problems you have to dig yourself out of later.