As a frequent site auditor at iProspect and WordPress developer, I tend to see a lot of onsite SEO issues every day. In many cases, it seems that a lot of web developers and designers are still overlooking many important onsite SEO factors. Sometimes these might be overlooked to save time, but often it’s due to a lack of SEO knowledge. Yet, in the long run, many of these issues will end up costing companies more time to fix when digital performance becomes a priority.
1. HTML Source Code
When working with a CMS or building a website from scratch, a good website begins with clean and well commented code. More often than not, I see websites that are not optimally utilizing CSS and HTML. HTML is meant to semantically separate content, while CSS is for styling. In other words, don’t use <h1>, <h2>, <h3>, <strong>, <em>, <br> and all other tags with a basic stylistic component as basic design elements. It might be easier than creating CSS IDs and classes, but it’s just wrong.
HTML is only becoming more significant and semantic. It only takes a few seconds to notice the semantic significance of many of the new tags such as <article>, <header>, <footer>, and so on. As these are not yet standard HTML tags, they should eventually be used to highlight important content and play a part similar to header tags, <strong> and <em> in Google’s algorithm.
While not an actual ranking factor, another semantic element that will save you time when it actually starts to matter more is implementing Schema.org tags. Implementing these tags is quite a simple and currently will make your entries in the SERP much more visible. For example, many e-commerce websites can showcase positive reviews using the markup.
2. Page Speed
A second issue that should be addressed during the development phase is page speed. A site that doesn’t load fast or times out 10% of the time will not only annoy users, but be penalized. When it comes to speed, it isn’t just the server response time that comes into play. From minifying your resources to optimizing your CSS files, there are numerous factors to test and tweak to help improve both your rankings and user experience.
Thankfully, there’s a great tool to check the basic issues for you: Google Page Speed Insights. There also many free CDNs, code minifiers and other tools to get you up speed (pun intended).
3. Robots.txt
Robots.txt can be your best friend or your worst enemy. It can make the difference between your site getting crawled and indexed versus being ignored totally by search engine bots everywhere. It’s also good to make sure to check it twice before you launch a website. It’s important to make sure all web pages are being indexed properly.
All it takes is a typo and the robots.txt file might end up disallowing search engines from crawling your whole site. It’s also important to make sure the crawlers are not indexing pages that should not be public such as server caches, development sites, confidential files, etc.
For example, you may have some pages or pdfs that require a user to perform an action such as sharing or tweeting a gateway page. If such pages are crawled, users will be able to bypass your gateway with a simple Google search.
4. Multilanguage & International Website Issues
Now let’s talk multi-language and international websites.
First, it seems fairly straight forward that session IDs, cookies, or other means that do not create a unique URL should not be used to alter the content of a page. This is especially true when it comes to changing the language. Who would want a slew of unique pages to not be indexed merely because they are in another language?
Another issue I come across often when auditing multi-language/international websites is that hrerflangisn’t being implemented nearly enough. Granted the attribute is fairly new, hreflang is the only way to explain to search bots that two French pages with identical content have been created to target to different dialects such as European French and Québecois. With duplicate content getting hit with bigger penalties, hreflang is a must for international websites.
5. Proper Tracking
Getting basic data from a website is very easy thanks to Google Analytics, but it takes a bit more work to get actionable data. In fact, it is primordial to monitor more than just the overall landing page bounce rate and CTR of your SERP Results. There are many actionable data that can be using to build a solid content promotion or blogger outreach strategy. Namely, looking at what pages lead to conversions for different keywords. Be sure to also set up some custom alerts, so that you can get a good view of when organic traffic is starting to grow or a landing page is leading to an overwhelmingly high bounce rate.
In conclusion, it’s important to always look forward to potential and new Onsite SEO practices as soon as one begins to build a new website. In this case, all new websites should be keeping track of these points:
- Use semantically relevant HTML5 tags.
- Avoid using HTML tags for style
- Use Schema.org tags when appropriate
- Use the Google Page Speed Insights
- Double check and test your Robots.txt
- Use hreflang for multilingual websites.
- Study your analytics results.
This post was originally written by Philip Tomlinson.