×

How much thought did you put into search engine optimization when developing your site? Isn’t SEO marketing’s job anyway?

Like performance, SEO is a team sport that require company buy-in to see significant impacts on your rankings.  

Why should you care as a developer or designer? Research shows that a higher ranking on Google results in increased organic traffic to your site, higher conversion rates, and higher brand credibility.

With so much riding on search engine rankings, it is important to know who in your organization is responsible for SEO and why the best websites are optimized for both search engines and users.

Who is Responsible for SEO?

SEO is an ongoing process that is likely led by marketing, but requires several other departments in order to be executed effectively.

Of course, marketers play the biggest role by generating content, identifying keywords, and much more. Web developers also play a crucial role in SEO by designing a site that is easy for both users and search engines to use. If Google is not able to crawl your site, then all of marketing’s efforts are effectively useless.

The truth is, whether you’re working on a new website or redesigning an existing website, it is important to keep in mind SEO best practices. Marketing and developers working from the same playbook can multiply your SEO efforts.

Optimizing for Users and Search Engines

SEO and UX both focus on delivering great experiences to their users. But, while SEO is focused on driving customers to your website, UX focuses on engaging customers once they get to your site.  

But what is the point of having a lot of traffic on your site if your users are not engaging once they get there? And what is the point of having an amazing UX if there are no visitors coming to your site?

Since SEO and UX go hand-in-hand, it is important for developers to know how to optimize their site for both users and search engines.

Optimize for Users

One of the biggest impactors of both SEO and UX is your site’s speed. We already know that higher load times usually mean higher bounce rates, lower average time on page, and have a negative impact on conversion rates.

Keep in mind these three tips when optimizing your site for UX and SEO:

  1. Start by Baselining Your Site with Google’s PageSpeed Insights and other free tools

    Google PageSpeed Insights is essentially Google’s guidelines for web performance best practices. It analyzes the front-end of your website, suggests optimizations and gives your site a performance score from 0 to 100. However, a perfect score doesn’t guarantee a fast site or a higher ranking.

    PageSpeed Insights checks to see if a page has applied
    common performance best practices but does not account for other factors (e.g. type of the device used to access the page, network speed, and cache reuse of shared resources) that can affect your page’s speed. So, while this score is correlated with the speed of your page, it is not completely representative of the actual user experience.

    That is why it is also important to leverage other free tools to get a more accurate picture of how to optimize your site. Some other popular, free tools to get a gauge on your site’s performance include WebPageTest.org, Yahoo’s Yslow and Rigor’s free performance report tool.

  2. Decrease Site Bloat

    Web pages are getting more and more bloated. Shave off as much time as you can by optimizing images, compressing code, and loading fewer external scripts and ads. Content management systems (CMS), like WordPress or Magento, will cause your site to accumulate bloat unless carefully managed.

    Even if you don’t use a CMS, you still run this risk as you add content (e.g. images, new content pages, and so on) to your site over time. That is why it important to continually optimize your images, combine your resources, and enable compression.

  3. Get on Board with Mobile-first Indexing

    Today most people are searching Google with mobile devices (between 50-60%). So, to make search results more useful to users, Google recently announced a “mobile first” indexing system. For this reason, it is more important than ever to embrace responsive design.

    Instead of using resources to maintain a desktop website and a mobile app that are nearly identical, implement a single, responsive website. A single site that adapts to the user’s device (desktop, phone, or tablet) is easier to maintain than two digital assets and is a better use of your resources.

    The good news if If you have a responsive site, you shouldn’t have to make any changes. But if you have a site configuration where the primary content and markup is different on mobile and desktop, you should consider making some changes to make your site more responsive.

    Here is a great discussion on how to implement responsive design on Stackoverflow.com and a blog on auditing the performance of your responsive mobile site.

    Once these common performance issues are addressed, you can expect to see a faster load time on your site and increased traffic.

Optimize for Search Engines

When Google’s bots crawl your site, they don’t view it like a visitor would. These bots scan your site for meta content, keyword saturation, relevant content, and several other factors. Therefore, you need to know what search engines actually look for on your pages and how to optimize your site accordingly. 

In the spirit of collaboration, developers can support marketing’s SEO efforts by eliminating common website crawlability errors that restrict Google from indexing the site.

Start by Identifying Crawl Errors

First you must test to make sure Google is scanning all of your site’s pages.

Anyone (marketer or developer) can do this by typing “site:yoursite.com” into the Google search bar. If the number of results returned is the same as the amount of pages your site has (give or take a few), then you are in the clear. If you see a large gap in the number of results versus the actual number of pages, Google might be having trouble crawling your site.

Now look at your Google Search Console dashboard. If Google sees issues with your site, then you should address those first. If there are issues, the dashboard will show you the error messages. The 404 HTTP status code is the most common error, but here is a full list of the most important status code errors for SEO.

Common Reasons Why Google Can’t Crawl Your Site

There are several different reasons search engines may not be indexing your site, but these are the most common:

  1. Crawler blocked by robots.txt or meta tags

    Pay close attention to the bot commands on your site, because some will prevent search engines from crawling some or all of your site. If you block a page from indexing through robots meta tag, the search bot will not even start looking at your page’s content and move directly to the next page.

    No follow links will cause the bot to index your page’s content, but they will not follow the links. And blocking the pages from indexing through robots.txt can mean that all the website’s pages or key details are blocked from indexing. Google outline everything you should know before using robots.txt here.

  2. Sitemap Errors

    Make sure there are no formatting errors and that you have included the correct pages in your sitemap. Here Google gives us the full list of format errors and instructions on testing and submitting your sitemap for approval.

  3. Broken Link Issues

    Broken links are a bad experience for the user and the search engine. Not to mention, the bot will waste your crawl budget trying to index the broken links on your site and won’t arrive to relevant and quality pages. Make sure to check for URL errors that are commonly caused by typos.

    It is also important to make sure you are not linking to old or outdated URLs. Lastly, mark pages with denied access as nofollow so that you are not waiting crawl budget on on pages only accessible by specific users.

  4. Server Issues

    If you are seeing a 502 error, it may be a signal of a server problem. You should also make sure your site’s hosting server is not down, overloaded, or misconfigured. Check for issues in your server configuration issues causing your site to inadvertently block Google.

    According to Google, “You might be blocking Google due to a system level issue, such as a DNS configuration issue, a misconfigured firewall or DoS protection system, or a content management system configuration.”

    Protection systems are usually configured to automatically block unusually high levels of server requests. Because Googlebot often makes more requests than a human user, it can trigger these protection systems and cause them to block Googlebot and prevent it from crawling your website.

    To fix these issues, identify which part of your website’s infrastructure is blocking Googlebot and remove the block.

Performance & SEO: Not finishable features


Just like the performance industry, SEO best practices change so quickly that they are often difficult to keep up with. Google is constantly changing their search engine’s ranking algorithm (500-600 times a year to be exact).

In a previous Rigor blog on performance as a feature, we quoted performance guru and then Principal Web Developer at BBC World News, Peter Chamberlin. He explains this concept well by saying, “Performance isn’t a technical niggle that ‘we should get around to fixing.’ It is an opportunity to grow your audience, to grow your business, to convert more sales.

The same opportunities are available in SEO.

Rather than viewing SEO or web performance management as a temporary project, you should encourage your team to view it as a feature that requires ongoing attention and maintenance.

Suggested Blog Posts

The Perception Gap for Poor Web Performance

E-commerce revenue continues to grow,as consumers turn away from shopping in brick-and-mortar stores to shopping online.  However, many businesses are not prepared for this growth because they do not fully understand the market and how to invest in...

Read More

Using Browser Trends to Maintain a Better Site

Because of the multifarious nature of web clients today, it’s important to consider the usage statistics when designing, implementing, and managing your site. However, misconceptions often arise when determining what browsers to design for an...

Read More

Finding Causes of Intermittent Errors from Google Webmaster Tools

Google Webmaster Tools is a web service that allows webmasters to view the status of their sites as seen by Google and the Googlebot crawlers. In addition to indexing your site structure and content, the Googlebot crawlers also record data on perform...

Read More

Optimization Options not Always Optimal

Web designers and developers are always looking for ways to speed up their page load times. Yahoo has an excellent article written on the best practices for speeding up your page. How do you know if implementing one of their suggested practices will...

Read More