Usual fuckups from developers in SEO

Ah, the age-old battle between SEO specialists and web developers. It's like explaining quantum physics to a cat (a stupid one; most cats are more intelligent than me) – challenging but not impossible.

Most web developers are unaware of SEO and are inclined to reduce it to the simple existence of a few tags or only some aspects of web performance. Of course, how to deal with developers is a widely recurring topic in every decent SEO blog, so I'll try to keep this post short, providing a list of common mistakes of developers, and I'll let you decide how to deal with them.

Keep a staging domain open

Developers often leave the staging domain open for Google or Bing to index, creating a twin website like "beta.example.com." Sure, it might not directly hurt your SEO rankings, but it's akin to leaving the front door of your house wide open while you're on vacation. Besides security risks, it's a surefire way to mess up your analytics data and SEO dashboards.

The Stubborn "Disallow" in Robots.txt

Ah, the dreaded "disallow" command in robots.txt. Once you've gone live, it's like keeping a "Do Not Disturb" sign on your door forever. It's self-explanatory why this is a no-no in the SEO realm. It is a problem that is easy to fix. Still, in many corporate environments, it will require creating a user story, explaining the business value, defining acceptance user criteria, and the costs of not doing it, and please shoot me in the head.

Client-Side Rendering for Public Pages

Developers might opt for client-side rendering because it's easier, but it can be SEO's kryptonite. How do you make them understand that server-side rendering is the hero your website deserves, ensuring faster loading times and better SEO rankings?

Forgetting Website Protection

Security isn't just about locking your doors; it's also about guarding your website against elemental attacks. Falling victim to hacks tarnishes your reputation and can lead to spammy links. The problem is often not a group of Chinese hackers trying to get sensitive data (sometimes it is, though) but simple security protocols nobody cares about.

Overusing Redirects

Redirects are like detours on the internet highway. While they can help fix broken links or redirects after a migration, too many redirects waste precious crawl budget and can slow down your website's performance.

The Case of Relative URLs

HTTPS protocol demands absolute URLs, not the noncommittal relative ones. It's like navigating with a GPS that only gives you vague directions like "Please turn left when you see a big tree and a red house."

Exposing Sensitive Information

Keeping sensitive information accessible through specific URLs or in unprotected website areas is like leaving your Amazon delivery package on the front porch. I don't want to explain why this is bad; you are smart enough to figure out why having your customer's data or financial details publicly available is a bad thing.

How to manage these situations

To avoid such issues, communication and collaboration are essential in this digital tango between SEO experts and web developers. Or at least that is how most blog posts like this end. But no matter the amount of training, communication, and Jira tickets, bad things happen more often than not, so be prepared to pick up a call at 2 a.m. because something went wrong.

Subscribe to SEO Mindset

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe