Your website sucks because it's written for a robot

The most compelling website copy in the world won’t help you if nobody reads it. So you need a strategy for bringing people to your site.

Optimizing your content for search engine discovery is one such strategy, which plenty of businesses employ to great success.

And yet.

And yet, the highest search ranking in the world won’t help you if your copy reads like a pile of machine-generated gibberish.

Like when you use the phrase “foreclosed home” 36 times in a single blog post:

… or when you use the phrase “F-150” 7 times in a single paragraph:

Here's what you can do

First, consider whether keyword-optimized SEO content is even a valuable strategy for your site.

If you’re able to create content that people share voluntarily, or that you link to from pages that already get traffic, or if you’re using ads to drive visitors … you have my permission to write that content for a human audience.

If you do decide to go for an SEO-optimized content strategy, keep two things in mind.

Organic search visitors are going to convert at a lower rate

They came to your site to answer a question, not shop for your product or solution.

Some will convert, and others will return later to convert. But driving a ton of search traffic to an informative, keyword-laden blog post will actually lower your overall conversion rate.

Not all SEO practitioners are created equally

Shop around, and for each expert you talk to, check out the content they’ve created for other clients. Does it make your eyeballs bleed?

There’s a huge difference in quality between what you get from someone who simply ingests keywords and spits out blog posts vs. someone who’s thoughtful about solving your visitors’ problems.

With a bit of diligence, you can end up content that satisfies robots and humans alike.