(713) 568-2763

Five Spooky SEO Scares and How to Avoid Them

Five Spooky SEO Scares and How to Avoid Them

spooky seo

With Halloween just around the corner, we’re all in for a few scares to close out the month. But certain things come as an unwelcome surprise, especially in the world of SEO. Here are some of the most petrifying problems you could face as an SEO and what you can do to stop them in their tracks.

Slow Site Speed/Poor Usability 

With Google’s mobile-first indexing in full effect, you can’t afford to have your site left in the dust. Site speed and usability are two major factors in determining the rank of your site in search engines, and slow one spells disaster. Users need to be able to quickly navigate through the pages of your site and find the information they are looking for, or else they will bounce in frustration. And Google doesn’t like a site with a high bounce rate.

Some of the best things you can do to keep your site speed in check are regular tests and crawls of your website. Make sure to check both the desktop and mobile site speeds (we like Google’s Mobile Site Speed Test and GTMetrix) and pay attention to the recommendations you are given with the results. Certain things like image compression and proper page coding are quick and easy ways to ensure that your content is loading as quickly as possible for users.

Broken Links/404 Errors

Thinking you’ve found the answer to your search only to click on a broken link is one of the most frustrating issues a web user can run into. If your site’s internal link structure is plagued with 404 errors or bad redirects, you’re sure to scare away all your users!

A quick crawl of your site using several different web tools and applications will give you a comprehensive, live list of all your bad links and 404 errors on-site. Make sure that you’ve implemented the proper code or plugins to send users to the right page without losing them along the way. Taking a moment to correct your redirects or add new ones can pay off in the long run as users will be more inclined to click through your site if the navigation actually works!

Incorrect Google My Business Info

A big piece of the push for local SEO is the optimization of Google My Business listings. Depending on your business, most of your impressions can come from mobile users looking for quick directions or a phone number to call for more information on a product or service. But watch out for every local business owner’s worst nightmare: customers can’t find the store, they can’t even get ahold of them!

Having up-to-date Google My Business information is key to building a relationship with your customer base. If a user clicks to call and they reach a dead line or the wrong number, they can quickly go from optimistic to discouraged. And imagine driving across town only to find that the business has moved 20 minutes away! You can ensure that customers have everything they need to get in touch with you or visit your store by regularly checking the health of your Google My Business listing. Make sure that your location is properly claimed and verified in order to reap the full benefits of the most important listing service online!

Manual Actions in Google Search Console

Your rankings are dropping and your visitors are disappearing, but you have no idea why! You visited your site and everything is there, but why aren’t users finding it? Then you turn to Google Search Console only to see the dreaded exclamation that a manual action has been placed on your site.

Unnatural links to your site, spammy behavior (pop-ups, markups, keyword stuffing, etc.), and poor-quality content are all possible culprits for your site’s manual action penalty. These Google Webmaster Guideline violations by your website can not only create a negative user experience for customers, but Google will punish or sometimes remove a site from its index as a result. The best things to do to avoid being hit with manual actions are to constantly create and update your site with quality content, avoid black hat tactics like buying backlinks to your site, and creating a high-quality user experience overall.

Bad Robots.txt Files

Robots make for some great Halloween costumes, but they play a role in the SEO world all year long. Robots, or Google bots in this case, can very easily become your greatest ally in reaching the top of search results, or your most powerful nemesis as they sink you on the SERP.

Making sure that you have an up-to-date robots.txt file on your site is key to ensuring that your site is crawled properly by Google bots and other search engines. If you fail to set the correct parameters, or even as much as blocking bots entirely, you can very quickly fall in search results as your site won’t be indexed properly. When you perform regular checks of your site for errors, make sure to take a look at your robots.txt file and sitemap. Be sure to update both with any changes to your site and always edit your robots.txt file to allow bots to crawl the proper pages on your site.

Contact Forthea this Halloween to have our team of SEO experts help keep your site safe from these issues and more!


Sean Alder
Sean is a Junior SEO Specialist at Forthea. When he’s not in the office, you can usually find Sean with his identical twin brother enjoying the finer things of life like video games and Asian food. As a compliment to his obviously very healthy lifestyle choices, he still savors trips to the park and a good dose of the great outdoors.

0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*