A client who I am consulting on changing his domain name asked me an unrelated SEO question about H1 tags. He received an external website audit that recommended removing excessive H1 tags from his sites. The funny thing about SEO audits (or rather the sad truth about SEO audits), is that they often come purposely without a lot of explanation so that the recipient must hire more SEO services to get really actionable recommendations. This is almost as dangerous as when people receive random SEO advice or see an SEO To-Do list of things that will help optimize your website. SEO is something you have to see within context. Websites are complicated and its way too easy to put general SEO advice to practice the wrong way on your website. Continue reading
Update: This article was previously published on Yahoo Voices, but has been moved here to my blog because the Yahoo Contributor Network was closed as Yahoo re-organizes its business efforts. This SearchEngineLand article alludes to closure due to Google’s Panda updates that target content farms, destroying the traffic. I can verify that my article on blog tagging previously ranked #1 or page 1 in Google for a variety of relevant terms, but has dropped to page 2 since the last Panda update. Since traffic is key to a publisher business model, I would guess the Yahoo Contributor Network, which is a user generated content platform that pays users, must not have been making enough profit from ad revenue. This is likely one example of Marissa Mayer trimming the fat as promised when she became Yahoo CEO.
Tagging Posts Without a Strategy Can Damage Your SEO
The mantra “content is king” has remained true for as long as SEO has been practiced. A blog embodies this idea by providing a platform to publish an ongoing stream of fresh content. However, if approached wrong, a blog can bury rankings for an entire website.
Google’s Panda filter which has been running since February 2011, is designed to lower rankings of websites using crafty black hat spammer techniques to thinly spread a finite amount of content across many pages, creating the appearance of a large volume of content. For many years predating Panda, search engine webmaster guidelines warned about a very similar issue, the technical difficulty created by duplicate content often generated by a CMS (content management system). A blog is a CMS, and Panda is essentially penalizing duplicate content. Continue reading
SEO local listings fluctuated the last couple months and dropped across a few clients and our own site around the end of May. Listings often fluctuate, but this was a trend across a handful of clients at the same time. So naturally you ask, were there any known algorithm changes at that time? Google Panda 4.0 hit May 20, but did not take a full 10 days to roll out (per Matt Cutts tweet on Panda 4.0), and these ranking drops happened at the very end of the month, after Panda already hit. Next question, were there any changes to the website or local listings. Yes, for BKV, but I believe it is a coincidence that listings fluctuated recently after publishing See Inside for BKV Atlanta marketing agency, mainly because adding See Inside should boost your rankings due to the additional engagement it brings to the brand’s Google Plus property. Continue reading
Over the years business and SEO people have asked for my take on rich snippets, SEO schema.org tagging and structured data. These have been hot terms for advanced SEO and website optimization for several years. Now with mobile voice search and semantic search on the rise, schema.org optimization is moving up many people’s lists of SEO things to figure out. Most of these people with whom I’ve spoken may have heard the terms or dabbled in schema tagging, but want to know…
- first, if they should even use schema.org?
- why they should do microdata?
- what parts of schema.org should they use?
- what is the benefit of having rich snippets?
- what does schema.org have to do with semantic search and Google Hummingbird?
It’s not just upper management or website owners who have questions about Schema.org. Chris Everett, an SEO consultant here in Atlanta with a high degree of technical knowledge recently asked my opinion on schema.org. So I’m writing this post to address both the business question on whether to invest in Schema.org structured data as well as the SEO practitioner questions on keyword strategy. I’ll go through a 4 step process to break down the opportunity into its basic building blocks, identify opportunities, assign measured ROI value and make a prioritized tier based budget. Continue reading
By now you should have heard enough buzz about the increase of mobile device usage to give mobile SEO some attention in 2014. It is my prediction that during 2014 Google will begin algorithmically enforcing the mobile SEO guideline updates it published near the end of 2013. To cover the basics, aside from your SEO keyword strategy, your development team should be looking at making sure your mobile site is loading in one second or less and that all redirects are working properly so that any links from your sitemaps, your other sites or 3rd party sites do not mislead search engine crawlers or users. If your mobile site is a skeleton version of your desktop site, this may require some rethinking on what content is included on the mobile site, how you link to it or how mobile devices are redirected.
Read my section (#3) entitled Mobile SEO Will Require You to Feed the Google Experience, for more ideas on the impact of mobile SEO on your overall visibility in Google Search.
I’m hearing confusing and false statements about Google Hummingbird, missing keyword data, knowledge graph and social media from people trying to sell SEO and social media services. Even those who read my previous article on Hummingbird find themselves confusing the issues based on misleading messaging on the subject all over the internet, even from Google’s own PR. Hummingbird seems to be the new great mystery from Google that everyone uses as a reason to sell any digital marketing service. Some say Hummingbird put a wrench in things and turned SEO business on its head. That’s not true. Issues are being confused, but I believe that’s exactly the way Google wants it. I’m here to clear up the confusion. Continue reading
The BKV website ranked as high as #3 for animated email, but then lost rankings completely. I believe the issue is because Google did not follow a 301 redirect quickly enough before following a robots.txt disallow rule for the old URL. I worked with our web development team on migrating the platform, but 301s in the HTACCESS were acting funny, not working the way we wanted them to. The site had switched hands from many different developers over the years and people weren’t sure why certain things redirected and other did not. There seemed no way to address all duplicate content using redirects, so we placed a disallow in the robots.txt to clean up any duplicate content that the 301s did not address. Continue reading