Hit By Panda Update? Google Has 23 Questions To Ask Yourself

here's what I don't understand: most all of those 23 questions seem to require human judgement. How can a computer algorithm do that?

without knowing that, it's next to impossible to figure out how to tweak a website to address those questions.
 
The algorithms don't utilize human judgement. AI in all forms merely fake human judgement to the best of their ability.

I'll try to not get terribly technical here with how you'd code something to detect duplicates, but the reality is that its not nearly as complex as you'd think.

When the exact same non proper noun words are used in a 4 word segment, there is a very high statistical likelihood that the content is duplicate, when compared to other content. In particular, if you see multiple 4 word duplicate segments, in addition to a similar length of page, and similar heading or title keywords, it is very likely duplicate content.

Think about what a computer can actually do. Computers can't read, but they can statistically analyse text. They have a very hard time determining content quality, but they can test grammar rules, duplicate phrase use, the size of image files, the number of words per sentence and paragraph, until the statistical likelihood of the content being duplicate is found to be likely or unlikely.

Oddly, it is easier to do so with longer content than with shorter, much in the way that statistics are more reliable when you have more data points.

Google is really sort of a tightrope walk between trying to figure out what the computer is looking at, in addition to trying to keep your content engaging to humans so that it can convert.

The stupidity of it is that some of the metrics defeat the actual intent.

Heres a great example. Lets say you want to know who won the super bowl in 2004, and someone has built a site that is just 1 page, with the answer, very few words, and when you search it is the top result.

You click it, immediately get the answer you wanted, and hit back to go back to google, satistified in the answer you received.

That website just got a bounce, their quality score will drop over time, and they will lose that position, because their website gave you what you wanted too quickly.

At the same time, a page that made you click to a 2nd page to give you the answer, and made it harder to find could keep you on the site longer, which in googles eyes could end up with a higher quality over time because of a lower bounce rate, even though it obtained the lower bounce rate specifically by making the site less relevant to the specific search.

Make sense? Not really. But the computer doesn't have a good way to gauge the satisfaction of the user with the page they visited. To some degree they do look at the bounce rate combined with how often a 2nd or 3rd link is also clicked, but the common thinking is that giving users what they want too quickly is actually bad.
 
that's some very good insight. Thank you.

I'm curious... did you bring up these two examples because they're very important or are they just two minor examples of the overall algorithm that happened to pop into your head? The reason I ask is because it just so happens that the two examples you used (bounce rate and duplicate content) are things that I've pondered very recently about making changes to my own website.

My bounce rate is horrible. 73% in the last 30 days. I'm considering ways to improve on that. One suggestion I really like is adding video content to several of my pages. I was told that will help both with building trust (conversion rate) and also with lowering my bounce rate at the same time. do you think that's a good idea?

and as far as duplicate content, I write my own content. It may be mediocre or lousy, but it's my work. However, when some of my blog entries get many more hits than others, I tend to make it a webpage of its own on my site (while still leaving it in my blog at the same time). I never thought about the fact that google may see both and deem it duplicate content. Was doing this a mistake or should I keep doing it?

thanks
 
Last edited:
Someone smarter than me might correct this, but I don't think google looks at duplicate content on the same domain/ip in the same way it looks at duplicate content on multiple sites.

I don't think you'll get credit for it in the same way you do for original content, as if it were brand new, but at the same time, you shouldn't be penalized for it, and if it was compelling copy that converted highly or kept the focus of users, the benefit of the copy being on the site would be high.

As for the bounce rate question, yes it appears to have a lot of impact. To know exactly what is happening you'd need to go get a copy of the code they're using, but this is what I imagine is actually happening.

Again, imagine you're a computer. How can you tell if someone did not get what they were looking for from a site. One of the easiest ways for google to find that out would be to see if you did one of two things.

Either you left a site very quickly, THEN clicked on another search link from the same IP, OR you went back and searched again for the same topic.

If you see ip addresses doing one of those two things, you can determine a trend that shows the likelihood that the person did not find what they were looking for on the site.

Also, if they come back way too fast, you can determine with some degree of likelihood that the person couldn't find what they were looking for, however I firmly believe that the particular method they use in that case is flawed, for reasons I stated above.

When doing SEO, try to think about what data a computer actually can have about your browsing activity. If you're using chrome, they can see a LOT, what the person clicked, where they came from, where they went, time they spent on each page. If they're using google for search, period, they at the very least have the ip address the person came from, how long since they clicked, what they searched for, what they searched for next, and if the phrase to broad of the immediate next search was relevant to the search before.

If the broad match of the next searched keyword is the same, likelihood is no matter how long the person spent on the page before, it either did not answer the question they had, or they felt that they needed to go somewhere else after visiting that page.

Tricks here would be, compel your users to stay on your site, and feel like their questions are being answered or will be answered fully by your site by having a narrow focus, supplying a lot of information, explaining how you will help, when you will help, what you will do, etc.

I think whitehat seo, in it's essence, is about doing a lot of that kind of stuff.

One of the simplest tricks in the world, free advice here write this down, make your offsite links open in a blank window in addition to being rel='nofollow' if they don't go to a site you own.

<a href='link.html' target='_blank' rel='nofollow'>link</a>

The new window opens over the old window, the person can close the new window after, it keeps them on the page where they clicked on the link from without navigating away, and so on.
 
What really sucks for insurance, you need a tech guy that can handle this end, but its very hard to tell if any tech guys know what in the hell they're talking about, and there are a ton of hit in the head with a hammer retarded people out there selling SEO to people for giant fees, because part of the whole thing is, you can't guarantee results.

Its a lucrative field, selling someone a service with no guaranteed result or outcome for massive money where they cannot prove or disprove that you did anything without the technical know how to do the whole job themselves in the first place.
 
Just my opinion, but the true SEO gurus don't post very much on this Forum. We all have out theories and opinions about SEO, but really, we are NOT the experts.

There are experts, and yes, they have a pretty firm grasp of things. Val, Alston and Dave are the three that come to mind. If I'm missing anybody, feel free to add them to the list.
 
My bounce rate is horrible. 73% in the last 30 days. I'm considering ways to improve on that. One suggestion I really like is adding video content to several of my pages. I was told that will help both with building trust (conversion rate) and also with lowering my bounce rate at the same time. do you think that's a good idea?

You all know how I feel about video...

That said, 73% bounce rate definitely gives you room for improvement. A few ideas/questions to ask yourself:

  • Do you provide a single clear, compelling call-to-action on your page?
  • Have you applied the squint test to your page to make sure the most important things aren't getting lost?
  • Do you provide a good intro on what the site offers your visitors and why you're the right person to offer it to them?
  • Do you have visuals/graphics that 'soothe' your visitors into feeling comfortable?
  • Are you asking for the sale before building trust?
  • Are you using color to differentiate the things on your page?
 
This Panda is bad for the insurance industry because Panda gives more credit to sites that have articles that are followed. Insurance is not exactly a topic of interest for most people. There is the big problem!



 

Latest posts

Back
Top