Own your local area with multiple sites.

TheCleaningDoc

New member
Here is something for you all to ponder....

Most everyone has a web site and most rank pretty well but what about multiple sites?

I have a customer that 2 years ago we picked one service that he did and created 7 different sites around that service. We also made it so that when he posts to the main site, the feed automatically posts to all 7 of the other sites. Not only does this help his main site, all the other sites rank too. Now we have done absolutely nothing to the sites since they were built AND we have done no link building. Today 4 of the 8 sites rank on page 1. spots 1, 4, 7 and 10.

Last summer was the first summer and the service we went for was deck staining. They went from 40 gallons of stain 2 years ago to over 2,000 gallons of stain last year and they are still going strong. His biggest problem was getting rid of the empty buckets... LOL They even had some people call them and say that they did not call anyone else because everywhere they looked they were there. At one point last year 7 of the 10 spots were his sites on the front page.

It is not really about ranking but more about bumping your competition off the front page. I am sure with a little link building and some social exposure he could have 7 spots again.

Oh and the city has a population of 253k so some markets would be tougher while others would be easier.

Something to think about. We did his sites over the winter and by spring they were all ranking well when the spring season rolled around.
 
This sounds like a very interesting idea to try out.
 
Doesnt Google specificlly state not to have seperate sites for each service? How many different addresses does his use for Google maps, or does he link each site to the same address? Won't the same feed going to each site be considered duplicate content. I would say its all a bit risky. Not to say he cant or wont get away with it, but it's a risk.
 
He only has one google map listing with one address. These sites are in the organic results not google maps. The duplicate content issue has been misconstrued as to the actual meaning of it. Take a look at all the news sites where they get the AP feeds etc. Duplicate content is only an issue when you have the same content on the same domain, not different domains. It is not risky at all Ed. We are not talking about separate google places listings but separate domains for each service.
 
I can tell you from personal experience that succeeding with this model gets harder and harder with every Google update. I still have multiple sites ranking in about 8 cities but not like they used to. It is still possible to have success but that success is much more limited now by the algorythm updates. I once had remodeling site ranking for 25 services in 3 cities - all on page 1 or 2.....now it ranks for 1 service in a different 4th city and that is all????

So...it can be done to some extent but it is not nearly as easy as it used to be!

Daniel Simmons
Commercial power washing contractor in Cypress, TX
 
I can tell you from personal experience that succeeding with this model gets harder and harder with every Google update. I still have multiple sites ranking in about 8 cities but not like they used to. It is still possible to have success but that success is much more limited now by the algorythm updates. I once had remodeling site ranking for 25 services in 3 cities - all on page 1 or 2.....now it ranks for 1 service in a different 4th city and that is all????

So...it can be done to some extent but it is not nearly as easy as it used to be!

Daniel Simmons
Commercial power washing contractor in Cypress, TX
Your sites have probably been static for too long. This strategy can work very well for competitive marketplaces. The thing is, you cant just launch a bunch of sites do nothing to them for 3 years and expect not to have declining rankings. You need to keep them updated. Show the search engines that you are still alive and still in business. This strategy works because exact match domains are always hard to outrank when they have other positive SEO factors in their favor.
 
He only has one google map listing with one address. These sites are in the organic results not google maps. The duplicate content issue has been misconstrued as to the actual meaning of it. Take a look at all the news sites where they get the AP feeds etc. Duplicate content is only an issue when you have the same content on the same domain, not different domains. It is not risky at all Ed. We are not talking about separate google places listings but separate domains for each service.


A news site is a bad example. First, the high ranking news sites don't depend on 2, 10 or 50 pages of content to rank, they have tens of thousands of pages. They also have thousands to millions of links. Google isn't going to de-index CNN for running an AP story.

A small business website is a totally different animal. If you could simply clone your site repeatedly and rank them all on page 1, people would do it all the time. Same for posts. There is a reason why Copyscape exists and is popular. There is a reason why SEO companies that outsource content from article writers require that they be able to pass Copyscape.

It is risky Ed, if you don't believe me, ask Tom at Ace Painting. His hosting company was supposed to redirect his old domain to his new domain after they cloned all the posts and pages to the new site. They did it incorrectly, redirecting page and post links instead, leaving the old site live.....and Google de-indexed him.

Duplicate content will not always get you de-indexed, sometimes it just gets you penalized. You might never realize that your ranking is handicapped. You can slip under the radar for a while but as someone mentioned, Google is getting better at filtering for these tactics.

The only reason I can even see for dupe posting is to save time. If you are that busy that you cannot write 500-600 words, pay someone to do it. You can find people all over the internet that will do it for 5-10 bucks.
 
That is why you use a RSS feed and also pull in other feeds. I see people all the time complaining about someone using their content on another site and what they are doing is using the RSS feed.

RSS = Real Simple Syndication

It is designed for this purpose, such as AP for news. This can not be penalized as duplicate content and when you pull in feeds from other sites it makes each site unique. I love it when someone uses my RSS feed because it just gives me more links and normally from a related site.

Google does not take copyscape when ranking sites and I have had unique pages find matching sites with copyscape because 3 words matched another site. If you have 3 words in a row that are the same as another site copyscape will find a match. A lot of the time they are totally non related sites.
 
That is why you use a RSS feed and also pull in other feeds. I see people all the time complaining about someone using their content on another site and what they are doing is using the RSS feed.

RSS = Real Simple Syndication

It is designed for this purpose, such as AP for news. This can not be penalized as duplicate content and when you pull in feeds from other sites it makes each site unique. I love it when someone uses my RSS feed because it just gives me more links and normally from a related site.

Google does not take copyscape when ranking sites and I have had unique pages find matching sites with copyscape because 3 words matched another site. If you have 3 words in a row that are the same as another site copyscape will find a match. A lot of the time they are totally non related sites.


The key is "other sites" and "links". If you own all the sites, and they are not on the same IP then sure, a link from a unique IP is valuable. But, it's no more or less valuable then getting a relevant back link any other way...a link is a link. The problem is that if you don't have 7 unique IP's for the 7 sites then you are leaving an easily traceable footprint for google to track. The more posts that are similar across these 7, the bigger the footprint. Eventually, it will trigger an unnatural pattern.

Then there is the issue of cost...having 7 unique sites on 7 unique IP's is going to mean 7 hosting accounts. Otherwise, it's all worthless.

Fair or not, if google senses that you are intentionally attempting to manipulate the serps, you will be penalized to some degree. This is why many top SEO companies spend so much money, time and effort keeping their footprints hidden. That means IP diversity, hidden domain ownership and unique content. Even if they own their own blog network, they are careful to limit a clients links to 1-2 "unique" articles on each domain.

Google knows we all create links....we still have to play the game and make them look "natural".



Google does not take copyscape when ranking sites and I have had unique pages find matching sites with copyscape because 3 words matched another site. If you have 3 words in a row that are the same as another site copyscape will find a match. A lot of the time they are totally non related sites.

Google doesn't need to use Copscape, they have dupe checking built in to their algo's. Copyscape isn't a yes/no issue. Copyscape reports % of duplicate data. When I run a Copyscape check on content, I look for 60%+ unique. Nobody cares about 3 words matching a 500 word article. You run it looking for high percentage matches.
 
Here is something for you all to ponder....

Most everyone has a web site and most rank pretty well but what about multiple sites?

I have a customer that 2 years ago we picked one service that he did and created 7 different sites around that service. We also made it so that when he posts to the main site, the feed automatically posts to all 7 of the other sites. Not only does this help his main site, all the other sites rank too. Now we have done absolutely nothing to the sites since they were built AND we have done no link building. Today 4 of the 8 sites rank on page 1. spots 1, 4, 7 and 10.

Last summer was the first summer and the service we went for was deck staining. They went from 40 gallons of stain 2 years ago to over 2,000 gallons of stain last year and they are still going strong. His biggest problem was getting rid of the empty buckets... LOL They even had some people call them and say that they did not call anyone else because everywhere they looked they were there. At one point last year 7 of the 10 spots were his sites on the front page.

It is not really about ranking but more about bumping your competition off the front page. I am sure with a little link building and some social exposure he could have 7 spots again.

Oh and the city has a population of 253k so some markets would be tougher while others would be easier.

Something to think about. We did his sites over the winter and by spring they were all ranking well when the spring season rolled around.
Not this year pat... Not this year. ;)
 
The key is "other sites" and "links". If you own all the sites, and they are not on the same IP then sure, a link from a unique IP is valuable. But, it's no more or less valuable then getting a relevant back link any other way...a link is a link. The problem is that if you don't have 7 unique IP's for the 7 sites then you are leaving an easily traceable footprint for google to track. The more posts that are similar across these 7, the bigger the footprint. Eventually, it will trigger an unnatural pattern.

Then there is the issue of cost...having 7 unique sites on 7 unique IP's is going to mean 7 hosting accounts. Otherwise, it's all worthless.

Fair or not, if google senses that you are intentionally attempting to manipulate the serps, you will be penalized to some degree. This is why many top SEO companies spend so much money, time and effort keeping their footprints hidden. That means IP diversity, hidden domain ownership and unique content. Even if they own their own blog network, they are careful to limit a clients links to 1-2 "unique" articles on each domain.

Google knows we all create links....we still have to play the game and make them look "natural".

If what you are saying here was true, it would have an effect on the main site right? We were not trying to get the others to rank, they just did but it also has not adversly effected the main site either. Didn't help alot but it did not hurt it either and it still has not and that is because of the links from else where. We are not talking about a network of hundreds of sites, we are talking about 7 sites.

But even with that, there is no rule that says you have to make the sites identical but having multiple sites will increase your chances of ranking one or more of them. If you can get 2 on page one then you only have 8 competitors. If you can get 2-3 videos to rank then you have only 6 left. If your yellow pages or angies list ranks then you have one less. This is a strategy as you well know.

Some may think it is impossible to rank a video let alone more than one... Well you can and I have. There is one where I count the customer 6 times on page one. 2 videos, 1 map listing and 3 different links to their site. BTW the videos rank #1 and #2
 
Page rank means absooutely nothing Ron. All it does is give people the warm and fuzzies. The ranking in the serps is what counts in the end.

Oh and YOU do not own the page either... LOL
 
Back
Top