I'm getting duplicate content on Google and it's affecting my rankings.

After setting up your website to work through a CDN Resource with CDN77, your content may be accessible both from your domain as well as the CDN URL / CNAME, which can result in unnecessary duplicity and damage your SEO ranking.

You can prevent Google from crawling the duplicate links by following these two steps.

First, edit your server configuration like this:

Apache:

Add this rewrite to your .htaccess file:

# serve different robots.txt for CDN77's edge servers
# so search bots will be blocked on edge server's robots.txt file
RewriteEngine On
RewriteCond %{HTTP:VIA} cdn77
RewriteRule ^robots\.txt$ robots_cdn77.txt [L]

 

Nginx:

Add this to your configuration file (usually under /etc/nginx/nginx.conf )

if ( $http_via ~ cdn77 ) {
rewrite ^/robots.txt$ $1/robots_cdn77.txt last;
}

 

Then add another robots file on your origin:

Robots file:
Create a robots_cdn77.txt file next to your standard robots.txt file on your origin, with following contents:

User-agent: *
Disallow: /

 

If you want your content from a CDN Storage to be unaccessible by Google as well, create a /www/robots.txt file with the same contents as above.
 

For more in-depth information regarding this, have a look at the following tutorial from Google.