On 12/01/2016 08:30 AM, Jonathan Vanasco wrote:
On Nov 28, 2016, at 4:07 PM, Jeff Dyke wrote:
And you do get a small SEO boost for being https forward.
Not necessarily -- some SEO engines are now doing the opposite, and
penalizing non-https sites. Google announced plans to start labeling
non-https sites as "insecure" in 2017 too.
It's incredibly simple (and free) to set up SSL via LetsEncrypt on all
domains - so I would do that.
The LetsEncrypt concept was corrupted from the start by it's use by
hackers / malware sites. If you're serious with security then an
oldschool $10 cert from Comodo is far better.
Sure LE is a solution, but multiple SSL cert providers is getting a bit
complex really.
( plus LE have already been hacked themselves )
On Nov 28, 2016, at 2:37 PM, steve wrote:
It seems that search engines are probing https: even for sites that
don't offer it, just because it's available for others, with the end
result that pages are being attributed to the wrong site.
In terms of your current situation with SEO and attribution -- can the
original poster share any examples of the search engines and
domains/results? I'd honestly love to see some of what is going on,
what you interpreted pretty much never happens. A search engine might
probe for data via https; but it won't attribute a resource to a
domain/protocol it didn't actually load it from. This alleged
Search Engine behavior is something that I've never seen with Google,
Bing (or other "standard" engines) and I've managed SEO for a handful
of top publishers. From my experience and a lack of evidence, I have
no reason to believe this is the actual problem.
Well, no as I've fixed this. However, if you have a probe for site x on
https: and it doesn't exist, then the default https site for that IP
address will be returned. Depending on configuration, it may still be
attributed to the original search domain. I don't understand why people
keep trying to shoot me down on this!
OTOMH, there are a lot of possible issues that could cause this.
The most likely issue is that there is a misconfiguration on nginx and
3 things are happening:
1. there exists a link to the "wrong domain" for the content somewhere
on the internet
2. nginx is serving a file on the "wrong domain"
3. the pages do not list a "canonical url"
If you have a thoroughly broken nginx installation and are serving the
content on a wrong domain, almost every search engine will transfer
the resource's link equity to the canonical URL. They're only going
to show the data on the wrong domain/scheme if you allowed it to be
served on the wrong domain/scheme, and failed to include a canonical.
Note: I host these sites, I do not write the sites in question. Addition
of canonical headers is beyond my remit, although I suppose nginx could
be coerced into adding one. Interestingly, neither of the CMSes I
primarily work with ( Magento and WordPress ) seem to add in canonical
headers either. I must research this further.
If you are dealing with a broken search engine/spider for random
service, there are lots of those, and you want to address it.... The
problem could be because the client doesn't process SSL or SNI
correctly, so you might be able to do:
A) single certificate HTTPS on IP#1 + (SNI HTTPS & plain-http on IP#2)
B) single certificate HTTPS on IP#1 + SNI HTTPS on IP#2 + Plain HTTP
on IP#3
You could also just isolate the given spiders by their browser id, and
handle them with custom content or redirects.
None of the major search engines work in the manner you suggest though.
The problem was with Google...
--
Steve Holdoway BSc(Hons) MIITP
http://www.greengecko.co.nz
Linkedin: http://www.linkedin.com/in/steveholdoway
Skype: sholdowa
_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx