On 7/25/14 3:12 PM, [email protected] wrote:
Put simply, as far as I can see, a dereferenceable URI behind a robots.txt blacklist is no longer a dereferenceable URI ... at least for a respectful software agent. Linked Data behind a robots.txt blacklist is no longer Linked Data.
When you have a sense of the identity of an Agent and on behalf of whom it is operating, you can use RDF based Linked Data to construct and enforce usage policies.
[1] http://bit.ly/loosely-coupled-read-write-operation-at-web-scale -- this also applies to any Linked Data resource.
-- Regards, Kingsley Idehen Founder & CEO OpenLink Software Company Web: http://www.openlinksw.com Personal Weblog 1: http://kidehen.blogspot.com Personal Weblog 2: http://www.openlinksw.com/blog/~kidehen Twitter Profile: https://twitter.com/kidehen Google+ Profile: https://plus.google.com/+KingsleyIdehen/about LinkedIn Profile: http://www.linkedin.com/in/kidehen Personal WebID: http://kingsley.idehen.net/dataspace/person/kidehen#this
smime.p7s
Description: S/MIME Cryptographic Signature
