Thanks Alan this is indeed the sort of functionality I'm looking for. I have been using a subset of the functionality of the LinkedDataAPI - for which implementations exist (ELDA, Puelia) - against Sesame and Triplify backends.
https://code.google.com/archive/p/linked-data-api/wikis/Specification.wiki LDA works nicely enough, modulo a few things There are a few issues with this: 1) it would be nice if the LDA spec was supported by a SDO 2) implementation was supported by a larger developer community 3) and it didnt mean a whole set of additional dependency wrangling for system requirements, imported components and licences - all of which would be solved if the equivalent functionality was in Marmotta :-) 4) it supported CONSTRUCT queries to transform statements inject default and calculated values into the result. 5) it supported querying over blank nodes (which we find all the time when there is an OWL model as part of the graph - and OWL is really useful for many things) 6) LDA itself could be factored into core and extensions Given Marmotta's modular architecture - a module with this functionality seems a fairly good fit. The main hurdle I see is that Marmotta is keen to bind to a single repository - whereas I specifically want to combine data from triple-stores and existing RDBMS environments. Decoupling APIs against separate SPARQL endpoints works fine for this. IMHO If the Marmotta build configured APIs by default against its configured SPARQL endpoint, but allowed additional APIs to be configured against alternative SPARQL endpoints then all would be bliss. Rob On Wed, 17 Feb 2016 at 13:33 Robson, Alan <alan.rob...@viasat.com> wrote: > This may be way off base… > > > > I wrote a small java EE application that accepts a get request with > parameters, it then searches for an RDF record in LDP that matches the name > of the query, validates the parameters according to the instructions it > finds in the RDF, formulates a SPARQL query then returns the results/errors. > > > > Here’s an example record representing an allowable query – I just made up > the vocabulary… > > > > *@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema# > <http://www.w3.org/2000/01/rdf-schema#>> .* > > *@prefix foaf: <http://xmlns.com/foaf/0.1/ <http://xmlns.com/foaf/0.1/>> .* > > *@prefix dlq: <http://unfortunate.name/2015/query# > <http://unfortunate.name/2015/query#>> .* > > > > *<> a dlq:sparqlQuery;* > > * rdfs:label "Returns a list of names of individuals who have attained > the age of majority and hence are considered to be adults.";* > > * dlq:param [ dlq:paramName "majority"; dlq:dataType dlq:unsignedInt; > dlq:maxValue "150"; dlq:defaultValue "18"; rdfs:label "age of majority" ];* > > * dlq:sparql "PREFIX foaf: <http://xmlns.com/foaf/0.1/ > <http://xmlns.com/foaf/0.1/>> SELECT ?name WHERE {?i a foaf:Person . ?i > foaf:name ?name . ?i foaf:age ?age . FILTER (?age >= majority)}"; .* > > > > I post the above ttl file to marmotta so that it can be found via LDP at: > > > * http://ldp:8080/marmotta/ldp/QUERIES/findAdults* > > > > My SPARQL proxy sits on tomcat alongside marmotta and when it gets a query > like: > > > > http://ldp:8080/proxy/Services/findAdults?majority=21 > > > > It looks up the above record in LDP by appending findAdults to the path, > then it validates the “majority” parameter and builds a SPARQL query, > substituting in the parameter where it is named then executes it and > returns the result (or of course an error message if anything is out of > kilter) > > > > I also went ahead and added a few dummy records with people of various > ages so that the SPARQL query had something to work with. > > > > Right now it only supports simple validations like integer ranges, but it > could be extended to other types. > > > > I built it because while don’t mind users having SPARQL access in my dev > environment, I want only canned queries in my production environment > because I have seen how unfettered queries can bog down other databases. I > was not so worried about security as performance. > > > > The only interfaces between marmotta and the query proxy are marmotta’s > standard REST APIs, so the proxy need not be hosted on the same machine as > marmotta. > > > > Is this along the lines of what you wanted ? > > > > Alan > > *From:* Rob Atkinson [mailto:r...@metalinkage.com.au] > *Sent:* Monday, February 15, 2016 4:55 AM > *To:* users@marmotta.apache.org > *Subject:* Support for RDF-shapes, Linked data API > > > > Hi > > > > I am looking at linked data applications to add value to data exposed via > Web Services, to add the missing semantics about the exposed content needed > to actually discover and use those services. > > i need to be able to traverse graphs composed of things like VoiD, > RDF-Datacube etc. > > The RDF-shapes scope covers this, and there are some elements of Marmotta > such as LDPath that are relevant. I've previously built the functionality I > needed using the LinkedDataAPI (https://github.com/UKGovLD/linked-data-api > <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_UKGovLD_linked-2Ddata-2Dapi&d=BQMFaQ&c=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk&r=xN9AnMDZvO_QvUZQZcoply6WFqSJuNcpv-dDu2fA4Ac&m=sJnASXMUIR0ucnEHlcG3kM1GZ5TFHlhGyAhkYT45qfY&s=StoV4m3FfnN5m7Y1JgL1aSFxde0mEhsWqxXB6i_H-FA&e=> > ). > > Is there anything in Marmotta to support parameterised SPARQL queries > accessed via URL based APIs, and building a response by traversing paths > from the query results? (i've looked but not found anything - any pointers > to descriptions of the marnotta core would be helpful!) > > > > Regards > > Rob Atkinson >