> >>On Jan 12, 2015, at 12:49 PM, Tobias Oberstein But what is the > >>"interface" between test cases from "twisted- benchmarks" to > >>codespeed? > > > >Codespeed runs the benchmark, and they print out this stuff: > >https://bazaar.launchpad.net/~twisted-dev/twisted- > >benchmarks/trunk/view/head:/benchlib.py#L12 > > > >POSTing them via JSON would be nicer, structured data is great. > > Nothing parses that output. It's just for humans. > > The code you're looking for is: > > https://bazaar.launchpad.net/~twisted-dev/twisted- > benchmarks/trunk/view/head:/speedcenter.py > > which POSTs structured data (though urlencoded, not JSON) to the > codespeed server.
I see. And codespeed parsest that, stores it in a database and produces graphics? It seems, reporting the results via a WAMP RPC to a backend should be quite easy to add in speedcenter.py How is Twisted speedcenter orchestrated / triggered? I mean, a new commit to Twisted repo will trigger rerunning all speed tests? If so, how does that work? /Tobias _______________________________________________ Twisted-Python mailing list Twisted-Python@twistedmatrix.com http://twistedmatrix.com/cgi-bin/mailman/listinfo/twisted-python