I'd like to know what people are using to measure the quality of their Go 
code bases and why. Specifically, I'm asking if:

   1. There is a way to score a code base on quality that works with 
   idiomatic Go
   2. Use this metric in a CI system and fail the build if the quality falls
   3. Use this metric in a team to detect trends over time and hot spots 
   where technical debt tends to accumulate

It strikes me that Go is quite different to the usual set of popular 
languages and that even the most basic measures (cyclomatic complexity for 
example) may not be a good fit. 

For example:

   1. Adding proper error handling increases measured complexity, but is 
   actually preferable to not handling errors
   2. A switch case where a case statement contains one or more commas is 
   actually more maintainable, but would probably have the same score as 
   individual case statements on separate lines

There are already a lot of tools that analyse Go code in several different 
ways - but which ones really work in the long term? 

I'd like to ask the community which ones you've had good experiences with - 
a test of a good tool could be that a better score results in more 
idiomatic, maintainable code.

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/golang-nuts/510f2a19-7dc8-4d01-ad1e-65155ca85355%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to