Hi,

I have implemented LTR (LambdaRank) functionality but there are some search
cases where the relevancy is actually getting worse. I am trying to
understand why some results are ranked over the others. Naturally, I am
using a debug query to understand what is going on.

e.g. here is the explain response for one of the document:

doc:en:/help/coder/index.html":"\n0.93952394 =
(name=model,featureValues=[linkScore=1.7102735,hierScore=3.9314165,originalScore=0.029598212,tfidf_title=-0.3270329,tfidf_body=-0.6185444,tfidf_url=-0.8011434,tfidf_file_name=-0.37964302,tfidf_primary_header_en=-0.32059863,tfidf_secondary_header_en=0.36570454,tfidf_meta_description_en=-0.09497543,tfidf_inlink_text_en=-0.08638504,tfidf_indexed_not_highlighted_en=-0.2544066],layers=[(matrix=75x12,activation=relu),(matrix=1x75,activation=sigmoid)])\n

Can somebody tell me how the final score of 0.93952394 is getting
calculated for this document? Also, how are the featureValues
calculated? e.g. hierScore field value for this document is actually
0.5 but it shows up here as 3.9314165.

Reply via email to