madrob commented on code in PR #912:
URL: https://github.com/apache/solr/pull/912#discussion_r902952991


##########
solr/core/src/java/org/apache/solr/core/SolrCores.java:
##########
@@ -39,8 +39,7 @@ class SolrCores {
 
   private final CoreContainer container;
 
-  private Set<String> currentlyLoadingCores =
-      Collections.newSetFromMap(new ConcurrentHashMap<String, Boolean>());
+  private Set<String> currentlyLoadingCores = Collections.newSetFromMap(new 
ConcurrentHashMap<>());

Review Comment:
   ```suggestion
     private Set<String> currentlyLoadingCores = ConcurrentHashMap.newKeySet();
   ```



##########
solr/core/src/java/org/apache/solr/filestore/PackageStoreAPI.java:
##########
@@ -212,12 +212,10 @@ public void upload(SolrQueryRequest req, 
SolrQueryResponse rsp) {
         } catch (IOException e) {
           throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, e);
         }
-      } catch (InterruptedException e) {
-        log.error("Unexpected error", e);
       } catch (KeeperException.NodeExistsException e) {
         throw new SolrException(
             SolrException.ErrorCode.SERVER_ERROR, "A write is already in 
process , try later");
-      } catch (KeeperException e) {
+      } catch (InterruptedException | KeeperException e) {

Review Comment:
   should check for interrupted status, which we weren't doing before either 
unfortunately.



##########
solr/core/src/java/org/apache/solr/handler/admin/ZookeeperInfoHandler.java:
##########
@@ -676,10 +675,7 @@ boolean printTree(JSONWriter json, String path) throws 
IOException {
             }
             first = false;
           }
-        } catch (KeeperException e) {
-          writeError(500, e.toString());
-          return false;
-        } catch (InterruptedException e) {
+        } catch (KeeperException | InterruptedException e) {

Review Comment:
   check interrupted



##########
solr/core/src/java/org/apache/solr/handler/component/FieldFacetStats.java:
##########
@@ -114,16 +114,11 @@ public boolean facetTermNum(int docID, int statsTermNum) 
throws IOException {
         key = topLevelSortedValues.lookupOrd(term).utf8ToString();
       }
       while (facetStatsTerms.size() <= statsTermNum) {
-        facetStatsTerms.add(new HashMap<String, Integer>());
+        facetStatsTerms.add(new HashMap<>());
       }
 
       final Map<String, Integer> statsTermCounts = 
facetStatsTerms.get(statsTermNum);
-      Integer statsTermCount = statsTermCounts.get(key);
-      if (statsTermCount == null) {
-        statsTermCounts.put(key, 1);
-      } else {
-        statsTermCounts.put(key, statsTermCount + 1);
-      }
+      statsTermCounts.merge(key, 1, Integer::sum);

Review Comment:
   I feel like there might be a better data structure to use for this, but 
probably can be a separate issue.



##########
solr/core/src/java/org/apache/solr/handler/admin/ZookeeperInfoHandler.java:
##########
@@ -760,10 +756,7 @@ boolean printZnode(JSONWriter json, String path) throws 
IOException {
         }
 
         json.endObject();
-      } catch (KeeperException e) {
-        writeError(500, e.toString());
-        return false;
-      } catch (InterruptedException e) {
+      } catch (KeeperException | InterruptedException e) {

Review Comment:
   check interrupted



##########
solr/core/src/java/org/apache/solr/core/SolrCore.java:
##########
@@ -1324,7 +1324,7 @@ public void initializeMetrics(SolrMetricsContext 
parentContext, String scope) {
         "coreName",
         Category.CORE.toString());
     parentContext.gauge(() -> startTime, true, "startTime", 
Category.CORE.toString());
-    parentContext.gauge(() -> getOpenCount(), true, "refCount", 
Category.CORE.toString());
+    parentContext.gauge(this::getOpenCount, true, "refCount", 
Category.CORE.toString());

Review Comment:
   I like the consistency with the lines above and below here, feels easier to 
understand if they're all the same.



##########
solr/core/src/java/org/apache/solr/cluster/placement/impl/AttributeFetcherImpl.java:
##########
@@ -185,20 +185,18 @@ public AttributeValues fetchAttributes() {
                                             .getReplicaMetricsBuilders()
                                             .computeIfAbsent(
                                                 replica.getName(),
-                                                n ->
-                                                    new 
CollectionMetricsBuilder
-                                                        
.ReplicaMetricsBuilder(n));
+                                                
CollectionMetricsBuilder.ReplicaMetricsBuilder
+                                                    ::new);

Review Comment:
   This line break is so awkward.



##########
solr/core/src/java/org/apache/solr/handler/BlobHandler.java:
##########
@@ -207,24 +206,21 @@ public void handleRequestBody(final SolrQueryRequest req, 
SolrQueryResponse rsp)
           if (docs.totalHits.value > 0) {
             rsp.add(
                 ReplicationHandler.FILE_STREAM,
-                new SolrCore.RawWriter() {
-
-                  @Override
-                  public void write(OutputStream os) throws IOException {
-                    Document doc = 
req.getSearcher().doc(docs.scoreDocs[0].doc);
-                    IndexableField sf = doc.getField("blob");
-                    FieldType fieldType = 
req.getSchema().getField("blob").getType();
-                    ByteBuffer buf = (ByteBuffer) fieldType.toObject(sf);
-                    if (buf == null) {
-                      // should never happen unless a user wrote this document 
directly
-                      throw new SolrException(
-                          SolrException.ErrorCode.NOT_FOUND,
-                          "Invalid document . No field called blob");
-                    } else {
-                      os.write(buf.array(), buf.arrayOffset(), buf.limit());
-                    }
-                  }
-                });
+                (SolrCore.RawWriter)

Review Comment:
   This looks like it actually increases the code indent, would revert.



##########
solr/core/src/java/org/apache/solr/handler/admin/SegmentsInfoRequestHandler.java:
##########
@@ -256,18 +256,9 @@ private SimpleOrderedMap<Object> getSegmentInfo(
                       size = dir.fileLength(f);
                     } catch (IOException e) {
                     }
-                    return new Pair<String, Long>(f, size);
-                  })
-              .sorted(
-                  (p1, p2) -> {
-                    if (p1.second() > p2.second()) {
-                      return -1;
-                    } else if (p1.second() < p2.second()) {
-                      return 1;
-                    } else {
-                      return 0;
-                    }
+                    return new Pair<>(f, size);
                   })
+              .sorted((p1, p2) -> p2.second().compareTo(p1.second()))

Review Comment:
   Add a comment that this is a reverse sort.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@solr.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@solr.apache.org
For additional commands, e-mail: issues-h...@solr.apache.org

Reply via email to