yangjiandan commented on code in PR #7527:
URL: https://github.com/apache/hadoop/pull/7527#discussion_r2020461751


##########
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java:
##########
@@ -586,17 +594,62 @@ InetAddress getByName(String hostname) throws 
UnknownHostException {
       return hostResolver.getByName(hostname);
     }
   }
-  
+
   interface HostResolver {
-    InetAddress getByName(String host) throws UnknownHostException;    
+    InetAddress getByName(String host) throws UnknownHostException;
+  }
+
+  static abstract class CacheableHostResolver implements HostResolver {
+    private volatile LoadingCache<String, InetAddress> cache;
+
+    CacheableHostResolver(long expiryIntervalSecs) {
+      if (expiryIntervalSecs > 0) {
+        cache = CacheBuilder.newBuilder()
+            .expireAfterWrite(expiryIntervalSecs, TimeUnit.SECONDS)

Review Comment:
   Thanks for the suggestion!
   
   Since the number of nodes in the cluster is limited, the cache size will not 
grow unbounded in practice. To keep the configuration simple, I think it’s 
reasonable not to introduce an additional setting for the maximum number of 
entries at this point.
   
   Also, note that the cache in NodesListManager.CachedResolver doesn’t enforce 
a size limit either, which seems to work fine under similar assumptions.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to