dependabot[bot] opened a new pull request, #1590:
URL: https://github.com/apache/stormcrawler/pull/1590

   Bumps 
[com.github.crawler-commons:crawler-commons](https://github.com/crawler-commons/crawler-commons)
 from 1.4 to 1.5.
   <details>
   <summary>Release notes</summary>
   <p><em>Sourced from <a 
href="https://github.com/crawler-commons/crawler-commons/releases";>com.github.crawler-commons:crawler-commons's
 releases</a>.</em></p>
   <blockquote>
   <h2>crawler-commons-1.5</h2>
   <h2>Important Changes</h2>
   <ul>
   <li>The robots.txt parser is now pedantic regarding the user-agent names 
passed to the <a 
href="https://crawler-commons.github.io/crawler-commons/1.5/crawlercommons/robots/SimpleRobotRulesParser.html#parseContent(java.lang.String,byte%5B%5D,java.lang.String,java.util.Collection)">parseContent()
 method</a>. The names in the <code>robotNames</code> parameter must be 
lower-case and the wildcard agent name &quot;<code>*</code>&quot; must not be 
included. An exception is thrown if these conditions are not met. Please see 
the Javadoc and <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/453";>#453</a>.</li>
   </ul>
   <h2>Full List of Changes</h2>
   <ul>
   <li>Migrate publishing from OSSRH to Central Portal (jnioche, 
sebastian-nagel, Richard Zowalla, aecio) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/510";>#510</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/516";>#516</a></li>
   <li>[Sitemaps] Add cross-submit feature (Avi Hayun, kkrugler, 
sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/85";>#85</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/515";>#515</a></li>
   <li>[Sitemaps] Complete sitemap extension attributes (sebastian-nagel, 
Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/513";>#513</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/514";>#514</a></li>
   <li>[Sitemaps] Allow partial extension metadata (adriabonetmrf, 
sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/456";>#456</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/458";>#458</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/512";>#512</a></li>
   <li>[Domains] EffectiveTldFinder to also take shorter suffix matches into 
account (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/479";>#479</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/505";>#505</a></li>
   <li>Add package-info.java to all packages (sebastian-nagel, Richard Zowalla) 
<a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/432";>#432</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/504";>#504</a></li>
   <li>[Robots.txt] Extend API to allow to check java.net.URL objects 
(sebastian-nagel, aecio, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/502";>#502</a></li>
   <li>[Robots.txt] Incorrect robots.txt result for uppercase user agents 
(teammakdi, sebastian-nagel, aecio, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/453";>#453</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/500";>#500</a></li>
   <li>Remove class utils.Strings (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/503";>#503</a></li>
   <li>[BasicNormalizer] Complete normalization feature list of 
BasicURLNormalizer (sebastian-nagel, kkrugler) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/494";>#494</a></li>
   <li>[Robots] Document that URLs not properly normalized may not be matched 
by robots.txt parser (sebastian-nagel, kkrugler) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/492";>#492</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/493";>#493</a></li>
   <li>[Sitemaps] Added https variants of namespaces (jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/487";>#487</a></li>
   <li>[Domains] Add version of public suffix list shipped with release 
packages enhancement (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/433";>#433</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/484";>#484</a></li>
   <li>[Domains] Improve representation of public suffix match results by class 
EffectiveTLD (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/478";>#478</a></li>
   <li>Javadoc: fix links to Java core classes (sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/417";>#417</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/483";>#483</a></li>
   <li>[Sitemaps] Improve logging done by SiteMapParser (Valery Yatsynovich, 
sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/457";>#457</a></li>
   <li>[Sitemaps] Google Sitemap PageMap extensions (josepowera, 
sebastian-nagel, Richard Zowalla, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/388";>#388</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/442";>#442</a></li>
   <li>[Domains] Installation of a gzip-compressed public suffix list from 
Maven cache breaks EffectiveTldFinder to address (sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/441";>#441</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/443";>#443</a></li>
   <li>Upgrade dependencies (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/437";>#437</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/444";>#444</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/448";>#448</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/451";>#451</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/473";>#473</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/465";>#465</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/466";>#466</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/468";>#468</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/488";>#488</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/491";>#491</a>,
 <a href="https://redirect.github.com/crawler-co
 mmons/crawler-commons/issues/506">#506</a>, <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/511";>#511</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/517";>#517</a></li>
   <li>Upgrade Maven plugins (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/434";>#434</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/438";>#438</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/439";>#439</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/449";>#449</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/445";>#445</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/452";>#452</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/455";>#455</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/459";>#459</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/460";>#460</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/464";>#464</a>,
 <a href="https://redirect.github.com/crawler-c
 ommons/crawler-commons/issues/469">#469</a>, <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/467";>#467</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/470";>#470</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/471";>#471</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/472";>#472</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/474";>#474</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/475";>#475</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/476";>#476</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/477";>#477</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/480";>#480</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/481";>#481</a>,
 <a href="https://redirect.github.com/crawl
 er-commons/crawler-commons/issues/482">#482</a>, <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/489";>#489</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/490";>#490</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/495";>#495</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/496";>#496</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/497";>#497</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/498";>#498</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/499";>#499</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/508";>#508</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/509";>#509</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/518";>#518</a></li>
   <li>Upgrade GitHub workflow actions v2 -&gt; v4 (sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/501";>#501</a></li>
   </ul>
   </blockquote>
   </details>
   <details>
   <summary>Changelog</summary>
   <p><em>Sourced from <a 
href="https://github.com/crawler-commons/crawler-commons/blob/master/CHANGES.txt";>com.github.crawler-commons:crawler-commons's
 changelog</a>.</em></p>
   <blockquote>
   <p>Crawler-Commons Change Log</p>
   <p>Current Development 1.6-SNAPSHOT (yyyy-mm-dd)</p>
   <p>Release 1.5 (2025-06-27)</p>
   <ul>
   <li>Migrate publishing from OSSRH to Central Portal (jnioche, 
sebastian-nagel, Richard Zowalla, aecio) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/510";>#510</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/516";>#516</a></li>
   <li>[Sitemaps] Add cross-submit feature (Avi Hayun, kkrugler, 
sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/85";>#85</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/515";>#515</a></li>
   <li>[Sitemaps] Complete sitemap extension attributes (sebastian-nagel, 
Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/513";>#513</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/514";>#514</a></li>
   <li>[Sitemaps] Allow partial extension metadata (adriabonetmrf, 
sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/456";>#456</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/458";>#458</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/512";>#512</a></li>
   <li>[Domains] EffectiveTldFinder to also take shorter suffix matches into 
account (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/479";>#479</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/505";>#505</a></li>
   <li>Add package-info.java to all packages (sebastian-nagel, Richard Zowalla) 
<a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/432";>#432</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/504";>#504</a></li>
   <li>[Robots.txt] Extend API to allow to check java.net.URL objects 
(sebastian-nagel, aecio, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/502";>#502</a></li>
   <li>[Robots.txt] Incorrect robots.txt result for uppercase user agents 
(teammakdi, sebastian-nagel, aecio, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/453";>#453</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/500";>#500</a></li>
   <li>Remove class utils.Strings (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/503";>#503</a></li>
   <li>[BasicNormalizer] Complete normalization feature list of 
BasicURLNormalizer (sebastian-nagel, kkrugler) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/494";>#494</a></li>
   <li>[Robots] Document that URLs not properly normalized may not be matched 
by robots.txt parser (sebastian-nagel, kkrugler) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/492";>#492</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/493";>#493</a></li>
   <li>[Sitemaps] Added https variants of namespaces (jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/487";>#487</a></li>
   <li>[Domains] Add version of public suffix list shipped with release 
packages enhancement (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/433";>#433</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/484";>#484</a></li>
   <li>[Domains] Improve representation of public suffix match results by class 
EffectiveTLD (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/478";>#478</a></li>
   <li>Javadoc: fix links to Java core classes (sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/417";>#417</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/483";>#483</a></li>
   <li>[Sitemaps] Improve logging done by SiteMapParser (Valery Yatsynovich, 
sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/457";>#457</a></li>
   <li>[Sitemaps] Google Sitemap PageMap extensions (josepowera, 
sebastian-nagel, Richard Zowalla, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/388";>#388</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/442";>#442</a></li>
   <li>[Domains] Installation of a gzip-compressed public suffix list from 
Maven cache breaks EffectiveTldFinder to address (sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/441";>#441</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/443";>#443</a></li>
   <li>Upgrade dependencies (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/437";>#437</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/444";>#444</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/448";>#448</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/451";>#451</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/473";>#473</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/465";>#465</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/466";>#466</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/468";>#468</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/488";>#488</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/491";>#491</a>,
 <a href="https://redirect.github.com/crawler-co
 mmons/crawler-commons/issues/506">#506</a>, <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/511";>#511</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/517";>#517</a></li>
   <li>Upgrade Maven plugins (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/434";>#434</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/438";>#438</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/439";>#439</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/449";>#449</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/445";>#445</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/452";>#452</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/455";>#455</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/459";>#459</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/460";>#460</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/464";>#464</a>,
 <a href="https://redirect.github.com/crawler-c
 ommons/crawler-commons/issues/469">#469</a>, <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/467";>#467</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/470";>#470</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/471";>#471</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/472";>#472</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/474";>#474</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/475";>#475</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/476";>#476</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/477";>#477</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/480";>#480</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/481";>#481</a>,
 <a href="https://redirect.github.com/crawl
 er-commons/crawler-commons/issues/482">#482</a>, <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/489";>#489</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/490";>#490</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/495";>#495</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/496";>#496</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/497";>#497</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/498";>#498</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/499";>#499</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/508";>#508</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/509";>#509</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/518";>#518</a></li>
   <li>Upgrade GitHub workflow actions v2 -&gt; v4 (sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/501";>#501</a></li>
   </ul>
   <p>Release 1.4 (2023-07-13)</p>
   <ul>
   <li>[Robots.txt] Implement Robots Exclusion Protocol (REP) IETF Draft: port 
unit tests (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/245";>#245</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/360";>#360</a></li>
   <li>[Robots.txt] Close groups of rules as defined in RFC 9309 (kkrugler, 
garyillyes, jnioche, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/114";>#114</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/390";>#390</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/430";>#430</a></li>
   <li>[Robots.txt] Empty disallow statement not to clear other rules 
(sebastian-nagel, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/422";>#422</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/424";>#424</a></li>
   <li>[Robots.txt] SimpleRobotRulesParser main() to follow five redirects 
(sebastian-nagel, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/428";>#428</a></li>
   <li>[Robots.txt] Add more spelling variants and typos of robots.txt 
directives (sebastian-nagel, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/425";>#425</a></li>
   <li>[Robots.txt] Document effect of rules merging in combination with 
multiple agent names (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/423";>#423</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/426";>#426</a></li>
   <li>[Robots.txt] Pass empty collection of agent names to select rules for 
any robot (wildcard user-agent name) (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/427";>#427</a></li>
   <li>[Robots.txt] Rename default user-agent / robot name in unit tests 
(sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/429";>#429</a></li>
   <li>[Robots.txt] Add units test based on examples in RFC 9309 
(sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/420";>#420</a></li>
   <li>[BasicNormalizer] Query parameters normalization in BasicURLNormalizer 
(aecio, sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/308";>#308</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/421";>#421</a></li>
   <li>[Robots.txt] Deduplicate robots rules before matching (sebastian-nagel, 
jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/416";>#416</a></li>
   <li>[Robots.txt] SimpleRobotRulesParser main to use the new API method 
(sebastian-nagel, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/413";>#413</a></li>
   <li>Generate JaCoCo reports when testing (jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/409";>#409</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/412";>#412</a></li>
   <li>Push Code Coverage to Coveralls (Richard Zowalla, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/414";>#414</a></li>
   <li>[Robots.txt] Path analyse bug with url-decode if allow/disallow path 
contains escaped wild-card characters (tkalistratov, sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/195";>#195</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/408";>#408</a></li>
   <li>[Robots.txt] Handle allow/disallow directives containing unescaped 
Unicode characters (sebastian-nagel, Richard Zowalla, aecio) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/389";>#389</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/401";>#401</a></li>
   <li>[Robots.txt] Improve readability of robots.txt unit tests 
(sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/383";>#383</a></li>
   <li>Upgrade project to use Java 11 (Avi Hayun, Richard Zowalla, aecio, 
sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/320";>#320</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/376";>#376</a></li>
   <li>[Robots.txt] RFC compliance: matching user-agent names when selecting 
rule blocks (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/362";>#362</a></li>
   <li>[Robots.txt] Matching user-agent names does not conform to robots.txt 
RFC (YossiTamari, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/192";>#192</a></li>
   <li>[Robots.txt] Improve robots check draft rfc compliance (Eduardo Jimenez) 
<a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/351";>#351</a></li>
   </ul>
   <!-- raw HTML omitted -->
   </blockquote>
   <p>... (truncated)</p>
   </details>
   <details>
   <summary>Commits</summary>
   <ul>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/8bdd9e7740abec2554c2e0772a6fefbfd9ff6df7";><code>8bdd9e7</code></a>
 [maven-release-plugin] prepare release crawler-commons-1.5</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/32a938e45af904369d8e1f5497fff2cd2ea221db";><code>32a938e</code></a>
 Reroll to 1.5-SNAPSHOT</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/53ac0cef4c8ca422046f307c13f7a40afef97dd5";><code>53ac0ce</code></a>
 Update CHANGES.txt to include final changes for 1.5</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/4f88e6dd8fbe6a49d53e22e4e1cc50fde1972c59";><code>4f88e6d</code></a>
 Bump org.sonatype.central:central-publishing-maven-plugin</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/adcfb548a76ff9d118338cf9e17f0c66a6063ecb";><code>adcfb54</code></a>
 [maven-release-plugin] prepare release crawler-commons-1.5</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/c6825ce7980e5671533d2268a12af27dcf3fc910";><code>c6825ce</code></a>
 Fix header in change log for release of 1.5</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/d634ba2b61d4ced3ce35d90736acc3a90456e69a";><code>d634ba2</code></a>
 [maven-release-plugin] prepare for next development iteration</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/14d9f8d818bdf11cf576ab0abdf64caac31394cc";><code>14d9f8d</code></a>
 [maven-release-plugin] prepare release crawler-commons-1.5</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/e7a4619cbf0ab46a066645a1d35da22d061da1d7";><code>e7a4619</code></a>
 Format Javadoc per <code>mvn java-formatter:format</code>,</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/43711bb68ace703837862c603aa3a0bd52606e5b";><code>43711bb</code></a>
 Update CHANGES.txt to include recent changes and updates</li>
   <li>Additional commits viewable in <a 
href="https://github.com/crawler-commons/crawler-commons/compare/crawler-commons-1.4...crawler-commons-1.5";>compare
 view</a></li>
   </ul>
   </details>
   <br />
   
   
   [![Dependabot compatibility 
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=com.github.crawler-commons:crawler-commons&package-manager=maven&previous-version=1.4&new-version=1.5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
   
   Dependabot will resolve any conflicts with this PR as long as you don't 
alter it yourself. You can also trigger a rebase manually by commenting 
`@dependabot rebase`.
   
   [//]: # (dependabot-automerge-start)
   [//]: # (dependabot-automerge-end)
   
   ---
   
   <details>
   <summary>Dependabot commands and options</summary>
   <br />
   
   You can trigger Dependabot actions by commenting on this PR:
   - `@dependabot rebase` will rebase this PR
   - `@dependabot recreate` will recreate this PR, overwriting any edits that 
have been made to it
   - `@dependabot merge` will merge this PR after your CI passes on it
   - `@dependabot squash and merge` will squash and merge this PR after your CI 
passes on it
   - `@dependabot cancel merge` will cancel a previously requested merge and 
block automerging
   - `@dependabot reopen` will reopen this PR if it is closed
   - `@dependabot close` will close this PR and stop Dependabot recreating it. 
You can achieve the same result by closing it manually
   - `@dependabot show <dependency name> ignore conditions` will show all of 
the ignore conditions of the specified dependency
   - `@dependabot ignore this major version` will close this PR and stop 
Dependabot creating any more for this major version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this minor version` will close this PR and stop 
Dependabot creating any more for this minor version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this dependency` will close this PR and stop 
Dependabot creating any more for this dependency (unless you reopen the PR or 
upgrade to it yourself)
   
   
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to