Hi, Uwe, Kelly and Berwin:
Thanks for the replies.
Checking my "cran-comments.md" file after Berwin's reply, I found
that I had documented two similar problems in 2022. One of these new
problems was present in 2022 but not flagged then. The other is new. I
documented them both in "cran-comments.md".
Thanks again,
Spencer Graves
p.s. If my memory worked better, I may not have needed to bother
R-pkg-devel. (I don't think this memory problem is a symptom of dementia
or senility, as I remember not remembering well decades ago ;-)
On 11/10/24 07:35, Uwe Ligges wrote:
These can be ignored: The websites report "Forbidden" state when the
script asks for headers to verify the URLs are correct.
Not much you can do unless the websites are under your control.
Best,
Uwe Ligges
On 10.11.2024 06:56, Spencer Graves wrote:
Hello:
I'm getting:
Found the following (possibly) invalid URLs:
URL: https://bioguide.congress.gov/
From: man/readDW_NOMINATE.Rd
Status: 403
Message: Forbidden
URL: https://www.bls.gov/cps/
From: inst/doc/UpdatingUSGDPpresidents.html
Status: 403
Message: Forbidden
These are in:
https://win-builder.r-project.org/9SsxyKVoV7n1/00check.log
Searching for "forbidden" in "Writing R Extensions" or in a web
search has given me nothing.
These are only NOTES. Should I ignore them is submitting to CRAN?
Thanks,
Spencer Graves
______________________________________________
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel
______________________________________________
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel