Hello Mark, Dirk:
We will study your suggestions and write back in some days with our decided
approach.
Thanks!
Ale
2018-06-29 8:36 GMT-03:00 Dirk Eddelbuettel :
>
> On 29 June 2018 at 09:15, Mark van der Loo wrote:
> | Hi Alejandro,
> |
> | Brooke Anderson gave a nice talk at useR!2017 address
On 29 June 2018 at 09:15, Mark van der Loo wrote:
| Hi Alejandro,
|
| Brooke Anderson gave a nice talk at useR!2017 addressing this exact issue.
| See
| https://schd.ws/hosted_files/user2017/19/anderson-eddelbuettel-use_r_talk.pdf
| for
| the slides. The basic idea is to use an external CRAN-lik
Hi Alejandro,
Brooke Anderson gave a nice talk at useR!2017 addressing this exact issue.
See
https://schd.ws/hosted_files/user2017/19/anderson-eddelbuettel-use_r_talk.pdf
for
the slides. The basic idea is to use an external CRAN-like repository for
the data back-end. Brooke used 'drat' to set up s
Hi Joris:
Thank you for your comments.
Of course, we are using https for aditional downloads.
For the moment it is not needed to use github LFS, but is an alternative we
can explore after this short step: our immediate goal is to make the
package lighter in CRAN. Now it's 35kb so I think we made
Hi Ale,
I'd personally use a more specific solution like github LFS (large file
storage) for a versioned database. You should also check with CRAN itself,
as they keep high standards for everything that's not a standard install.
More specifically (from CRAN policies) :
Downloads of additional sof
By now, we are on that situation: +- 150 polyhedra published.
But +800 able to publish and because of package size cannot publish all of
them.
It is not a problem on github, it's a problem on CRAN, with building (fixed
testing timing with simple sample techniques) timing. I would like to hear
more