2007/6/5, Michael Hanke <[EMAIL PROTECTED]>:
Hi, I'm packaging some neuroimaging tools that come with datasets that are required for those tools to work properly. The size of these datasets is up to 400 MB (some others at least well over 100 MB). My question is now: Is it reasonable to provide this rather huge amount of data in a package in the archive? An alternative to a dedicated package would be to provide a download/install script for the data (like the msttcorefonts package) that is called at package postinst. Another alternative would be to provide a package like the (googleearth-package)-package. This would have the advantage that the users could easily build packages, that they can distribute themselves. Arguments for download wrappers/package-maker would be: - only the datasets fill yet another CD - only very few people actually benefit from this package (it is a rather very-special-interest-package) - datasets change infrequently - saves a lot of diskspace in archive and mirrors Arguments for a package: - much easier to handle for users (thinking of offline machines) - if upstream goes offline, the relevant software package in the archive are basically useless as the required datasets are not distributed anymore - diskspace is rather cheap and bandwith should be no problem as the number of downloads will remain relatively low. There was already a little discussion about this, starting from here: http://lists.debian.org/debian-devel/2007/05/msg00207.html I'd like to hear you comments about this.
Some games, and number increasing, are in the same situation that you describe. I do not have a proper solution to it, but I'm very interested in possible ways to go, as the number of packages in this situation will keep increasing. Greetings, Miry