Bug#728277: ITP: r-bioc-bsgenome -- BioConductor infrastructure for Biostrings-based genome data packages

2013-10-30 Thread Andreas Tille
Description : BioConductor infrastructure for Biostrings-based genome data packages This BioConductor module provides some basic infrastructure for Biostrings-based genome data packages. Remark: The package is maintained by Debian Med team at svn://anonscm.debian.org/debian-med

Re: Library depending on -data packages

2011-03-23 Thread Goswin von Brederlow
Jonathan Nieder writes: > (dropped cc's; hopefully that's okay.) > Hi! > > Luca Capello wrote: > >> I see these situations as a misuse of Depends: where Recommends: would >> be perfectly fine, otherwise Recommends: are useless. But given that it >> seems no one agrees with me, is such a behavior

Re: Library depending on -data packages

2011-03-23 Thread Goswin von Brederlow
Simon McVittie writes: > For instance, openarena needs a corresponding version of openarena-data: > if you substitute a data-set in the same format (zipped Quake III-compatible > assets) with non-trivial modifications, it won't be network-compatible, and > might even crash if you don't make corre

Re: Library depending on -data packages

2011-03-21 Thread Josh Triplett
;http://bugs.debian.org/599643> > <http://bugs.debian.org/599666> > > When I found out about libm17n-0, I also found out that the change added > a circular dependency and thus commented on this new bug why I think a > library package should not depend on data packages: > >

Re: Library depending on -data packages

2011-03-21 Thread Jonathan Nieder
Simon McVittie wrote: > Or are you saying > that things with special::auto-inst-parts should never have even a weakened > dependency on the package of which they're an implementation detail? Yes. (Well, a Suggests is okay.) > In situations where the data and the engine have a many-to-many relat

Re: Library depending on -data packages

2011-03-21 Thread Simon McVittie
On Mon, 21 Mar 2011 at 12:10:16 -0500, Jonathan Nieder wrote: > Simon McVittie wrote: > > The existence of openarena-data is an implementation detail of openarena, > > so it has this relationship: > > > >/--->--- Depends -->---\ > > openarena openarena-data > >\---<--

Re: Library depending on -data packages

2011-03-21 Thread Olaf van der Spek
On Mon, Mar 21, 2011 at 5:42 PM, Simon McVittie wrote: > The data package typically just takes up space without doing anything > useful if you install it on its own, so it should have the > special::auto-inst-parts debtag and should usually Recommend the library or > executable. I don't agree. Ye

Re: Library depending on -data packages

2011-03-21 Thread Jonathan Nieder
Hi, Simon McVittie wrote: > Which way to break the circular dependency needs to be considered > case-by-case; > neither answer is universally right. Here (with this statement of the problem) I disagree --- using Depends to mean Enhances is _always_ wrong. For example: > The existence of opena

Re: Library depending on -data packages

2011-03-21 Thread Jonathan Nieder
(dropped cc's; hopefully that's okay.) Hi! Luca Capello wrote: > I see these situations as a misuse of Depends: where Recommends: would > be perfectly fine, otherwise Recommends: are useless. But given that it > seems no one agrees with me, is such a behavior documented somewhere? Checking poli

Re: Library depending on -data packages

2011-03-21 Thread Simon McVittie
On Mon, 21 Mar 2011 at 17:18:00 +0100, Luca Capello wrote: > When I found out about libm17n-0, I also found out that the change added > a circular dependency and thus commented on this new bug why I think a > library package should not depend on data packages Which way to break the

Library depending on -data packages

2011-03-21 Thread Luca Capello
change added a circular dependency and thus commented on this new bug why I think a library package should not depend on data packages: <http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=604926#20> And now, while looking again for that bug to be linked here, I found another occurrence o

Bug#530004: ITP: r-cran-g.data -- GNU R package for delayed-data packages

2009-05-22 Thread Steffen Moeller
delayed-data g.data creates and maintains delayed-data packages (DDP's). Data stored in a DDP are available on demand, but do not take up memory until requested. You attach a DDP with g.data.attach(), then read from it and assign to it in a manner similar to S-Plus, except that you must

Re: Data packages.

2009-02-09 Thread Joerg Jaspert
> Also, I'd suggest (if we get it), we also use it for things like > videos of talks about Debian, that kind of things. No. -- bye, Joerg schneidet nie chilis und wascht euch dann _nicht_ die hände und reibt euch dann an der nase. uargs, wie das brennt hammer. das ist ja schlimmer als die d

Re: Data packages.

2009-02-09 Thread Cyril Brulebois
Lionel Elie Mamane (09/02/2009): > Also, I'd suggest (if we get it), we also use it for things like > videos of talks about Debian, that kind of things. FWIW, there's that already: http://meetings-archive.debian.net/pub/debian-meetings/ Mraw, KiBi. signature.asc Description: Digital signature

Re: Data packages.

2009-02-09 Thread Lionel Elie Mamane
On Mon, Feb 09, 2009 at 08:39:58AM +0900, Charles Plessy wrote: > Le Sun, Feb 08, 2009 at 10:03:09PM +0100, Joey Schulze a écrit : >> The data archive will contain huge packages that cannot be distributed >> through the regular archive due to their sheer size. (...) > that sounds very, very inter

Re: Data packages. (with BitTorrent)

2009-02-08 Thread Charles Plessy
Le Sun, Feb 08, 2009 at 07:22:10PM -0600, Lukasz Szybalski a écrit : > > Is there a capability of creating a mirror that would use torrent technology? > > If you have these .deb packages that are big (could you list few, with > their sizes), I was under the assumption that having torrent mirror >

Re: Data packages.

2009-02-08 Thread Lukasz Szybalski
f they > are not official. > > - From within the Amazon system, build binary packages of bioinformatical > data, and distribute them on the Amazon Simple Storage system, if it is > possible to open to the inside (free transfer), but not to the outside > (costs > me money).

Re: Data packages.

2009-02-08 Thread Charles Plessy
orage system, if it is possible to open to the inside (free transfer), but not to the outside (costs me money). - See if people use the data packages in conjuction with the unofficial-but-gpg-signed Debian Med image that I indend to prepare. - Ask for sponsorship if it starts to cost too much :)

Re: Data packages.

2009-02-08 Thread Peter Palfrader
On Mon, 09 Feb 2009, Charles Plessy wrote: > Le Sun, Feb 08, 2009 at 10:03:09PM +0100, Joey Schulze a écrit : > > > > The data archive will contain huge packages that cannot be distributed > > through the regular archive due to their sheer size. The number of free > >

Data packages.

2009-02-08 Thread Charles Plessy
Le Sun, Feb 08, 2009 at 10:03:09PM +0100, Joey Schulze a écrit : > > The data archive will contain huge packages that cannot be distributed > through the regular archive due to their sheer size. The number of free > large data packages, such as medical and statistical data sets, and

Re: Large data packages in the archive

2008-05-31 Thread Joe Smith
"Joerg Jaspert" <[EMAIL PROTECTED]> wrote: [snip] c.) We can host an own archive for it under control of ftpmaster. [snip] So the way to go for us seems to be c.), hosting the archive ourself (somewhere below data.debian.org probably). [snip] A data.d.o would presumably be running on a debian

Re: Large data packages in the archive

2008-05-28 Thread Daniel Jacobowitz
On Tue, May 27, 2008 at 03:54:25PM -0500, Raphael Geissert wrote: > Daniel Jacobowitz wrote: > > > > FYI, the most recent CVS snapshots of GDB can read zlib-compressed > > debug info. If someone gets around to an objcopy patch to create it, > > then we can change debhelper to use it... > > > >

Re: Large data packages in the archive

2008-05-28 Thread Mark Eichin
Ove Kaaven <[EMAIL PROTECTED]> writes: > Joerg Jaspert skrev: >> - Packages in main need to be installable and not cause their (indirect) >>reverse build-depends to FTBFS in the absence of data.debian.org. >>If the data is necessary for the package to work and there is a small >>datas

Re: Large data packages in the archive

2008-05-28 Thread Goswin von Brederlow
Joerg Jaspert <[EMAIL PROTECTED]> writes: > On 11397 March 1977, Goswin von Brederlow wrote: > >>> No, we have to distribute the source. DFSG#2, many licenses require it. >>> Relying on some external system to keep the source for us is also a >>> no-go. >> What about the hack to have the source bu

Re: Large data packages in the archive

2008-05-27 Thread Raphael Geissert
Daniel Jacobowitz wrote: > > FYI, the most recent CVS snapshots of GDB can read zlib-compressed > debug info. If someone gets around to an objcopy patch to create it, > then we can change debhelper to use it... > What's the runtime cost? Usually things get slower with such kind of changes. Che

Re: Large data packages in the archive

2008-05-27 Thread Florian Weimer
* Joerg Jaspert: > Any comments? In the long term, I'd like to see a better CDN, so that such considerations would magically disappear. > Timeframe for this? I expect it to be ready within 2 weeks. Oooh. For a production-quality CDN, 2 years seem more reasonable. I don't know the reason for t

Re: Large data packages in the archive

2008-05-27 Thread Mike Hommey
On Tue, May 27, 2008 at 02:05:02PM -0400, Daniel Jacobowitz wrote: > On Mon, May 26, 2008 at 05:52:13PM +0100, Darren Salt wrote: > > I demand that Alexander E. Patrakov may or may not have written... > > > > > Joerg Jaspert wrote: > > >> That already has a problem: How to define "large"? One way,

Re: Large data packages in the archive

2008-05-27 Thread Daniel Jacobowitz
On Mon, May 26, 2008 at 05:52:13PM +0100, Darren Salt wrote: > I demand that Alexander E. Patrakov may or may not have written... > > > Joerg Jaspert wrote: > >> That already has a problem: How to define "large"? One way, which we > >> chose for now, is simply "everything > 50MB". > > > Random th

Re: Large data packages in the archive

2008-05-25 Thread Alexander E. Patrakov
Joerg Jaspert wrote: That already has a problem: How to define "large"? One way, which we chose for now, is simply "everything > 50MB". Random thought: some architecture-dependent -dbg packages are also > 50 MB in size. Shouldn't they get some special treatment, too? -- Alexander E. Patrako

Re: Large data packages in the archive

2008-05-25 Thread Charles Plessy
Le Mon, May 26, 2008 at 02:02:52AM +0200, Joerg Jaspert a écrit : > On 11397 March 1977, Charles Plessy wrote: > > > I have a question about the sources: for big datasets, would it be > > acceptable that the source package does not contain the data itself but > > only a script to download it? Sinc

Re: Large data packages in the archive

2008-05-25 Thread Joerg Jaspert
On 11397 March 1977, Charles Plessy wrote: > I have a question about the sources: for big datasets, would it be > acceptable that the source package does not contain the data itself but > only a script to download it? Since the source packages are not to be > autobuilt and the binary packages only

Re: Large data packages in the archive

2008-05-25 Thread Joerg Jaspert
On 11396 March 1977, Raphael Geissert wrote: > What about going the 'b.)' way but define it as a RG (or even RC) with some > other changes to policy (like requiring big data package's source packages > to be arch-indep and not build anything else but the data packages).

Re: Large data packages in the archive

2008-05-25 Thread Charles Plessy
Le Sun, May 25, 2008 at 08:18:01PM +0200, Joerg Jaspert a écrit : > Basic Problem: "What to do with large data packages?" > > That already has a problem: How to define "large"? One way, which we > chose for now, is simply "everything > 50MB". (...)

Re: Large data packages in the archive

2008-05-25 Thread Goswin von Brederlow
be arch-indep and not build anything else but the data packages). >> >> That way the transition could be done gradually for lenny+1 so there's no >> bloating. >> >> And, mirror admins could then have plenty of time to decide whether to >> mirror the da

Re: Large data packages in the archive

2008-05-25 Thread Luk Claes
Raphael Geissert wrote: > Luk Claes wrote: >> Are you sure that the current sync scripts make that possible and won't >> sync everything unless explicitely stated differently and will keep >> working without intervention for the time being? Because otherwise it's >> like Joerg said not an option IM

Re: Large data packages in the archive

2008-05-25 Thread Raphael Geissert
nd achieves the same goal without having to setup another repository which would also require more integration work in the current tools (package.d.o, PTS, DDPO, DEHS, ). And if that change is defined as a lenny+1 RC goal, it would ensure that all data packages are in the data component for len

Re: Large data packages in the archive

2008-05-25 Thread Luk Claes
Raphael Geissert wrote: > Hi all, > > What about going the 'b.)' way but define it as a RG (or even RC) with some > other changes to policy (like requiring big data package's source packages > to be arch-indep and not build anything else but the data packages). >

Re: Large data packages in the archive

2008-05-25 Thread Raphael Geissert
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hi all, What about going the 'b.)' way but define it as a RG (or even RC) with some other changes to policy (like requiring big data package's source packages to be arch-indep and not build anything else but the data packages).

Re: Large data packages in the archive

2008-05-25 Thread Ove Kaaven
Joerg Jaspert skrev: - Packages in main need to be installable and not cause their (indirect) reverse build-depends to FTBFS in the absence of data.debian.org. If the data is necessary for the package to work and there is a small dataset (like 5 to 10 MB) that can be reasonably substitu

Re: Large data packages in the archive

2008-05-25 Thread Michael Hanke
Hi, On Sun, May 25, 2008 at 08:18:01PM +0200, Joerg Jaspert wrote: > So assume we go for solution c. (which is what happens unless someone > has a *very* strong reason not to, which I currently can't imagine) we > will setup a seperate archive for this. This will work the same way as > our main a

Large data packages in the archive

2008-05-25 Thread Joerg Jaspert
see if we missed important points but we keep the right to have the last word how it gets done. :) Basic Problem: "What to do with large data packages?" That already has a problem: How to define "large"? One way, which we chose for now, is simply "everything > 50MB".

Re: to join wmakerconf and wmakerconf-data packages

2007-05-30 Thread Josip Rodin
On Wed, May 30, 2007 at 08:56:29AM -0300, Herbert P Fortes Neto wrote: > I think it is also a good idea to join the wmakerconf and > wmakerconf-data debian packages. I would like to know if there is > a reason to not join wmakerconf debian package and wmakerconf-data > debian package in only one d

to join wmakerconf and wmakerconf-data packages

2007-05-30 Thread Herbert P Fortes Neto
Hi, I am intending to adopt the wmakerconf and wmakerconf-data debian packages. And I am now the new responsible of these source packages. As does not exist a reason to have two source packages nowadays, i joined wmakerconf source package and wmakerconf-data source package in only one source

Data-packages was: Re: Potato now stable

2000-08-16 Thread Bernhard R. Link
On Tue, 15 Aug 2000, Drake Diedrich wrote: >Under the Irix packaging system (quite nice UI except that it has to > handle Irix packages..) packages exist in a hierarchy, with lowest level > packages quite fine grained. For example: [...] >Many of our packages are already hierarchical ( x