Dear Lea and Andreas,
given the excellent resolution, it might be possible to check the densities to
decide if hydrolysis is the more likely cause. If the negative density starts
more or less sharply after the C-terminal carboxylate group of the cleaved
terminus (i.e. after the peptide nitrogen
Hi,
I think it's unlikely to be “just one” molecule — It’s much more likely an
average over multiple states in the crystal (i.e. that site is disordered).
Therefore, IMHO, modelling anything more than a couple of waters is highly
misleading to anyone interpreting your model afterwards.
If you d
https://www.wehi.edu.au/postdoctoral-fellow-tham-laboratory
An opportunity exists for a Postdoctoral Fellow to join the Infection and
Immunity Division at the Walter and Eliza Hall Institute to study the molecular
events that occur during malaria parasite entry into human red blood cells and
to
Dear structural biology friends,
I would like to let you know that I have an open position in my lab for a
postdoctoral researcher.
Please find the vacancy attached.
Kind regards,
Joris
Prof. dr. Joris Messens | Group
On 15/01/2019 20:49, Markus Heckmann wrote:
I have a structure where CYS is bound to OCA (octanoic acid). I
started with Jligand and created a CYS-OCA covalent link. It then
output a CIF file (verified looks OK). Add the LINK record to PDB
file:
LINK SG CYS A 161 C1 OC
Dear CCP4ites,
Is anyone aware of online repositories that will store huge sets of raw data
(>100 GB)? I’m aware of Zenodo and SBGrid, but Zenodo’s limit is typically 50
GB and their absolute limit is 100 GB. SBGrid has yet to respond to my emails.
I could host them myself, but the involuntary
Hi Aaron
I would guess most places would start to want $$ for storing multiples of 100 GB
Google, Amazon, Microsoft all offer this kind of thing. Getting the data in and
out can be slow, and I would expect as the data size tends towards big and the
time tends to a long time it would be comparab
Hi Aaron,
can you slice your data and then link to the bits?
We're currently trying to find out what "unlimited Google Drive storage"
means by uploading pi in chunks of 70 GB or so.
All best.
Andreas
On Fri, Jan 18, 2019 at 4:31 PM Aaron Finke wrote:
> Dear CCP4ites,
>
> Is anyone aware o
Hi,
> Is anyone aware of online repositories that will store huge sets of
> raw data (>100 GB)? I’m aware of Zenodo and SBGrid, but Zenodo’s
> limit is typically 50 GB and their absolute limit is 100 GB. SBGrid
> has yet to respond to my emails.
The Coherent X-ray Imaging Data Bank may be appropr
The zenodo policies seem to the most workable as a start. I would suggest
contacting them for the cases that go over 50GB, but at worst splitting
into 50GB chunks. -- Herbert
On Fri, Jan 18, 2019 at 10:49 AM Andreas Förster <
andreas.foers...@dectris.com> wrote:
> Hi Aaron,
>
> can you slice yo
This is what Zenodo emailed me: "By default, we provide a one-time quota
increase up to 100GB for a dataset that will be cited from a peer-reviewed
article. Zenodo is a free-to-use service, an in order to keep it this way, we
have to restrict the incoming data volume rate as very large datasets
Dear Aaron,
I think that the Zenodo limit is as their email to you states, per dataset
cited from an article ie equals one doi. I recall that at International Data
Week in Denver 2016 I mentioned in open discussion at the session on data
repositories the zenodo limit per dataset of 5 Gbytes and
12 matches
Mail list logo