Re: [ccp4bb] water soluble protein that needs detergent to be stable

2007-08-29 Thread Daniel Picot
I have not dealt myself with this type of protein, but Alex McPherson
tested the detergent beta-octylglucoside (at a concentration of 1.5 %,
i.e. above the cmc) for the crystallisation of soluble protein and tRNA
(1986, J.Biol.Chem 261:1969-75), the detergent did not hampered the
crystallisation, and even sometimes improved the crystal quality. It is
possible to somewhat control the detergent concentration: if the cmc is
high enough (something above 1 mM) you can dialyse the detergent,
otherwise you can bind the protein to a small affinity column and djust
the required concentration of detergent. I stop here because I find
always very difficult to extrapolate the effect of detergent  from
membrane protein to soluble protein.
Daniel

Daniel Jin a écrit :
> Hi,
>  
> I am working on a 60 kDa C. elegans protein that is predicted to be
> mostly alpha-helix. It is over-expressed in E.coli and the yield is
> about 1 mg/L of cell culture. The CD spec at 4 degree showed the
> presence of dominant alpha-helix. However, we don’t have any functional
> assay to confirm that it is folded correctly.
>  
> It is over-expressed as a GST-fusion. We noticed that after cleavage of
> GST, it will easily precipitate if moved to room temperature (solution
> turns cloudy). Otherwise, it is OK at 4 degree. The CD temperature
> melting experiment showed a gradual change of signal, no sharp
> transition was observed. We later found out that including some
> detergent in the buffer will make it stay soluble at room temperature
> and showed as a dimer on SEC (4C or RT). Glycerol at 10% will help too
> but not as good as detergent.
>  
> My concerns are, first this protein might not folded correctly, second,
> the presence of probably high concentration of detergent in the final
> sample will harm crystallization since the detergent will be
> co-concentrated with the protein. I am wondering whether anyone has deal
> with proteins like this before and their experience on improvement of
> the biochemical behavior and of course crystallization. Many thanks.
>  
> Best,
> chen
>  
> 
> 
> Got a little couch potato?
> Check out fun summer activities for kids.
> 
> 


Re: [ccp4bb] The importance of USING our validation tools

2007-08-29 Thread James Stroud
Could such solvent ordering be a byproduct of unusually low solvent  
content as both of the structures seem to have (1h6w & 1ocy)? Perhaps  
cryoprotectant did not penetrate the solvent channels, but the  
crystal maintained its integrity nevertheless?



On Aug 28, 2007, at 1:14 AM, Kevin Cowtan wrote:


Wow! Those are two pretty amazing structures.

For those of you who haven't had a look, the ordered molecules are  
in layers with *huge* gaps in between, much greater than in 2hr0.


And yet both of these structures were solved with experimental  
phasing (SIRAS) unlike 2hr0, and the data is to higher resolution.


Mark J. van Raaij wrote:
With regards to our structures 1H6W (1.9A) and 1OCY (1.5A), rather  
than faith, I think the structure is held together by a real  
mechanism, which however I can't explain. Like in the structure  
Axel Brunger mentioned, there is appreciable diffuse scatter,  
which imo deserves to be analysed by someone expert in the matter  
(to whom, or anyone else, I would gladly supply the images which I  
should still have on a tape or CD in the cupboard...). For low-res  
version of one image see http://web.usc.es/~vanraaij/diff45kd.png

and http://web.usc.es/~vanraaij/diff45kdzoom.png
two possibilities I have been thinking about:


Re: [ccp4bb] Mean phase difference calculation

2007-08-29 Thread Winter, G (Graeme)
Hi All,
 
Thanks for your help with this - cphasematch is clearly what I was
looking for. For reference, here is the final script:
 
# compute PHI from refined structure - only necessary if these are the 
# "reference" phases

sfall XYZIN 1VRM.pdb HKLIN 13185_free.mtz HKLOUT 1VRM_HKLREF.mtz << eof
mode SFCALC XYZIN HKLIN
LABIN FP=F_INFL SIGFP=SIGF_INFL
labout FC=F_CALC PHIC=PHI_CALC
eof

# cad together

cad hklin1 resolve.mtz hklin2 1VRM_HKLREF.mtz hklout combined.mtz << eof
labin file 1 ALL
labin file 2 ALL
eof

# match origins and compute mean phi difference

cphasematch -mtzin combined.mtz -mtzout dummy.mtz \
-colin-fo "[*/*/F_INFL,*/*/SIGF_INFL]" \
-colin-phifom-1 "[*/*/PHIM,*/*/FOMM]" \
-colin-fc-2 "[*/*/F_CALC,*/*/PHI_CALC]"

Cheers,

Graeme 




From: CCP4 bulletin board [mailto:[EMAIL PROTECTED] On Behalf Of
Winter, G (Graeme)
Sent: 28 August 2007 16:21
To: CCP4BB@JISCMAIL.AC.UK
Subject: [ccp4bb] Mean phase difference calculation



Hi CCP4BB, 

Is there a straightforward way to compute the mean phase difference
between two phase columns in an MTZ file, allowing for the difference in
the origin? I am thinking something like:

Cad together two reflection files - renaming the PHIB column, most
likely 
Compute phi offset - apply to get the origins the same 
Compute the r.m.s. phase difference in degrees etc. 

Also - is there a general feeling as to how useful this is for comparing
the results of phasing between different packages? I was thinking that
PHI_ref could be computed fom the final refined structure, if this is
available, as a benchmark. Alternatively is there a "correct" way to be
doing things like this?

An example script would be really great... 

Thanks, 

Graeme 


Re: [ccp4bb] Mean phase difference calculation

2007-08-29 Thread Phil Evans

Kevin

Does this give the correlation between two (weighted) complex Fs,  
which is arguably (ref Bricogne) the best measure, as it corresponds  
to the map correlation but as a function of resolution?


Phil

On 28 Aug 2007, at 17:22, Kevin Cowtan wrote:


Yup. cphasematch

It gives unweighted and weighted mean phase errors and F and E  
correlations. It'll also try the opposite hand if appropriate.


It's in the GUI under Clipper utilities.

For an example script, run it in the GUI and then view the command  
file.


Kevin

Winter, G (Graeme) wrote:

Hi CCP4BB,
Is there a straightforward way to compute the mean phase  
difference between two phase columns in an MTZ file, allowing for  
the difference in the origin? I am thinking something like:
Cad together two reflection files - renaming the PHIB column, most  
likely

Compute phi offset - apply to get the origins the same
Compute the r.m.s. phase difference in degrees etc.
Also - is there a general feeling as to how useful this is for  
comparing the results of phasing between different packages? I was  
thinking that PHI_ref could be computed fom the final refined  
structure, if this is available, as a benchmark. Alternatively is  
there a "correct" way to be doing things like this?

An example script would be really great…
Thanks,
Graeme


[ccp4bb] permanent crystalliser job - AZ Boston

2007-08-29 Thread Pauptit, Richard A
AstraZeneca has a permanent position at our research site in Boston USA for 
someone interested in doing protein crystallization. The work will be 
crystallising bacterial proteins for use in antibiotic drug design. The 
successful candidate will be able to demonstrate predominant interest and track 
record in protein crystallisation.

To apply, please visit www.astrazeneca.com, click on jobs, US, career 
opportunities, and search for requisition number 1781568.

many thanks,

Prof Richard Pauptit
Associate Director and Principal Scientist
Global Structural Chemistry
AstraZeneca
50S36, Mereside, Alderley Park
Macclesfield SK10 4TG
Cheshire, UK.

email: [EMAIL PROTECTED]
tel: +44 1625 516135 


[ccp4bb] Why wwPDB and members are doing a poor job.

2007-08-29 Thread Joe Krahn
This is a reply to the below message posted under "[ccp4bb] The
importance of USING our validation tool", which is a rather long thread now.

This is part of why I claim that wwPDB and its members are doing a bad
job. They have worked to systematically remove "general purpose"
information that does not fit their pre-defined schemes, which are
developed with out much interaction with the user community. the problem
is that we are doing RESEARCH, which means that we will continue to
develop new methods over time. The sensible thing to do is to allow
unformatted user-defined information, and eventually work it in to a
properly formatted, standard item if that information is seen as
generally useful by the user community.

I think that the lack of community involvement by the database
administrations should be a clear indication of why we should NOT switch
from PDB to mmCIF format for coordinate files. Instead, we should take
this opportunity of wwPDB members abandoning the PDB format to take over
management of the format ourselves. I was quite irate with them for
going against our wishes on several features of the PDB format, like
supporting the SegID. Instead, I think we should realize that "modern
database" management goals are different from experimentalist goals, and
that we should not rely on them to decide how our own data should be
represented.

I think that we should intentionally avoid mmCIF for coordinate files,
and stick to the PDB format. The wwPDB has absolutely no policy for user
involvement, and RCSB has clearly dropped the previously establish
PDB-format change policy. Their task was never to manage a public file
format standard. This is an opportunity to turn the PDB file format into
a public standard.

I have started a PDB Format Wiki, running on my home computer, at
http://pdb.homeunix.org. If it gains interest, I will see about moving
it to a proper Internet host.

Joe Krahn

Miller, Mitchell D. wrote:
> Hi Boaz,
>   We were informed by an RCSB annotator in April 2006 that the
> RCSB had suspended including REMARK 42 records in PDB files
> pending the review of the process by the wwPDB.
> 
>   In looking at the new annotation guidelines, it looks
> like the result of that review was to reject the REMARK 42 
> record and the listing of additional validation items.  
> See page 23 of the July 2007 "wwPDB Processing Procedures 
> and Policies Document"
> http://www.wwpdb.org/documentation/wwPDB-A-20070712.pdf
> 
> "REMARK 42 and use of other programs for validation Use of REMARK 42 is
> discontinued.
> 
> If authors wish to indicate presubmission validation and other programs used 
> before
> deposition, the programs may be listed in a new remark, REMARK 40. This 
> remark will
> list the software name, authors and function of the program. Results of the 
> software will
> not be listed. Use of this remark is voluntary."
> 
> It seems that the wwPDB only allows the inclusion of validation 
> statistics output by the refinement program but not from additional 
> validation programs. So for additional statistics to be included 
> in the PDB header, they will either need to be implemented by the 
> refinement package or the wwPDB annotators.
> 
> Regards,
> Mitch 


[ccp4bb] Rotation function calculation from 2 patterson maps

2007-08-29 Thread Raja Dey
Hi,
I have two patterson maps created from two observed data sets of similar 
structures. I want to rotate one map against the other and hence to match them 
to get the corresponding rotation function. Does anyone have experience how I 
can do that? 

Raja 


Re: [ccp4bb] The importance of USING our validation tools

2007-08-29 Thread Raji Edayathumangalam
I would like to mention some other issues now that Ajees et al. has stirred all 
sorts of
discussions. I hope I haven't opened Pandora's box.

>From what I have learned around here, very often, there seems to be little 
>time allowed or allocated
to actually learn--a bit beyond the surface--some of the crystallography or 
what the
crystallographic software is doing during the structure solution process. 

A good deal of the postdocs and students here are under incredible pressure to 
get the structure
DONE asap. For some of them, it is their first time solving a crystal 
structure. Yes, the same
heapful of reasons: because it's "hot", "competitive", grant deadline, PI 
tenure pressure etc. etc.
Learning takes the backseat and this is total rubbish and very scary, in my 
biased personal opinion.
Although I think it is the person's responsibility to take the time and 
initiative to learn, I also
see that the pressure often is insurmountable. Often, the PI and/or assigned 
"structure solver" in
the lab pretty much takes charge at some early stage of structure determination 
and solves the
structure with much lesser contribution from the scientist in training 
(student/postdoc). All that
slog to clone, purify, crystallize, optimize diffraction only to realize 
someone else will come
along, process the data and "finish up" the structure for you. Such 'training' 
(or lack thereof) is
a recipe for generating 'bad' structures in future and part of the reason for 
this endless thread. 

I think it is NOT as common for someone else to, say, run all the Western blots 
for you, maintain
your tissue cell lines for you, do your protein preps for you. Is it because it 
is much easier to
upload someone else's crystallographic data on one's machine and solve the 
structure (since this
does not demand the same kind of physical labor and effort and is also a lot of 
fun) that this
happens? I understand when the PI or "structure solver" does the above as part 
of a teamwork and
allows for the person in question to learn. But often, I see the person is 
somewhat left overwhelmed
and clueless in the end.

I bring this issue to the forum since I do not know if this phenomenon is 
ubiquitous. If this
practice is a rampant weed, can we as a crystallographic community place some 
measures to stanch
such practices?

How about ALL journals explicitly listing who did what during the 
crystallographic analysis? Is
there a practical solution?

I suspect that what I describe is not merely anecdotal. Any solutions?
Raji




--
Date:   Thu, 23 Aug 2007 16:17:23 -0700
Reply-To:   Dale Tronrud <[log in to unmask]>
Sender: CCP4 bulletin board <[log in to unmask]>
From:   Dale Tronrud <[log in to unmask]>
Subject:Re: The importance of USING our validation tools
In-Reply-To:<[log in to unmask]>
Content-Type:   text/plain; charset=ISO-8859-1; format=flowed

In the cases you list, it is clearly recognized that the fault lies with the 
investigator and not
the method. In most of the cases where serious problems have been identified in 
published models the
authors have stonewalled by saying that the method failed them.

"The methods of crystallography are so weak that we could not detect (for 
years) that our program
was swapping F+ and F-."

"The scattering of X-rays by bulk solvent is a contentious topic."

"We should have pointed out that the B factors of the peptide are higher then 
those of the protein."

It appears that the problems occurred because these authors were not following 
established
procedures in this field. They are, as near as I can tell, somehow immune from 
the consequences of
their errors. Usually the paper isn't even retracted, when the model is clearly 
wrong. They can dump
blame on the technique and escape personal responsibility. This is what upsets 
so many of us.

It would be so refreshing to read in one of these responses "We were under a 
great deal of pressure
to get our results out before our competitors and cut corners that we shouldn't 
have, and that
choice resulted in our failure to detect the obvious errors in our model."

If we did see papers retracted, if we did see nonrenewal of grants, if we did 
see people get fired,
if we did see prison time (when the line between carelessness and fraud is 
crossed), then we could
be comforted that there is practical incentive to perform quality work.

Dale Tronrud 


[ccp4bb] High solvent content crystals

2007-08-29 Thread Joe Krahn
This is a reply to the below message posted under "[ccp4bb] The
importance of USING our validation tool", which is a rather long thread now.

I worked with Lars Pedersen on a rather high (78%) solvent content
crystal, 1VKJ. The interesting thing is that there are 3 molecules in
the AU, with 2 well-defined, and one that is almost invisible. Without
the 3rd molecule, it looks like tubes of protein floating in space with
no contacts. Here is the unit cell:
http://joekrahn.homelinux.com/top_ABC.png

The third molecule (chain C) is yellow, and is poorly ordered. You can
get a good R-factor with just the A and B molecules, but the cell looks
like this:
http://joekrahn.homelinux.com/top_AB.png

Molecule C forms tringular "girder" pattern seen here:
http://joekrahn.homelinux.com/side.png

Each brace across the girder pattern is 2 C molecules, which make
end-to-end contacts that are strong, but poor side-to-side stability.
This results in a fairly blurry, wobbly molecule, that still is strong
length-wise, which is all that is required for a strong girder system.
However, only the very ends are well ordered, and needs NCS restraints
against the A and B chains to be refined. If the had been an unknown
molecule, we could have a good result for molecules A and B, but
"floating" in space, with 86% solvent. I wonder if some other
high-solvent crystal is like this, where an unmodelled molecule can be
too poorly odered to model, but still supply enough strength along one
direction that it allows the crystal to be well-ordered.

Joe Krahn


Jenny Martin wrote:
> I've been reading the contributions on this topic with much interest.
> It's been very timely in that I've been giving 3rd year u/g lectures on
> protein X-ray structures and their validation over the past week.
> As part of the preparation for the lectures, I searched the PDB for
> structures with high solvent content.
> To my surprise, I found 376 crystal structures with solvent content >75%
> (about 1% of all crystal structures) and 120 structures with solvent
> content > 80% (about 0.3% of all crystal structures)
> However, there were only 3 other structures that (like 2HR0) had >80%
> AND Rcryst Rfree less than 20%. All three structures are solved to
> better than 3A Resolution.
> One is from a weak data set from a virus crystal, the other two PDB
> files report very strong crystallographic data.
> The Rmerge values are more typical than for 2HR0 and none of the three
> appear to have the geometry or crystal contact problems of 2HR0.
> 
> My question is, how could crystals with 80% or more solvent diffract so
> well? The best of the three is 1.9A resolution with I/sigI 48 (top shell
> 2.5). My experience is that such crystals diffract very weakly.
> There are another 15 structures with solvent content 75-80% and
> Rcryst/Rfree < 20%. I didn't check them in any detail, just to see that
> the structure was consistent with a high solvent content.
> 
> Any thoughts?
> 
> Cheers,
> Jenny


[ccp4bb] exporting the "NCS found for Chain ... onto Chain ..." map

2007-08-29 Thread J . Sanchez-weatherby
Dear all,

I am trying to export the intermediate maps obtained in coot when obtaining NCS 
average maps. That
is the ones called "NCS found for Chain ... onto Chain ..."

When I use the command (export-map imol "name", in linux, or export_map (imol, 
"name"), in
windows, I manage to obtain a map output. The problem is that this map is the 
original map not the
map I wanted. As if it had forgotten the transformation to apply or it wasn't 
applying it before
outputting it. This doesn’t seem to be a problem if instead I try to output the 
"NCS average of
map ..."

Is it that the intermediate maps used to generate the NCS average map cannot be 
exported or am I
missing something?

Is this option available in a newer version? (The latest I’ve tried is 
coot-0.3.2 (lin)
coot-0.3.3.1 (win).

Alternatively, are these maps stored in some tmp file somewhere that I can find 
to be able to get
them?

Thank you for your time and help,

Juan


[ccp4bb] solving structure of which 70% is known

2007-08-29 Thread Raja Dey
Hello,
 I am trying to solve a multi-protein DNA complex structure from a 3.6 
A native data set. The target structure is a dimer (95 aa in each monomer) in 
complex with DNA( 15 base pairs) plus a second protein of 131 aa. The data has 
been scaled to P6(1)22 sp. gr. and one target structure is expected to be in 
the asymmetric unit that corresponds to 68% solvent content. 70 % of the target 
structure is known in two parts(two different pdb structures previously solved 
contributing 55% and 15% of the target structure). I tried with molrep and 
phaser considering the first part(55%) as the search model but it turned out to 
no good solution which all clashes with symmetry related copies. If I assume 
the sp. gr. is not the case what else I can try. Any suggestion is well 
appreciated.
Thanks...
Raja


[ccp4bb] Question about the percentages of occupancy for dual conformations.

2007-08-29 Thread Jian Wu
Dear all,
If I have a set of high-resolution data in which there is an important
residue having apparent dual conformations, my question is:
how or where could I test and calculate the percentages of occupancy for
the dual conformations?
Any suggestion is welcome.

Yours,
Jian
-- 
Jian Wu

Ph.D. Student
Institute of Biochemistry and Cell Biology
Shanghai Institutes for Biological Sciences
Chinese Academy of Sciences (CAS)
Tel: 0086-21-54921117
Email: [EMAIL PROTECTED]