Hi all, I was encouraged by Alan to respond to these PDF questions on-list, rather than redirecting the discussion to a PDF-only list, so my previous comments that such PDF discussions are not appropriate for the Rietveld list may have been overstated. It is all powder diffraction, we are all family! :-)
> Stimulated by the topics on pair distribution functions, can I ask some > questions here about PDF analysis? What is the proper way of scaling in PDF > calculation, so that one may obtain the actual structure factor and > coordination numbers? I used both PDFGetN and PDFGetX to get PDF, and also > tried to do that by hand. What I found is that however I adjust the > parameters in both softwares, there is always a scale factor (not equal to > 1) when one try to refine the obtained G(r) in PDFFit. It seems the scale > factor is determined by some parameters such as the size of sample and > vanadium bar, and packing ratios etc, which may not be so accurate when one > inputs them in PDFGetN. In order to obtain the right coordination numbers, > the reduced PDF (G(r)=4pi*rho*r[g(r)-1], according to Egami and Billinge's > definition) must be properly scaled. The scale factor may be found by > fitting with known model, but may not be so clear for amorphous or unknown > materials. Is this scale factor arbitrary, or are there some criteria, > theoretically or empirically? In other words, how can one normalize the > scattering intensity to get actual structure factor? > Ideal normalization is something that people studying glasses care a great deal about and spend much effort in the pursuit thereof. Whilst you can do very well using Neutrons and normalizing by a totally incoherent scatterer such as V/Nb alloy, or even pure vanadium, my personal view is that the resulting errors are somewhat large and somewhat uncontrolled. Without model fitting (this is a Rietveld list afterall, go modelling!) I would hesitate to say that I can determine a coordination number from a PDF with better than 10% accuracy, and 20% is more likely to be correct. The difficulty is the uncontrolled errors. If you are unlucky, the occupancy can be off by much more. By model fitting you can refine an overall scale factor which takes care of this problem. The relative intensities between different PDF peaks is determined very accurately, so once the overall scale is determined, other structural parameters can be refined with high accuracy, comparable to Rietveld. In principle we normalize the S(Q) by measuring it to high-Q where the coherent scattering disappears (because of the Debye-Waller factor, and additionally the x-ray form factor in the case of x-rays). At this point, all the scattering from the sample is incoherent and so the intensity, normalized to the number of atoms which is the normal convention, is given by the the number of atoms times the average scattering cross-section of the atoms, or <f>^2. We can simply multiply our measured intensity by whatever number it takes such that this incoherent part of the scattering lies on the theoretical <f>^2 line for our sample. When we Fourier transform the resulting intensity we can get perfect coordination numbers by using the correct (Faber-Ziman) expressions. However, things are not as simple as they seem and here is where the problem originates. Before we get to the "normalization" point we have to make some corrections to the data. We should subtract parasitic scattering (from containers etc.), Compton Scattering, multiple scattering, correct for absorption and extinction, polarization, detector deadtime, angular dependence of detector efficiencies, etc. etc. etc.. This we do with great care, but what you immediately realise is that some of the corrections are additive/subtractive and some are multiplicative. If we make any errors in these corrections (and many of them are quite approximate) we run the risk of compensating an additive correction with a multiplicative one when we do the normalization, resulting in an incorrect absolute scale on the data. In many cases, the signal is much larger than any relevant backgrounds and corrections and we get rather good normalizations, in other cases this is no longer true, especially for weakly scattering samples. My glass-studying colleagues will get mad at me for saying it, but in general, treat coordination numbers determined directly from PDF peak fitting with some degree of doubt. However, when you fit with models, we have shown [1] that data that are even far from ideally normalized (i.e., the normalization lim Q->infty S(Q)=1 is good, but additive offsets were corrected with with a multiplicative correction) give quantitatively identical refined parameters so, as they say in Brooklyn, donworryaboudit. [1] P. F. Peterson, E. S. Bozin, Th. Proffen and S. J. L. Billinge, Improved measures of quality for atomic pair distribution functions, J. Appl. Crystallogr. 36, 53 (2003). Simon PS, are you using PDFgetX or PDFgetX2 for your x-ray data...the latter is the version of the code that we are supporting and, if that was not a typo, I encourage you to switch. -- Prof. Simon Billinge Applied Physics & Applied Mathematics Columbia University 500 West 120th Street Room 200 Mudd, MC 4701 New York, NY 10027 Tel: (212)-854-2918 Condensed Matter and Materials Science Brookhaven National Laboratory P.O. Box 5000 Upton, NY 11973-5000 (631)-344-5387 email: sb2896 at columbia dot edu home: http://nirt.pa.msu.edu/