Core dictionary (coreCIF) version 2.4.5
_diffrn_standards_decay_%
Name:'_diffrn_standards_decay_%'
Definition:
The percentage decrease in the mean intensity of the set of standard reflections measured at the start of the measurement process and at the finish. This value usually affords a measure of the overall decay in crystal quality during the diffraction measurement process. Negative values are used in exceptional instances where the final intensities are greater than the initial ones. If no measurable decay has occurred, the standard uncertainty should be quoted to indicate the maximum possible value the decay might have. A range of 3 standard uncertainties is considered possible. Thus 0.0(1) would indicate a decay of less than 0.3% or an enhancement of less than 0.3%.Examples:
0.5(1) | represents a decay between 0.2% and 0.8% |
-1(1) | the change in the standards lies between a decay of 2% and an increase of 4% |
0.0(2) | the change in the standards lies between a decay of 0.6% and an increase of 0.6%. |
The permitted range is -infinity -> 100
Type: numb
Category: diffrn_standards