[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [IMP-dev] Failure rate of restraints




This is just basic physics - most restraints are extensive, in that they scale with the system size (e.g. excluded volume, stereochemistry). Any
CC-like restraint is intensive, and doesn't scale with system size.
I wouldn't say it has anything to do with physics, but yes, it is obvious once you look at it :-)

Obviously that doesn't work when you combine the two.
The challenge is either making sure people pay attention to to the issue early on or making it go away as a problem so IMP developers don't have to help them individually :-)


For EM we solved this years ago with a scaling factor. Ideally the scale
would simply be N^2 where N is the number of particles in the system.
Why quadratic rather than linear?

I don't much like the idea of scaling "regular" restraints by the number of atoms or similar, since that would break pretty much everything else
where the assumption is made that the sum of the restraints is a score
that can be safely minimized
I don't see that being able to minimize the score cares about whether it increases as you add atoms - for most cases the number number of atoms is constant or otherwise not interesting (ie if you are docking proteins you don't want a larger protein to automatically score worse) - and if you care about minimizing the number of atoms, you can always add that as a term in your scoring function

(this assumes that the score does increase
as you add atoms, and restraints on multiple atoms should have more
weight than those on just pairs).
Currently the weight scales with the number of particles rather than the number of atoms. The convention I proposed would make it scale with the number of atoms.

A key invariant is that changing the resolution of the representation should not change things too much.