-
Notifications
You must be signed in to change notification settings - Fork 823
Open
Description
Expected behavior
The scaled to unscaled coordinate transform should not drastically change the coordinates past a certain threshold.
Actual behavior
Currently, using distances.transform_StoR() in the LAMMPSDumpReader only agrees with the values produced by LAMMPS to the 1st decimal place. It is not immediately clear if or where precision is being lost, but IMO 1 decimal place is not ideal, whatever the cause.
Off the top of my head possible causes/explanations:
- We are not carrying enough precision in
_coord_transform()incalc_distances.h - We disagree with the LAMMPs value because they only carry 6dp and 5 are being lost somewhere.
- The matrix multiplication we are trying to do is inherently poorly conditioned.
- Transforming from scaled to real space coordinates is inherently throwing away some ULPs.
- Some or all of the above.
Code to reproduce the behavior
import MDAnalysis as mda
from MDAnalysis.tests.datafiles import LAMMPSDUMP_allcoords
from numpy.testing import (assert_equal, assert_almost_equal)
u_unscaled = mda.Universe(LAMMPSDUMP_allcoords, lammps_coordinate_convention="unscaled")
u_scaled = mda.Universe(LAMMPSDUMP_allcoords, lammps_coordinate_convention="scaled") # scaled -> unscaled transform automatically called when universe is created
for ts_u, ts_s in zip(u_unscaled.trajectory, u_scaled.trajectory):
assert_almost_equal(ts_u.positions, ts_s.positions, decimal=1)
assert_almost_equal(ts_u.positions, ts_s.positions, decimal=2)
....Current version of MDAnalysis
- Which version are you using? 2.0.0dev0
- Which version of Python 3.7
- Which operating system? Ubuntu Eoan Ermine
Reactions are currently unavailable