-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Continuing a FEP alchemical #96
Comments
While TI can handle this easily, alchemical-analysis isn't set up to handle this case (partly because it places a heavy emphasis on BAR and MBAR, and you don't have the data for these); alchemical-analysis assumes the same number of lambda windows were used when running all of the calculations. You may be able to handle this analysis with GROMACS' built-in tools, otherwise you will be better off just processing the |
On Mon, 23 Jan 2017 09:49:41 -0800 "David L. Mobley" ***@***.***> wrote:
While TI can handle this easily, alchemical-analysis isn't set up to
handle this case (partly because it places a heavy emphasis on BAR
and MBAR, and you don't have the data for these); alchemical-analysis
assumes the same number of lambda windows were used when running all
of the calculations. You *may* be able to handle this analysis with
GROMACS' built-in tools, otherwise you will be better off just
processing the `.xvg` files yourself and extracting the data for TI
analysis. It's not hard.
Since it is not hard, then alchemical-analysis should be easily able to
handle it :-). Actually, I asked Alexander to report this here because
I see no technical reason why this should not be possible. If the user
explicitly asks not to use BAR/MBAR then the parser should deal with
this by ignoring any BAR columns and deactivate any checks for those.
|
@halx - I agree this is something we should probably do eventually, yes. It's just not something I can fix in the near-term (will need to wait larger architectural changes, I think). Or did you have a shorter-term fix in mind? Thanks. |
I'm afraid I do not have an immediate short-term fix. The current design
seems to be too much based on assumption as to how a _typical_ Gromacs
output would look like
…On 23 January 2017 at 18:11, David L. Mobley ***@***.***> wrote:
Reopened #96 <#96>.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#96 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AH3aWo-GBQ1Gc18f9kTYySvdivngP48Jks5rVO1JgaJpZM4LrT83>
.
|
Dear all,
I had a free energy simulation converged results harvested by alchemcial analysis in 15 lambada windows in gromacs, just to increase the lambada windows to 20, I simply added the additional lambda points to my lambda vector and ran the simulations for those in order to use TI method with out any problem. Now, I have some .xvg files with 15 columns while some of .xvg file have 20 columns. I expected that the free energy change could be calculated via TI method (alchemical analysis package) easily, so, I invoked below command to get rid of other method except TI:
"alchemical_analysis -p prd. -u kcal -f 20 -o 100dis -s 100 -c -g -v -m '-ti_cubic-dexp-iexp-bar-mbar' -i 10".
However it crashed exactly in the *.xvg file (prd.15.xvg and afterward) in which additional \lambada getting involved.
Traceback (most recent call last):
File "/usr/bin/alchemical_analysis", line 9, in
load_entry_point('alchemical-analysis==1.0.2.dev0', 'console_scripts', 'alchemical_analysis')()
File "/usr/lib/python2.7/site-packages/alchemical_analysis-1.0.2.dev0-py2.7.egg/alchemical_analysis/alchemical_analysis.py", line 1207, in main
nsnapshots, lv, dhdlt, u_klt = parser_gromacs.readDataGromacs(P)
File "/usr/lib/python2.7/site-packages/alchemical_analysis-1.0.2.dev0-py2.7.egg/alchemical_analysis/parser_gromacs.py", line 308, in readDataGromacs
nsnapshots[nf,nf] += unixlike.wcPy(f.filename) - f.skip_lines - 1*bLenConsistency
IndexError: index 15 is out of bounds for axis 1 with size 15
Any idea to sort out the problem are highly appreciated.
Regards,
Alex
The text was updated successfully, but these errors were encountered: