xplor + very large structure

Christoph Weber weber at scripps.edu
Thu Oct 16 16:38:10 EST 1997


Dear netters,

can xplor be reconfigured to use less memory in any way? I have tried
playing with parameters in the prexplor.dim file and also with
parameters MAXNGR and MNCSYM in ncs.fcm, but all executables that will
allow the job to run eventually grow to the same size.

Also, is there a parallel version of xplor that scales relatively well?
I hear that the regular version parallelized through pfa (on SGI) does not
scale well and in my hands with the newer MIPSpro 7.1 compilers and pfa,
parallelization failed altogether.


Background:
We are trying to refine the structure of a very large virus 
(2,000,000 reflections, one size bigger than regular viruses) with xplor.
The job has a total memory footprint of 480 MB  and about 
half of the data needs to stay in memory for efficient CPU utilization.
This creates problems on our shared SGI servers which happen to be the 
only machines on campus with sufficient memory.
We are looking at ~6 CPU weeks on an R8000 per SA run.

Any hints about how we can tame this job would be greatly appreciated!
Thanks in advance,

Christoph
--
|  Dr. Christoph Weber              Sen. Research Associate
|  Research Computing, MB9          619-784-9869 (phone)
|  The Scripps Research Institute   619-784-9985 (FAX)
|  La Jolla  CA  92037-1027         weber at scripps.edu        
|  http://www.scripps.edu/~weber.html                   
        Support the anti-Spam amendment
        Join at http://www.cauce.org/



More information about the X-plor mailing list