# Quick test using conventional unit cell
= '1x1x1' supercell
Parallel calculations
Parallel VASP calculations support
These functions support asynchronously executed VASP calculations. This implementation should be fairly easy to adapt to any directory-based calculator. The in-memory calculators are not supported and probably not worth it. The real gain comes from the cluster-run calculations.
Parallel width estimator
This implements parallel version of width-estimation routine. It runs all the calculations using a pool of nwork workers.
# Slow more realistic test
= '2x2x2' supercell
# Directory in which our project resides
= f'example/VASP_3C-SiC_calculated/{supercell}/'
base_dir = TemporaryDirectory(dir='TMP') calc_dir
# Read the structure (previously calculated unit(super) cell)
# The command argument is specific to the cluster setup
= Vasp(label='cryst', directory=f'{base_dir}/sc/', restart=True)
calc
# This just makes a copy of atoms object
# Do not generate supercell here - your atom ordering will be wrong!
= calc.atoms.repeat(1) cryst
print('Stress tensor: ', end='')
for ss in calc.get_stress()/un.GPa:
print(f'{ss:.3f}', end=' ')
print('GPa')
= calc.get_potential_energy() Ep0
Stress tensor: 0.017 0.017 0.017 0.000 0.000 0.000 GPa
# Setup the calculator - single point energy calculation
# The details will change here from case to case
# We are using run-vasp from the current directory!
set(directory=f'{calc_dir.name}/sc')
calc.set(command=f'{os.getcwd()}/run-calc.sh "async"')
calc.set(nsw=0)
calc. cryst.set_calculator(calc)
# Prepare space for the results.
# We use defaultdict to automatically
# initialize the items to empty list.
= defaultdict(lambda : [])
samples
# Space for amplitude correction data
= [] xsl
# Build the sampler
= HECSS(cryst, calc,
hecss =calc_dir.name,
directory= True,
w_search =True,
pbar )
= Ep0 hecss.Ep0
Triggering parallel calculations
The parallel version of the estimate_width_scale
method is triggered by setting nwork
parameter to number of parallel workers which should be used. If nwork=0
the number of workers will be equal to number of required samples N
.
= 10
N = hecss.estimate_width_scale(1, Tmax=2000) m, s, xscl
await hecss.__estimate_width_scale_aio(N//2, Tmax=2000, nwork=N//2)
= hecss.estimate_width_scale(N, Tmax=2000, nwork=N//2) m, s, xscl
= hecss.estimate_width_scale(N//2, Tmax=2000, nwork=3) m, s, xscl
= hecss.estimate_width_scale(2*N, Tmax=2000, nwork=N) m, s, xscl
= hecss.estimate_width_scale(3*N, Tmax=2000, nwork=0) m, s, xscl
# plt.semilogy()
= np.array(hecss._eta_list).T
wm = np.sqrt((3*wm[1]*un.kB)/(2*wm[2]))
y 1], y, '.');
plt.plot(wm[= np.linspace(0, 1.05*wm[1].max(), 2)
x = np.polyfit(wm[1], y, 1)
fit ':', label=f'{fit[1]:.4g} {fit[0]:+.4g} T')
plt.plot(x, np.polyval(fit, x), ='--', label=f'{m:.4g}±{s:.4g}')
plt.axhline(m, ls-s, m+s, alpha=0.3)
plt.axhspan(m-4*s, m+4*s)
plt.ylim(m# plt.ylim(0, m+4*s)
'Target temperature (K)')
plt.xlabel('width scale ($\\AA/\\sqrt{K}$)')
plt.ylabel(
plt.grid(); plt.legend()
= np.array(hecss._eta_list).T
wm = np.sqrt((3*wm[1]*un.kB)/(2*wm[2]))
y '.')
plt.plot(y, = np.array([y[:l].mean() for l in range(1, len(y))])
rm = np.array([y[:l].std() for l in range(1, len(y))])
rv '-', label='$ (x_0 + ... + x_{n-1})/n$')
plt.plot(rm, + rv, ':', lw=1, color='C1')
plt.plot(rm - rv, ':', lw=1, color='C1')
plt.plot(rm ='--', label=f'{m:.4g}±{s:.4g}')
plt.axhline(m, ls-s, m+s, alpha=0.3)
plt.axhspan(m'Sample number ($n$)')
plt.xlabel('width scale ($\\AA/\\sqrt{K}$)')
plt.ylabel(
plt.grid(); plt.legend()
Parallel sampler implementation
Implementation of async/parallel generator execuing calculations in a pool of nwork workers. Implemented for VASP but should be fairly easy to port/extend to other directory/cluster-based calculators.
Directory clean-up routine
This is executed by default to clean-up after the tests. If you want to clean up the directory after running the notebook change CLEANUP to True. The directory is always cleaned after successful tests run in command line mode. The default False value skips the cleanup for manual runs to leave calculation directory for inspection.