Matplotlib created a temporary config/cache directory at /tmp/matplotlib-9re594ps because the default path (/var/www/.config/matplotlib) is not a writable directory; it is highly recommended to set the MPLCONFIGDIR environment variable to a writable directory, in particular to speed up the import of Matplotlib and to better support multiprocessing. /usr/lib/python3/dist-packages/ase/io/cif.py:401: UserWarning: crystal system 'cubic' is not interpreted for space group Spacegroup(221, setting=1). This may result in wrong setting! warnings.warn( INFO: Reading configuration file /var/www/html/data/material-modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114/config.toml. INFO: Creating lattice/cell object from /var/www/html/data/material-modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114/LaB6.cif. INFO: supercell = [2 2 2] INFO: kpoints = [3 3 3] INFO: kpts_density= 1512 INFO: directory = /var/www/html/data/material-modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114/ INFO: B6La Bravais lattice: CUB(a=4.157); Spacegroup: P m -3 m (221) INFO: Creating calculator QUANTUMESPRESSO. -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:789941] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:789941] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 4 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:790246] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:790246] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:790547] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:790547] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages INFO: Starting computations in file:///var/www/html/data/material-modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114/ [Thu Jan 30 12:51:18 2025] INFO: Computing ground state. -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 8 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:790852] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:790852] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages INFO: Computing electronic density of states. WARNING: Failed computing the eDOS. Ignoring. Calculator "espresso" failed with command "/usr/bin/mpirun -np 32 /usr/bin/pw.x -npool 3 -ndiag 100.0 -in B6La.pwi > B6La.pwo" failed in /mnt/home-re-grades-02/data/material_modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114 with error code 1 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 13 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:791151] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:791151] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages INFO: Computing electronic band structure 1d. WARNING: Failed computing the 1d electronic band structure along BZ path. Ignoring. Calculator "espresso" failed with command "/usr/bin/mpirun -np 32 /usr/bin/pw.x -npool 3 -ndiag 100.0 -in B6La.pwi > B6La.pwo" failed in /mnt/home-re-grades-02/data/material_modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114 with error code 1 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 16 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:791449] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:791449] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages INFO: Computing electronic band structure 3d. WARNING: Failed computing the 3d electronic band structure along BZ path. Ignoring. Calculator "espresso" failed with command "/usr/bin/mpirun -np 32 /usr/bin/pw.x -npool 3 -ndiag 100.0 -in B6La.pwi > B6La.pwo" failed in /mnt/home-re-grades-02/data/material_modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114 with error code 1 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 25 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:791754] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:791754] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 16 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:792054] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:792054] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages INFO: Computing phonon density of states (small displacement method, Phonopy). WARNING: Failed computing phonons with Phonopy. Trying with ASE. Calculator "espresso" failed with command "/usr/bin/mpirun -np 32 /usr/bin/pw.x -npool 3 -ndiag 100.0 -in B6La.pwi > B6La.pwo" failed in /mnt/home-re-grades-02/data/material_modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114 with error code 1 WARNING: Failed computing the phonon density of states (ASE). Ignoring. Calculator "espresso" failed with command "/usr/bin/mpirun -np 32 /usr/bin/pw.x -npool 3 -ndiag 100.0 -in B6La.pwi > B6La.pwo" failed in /mnt/home-re-grades-02/data/material_modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114 with error code 1 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:792352] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:792352] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages WARNING: Failed computing the phonon dispersion 1d (ASE). Ignoring. Calculator "espresso" failed with command "/usr/bin/mpirun -np 32 /usr/bin/pw.x -npool 3 -ndiag 100.0 -in B6La.pwi > B6La.pwo" failed in /mnt/home-re-grades-02/data/material_modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114 with error code 1 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 8 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [re-grades-01:792655] 31 more processes have sent help message help-mpi-api.txt / mpi-abort [re-grades-01:792655] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages WARNING: Failed computing the phonon dispersion 3d (ASE). Ignoring. Calculator "espresso" failed with command "/usr/bin/mpirun -np 32 /usr/bin/pw.x -npool 3 -ndiag 100.0 -in B6La.pwi > B6La.pwo" failed in /mnt/home-re-grades-02/data/material_modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114 with error code 1 INFO: Saving results into file:///var/www/html/data/material-modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114/ [Thu Jan 30 12:51:56 2025] INFO: Report available at file:///var/www/html/data/material-modeling/LaB6.cif-QuantumEspresso-farhie-20250130-125114//README.html