Configurations for SPARC-X-API

SPARC-X-API requires the following components to be fully functional:

  • A JSON schema parsed from SPARC LaTeX documentation

  • Pseudopotential files

  • SPARC C/C++ binary

Default configurations

SPARC-X-API is designed to automate the discovery of these configurations. The default configurations are:

  • JSON schema: <sparc-x-api-root>/sparc_json_api/parameters.json

  • Pseudopotential files: <sparc-x-api-root>/psp/*.psp8 (if installed via conda)

  • Command to run SPARC binary: mpirun sparc (if sparc is found in $PATH)

Custom configurations

You can configure the setup for SPARC-X-API using either of the following methods:

  1. (Recommended) use the ASE configuration file

  2. Use environmental variables. Please note although environmental variables have long been the standard approach to set ASE-calculators, they may be obsolete in future releases of ASE.

Note

The environmental variables will have higher priority than the equivalent fields in the configure file, if both are set.

Editing the configuration file

ASE will look for a configuration file at ~/.config/ase/config.ini for package-specific settings. The configuration file follows the INI format, where key-value pairs are grouped in sections. An example of the SPARC-specific section may look like follows:

[sparc]
; `command`: full shell command (include MPI directives) to run SPARC calculation
;            has the same effect as `ASE_SPARC_COMMAND`
command = srun -n 24 ~/bin/sparc

; `psp_path`: directory containing pseudopotential files
;            has the same effect as `SPARC_PSP_PATH`
psp_path = ~/dev_SPARC/psps

; `doc_path`: directory for SPARC LaTeX documentation
;             has the same effect as `SPARC_DOC_PATH`
doc_path = ~/dev_SPARC/doc/.LaTeX/

The available options in the configuration file are:

  1. SPARC command: either use command to set a full shell string to run the SPARC program, or use the combination of sparc_exe and mpi_prefix. See SPARC command configuration for more details.

  2. JSON schema settings (Optional): use either json_schema to define a custom JSON schema file, or doc_path for parsing the LaTeX documentation on-the-fly. See JSON schema configuration for more details.

  3. Pseudopotential settings (Optional): use psp_path for the location of pseudopotential files. See pseudopotential files settings for more details.

You can overwrite the location of the configuration file by the environmental variable ASE_CONFIG_PATH.

JSON schema

Each version of SPARC-X-API ships with a JSON schema compatible with a dated-version of SPARC C/C++ code. You can find that version in the README badge like follows:

API badge

If that does not match your local SPARC version, you can configure the location of JSON schema using one of the following methods:

Option 1. Parse LaTeX documentation on-the-fly

The environment variable SPARC_DOC_PATH (equivalent to doc_path field in configuration file) will direct SPARC-X-API to look for a local directory containing LaTeX documentation to parse on-the-fly, for example:

export SPARC_DOC_PATH=<local-SPARC-dir>/doc/.LaTeX

or configuration file setting:

doc_path: <local-SPARC-dir>/doc/.LaTeX

2. Use your own parameters.json

In some cases an experimental feature may not have been updated in the official doc. You can create and edit your own parameters.json file to temporarily test a local version of SPARC:

First parse the LaTeX files into parameters.json

python -m sparc.docparser  <local-SPARC-dir>/doc/.LaTeX \
                           --output parameters.json \
                           --include-subdirs

Then add / edit missing parameters in the parameters section in parameters.json, see examples from the existing file:

"ACE_FLAG": {
   "symbol": "ACE_FLAG",
   "label": "ACE_FLAG",
   "type": "integer",
   "default": 1,
   "unit": "No unit",
   "example": "ACE_FLAG: 0",
   "description": "Use ACE operator to accelarte the hybrid calculation.",
   "remark": "Without ACE operator, the hybrid calculation will be way slower than with it on depending on the system size.",
   "allow_bool_input": true,
   "default_remark": "1",
   "description_raw": "Use ACE operator to accelarte the hybrid calculation.",
   "remark_raw": "Without ACE operator, the hybrid calculation will be way slower than with it on depending on the system size.",
   "category": "scf"
  },

Finally, set the json_schema field in the configuration file to the newly generated json file, for example:

[sparc]
; `json_schema`: custom schema file parsed from LaTeX documentation
json_schema: ~/SPARC/parameters.json

Warning

json_schema and doc_path fields cannot be both present in the configuration file!

Pseudopotential files

To specify a custom path for your pseudopotential files (in Abinit psp8 format, you can either use the environment variable SPARC_PSP_PATH:

export SPARC_PSP_PATH=path/to/your/psp8/directory

or the equivalent keyword psp_path in configuration file

psp_path: path/to/your/psp8/directory

When installing SPARC via conda-forge, $SPARC_PSP_PATH is already included in the activate script of the conda environment.

To determine the location of default psp8 files (as in manual pip installation), run the following code:

python -c "from sparc.common import psp_dir; print(psp_dir)"

SPARC command configuration

The command to execute SPARC calculations is determined based on the following priority:

  1. The command argument provided directly to the sparc.SPARC calculator.

  2. Command defined by environment variable ASE_SPARC_COMMAND or configuration file

  3. If neither of the above is defined, SPARC-X-API looks for the SPARC binary under current $PATH and combine with the suitable MPI command prefix.

Use full command

The variable ASE_SPARC_COMMAND (or command field in configuration file) contain the full command to run a SPARC calculation. depending on the system, you may choose one of the following:

  1. Using mpirun (e.g. on a single test machine)

export ASE_SPARC_COMMAND="mpirun -n 8 -mca orte_abort_on_non_zero_status 1 /path/to/sparc -name PREFIX"

or in configuration file

command: mpirun -n 8 -mca orte_abort_on_non_zero_status 1 /path/to/sparc -name PREFIX
  1. Using srun (e.g. SLURM job system HPCs)

export ASE_SPARC_COMMAND="srun -n 8 --kill-on-bad-exit /path/to/sparc -name PREFIX"

or in configuration file

command: srun -n 8 --kill-on-bad-exit /path/to/sparc -name PREFIX" /path/to/sparc -name PREFIX

Note

  1. The -name PREFIX is optional and will automatically replaced by the sparc.SPARC calculator.

  2. We recommend adding kill switches for your MPI commands like the examples above when running sparc to avoid unexpected behaviors with exit signals.

Specifying MPI binary location

It is also possible to construct the SPARC command from two

Post-installation check

We recommend the users to run a simple test after installation and setup to make sure everything works:

python -m sparc.quicktest

A proper setup will display the following in the output’s Summary section:

--------------------------------------------------------------------------------
Summary
--------------------------------------------------------------------------------
Configuration
psp_dir: /home/pink/Dev/SPARC/psps
api_version: 2024.10.14
api_source: {'path': '/home/pink/Dev/SPARC-X-API/sparc/sparc_json_api/parameters.json', 'type': 'json'}
command: mpirun -n 2 /home/pink/Dev/dev_SPARC/lib/sparc
sparc_version: 2024.10.14
sparc_socket_compatibility: True
--------------------------------------------------------------------------------
Tests
Import:  PASS
Pseudopotential:  PASS
JSON API:  PASS
SPARC Command:  PASS
Calculation (File I/O):  PASS
Calculation (UNIX socket):  PASS
--------------------------------------------------------------------------------

Check for error messages when some tests didn’t pass and troubleshooting hints, such as the example below with a mis-configured command.

--------------------------------------------------------------------------------
Summary
--------------------------------------------------------------------------------
Configuration
psp_dir: /home/pink/Dev/SPARC/psps
api_version: 2024.10.14
api_source: {'path': '/home/pink/Dev/dev_SPARC/doc/.LaTeX', 'type': 'latex'}
command: mpirun -n 4 /home/pink/Dev/dev_SPARC/lib/sparc
sparc_version: NaN
sparc_socket_compatibility: False
--------------------------------------------------------------------------------
Tests
Import:  PASS
Pseudopotential:  PASS
JSON API:  PASS
SPARC Command:  FAIL
Calculation (File I/O):  FAIL
Calculation (UNIX socket):  FAIL
--------------------------------------------------------------------------------
Some tests failed! Please check the following information.

SPARC Command:
Error detecting SPARC version
- The command prefix to run SPARC calculation should look like
  `<mpi instructions> <sparc binary>`
- Use $ASE_SPARC_COMMAND to set the command string
- Check HPC resources and compatibility (e.g. `srun` on a login node)


Calculation (File I/O):
Simple calculation in file I/O mode failed: 
SPARC failed with command mpirun -n 4 /home/pink/Dev/dev_SPARC/lib/sparc -name SPARCwith error code 1
- Check if settings for pseudopotential files are correct
- Check if SPARC binary exists and functional
- Check if specific HPC requirements are met:
  (module files, libraries, parallel settings, resources)


Calculation (UNIX socket):
Simple calculation in socket mode (UNIX socket) failed: 
Cannot find the sparc executable! Please make sure you have the correct setup
- The same as error handling in file I/O calculation test
- Check if SPARC binary supports socket


--------------------------------------------------------------------------------
Please check additional information from:
1. SPARC's documentation: https://github.com/SPARC-X/SPARC/blob/master/doc/Manual.pdf
2. Python API documentation: https://github.com/SPARC-X/SPARC-X-API/blob/master/README.md

Note

  1. When using SPARC-X-API to parse SPARC files, it’s essential that at least the “Import” and “JSON API” tests are successful.

  2. For running SPARC calculations, “SPARC Command” and “Calculation (File I/O)” must also succeed.

  3. “Calculation (UNIX socket)” ensures the SPARC binary is compatible with socket communication, see calculation in socket mode.

If you run into further problems, consult our troubleshooting guidlines or raise an issue.