Skip to content

Latest commit

 

History

History
773 lines (497 loc) · 28.2 KB

plugins.rst

File metadata and controls

773 lines (497 loc) · 28.2 KB

Plugins

What a plugin can do

  • Add a new class to AiiDA's :ref:`entry point groups <topics:plugins:entrypointgroups>`, including:: calculations, parsers, workflows, data types, verdi commands, schedulers, transports and importers/exporters from external databases. This typically involves subclassing the respective base class AiiDA provides for that purpose.
  • Install new commandline and/or GUI executables
  • Depend on, and build on top of any number of other plugins (as long as their requirements do not clash)

What a plugin should not do

An AiiDA plugin should not:

  • Change the database schema AiiDA uses
  • Use protected functions, methods or classes of AiiDA (those starting with an underscore _)
  • Monkey patch anything within the aiida namespace (or the namespace itself)

Failure to comply will likely prevent your plugin from being listed on the official AiiDA plugin registry.

If you find yourself in a situation where you feel like you need to do any of the above, please open an issue on the AiiDA repository and we can try to advise on how to proceed.

Guidelines for plugin design

CalcJob & Parser plugins

The following guidelines are useful to keep in mind when wrapping external codes:

What is an entry point?

The setuptools package (used by pip) has a feature called entry points, which allows to associate a string (the entry point identifier) with any python object defined inside a python package. Entry points are defined in the pyproject.toml file, for example:

...
[project.entry-points."aiida.data"]
# entry point = path.to.python.object
"mycode.mydata = aiida_mycode.data.mydata:MyData",
...

Here, we add a new entry point mycode.mydata to the entry point group aiida.data. The entry point identifier points to the MyData class inside the file mydata.py, which is part of the aiida_mycode package.

When installing a python package that defines entry points, the entry point specifications are written to a file inside the distribution's .egg-info folder. setuptools provides a package pkg_resources for querying these entry point specifications by distribution, by entry point group and/or by name of the entry point and load the data structure to which it points.

Why entry points?

AiiDA defines a set of entry point groups (see :ref:`topics:plugins:entrypointgroups` below). By inspecting the entry points added to these groups by AiiDA plugins, AiiDA can offer uniform interfaces to interact with them. For example:

  • verdi plugin list aiida.workflows provides an overview of all workflows installed by AiiDA plugins. Users can inspect the inputs/outputs of each workflow using the same command without having to study the documentation of the plugin.
  • The DataFactory, CalculationFactory and WorkflowFactory methods allow instantiating new classes through a simple short string (e.g. quantumespresso.pw). Users don't need to remember exactly where in the plugin package the class resides, and plugins can be refactored without users having to re-learn the plugin's API.

AiiDA entry point groups

Below, we list the entry point groups defined and searched by AiiDA. You can get the same list as the output of verdi plugin list.

aiida.calculations

Entry points in this group are expected to be subclasses of :py:class:`aiida.orm.JobCalculation <aiida.orm.nodes.process.calculation.calcjob.CalcJobNode>`. This replaces the previous method of placing a python module with the class in question inside the aiida/orm/calculation/job subpackage.

Example entry point specification:

[project.entry-points."aiida.calculations"]
"mycode.mycode" = "aiida_mycode.calcs.mycode:MycodeCalculation"

aiida_mycode/calcs/mycode.py:

from aiida.orm import JobCalculation
class MycodeCalculation(JobCalculation):
   ...

Will lead to usage:

from aiida.plugins import CalculationFactory
calc = CalculationFactory('mycode.mycode')

aiida.parsers

AiiDA expects a subclass of Parser. Replaces the previous approach consisting in placing a parser module under aiida/parsers/plugins.

Example spec:

[project.entry-points."aiida.parsers"]
"mycode.myparser" = "aiida_mycode.parsers.mycode:MycodeParser"

aida_mycode/parsers/myparser.py:

from aiida.parsers import Parser
class MycodeParser(Parser)
   ...

Usage:

from aiida.plugins import ParserFactory
parser = ParserFactory('mycode.mycode')

aiida.data

Group for :py:class:`~aiida.orm.nodes.data.data.Data` subclasses. Previously located in a subpackage of aiida/orm/data.

Spec:

[project.entry-points."aiida.data"]
"mycode.mydata" = "aiida_mycode.data.mydata:MyData"

aiida_mycode/data/mydat.py:

from aiida.orm import Data
class MyData(Data):
   ...

Usage:

from aiida.plugins import DataFactory
params = DataFactory('mycode.mydata')

aiida.workflows

Package AiiDA workflows as follows:

Spec:

[project.entry-points."aiida.workflows"]
"mycode.mywf" = "aiida_mycode.workflows.mywf:MyWorkflow"

aiida_mycode/workflows/mywf.py:

from aiida.engine.workchain import WorkChain
class MyWorkflow(WorkChain):
   ...

Usage:

from aiida.plugins import WorkflowFactory
wf = WorkflowFactory('mycode.mywf')

Note

For old-style workflows the entry point mechanism of the plugin system is not supported. Therefore one cannot load these workflows with the WorkflowFactory. The only way to run these, is to store their source code in the aiida/workflows/user directory and use normal python imports to load the classes.

aiida.cmdline

verdi uses the click_ framework, which makes it possible to add new subcommands to existing verdi commands, such as verdi data mydata. AiiDA expects each entry point to be either a click.Command or click.Group. At present extra commands can be injected at the following levels:

Spec for verdi data:

[project.entry-points."aiida.cmdline.data"]
"mydata" = "aiida_mycode.commands.mydata:mydata"

aiida_mycode/commands/mydata.py:

import click
@click.group()
mydata():
   """commandline help for mydata command"""

@mydata.command('animate')
@click.option('--format')
@click.argument('pk')
create_fancy_animation(format, pk):
   """help"""
   ...

Usage:

verdi data mydata animate --format=Format PK

Spec for verdi data core.structure import:

entry_points={
   "aiida.cmdline.data.structure.import": [
      "myformat = aiida_mycode.commands.myformat:myformat"
   ]
}
[project.entry-points."aiida.cmdline.data.structure.import"]
"myformat" = "aiida_mycode.commands.myformat:myformat"

aiida_mycode/commands/myformat.py:

import click
@click.group()
@click.argument('filename', type=click.File('r'))
myformat(filename):
   """commandline help for myformat import command"""
   ...

Usage:

verdi data core.structure import myformat a_file.myfmt

aiida.tools.dbexporters

If your plugin package adds support for exporting to an external database, use this entry point to have aiida find the module where you define the necessary functions.

aiida.tools.dbimporters

If your plugin package adds support for importing from an external database, use this entry point to have aiida find the module where you define the necessary functions.

aiida.schedulers

We recommend naming the plugin package after the scheduler (e.g. aiida-myscheduler), so that the entry point name can simply equal the name of the scheduler:

Spec:

[project.entry-points."aiida.schedulers"]
"myscheduler" = "aiida_myscheduler.myscheduler:MyScheduler"

aiida_myscheduler/myscheduler.py:

from aiida.schedulers import Scheduler
class MyScheduler(Scheduler):
   ...

Usage: The scheduler is used in the familiar way by entering 'myscheduler' as the scheduler option when setting up a computer.

aiida.transports

aiida-core ships with two modes of transporting files and folders to remote computers: core.ssh and core.local (stub for when the remote computer is actually the same). We recommend naming the plugin package after the mode of transport (e.g. aiida-mytransport), so that the entry point name can simply equal the name of the transport:

Spec:

[project.entry-points."aiida.transports"]
"mytransport" = "aiida_mytransport.mytransport:MyTransport"

aiida_mytransport/mytransport.py:

from aiida.transports import Transport
class MyTransport(Transport):
   ...

Usage:

from aiida.plugins import TransportFactory
transport = TransportFactory('mytransport')

When setting up a new computer, specify mytransport as the transport mode.

Plugin test fixtures

When developing AiiDA plugin packages, it is recommended to use pytest as the unit test library, which is the de-facto standard in the Python ecosystem. It provides a number of fixtures that make it easy to setup and write tests. aiida-core also provides a number fixtures that are specific to AiiDA and make it easy to test various sorts of plugins.

To make use of these fixtures, create a conftest.py file in your tests folder and add the following code:

pytest_plugins = ['aiida.manage.tests.pytest_fixtures']

Just by adding this line, the fixtures that are provided by the :mod:`~aiida.manage.tests.pytest_fixtures` module are automatically imported. The module provides the following fixtures:

aiida_manager

Return the global instance of the :class:`~aiida.manage.manager.Manager`. Can be used, for example, to retrieve the current :class:`~aiida.manage.configuration.config.Config` instance:

def test(aiida_manager):
   aiida_manager.get_config().get_option('logging.aiida_loglevel')

aiida_profile

This fixture ensures that an AiiDA profile is loaded with an initialized storage backend, such that data can be stored. The fixture is session-scoped and it has set autouse=True, so it is automatically enabled for the test session.

By default, the fixture will generate a completely temporary independent AiiDA instance and test profile. This includes:

  • A temporary .aiida configuration folder with configuration files
  • A temporary PostgreSQL cluster
  • A temporary test profile complete with storage backend (creates a database in the temporary PostgreSQL cluster)

The temporary test instance and profile are automatically destroyed at the end of the test session. The fixture guarantees that no changes are made to the actual instance of AiiDA with its configuration and profiles.

The creation of the temporary instance and profile takes a few seconds at the beginning of the test suite to setup. It is possible to avoid this by creating a dedicated test profile once and telling the fixture to use that instead of generating one each time:

  • Create a profile, by using verdi setup or verdi quicksetup and specify the --test-profile flag
  • Set the AIIDA_TEST_PROFILE environment variable to the name of the test profile: export AIIDA_TEST_PROFILE=<test-profile-name>

Although the fixture is automatically used, and so there is no need to explicitly pass it into a test function, it may still be useful, as it can be used to clean the storage backend from all data:

def test(aiida_profile):
   from aiida.orm import Data, QueryBuilder

   Data().store()
   assert QueryBuilder().append(Data).count() != 0

   # The following call clears the storage backend, deleting all data, except for the default user.
   aiida_profile.clear_profile()

   assert QueryBuilder().append(Data).count() == 0

aiida_profile_clean

Provides a loaded test profile through aiida_profile but empties the storage before calling the test function. Note that a default user will be inserted into the database after cleaning it.

def test(aiida_profile_clean):
   """The profile storage is guaranteed to be emptied at the start of this test."""

This functionality can be useful if it is easier to setup and write the test if there is no pre-existing data. However, cleaning the storage may take a non-negligible amount of time, so only use it when really needed in order to keep tests running as fast as possible.

aiida_profile_clean_class

Provides the same as aiida_profile_clean but with scope=class. Should be used for a test class:

@pytest.mark.usefixtures('aiida_profile_clean_class')
class TestClass:

    def test():
        ...

The storage is cleaned once when the class is initialized.

aiida_profile_factory

Create a temporary profile, add it to the config of the loaded AiiDA instance and load the profile. Can be useful to create a test profile for a custom storage backend:

@pytest.fixture(scope='session')
def custom_storage_profile(aiida_profile_factory) -> Profile:
    """Return a test profile for a custom :class:`~aiida.orm.implementation.storage_backend.StorageBackend`"""
    from some_module import CustomStorage
    configuration = {
        'storage': {
            'backend': 'plugin_package.custom_storage',
            'config': {
                'username': 'joe'
                'api_key': 'super-secret-key'
            }
        }
    }
    yield aiida_profile_factory(configuration)

Note that the configuration above is not actually functional and the actual configuration depends on the storage implementation that is used.

aiida_instance

Return the :class:`~aiida.manage.configuration.config.Config` instance that is used for the test session.

def test(aiida_instance):
    aiida_instance.get_option('logging.aiida_loglevel')

config_psql_dos

Return a profile configuration for the :class:`~aiida.storage.psql_dos.backend.PsqlDosBackend`. This can be used in combination with the aiida_profile_factory fixture to create a test profile with customised database parameters:

@pytest.fixture(scope='session')
def psql_dos_profile(aiida_profile_factory, config_psql_dos) -> Profile:
    """Return a test profile configured for the :class:`~aiida.storage.psql_dos.PsqlDosStorage`."""
    configuration = config_psql_dos()
    configuration['storage']['config']['repository_uri'] = '/some/custom/path'
    yield aiida_profile_factory(configuration)

Note that this is only useful if the storage configuration needs to be customized. If any configuration works, simply use the aiida_profile fixture straight away, which uses the PsqlDosStorage storage backend by default.

postgres_cluster

Create a temporary and isolated PostgreSQL cluster using pgtest and cleanup after the yield.

@pytest.fixture()
def custom_postgres_cluster(postgres_cluster):
    yield postgres_cluster(
        database_name='some-database-name',
        database_username='guest',
        database_password='guest',
    )

aiida_localhost

This test is useful if a test requires a :class:`~aiida.orm.computers.Computer` instance. This fixture returns a :class:`~aiida.orm.computers.Computer` that represents the localhost.

def test(aiida_localhost):
    aiida_localhost.get_minimum_job_poll_interval()

aiida_local_code_factory

This test is useful if a test requires an :class:`~aiida.orm.nodes.data.code.installed.InstalledCode` instance. For example:

def test(aiida_local_code_factory):
    code = aiida_local_code_factory(
        entry_point='core.arithmetic.add',
        executable='/usr/bin/bash'
    )

By default, it will use the localhost computer returned by the aiida_localhost fixture.

aiida_computer

This fixture should be used to create and configure a :class:`~aiida.orm.computers.Computer` instance. The fixture provides a factory that can be called without any arguments:

def test(aiida_computer):
    from aiida.orm import Computer
    computer = aiida_computer()
    assert isinstance(computer, Computer)

By default, the localhost is used for the hostname and a random label is generated.

def test(aiida_computer):
    custom_label = 'custom-label'
    computer = aiida_computer(label=custom_label)
    assert computer.label == custom_label

First the database is queried to see if a computer with the given label already exist. If found, the existing computer is returned, otherwise a new instance is created.

The returned computer is also configured for the current default user. The configuration can be customized through the configuration_kwargs dictionary:

def test(aiida_computer):
    configuration_kwargs = {'safe_interval': 0}
    computer = aiida_computer(configuration_kwargs=configuration_kwargs)
    assert computer.get_minimum_job_poll_interval() == 0

aiida_computer_local

This fixture is a shortcut for aiida_computer to setup the localhost with local transport:

def test(aiida_computer_local):
    localhost = aiida_computer_local()
    assert localhost.hostname == 'localhost'
    assert localhost.transport_type == 'core.local'

To leave a newly created computer unconfigured, pass configure=False:

def test(aiida_computer_local):
    localhost = aiida_computer_local(configure=False)
    assert not localhost.is_configured

Note that if the computer already exists and was configured before, it won't be unconfigured. If you need a guarantee that the computer is not configured, make sure to clean the database before the test or use a unique label:

def test(aiida_computer_local):
    import uuid
    localhost = aiida_computer_local(label=str(uuid.uuid4()), configure=False)
    assert not localhost.is_configured

aiida_computer_ssh

This fixture is a shortcut for aiida_computer to setup the localhost with SSH transport:

def test(aiida_computer_ssh):
    localhost = aiida_computer_ssh()
    assert localhost.hostname == 'localhost'
    assert localhost.transport_type == 'core.ssh'

This can be useful if the functionality that needs to be tested involves testing the SSH transport, but these use-cases should be rare outside of aiida-core. To leave a newly created computer unconfigured, pass configure=False:

def test(aiida_computer_ssh):
    localhost = aiida_computer_ssh(configure=False)
    assert not localhost.is_configured

Note that if the computer already exists and was configured before, it won't be unconfigured. If you need a guarantee that the computer is not configured, make sure to clean the database before the test or use a unique label:

def test(aiida_computer_ssh):
    import uuid
    localhost = aiida_computer_ssh(label=str(uuid.uuid4()), configure=False)
    assert not localhost.is_configured

submit_and_await

This fixture is useful when testing submitting a process to the daemon. It submits the process to the daemon and will wait until it has reached a certain state. By default it will wait for the process to reach ProcessState.FINISHED:

def test(aiida_local_code_factory, submit_and_await):
    code = aiida_local_code_factory('core.arithmetic.add', '/usr/bin/bash')
    builder = code.get_builder()
    builder.x = orm.Int(1)
    builder.y = orm.Int(1)
    node = submit_and_await(builder)
    assert node.is_finished_ok

Note that the fixture automatically depends on the started_daemon_client fixture to guarantee the daemon is running.

started_daemon_client

This fixture ensures that the daemon for the test profile is running and returns an instance of the :class:`~aiida.engine.daemon.client.DaemonClient` which can be used to control the daemon.

def test(started_daemon_client):
    assert started_daemon_client.is_daemon_running

stopped_daemon_client

This fixture ensures that the daemon for the test profile is stopped and returns an instance of the :class:`~aiida.engine.daemon.client.DaemonClient` which can be used to control the daemon.

def test(stopped_daemon_client):
    assert not stopped_daemon_client.is_daemon_running

daemon_client

Return a :class:`~aiida.engine.daemon.client.DaemonClient` instance that can be used to control the daemon:

def test(daemon_client):
    daemon_client.start_daemon()
    assert daemon_client.is_daemon_running
    daemon_client.stop_daemon(wait=True)

The fixture is session scoped. At the end of the test session, this fixture automatically shuts down the daemon if it is still running.

entry_points

Return a :class:`~aiida.manage.tests.pytest_fixtures.EntryPointManager` instance to add and remove entry points.

def test_parser(entry_points):
    """Test a custom ``Parser`` implementation."""
    from aiida.parsers import Parser
    from aiida.plugins import ParserFactory

    class CustomParser(Parser):
        """Parser implementation."""

    entry_points.add(CustomParser, 'custom.parser')

    assert ParserFactory('custom.parser', CustomParser)

Any entry points additions and removals are automatically undone at the end of the test.