Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hashing, caching and fast-forwarding #652

Merged
merged 122 commits into from
Feb 9, 2018
Merged
Show file tree
Hide file tree
Changes from 118 commits
Commits
Show all changes
122 commits
Select commit Hold shift + click to select a range
ce539d8
#119 added first small test to check for node hashing
lekah Jun 1, 2017
7589afd
#119 Added small functions get_hash and get_same_node to Node
lekah Jun 1, 2017
13caff2
#119 Added functionality to store method (only for Django backend) th…
lekah Jun 1, 2017
ca5f047
#119 Try-except clause if hashing fails, and also check for None for …
lekah Jun 2, 2017
87e546f
#119 Fix in test that was failing because there is now an additional …
lekah Jun 2, 2017
39d378e
* Failing test for two small but unequal floats
greschd Jun 15, 2017
f7e6fe0
Merge pull request #591 from greschd/node-hashing
lekah Jun 16, 2017
28af8ed
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Jul 20, 2017
f12c3cb
Merge branch 'node-hashing' of github.com:aiidateam/aiida_core into n…
greschd Jul 20, 2017
219d53b
Give name to '_' in loop
greschd Jul 20, 2017
76acbb6
Update hashing algorithm
greschd Jul 20, 2017
2de9bc8
Fix get_hash for special case of ArrayData
greschd Jul 20, 2017
a4ae949
Fix ArrayData issue by hashing folder content for 'pathlib.Path'
greschd Jul 20, 2017
516c44e
Explicitly raise error when non-directory path is hashed
greschd Jul 20, 2017
dcb4a38
Add pathlib2 requirement
greschd Jul 20, 2017
bbe4c33
Print modulename to check failing Travis test
greschd Jul 20, 2017
fb182cc
Try moving ArrayData import inside the test
greschd Jul 20, 2017
dc93962
Move numpy import to test
greschd Jul 20, 2017
5fef7f2
Add comma after checksumdir requirement
greschd Jul 20, 2017
f04d3bb
Add find_same to store_all, add to SQLAlchemy
greschd Jul 20, 2017
1bc7c90
Simplify get_same_node
greschd Jul 20, 2017
c5df6bf
Set hash extra on SQLAlchemy nodes
greschd Jul 20, 2017
e5c72a1
Change how extra is stored on SQLAlchemy node
greschd Jul 20, 2017
a896eb1
Use Folder interface for hashing the repository folder
greschd Jul 20, 2017
d13f8dd
Add logic to ignore attributes
greschd Jul 20, 2017
533e968
Add tests for FolderData with empty files and folders.
greschd Jul 20, 2017
55d1ad0
Add functionality to set the default of caching True / False
greschd Jul 21, 2017
29f14ba
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Jul 21, 2017
7a0d4ef
Add contextmanager to enable / disable caching
greschd Jul 21, 2017
148770d
Change caching default in store methods to False.
greschd Jul 21, 2017
a3bc4b5
Add module.__version__ to hash of a Node
greschd Jul 21, 2017
bfad737
Add _is_valid_cache, to exclude failed calculations
greschd Aug 2, 2017
6f37ddd
Add debug statements in subcalss run_until_complete
greschd Aug 7, 2017
ffbdd73
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Aug 11, 2017
06acb5c
Implement fast-forwarding in _create_and_setup_db_record
greschd Aug 14, 2017
c4027ff
Add inputs hash for WorkCalculation Nodes
greschd Aug 14, 2017
6693152
Add tests for WorkChain fast-forwarding
greschd Aug 14, 2017
8e56a54
Add more tests to WorkChain forwarding
greschd Aug 15, 2017
e7493c7
Add Travis hack to install plum from my fork
greschd Aug 15, 2017
ae485ac
Add insert_data to fast-forwarding workchain test setUp method
greschd Aug 15, 2017
fb8367e
Call tearDownClass and setUpClass in fast-forwarding tearDown
greschd Aug 15, 2017
33e6d0d
Add _is_valid_cache method to exclude invalid Nodes (e.g. failed Calc…
greschd Aug 2, 2017
404a6f6
Remove unused _use_cache attribute
greschd Aug 15, 2017
62b8880
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Aug 16, 2017
471dcfa
Add 'ignore_errors' option to get_hash, raise errors instead of retur…
greschd Aug 16, 2017
f076bfe
Add fast-forwarding for JobCalculations
greschd Aug 16, 2017
ba052a5
Implement _is_valid_cache at AbstractCalculation level, add abstractm…
greschd Aug 16, 2017
3b02796
Add option to configure caching / fast-forwarding via config file
greschd Aug 17, 2017
509b7e8
Set Py2 requirements via environment marker
greschd Aug 17, 2017
eb0289f
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Aug 17, 2017
b79fe10
Add _get_objects_to_hash method, to de-duplicate get_hash code
greschd Aug 17, 2017
f989e92
Add 'expanduser' to getting the cache config
greschd Aug 17, 2017
d64add2
- Add **kwargs to make_hash
greschd Aug 20, 2017
f1d7171
Add re-hash command
greschd Aug 20, 2017
7b460ab
Add '.' for every 100 re-hashed nodes
greschd Aug 20, 2017
1df9128
Get only INPUT link in get_inputs_dict for objects to hash
greschd Aug 21, 2017
58b2195
Add 'CALL' to ignored inputs
greschd Aug 21, 2017
7df259f
Add option to specify PKs (or --all) in verdi rehash.
greschd Oct 19, 2017
4168fa0
Simplify verdi rehash code
greschd Oct 20, 2017
98e7aaf
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Oct 27, 2017
537cd89
Run pre-commit
greschd Oct 27, 2017
d809be6
Update plumpy requirement to 0.7.11
greschd Oct 30, 2017
5a63309
Install plumpy before AiiDA on travis build
greschd Oct 30, 2017
42cde39
Update RTD requirements
greschd Oct 30, 2017
e96cd6b
Ignore '_updatable_attributes' in addition to '_hash_ignored_attribut…
greschd Oct 30, 2017
afbb3c5
Enable use of the cache_config for use_cache
greschd Oct 31, 2017
28e4d60
Add tests for caching configuration.
greschd Oct 31, 2017
e8a7d2b
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Oct 31, 2017
e04b9de
Merge branch 'develop' into node-hashing
greschd Oct 31, 2017
39ac450
Merge branch 'node-hashing' of github.com:greschd/aiida_core into nod…
greschd Oct 31, 2017
18cf330
Move caching config tests to db tests
greschd Oct 31, 2017
050eb38
Implement cache lookup in general/node.py instead of db implementations
greschd Oct 31, 2017
74a358a
Use 'copy' when retrieving cached Node
greschd Nov 1, 2017
fd09763
Keep inputlink_cache when creating cached Node.
greschd Nov 1, 2017
5646de9
Add class-level '_cacheable' attribute
greschd Nov 6, 2017
2b72bff
Seal inline calculations
greschd Nov 6, 2017
9c6f90a
Fix _store_from_cache
greschd Nov 6, 2017
e4d101d
Simplify _store_from_cache code, add tests for InlineCalculation caching
greschd Nov 6, 2017
d902d69
Merge configuration for 'use_cache' and 'fast_forward'.
greschd Nov 6, 2017
644d349
Add enable_caching and disable_caching contextmanagers.
greschd Nov 6, 2017
389aa8d
Merge duplicate 'general' implementations of InlineCalculation.
greschd Nov 6, 2017
e4af12d
Remove fast-forwarding tests for WorkChain and workfunction
greschd Nov 6, 2017
01a9143
Set calculation state to PARSING before adding cached outputs, and ma…
greschd Nov 6, 2017
338f952
Add caching documentation
greschd Nov 6, 2017
62ba9d9
Update singledispatch requirement to 3.4.0.3
greschd Nov 6, 2017
7559721
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Nov 12, 2017
baea092
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Nov 13, 2017
20691dd
Add cache tests to test_daemon.py
greschd Nov 13, 2017
326b258
Add assert for 'cached_from' key in daemon test
greschd Nov 13, 2017
2a6f475
Test that the hash stays the same for completed calculations.
greschd Dec 10, 2017
8d76026
Add 'retrieve_temporary_list' to ignored attributes for calculations.
greschd Dec 10, 2017
fb5f5dd
Check for hash consistency only for cached calculations.
greschd Dec 10, 2017
fe4a5c2
Add has_failed and has_finished methods to AbstractCalculation.
greschd Dec 12, 2017
b54d840
Remove install for git plumpy version in Travis.
greschd Dec 12, 2017
dd3ca84
Add computer uuid to _objects_to_be_hashed
greschd Dec 12, 2017
da52523
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Dec 12, 2017
7e3a771
Remove 'self.is_sealed' in _is_valid_cache test -- is not fulfilled f…
greschd Dec 12, 2017
2fb2081
Add a more thorough discussion of links and caching
greschd Dec 12, 2017
87994f2
Validate caching of 'cached' calculations when also checking for expe…
greschd Dec 13, 2017
265d058
Merge branch 'develop' into node-hashing
giovannipizzi Dec 18, 2017
85c812f
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Jan 29, 2018
3aa6aa0
Merge branch 'node-hashing' of github.com:greschd/aiida_core into nod…
greschd Jan 29, 2018
9717235
* add 'get_all_same_nodes' function, returns a list of all matching n…
greschd Jan 29, 2018
be36c3a
Add '_iter_all_same_nodes' helper function, which returns an iterator
greschd Jan 29, 2018
c413145
Rename 'cached_from' and 'hash' extras by prepending '_aiida_'.
greschd Jan 29, 2018
acde1c3
Define constant for '_aiida_hash' key
greschd Jan 29, 2018
4101b15
Fix filter in '_iter_same_node': Rename hash to _aiida_hash
greschd Jan 29, 2018
107e444
Fix tests which compared extras and didn't expect the '_aiida_hash' e…
greschd Jan 29, 2018
1f5bf07
Fix TestDbExtrasDjango to take into account '_aiida_hash'.
greschd Jan 29, 2018
89c3626
Fix SQLA test_replacement_1 test to work with '_aiida_hash'
greschd Jan 29, 2018
7b4d164
Fix error message where 'fast-forwarding' is mentioned.
greschd Jan 31, 2018
e515f14
Remove 'retrieve_temporary_list' from ignored hash attributes.
greschd Jan 31, 2018
99bad1e
Add doc for what to do when caching is triggered in error, add clear_…
greschd Jan 31, 2018
fe1f5e8
Add max_memory_kb and priority to the attributes ignored in JobCalcul…
greschd Feb 1, 2018
b7c1e71
Add description of '_hash_ignored_inputs'
greschd Feb 1, 2018
1b4839e
Add code to show how to get the full class name
greschd Feb 1, 2018
01f9004
Remove workchain caching test, add caching section to dev guide
greschd Feb 1, 2018
eef22f0
Merge branch 'develop' into node-hashing
greschd Feb 1, 2018
5a2a840
Merge branch 'develop' of github.com:aiidateam/aiida_core into node-h…
greschd Feb 1, 2018
1c0e6bc
Merge branch 'node-hashing' of github.com:greschd/aiida_core into nod…
greschd Feb 1, 2018
8596f77
Add 'retrieve_temporary_list' to updatable attributes
greschd Feb 8, 2018
419b20a
Merge branch 'develop' into node-hashing
greschd Feb 8, 2018
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
116 changes: 78 additions & 38 deletions .travis-data/test_daemon.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,65 @@ def validate_workchains(expected_results):

return valid

def validate_cached(cached_calcs):
"""
Check that the calculations with created with caching are indeed cached.
"""
return all(
'_aiida_cached_from' in calc.extras() and
calc.get_hash() == calc.get_extra('_aiida_hash')
for calc in cached_calcs
)

def create_calculation(code, counter, inputval, use_cache=False):
parameters = ParameterData(dict={'value': inputval})
template = ParameterData(dict={
## The following line adds a significant sleep time.
## I set it to 1 second to speed up tests
## I keep it to a non-zero value because I want
## To test the case when AiiDA finds some calcs
## in a queued state
#'cmdline_params': ["{}".format(counter % 3)], # Sleep time
'cmdline_params': ["1"],
'input_file_template': "{value}", # File just contains the value to double
'input_file_name': 'value_to_double.txt',
'output_file_name': 'output.txt',
'retrieve_temporary_files': ['triple_value.tmp']
})
calc = code.new_calc()
calc.set_max_wallclock_seconds(5 * 60) # 5 min
calc.set_resources({"num_machines": 1})
calc.set_withmpi(False)
calc.set_parser_name('simpleplugins.templatereplacer.test.doubler')

calc.use_parameters(parameters)
calc.use_template(template)
calc.store_all(use_cache=use_cache)
expected_result = {
'value': 2 * inputval,
'retrieved_temporary_files': {
'triple_value.tmp': str(inputval * 3)
}
}
print "[{}] created calculation {}, pk={}".format(
counter, calc.uuid, calc.dbnode.pk)
return calc, expected_result

def submit_calculation(code, counter, inputval):
calc, expected_result = create_calculation(
code=code, counter=counter, inputval=inputval
)
calc.submit()
print "[{}] calculation submitted.".format(counter)
return calc, expected_result

def create_cache_calc(code, counter, inputval):
calc, expected_result = create_calculation(
code=code, counter=counter, inputval=inputval, use_cache=True
)
print "[{}] created cached calculation.".format(counter)
return calc, expected_result

def main():

# Submitting the Calculations
Expand All @@ -106,39 +165,10 @@ def main():
expected_results_calculations = {}
for counter in range(1, number_calculations + 1):
inputval = counter
parameters = ParameterData(dict={'value': inputval})
template = ParameterData(dict={
## The following line adds a significant sleep time.
## I set it to 1 second to speed up tests
## I keep it to a non-zero value because I want
## To test the case when AiiDA finds some calcs
## in a queued state
#'cmdline_params': ["{}".format(counter % 3)], # Sleep time
'cmdline_params': ["1"],
'input_file_template': "{value}", # File just contains the value to double
'input_file_name': 'value_to_double.txt',
'output_file_name': 'output.txt',
'retrieve_temporary_files': ['triple_value.tmp']
})
calc = code.new_calc()
calc.set_max_wallclock_seconds(5 * 60) # 5 min
calc.set_resources({"num_machines": 1})
calc.set_withmpi(False)
calc.set_parser_name('simpleplugins.templatereplacer.test.doubler')

calc.use_parameters(parameters)
calc.use_template(template)
calc.store_all()
print "[{}] created calculation {}, pk={}".format(
counter, calc.uuid, calc.dbnode.pk)
expected_results_calculations[calc.pk] = {
'value': inputval * 2,
'retrieved_temporary_files': {
'triple_value.tmp': str(inputval * 3)
}
}
calc.submit()
print "[{}] calculation submitted.".format(counter)
calc, expected_result = submit_calculation(
code=code, counter=counter, inputval=inputval
)
expected_results_calculations[calc.pk] = expected_result

# Submitting the Workchains
print "Submitting {} workchains to the daemon".format(number_workchains)
Expand All @@ -158,7 +188,7 @@ def main():
exited_with_timeout = True
while time.time() - start_time < timeout_secs:
time.sleep(15) # Wait a few seconds

# Print some debug info, both for debugging reasons and to avoid
# that the test machine is shut down because there is no output

Expand All @@ -168,7 +198,7 @@ def main():
print "Output of 'verdi calculation list -a':"
try:
print subprocess.check_output(
["verdi", "calculation", "list", "-a"],
["verdi", "calculation", "list", "-a"],
stderr=subprocess.STDOUT,
)
except subprocess.CalledProcessError as e:
Expand All @@ -177,7 +207,7 @@ def main():
print "Output of 'verdi work list':"
try:
print subprocess.check_output(
['verdi', 'work', 'list'],
['verdi', 'work', 'list'],
stderr=subprocess.STDOUT,
)
except subprocess.CalledProcessError as e:
Expand All @@ -186,7 +216,7 @@ def main():
print "Output of 'verdi daemon status':"
try:
print subprocess.check_output(
["verdi", "daemon", "status"],
["verdi", "daemon", "status"],
stderr=subprocess.STDOUT,
)
except subprocess.CalledProcessError as e:
Expand All @@ -204,8 +234,18 @@ def main():
timeout_secs)
sys.exit(2)
else:
# create cached calculations -- these should be FINISHED immediately
cached_calcs = []
for counter in range(1, number_calculations + 1):
inputval = counter
calc, expected_result = create_cache_calc(
code=code, counter=counter, inputval=inputval
)
cached_calcs.append(calc)
expected_results_calculations[calc.pk] = expected_result
if (validate_calculations(expected_results_calculations)
and validate_workchains(expected_results_workchains)):
and validate_workchains(expected_results_workchains)
and validate_cached(cached_calcs)):
print_daemon_log()
print ""
print "OK, all calculations have the expected parsed result"
Expand Down
3 changes: 2 additions & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -44,9 +44,10 @@ install:
# Install AiiDA with some optional dependencies
- pip install .[REST,docs,atomic_tools,testing,dev_precommit]


env:
## Build matrix to test both backends, and the docs
## I still let it create the test backend for django
## I still let it create the test backend for django
## also when building the docs
## because otherwise the code would complain. Also, I need latex.
- TEST_TYPE="pre-commit"
Expand Down
18 changes: 11 additions & 7 deletions aiida/backends/djsite/db/subtests/generic.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,18 +133,22 @@ def test_replacement_1(self):
DbExtra.set_value_for_node(n1.dbnode, "pippobis", [5, 6, 'c'])
DbExtra.set_value_for_node(n2.dbnode, "pippo2", [3, 4, 'b'])

self.assertEquals(n1.dbnode.extras, {'pippo': [1, 2, 'a'],
'pippobis': [5, 6, 'c']})
self.assertEquals(n2.dbnode.extras, {'pippo2': [3, 4, 'b']})
self.assertEquals(n1.get_extras(), {'pippo': [1, 2, 'a'],
'pippobis': [5, 6, 'c'],
'_aiida_hash': n1.get_hash()
})
self.assertEquals(n2.get_extras(), {'pippo2': [3, 4, 'b'],
'_aiida_hash': n2.get_hash()
})

new_attrs = {"newval1": "v", "newval2": [1, {"c": "d", "e": 2}]}

DbExtra.reset_values_for_node(n1.dbnode, attributes=new_attrs)
self.assertEquals(n1.dbnode.extras, new_attrs)
self.assertEquals(n2.dbnode.extras, {'pippo2': [3, 4, 'b']})
self.assertEquals(n1.get_extras(), new_attrs)
self.assertEquals(n2.get_extras(), {'pippo2': [3, 4, 'b'], '_aiida_hash': n2.get_hash()})

DbExtra.del_value_for_node(n1.dbnode, key='newval2')
del new_attrs['newval2']
self.assertEquals(n1.dbnode.extras, new_attrs)
self.assertEquals(n1.get_extras(), new_attrs)
# Also check that other nodes were not damaged
self.assertEquals(n2.dbnode.extras, {'pippo2': [3, 4, 'b']})
self.assertEquals(n2.get_extras(), {'pippo2': [3, 4, 'b'], '_aiida_hash': n2.get_hash()})
8 changes: 4 additions & 4 deletions aiida/backends/sqlalchemy/tests/generic.py
Original file line number Diff line number Diff line change
Expand Up @@ -141,18 +141,18 @@ def test_replacement_1(self):

n2.set_extra("pippo2", [3, 4, u'b'])

self.assertEqual(n1.get_extras(),{'pippo': [1, 2, u'a'], 'pippobis': [5, 6, u'c']})
self.assertEqual(n1.get_extras(),{'pippo': [1, 2, u'a'], 'pippobis': [5, 6, u'c'], '_aiida_hash': n1.get_hash()})

self.assertEquals(n2.get_extras(), {'pippo2': [3, 4, 'b']})
self.assertEquals(n2.get_extras(), {'pippo2': [3, 4, 'b'], '_aiida_hash': n2.get_hash()})

new_attrs = {"newval1": "v", "newval2": [1, {"c": "d", "e": 2}]}

n1.reset_extras(new_attrs)
self.assertEquals(n1.get_extras(), new_attrs)
self.assertEquals(n2.get_extras(), {'pippo2': [3, 4, 'b']})
self.assertEquals(n2.get_extras(), {'pippo2': [3, 4, 'b'], '_aiida_hash': n2.get_hash()})

n1.del_extra('newval2')
del new_attrs['newval2']
self.assertEquals(n1.get_extras(), new_attrs)
# Also check that other nodes were not damaged
self.assertEquals(n2.get_extras(), {'pippo2': [3, 4, 'b']})
self.assertEquals(n2.get_extras(), {'pippo2': [3, 4, 'b'], '_aiida_hash': n2.get_hash()})
2 changes: 1 addition & 1 deletion aiida/backends/testbase.py
Original file line number Diff line number Diff line change
Expand Up @@ -144,7 +144,7 @@ def run_aiida_db_tests(tests_to_run, verbose=False):

actually_run_tests = []
num_tests_expected = 0

# To avoid adding more than once the same test
# (e.g. if you type both db and db.xxx)
found_modulenames = set()
Expand Down
2 changes: 2 additions & 0 deletions aiida/backends/tests/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,8 @@
'pluginloader': ['aiida.backends.tests.test_plugin_loader'],
'daemon': ['aiida.backends.tests.daemon'],
'verdi_commands': ['aiida.backends.tests.verdi_commands'],
'caching_config': ['aiida.backends.tests.test_caching_config'],
'inline_calculation': ['aiida.backends.tests.inline_calculation'],
}
}

Expand Down
55 changes: 55 additions & 0 deletions aiida/backends/tests/inline_calculation.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
# -*- coding: utf-8 -*-
###########################################################################
# Copyright (c), The AiiDA team. All rights reserved. #
# This file is part of the AiiDA code. #
# #
# The code is hosted on GitHub at https://github.com/aiidateam/aiida_core #
# For further information on the license, see the LICENSE.txt file #
# For further information please visit http://www.aiida.net #
###########################################################################
"""
Tests for inline calculations.
"""

from aiida.orm.data.base import Int
from aiida.common.caching import enable_caching
from aiida.orm.calculation.inline import make_inline, InlineCalculation
from aiida.backends.testbase import AiidaTestCase

class TestInlineCalculation(AiidaTestCase):
"""
Tests for the InlineCalculation calculations.
"""
def setUp(self):
@make_inline
def incr_inline(inp):
return {'res': Int(inp.value + 1)}

self.incr_inline = incr_inline

def test_incr(self):
"""
Simple test for the inline increment function.
"""
for i in [-4, 0, 3, 10]:
calc, res = self.incr_inline(inp=Int(i))
self.assertEqual(res['res'].value, i + 1)

def test_caching(self):
with enable_caching(InlineCalculation):
calc1, res1 = self.incr_inline(inp=Int(11))
calc2, res2 = self.incr_inline(inp=Int(11))
self.assertEquals(res1['res'].value, res2['res'].value, 12)
self.assertEquals(calc1.get_extra('_aiida_cached_from', calc1.uuid), calc2.get_extra('_aiida_cached_from'))

def test_caching_change_code(self):
with enable_caching(InlineCalculation):
calc1, res1 = self.incr_inline(inp=Int(11))

@make_inline
def incr_inline(inp):
return {'res': Int(inp.value + 2)}

calc2, res2 = incr_inline(inp=Int(11))
self.assertNotEquals(res1['res'].value, res2['res'].value)
self.assertFalse('_aiida_cached_from' in calc2.extras())
Loading