Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BENCH: skip benchmarks instead of hiding them when SCIPY_XSLOW=0 #12732

Merged
merged 1 commit into from
Aug 19, 2020

Conversation

pv
Copy link
Member

@pv pv commented Aug 16, 2020

The "xslow" benchmarks should not be hidden from asv when the
environment variable is not set, because then any results obtained for
them appear to be for non-existing benchmarks, and may be deleted and
will not be included in reporting. Instead, skip the benchmarks with the
usual mechanism.

The "xslow" benchmarks should not be hidden from asv when the
environment variable is not set, because then any results obtained for
them appear to be for non-existing benchmarks, and may be deleted and
will not be included in reporting. Instead, skip the benchmarks with the
usual mechanism.
@pv pv changed the title BENCH: skip benchmarks instead of hiding them when SCIPY_XSLOW=1 BENCH: skip benchmarks instead of hiding them when SCIPY_XSLOW=0 Aug 16, 2020
@rgommers rgommers added the Benchmarks Running, verifying or documenting benchmarks for SciPy label Aug 16, 2020
@mdhaber
Copy link
Contributor

mdhaber commented Aug 16, 2020

Thanks for fixing this.
Same TravisCI failures are occuring in other PRs, so I don't think that's a problem.
PR logic makes sense. I'm running benchmarks to double-check.

@mdhaber
Copy link
Contributor

mdhaber commented Aug 16, 2020

TLDR: I have three problems running benchmarks on Windows. (They're not unique to this PR, so let me know if you'd like them posted somewhere else.)

  • conda-installed asv run fails with Fatal error in launcher: Unable to create process using '"d:\bld\asv_1589638519985\_h_env\python.exe"
  • pip-installed asv run succeeds only if I manually copy openblas.a into the environment's Lib folder
  • runtests.py or benchmarks/run.py fail with "TypeError: environment can only contain strings" for both conda- and pip-installed ASV

I installed ASV with conda (conda install -c conda-forge asv) and I'm working in an Anaconda Powershell Prompt.

>>> asv run --bench optimize_linprog
Fatal error in launcher: Unable to create process using '"d:\bld\asv_1589638519985\_h_env\python.exe"  "C:\ProgramData\Anaconda3\envs\scipydev\Scripts\asv.exe" run --bench optimize_linprog': The system cannot find the file specified.

>>> python run.py run --bench optimize_linprog
Traceback (most recent call last):
  File "run.py", line 94, in <module>
    sys.exit(main())
  File "run.py", line 35, in main
    sys.exit(run_asv(args.asv_command))
  File "run.py", line 73, in run_asv
    return subprocess.call(cmd, env=env, cwd=cwd)
  File "C:\ProgramData\Anaconda3\envs\scipydev\lib\subprocess.py", line 339, in call
    with Popen(*popenargs, **kwargs) as p:
  File "C:\ProgramData\Anaconda3\envs\scipydev\lib\subprocess.py", line 800, in __init__
    restore_signals, start_new_session)
  File "C:\ProgramData\Anaconda3\envs\scipydev\lib\subprocess.py", line 1207, in _execute_child
    startupinfo)
TypeError: environment can only contain strings

>>> python runtests.py --bench optimize_linprog

(same error as with python run.py).

I uninstalled ASV and reinstall with pip (in a conda environment - frowned upon, I know, but I saw a tip online that it's worth trying). I still get the TypeError when I use either of SciPy's scripts. asv run no longer fails immediately, but build fails because it can't find BLAS. I manually copied openblas.a into ASV's environment's Lib folder, and it seems to be working, but is there a better way?

@pv
Copy link
Member Author

pv commented Aug 17, 2020 via email

@tylerjereddy tylerjereddy added this to the 1.6.0 milestone Aug 18, 2020
Copy link
Contributor

@tylerjereddy tylerjereddy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

asv check is passing in CI at least; the logic is a little complicated, but the idea makes sense to me; not really surprising that issues can crop up when people try to run asv themselves, it can take a little tweaking on a given machine

There are no Travis CI failures; the Azure failures are fixed in master after #12730

@mdhaber
Copy link
Contributor

mdhaber commented Aug 18, 2020

Oops yes I meant Azure.
I ran without SCIPY_XSLOW and the right benchmarks ran. I'll check with SCIPY_XSLOW=1 and merge tomorrow if there are no objections.
Update: looks good for linprog. Checking BenchGlobal

@mdhaber mdhaber merged commit 55ba7c8 into scipy:master Aug 19, 2020
try:
self.numtrials = int(os.environ['SCIPY_GLOBAL_BENCH_NUMTRIALS'])
except (KeyError, ValueError):
self.numtrials = 100

self.dump_fn = os.path.join(os.path.dirname(__file__), '..', 'global-bench-results.json')
self.results = {}

def setup(self, name, ret_value, solver):
if not self.enabled:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pv @andyfaff I see

    if not is_xslow():
        _enabled_functions = ['AMGM']

above, suggesting that AMGM is supposed to run even when SCIPY_XSLOW is not 1, yet:

self.enabled = is_xslow()
...
if not self.enabled:
...
   raise NotImplementedError()

so none of the global benchmarks run when SCIPY_XSLOW is not 1. I think I'm reading correctly, as no benchmarks run locally when SCIPY_XSLOW is not 1, and there are no results for GlobalBench on Pauli's server (AMGM is listed, but there is no data).

Is this intentional, or should a subset of the benchmarks run when SCIPY_XSLOW is not 1?

@pv
Copy link
Member Author

pv commented Aug 19, 2020 via email

@mdhaber
Copy link
Contributor

mdhaber commented Aug 19, 2020

OK, I can pick some out so that it takes a certain amount of time. Approximately how long should it take to run the global optimization benchmarks when SCIPY_XSLOW is not 1?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Benchmarks Running, verifying or documenting benchmarks for SciPy
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants