Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Belos/FEI/Ifpack2/NOX/Amesos2: Randomly hanging tests on ascisc1** breaking PR builds #11530

Closed
csiefer2 opened this issue Feb 1, 2023 · 38 comments
Assignees
Labels
CLOSED_DUE_TO_INACTIVITY Issue or PR has been closed by the GitHub Actions bot due to inactivity. impacting: tests The defect (bug) is primarily a test failure (vs. a build failure) MARKED_FOR_CLOSURE Issue or PR is marked for auto-closure by the GitHub Actions bot. PA: Discretizations Issues that fall under the Trilinos Discretizations Product Area PA: Framework Issues that fall under the Trilinos Framework Product Area PA: Linear Solvers Issues that fall under the Trilinos Linear Solvers Product Area PA: Nonlinear Solvers Issues that fall under the Trilinos Nonlinear Linear Solvers Product Area pkg: Belos pkg: FEI pkg: Ifpack2 pkg: NOX type: bug The primary issue is a bug in Trilinos code or tests

Comments

@csiefer2
Copy link
Member

csiefer2 commented Feb 1, 2023

Next Action Status

Description

As shown in this query (click "Shown Matching Output" in upper right) the tests:

  • Belos_Tpetra_tfqmr_hb_2_MPI_4
  • FEI_fei_ubase_MPI_3
  • Ifpack2_SGSMT_compare_with_Jacobi_MPI_4
  • NOX_1DfemStratimikos_MPI_4

in the unique GenConfig builds:

  • rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables
  • rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables

started failing on testing day 2023-01-30.

The specific set of CDash builds impacted where:

  • PR-11484-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-1717
  • PR-11516-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-1702
  • PR-11516-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-1708
  • PR-11516-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-228
  • PR-11516-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-230
  • PR-11523-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-241

<Add details about what is failing and what the failures look like. Make sure to include strings that are easy to match with GitHub Issue searches.>

Current Status on CDash

Run the above query adjusting the "Begin" and "End" dates to match today any other date range or just click "CURRENT" in the top bar to see results for the current testing day.

Steps to Reproduce

See:

If you can't figure out what commands to run to reproduce the problem given this documentation, then please post a comment here and we will give you the exact minimal commands.

@csiefer2 csiefer2 added type: bug The primary issue is a bug in Trilinos code or tests pkg: Ifpack2 pkg: Belos pkg: NOX pkg: FEI impacting: tests The defect (bug) is primarily a test failure (vs. a build failure) PA: Framework Issues that fall under the Trilinos Framework Product Area PA: Linear Solvers Issues that fall under the Trilinos Linear Solvers Product Area PA: Nonlinear Solvers Issues that fall under the Trilinos Nonlinear Linear Solvers Product Area PA: Discretizations Issues that fall under the Trilinos Discretizations Product Area labels Feb 1, 2023
@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

10 similar comments
@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Automatic mention of the @trilinos/ifpack2 team

@sebrowne
Copy link
Contributor

sebrowne commented Feb 2, 2023

Looking at them (I've been hit by two of them as well), I think that three of them exhibit the same behavior (they say they pass, but then time out). The Ifpack2 one actually shows a failure as part of the test, so it may be a different root cause? Should we just disable the other three for now? I'm loathe to do that, but I suppose it depends on how long to debug/fix them.

@sebrowne
Copy link
Contributor

sebrowne commented Feb 2, 2023

When we were switching to the new SEMS V2 modules, we did run nightly tests with the new configurations.
https://trilinos-cdash.sandia.gov/build/699959
https://trilinos-cdash.sandia.gov/build/704033
All of the tests passed there, but they were running on Haswell architecture machines. The posted query shows them failing on Broadwell and Skylake. Not sure that's helpful, or if it's a red herring, but figured I would mention it.

@sebrowne
Copy link
Contributor

sebrowne commented Feb 3, 2023

@csiefer2 alternatively, if this is too big a disruption right now, we could explore backing off the GCC toolchain change and go back to SEMS V1 modules temporarily. And maybe add a nightly build with those specific failing builds on the specific machines? Let me know your preference/thoughts. We (Framework and SEMS) really want to get off of the old modules stack regardless, so I don't want to lose momentum here, but I also don't want to adversely affect the ability of the autotester to get stuff into develop.

@csiefer2
Copy link
Member Author

csiefer2 commented Feb 6, 2023

Disable the tests and file issues might be the way to go.

Lots of the"says passed but failed" tests are sue to output order of the PASSED message

@rppawlo
Copy link
Contributor

rppawlo commented Feb 6, 2023

I could not reproduce this on my local workstation. I used the instructions for reproducing using genconfig. Will need some help reproducing.

@jhux2
Copy link
Member

jhux2 commented Feb 7, 2023

#11466 is also getting tripped up by random failures.

@sebrowne
Copy link
Contributor

sebrowne commented Feb 8, 2023

@rppawlo what CPU architecture is your local workstation? Model can be grabbed with the 'lscpu' command.

@sebrowne
Copy link
Contributor

sebrowne commented Feb 8, 2023

I am able to reproduce the FEI failure on my workstation (Cascade Lake) by doing this:
#!/bin/bash
for x in 1 2 3 4 5 6 7 8 9 10
do
ctest --timeout 10 -R FEI_fei_ubase_MPI_3
done

It doesn't fail all of the time, but at least one in ten does seem to fail. I suspect this will be the case with the other ones that hang and time out as well. I'll try the Ifpack2 test next.

sebrowne added a commit to sebrowne/Trilinos that referenced this issue Feb 15, 2023
User Support Ticket(s) or Story Referenced: trilinos#11530
@sebrowne
Copy link
Contributor

If it's exhibiting the same behavior, I would say probably yes. I've added it to the list of disables in #11567, let's see if I can get the syntax right and see what PR testing has to say with those five tests disabled.

I'll defer to pretty much anybody else about how to handle the failures. I think @csiefer2 was in favor of disabling them and filing issues (this issue perhaps). My big concern is whether or not this is related to some kind of issue with the new SEMS toolchain (the OpenMPI 1.10.7 one), and if we're playing whack-a-mole with unstable tests only to discover that it's some deeper issue that is similar across all of them.

@rppawlo rppawlo self-assigned this Feb 16, 2023
sebrowne added a commit to sebrowne/Trilinos that referenced this issue Feb 16, 2023
Disable should be 'on' to disable....

User Support Ticket(s) or Story Referenced: trilinos#11530
@rppawlo
Copy link
Contributor

rppawlo commented Feb 16, 2023

For the nox tests, this looks like an mpi issue. I was able to replicate the failures after wiping the genconfig directory and then rerunning get_dependencies. Even though I specified openmpi 1.10.7 in the string to genconfig, it must have not had an updated submodule and somehow pulled in the 1.10.1 libraries. Thanks to @sebrowne for the ldd info.

I believe the issue is in the mpi tpl. The code runs fine and the application executables for each mpi rank exit clean. Then the mpi hangs inside the mpirun executable. Here’s the gdb stack trace:

(gdb) bt
#0 0x00007f2249ffaddd in poll () from /lib64/libc.so.6
#1 0x00007f224b0aee45 in poll_dispatch (base=0x199e8b0, tv=0x0) at poll.c:165
#2 0x00007f224b0a6529 in opal_libevent2021_event_base_loop (base=0x199e8b0, flags=1) at event.c:1633
#3 0x0000000000405384 in orterun (argc=6, argv=0x7fff9240da98) at orterun.c:1133
#4 0x0000000000403c32 in main (argc=6, argv=0x7fff9240da98) at main.c:13
(gdb)

This hang causes a timeout in ctest that triggers failure. Maybe something in the mpi install is machine specific? We might be able to fix by moving to a new version of mpi. In my local builds, using mpi 1.10.1 ran fine without failure.

@jjellio
Copy link
Contributor

jjellio commented Feb 16, 2023

@rppawlo

This is probably from the system being updated, but the MPI not being rebuilt ontop of it. (looking at the timestamps on the libraries from ldd and comparing to the MPI lib's timestamps can give a hint if a core system lib has changed after MPI was built)

sebrowne added a commit to sebrowne/Trilinos that referenced this issue Feb 20, 2023
User Support Ticket(s) or Story Referenced: trilinos#11530
@sebrowne
Copy link
Contributor

This is now blocking development enough that I'm going to roll back the upgrade this morning and go back to the older SEMS modules for the GCC toolchains (once I discuss it with the team and make sure it will still work). Now that we have a reliable-ish reproducer, I've added a story to ensure that reproducer is fixed prior to re-deploying the SEMS module change.

@bartlettroscoe bartlettroscoe changed the title Belos/FEI/Ifpack2/NOX: Randomly Failing Tests on ascisc1** breaking PR builds Belos/FEI/Ifpack2/NOX/Amesos2: Randomly hanging tests on ascisc1** breaking PR builds Feb 20, 2023
@bartlettroscoe
Copy link
Member

bartlettroscoe commented Feb 20, 2023

It looks like all of these failures show that the test passes on the root process (i.e. proc 0) but MPI says that other ranks failed. In fact, it looks to always be 1 process that fails showing:

End Result: TEST PASSED
test Eqns_unit.feiInitSlave only runs on 2 procs. returning.
test Eqns_unit.feiInitSlave only runs on 2 procs. returning.
-------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
-------------------------------------------------------

If the tests are using the Teuchos unit test harness correctly, that should be impossible. So this must be a problem with the system.

I will post the full extent of these errors and what tests and builds they involve in the next comment.

@sebrowne
Copy link
Contributor

Okay, I've reverted the configuration now. This should stop impacting people immediately (at least that's the hope), and we have work/testing to figure it out prior to merging it again in the future.

@bartlettroscoe
Copy link
Member

Broadening the query looking for tests that failed, have 'timeout', and show "Primary job terminated normally, but .* process returned" since 1/28/2023, we see the full scope of these failures ...

As shown in this query (click "Shown Matching Output" in upper right) the tests:

  • Amesos2_CrsMatrix_Adapter_Consistency_Tests_MPI_4
  • Belos_Tpetra_tfqmr_hb_2_MPI_4
  • FEI_fei_ubase_MPI_3
  • NOX_1DfemAztecSolverPrecRecompute_MPI_4
  • NOX_1DfemStratimikos_MPI_4

in the unique GenConfig builds:

  • rhel7_sems-clang-11.0.1-openmpi-1.10.7-serial_release-debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables
  • rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables
  • rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables

started randomly hanging on testing day 2023-01-30.

The specific set of CDash builds impacted where:

  • PR-11437-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-1741
  • PR-11466-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-269
  • PR-11480-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-1755
  • PR-11480-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-310
  • PR-11480-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-314
  • PR-11484-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-1717
  • PR-11484-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-243
  • PR-11516-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-1702
  • PR-11516-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-228
  • PR-11516-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-230
  • PR-11518-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-284
  • PR-11518-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-306
  • PR-11523-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-241
  • PR-11524-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-1716
  • PR-11524-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-311
  • PR-11528-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-293
  • PR-11533-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-258
  • PR-11533-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-261
  • PR-11533-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-264
  • PR-11544-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-273
  • PR-11546-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-329
  • PR-11554-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-298
  • PR-11554-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-301
  • PR-11565-test-rhel7_sems-clang-11.0.1-openmpi-1.10.7-serial_release-debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-315
  • PR-11565-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-322
  • PR-11567-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-317
  • PR-11567-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-serial_debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-323
  • PR-11570-test-rhel7_sems-clang-11.0.1-openmpi-1.10.7-serial_release-debug_shared_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-321
  • PR-11570-test-rhel7_sems-gnu-8.3.0-openmpi-1.10.7-openmp_release-debug_static_no-kokkos-arch_no-asan_no-complex_no-fpic_mpi_no-pt_no-rdc_no-uvm_deprecated-on_no-package-enables-1795

As you can see, this seems to only be impacting the openmpi-1.10.7 builds.

It is interesting that these failures are isolated to just these 5 tests over all that time. What is unique about these tests that differentiates them from other

NOTE: I could not go back further than 1/28/2023 or CDash just hung. (That may be a memory constraint with the CDash PHP setup on trilinos-cdash.sandia.gov that needs to be resolved.)

@bartlettroscoe
Copy link
Member

Okay, I've reverted the configuration now. This should stop impacting people immediately (at least that's the hope), and we have work/testing to figure it out prior to merging it again in the future.

@sebrowne, FYI, but @ndellingwood just reported the same error in the test Intrepid2_unit-test_MonolithicExecutable_Intrepid2_Tests_MPI_1 failure in a PR iteration yesterday shown in #11579 (comment).

@sebrowne
Copy link
Contributor

sebrowne commented Feb 23, 2023

Even though it's one of the same tests, the failure mode is different. That one failed when the test was starting with a CUDA device-side assert. The issue here was specifically tests that hung at the end of their run (MPI_finalize(), perhaps).

With respect to that failure, I think it may be alleviated by #11391 which stops running a bunch of simultaneous tests on one GPU of the machine, potentially leading to resource issues. Or it may be a real bug in Intrepid2 which triggered the assert, I couldn't say for sure.

Also note that the CUDA line has been using OpenMPI 4.0.5 for a while (it was not part of this change), so I really don't think it's the same failure case. We've seen spurious CUDA failures off and on since I started on the team, which is part of the motivation for #11391 when I was looking into them.

@bartlettroscoe
Copy link
Member

Even though it's one of the same tests, the failure mode is different. That one failed when the test was starting with a CUDA device-side assert. The issue here was specifically tests that hung at the end of their run (MPI_finalize(), perhaps).

Okay, that is right. Thanks for clarifying.

But that means we still have a randomly failing test Intrepid2_unit-test_MonolithicExecutable_Intrepid2_Tests_MPI_1 taking down PR builds as shown in #11579 (comment).

@rppawlo
Copy link
Contributor

rppawlo commented May 31, 2023

@sebrowne - I believe all the tests mentioned in this ticket have now been fixed or disabled. You could try upgrading the compiler stack again.

Copy link

github-actions bot commented Jun 1, 2024

This issue has had no activity for 365 days and is marked for closure. It will be closed after an additional 30 days of inactivity.
If you would like to keep this issue open please add a comment and/or remove the MARKED_FOR_CLOSURE label.
If this issue should be kept open even with no activity beyond the time limits you can add the label DO_NOT_AUTOCLOSE.
If it is ok for this issue to be closed, feel free to go ahead and close it. Please do not add any comments or change any labels or otherwise touch this issue unless your intention is to reset the inactivity counter for an additional year.

@github-actions github-actions bot added the MARKED_FOR_CLOSURE Issue or PR is marked for auto-closure by the GitHub Actions bot. label Jun 1, 2024
Copy link

github-actions bot commented Jul 3, 2024

This issue was closed due to inactivity for 395 days.

@github-actions github-actions bot added the CLOSED_DUE_TO_INACTIVITY Issue or PR has been closed by the GitHub Actions bot due to inactivity. label Jul 3, 2024
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jul 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLOSED_DUE_TO_INACTIVITY Issue or PR has been closed by the GitHub Actions bot due to inactivity. impacting: tests The defect (bug) is primarily a test failure (vs. a build failure) MARKED_FOR_CLOSURE Issue or PR is marked for auto-closure by the GitHub Actions bot. PA: Discretizations Issues that fall under the Trilinos Discretizations Product Area PA: Framework Issues that fall under the Trilinos Framework Product Area PA: Linear Solvers Issues that fall under the Trilinos Linear Solvers Product Area PA: Nonlinear Solvers Issues that fall under the Trilinos Nonlinear Linear Solvers Product Area pkg: Belos pkg: FEI pkg: Ifpack2 pkg: NOX type: bug The primary issue is a bug in Trilinos code or tests
Projects
None yet
Development

No branches or pull requests

7 participants