Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running error on TM_Subduction_example.jl #51

Open
tyszwh opened this issue Apr 20, 2024 · 19 comments
Open

Running error on TM_Subduction_example.jl #51

tyszwh opened this issue Apr 20, 2024 · 19 comments

Comments

@tyszwh
Copy link

tyszwh commented Apr 20, 2024

Hi, everyone

I just started using lamem.jl. I tried to run TM_Subduction_example.jl under scripts, but I got the following error.
I'm also new to julia, even though I have quite a bit of experience with python. I'm not sure what the problem with this is?
How can I fix it?
Thanks a lot.

MethodError: no method matching length(::Nothing)

Closest candidates are:
  length(!Matched::Base.AsyncGenerator)
   @ Base asyncmap.jl:390
  length(!Matched::DataStructures.IntSet)
   @ DataStructures [C:\Users\tyszw\.julia\packages\DataStructures\b0JVf\src\int_set.jl:191](file:///C:/Users/tyszw/.julia/packages/DataStructures/b0JVf/src/int_set.jl:191)
  length(!Matched::RegexMatch)
   @ Base regex.jl:285
  ...

Stacktrace:
 [1] run_lamem_save_grid(ParamFile::String, cores::Int64; verbose::Bool, directory::String)
   @ LaMEM.Run [C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\run_lamem_save_grid.jl:88](file:///C:/Users/tyszw/.julia/packages/LaMEM/bw6yg/src/run_lamem_save_grid.jl:88)
 [2] run_lamem_save_grid
   @ [C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\run_lamem_save_grid.jl:76](file:///C:/Users/tyszw/.julia/packages/LaMEM/bw6yg/src/run_lamem_save_grid.jl:76) [inlined]
 [3] create_initialsetup(model::Model, cores::Int64, args::String; verbose::Bool)
   @ LaMEM.LaMEM_Model [C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:294](file:///C:/Users/tyszw/.julia/packages/LaMEM/bw6yg/src/LaMEM_ModelGeneration/Model.jl:294)
 [4] create_initialsetup
   @ [C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:273](file:///C:/Users/tyszw/.julia/packages/LaMEM/bw6yg/src/LaMEM_ModelGeneration/Model.jl:273) [inlined]
 [5] run_lamem(model::Model, cores::Int64, args::String; wait::Bool)
   @ LaMEM.LaMEM_Model [C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:200](file:///C:/Users/tyszw/.julia/packages/LaMEM/bw6yg/src/LaMEM_ModelGeneration/Model.jl:200)
 [6] run_lamem
   @ [C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:195](file:///C:/Users/tyszw/.julia/packages/LaMEM/bw6yg/src/LaMEM_ModelGeneration/Model.jl:195) [inlined]
 [7] run_lamem(model::Model, cores::Int64)
   @ LaMEM.LaMEM_Model [C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:195](file:///C:/Users/tyszw/.julia/packages/LaMEM/bw6yg/src/LaMEM_ModelGeneration/Model.jl:195)
 [8] top-level scope
   @ [h:\MyResearch\](file:///H:/MyResearch/example\LaMEM.jl\scripts\2DSubduction.ipynb:1
@boriskaus
Copy link
Member

What was the line you typed just before this error message and are you using the latest version of LaMEM? You can determine that with

julia > ]
pkg> status 

@tyszwh
Copy link
Author

tyszwh commented Apr 20, 2024

Dear Prof Boris Kaus

Thanks for your reply

I'll send you the julia status a little later as I'm not on that laptop (Win11) right now. I am running the code by typing the command julia TM_Subduction_example.jl.

I currently tested it on another server (ubuntu 20.04) as well and ran into a different problem, it was about the Petsc error (see below 1).
I also tried manually compiling Lamem with petsc3.19.4 (without julia). compiling was successful, but running still gives petsc errors (see below 2).

(@v1.10) pkg> status
Status `~/.julia/environments/v1.10/Project.toml`
  [3700c31b] GeophysicalModelGenerator v0.7.1
  [2e889f3d] LaMEM v0.3.4
  [91a5bcdd] Plots v1.40.4

The error output 1

Saved file: Model3D.vts
(Nprocx, Nprocy, Nprocz, xc, yc, zc, nNodeX, nNodeY, nNodeZ) = (4, 1, 2, [-2000.0, -1000.0, 0.0, 1000.0, 2000.0], [-2.5, 2.5], [-660.0, -310.0, 40.0], 513, 2, 129)
Writing LaMEM marker file -> ./markers/mdb.00000000.dat
Writing LaMEM marker file -> ./markers/mdb.00000001.dat
Writing LaMEM marker file -> ./markers/mdb.00000002.dat
Writing LaMEM marker file -> ./markers/mdb.00000003.dat
Writing LaMEM marker file -> ./markers/mdb.00000004.dat
Writing LaMEM marker file -> ./markers/mdb.00000005.dat
Writing LaMEM marker file -> ./markers/mdb.00000006.dat
Writing LaMEM marker file -> ./markers/mdb.00000007.dat
--------------------------------------------------------------------------
                   Lithosphere and Mantle Evolution Model
     Compiled: Date: Jan  1 1970 - Time: 00:00:00
     Version : 2.1.3
--------------------------------------------------------------------------
        STAGGERED-GRID FINITE DIFFERENCE CANONICAL IMPLEMENTATION
--------------------------------------------------------------------------
Parsing input file : output.dat
   Adding PETSc option: -snes_ksp_ew
   Adding PETSc option: -snes_ksp_ew_rtolmax 1e-4
   Adding PETSc option: -snes_rtol 5e-3
   Adding PETSc option: -snes_atol 1e-4
   Adding PETSc option: -snes_max_it 200
   Adding PETSc option: -snes_PicardSwitchToNewton_rtol 1e-3
   Adding PETSc option: -snes_NewtonSwitchToPicard_it 20
   Adding PETSc option: -js_ksp_type fgmres
   Adding PETSc option: -js_ksp_max_it 20
   Adding PETSc option: -js_ksp_atol 1e-8
   Adding PETSc option: -js_ksp_rtol 1e-4
   Adding PETSc option: -snes_linesearch_type l2
   Adding PETSc option: -snes_linesearch_maxstep 10
   Adding PETSc option: -da_refine_y 1
Finished parsing input file
--------------------------------------------------------------------------
Scaling parameters:
   Temperature : 1000. [C/K]
   Length      : 1000. [m]
   Viscosity   : 1e+20 [Pa*s]
   Stress      : 1e+09 [Pa]
--------------------------------------------------------------------------
Time stepping parameters:
   Simulation end time          : 2000. [Myr]
   Maximum number of steps      : 400
   Time step                    : 0.001 [Myr]
   Minimum time step            : 1e-06 [Myr]
   Maximum time step            : 0.1 [Myr]
   Time step increase factor    : 0.1
   CFL criterion                : 0.5
   CFLMAX (fixed time steps)    : 0.8
   Output every [n] steps       : 10
   Output [n] initial steps     : 1
--------------------------------------------------------------------------
Grid parameters:
   Total number of cpu                  : 8
   Processor grid  [nx, ny, nz]         : [4, 1, 2]
   Fine grid cells [nx, ny, nz]         : [512, 1, 128]
   Number of cells                      :  65536
   Number of faces                      :  262784
   Maximum cell aspect ratio            :  1.56250
   Lower coordinate bounds [bx, by, bz] : [-2000., -2.5, -660.]
   Upper coordinate bounds [ex, ey, ez] : [2000., 2.5, 40.]
--------------------------------------------------------------------------
Softening laws:
--------------------------------------------------------------------------
   SoftLaw [0] : A = 0.95, APS1 = 0.1, APS2 = 0.5
--------------------------------------------------------------------------
Material parameters:
--------------------------------------------------------------------------
- Melt factor mfc = 0.000000   Phase ID : 0     --   dryPeridotite
   diffusion creep profile  : Dry_Olivine_diff_creep-Hirth_Kohlstedt_2003
   dislocation creep profile: Dry_Olivine_disl_creep-Hirth_Kohlstedt_2003
   (dens)   : rho = 3300. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 3892.49 [m/s]
   (diff)   : Bd = 1.5e-09 [1/Pa/s]  Ed = 375000. [J/mol]  Vd = 1.45e-05 [m^3/mol]
   (disl)   : Bn = 6.22254e-16 [1/Pa^n/s]  En = 530000. [J/mol]  Vn = 1.45e-05 [m^3/mol]  n = 3.5 [ ]
   (plast)  : ch = 3e+07 [Pa]  fr = 20. [deg]  frSoftID = 0 chSoftID = 0
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 6.6667e-12 [W/kg]

- Melt factor mfc = 0.000000   Phase ID : 1     --   oceanCrust
   dislocation creep profile: Plagioclase_An75-Ranalli_1995
   (dens)   : rho = 3300. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 3892.49 [m/s]
   (disl)   : Bn = 1.04578e-22 [1/Pa^n/s]  En = 238000. [J/mol]  n = 3.2 [ ]
   (plast)  : ch = 5e+06 [Pa]
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 2.333e-10 [W/kg]

- Melt factor mfc = 0.000000   Phase ID : 2     --   oceanicLithosphere
   diffusion creep profile  : Dry_Olivine_diff_creep-Hirth_Kohlstedt_2003
   dislocation creep profile: Dry_Olivine_disl_creep-Hirth_Kohlstedt_2003
   (dens)   : rho = 3300. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 3892.49 [m/s]
   (diff)   : Bd = 1.5e-09 [1/Pa/s]  Ed = 375000. [J/mol]  Vd = 1.45e-05 [m^3/mol]
   (disl)   : Bn = 6.22254e-16 [1/Pa^n/s]  En = 530000. [J/mol]  Vn = 1.45e-05 [m^3/mol]  n = 3.5 [ ]
   (plast)  : ch = 3e+07 [Pa]  fr = 20. [deg]  frSoftID = 0 chSoftID = 0
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 6.6667e-12 [W/kg]

- Melt factor mfc = 0.000000   Phase ID : 3     --   continentalCrust
   dislocation creep profile: Quarzite-Ranalli_1995
   (dens)   : rho = 2700. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 4303.31 [m/s]
   (disl)   : Bn = 8.63279e-20 [1/Pa^n/s]  En = 156000. [J/mol]  n = 2.4 [ ]
   (plast)  : ch = 3e+07 [Pa]  fr = 20. [deg]  frSoftID = 0 chSoftID = 0
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 5.3571e-10 [W/kg]

- Melt factor mfc = 0.000000   Phase ID : 4     --   continentalLithosphere
   diffusion creep profile  : Dry_Olivine_diff_creep-Hirth_Kohlstedt_2003
   dislocation creep profile: Dry_Olivine_disl_creep-Hirth_Kohlstedt_2003
   (dens)   : rho = 3300. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 3892.49 [m/s]
   (diff)   : Bd = 1.5e-09 [1/Pa/s]  Ed = 375000. [J/mol]  Vd = 1.45e-05 [m^3/mol]
   (disl)   : Bn = 6.22254e-16 [1/Pa^n/s]  En = 530000. [J/mol]  Vn = 1.45e-05 [m^3/mol]  n = 3.5 [ ]
   (plast)  : ch = 3e+07 [Pa]  fr = 20. [deg]  frSoftID = 0 chSoftID = 0
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 6.6667e-12 [W/kg]

- Melt factor mfc = 0.000000   Phase ID : 5     --   air
   (dens)   : rho = 50. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 31622.8 [m/s]
   (diff)   : eta = 1e+19 [Pa*s]  Bd = 5e-20 [1/Pa/s]
   (plast)  : ch = 1e+07 [Pa]
   (temp)   : Cp = 1e+06 [J/kg/K]  k = 100. [W/m/k]

--------------------------------------------------------------------------
--------------------------------------------------------------------------
Free surface parameters:
   Sticky air phase ID       : 5
   Initial surface level     : 0. [km]
   Erosion model             : none
   Sedimentation model       : none
   Correct marker phases     @
   Maximum surface slope     : 40. [deg]
--------------------------------------------------------------------------
Boundary condition parameters:
   No-slip boundary mask [lt rt ft bk bm tp]  : 0 0 0 0 0 0
   Open top boundary                          @
   Top boundary temperature                   : 20. [C]
   Bottom boundary temperature                : 1565. [C]
--------------------------------------------------------------------------
Solution parameters & controls:
   Gravity [gx, gy, gz]                    : [0., 0., -9.81] [m/s^2]
   Surface stabilization (FSSA)            :  1.
   Adiabatic Heating Efficiency            :  1.
   Shear heating efficiency                :  1.
   Activate temperature diffusion          @
   Compute initial guess                   @
   Use lithostatic pressure for creep      @
   Enforce zero average pressure on top    @
   Limit pressure at first iteration       @
   Minimum viscosity                       : 5e+18 [Pa*s]
   Reference viscosity (initial guess)     : 1e+20 [Pa*s]
   Universal gas constant                  : 8.31446 [J/mol/K]
   Minimum cohesion                        : 1000. [Pa]
   Max. melt fraction (viscosity, density) : 0.15
   Rheology iteration number               : 25
   Rheology iteration tolerance            : 1e-06
   Ground water level type                 : none
--------------------------------------------------------------------------
Advection parameters:
   Advection scheme              : Runge-Kutta 2-nd order
   Periodic marker advection     : 0 0 0
   Marker setup scheme           : binary files (MATLAB)
   Velocity interpolation scheme : STAG (linear)
   Marker control type           : subgrid
   Markers per cell [nx, ny, nz] : [3, 3, 3]
   Marker distribution type      : random noise
--------------------------------------------------------------------------
Loading markers in parallel from file(s) <./markers/mdb> ... done (0.0478795 sec)
--------------------------------------------------------------------------
Output parameters:
   Output file name                        : output
   Write .pvd file                         : yes
   Phase                                   @
   Density                                 @
   Total effective viscosity               @
   Creep effective viscosity               @
   Velocity                                @
   Temperature                             @
   Deviatoric strain rate second invariant @
   Accumulated Plastic Strain (APS)        @
   Plastic dissipation                     @
--------------------------------------------------------------------------
Surface output parameters:
   Write .pvd file : yes
   Topography      @
--------------------------------------------------------------------------
Preconditioner parameters:
   Matrix type                   : monolithic
   Preconditioner type           : coupled Galerkin geometric multigrid
   Global coarse grid [nx,ny,nz] : [128, 1, 32]
   Local coarse grid  [nx,ny,nz] : [32, 1, 16]
   Number of multigrid levels    :  3
--------------------------------------------------------------------------
Solver parameters specified:
   Outermost Krylov solver       : fgmres
   Solver type                   : multigrid
   Multigrid refinement in x/z
   Multigrid smoother levels KSP : chebyshev
   Multigrid smoother levels PC  : sor
   Number of smoothening steps   : 10
   Coarse level KSP              : preonly
   Coarse level PC               : lu
   Coarse level solver package   : (null)
--------------------------------------------------------------------------
============================== INITIAL GUESS =============================
--------------------------------------------------------------------------
  0 SNES Function norm 7.682568809981e+00
  0 PICARD ||F||/||F0||=1.000000e+00
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: See https://petsc.org/release/overview/linear_solve_table/ for possible LU and Cholesky solvers
[0]PETSC ERROR: Could not locate solver type superlu_dist for factorization type LU and matrix type mpiaij. Perhaps you must ./configure with --download-superlu_dist
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-js_ksp_converged_reason (no value)
[0]PETSC ERROR: Option left: name:-js_ksp_min_it value: 1
[0]PETSC ERROR: Option left: name:-ParamFile value: output.dat
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.18.6, Mar 30, 2023
[0]PETSC ERROR: /home/tysz/.julia/artifacts/cdcdb6c042de853df8d1ba4d8e903049163372fc/bin/LaMEM on a  named ubuntu by Unknown Sat Apr 20 06:37:17 2024
[0]PETSC ERROR: Configure options --prefix=/workspace/destdir/lib/petsc/double_real_Int32 --CC=cc --FC=gfortran --CXX=c++ --COPTFLAGS=-O3 -g --CXXOPTFLAGS=-O3 -g --FOPTFLAGS=-O3 --with-blaslapack-lib=/workspace/destdir/lib/libopenblas.so --with-blaslapack-suffix= --CFLAGS="-fno-stack-protector " --FFLAGS= --LDFLAGS=-L/workspace/destdir/lib --with-64-bit-indices=0 --with-debugging=0 --with-batch --with-mpi=1 --with-mpi-lib="[/workspace/destdir/lib/libmpifort.so,/workspace/destdir/lib/libmpi.so]" --with-mpi-include=/workspace/destdir/include --known-mpi-int64_t=0 --with-sowing=0 --with-precision=double --with-scalar-type=real --with-pthread=0 --PETSC_ARCH=x86_64-linux-gnu_double_real_Int32 --with-superlu_dist=0 --with-mumps=1 --with-mumps-lib=/workspace/destdir/lib/libdmumpspar.so --with-scalapack-lib=/workspace/destdir/lib/libscalapack32.so --with-mumps-include=/workspace/destdir/include --with-scalapack-include=/workspace/destdir/include --SOSUFFIX=double_real_Int32 --with-shared-libraries=1 --with-clean=1
[0]PETSC ERROR: #1 MatGetFactor() at /workspace/srcdir/petsc-3.18.6/src/mat/interface/matrix.c:4751
[0]PETSC ERROR: #2 PCSetUp_LU() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/impls/factor/lu/lu.c:80
[0]PETSC ERROR: #3 PCSetUp() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/interface/precon.c:994
[0]PETSC ERROR: #4 PCApply() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/interface/precon.c:438
[0]PETSC ERROR: #5 KSP_PCApply() at /workspace/srcdir/petsc-3.18.6/include/petsc/private/kspimpl.h:380
[0]PETSC ERROR: #6 KSPSolve_PREONLY() at /workspace/srcdir/petsc-3.18.6/src/ksp/ksp/impls/preonly/preonly.c:21
[0]PETSC ERROR: #7 KSPSolve_Private() at /workspace/srcdir/petsc-3.18.6/src/ksp/ksp/interface/itfunc.c:899
[0]PETSC ERROR: #8 KSPSolve() at /workspace/srcdir/petsc-3.18.6/src/ksp/ksp/interface/itfunc.c:1071
[0]PETSC ERROR: #9 PCMGMCycle_Private() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/impls/mg/mg.c:28
[0]PETSC ERROR: #10 PCMGMCycle_Private() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/impls/mg/mg.c:84
[0]PETSC ERROR: #11 PCMGMCycle_Private() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/impls/mg/mg.c:84
[0]PETSC ERROR: #12 PCApply_MG_Internal() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/impls/mg/mg.c:611
[0]PETSC ERROR: #13 PCApply_MG() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/impls/mg/mg.c:633
[0]PETSC ERROR: #14 PCApply() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/interface/precon.c:441
[0]PETSC ERROR: #15 PCStokesMGApply() at lsolve.cpp:435
[0]PETSC ERROR: #16 MatMult_Shell() at /workspace/srcdir/petsc-3.18.6/src/mat/impls/shell/shell.c:1014
[0]PETSC ERROR: #17 MatMult() at /workspace/srcdir/petsc-3.18.6/src/mat/interface/matrix.c:2583
[0]PETSC ERROR: #18 PCApply_Mat() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/impls/mat/pcmat.c:7
[0]PETSC ERROR: #19 PCApply() at /workspace/srcdir/petsc-3.18.6/src/ksp/pc/interface/precon.c:441
[0]PETSC ERROR: #20 KSP_PCApply() at /workspace/srcdir/petsc-3.18.6/include/petsc/private/kspimpl.h:380
[0]PETSC ERROR: #21 KSPFGMRESCycle() at /workspace/srcdir/petsc-3.18.6/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152
[0]PETSC ERROR: #22 KSPSolve_FGMRES() at /workspace/srcdir/petsc-3.18.6/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273
[0]PETSC ERROR: #23 KSPSolve_Private() at /workspace/srcdir/petsc-3.18.6/src/ksp/ksp/interface/itfunc.c:899
[0]PETSC ERROR: #24 KSPSolve() at /workspace/srcdir/petsc-3.18.6/src/ksp/ksp/interface/itfunc.c:1071
[0]PETSC ERROR: #25 SNESSolve_NEWTONLS() at /workspace/srcdir/petsc-3.18.6/src/snes/impls/ls/ls.c:210
[0]PETSC ERROR: #26 SNESSolve() at /workspace/srcdir/petsc-3.18.6/src/snes/interface/snes.c:4692
[0]PETSC ERROR: #27 LaMEMLibInitGuess() at LaMEMLib.cpp:841
[0]PETSC ERROR: #28 LaMEMLibSolve() at LaMEMLib.cpp:631
[0]PETSC ERROR: #29 LaMEMLibMain() at LaMEMLib.cpp:133
[0]PETSC ERROR: #30 main() at LaMEM.cpp:53
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -crs_ksp_type preonly
[0]PETSC ERROR: -crs_pc_factor_mat_solver_type superlu_dist
[0]PETSC ERROR: -crs_pc_type lu
[0]PETSC ERROR: -da_refine_y 1
[0]PETSC ERROR: -gmg_mg_levels_ksp_max_it 10
[0]PETSC ERROR: -gmg_mg_levels_ksp_type chebyshev
[0]PETSC ERROR: -gmg_pc_mg_cycle_type v
[0]PETSC ERROR: -gmg_pc_mg_galerkin
[0]PETSC ERROR: -gmg_pc_mg_levels 3
[0]PETSC ERROR: -gmg_pc_mg_log
[0]PETSC ERROR: -gmg_pc_mg_type multiplicative
[0]PETSC ERROR: -gmg_pc_type mg
[0]PETSC ERROR: -jp_type mg
[0]PETSC ERROR: -js_ksp_atol 1e-8
[0]PETSC ERROR: -js_ksp_converged_reason
[0]PETSC ERROR: -js_ksp_max_it 20
[0]PETSC ERROR: -js_ksp_min_it 1
[0]PETSC ERROR: -js_ksp_rtol 1e-4
[0]PETSC ERROR: -js_ksp_type fgmres
[0]PETSC ERROR: -ParamFile output.dat
[0]PETSC ERROR: -pcmat_type mono
[0]PETSC ERROR: -snes_atol 1e-4
[0]PETSC ERROR: -snes_ksp_ew
[0]PETSC ERROR: -snes_ksp_ew_rtolmax 1e-4
[0]PETSC ERROR: -snes_linesearch_maxstep 10
[0]PETSC ERROR: -snes_linesearch_type l2
[0]PETSC ERROR: -snes_max_funcs 500000
[0]PETSC ERROR: -snes_max_it 200
[0]PETSC ERROR: -snes_max_linear_solve_fail 10000
[0]PETSC ERROR: -snes_monitor
[0]PETSC ERROR: -snes_NewtonSwitchToPicard_it 20
[0]PETSC ERROR: -snes_PicardSwitchToNewton_rtol 1e-3
[0]PETSC ERROR: -snes_rtol 5e-3
[0]PETSC ERROR: -snes_stol 1e-16
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------
application called MPI_Abort(MPI_COMM_SELF, 92) - process 0
ERROR: LoadError: failed process: Process(setenv(`/home/tysz/.julia/artifacts/0ed4137b58af5c5e3797cb0c400e60ed7c308bae/bin/mpiexec -n 8 /home/tysz/.julia/artifacts/cdcdb6c042de853df8d1ba4d8e903049163372fc/bin/LaMEM -ParamFile output.dat ''`,["LD_LIBRARY_PATH=/home/tysz/software/julia-1.10.2/bin/../lib/julia:/home/tysz/.julia/artifacts/93ddb84060b49f38ec59d4b04a3109fedc4577d2/lib:/home/tysz/.julia/artifacts/e5244f73466343373a87ed1345efac5e3440fa96/lib:/home/tysz/.julia/artifacts/e5244f73466343373a87ed1345efac5e3440fa96/lib/metis/metis_Int32_Real64/lib:/home/tysz/.julia/artifacts/e5244f73466343373a87ed1345efac5e3440fa96/lib/metis/metis_Int64_Real32/lib:/home/tysz/.julia/artifacts/e5244f73466343373a87ed1345efac5e3440fa96/lib/metis/metis_Int64_Real64/lib:/home/tysz/.julia/artifacts/0ed4137b58af5c5e3797cb0c400e60ed7c308bae/lib:/home/tysz/.julia/artifacts/4e429343cedc556ef6c6046f573d3b270b232265/lib:/home/tysz/.julia/artifacts/9c535f313f10a5cbe213d4135181424781e9111d/lib:/home/tysz/miniconda3/envs/geo3/lib:/home/tysz/.julia/artifacts/f839432e3d2904a5c847b217ef0c0f489377ecc5/lib:/home/tysz/.julia/artifacts/a47753eb13a8000f65dfa44a924c85c917753031/lib:/home/tysz/.julia/artifacts/f22bbf5ffbd75d093017c5c58208111d75a86ee8/lib:/home/tysz/.julia/artifacts/caffdc55b95700105f8a8466ec7e9a8484e0363d/lib:/home/tysz/.julia/artifacts/0a6a41be79ef85f32aa7d8529d4aebf9ef8ab030/lib:/home/tysz/.julia/artifacts/d2146cd9a2156c30259e6246e9778b18b916d453/lib:/home/tysz/.julia/artifacts/52352a3b6f679226b157304cebd3d4d81539b305/lib/petsc/double_real_Int64/lib:/home/tysz/.julia/artifacts/52352a3b6f679226b157304cebd3d4d81539b305/lib/petsc/single_complex_Int32/lib:/home/tysz/.julia/artifacts/52352a3b6f679226b157304cebd3d4d81539b305/lib/petsc/single_complex_Int64/lib:/home/tysz/.julia/artifacts/52352a3b6f679226b157304cebd3d4d81539b305/lib/petsc/single_real_Int32/lib:/home/tysz/.julia/artifacts/52352a3b6f679226b157304cebd3d4d81539b305/lib/petsc/single_real_Int64/lib:/home/tysz/.julia/artifacts/52352a3b6f679226b157304cebd3d4d81539b305/lib/petsc/double_complex_Int32/lib:/home/tysz/.julia/artifacts/52352a3b6f679226b157304cebd3d4d81539b305/lib/petsc/double_complex_Int64/lib:/home/tysz/.julia/artifacts/52352a3b6f679226b157304cebd3d4d81539b305/lib/petsc/double_real_Int32/lib:/home/tysz/.julia/artifacts/52352a3b6f679226b157304cebd3d4d81539b305/lib/petsc/double_real_Int64_deb/lib:/home/tysz/.julia/artifacts/cdcdb6c042de853df8d1ba4d8e903049163372fc/lib:/home/tysz/software/julia-1.10.2/bin/../lib/julia:/home/tysz/software/julia-1.10.2/bin/../lib", "VECLIB_MAXIMUM_THREADS=1", "OMP_NUM_THREADS=1"]), ProcessExited(92)) [92]

Stacktrace:
 [1] pipeline_error
   @ ./process.jl:565 [inlined]
 [2] run(::Cmd; wait::Bool)
   @ Base ./process.jl:480
 [3] run
   @ ./process.jl:477 [inlined]
 [4] run_lamem(ParamFile::String, cores::Int64, args::String; wait::Bool, deactivate_multithreads::Bool)
   @ LaMEM.Run ~/.julia/packages/LaMEM/bw6yg/src/run_lamem.jl:67
 [5] run_lamem
   @ ~/.julia/packages/LaMEM/bw6yg/src/run_lamem.jl:43 [inlined]
 [6] run_lamem(model::Model, cores::Int64, args::String; wait::Bool)
   @ LaMEM.LaMEM_Model ~/.julia/packages/LaMEM/bw6yg/src/LaMEM_ModelGeneration/Model.jl:206
 [7] run_lamem
   @ ~/.julia/packages/LaMEM/bw6yg/src/LaMEM_ModelGeneration/Model.jl:195 [inlined]
 [8] run_lamem(model::Model, cores::Int64)
   @ LaMEM.LaMEM_Model ~/.julia/packages/LaMEM/bw6yg/src/LaMEM_ModelGeneration/Model.jl:195
 [9] top-level scope
   @ ~/software/boris/LaMEM.jl/scripts/TM_Subduction_example.jl:233
in expression starting at /home/tysz/software/boris/LaMEM.jl/scripts/TM_Subduction_example.jl:233

The error output 2

LaMEM -ParamFile output.dat
--------------------------------------------------------------------------
                   Lithosphere and Mantle Evolution Model
     Compiled: Date: Apr 20 2024 - Time: 06:17:55
     Version : 2.1.4
--------------------------------------------------------------------------
        STAGGERED-GRID FINITE DIFFERENCE CANONICAL IMPLEMENTATION
--------------------------------------------------------------------------
Parsing input file : output.dat
   Adding PETSc option: -snes_ksp_ew
   Adding PETSc option: -snes_ksp_ew_rtolmax 1e-4
   Adding PETSc option: -snes_rtol 5e-3
   Adding PETSc option: -snes_atol 1e-4
   Adding PETSc option: -snes_max_it 200
   Adding PETSc option: -snes_PicardSwitchToNewton_rtol 1e-3
   Adding PETSc option: -snes_NewtonSwitchToPicard_it 20
   Adding PETSc option: -js_ksp_type fgmres
   Adding PETSc option: -js_ksp_max_it 20
   Adding PETSc option: -js_ksp_atol 1e-8
   Adding PETSc option: -js_ksp_rtol 1e-4
   Adding PETSc option: -snes_linesearch_type l2
   Adding PETSc option: -snes_linesearch_maxstep 10
   Adding PETSc option: -da_refine_y 1
Finished parsing input file
--------------------------------------------------------------------------
Scaling parameters:
   Temperature : 1000. [C/K]
   Length      : 1000. [m]
   Viscosity   : 1e+20 [Pa*s]
   Stress      : 1e+09 [Pa]
--------------------------------------------------------------------------
Time stepping parameters:
   Simulation end time          : 2000. [Myr]
   Maximum number of steps      : 400
   Time step                    : 0.001 [Myr]
   Minimum time step            : 1e-06 [Myr]
   Maximum time step            : 0.1 [Myr]
   Time step increase factor    : 0.1
   CFL criterion                : 0.5
   CFLMAX (fixed time steps)    : 0.8
   Output every [n] steps       : 10
   Output [n] initial steps     : 1
--------------------------------------------------------------------------
Grid parameters:
   Total number of cpu                  : 1
   Processor grid  [nx, ny, nz]         : [1, 1, 1]
   Fine grid cells [nx, ny, nz]         : [512, 1, 128]
   Number of cells                      :  65536
   Number of faces                      :  262784
   Maximum cell aspect ratio            :  1.56250
   Lower coordinate bounds [bx, by, bz] : [-2000., -2.5, -660.]
   Upper coordinate bounds [ex, ey, ez] : [2000., 2.5, 40.]
--------------------------------------------------------------------------
Softening laws:
--------------------------------------------------------------------------
   SoftLaw [0] : A = 0.95, APS1 = 0.1, APS2 = 0.5
--------------------------------------------------------------------------
Material parameters:
--------------------------------------------------------------------------
   Phase ID : 0     --   dryPeridotite
   diffusion creep profile  : Dry_Olivine_diff_creep-Hirth_Kohlstedt_2003
   dislocation creep profile: Dry_Olivine_disl_creep-Hirth_Kohlstedt_2003
   (dens)   : rho = 3300. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 3892.49 [m/s]
   (diff)   : Bd = 1.5e-09 [1/Pa/s]  Ed = 375000. [J/mol]  Vd = 1.45e-05 [m^3/mol]
   (disl)   : Bn = 6.22254e-16 [1/Pa^n/s]  En = 530000. [J/mol]  Vn = 1.45e-05 [m^3/mol]  n = 3.5 [ ]
   (plast)  : ch = 3e+07 [Pa]  fr = 20. [deg]  frSoftID = 0 chSoftID = 0
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 6.6667e-12 [W/kg]

   Phase ID : 1     --   oceanCrust
   dislocation creep profile: Plagioclase_An75-Ranalli_1995
   (dens)   : rho = 3300. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 3892.49 [m/s]
   (disl)   : Bn = 1.04578e-22 [1/Pa^n/s]  En = 238000. [J/mol]  n = 3.2 [ ]
   (plast)  : ch = 5e+06 [Pa]
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 2.333e-10 [W/kg]

   Phase ID : 2     --   oceanicLithosphere
   diffusion creep profile  : Dry_Olivine_diff_creep-Hirth_Kohlstedt_2003
   dislocation creep profile: Dry_Olivine_disl_creep-Hirth_Kohlstedt_2003
   (dens)   : rho = 3300. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 3892.49 [m/s]
   (diff)   : Bd = 1.5e-09 [1/Pa/s]  Ed = 375000. [J/mol]  Vd = 1.45e-05 [m^3/mol]
   (disl)   : Bn = 6.22254e-16 [1/Pa^n/s]  En = 530000. [J/mol]  Vn = 1.45e-05 [m^3/mol]  n = 3.5 [ ]
   (plast)  : ch = 3e+07 [Pa]  fr = 20. [deg]  frSoftID = 0 chSoftID = 0
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 6.6667e-12 [W/kg]

   Phase ID : 3     --   continentalCrust
   dislocation creep profile: Quarzite-Ranalli_1995
   (dens)   : rho = 2700. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 4303.31 [m/s]
   (disl)   : Bn = 8.63279e-20 [1/Pa^n/s]  En = 156000. [J/mol]  n = 2.4 [ ]
   (plast)  : ch = 3e+07 [Pa]  fr = 20. [deg]  frSoftID = 0 chSoftID = 0
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 5.3571e-10 [W/kg]

   Phase ID : 4     --   continentalLithosphere
   diffusion creep profile  : Dry_Olivine_diff_creep-Hirth_Kohlstedt_2003
   dislocation creep profile: Dry_Olivine_disl_creep-Hirth_Kohlstedt_2003
   (dens)   : rho = 3300. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 3892.49 [m/s]
   (diff)   : Bd = 1.5e-09 [1/Pa/s]  Ed = 375000. [J/mol]  Vd = 1.45e-05 [m^3/mol]
   (disl)   : Bn = 6.22254e-16 [1/Pa^n/s]  En = 530000. [J/mol]  Vn = 1.45e-05 [m^3/mol]  n = 3.5 [ ]
   (plast)  : ch = 3e+07 [Pa]  fr = 20. [deg]  frSoftID = 0 chSoftID = 0
   (temp)   : alpha = 3e-05 [1/K]  Cp = 1000. [J/kg/K]  k = 3. [W/m/k]  A = 6.6667e-12 [W/kg]

   Phase ID : 5     --   air
   (dens)   : rho = 50. [kg/m^3]
   (elast)  : G = 5e+10 [Pa]  Vs = 31622.8 [m/s]
   (diff)   : eta = 1e+19 [Pa*s]  Bd = 5e-20 [1/Pa/s]
   (plast)  : ch = 1e+07 [Pa]
   (temp)   : Cp = 1e+06 [J/kg/K]  k = 100. [W/m/k]

--------------------------------------------------------------------------
--------------------------------------------------------------------------
Free surface parameters:
   Sticky air phase ID       : 5
   Initial surface level     : 0. [km]
   Erosion model             : none
   Sedimentation model       : none
   Correct marker phases     @
   Maximum surface slope     : 40. [deg]
--------------------------------------------------------------------------
Boundary condition parameters:
   No-slip boundary mask [lt rt ft bk bm tp]  : 0 0 0 0 0 0
   Open top boundary                          @
   Top boundary temperature                   : 20. [C]
   Bottom boundary temperature                : 1565. [C]
--------------------------------------------------------------------------
Solution parameters & controls:
   Gravity [gx, gy, gz]                    : [0., 0., -9.81] [m/s^2]
   Surface stabilization (FSSA)            :  1.
   Adiabatic Heating Efficiency            :  1.
   Shear heating efficiency                :  1.
   Activate temperature diffusion          @
   Compute initial guess                   @
   Use lithostatic pressure for creep      @
   Enforce zero average pressure on top    @
   Limit pressure at first iteration       @
   Minimum viscosity                       : 5e+18 [Pa*s]
   Reference viscosity (initial guess)     : 1e+20 [Pa*s]
   Universal gas constant                  : 8.31446 [J/mol/K]
   Minimum cohesion                        : 1000. [Pa]
   Max. melt fraction (viscosity, density) : 1.
   Rheology iteration number               : 25
   Rheology iteration tolerance            : 1e-06
   Ground water level type                 : none
--------------------------------------------------------------------------
Advection parameters:
   Advection scheme              : Runge-Kutta 2-nd order
   Periodic marker advection     : 0 0 0
   Marker setup scheme           : binary files (MATLAB)
   Velocity interpolation scheme : STAG (linear)
   Marker control type           : subgrid
   Markers per cell [nx, ny, nz] : [3, 3, 3]
   Marker distribution type      : random noise
--------------------------------------------------------------------------
Loading markers in parallel from file(s) <./markers/mdb> ... done (0.0394263 sec)
--------------------------------------------------------------------------
Number of exactly empty cells: 57344
Number of cells with incorrect number of markers (nmark_x*nmark_y*nmark_z): 57344
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Problems with initial marker distribution (see the above message)
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR:   Option left: name:-crs_ksp_type value: preonly source: code
[0]PETSC ERROR:   Option left: name:-crs_pc_factor_mat_solver_type value: superlu_dist source: code
[0]PETSC ERROR:   Option left: name:-crs_pc_type value: lu source: code
[0]PETSC ERROR:   Option left: name:-gmg_mg_levels_ksp_max_it value: 10 source: code
[0]PETSC ERROR:   Option left: name:-gmg_mg_levels_ksp_type value: chebyshev source: code
[0]PETSC ERROR:   Option left: name:-gmg_pc_mg_cycle_type value: v source: code
[0]PETSC ERROR:   Option left: name:-gmg_pc_mg_galerkin (no value) source: code
[0]PETSC ERROR:   Option left: name:-gmg_pc_mg_levels value: 3 source: code
[0]PETSC ERROR:   Option left: name:-gmg_pc_mg_log (no value) source: code
[0]PETSC ERROR:   Option left: name:-gmg_pc_mg_type value: multiplicative source: code
[0]PETSC ERROR:   Option left: name:-gmg_pc_type value: mg source: code
[0]PETSC ERROR:   Option left: name:-jp_type value: mg source: code
[0]PETSC ERROR:   Option left: name:-js_ksp_atol value: 1e-8 source: code
[0]PETSC ERROR:   Option left: name:-js_ksp_converged_reason (no value) source: code
[0]PETSC ERROR:   Option left: name:-js_ksp_max_it value: 20 source: code
[0]PETSC ERROR:   Option left: name:-js_ksp_min_it value: 1 source: code
[0]PETSC ERROR:   Option left: name:-js_ksp_rtol value: 1e-4 source: code
[0]PETSC ERROR:   Option left: name:-js_ksp_type value: fgmres source: code
[0]PETSC ERROR:   Option left: name:-ParamFile value: output.dat source: code
[0]PETSC ERROR:   Option left: name:-pcmat_type value: mono source: code
[0]PETSC ERROR:   Option left: name:-snes_atol value: 1e-4 source: code
[0]PETSC ERROR:   Option left: name:-snes_ksp_ew (no value) source: code
[0]PETSC ERROR:   Option left: name:-snes_ksp_ew_rtolmax value: 1e-4 source: code
[0]PETSC ERROR:   Option left: name:-snes_linesearch_maxstep value: 10 source: code
[0]PETSC ERROR:   Option left: name:-snes_linesearch_type value: l2 source: code
[0]PETSC ERROR:   Option left: name:-snes_max_funcs value: 500000 source: code
[0]PETSC ERROR:   Option left: name:-snes_max_it value: 200 source: code
[0]PETSC ERROR:   Option left: name:-snes_max_linear_solve_fail value: 10000 source: code
[0]PETSC ERROR:   Option left: name:-snes_monitor (no value) source: code
[0]PETSC ERROR:   Option left: name:-snes_NewtonSwitchToPicard_it value: 20 source: code
[0]PETSC ERROR:   Option left: name:-snes_PicardSwitchToNewton_rtol value: 1e-3 source: code
[0]PETSC ERROR:   Option left: name:-snes_rtol value: 5e-3 source: code
[0]PETSC ERROR:   Option left: name:-snes_stol value: 1e-16 source: code
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.19.4, unknown
[0]PETSC ERROR: LaMEM on a  named ubuntu by tysz Sat Apr 20 06:54:43 2024
[0]PETSC ERROR: Configure options --prefix=/home/tysz/miniconda3/envs/geo3 --with-debugging=yes --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries --with-cxx-dialect=C++11 --with-make-np=8 --download-mpich=yes --download-hdf5=yes --download-mumps=yes --download-parmetis=yes --download-metis=yes --download-superlu=yes --download-hypre=yes --download-superlu_dist=yes --download-scalapack=yes --download-cmake=yes --with-zlib
[0]PETSC ERROR: #1 ADVMarkCheckMarkers() at marker.cpp:451
[0]PETSC ERROR: #2 ADVCreate() at advect.cpp:243
[0]PETSC ERROR: #3 LaMEMLibCreate() at LaMEMLib.cpp:187
[0]PETSC ERROR: #4 LaMEMLibMain() at LaMEMLib.cpp:113
[0]PETSC ERROR: #5 main() at LaMEM.cpp:53
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -crs_ksp_type preonly (source: code)
[0]PETSC ERROR: -crs_pc_factor_mat_solver_type superlu_dist (source: code)
[0]PETSC ERROR: -crs_pc_type lu (source: code)
[0]PETSC ERROR: -da_refine_y 1 (source: code)
[0]PETSC ERROR: -gmg_mg_levels_ksp_max_it 10 (source: code)
[0]PETSC ERROR: -gmg_mg_levels_ksp_type chebyshev (source: code)
[0]PETSC ERROR: -gmg_pc_mg_cycle_type v (source: code)
[0]PETSC ERROR: -gmg_pc_mg_galerkin (source: code)
[0]PETSC ERROR: -gmg_pc_mg_levels 3 (source: code)
[0]PETSC ERROR: -gmg_pc_mg_log (source: code)
[0]PETSC ERROR: -gmg_pc_mg_type multiplicative (source: code)
[0]PETSC ERROR: -gmg_pc_type mg (source: code)
[0]PETSC ERROR: -jp_type mg (source: code)
[0]PETSC ERROR: -js_ksp_atol 1e-8 (source: code)
[0]PETSC ERROR: -js_ksp_converged_reason (source: code)
[0]PETSC ERROR: -js_ksp_max_it 20 (source: code)
[0]PETSC ERROR: -js_ksp_min_it 1 (source: code)
[0]PETSC ERROR: -js_ksp_rtol 1e-4 (source: code)
[0]PETSC ERROR: -js_ksp_type fgmres (source: code)
[0]PETSC ERROR: -ParamFile output.dat (source: code)
[0]PETSC ERROR: -pcmat_type mono (source: code)
[0]PETSC ERROR: -snes_atol 1e-4 (source: code)
[0]PETSC ERROR: -snes_ksp_ew (source: code)
[0]PETSC ERROR: -snes_ksp_ew_rtolmax 1e-4 (source: code)
[0]PETSC ERROR: -snes_linesearch_maxstep 10 (source: code)
[0]PETSC ERROR: -snes_linesearch_type l2 (source: code)
[0]PETSC ERROR: -snes_max_funcs 500000 (source: code)
[0]PETSC ERROR: -snes_max_it 200 (source: code)
[0]PETSC ERROR: -snes_max_linear_solve_fail 10000 (source: code)
[0]PETSC ERROR: -snes_monitor (source: code)
[0]PETSC ERROR: -snes_NewtonSwitchToPicard_it 20 (source: code)
[0]PETSC ERROR: -snes_PicardSwitchToNewton_rtol 1e-3 (source: code)
[0]PETSC ERROR: -snes_rtol 5e-3 (source: code)
[0]PETSC ERROR: -snes_stol 1e-16 (source: code)
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------
application called MPI_Abort(MPI_COMM_SELF, 83) - process 0
[unset]: PMIU_write error; fd=-1 buf=:cmd=abort exitcode=83 message=application called MPI_Abort(MPI_COMM_SELF, 83) - process 0
:
system msg for write_line failure : Bad file descriptor

@tyszwh
Copy link
Author

tyszwh commented Apr 20, 2024

Hi, Prof Boris Kaus

Here is julia pkg status on my laptop( test in Win11)

Status `C:\Users\tyszw\.julia\environments\v1.10\Project.toml`
⌃ [0c46a032] DifferentialEquations v7.12.0
  [5752ebe1] GMT v1.12.2
  [3700c31b] GeophysicalModelGenerator v0.7.1
  [98e50ef6] JuliaFormatter v1.0.56
  [2e889f3d] LaMEM v0.3.4
  [2b0e0bc5] LanguageServer v4.5.1
  [5fb14364] OhMyREPL v0.5.24
  [91a5bcdd] Plots v1.40.4
  [dc215faf] ReadVTK v0.2.0
  [295af30f] Revise v3.5.14
Info Packages marked with ⌃ have new versions available and may be upgradable.

@boriskaus
Copy link
Member

boriskaus commented Apr 20, 2024

Ok, I finally managed to look at this:

  1. The example scripts/TM_Subduction_example.jl was indeed not working as the latest version of PETSc used in LaMEM.jl does not include superlu_dist, but uses mumps instead. I have pushed an update.
  2. There should be no need for you to compile PETSc yourself, unless you are planning to run this on a large high-performance computer. For smaller servers, LaMEM.jl should be all you need.
  3. Note that you can run the example with julia> include("TM_Subduction_example.jl"), provided that you are in the correct directory.
  4. In case you want to use this with your manually compiled version of PETSc + LaMEM (as in option 2 above) you have to save a *.dat file along with markers by using the function create_initialsetup instead of run_lamem. Note that you have to indicate the number of cores on which you want to run this and use the exact same number when running it.

@tyszwh
Copy link
Author

tyszwh commented Apr 21, 2024

Hi, Prof Boris Kaus

Thanks for it.

Now LaMEM.jl works correctly on the linux server (according to (1), TM_Subduction_example.jl)

But on Windows platform, the example TM_Subduction_example.jl still gives the same error as below (installed via LaMEM.jl, same error on both Windows PC platforms (ERROR: LoadError: MethodError: no method matching length(::Nothing)) ). All tests pass correctly. This should not be a Petsc error.

julia> include("TM_Subduction_example.jl")
Saved file: Model3D.vts
ERROR: LoadError: MethodError: no method matching length(::Nothing)

Closest candidates are:
  length(::Base.AsyncGenerator)
   @ Base asyncmap.jl:390
  length(::ReadVTK.PVTKFile)
   @ ReadVTK C:\Users\tyszw\.julia\packages\ReadVTK\XpdR0\src\ReadVTK.jl:355
  length(::RegexMatch)
   @ Base regex.jl:285
  ...

Stacktrace:
  [1] run_lamem_save_grid(ParamFile::String, cores::Int64; verbose::Bool, directory::String)
    @ LaMEM.Run C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\run_lamem_save_grid.jl:88
  [2] run_lamem_save_grid
    @ C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\run_lamem_save_grid.jl:76 [inlined]
  [3] create_initialsetup(model::Model, cores::Int64, args::String; verbose::Bool)
    @ LaMEM.LaMEM_Model C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:294
  [4] create_initialsetup
    @ C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:273 [inlined]
  [5] run_lamem(model::Model, cores::Int64, args::String; wait::Bool)
    @ LaMEM.LaMEM_Model C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:200
  [6] run_lamem
    @ C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:195 [inlined]
  [7] run_lamem(model::Model, cores::Int64)
    @ LaMEM.LaMEM_Model C:\Users\tyszw\.julia\packages\LaMEM\bw6yg\src\LaMEM_ModelGeneration\Model.jl:195
  [8] top-level scope
    @ H:\MyResearch\code\LaMEM.jl\scripts\TM_Subduction_example.jl:233
  [9] include(fname::String)
    @ Base.MainInclude .\client.jl:489
 [10] top-level scope
    @ REPL[1]:1
in expression starting at H:\MyResearch\code\LaMEM.jl\scripts\TM_Subduction_example.jl:233

BTW, what is the highest Petsc version currently supported for, is it 3.18.6?

@boriskaus
Copy link
Member

Thanks for reporting. I was able to reproduce your error, which caused by the fact that MPI is currently broken on windows (for the version of PETSc_jll we are using here). As a result the direct solver mumps also does not work and we can only run this on one processor.

I am working on an automatic fix. Until that is released you have two options:

  1. Install WSL (windows subsystem for linux) on your windows machine and install Julia and LaMEM.jl there. This gives you a linux environment and will allow you to do simulations in parallel.
  2. Modify TM_Subduction_example.jl such that it uses one processor only by: a) setting MGCoarseSolver = "mumps" and b) run_lamem(model, 1)

@boriskaus
Copy link
Member

BTW, what is the highest Petsc version currently supported for, is it 3.18.6?

If you want to compile LaMEM manually, you need to use PETSc version 3.18.x for the current version of LaMEM as discussed in the manual. We believe it works with 3.19.x as well, but we have not tested that extensively.
PETSc tends to be not backwards compatible due to frequent name changes of routines.

@tyszwh
Copy link
Author

tyszwh commented Apr 22, 2024

Hi, Prof Boris Kaus

Now I can run it correctly under Linux. Looking forward to fixing the Windows version.
I also did a test with manual compilation, Petsc 3.18.6 works fine, but Petsc 3.19.4 runs with an an error.

Thanks for your reply, this solved my problem!

@boriskaus
Copy link
Member

@tyszwh I have now updated PETSc_jll to 3.19.6 and released a new LaMEM_jll that is build versus this. MPI on windows remains broken, so you cannot use MUMPS or SuperLU_Dist on windows and only do 1-processor runs. Yet, if you install WSL, you'll be able to do parallel runs as well, as explained above.

@tyszwh
Copy link
Author

tyszwh commented Sep 7, 2024

Hi, Prof Boris kaus, thanks for your work, I just got back from the field. I tried under WSL and mumps worked correctly, it seems superlu_dist still doesn't work.

--------------------------------------------------------------------------
============================== INITIAL GUESS =============================
--------------------------------------------------------------------------
  0 SNES Function norm 7.879913188472e+00
  0 PICARD ||F||/||F0||=1.000000e+00
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: See https://petsc.org/release/overview/linear_solve_table/ for possible LU and Cholesky solvers
[0]PETSC ERROR: Could not locate solver type superlu_dist for factorization type LU and matrix type mpiaij. Perhaps you must ./configure with --download-superlu_dist
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR:   Option left: name:-js_ksp_converged_reason (no value) source: code
[0]PETSC ERROR:   Option left: name:-js_ksp_min_it value: 1 source: code
[0]PETSC ERROR:   Option left: name:-ParamFile value: output.dat source: code
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.19.6, Sep 28, 2023
[0]PETSC ERROR: /home/tysz/.julia/artifacts/cd461744844630cc33fbd2e7a8e795b56651d039/bin/LaMEM on a  named tysz by Unknown Sat Sep  7 10:15:03 2024
[0]PETSC ERROR: Configure options --prefix=/workspace/destdir/lib/petsc/double_real_Int32 --CC=mpicc --FC=mpif90 --CXX=mpicxx --COPTFLAGS=-O3 -g --CXXOPTFLAGS=-O3 -g --FOPTFLAGS=-O3 --with-blaslapack-lib=/workspace/destdir/lib/libopenblas.so --with-blaslapack-suffix= --CFLAGS="-fno-stack-protector " --FFLAGS=" " --LDFLAGS=-L/workspace/destdir/lib --CC_LINKER_FLAGS= --with-64-bit-indices=0 --with-debugging=0 --with-batch --with-mpi=1 --with-mpi-lib="[/workspace/destdir/lib/libmpifort.so,/workspace/destdir/lib/libmpi.so]" --with-mpi-include=/workspace/destdir/include --with-sowing=0 --with-precision=double --with-scalar-type=real --with-pthread=0 --PETSC_ARCH=x86_64-linux-gnu_double_real_Int32 --download-superlu_dist=0 --download-superlu_dist-shared=0 --download-mumps=1 --download-mumps-shared=0 --with-scalapack-lib=/workspace/destdir/lib/libscalapack32.so --with-scalapack-include=/workspace/destdir/include --SOSUFFIX=double_real_Int32 --with-shared-libraries=1 --with-clean=1
[0]PETSC ERROR: #1 MatGetFactor() at /workspace/srcdir/petsc-3.19.6/src/mat/interface/matrix.c:4764
[0]PETSC ERROR: #2 PCSetUp_LU() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/impls/factor/lu/lu.c:80
[0]PETSC ERROR: #3 PCSetUp() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/interface/precon.c:994
[0]PETSC ERROR: #4 PCApply() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/interface/precon.c:438
[0]PETSC ERROR: #5 KSP_PCApply() at /workspace/srcdir/petsc-3.19.6/include/petsc/private/kspimpl.h:381
[0]PETSC ERROR: #6 KSPSolve_PREONLY() at /workspace/srcdir/petsc-3.19.6/src/ksp/ksp/impls/preonly/preonly.c:21
[0]PETSC ERROR: #7 KSPSolve_Private() at /workspace/srcdir/petsc-3.19.6/src/ksp/ksp/interface/itfunc.c:898
[0]PETSC ERROR: #8 KSPSolve() at /workspace/srcdir/petsc-3.19.6/src/ksp/ksp/interface/itfunc.c:1070
[0]PETSC ERROR: #9 PCMGMCycle_Private() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/impls/mg/mg.c:28
[0]PETSC ERROR: #10 PCMGMCycle_Private() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/impls/mg/mg.c:84
[0]PETSC ERROR: #11 PCMGMCycle_Private() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/impls/mg/mg.c:84
[0]PETSC ERROR: #12 PCApply_MG_Internal() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/impls/mg/mg.c:611
[0]PETSC ERROR: #13 PCApply_MG() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/impls/mg/mg.c:633
[0]PETSC ERROR: #14 PCApply() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/interface/precon.c:441
[0]PETSC ERROR: #15 PCStokesMGApply() at lsolve.cpp:435
[0]PETSC ERROR: #16 MatMult_Shell() at /workspace/srcdir/petsc-3.19.6/src/mat/impls/shell/shell.c:1014
[0]PETSC ERROR: #17 MatMult() at /workspace/srcdir/petsc-3.19.6/src/mat/interface/matrix.c:2599
[0]PETSC ERROR: #18 PCApply_Mat() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/impls/mat/pcmat.c:7
[0]PETSC ERROR: #19 PCApply() at /workspace/srcdir/petsc-3.19.6/src/ksp/pc/interface/precon.c:441
[0]PETSC ERROR: #20 KSP_PCApply() at /workspace/srcdir/petsc-3.19.6/include/petsc/private/kspimpl.h:381
[0]PETSC ERROR: #21 KSPFGMRESCycle() at /workspace/srcdir/petsc-3.19.6/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152
[0]PETSC ERROR: #22 KSPSolve_FGMRES() at /workspace/srcdir/petsc-3.19.6/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273
[0]PETSC ERROR: #23 KSPSolve_Private() at /workspace/srcdir/petsc-3.19.6/src/ksp/ksp/interface/itfunc.c:898
[0]PETSC ERROR: #24 KSPSolve() at /workspace/srcdir/petsc-3.19.6/src/ksp/ksp/interface/itfunc.c:1070
[0]PETSC ERROR: #25 SNESSolve_NEWTONLS() at /workspace/srcdir/petsc-3.19.6/src/snes/impls/ls/ls.c:219
[0]PETSC ERROR: #26 SNESSolve() at /workspace/srcdir/petsc-3.19.6/src/snes/interface/snes.c:4652
[0]PETSC ERROR: #27 LaMEMLibInitGuess() at LaMEMLib.cpp:841
[0]PETSC ERROR: #28 LaMEMLibSolve() at LaMEMLib.cpp:631
[0]PETSC ERROR: #29 LaMEMLibMain() at LaMEMLib.cpp:133
[0]PETSC ERROR: #30 main() at LaMEM.cpp:53
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -crs_ksp_type preonly (source: code)
[0]PETSC ERROR: -crs_pc_factor_mat_solver_type superlu_dist (source: code)
[0]PETSC ERROR: -crs_pc_type lu (source: code)
[0]PETSC ERROR: -da_refine_y 1 (source: code)
[0]PETSC ERROR: -gmg_mg_levels_ksp_max_it 10 (source: code)
[0]PETSC ERROR: -gmg_mg_levels_ksp_type chebyshev (source: code)
[0]PETSC ERROR: -gmg_pc_mg_cycle_type v (source: code)
[0]PETSC ERROR: -gmg_pc_mg_galerkin (source: code)
[0]PETSC ERROR: -gmg_pc_mg_levels 3 (source: code)
[0]PETSC ERROR: -gmg_pc_mg_log (source: code)
[0]PETSC ERROR: -gmg_pc_mg_type multiplicative (source: code)
[0]PETSC ERROR: -gmg_pc_type mg (source: code)
[0]PETSC ERROR: -jp_type mg (source: code)
[0]PETSC ERROR: -js_ksp_atol 1e-8 (source: code)
[0]PETSC ERROR: -js_ksp_converged_reason (source: code)
[0]PETSC ERROR: -js_ksp_max_it 20 (source: code)
[0]PETSC ERROR: -js_ksp_min_it 1 (source: code)
[0]PETSC ERROR: -js_ksp_rtol 1e-4 (source: code)
[0]PETSC ERROR: -js_ksp_type fgmres (source: code)
[0]PETSC ERROR: -ParamFile output.dat (source: code)
[0]PETSC ERROR: -pcmat_type mono (source: code)
[0]PETSC ERROR: -snes_atol 1e-4 (source: code)
[0]PETSC ERROR: -snes_ksp_ew (source: code)
[0]PETSC ERROR: -snes_ksp_ew_rtolmax 1e-4 (source: code)
[0]PETSC ERROR: -snes_linesearch_maxstep 10 (source: code)
[0]PETSC ERROR: -snes_linesearch_type l2 (source: code)
[0]PETSC ERROR: -snes_max_funcs 500000 (source: code)
[0]PETSC ERROR: -snes_max_it 200 (source: code)
[0]PETSC ERROR: -snes_max_linear_solve_fail 10000 (source: code)
[0]PETSC ERROR: -snes_monitor (source: code)
[0]PETSC ERROR: -snes_NewtonSwitchToPicard_it 20 (source: code)
[0]PETSC ERROR: -snes_PicardSwitchToNewton_rtol 1e-3 (source: code)
[0]PETSC ERROR: -snes_rtol 5e-3 (source: code)
[0]PETSC ERROR: -snes_stol 1e-16 (source: code)
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------
application called MPI_Abort(MPI_COMM_SELF, 92) - process 0
ERROR: LoadError: failed process: Process(setenv(`/home/tysz/.julia/artifacts/0ed4137b58af5c5e3797cb0c400e60ed7c308bae/bin/mpiexec -n 8 /home/tysz/.julia/artifacts/cd461744844630cc33fbd2e7a8e795b56651d039/bin/LaMEM -ParamFile output.dat ''`,["LD_LIBRARY_PATH=/home/tysz/software/julia-1.10.2/bin/../lib/julia:/home/tysz/.julia/artifacts/93ddb84060b49f38ec59d4b04a3109fedc4577d2/lib:/home/tysz/.julia/artifacts/0ed4137b58af5c5e3797cb0c400e60ed7c308bae/lib:/home/tysz/.julia/artifacts/0a6a41be79ef85f32aa7d8529d4aebf9ef8ab030/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_real_Int64/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/single_complex_Int32/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/single_complex_Int64/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/single_real_Int32/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/single_real_Int64/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_complex_Int32/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_complex_Int64/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_real_Int32/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_real_Int64_deb/lib:/home/tysz/.julia/artifacts/cd461744844630cc33fbd2e7a8e795b56651d039/lib:/home/tysz/software/julia-1.10.2/bin/../lib/julia:/home/tysz/software/julia-1.10.2/bin/../lib", "VECLIB_MAXIMUM_THREADS=1", "OMP_NUM_THREADS=1"]), ProcessExited(92)) [92]

Stacktrace:
 [1] pipeline_error
   @ ./process.jl:565 [inlined]
 [2] run(::Cmd; wait::Bool)
   @ Base ./process.jl:480
 [3] run
   @ ./process.jl:477 [inlined]
 [4] run_lamem(ParamFile::String, cores::Int64, args::String; wait::Bool, deactivate_multithreads::Bool)
   @ LaMEM.Run ~/.julia/packages/LaMEM/6dfaH/src/run_lamem.jl:69
 [5] run_lamem
   @ ~/.julia/packages/LaMEM/6dfaH/src/run_lamem.jl:43 [inlined]
 [6] run_lamem(model::Model, cores::Int64, args::String; wait::Bool)
   @ LaMEM.LaMEM_Model ~/.julia/packages/LaMEM/6dfaH/src/LaMEM_ModelGeneration/Model.jl:206
 [7] run_lamem
   @ ~/.julia/packages/LaMEM/6dfaH/src/LaMEM_ModelGeneration/Model.jl:195 [inlined]
 [8] run_lamem(model::Model, cores::Int64)
   @ LaMEM.LaMEM_Model ~/.julia/packages/LaMEM/6dfaH/src/LaMEM_ModelGeneration/Model.jl:195
 [9] top-level scope
   @ ~/lamem_model/model_test/compression.jl:282
in expression starting at /home/tysz/lamem_model/model_test/compression.jl:278

caused by: failed process: Process(setenv(`/home/tysz/.julia/artifacts/0ed4137b58af5c5e3797cb0c400e60ed7c308bae/bin/mpiexec -n 8 /home/tysz/.julia/artifacts/cd461744844630cc33fbd2e7a8e795b56651d039/bin/LaMEM -ParamFile output.dat '-nstep_max 2 -nstep_out 1'`,["LD_LIBRARY_PATH=/home/tysz/software/julia-1.10.2/bin/../lib/julia:/home/tysz/.julia/artifacts/93ddb84060b49f38ec59d4b04a3109fedc4577d2/lib:/home/tysz/.julia/artifacts/0ed4137b58af5c5e3797cb0c400e60ed7c308bae/lib:/home/tysz/.julia/artifacts/0a6a41be79ef85f32aa7d8529d4aebf9ef8ab030/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_real_Int64/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/single_complex_Int32/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/single_complex_Int64/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/single_real_Int32/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/single_real_Int64/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_complex_Int32/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_complex_Int64/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_real_Int32/lib:/home/tysz/.julia/artifacts/3f1a00f1080275e347bfbcf5d11aac546fc87017/lib/petsc/double_real_Int64_deb/lib:/home/tysz/.julia/artifacts/cd461744844630cc33fbd2e7a8e795b56651d039/lib:/home/tysz/software/julia-1.10.2/bin/../lib/julia:/home/tysz/software/julia-1.10.2/bin/../lib", "VECLIB_MAXIMUM_THREADS=1", "OMP_NUM_THREADS=1"]), ProcessExited(92)) [92]

@boriskaus
Copy link
Member

Which version of LaMEM and petsc are you using?

@tyszwh
Copy link
Author

tyszwh commented Sep 7, 2024

Hi, Prof Boris kaus, I installed Lamem.jl manually No other changes. Do I need to install PETSc_jll separately?

(@v1.10) pkg> status
Status `~/.julia/environments/v1.10/Project.toml`
  [5752ebe1] GMT v1.17.0
⌃ [3700c31b] GeophysicalModelGenerator v0.7.4
  [2e889f3d] LaMEM v0.4.0 `/home/tysz/.julia/registries/LaMEM.jl#main`
  [91a5bcdd] Plots v1.40.8
  [15d6fa20] LaMEM_jll v2.1.4+0
Info Packages marked with ⌃ have new versions available and may be upgradable.

@boriskaus
Copy link
Member

Does test LaMEM work for you? It should list sll packages including the version of PETSc that is being used.
You should not have to install this manually

@tyszwh
Copy link
Author

tyszwh commented Sep 7, 2024

All tests pass.
I don't know why status doesn't show petsc, but I can see it during test.

⌅ [8fa3689e] PETSc_jll v3.19.6+0
--------------------------------------------------------------------------
Test Summary:                   | Pass  Total     Time
LaMEM.jl                        |   36     36  2m41.9s
  Julia setup                   |    1      1    13.4s
  velocity box                  |    1      1     1.2s
  phase transitions             |    1      1     7.6s
  phase diagrams                |    1      1     2.8s
  build-in geometries           |    2      2     0.3s
  run LaMEM                     |    6      6    36.7s
  read LaMEM output             |    5      5     3.5s
  run lamem mode save grid test |    2      2     0.2s
  Test mesh refinement          |    1      1    20.5s
  Read logfile                  |    1      1     0.5s
  filesize compression          |    2      2     4.0s
  GeoParams 0D rheology         |    5      5     3.5s
  examples in /scripts          |    8      8  1m07.4s
     Testing LaMEM tests passed

@boriskaus
Copy link
Member

Yes that is correct.
it is somewhat puzzling that you have issues with superlu_dist as I am pretty sure that this is being tested.
At the beginning of the tests it should show some flags to show which external solvers are being tested. Are they all active?

@tyszwh
Copy link
Author

tyszwh commented Sep 7, 2024

Hi, Prof Boris kaus, I didn't find the test information for superlu_dist.
I want to output logs to a file when I test lamem, how should I add the command? I'm not particularly familiar with julia-pkg.

@boriskaus
Copy link
Member

If you scroll through the online test results you’ll see it:
https://github.com/JuliaGeodynamics/LaMEM.jl/actions/runs/10715282647

@tyszwh
Copy link
Author

tyszwh commented Sep 17, 2024

I reinstalled julia and LaMEM under wsl and the problem(superlu_dist.) persists.

@boriskaus
Copy link
Member

Apologies for the delay; this puzzled me for some time as PETSc_jll was working fine with SuperLU_DIST. I finally discovered the reason for it: I only included SuperLU_DIST with the 64 bit version of PETSc_jll, but LaMEM_jll was compiled vs. the 32 bit version.
It will take some time to fix this, which will be done with a new future release of LaMEM_jll

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants