Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bad optimization on INTEL/MIT datasets #87

Closed
EmilioOlivastri opened this issue Feb 6, 2023 · 7 comments
Closed

Bad optimization on INTEL/MIT datasets #87

EmilioOlivastri opened this issue Feb 6, 2023 · 7 comments

Comments

@EmilioOlivastri
Copy link

Hello Yun Chang!

I am working on Robust Optimization for SLAM and was trying to use RPGO to compare with other methods on standard datasets ( M3500, INTEL, CSAIL, MIT ).

I used the snippet in the examples folder (RpgoReadG2o.cpp), but unfortunately, I am unable to converge to a reasonable solution on all datasets except M3500 ( classic, without additional noise ).
Being that I still didn't add any outliers to the dataset, I disabled outlier rejection as it was explained in #84, and used the same params for using only GNC.
intel
mit

These are the results that I obtained from the optimization, do you have any idea as to why these results and how to fix them?
The params I have used are :

  • pcm_t = pcm_R = 10.0;
  • params.setGncInlierCostThresholdsAtProbability(0.9999);

Thanks in advance for your time,
EO.

@yunzc
Copy link
Member

yunzc commented Feb 6, 2023

Could you paste the changes you made here?
If you set pcm_t and pcm_R to -1 and GNC to 0 it really should be just a call to to the gtsam implementation of Gauss-Newton or Levenberg-Marquardt without any outlier rejection.

Though currently be default it only actually optimizes if a loop closure is added (just to save computation) so be careful with that. Or call the forceUpdate function which should bypass the condition and forces the optimization to trigger.

Best,
Y

@EmilioOlivastri
Copy link
Author

EmilioOlivastri commented Feb 6, 2023

First of all, thanks for your quick response.

I applied minimal modifications to the test code in examples as shown in the code snipped pasted here :

###############################

Usage:
./RpgoReadG2o <some_2d_g2o_file> <pcm_t_simple_thresh> <pcm_R_simple_thresh> <output>

int main(int argc, char* argv[]) {
gtsam::GraphAndValues graphNValues;
std::string g2ofile = argv[1];
double pcm_t = atof(argv[2]);
double pcm_R = atof(argv[3]);

std::string output_folder;
if (argc > 4) output_folder = argv[4];

RobustSolverParams params;
Verbosity verbosity = Verbosity::VERBOSE;

graphNValues = gtsam::load2D(g2ofile,
gtsam::SharedNoiseModel(),
0,
false,
true,
gtsam::NoiseFormatG2O);

params.setPcmSimple2DParams(pcm_t, pcm_R, verbosity);
params.setGncInlierCostThresholdsAtProbability(0.9999999);
Simulategtsam::Pose2(graphNValues, params, output_folder);

return 0;
}

Best,
EO.

@yunzc
Copy link
Member

yunzc commented Feb 7, 2023

Ah, I see, if you are doing 2d g2o you should probably turn off diagonal damping https://github.com/MIT-SPARK/Kimera-RPGO/blob/master/include/KimeraRPGO/SolverParams.h#L91

just add a line:

params.setLmDiagonalDamping(false);

Sorry we have mostly been testing with the 3D case and forgot that diagonal damping was set to true by default.

Updated example script in 8c5e163

@EmilioOlivastri
Copy link
Author

Yes, I am first trying the 2D case, and then I am going to move to the 3D datasets.

Still, I added the line that you suggested but the results remain unchanged, it still works on M3500, but not on INTEL or MIT.
Also I was wondering why should disabling Diagonal Damping on LM work should make it work in 2D and not in 3D.

Doing some research could be that the problem is in the underlying GTSAM solver. Reading this issue, gtsam_issue, in fact, without using RPGO, but directly GTSAM's implementation of GNC + GaussNewton, using the params specified in the issue, it converges as expected, while instead if using GNC + LM it never converges, no matter how many iterations. So, the reason behind the poor convergence results of RPGO on these datasets could be that it uses GNC + LM.

Unfortunately, I am not able to test RPGO + GNC + GN because it gives me the following error

#####################
PCM spin took 44 milliseconds. Detected 256 total loop closures with 256 inliers.
Running GN
setMaxIterations: changing the max nr of iters might lead to less accurate solutions and is not recommended!
Initial error: 2.57486e+06
terminate called after throwing an instance of 'gtsam::IndeterminantLinearSystemException'
what():
Indeterminant linear system detected while working near variable
923 (Symbol: 923).

Thrown when a linear system is ill-posed. The most common cause for this
error is having underconstrained variables. Mathematically, the system is
underdetermined. See the GTSAM Doxygen documentation at
http://borg.cc.gatech.edu/ on gtsam::IndeterminantLinearSystemException for
more information.

##########################

Best,
EO

@yunzc
Copy link
Member

yunzc commented Feb 8, 2023

Re: diagonal damping:
Good question, I don't have a good explanation for this but in my experience when testing on 2D datasets like the INTEL and MIT LM usually struggles a lot with diagonal damping for some initialization.

Initial error: 2.57486e+06, values: 1228
iter      cost      cost_change    lambda  success iter_time
   0  5.996935e+08   -5.97e+08    1.00e-05     1    8.76e-03
iter      cost      cost_change    lambda  success iter_time
   0  1.049264e+08   -1.02e+08    1.00e-04     1    7.88e-03
iter      cost      cost_change    lambda  success iter_time
   0  1.473695e+07   -1.22e+07    1.00e-03     1    7.03e-03
iter      cost      cost_change    lambda  success iter_time
   0      inf   -1.22e+07    1.00e-02     0    4.03e-03
iter      cost      cost_change    lambda  success iter_time
   0      inf   -1.22e+07    1.00e-01     0    3.39e-03
iter      cost      cost_change    lambda  success iter_time
   0      inf   -1.22e+07    1.00e+00     0    3.24e-03
iter      cost      cost_change    lambda  success iter_time
   0      inf   -1.22e+07    1.00e+01     0    3.12e-03
iter      cost      cost_change    lambda  success iter_time
   0      inf   -1.22e+07    1.00e+02     0    3.41e-03
iter      cost      cost_change    lambda  success iter_time
   0      inf   -1.22e+07    1.00e+03     0    3.12e-03
iter      cost      cost_change    lambda  success iter_time
   0      inf   -1.22e+07    1.00e+04     0    3.11e-03
Warning:  Levenberg-Marquardt giving up because cannot decrease error with maximum lambda

If you give it a different initialization though, for example doing this:

template <class T>
void Simulate(gtsam::GraphAndValues gv,
              RobustSolverParams params,
              std::string output_folder) {
  gtsam::NonlinearFactorGraph nfg = *gv.first;
  gtsam::Values values = *gv.second;
  gtsam::Values updated_values = values;

  for (const auto& factor : nfg) {
    // convert to between factor
    if (updated_values.exists(factor->front())) {
      const gtsam::BetweenFactor<T>& btwn =
          *boost::dynamic_pointer_cast<gtsam::BetweenFactor<T> >(factor);
      updated_values.update(
          factor->back(),
          updated_values.at<T>(factor->front()).compose(btwn.measured()));
    }
  }

  std::unique_ptr<RobustSolver> pgo =
      KimeraRPGO::make_unique<RobustSolver>(params);
  pgo->update(nfg, updated_values);
}

It works better:
updated_values (initial guess)
initial
result
result
https://drive.google.com/file/d/1gQrj6hLvQLN8ai5Ssptp_90X7QdBBu5h/view?usp=share_link

Initial error: 2.99913e+10, values: 1228
iter      cost      cost_change    lambda  success iter_time
   0  2.352054e+10    6.47e+09    1.00e-05     1    7.86e-03
   1  6.636590e+08    2.29e+10    1.00e-06     1    6.92e-03
   2  1.382104e+08    5.25e+08    1.00e-07     1    6.97e-03
   3  1.867583e+07    1.20e+08    1.00e-08     1    7.08e-03
   4  5.292067e+05    1.81e+07    1.00e-09     1    6.62e-03
   5  5.417554e+07   -5.36e+07    1.00e-10     1    6.76e-03
   5  3.926288e+06   -3.40e+06    1.00e-09     1    6.79e-03
   5  1.080691e+05    4.21e+05    1.00e-08     1    6.55e-03
   6  2.046609e+06   -1.94e+06    1.00e-09     1    6.62e-03
   6  1.021105e+06   -9.13e+05    1.00e-08     1    6.52e-03
   6  1.934364e+04    8.87e+04    1.00e-07     1    6.48e-03
   7  1.017504e+06   -9.98e+05    1.00e-08     1    6.78e-03
   7  1.002376e+04    9.32e+03    1.00e-07     1    6.67e-03
   8  1.085832e+06   -1.08e+06    1.00e-08     1    6.44e-03
   8  9.708335e+03    3.15e+02    1.00e-07     1    7.15e-03
   9  1.167896e+06   -1.16e+06    1.00e-08     1    6.59e-03
   9  1.125452e+04   -1.55e+03    1.00e-07     1    6.36e-03
   9  1.185263e+03    8.52e+03    1.00e-06     1    6.61e-03
  10  8.202854e+03   -7.02e+03    1.00e-07     1    7.07e-03
  10  3.747139e+02    8.11e+02    1.00e-06     1    6.63e-03
  11  7.601835e+03   -7.23e+03    1.00e-07     1    6.35e-03
  11  2.579313e+02    1.17e+02    1.00e-06     1    6.57e-03
  12  7.555064e+03   -7.30e+03    1.00e-07     1    7.08e-03
  12  2.353084e+02    2.26e+01    1.00e-06     1    6.84e-03
  13  7.648129e+03   -7.41e+03    1.00e-07     1    6.79e-03
  13  2.315212e+02    3.79e+00    1.00e-06     1    6.60e-03
  14  7.777659e+03   -7.55e+03    1.00e-07     1    7.24e-03
  14  2.322781e+02   -7.57e-01    1.00e-06     1    6.99e-03
  14  1.330701e+02    9.85e+01    1.00e-05     1    6.30e-03
  15  1.975947e+02   -6.45e+01    1.00e-06     1    6.64e-03
  15  1.237218e+02    9.35e+00    1.00e-05     1    7.17e-03
  16  1.898009e+02   -6.61e+01    1.00e-06     1    7.05e-03
  16  1.223663e+02    1.36e+00    1.00e-05     1    6.42e-03
  17  1.879509e+02   -6.56e+01    1.00e-06     1    6.46e-03
  17  1.220828e+02    2.83e-01    1.00e-05     1    6.95e-03
  18  1.875774e+02   -6.55e+01    1.00e-06     1    6.92e-03
  18  1.220106e+02    7.22e-02    1.00e-05     1    6.46e-03
  19  1.875825e+02   -6.56e+01    1.00e-06     1    6.77e-03
  19  1.219887e+02    2.19e-02    1.00e-05     1    6.87e-03
  20  1.876854e+02   -6.57e+01    1.00e-06     1    6.94e-03
  20  1.219796e+02    9.13e-03    1.00e-05     1    7.07e-03
  21  1.878136e+02   -6.58e+01    1.00e-06     1    6.68e-03
  21  1.219737e+02    5.84e-03    1.00e-05     1    6.82e-03
  22  1.879485e+02   -6.60e+01    1.00e-06     1    6.98e-03
  22  1.219688e+02    4.99e-03    1.00e-05     1    6.70e-03
  23  1.880850e+02   -6.61e+01    1.00e-06     1    6.99e-03
  23  1.219640e+02    4.76e-03    1.00e-05     1    6.83e-03
  24  1.882221e+02   -6.63e+01    1.00e-06     1    6.94e-03
  24  1.219593e+02    4.70e-03    1.00e-05     1    6.67e-03
  25  1.883592e+02   -6.64e+01    1.00e-06     1    6.60e-03
  25  1.219546e+02    4.68e-03    1.00e-05     1    7.03e-03
  26  1.884965e+02   -6.65e+01    1.00e-06     1    6.89e-03
  26  1.219499e+02    4.67e-03    1.00e-05     1    6.72e-03
  27  1.886338e+02   -6.67e+01    1.00e-06     1    6.61e-03
  27  1.219453e+02    4.67e-03    1.00e-05     1    6.78e-03
  28  1.887711e+02   -6.68e+01    1.00e-06     1    6.87e-03
  28  1.219406e+02    4.66e-03    1.00e-05     1    6.70e-03
  29  1.889085e+02   -6.70e+01    1.00e-06     1    6.55e-03
  29  1.219359e+02    4.66e-03    1.00e-05     1    6.89e-03
  30  1.890459e+02   -6.71e+01    1.00e-06     1    6.97e-03
  30  1.219313e+02    4.65e-03    1.00e-05     1    6.42e-03
  31  1.891833e+02   -6.73e+01    1.00e-06     1    6.46e-03
  31  1.219266e+02    4.65e-03    1.00e-05     1    6.73e-03
  32  1.893208e+02   -6.74e+01    1.00e-06     1    6.84e-03
  32  1.219220e+02    4.64e-03    1.00e-05     1    6.45e-03
  33  1.894584e+02   -6.75e+01    1.00e-06     1    6.67e-03
  33  1.219174e+02    4.64e-03    1.00e-05     1    6.70e-03
  34  1.895960e+02   -6.77e+01    1.00e-06     1    6.68e-03
  34  1.219127e+02    4.63e-03    1.00e-05     1    6.69e-03
  35  1.897337e+02   -6.78e+01    1.00e-06     1    6.63e-03
  35  1.219081e+02    4.63e-03    1.00e-05     1    6.48e-03
  36  1.898714e+02   -6.80e+01    1.00e-06     1    6.50e-03
  36  1.219035e+02    4.62e-03    1.00e-05     1    7.14e-03
  37  1.900091e+02   -6.81e+01    1.00e-06     1    6.60e-03
  37  1.218989e+02    4.62e-03    1.00e-05     1    6.40e-03
  38  1.901470e+02   -6.82e+01    1.00e-06     1    6.82e-03
  38  1.218943e+02    4.61e-03    1.00e-05     1    6.58e-03
  39  1.902848e+02   -6.84e+01    1.00e-06     1    6.51e-03
  39  1.218897e+02    4.60e-03    1.00e-05     1    6.69e-03
  40  1.904228e+02   -6.85e+01    1.00e-06     1    6.99e-03
  40  1.218851e+02    4.60e-03    1.00e-05     1    6.63e-03
  41  1.905607e+02   -6.87e+01    1.00e-06     1    6.44e-03
  41  1.218805e+02    4.59e-03    1.00e-05     1    6.67e-03
  42  1.906988e+02   -6.88e+01    1.00e-06     1    7.03e-03
  42  1.218759e+02    4.59e-03    1.00e-05     1    6.99e-03
  43  1.908369e+02   -6.90e+01    1.00e-06     1    7.48e-03
  43  1.218713e+02    4.58e-03    1.00e-05     1    6.65e-03
  44  1.909750e+02   -6.91e+01    1.00e-06     1    7.09e-03
  44  1.218667e+02    4.58e-03    1.00e-05     1    6.84e-03
  45  1.911133e+02   -6.92e+01    1.00e-06     1    7.69e-03
  45  1.218621e+02    4.57e-03    1.00e-05     1    6.44e-03
  46  1.912515e+02   -6.94e+01    1.00e-06     1    7.92e-03
  46  1.218576e+02    4.57e-03    1.00e-05     1    6.74e-03
  47  1.913899e+02   -6.95e+01    1.00e-06     1    7.32e-03
  47  1.218530e+02    4.56e-03    1.00e-05     1    6.85e-03
  48  1.915283e+02   -6.97e+01    1.00e-06     1    6.73e-03
  48  1.218485e+02    4.56e-03    1.00e-05     1    7.45e-03
  49  1.916667e+02   -6.98e+01    1.00e-06     1    6.85e-03
  49  1.218439e+02    4.55e-03    1.00e-05     1    7.16e-03
  50  1.918053e+02   -7.00e+01    1.00e-06     1    6.78e-03
  50  1.218394e+02    4.54e-03    1.00e-05     1    6.87e-03
  51  1.919438e+02   -7.01e+01    1.00e-06     1    6.49e-03
  51  1.218348e+02    4.54e-03    1.00e-05     1    7.29e-03
  52  1.920825e+02   -7.02e+01    1.00e-06     1    7.04e-03
  52  1.218303e+02    4.53e-03    1.00e-05     1    6.78e-03
  53  1.922212e+02   -7.04e+01    1.00e-06     1    7.06e-03
  53  1.218258e+02    4.53e-03    1.00e-05     1    7.19e-03
  54  1.923599e+02   -7.05e+01    1.00e-06     1    6.93e-03
  54  1.218212e+02    4.52e-03    1.00e-05     1    6.89e-03
  55  1.924988e+02   -7.07e+01    1.00e-06     1    6.56e-03
  55  1.218167e+02    4.52e-03    1.00e-05     1    7.04e-03
  56  1.926377e+02   -7.08e+01    1.00e-06     1    7.00e-03
  56  1.218122e+02    4.51e-03    1.00e-05     1    7.06e-03
  57  1.927766e+02   -7.10e+01    1.00e-06     1    7.79e-03
  57  1.218077e+02    4.51e-03    1.00e-05     1    7.05e-03
  58  1.929156e+02   -7.11e+01    1.00e-06     1    6.58e-03
  58  1.218032e+02    4.50e-03    1.00e-05     1    7.43e-03
  59  1.930547e+02   -7.13e+01    1.00e-06     1    6.95e-03
  59  1.217987e+02    4.49e-03    1.00e-05     1    7.11e-03
  60  1.931938e+02   -7.14e+01    1.00e-06     1    7.24e-03
  60  1.217942e+02    4.49e-03    1.00e-05     1    6.73e-03
  61  1.933330e+02   -7.15e+01    1.00e-06     1    6.52e-03
  61  1.217897e+02    4.48e-03    1.00e-05     1    6.74e-03
  62  1.934723e+02   -7.17e+01    1.00e-06     1    6.97e-03
  62  1.217853e+02    4.48e-03    1.00e-05     1    6.85e-03
  63  1.936116e+02   -7.18e+01    1.00e-06     1    6.51e-03
  63  1.217808e+02    4.47e-03    1.00e-05     1    7.36e-03
  64  1.937510e+02   -7.20e+01    1.00e-06     1    6.26e-03
  64  1.217763e+02    4.47e-03    1.00e-05     1    6.54e-03
  65  1.938904e+02   -7.21e+01    1.00e-06     1    7.08e-03
  65  1.217719e+02    4.46e-03    1.00e-05     1    7.30e-03
  66  1.940299e+02   -7.23e+01    1.00e-06     1    6.72e-03
  66  1.217674e+02    4.46e-03    1.00e-05     1    7.04e-03
  67  1.941695e+02   -7.24e+01    1.00e-06     1    9.47e-03
  67  1.217630e+02    4.45e-03    1.00e-05     1    8.04e-03
  68  1.943091e+02   -7.25e+01    1.00e-06     1    8.22e-03
  68  1.217585e+02    4.45e-03    1.00e-05     1    7.23e-03
  69  1.944488e+02   -7.27e+01    1.00e-06     1    7.32e-03
  69  1.217541e+02    4.44e-03    1.00e-05     1    7.24e-03
  70  1.945885e+02   -7.28e+01    1.00e-06     1    7.02e-03
  70  1.217496e+02    4.43e-03    1.00e-05     1    7.61e-03
  71  1.947284e+02   -7.30e+01    1.00e-06     1    6.88e-03
  71  1.217452e+02    4.43e-03    1.00e-05     1    6.85e-03
  72  1.948682e+02   -7.31e+01    1.00e-06     1    7.46e-03
  72  1.217408e+02    4.42e-03    1.00e-05     1    7.11e-03
  73  1.950082e+02   -7.33e+01    1.00e-06     1    7.84e-03
  73  1.217364e+02    4.42e-03    1.00e-05     1    6.88e-03
  74  1.951482e+02   -7.34e+01    1.00e-06     1    6.58e-03
  74  1.217320e+02    4.41e-03    1.00e-05     1    6.92e-03
  75  1.952882e+02   -7.36e+01    1.00e-06     1    7.13e-03
  75  1.217275e+02    4.41e-03    1.00e-05     1    6.89e-03
  76  1.954283e+02   -7.37e+01    1.00e-06     1    6.50e-03
  76  1.217231e+02    4.40e-03    1.00e-05     1    6.80e-03
  77  1.955685e+02   -7.38e+01    1.00e-06     1    6.78e-03
  77  1.217187e+02    4.40e-03    1.00e-05     1    6.99e-03
  78  1.957088e+02   -7.40e+01    1.00e-06     1    6.67e-03
  78  1.217144e+02    4.39e-03    1.00e-05     1    7.02e-03
  79  1.958491e+02   -7.41e+01    1.00e-06     1    6.75e-03
  79  1.217100e+02    4.39e-03    1.00e-05     1    7.19e-03
  80  1.959894e+02   -7.43e+01    1.00e-06     1    6.59e-03
  80  1.217056e+02    4.38e-03    1.00e-05     1    7.00e-03
  81  1.961299e+02   -7.44e+01    1.00e-06     1    6.82e-03
  81  1.217012e+02    4.37e-03    1.00e-05     1    6.35e-03
  82  1.962703e+02   -7.46e+01    1.00e-06     1    6.60e-03
  82  1.216968e+02    4.37e-03    1.00e-05     1    7.11e-03
  83  1.964109e+02   -7.47e+01    1.00e-06     1    6.84e-03
  83  1.216925e+02    4.36e-03    1.00e-05     1    6.34e-03
  84  1.965515e+02   -7.49e+01    1.00e-06     1    6.46e-03
  84  1.216881e+02    4.36e-03    1.00e-05     1    7.11e-03
  85  1.966921e+02   -7.50e+01    1.00e-06     1    6.81e-03
  85  1.216838e+02    4.35e-03    1.00e-05     1    6.33e-03
  86  1.968329e+02   -7.51e+01    1.00e-06     1    6.41e-03
  86  1.216794e+02    4.35e-03    1.00e-05     1    7.08e-03
  87  1.969737e+02   -7.53e+01    1.00e-06     1    7.03e-03
  87  1.216751e+02    4.34e-03    1.00e-05     1    6.38e-03
  88  1.971145e+02   -7.54e+01    1.00e-06     1    6.44e-03
  88  1.216707e+02    4.34e-03    1.00e-05     1    6.93e-03
  89  1.972554e+02   -7.56e+01    1.00e-06     1    7.22e-03
  89  1.216664e+02    4.33e-03    1.00e-05     1    6.89e-03
  90  1.973964e+02   -7.57e+01    1.00e-06     1    6.49e-03
  90  1.216621e+02    4.33e-03    1.00e-05     1    6.93e-03
  91  1.975374e+02   -7.59e+01    1.00e-06     1    6.86e-03
  91  1.216578e+02    4.32e-03    1.00e-05     1    6.72e-03
  92  1.976785e+02   -7.60e+01    1.00e-06     1    6.75e-03
  92  1.216535e+02    4.32e-03    1.00e-05     1    6.90e-03
  93  1.978196e+02   -7.62e+01    1.00e-06     1    7.35e-03
  93  1.216491e+02    4.31e-03    1.00e-05     1    6.20e-03
  94  1.979608e+02   -7.63e+01    1.00e-06     1    6.62e-03
  94  1.216448e+02    4.30e-03    1.00e-05     1    6.54e-03
  95  1.981020e+02   -7.65e+01    1.00e-06     1    7.05e-03
  95  1.216405e+02    4.30e-03    1.00e-05     1    6.30e-03
  96  1.982433e+02   -7.66e+01    1.00e-06     1    6.53e-03
  96  1.216362e+02    4.29e-03    1.00e-05     1    6.45e-03
  97  1.983847e+02   -7.67e+01    1.00e-06     1    6.89e-03
  97  1.216320e+02    4.29e-03    1.00e-05     1    6.55e-03
  98  1.985261e+02   -7.69e+01    1.00e-06     1    6.62e-03
  98  1.216277e+02    4.28e-03    1.00e-05     1    6.70e-03
  99  1.986676e+02   -7.70e+01    1.00e-06     1    7.09e-03
  99  1.216234e+02    4.28e-03    1.00e-05     1    6.52e-03

@yunzc
Copy link
Member

yunzc commented Feb 8, 2023

Re: gtsam issue.

I will follow this more closely.

@EmilioOlivastri
Copy link
Author

Thanks, Yun Chang! Using the relative transformations from the odometry measurements, instead of using the initial estimate given from the vertices, allows for the expected convergence behavior.
This is probably due to the fact that the initial guess obtained from the composition of the relative measurements is much closer to the optimal solution than the initial guess given in the vertex info of the dataset.

I think the issue can be closed, being that the issue is more related to a problem with GTSAM, that RPGO itself.
Thanks a lot for your quick responses,

Kind Regards,
EO.

@yunzc yunzc closed this as completed Feb 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants