Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for random state passing #269

Open
wants to merge 6 commits into
base: main
Choose a base branch
from
Open

Fix for random state passing #269

wants to merge 6 commits into from

Conversation

nicl-nno
Copy link
Collaborator

FEDOT-related fix for reproducibility of optimiser (now only runs from GOLEM() class take the seed into account)

@nicl-nno nicl-nno requested review from Nunkyl and maypink March 28, 2024 11:48
@pep8speaks
Copy link

pep8speaks commented Mar 28, 2024

Hello @nicl-nno! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2024-04-08 11:36:06 UTC

@codecov-commenter
Copy link

codecov-commenter commented Mar 28, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 71.44%. Comparing base (68706be) to head (3799d8c).

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #269      +/-   ##
==========================================
- Coverage   72.88%   71.44%   -1.45%     
==========================================
  Files         140      140              
  Lines        8338     8341       +3     
==========================================
- Hits         6077     5959     -118     
- Misses       2261     2382     +121     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@@ -174,6 +174,23 @@ def all_historical_quality(self, metric_position: int = 0) -> List[float]:
def show(self):
return OptHistoryVisualizer(self)

# def analyze_online(self, url='https://fedot.onti.actcognitive.org'):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Это нужно оставлять?

@@ -112,8 +112,7 @@ def __init__(self,
# TODO: rename params to avoid confusion
requirements: Optional[OptimizationParameters] = None,
graph_generation_params: Optional[GraphGenerationParams] = None,
graph_optimizer_params: Optional[AlgorithmParameters] = None,
**custom_optimizer_params):
graph_optimizer_params: Optional[AlgorithmParameters] = None):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

а там не ломается использование кастомных оптимизаторов, унаследованных от него?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants