Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[chg] avoid dealing with encoding and provide backwards compatibility #1

Merged
merged 1 commit into from
Dec 29, 2019

Conversation

raw-data
Copy link
Contributor

While running CAPEv2 with virtualbox machinery, the system enters in a loop state. It seems no match is found for the detected VM state when checked against the predefined set (e.g. POWEROFF, ABORTED, etc.); setting universal_newlines=True returns output as text instead of binary and it does the trick.

@doomedraven
Copy link
Collaborator

amazing, thanks a lot, as we use only kvm, is the most tested hypervisor, so we appreciate a lot your collaboration, and sorry for delay in merge

@doomedraven doomedraven merged commit 6f51074 into kevoreilly:master Dec 29, 2019
@raw-data
Copy link
Contributor Author

raw-data commented Jan 1, 2020

No stress! Keep going with the great job!

doomedraven pushed a commit that referenced this pull request Jan 28, 2020
klingerko pushed a commit to klingerko/CAPEv2 that referenced this pull request Oct 14, 2021
nbargnesi added a commit to nbargnesi/CAPEv2 that referenced this pull request Feb 9, 2023
Ideally key collisions across multiple configs could combine data, so
multiple values could be generated across configs.

For example, config kevoreilly#1 generates the following:
    {"cfg1": {"key": "value1"}

While config kevoreilly#2 generates:
    {"cfg1": {"key": "value2"}

And their combined config becomes:
    {"cfg1": {"key": ["value1", "value2"]}}

The existing behavior is to let the last config win on key
collisions, resulting instead in:
    {"cfg1": {"key": "value2"}}

So reflect that in the new test coverage, and simplify the config update
method by forwarding the cape_name.
nbargnesi added a commit to nbargnesi/CAPEv2 that referenced this pull request Feb 9, 2023
Ideally key collisions across multiple configs could combine data, so
multiple values could be generated across configs.

For example, config kevoreilly#1 generates the following:
    {"cfg1": {"key": "value1"}

While config kevoreilly#2 generates:
    {"cfg1": {"key": "value2"}

And their combined config becomes:
    {"cfg1": {"key": ["value1", "value2"]}}

The existing behavior is to let the last config win on key
collisions, resulting instead in:
    {"cfg1": {"key": "value2"}}

So reflect that in the new test coverage, and simplify the config update
method by forwarding the cape_name.
doomedraven pushed a commit that referenced this pull request Feb 9, 2023
…ial for data loss (#1357)

* add config update test coverage, part 1

These tests are known to pass based on the current implementation of
update_cape_configs.

* add config update test coverage, part 2

This test is known to fail based on the current implementation of
update_cape_configs.

* run python-package workflow on all branches

* simplify update_cape_configs and update tests

Ideally key collisions across multiple configs could combine data, so
multiple values could be generated across configs.

For example, config #1 generates the following:
    {"cfg1": {"key": "value1"}

While config #2 generates:
    {"cfg1": {"key": "value2"}

And their combined config becomes:
    {"cfg1": {"key": ["value1", "value2"]}}

The existing behavior is to let the last config win on key
collisions, resulting instead in:
    {"cfg1": {"key": "value2"}}

So reflect that in the new test coverage, and simplify the config update
method by forwarding the cape_name.

* log the potential for data loss

* less ruff now

* Revert "run python-package workflow on all branches"

This reverts commit 4868ffa.

* link test coverage to #1357

---------

Co-authored-by: Nick Bargnesi
enzok pushed a commit to enzok/CAPEv2 that referenced this pull request Apr 13, 2023
doomedraven pushed a commit that referenced this pull request Apr 16, 2023
@para0x0dise para0x0dise mentioned this pull request Jul 1, 2023
@YungBinary YungBinary mentioned this pull request Nov 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants