Skip to content

Commit

Permalink
Release 0.3.0b4 (#1030)
Browse files Browse the repository at this point in the history
# Description

Please describe the change you have made.

## Checklist

- [ ] Tests added/updated.
- [ ] Run Demo Job Locally.
- [ ] Documentation updated.
- [ ] Changelogs updated in
[CHANGELOG.cdf-tk.md](https://github.com/cognitedata/toolkit/blob/main/CHANGELOG.cdf-tk.md).
- [ ] Template changelogs updated in
[CHANGELOG.templates.md](https://github.com/cognitedata/toolkit/blob/main/CHANGELOG.templates.md).
- [ ] Version bumped.

[_version.py](https://github.com/cognitedata/toolkit/blob/main/cognite/cognite_toolkit/_version.py)
and

[pyproject.toml](https://github.com/cognitedata/toolkit/blob/main/pyproject.toml)
per [semantic versioning](https://semver.org/).
  • Loading branch information
ronpal authored Sep 30, 2024
2 parents 431b193 + 72537fc commit 8cf9a82
Show file tree
Hide file tree
Showing 35 changed files with 427 additions and 51 deletions.
4 changes: 2 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,10 @@ repos:
- --fixable=E,W,F,I,T,RUF,TID,UP
- --target-version=py39
- id: ruff-format
rev: v0.6.7
rev: v0.6.8

- repo: https://github.com/igorshubovych/markdownlint-cli
rev: v0.41.0
rev: v0.42.0
hooks:
- id: markdownlint

Expand Down
6 changes: 6 additions & 0 deletions CHANGELOG.cdf-tk.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,12 @@ Changes are grouped as follows:
- `Fixed` for any bug fixes.
- `Security` in case of vulnerabilities.

## [0.3.0b4] - 2024-09-30

### Fixed

- Upgrading the `cognite-sdk` to `6.62.5` no longer raises an `ImportError` when running the `cdf` commands.

## [0.3.0b3] - 2024-09-25

### Added
Expand Down
6 changes: 6 additions & 0 deletions CHANGELOG.templates.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,12 @@ Changes are grouped as follows:
- `Fixed` for any bug fixes.
- `Security` in case of vulnerabilities.

## [0.3.0b4] - 2024-09-30

### Fixed

- Module `bootcamp` error in calculation of `quality` and `performance` in the `oee_timeseries` function.

## [0.3.0b3] - 2024-09-25

No changes to templates.
Expand Down
Binary file added branding/toolkit.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
5 changes: 5 additions & 0 deletions branding/toolkit.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
8 changes: 8 additions & 0 deletions branding/toolkit.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
,,,,,,, ********
,,,,,,,,,,,,**********
,,,,,,,,,,,,,,,,,*********
,,,,,,,,,,,,,,,,,,,**********
,,,,,,,,,,,,,,,,,,,**********
,,,,,,,,,,,,,,,,,*********
,,,,,,,,,,,,**********
,,,,,,, ********
2 changes: 1 addition & 1 deletion cdf.toml
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,4 @@ dump = true
[modules]
# This is the version of the modules. It should not be changed manually.
# It will be updated by the 'cdf module upgrade' command.
version = "0.3.0b3"
version = "0.3.0b4"
Original file line number Diff line number Diff line change
Expand Up @@ -86,8 +86,8 @@ def process_site(client, data_set_id, lookback_minutes, site):

# Calculate the components of OEE
dps_df[f"{asset}:off_spec"] = count_dps - good_dps
dps_df[f"{asset}:quality"] = good_dps / total_items
dps_df[f"{asset}:performance"] = (total_items / status_dps) / (60.0 / 3.0)
dps_df[f"{asset}:quality"] = good_dps / count_dps
dps_df[f"{asset}:performance"] = (count_dps / status_dps) / (60.0 / 3.0)
dps_df[f"{asset}:availability"] = status_dps / planned_status_dps

dps_df[f"{asset}:oee"] = dps_df[f"{asset}:quality"] * dps_df[f"{asset}:performance"] * dps_df[f"{asset}:availability"]
Expand Down
2 changes: 1 addition & 1 deletion cognite_toolkit/_builtin_modules/cdf.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ default_env = "<DEFAULT_ENV_PLACEHOLDER>"
[modules]
# This is the version of the modules. It should not be changed manually.
# It will be updated by the 'cdf module upgrade' command.
version = "0.3.0b3"
version = "0.3.0b4"


[plugins]
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
## location filter
location_external_id : <not_set>
location_name : <not_set>
location_parent_id : <not_set>
location_description : <not_set>
location_data_model_external_id : CogniteCore
location_data_model_space : sp_core_data_modecdf_cdm
location_data_model_version : v1
location_instance-space : <not_set>
## view
location_view_external_id : <not_set>
location_view_space : <not_set>
location_view_version : <not_set>
location_view_represents_entity : <not_set>
## scene
#location_scene_external_id : <not_set>
#location_scene_space : <not_set>
## optional to use with asset hierarchy
#location_asset_dataset_external_id : <not_set>
#location_asset_subtree_external_id : <not_set>
#location_event_dataset_external_id : <not_set>
#location_file_dataset_external_id : <not_set>
#location_timeseries_dataset_external_id : <not_set>
#location_timeseries_prefix : <not_set>
#location_sequences_prefix : <not_set>
#location_dataset_external_id : <not_set>
#location_general_external_id_prefix : <not_set>
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
externalId: '{{ location_external_id }}'
name: '{{ location_name }}'
parentId: '{{ location_parent_id }}'
description: '{{ location_description }}'
scene:
externalId: '{{ location_scene_external_id }}'
space: '{{ location_scene_space }}'
assetCentric:
assets:
dataSetExternalIds:
- '{{ location_asset_dataset_external_id }}'
assetSubtreeIds:
- externalId: '{{ location_asset_subtree_external_id }}'
externalIdPrefix: asset-prefix
events:
dataSetExternalIds:
- '{{ location_event_dataset_external_id }}'
assetSubtreeIds:
- externalId: '{{ location_asset_subtree_external_id }}'
externalIdPrefix: event-prefix
files:
dataSetExternalIds:
- '{{ location_file_dataset_external_id }}'
assetSubtreeIds:
- externalId: '{{ location_asset_subtree_external_id }}'
externalIdPrefix: file-prefix
timeseries:
dataSetExternalIds:
- '{{ location_timeseries_dataset_external_id }}'
assetSubtreeIds:
- externalId: '{{ location_asset_subtree_external_id }}'
externalIdPrefix: '{{ location_timeseries_prefix }}'
sequences:
dataSetExternalIds:
- '{{ location_timeseries_dataset_external_id }}'
assetSubtreeIds:
- externalId: '{{ location_asset_subtree_external_id }}'
externalIdPrefix: '{{ location_sequences_prefix }}'
dataSetExternalIds:
- '{{ location_dataset_external_id }}'
assetSubtreeIds:
- externalId: '{{ location_asset_subtree_external_id }}'
externalIdPrefix: '{{ location_general_external_id_prefix }}'
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
externalId: '{{ location_external_id }}'
name: '{{ location_name }}'
parentId: '{{ location_parent_id }}'
description: '{{ location_description }}'
dataModels:
- externalId: '{{ location_data_model_external_id }}'
space: '{{ location_data_model_space }}'
version: '{{ location_data_model_version }}'
instanceSpaces:
- '{{ location_instance-space}}'
scene:
externalId: '{{ location_scene_external_id }}'
space: '{{ location_scene_space }}'
views:
externalId: '{{ location_view_external_id }}'
space: '{{ location_view_space }}'
version: '{{ location_view_version }}'
representsEntity: '{{ location_view_represents_entity }}'
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
[module]
title = "Example location filter"

[packages]
tags = [
"examples",
]
1 change: 0 additions & 1 deletion cognite_toolkit/_cdf_tk/commands/modules.py
Original file line number Diff line number Diff line change
Expand Up @@ -306,7 +306,6 @@ def _select_packages(self, packages: Packages, existing_module_names: list[str]
questionary.Choice(
title=selectable_module.title,
value=selectable_module,
checked=True,
)
for selectable_module in package.modules
],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,9 @@
Workflow,
WorkflowList,
WorkflowTrigger,
WorkflowTriggerCreate,
WorkflowTriggerCreateList,
WorkflowTriggerList,
WorkflowTriggerUpsert,
WorkflowTriggerUpsertList,
WorkflowUpsert,
WorkflowUpsertList,
WorkflowVersion,
Expand Down Expand Up @@ -265,18 +265,18 @@ def get_write_cls_parameter_spec(cls) -> ParameterSpecSet:

@final
class WorkflowTriggerLoader(
ResourceLoader[str, WorkflowTriggerCreate, WorkflowTrigger, WorkflowTriggerCreateList, WorkflowTriggerList]
ResourceLoader[str, WorkflowTriggerUpsert, WorkflowTrigger, WorkflowTriggerUpsertList, WorkflowTriggerList]
):
folder_name = "workflows"
filename_pattern = r"^.*WorkflowTrigger$"
resource_cls = WorkflowTrigger
resource_write_cls = WorkflowTriggerCreate
resource_write_cls = WorkflowTriggerUpsert
list_cls = WorkflowTriggerList
list_write_cls = WorkflowTriggerCreateList
list_write_cls = WorkflowTriggerUpsertList
kind = "WorkflowTrigger"
dependencies = frozenset({WorkflowLoader, WorkflowVersionLoader})

_doc_url = "Workflow-triggers/operation/createTriggers"
_doc_url = "Workflow-triggers/operation/CreateOrUpdateTriggers"

def __init__(self, client: ToolkitClient, build_dir: Path | None):
super().__init__(client, build_dir)
Expand All @@ -287,7 +287,7 @@ def display_name(self) -> str:
return "workflow.triggers"

@classmethod
def get_id(cls, item: WorkflowTriggerCreate | WorkflowTrigger | dict) -> str:
def get_id(cls, item: WorkflowTriggerUpsert | WorkflowTrigger | dict) -> str:
if isinstance(item, dict):
return item["externalId"]
return item.external_id
Expand All @@ -297,15 +297,15 @@ def dump_id(cls, id: str) -> dict[str, Any]:
return {"externalId": id}

@classmethod
def get_required_capability(cls, items: WorkflowTriggerCreateList | None) -> Capability | list[Capability]:
def get_required_capability(cls, items: WorkflowTriggerUpsertList | None) -> Capability | list[Capability]:
if not items and items is not None:
return []
return WorkflowOrchestrationAcl(
[WorkflowOrchestrationAcl.Action.Read, WorkflowOrchestrationAcl.Action.Write],
WorkflowOrchestrationAcl.Scope.All(),
)

def create(self, items: WorkflowTriggerCreateList) -> WorkflowTriggerList:
def create(self, items: WorkflowTriggerUpsertList) -> WorkflowTriggerList:
created = WorkflowTriggerList([])
for item in items:
credentials = self._authentication_by_id.get(item.external_id)
Expand All @@ -317,7 +317,7 @@ def retrieve(self, ids: SequenceNotStr[str]) -> WorkflowTriggerList:
lookup = set(ids)
return WorkflowTriggerList([trigger for trigger in all_triggers if trigger.external_id in lookup])

def update(self, items: WorkflowTriggerCreateList) -> WorkflowTriggerList:
def update(self, items: WorkflowTriggerUpsertList) -> WorkflowTriggerList:
exising = self.client.workflows.triggers.get_triggers(limit=-1)
existing_lookup = {trigger.external_id: trigger for trigger in exising}
updated = WorkflowTriggerList([])
Expand Down Expand Up @@ -370,15 +370,15 @@ def get_dependent_items(cls, item: dict) -> Iterable[tuple[type[ResourceLoader],

def load_resource(
self, filepath: Path, ToolGlobals: CDFToolConfig, skip_validation: bool
) -> WorkflowTriggerCreateList:
) -> WorkflowTriggerUpsertList:
raw_yaml = load_yaml_inject_variables(filepath, ToolGlobals.environment_variables())
raw_list = raw_yaml if isinstance(raw_yaml, list) else [raw_yaml]
loaded = WorkflowTriggerCreateList([])
loaded = WorkflowTriggerUpsertList([])
for item in raw_list:
if "data" in item and isinstance(item["data"], dict):
item["data"] = json.dumps(item["data"])
if "authentication" in item:
raw_auth = item.pop("authentication")
self._authentication_by_id[self.get_id(item)] = ClientCredentials._load(raw_auth)
loaded.append(WorkflowTriggerCreate.load(item))
loaded.append(WorkflowTriggerUpsert.load(item))
return loaded
2 changes: 1 addition & 1 deletion cognite_toolkit/_version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "0.3.0b3"
__version__ = "0.3.0b4"
12 changes: 12 additions & 0 deletions needle/config.dev.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
environment:
name: dev
project: needle
type: dev
selected:
- modules/

variables:
modules:
common:
cdf_auth_readwrite_all:
readwrite_source_id: da20b44a-236a-4e6e-9f46-68f741df2315
34 changes: 34 additions & 0 deletions needle/modules/common/cdf_auth_readwrite_all/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Module: cdf_auth_readwrite_all

This is a foundational module used by the `cdf-tk` tool as the default
auth module for read-write access to all CDF resources for the tool itself (admin or CI/CD pipeline),
as well as default read-only access for admin access in the UI.

This structure is based on the concept of ONLY the tool having write access to the entities
that are controlled by the templates. Everybody else should either have no access or read-only access.

## Managed resources

This module manages the following resources:

1. a group with read-write access (`gp_cicd_all_read_write`) to everything in a CDF project (for `cdf-tk` as an admin
tool or through a CI/CD pipeline).
2. a group with read-only access `gp_cicd_all_read_only` (for viewing configurations from UI).

## Variables

The following variables are required and defined in this module:

| Variable | Description |
|----------|-------------|
|readwrite_source_id| The source ID of the group that should be granted read-write access to all resources in the project. |
|readonly_source_id| The source ID of the group that should be granted read-only access to all resources in the project.|

## Usage

The `gp_cicd_all_read_write` group is used default by the `cdf-tk auth verify` command to verify correct access to
resources in a project. The groups are default part of several packages that are created by the `cdf-tk` tool.

If you have different needs for the readwrite and readonly groups, you can copy this module into `custom_modules`, rename
it (remove the cdf_ prefix), and change which modules are deployed in your `environments.yaml` file. You can also
use the `cdf-tk verify --group-file=/path/to/group.yaml` command to switch out the default group file with your own.
Loading

0 comments on commit 8cf9a82

Please sign in to comment.