Skip to content

Commit

Permalink
Bump to v0.0.28 (#39)
Browse files Browse the repository at this point in the history
* More refactoring in FEM.abaqus. Add ability to convert Primitives to FEM models using Shell elements.

* Minor changes related to refactoring

* Fix face numbering for HEX elements. Minor refactoring

* Minor fix related to fem_to_concept_objects method not adding parents to materials.

* Begun work on improving support for reading ifc files

* minor changes to wall and wall inserts

* further small changes to wall and wall inserts

* Further work on ifc reading

* further work on proper IFCBeam reading

* Proper IFC reading of beams. Next up -> Cardinality

* add functionality to read badly created IFC files (lack of name/tag on elements).

* Add test workflow for docker builds and azure acr

* Further work on masses in FEM

* Further work on improving FEM code stability and decoupling.

* Add option to write physical objects to fem directly without having to define assemblies and parts (creates dummy objects instead).

* Further work on packaging theory. Structuring into local and conda tests.

* prepare testing of snapshot versions of gmsh and pythonocc-core

* fix errors in missing references

* fix win/linux mistake

* Minor fix to version name

* Fix bug in surface set referencing

* Add support for Interface nodes updating Csys object upon merge

* Fix bug where default field and history outputs being set globally

* Fix sesam reader not importing masses due to updated mass element handling

* WIP: Further development on treatment of mass elements.

* Add sesam test file

* Update meta.yaml

* minor fixes and improvements to FEM

* minor update to test assertions

* Write something to start using Ifc instancing

* Start work on ifcmapped repr

* Bump gmsh dep

* Add example ifc file

* Begin work on visualization module

* Working sample of Instancing using MappedItem [WIP]

* Simplify method of turning off/on property exports to IFC

* Further work on revolved IFC beam and exporting to visualization formats such as threejs

* Fix custom json export

* Further work on revolved IFC beam

* WIP export to json and instances export

* Further work on instanced visualization objects

* Further work on IFC

* Add option for exporting to custom json using multithreading

* Set default color to white if trying to normalize something without color

* Prep for FEM viz export to json

* Fix minor bugs in calculix and code aster fem writers.

Continue work on custom json exporter for visualization

Calculix postprocessing is currently suffering from dll errors related to vtk package. Should consider skipping dependency altogether (if possible

* Minor bugfixes in FEM tests

* Further work on conversion of OCC to visualization mesh

* Fix bug in primitive shape units conversion

* Further work on adding features to FEM class and various FEM objects.

Changed FEM container of constraints from list to dict. Seems more user-friendly

* Changing typing for consistency, and added functionality for merging and splitting beams

* Minor fixes in abaqus reading

* Made changes to make tests pass

* FEM: Fix bug in reading/writing abaqus orientations

* Various improvements to fem module

* FEM: Minor improvements to usability of orientations, vector rotations and readability.

* Fix bug in point rotation transform.

* Fix formatting issue caused by outdated black version

* Try to simply swap order in channel priority from krande first to conda-forge

* fix formatting

* Change gmsh package dep to python-gmsh

* start on fixing code aster FEM analysis for static

* FEM: Reduce load in static fem test. Edit Code Aster load writer to not multiply with negative 1

* Further work on debugging dependencies

* Experiment with reduction in dependencies (#37)

* An attempt to reduce package dependency complexity.

Certain deps are only included due to a single function. Should revise

* Add to devops

* Fix failing tests.

* test occt >= 7.6.0 as dependency

* Further work on visualization module and added support for editing section properties and updating the section props calculations

* Separate installing local adapy and pytest

* chore: test using conda build scripts and minor improvements to json export for visualization

* chore: Add conditional use of dev label on conda for testing experimental upstream packages

* chore: Use conditional to set env variable opposed to copying entire statements

* chore: remove no longer used conditionals

* chore: slight edit of conda build command

* chore: further work on conda compilation using fewer dependencies.

* fix: added support for penetration of piping objects using opencascade.

* further work on resolving dependencies

* fix: add support for visualizing joints and exporting it to STEP.

* fix: path makedir prior to bump

* fix: make live file if not exists

* chore: remove python version from name given that it is a noarch package

* chore: Use noarch path for exported package from condabuild

* chore: Try using newly created noarch packages of pytexit and pyquaternion

* chore: do not skip existing of pytexit and pyquaternion

* chore: bump

* chore: fix failing tests for linux

* chore: fix the last failing tests for linux

* chore: add test for reading STEP files.

* chore: add conditional use of native_pointer for importing occ geometry into gmsh

* chore: add minor user options to open and view model in gmsh when using the to_fem_obj method.

* fix: Add handling of pipe elements for new FEM mesh generation using native pointer

* chore: minor improvements in Beam initialization

* chore: minor improvement in exporting custom json related to visualization

* chore: Add option to return file like object in addition to writing to file

* feature: add ability to create custom json file-like object in addition to writing to json file.

* chore: change all tests writing IFC files to disk to reduce IO and to reduce testing time

* chore: reformat visualize module

* fix: now working merging of polygons for custom json

* fix: fix normals. Minor reorganizing of code. Created a PolyModel object

* chore: minor renaming of PolyModel adding

* fix: skip objects not able to convert to polymodel

* chore: lint

* chore: further work on custom json exports

* fix: import colours properly from IFC files. Also fix normalization of colours in colour_norm property. Further work on visualization module

* fix: bug in PolyModel merging fixed

* chore: further work on simplifying generating objects for visualization

* chore: WIP more work on exporting geometries for visualization

* chore: WIP further work visualization export

* chore: WIP visualization export and linting

* fix: WIP viz export and linting

* chore: WIP ifc guid creation

* chore: WIP mesh class AssemblyMesh will now be the core container of model objects designed for visualization only.

* chore: WIP - add example of export to binary + json visualization file set

* chore: Add ability to restart json conversion and skip already converted files

* fix: to_assembly_mesh exported twice the number of geometries due to error in get_physical_objects method

chore: rename to_assembly_mesh to to_vis_mesh

* fix: No longer export all physical objects within parts with a multilevel hierarchy multiple times

* fix: Multiprocessing now works. Translation of models happens after conversion step and no longer needs to be done before mp starts

* fix: merging by colours is not fixed

* chore: fix step export not exporting all subelements (including subparts).

* chore: change default behaviour of get_physical_objects to find all subelements in sublevels

* fix: regression in viz object output due to get_physical_object default change

* chore: add minor improvements in allowable arguments and defaults

* chore: WIP binary support

* chore: add more filtering options to get_list_of_files

* chore: test pre-commit linter service in gh action

* fix: edit spelling mistake

* chore: do not allow returning None and provide logging of error whenever a world has no parts.

* fix: when exporting to binjson, first remove all local files in temp dir

* fix: set int32 as export format for binary numbers AND do not use pre-commit linting (yet)

* add filter functions

* More robust modifications of attributes on Beam instance, and general enhanced readability (#38)

* Updated type hints for consistency

* Updated sections

* Updated some vector utils

* Some updates regarding beam and node

* Updates on node and containers for concepts

* Updating node refs

* Updating refs

* Updating section and taper

* Updating Beam

* Adding functions for sorting nodes

* add extra type hint for beam nodes and fix the failing test

* lint using black, isort and flake8

* chore: minor improvements to type hints and default config for viz exports

* fix: pass owner history to lower level ifc write functions opposed to use ifcopenshell to get it. This improves speeds significantly. TODO: Should move to a Ifc<Type>Exporter class system as opposed to always have to pass variables between functions

* chore: change default to always create zip-files during binary export

* chore: minor changes in defaults in PartMesh

* chore: start using a module-specific logger opposed to logging to root

* fix: noticing built packages are not tested against correct python version. Trying with conda_build_config.yaml file now

* fix: remove cache

* chore: experiment with meta.yaml

* chore: further experiment with meta.yaml

* chore: Do not build noarch. Use regular OS-specific packaging

* chore: skip python version in conda config and remove python jinja in meta.yaml

* chore: fix meta mistake

* chore: fix repr trying to print un-initialized attributes

* found incompatible packages for python 3.10

* fix: python 3.10 dev release for adapy on linux and windows

* chore: cleanup of vis export code

* feature: add STL export (requires trimesh installed)

* add method for reading already converted data

* fix: correct failing test on osx

* fix: linting

* feature: add support for gltf export

* add support for trimesh Scene export containing correct color and name of  objects

* add bumpversion as versioning mechanism

* Bump version: 0.0.27 → 0.0.28

* stick version for now occt=7.5.3
  • Loading branch information
Krande authored Apr 23, 2022
1 parent e9cdbce commit 37168ea
Show file tree
Hide file tree
Showing 250 changed files with 33,827 additions and 3,181 deletions.
80 changes: 30 additions & 50 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
name: ci-ada-main

# bump 2
on:
push:
paths:
Expand All @@ -11,6 +11,7 @@ on:
branches:
- main
- dev
- reduce_dependencies
pull_request:
paths-ignore:
- setup.py
Expand All @@ -28,88 +29,67 @@ jobs:
needs: activate
runs-on: ubuntu-latest
steps:
- uses: actions/setup-python@v2
with:
python-version: "3.x"
- uses: actions/checkout@v2
- name: Install lint packages
run: pip install isort flake8 black
- name: Lint with isort
run: isort --check .
- name: Lint with flake8
run: flake8 .
- name: Lint with black
run: black --config pyproject.toml --check .
- uses: actions/setup-python@v2
with:
python-version: "3.x"
- uses: actions/checkout@v2
- name: Install lint packages
run: pip install isort flake8 black
- name: Lint with isort
run: isort --check .
- name: Lint with flake8
run: flake8 .
- name: Lint with black
run: black --config pyproject.toml --check .
test:
needs: lint
name: ${{ matrix.platform.name }}-${{ matrix.pyver.name }}
runs-on: ${{ matrix.platform.distver }}
defaults:
run:
shell: bash -l {0}
env:
TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN }}
CONDAENV: base
PKG_VERSION: nothing
CONDAROOT: nothing
CONDALABEL: krande
CONDALABEL2: ''
strategy:
fail-fast: false
matrix:
pyver: [ { name: py38, distver: '3.8' }, { name: py39, distver: '3.9'}]
pyver: [ { name: py39, distver: '3.9.10' }, { name: py310, distver: '3.10.2'}]
platform: [
{ name: Windows, distver: windows-latest, short: 'win-64' },
{ name: Linux, distver: ubuntu-latest, short: 'linux-64' },
{ name: macOS, distver: macos-latest, short: 'osx-64' }
]
steps:
- uses: actions/checkout@v2
- name: Cache conda
uses: actions/cache@v2
env:
# Increase this value to reset cache if etc/example-environment.yml has not changed
CACHE_NUMBER: 0
with:
path: ~/conda_pkgs_dir
key:
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-${{
hashFiles('conda/environment.yml') }}
- uses: conda-incubator/setup-miniconda@v2 # https://github.com/conda-incubator/setup-miniconda
with:
activate-environment: ${{ env.CONDAENV }}
python-version: ${{ matrix.pyver.distver }}
channel-priority: strict
environment-file: conda/environment.yml
auto-update-conda: true
use-only-tar-bz2: true # IMPORTANT: This needs to be set for caching to work properly!
- name: build
shell: bash -l {0}
run: |
conda activate ${{ env.CONDAENV }}
conda-build -c krande -c conda-forge conda --python=${{ matrix.pyver.distver }} --override-channels --keep-old-work --dirty
- name: install
shell: bash -l {0}
- name: edit conda channel label if not on main
if: github.event_name == 'push' && github.ref != 'refs/heads/main'
run: |
conda activate ${{ env.CONDAENV }}
conda create -n testenv -c local -c krande -c conda-forge ada-py pytest pytest-cov --strict-channel-priority
- name: get package version into env variable
shell: bash -l {0}
run: |
echo "CONDAROOT=$CONDA_PREFIX" >> $GITHUB_ENV
conda activate testenv
python conda/getversion.py
cat version.txt >> $GITHUB_ENV
- name: test
shell: bash -l {0}
run: |
conda activate testenv
pytest tests
- name: upload to conda -c krande
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
shell: bash -l {0}
echo "CONDALABEL=krande/label/dev" >> $GITHUB_ENV
echo "CONDALABEL2= --label dev" >> $GITHUB_ENV
- name: build
run: |
conda activate ${{ env.CONDAENV }}
anaconda -t=${{ secrets.ANACONDA_TOKEN }} upload ${{ env.CONDAROOT }}/conda-bld/${{ matrix.platform.short }}/ada-py-${{ env.PKG_VERSION }}-${{ matrix.pyver.name }}_0.tar.bz2 --user krande --skip-existing
cd conda
conda-build -c ${{env.CONDALABEL}} -c conda-forge . --python=${{ matrix.pyver.distver }} --user krande${{env.CONDALABEL2}} --token=${{ secrets.ANACONDA_TOKEN }}
pypi:
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
needs: test
name: Publish to PYPI
defaults:
run:
shell: bash -l {0}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
Expand Down
23 changes: 17 additions & 6 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -3,20 +3,26 @@ cmd_test=cd /home/tests/fem && pytest && python build_verification_report.py
mount=--mount type=bind,source="$(CURDIR)/temp/report",target=/home/tests/fem/temp \
--mount type=bind,source="$(CURDIR)/temp/scratch",target=/home/adauser/scratch
build_dirs=mkdir -p "temp/report" && mkdir -p "temp/scratch"
build_dirs_win=mkdir -p "temp/report" && mkdir -p "temp/scratch"

install:
conda env create -f environment.yml

update:
conda env update --name work --file environment.yml --prune

format:
black . && isort . && flake8 .

bump:
bumpversion patch setup.py

build:
docker build -t ada/base:latest .

run:
docker run --rm -p 8888:8888 ada/base:latest

format:
black . && isort . && flake8 .

install:
pip install .

test:
cd tests && pytest --cov=ada --cov-report=xml --cov-report=html .

Expand All @@ -25,6 +31,11 @@ dtest:
docker build -t ada/testing . && \
docker run --name ada-report --rm $(mount) ada/testing bash -c "$(cmd_pre) && $(cmd_test)"

dtest-local:
$(build_dirs_win) && \
docker build -t ada/testing . && \
docker run --name ada-report --rm $(mount) ada/testing bash -c "$(cmd_pre) && $(cmd_test)"

dtest-b:
$(build_dirs) && docker build -t ada/testing .

Expand Down
8 changes: 7 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,18 @@ A python library for working with structural analysis and design. `Ada-py` deliv
CAD/BIM/FEM modelling, interoperability and Finite Elements (FE) post-processing.


To install the ada-py package into an existing conda environment
To install the latest "stable" ada-py package into an existing conda environment

```
conda install -c krande -c conda-forge ada-py
```

or if you wish to download the latest build from any branch passing unittests you can do

```
conda install -c krande/label/dev -c conda-forge ada-py
```

**Alternatively** create a new isolated environment for the installation like so:

```
Expand Down
11 changes: 11 additions & 0 deletions conda/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
FROM continuumio/miniconda3

# Create the environment:
RUN conda create -n condabuild -y -c conda-forge conda-build conda-verify anaconda-client git
RUN apt-get -y update && apt -y install patch
# Make RUN commands use the new environment:
SHELL ["conda", "run", "-n", "condabuild", "/bin/bash", "-c"]

COPY . .

#RUN conda-build -c krande/label/dev -c conda-forge . --keep-old-work --python 3.9.10
19 changes: 16 additions & 3 deletions conda/Makefile
Original file line number Diff line number Diff line change
@@ -1,5 +1,12 @@
build:
conda-build -c krande -c conda-forge . --keep-old-work
compile:
conda activate condabuild && conda-build -c krande/label/dev -c conda-forge . --keep-old-work --python 3.10.2

compile-docker-build:
docker build -t ada/condabuild -f Dockerfile ../ && \
docker run --name ada-condabuild --rm ada/condabuild

compile-docker:
docker exec -it ada-condabuild "conda-build -c krande/label/dev -c conda-forge . --keep-old-work --python 3.9.10"

index:
conda index ${CONDA_PREFIX}/conda-bld --channel-name local
Expand All @@ -15,4 +22,10 @@ upload:
conda-build -c krande -c conda-forge . --user krande

show:
conda config --show channels
conda config --show channels

pre:
conda create -y -n condabuild -c conda-forge conda-build conda-verify anaconda-client

py310:
conda create -n py310 -c krande/label/dev -c conda-forge ifcopenshell h5py python==3.10.2
43 changes: 29 additions & 14 deletions conda/meta.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
{% set data = load_setup_py_data() %}


package:
name: ada-py
version: {{ data.get('version') }}
Expand All @@ -9,33 +10,47 @@ source:

build:
number: 0
script: "{{ PYTHON }} -m pip install . -vv"
script: python -m pip install --no-deps --ignore-installed .

requirements:
build:
- python
run:
- python
- numpy
- ifcopenshell
- pythonocc-core 7.5.1
- occt 7.5.1
- vtk
- meshio[all]
- toolz
- lmfit
- gmsh
- ifcopenshell >=0.7.0
- pythonocc-core >=7.5.3
- occt ==7.5.3
- python-gmsh >=4.9.3
- pyquaternion
- ccx2paraview
- trimesh
- pytexit
- jupyterlab
- pythreejs
- pyparsing
- h5py
- plotly
- python-kaleido
- ipygany
- pydantic
- pyvista
- meshio

# Dependencies that are currently left out
# - numpy
# - vtk
# - toolz
# - lmfit
# - ccx2paraview
# - ipygany
# - pydantic
# - pyvista
test:
source_files:
- tests
- files
requires:
- pytest
- pytest-cov
imports:
- ada

about:
home: https://github.com/krande/adapy
Expand All @@ -46,4 +61,4 @@ about:

extra:
recipe-maintainers:
- Krande
- Krande
1 change: 1 addition & 0 deletions conda/run_test.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
pytest tests
3 changes: 3 additions & 0 deletions conda/run_test.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/bin/bash

pytest tests
2 changes: 1 addition & 1 deletion environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ channels:
- conda-forge
dependencies:
- ada-py
- pydantic
- paradoc
Loading

0 comments on commit 37168ea

Please sign in to comment.