Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate public API in keras/api folder. #19510

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
24 changes: 17 additions & 7 deletions .github/workflows/actions.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,13 +24,13 @@ jobs:
KERAS_HOME: .github/workflows/config/${{ matrix.backend }}
steps:
- uses: actions/checkout@v4
- name: Check for changes in keras/applications
- name: Check for changes in keras/src/applications
uses: dorny/paths-filter@v3
id: filter
with:
filters: |
applications:
- 'keras/applications/**'
- 'keras/src/applications/**'
- name: Set up Python
uses: actions/setup-python@v5
with:
Expand All @@ -49,13 +49,13 @@ jobs:
run: |
pip install -r requirements.txt --progress-bar off --upgrade
pip uninstall -y keras keras-nightly
pip install tf_keras==2.16.0rc0 --progress-bar off --upgrade
pip install tf_keras==2.16.0 --progress-bar off --upgrade
pip install -e "." --progress-bar off --upgrade
- name: Test applications with pytest
if: ${{ steps.filter.outputs.applications == 'true' }}
run: |
pytest keras/applications --cov=keras/applications
coverage xml --include='keras/applications/*' -o apps-coverage.xml
pytest keras/src/applications --cov=keras/src/applications
coverage xml --include='keras/src/applications/*' -o apps-coverage.xml
- name: Codecov keras.applications
if: ${{ steps.filter.outputs.applications == 'true' }}
uses: codecov/codecov-action@v4
Expand All @@ -80,8 +80,8 @@ jobs:
pytest integration_tests/torch_workflow_test.py
- name: Test with pytest
run: |
pytest keras --ignore keras/applications --cov=keras
coverage xml --omit='keras/applications/*' -o core-coverage.xml
pytest keras --ignore keras/src/applications --cov=keras
coverage xml --omit='keras/src/applications/*,keras/api' -o core-coverage.xml
- name: Codecov keras
uses: codecov/codecov-action@v4
with:
Expand Down Expand Up @@ -115,5 +115,15 @@ jobs:
pip install -r requirements.txt --progress-bar off --upgrade
pip uninstall -y keras keras-nightly
pip install -e "." --progress-bar off --upgrade
- name: Check for API changes
run: |
bash shell/api_gen.sh
git status
clean=$(git status | grep "nothing to commit")
if [ -z "$clean" ]; then
echo "There are uncommitted changes."
echo "Run `bash shell/api_gen.sh` in your commit to capture public API changes."
exit 1
fi
- name: Lint
run: bash shell/lint.sh
12 changes: 11 additions & 1 deletion .github/workflows/nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ jobs:
pytest integration_tests/torch_workflow_test.py
- name: Test with pytest
run: |
pytest keras --ignore keras/applications --cov=keras
pytest keras --ignore keras/src/applications --cov=keras

format:
name: Check the code format
Expand All @@ -81,6 +81,16 @@ jobs:
pip install -r requirements.txt --progress-bar off --upgrade
pip uninstall -y keras keras-nightly
pip install -e "." --progress-bar off --upgrade
- name: Check for API changes
run: |
bash shell/api_gen.sh
git status
clean=$(git status | grep "nothing to commit")
if [ -z "$clean" ]; then
echo "There are uncommitted changes."
echo "Run `bash shell/api_gen.sh` in your commit to capture public API changes."
exit 1
fi
- name: Lint
run: bash shell/lint.sh

Expand Down
16 changes: 8 additions & 8 deletions .kokoro/github/ubuntu/gpu/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ then
python3 -c 'import tensorflow as tf;assert len(tf.config.list_physical_devices("GPU")) > 0'

# TODO: keras/layers/merging/merging_test.py::MergingLayersTest::test_sparse_dot_2d Fatal Python error: Aborted
pytest keras --ignore keras/applications \
--ignore keras/layers/merging/merging_test.py \
pytest keras --ignore keras/src/applications \
--ignore keras/src/layers/merging/merging_test.py \
--cov=keras
fi

Expand All @@ -51,11 +51,11 @@ then
# TODO: keras/layers/merging/merging_test.py::MergingLayersTest::test_sparse_dot_2d Fatal Python error: Aborted
# TODO: keras/trainers/data_adapters/py_dataset_adapter_test.py::PyDatasetAdapterTest::test_basic_flow0 Fatal Python error: Aborted
# keras/backend/jax/distribution_lib_test.py is configured for CPU test for now.
pytest keras --ignore keras/applications \
--ignore keras/layers/merging/merging_test.py \
--ignore keras/trainers/data_adapters/py_dataset_adapter_test.py \
--ignore keras/backend/jax/distribution_lib_test.py \
--ignore keras/distribution/distribution_lib_test.py \
pytest keras --ignore keras/src/applications \
--ignore keras/src/layers/merging/merging_test.py \
--ignore keras/src/trainers/data_adapters/py_dataset_adapter_test.py \
--ignore keras/src/backend/jax/distribution_lib_test.py \
--ignore keras/src/distribution/distribution_lib_test.py \
--cov=keras
fi

Expand All @@ -68,6 +68,6 @@ then
# Raise error if GPU is not detected.
python3 -c 'import torch;assert torch.cuda.is_available()'

pytest keras --ignore keras/applications \
pytest keras --ignore keras/src/applications \
--cov=keras
fi
8 changes: 8 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,14 @@ pip install -r requirements.txt

```
python pip_build.py --install
# Or install the package via setup
python setup.py -e .
```

3. Run API generation script when creating PRs:

```
./shell/api_gen.sh
```

#### Adding GPU support
Expand Down
172 changes: 172 additions & 0 deletions api_gen.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
"""Script to generate keras public API in `keras/api` directory.

Usage:

Run via `./shell/api_gen.sh`. This also formats the APIs generated.
"""

import os
import shutil

import namex

package = "keras"


def ignore_files(_, filenames):
return [f for f in filenames if f.endswith("_test.py")]


def create_legacy_directory():
API_DIR = os.path.join(package, "api")
# Make keras/_tf_keras/ by copying keras/
tf_keras_dirpath_parent = os.path.join(API_DIR, "_tf_keras")
tf_keras_dirpath = os.path.join(tf_keras_dirpath_parent, "keras")
os.makedirs(tf_keras_dirpath, exist_ok=True)
with open(os.path.join(tf_keras_dirpath_parent, "__init__.py"), "w") as f:
f.write("from keras.api._tf_keras import keras\n")
with open(os.path.join(API_DIR, "__init__.py")) as f:
init_file = f.read()
init_file = init_file.replace(
"from keras.api import _legacy",
"from keras.api import _tf_keras",
)
with open(os.path.join(API_DIR, "__init__.py"), "w") as f:
f.write(init_file)
with open(os.path.join(tf_keras_dirpath, "__init__.py"), "w") as f:
f.write(init_file)
for dirname in os.listdir(API_DIR):
dirpath = os.path.join(API_DIR, dirname)
if os.path.isdir(dirpath) and dirname not in (
"_legacy",
"_tf_keras",
"src",
):
destpath = os.path.join(tf_keras_dirpath, dirname)
if os.path.exists(destpath):
shutil.rmtree(destpath)
shutil.copytree(
dirpath,
destpath,
ignore=ignore_files,
)

# Copy keras/_legacy/ file contents to keras/_tf_keras/keras
legacy_submodules = [
path[:-3]
for path in os.listdir(os.path.join(package, "src", "legacy"))
if path.endswith(".py")
]
legacy_submodules += [
path
for path in os.listdir(os.path.join(package, "src", "legacy"))
if os.path.isdir(os.path.join(package, "src", "legacy", path))
]

for root, _, fnames in os.walk(os.path.join(package, "_legacy")):
for fname in fnames:
if fname.endswith(".py"):
legacy_fpath = os.path.join(root, fname)
tf_keras_root = root.replace("/_legacy", "/_tf_keras/keras")
core_api_fpath = os.path.join(
root.replace("/_legacy", ""), fname
)
if not os.path.exists(tf_keras_root):
os.makedirs(tf_keras_root)
tf_keras_fpath = os.path.join(tf_keras_root, fname)
with open(legacy_fpath) as f:
legacy_contents = f.read()
legacy_contents = legacy_contents.replace(
"keras.api._legacy", "keras.api._tf_keras.keras"
)
if os.path.exists(core_api_fpath):
with open(core_api_fpath) as f:
core_api_contents = f.read()
core_api_contents = core_api_contents.replace(
"from keras.api import _tf_keras\n", ""
)
for legacy_submodule in legacy_submodules:
core_api_contents = core_api_contents.replace(
f"from keras.api import {legacy_submodule}\n",
"",
)
core_api_contents = core_api_contents.replace(
f"keras.api.{legacy_submodule}",
f"keras.api._tf_keras.keras.{legacy_submodule}",
)
legacy_contents = core_api_contents + "\n" + legacy_contents
with open(tf_keras_fpath, "w") as f:
f.write(legacy_contents)

# Delete keras/api/_legacy/
shutil.rmtree(os.path.join(API_DIR, "_legacy"))


def export_version_string():
API_INIT = os.path.join(package, "api", "__init__.py")
with open(API_INIT) as f:
contents = f.read()
with open(API_INIT, "w") as f:
contents += "from keras.src.version import __version__\n"
f.write(contents)


def update_package_init():
contents = """
# Import everything from /api/ into keras.
from keras.api import * # noqa: F403
from keras.api import __version__ # Import * ignores names start with "_".

import os

# Add everything in /api/ to the module search path.
__path__.append(os.path.join(os.path.dirname(__file__), "api")) # noqa: F405

# Don't pollute namespace.
del os

# Never autocomplete `.src` or `.api` on an imported keras object.
def __dir__():
keys = dict.fromkeys((globals().keys()))
keys.pop("src")
keys.pop("api")
return list(keys)


# Don't import `.src` or `.api` during `from keras import *`.
__all__ = [
name
for name in globals().keys()
if not (name.startswith("_") or name in ("src", "api"))
]"""
with open(os.path.join(package, "__init__.py")) as f:
init_contents = f.read()
with open(os.path.join(package, "__init__.py"), "w") as f:
f.write(init_contents.replace("\nfrom keras import api", contents))


if __name__ == "__main__":
# Backup the `keras/__init__.py` and restore it on error in api gen.
os.makedirs(os.path.join(package, "api"), exist_ok=True)
init_fname = os.path.join(package, "__init__.py")
backup_init_fname = os.path.join(package, "__init__.py.bak")
try:
if os.path.exists(init_fname):
shutil.move(init_fname, backup_init_fname)
# Generates `keras/api` directory.
namex.generate_api_files(
"keras", code_directory="src", target_directory="api"
)
# Creates `keras/__init__.py` importing from `keras/api`
update_package_init()
except Exception as e:
if os.path.exists(backup_init_fname):
shutil.move(backup_init_fname, init_fname)
raise e
finally:
if os.path.exists(backup_init_fname):
os.remove(backup_init_fname)
# Add __version__ to keras package
export_version_string()
# Creates `_tf_keras` with full keras API
create_legacy_directory()
2 changes: 1 addition & 1 deletion integration_tests/basic_full_flow.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
from keras import losses
from keras import metrics
from keras import optimizers
from keras import testing
from keras.src import testing


class MyModel(keras.Model):
Expand Down
2 changes: 1 addition & 1 deletion integration_tests/dataset_tests/boston_housing_test.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
from keras import testing
from keras.datasets import boston_housing
from keras.src import testing


class BostonHousingTest(testing.TestCase):
Expand Down
2 changes: 1 addition & 1 deletion integration_tests/dataset_tests/california_housing_test.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
from keras import testing
from keras.datasets import california_housing
from keras.src import testing


class CaliforniaHousingTest(testing.TestCase):
Expand Down
2 changes: 1 addition & 1 deletion integration_tests/dataset_tests/cifar100_test.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import numpy as np

from keras import testing
from keras.datasets import cifar100
from keras.src import testing


class Cifar100LoadDataTest(testing.TestCase):
Expand Down
2 changes: 1 addition & 1 deletion integration_tests/dataset_tests/cifar10_test.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import numpy as np

from keras import testing
from keras.datasets import cifar10
from keras.src import testing


class Cifar10LoadDataTest(testing.TestCase):
Expand Down
2 changes: 1 addition & 1 deletion integration_tests/dataset_tests/fashion_mnist_test.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import numpy as np

from keras import testing
from keras.datasets import fashion_mnist
from keras.src import testing


class FashionMnistLoadDataTest(testing.TestCase):
Expand Down
2 changes: 1 addition & 1 deletion integration_tests/dataset_tests/imdb_test.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import numpy as np

from keras import testing
from keras.datasets import imdb
from keras.src import testing


class ImdbLoadDataTest(testing.TestCase):
Expand Down
2 changes: 1 addition & 1 deletion integration_tests/dataset_tests/mnist_test.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import numpy as np

from keras import testing
from keras.datasets import mnist
from keras.src import testing


class MnistLoadDataTest(testing.TestCase):
Expand Down
2 changes: 1 addition & 1 deletion integration_tests/dataset_tests/reuters_test.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import numpy as np

from keras import testing
from keras.datasets import reuters
from keras.src import testing


class ReutersLoadDataTest(testing.TestCase):
Expand Down
Loading
Loading