-
Notifications
You must be signed in to change notification settings - Fork 105
Adding support for a new binding type
Adding support for a new Azure Functions binding type to Python involves two subtasks:
- Designing and documenting high-level rich Python types that will represent the binding data for the function user.
- Implementing input/output converters that transform inbound
TypedData
into instances of the rich type and vice-versa.
The types exposed to the Functions user are organized in the
azure.functions
package. The definition is usually split into an
abc.ABC
abstract interface and a concrete implementation that is either
public or private depending on whether the user should be able to construct
instances of the binding type directly.
As an example let's consider a Python type for the Blob Storage binding,
which is declared in azure.functions._abc
:
class InputStream(abc.ABC):
"""File-like object representing an input blob."""
@abc.abstractmethod
def read(self, size=-1) -> bytes:
"""Return and read up to *size* bytes.
:param int size:
The number of bytes to read. If the argument is omitted,
``None``, or negative, data is read and returned until
EOF is reached.
:return:
Bytes read from the input stream.
"""
pass
@property
@abc.abstractmethod
def name(self) -> typing.Optional[str]:
"""The name of the blob."""
pass
@property
@abc.abstractmethod
def length(self) -> typing.Optional[int]:
"""The size of the blob in bytes."""
pass
@property
@abc.abstractmethod
def uri(self) -> typing.Optional[str]:
"""The blob's primary location URI."""
pass
The above is a pure abstract base class which exposes the documented functionality of the binding to the Functions user.
To fully implement support for a binding, a data converter must be
implemented for each supported binding direction. Binding converters
typically go into the azure.worker.bindings
package.
For input bindings the following class declaration is necessary:
class MyBindingInputConverter(azure.functions.meta.InConverter,
binding='mybinding'):
@classmethod
def check_input_type_annotation(cls, pytype: type) -> bool:
# check that the type annotation for the binding argument
# in the function source is correct, i.e. one of the supported
# python types you've declared for this binding.
@classmethod
def from_proto(cls, data: protos.TypedData, *,
pytype: typing.Optional[type],
trigger_metadata: typing.Optional[dict]) -> typing.Any:
# a function that takes a gRPC TypedData instance
# and returns an instance of the high-level Python type
# for the binding.
# *trigger_metadata* will be passed if this is a trigger
# binding converter.
Trigger binding converters should implement the same interface, and are often a simple subclass of the input binding converter:
class MyBindingTriggerConverter(MyBindingInputConverter,
binding='mybindingTrigger', trigger=True):
pass
For output bindings the following class declaration is necessary:
class MyBindingOutputConverter(azure.functions.meta.OutConverter,
binding='mybinding'):
@classmethod
def check_output_type_annotation(cls, pytype: type) -> bool:
# check that the type annotation for the binding `Out` argument
# or the function return type annotation in the function source
# is correct, i.e. one of the supported python types that this
# converter is able to convert into valid protocol TypedData.
@classmethod
def to_proto(cls, obj: typing.Any, *,
pytype: typing.Optional[type]) -> protos.TypedData:
# a function that takes the function output and converts
# it into a ``TypedData`` instance.
If your user-facing Python type is abstract, it is necessary to subclass it and implement the abstract methods and properties. Such concrete implementation is normally located in the same package as the data converters and is considered private.
See the Blob binding implementation for an example.
All new bindings must have full test coverage in the form of actual fuunctions.
Functions must be placed in a directory under tests/
, which should also
contain the host.json
file and the special ping
function. It is
easiest to copy these from an existing test subdirectory.
Ideally, end-to-end tests are written which use a live function app. In this
case, the test case should inherit from
azure.worker.testutils.WebHostTestCase
:
class TestBlobFunctions(testutils.WebHostTestCase):
@classmethod
def get_script_dir(cls):
# Return the name of the subirectory in tests/ containing
# the test functions.
return 'mybinding_functions'
WebHostTestCase
test cases have the webhost
property, which
is a requests
connection to a real Functions host instance. The tests
then make HTTP requests to test functions and examine the output. For
example:
def test_mybinding_input(self):
r = self.webhost.request('POST', 'my_binding_test_function',
...)
...
If your binding implementation is supposed to be shipped in a separate Python package,
(i.e. not as part of the worker), it is possible to tell the worker to automatically
load and register your binding by adding an entry_points
value to the setup()
call in setup.py
:
setup(
name='My Binding',
...,
entry_points={
'azure.functions.bindings': ['<binding>=<module-name>:<ClassName>']
}
)
Here, <binding>
is a string that uniquely identifies the binding type, <module-name>
is the full name of the binding module (e.g. 'my.azure.binding'
), and <ClassName>
is the name of the binding data converter class.