The :ref:`C data interface <c-data-interface>` and
:ref:`C stream interface <c-stream-interface>` allow moving Arrow data between
different implementations of Arrow. However, these interfaces don't specify how
Python libraries should expose these structs to other libraries. Prior to this,
many libraries simply provided export to PyArrow data structures, using the
_import_from_c
and _export_from_c
methods. However, this always required
PyArrow to be installed.
This interface allows any library to export Arrow data structures to other libraries that understand the same protocol.
- Standardize the PyCapsule objects that represent
ArrowSchema
,ArrowArray
, andArrowArrayStream
. - Define standard methods to get these PyCapsules, so any Python objects that support export via Arrow can be recognized by other libraries.
- Standardizing what public APIs should be used for import. This is left up to individual libraries.
PyCapsules allow for a name
to be associated with the capsule, allowing
consumers to verify that the capsule contains the expected data. To make sure
Arrow structs are recognized, the following names should be used:
C Interface Type | PyCapsule Name |
---|---|
ArrowSchema | arrowschema |
ArrowArray | arrowarray |
ArrowArrayStream | arrowarraystream |
The exported PyCapsules should have a destructor that calls the :ref:`release callback <c-data-interface-released>` of the Arrow struct, if it is not already null. This prevents a memory leak in case the capsule was never passed to another consumer.
If the consumer has been passed to a consumer, the consumer should have moved the data and marked the release callback as null, so there isn’t a risk of releasing data the consumer is using. Read more in the C Data Interface specification.
The interface is three separate protocols:
ArrowSchemaExportable
, which defines the__arrow_c_schema__
method.ArrowArrayExportable
, which defines the__arrow_c_array__
. It extendsArrowSchemaExportable
, so it must also define__arrow_c_schema__
.ArrowStreamExportable
, which defines the__arrow_c_stream__
. It extendsArrowSchemaExportable
, so it must also define__arrow_c_schema__
.
The protocols are defined below in terms of typing.Protocol
. These may be
copied into a library for the purposes of static type checking, but this is not
required to implement the protocol.
from typing import Tuple, Protocol
from typing_extensions import Self
class ArrowSchemaExportable(Protocol):
def __arrow_c_schema__(self) -> object:
"""
Get a PyCapsule containing a C ArrowSchema representation of the object.
The capsule will have a name of "arrowschema".
"""
...
class ArrowArrayExportable(ArrowSchemaExportable, Protocol):
def __arrow_c_array__(
self,
requested_schema: object | None = None
) -> Tuple[object, object]:
"""
Get a PyCapsule containing a C ArrowArray representation of the object.
The capsule will have a name of "arrowarray".
If requested_schema is passed, the callee should attempt to provide the
data in the requested schema. However, this is best-effort, and the
callee may return a PyCapsule containing an ArrowArray with a different
schema. This parameter is useful for cases where the underlying data
could be represented in multiple ways, and the caller has a preference
for how it is represented. For example, some systems have a single
integer type, but Arrow has multiple integer types with different
sizes and sign.
Parameters
----------
requested_schema : PyCapsule | None
A PyCapsule containing a C ArrowSchema representation of a requested
schema. Conversion to this schema is best-effort.
Returns
-------
Tuple[PyCapsule, PyCapsule]
A pair of PyCapsules containing a C ArrowSchema and ArrowArray,
respectively.
"""
...
class ArrowStreamExportable(ArrowSchemaExportable, Protocol):
def __arrow_c_stream__(self, requested_schema: object | None = None) -> Tuple[object]:
"""
Get a PyCapsule containing a C ArrowArrayStream representation of the object.
The capsule will have a name of "arrowarraystream".
If requested_schema is passed, the callee should attempt to provide the
data in the requested schema. However, this is best-effort, and the
callee may return a PyCapsule containing an ArrowArray with a different
schema. This parameter is useful for cases where the underlying data
could be represented in multiple ways, and the caller has a preference
for how it is represented. For example, some systems have a single
integer type, but Arrow has multiple integer types with different
sizes and sign.
Parameters
----------
requested_schema : PyCapsule | None
A PyCapsule containing a C ArrowSchema representation of a requested
schema. Conversion to this schema is best-effort.
Returns
-------
PyCapsule
A PyCapsule containing a C ArrowArrayStream representation of the
object.
"""
...
In some cases, there might be multiple possible Arrow representations of the same data. For example, a library might have a single integer type, but Arrow has multiple integer types with different sizes and sign. As another example, Arrow has several possible encodings for an array of strings: 32-bit offsets, 64-bit offsets, string view, and dictionary-encoded. A sequence of strings could export to any one of these Arrow representations.
In order to allow the caller to request a specific representation, the
__arrow_c_array__
and __arrow_c_stream__
methods may take an optional
requested_schema
parameter. This parameter is a PyCapsule containing an
ArrowSchema
.
The callee should attempt to provide the data in the requested schema. However,
if the callee cannot provide the data in the requested schema, they may return
with the schema as provided by the __arrow_c_schema__
method. Similarly, if
no schema is requested, the callee _must_ return with the schema as provided by
the __arrow_c_schema__
method.
If the caller requests a schema that is not compatible with the data, say requesting a schema with a different number of fields, the callee should raise an exception. The requested schema mechanism is only meant to negotiate between different representations of the same data and not to allow arbitrary schema transformations.