- Install poetry:
pip install poetry
- Activate the virtual environment:
poetry shell
- Install dependencies:
poetry install --all-extras
To test the pipeline with Kafka, you can use docker compose
to setup Kafka, Redpanda or RabbitMQ locally.
Kafka
docker compose --file ./scripts/queues/kafka-compose.yml up # Kafka
In your YAML configuration, set the queue
configuration to Kafka under engine
:
engine:
queue:
type: kafka
config:
queue: bootstrap_server: localhost:9092 # Kafka
RedPanda
docker compose --file ./scripts/queues/redpanda-compose.yml up # Redpanda
In your YAML configuration, set the queue
configuration to Redpanda under engine
:
engine:
queue:
type: kafka
config:
queue: bootstrap_server: localhost:19092
RabbitMQ
docker compose --file ./scripts/queues/rabbitmq-compose.yml up
In your YAML configuration, set the queue
configuration to Kafka under engine
:
engine:
queue:
type: rabbitmq
config:
queue:
host: localhost
If you don't start the backend, the test fall back on the bizon default backend which is the destination warehouse.
Create a .env file living in the /tests folder:
BIGQUERY_PROJECT_ID=<YOUR_PROJECT_ID>
BIGQUERY_DATASET_ID=bizon_test