Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Add docs for all Django settings #126

Merged
merged 3 commits into from
Feb 8, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,12 @@ Change Log
Unreleased
**********

[3.9.2] - 2023-02-07
********************
Fixed
=====
* Added documentation to all Django settings used in consumer and producer

[3.9.1] - 2023-02-07
********************
There was no version 3.9.0, due to a release issue. (Ignore any ``v3.9.0`` tag.)
Expand Down
2 changes: 1 addition & 1 deletion edx_event_bus_kafka/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@
from edx_event_bus_kafka.internal.consumer import KafkaEventConsumer
from edx_event_bus_kafka.internal.producer import KafkaEventProducer, create_producer

__version__ = '3.9.1'
__version__ = '3.9.2'
32 changes: 32 additions & 0 deletions edx_event_bus_kafka/internal/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,12 +38,29 @@ def get_schema_registry_client():
warnings.warn('Library confluent-kafka not available. Cannot create schema registry client.')
return None

# .. setting_name: EVENT_BUS_KAFKA_SCHEMA_REGISTRY_URL
# .. setting_default: None
# .. setting_description: URL of the Avro schema registry, required for managing the evolution
# of event bus data and ensuring that consumers are able to decode the events that
# are produced. This URL is required for both producers and consumers and must point
# to an instance of Confluent Schema Registry:
# https://docs.confluent.io/platform/current/schema-registry/index.html
# If needed, auth information must be added to ``EVENT_BUS_KAFKA_SCHEMA_REGISTRY_API_KEY``
# and ``EVENT_BUS_KAFKA_SCHEMA_REGISTRY_API_SECRET``.
url = getattr(settings, 'EVENT_BUS_KAFKA_SCHEMA_REGISTRY_URL', None)
if url is None:
warnings.warn("Cannot configure event-bus-kafka: Missing setting EVENT_BUS_KAFKA_SCHEMA_REGISTRY_URL")
return None

# .. setting_name: EVENT_BUS_KAFKA_SCHEMA_REGISTRY_API_KEY
# .. setting_default: ''
# .. setting_description: API key for talking to the Avro schema registry specified in
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This key and secret are optional as well (figure we should probably be consistent with how we describe the Kafka API key)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh! Yes, I was wondering about this and forgot to leave myself a note. The Kafka API key/secret are definitely optional because if either one is omitted, we skip adding both of them to the config. But with these schema registry values, we always use them, even if they're None or empty. So for instance we could end up with a basic.auth.user.info of :, user:, :pass, or user:pass. Not all of those look particularly sensible. I'll go check the schema registry docs again...

(Also, I see that I've documented them as defaulting to None but they're actually ''. Oops.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed the docs of the defaults. OK, here's what I've got so far:

I guess what I'm unsure of is this: What happens if the schema-registry server is configured not to require auth, and the client uses the defaults, i.e. ':'? Is that going to cause an issue?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like it will look to the url first for auth information and then error if there is also basic.auth.user.info. I vote we punt on what happens if the schema registry server doesn't expect any auth and just put in docs somewhere that user should use the settings provided instead of putting the user and password in the URL

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, we should probably document that auth goes in the auth settings and not in the URL. But what to do about required/optional? I don't want to make claims in the docstrings that we can't back up, and so I'm tempted to just... not mention required/optional either way, for those two.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think "Optional" still makes sense because we do know for sure that you don't have to always have to set them.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. I still don't know what happens if the schema registry isn't configured for auth and we pass an auth header containing ':' anyway. Maybe nothing? I guess if it causes any issues then we can put in a bugfix when we find out.

# ``EVENT_BUS_KAFKA_SCHEMA_REGISTRY_URL``. Optional.
key = getattr(settings, 'EVENT_BUS_KAFKA_SCHEMA_REGISTRY_API_KEY', '')
# .. setting_name: EVENT_BUS_KAFKA_SCHEMA_REGISTRY_API_SECRET
# .. setting_default: ''
# .. setting_description: API secret for talking to the Avro schema registry specified in
# ``EVENT_BUS_KAFKA_SCHEMA_REGISTRY_URL``. Optional.
secret = getattr(settings, 'EVENT_BUS_KAFKA_SCHEMA_REGISTRY_API_SECRET', '')

return SchemaRegistryClient({
Expand All @@ -58,6 +75,11 @@ def load_common_settings() -> Optional[dict]:

Warns and returns None if essential settings are missing.
"""
# .. setting_name: EVENT_BUS_KAFKA_BOOTSTRAP_SERVERS
# .. setting_default: None
# .. setting_description: List of one or more Kafka bootstrap servers, a comma-separated
# list of hosts and optional ports, and is required for both producers and consumers.
# See https://kafka.apache.org/documentation/ for more info.
bootstrap_servers = getattr(settings, 'EVENT_BUS_KAFKA_BOOTSTRAP_SERVERS', None)
if bootstrap_servers is None:
warnings.warn("Cannot configure event-bus-kafka: Missing setting EVENT_BUS_KAFKA_BOOTSTRAP_SERVERS")
Expand All @@ -67,7 +89,17 @@ def load_common_settings() -> Optional[dict]:
'bootstrap.servers': bootstrap_servers,
}

# .. setting_name: EVENT_BUS_KAFKA_API_KEY
# .. setting_default: None
# .. setting_description: Optional API key for connecting to the Kafka cluster
# via SASL/PLAIN over TLS. Used as the SASL username. If not specified, no
# authentication will be attempted.
key = getattr(settings, 'EVENT_BUS_KAFKA_API_KEY', None)
# .. setting_name: EVENT_BUS_KAFKA_API_SECRET
# .. setting_default: None
# .. setting_description: Optional API secret for connecting to the Kafka cluster
# via SASL/PLAIN over TLS. Used as the SASL password. If not specified, no
# authentication will be attempted.
secret = getattr(settings, 'EVENT_BUS_KAFKA_API_SECRET', None)

if key and secret:
Expand Down
4 changes: 4 additions & 0 deletions edx_event_bus_kafka/internal/consumer.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,10 @@
# .. toggle_creation_date: 2022-01-31
KAFKA_CONSUMERS_ENABLED = SettingToggle('EVENT_BUS_KAFKA_CONSUMERS_ENABLED', default=True)

# .. setting_name: EVENT_BUS_KAFKA_CONSUMER_POLL_TIMEOUT
# .. setting_default: 1.0
# .. setting_description: How long the consumer should wait, in seconds, for the Kafka broker
# to respond to a poll() call.
CONSUMER_POLL_TIMEOUT = getattr(settings, 'EVENT_BUS_KAFKA_CONSUMER_POLL_TIMEOUT', 1.0)

# .. setting_name: EVENT_BUS_KAFKA_CONSUMER_POLL_FAILURE_SLEEP
Expand Down