diff --git a/.readthedocs.yml b/.readthedocs.yml
new file mode 100644
index 00000000..2739d0f6
--- /dev/null
+++ b/.readthedocs.yml
@@ -0,0 +1,12 @@
+version: 2
+
+python:
+ version: 3.5
+ install:
+ - requirements: docs/requirements.txt
+ - requirements: requirements.txt
+
+sphinx:
+ builder: html
+ configuration: docs/conf.py
+ fail_on_warning: true
diff --git a/README.md b/README.md
index 61e67050..6013179f 100644
--- a/README.md
+++ b/README.md
@@ -81,20 +81,22 @@ The SDK is tested with the most recent patch releases of Python 2.7, 3.3, 3.4, 3
Database integrations
---------------------
-Feature flag data can be kept in a persistent store using Consul, DynamoDB, or Redis. These adapters are implemented in the `Consul`, `DynamoDB` and `Redis` classes in `ldclient.integrations`; to use them, call the `new_feature_store` method in the appropriate class, and put the returned object in the `feature_store` property of your client configuration. See [`ldclient.integrations`](https://github.com/launchdarkly/python-client-private/blob/master/ldclient/integrations.py) and the [SDK reference guide](https://docs.launchdarkly.com/v2.0/docs/using-a-persistent-feature-store) for more information.
+Feature flag data can be kept in a persistent store using Consul, DynamoDB, or Redis. These adapters are implemented in the `Consul`, `DynamoDB` and `Redis` classes in `ldclient.integrations`; to use them, call the `new_feature_store` method in the appropriate class, and put the returned object in the `feature_store` property of your client configuration. See [`ldclient.integrations`](https://launchdarkly-python-sdk.readthedocs.io/en/latest/api-integrations.html#module-ldclient.integrations) and the [SDK reference guide](https://docs.launchdarkly.com/v2.0/docs/using-a-persistent-feature-store) for more information.
Note that Consul is not supported in Python 3.3 or 3.4.
Using flag data from a file
---------------------------
-For testing purposes, the SDK can be made to read feature flag state from a file or files instead of connecting to LaunchDarkly. See [`file_data_source.py`](https://github.com/launchdarkly/python-client/blob/master/ldclient/file_data_source.py) and the [SDK reference guide](https://docs.launchdarkly.com/v2.0/docs/reading-flags-from-a-file) for more details.
+For testing purposes, the SDK can be made to read feature flag state from a file or files instead of connecting to LaunchDarkly. See [`ldclient.integrations.Files`](https://launchdarkly-python-sdk.readthedocs.io/en/latest/api-integrations.html#ldclient.integrations.Files) and the [SDK reference guide](https://docs.launchdarkly.com/v2.0/docs/reading-flags-from-a-file) for more details.
Learn more
------------
+----------
Check out our [documentation](http://docs.launchdarkly.com) for in-depth instructions on configuring and using LaunchDarkly. You can also head straight to the [complete reference guide for this SDK](http://docs.launchdarkly.com/docs/python-sdk-reference).
+Generated API documentation is on [readthedocs.io](https://launchdarkly-python-sdk.readthedocs.io/en/latest/).
+
Testing
-------
@@ -116,16 +118,18 @@ About LaunchDarkly
* Turn off a feature that you realize is causing performance problems in production, without needing to re-deploy, or even restart the application with a changed configuration file.
* Grant access to certain features based on user attributes, like payment plan (eg: users on the ‘gold’ plan get access to more features than users in the ‘silver’ plan). Disable parts of your application to facilitate maintenance, without taking everything offline.
* LaunchDarkly provides feature flag SDKs for
- * [Java](http://docs.launchdarkly.com/docs/java-sdk-reference "Java SDK")
+ * [Java](http://docs.launchdarkly.com/docs/java-sdk-reference "LaunchDarkly Java SDK")
* [JavaScript](http://docs.launchdarkly.com/docs/js-sdk-reference "LaunchDarkly JavaScript SDK")
* [PHP](http://docs.launchdarkly.com/docs/php-sdk-reference "LaunchDarkly PHP SDK")
* [Python](http://docs.launchdarkly.com/docs/python-sdk-reference "LaunchDarkly Python SDK")
* [Go](http://docs.launchdarkly.com/docs/go-sdk-reference "LaunchDarkly Go SDK")
* [Node.JS](http://docs.launchdarkly.com/docs/node-sdk-reference "LaunchDarkly Node SDK")
+ * [Electron](http://docs.launchdarkly.com/docs/electron-sdk-reference "LaunchDarkly Electron SDK")
* [.NET](http://docs.launchdarkly.com/docs/dotnet-sdk-reference "LaunchDarkly .Net SDK")
* [Ruby](http://docs.launchdarkly.com/docs/ruby-sdk-reference "LaunchDarkly Ruby SDK")
* [iOS](http://docs.launchdarkly.com/docs/ios-sdk-reference "LaunchDarkly iOS SDK")
* [Android](http://docs.launchdarkly.com/docs/android-sdk-reference "LaunchDarkly Android SDK")
+ * [C/C++](http://docs.launchdarkly.com/docs/c-sdk-reference "LaunchDarkly C/C++ SDK")
* Explore LaunchDarkly
* [launchdarkly.com](http://www.launchdarkly.com/ "LaunchDarkly Main Website") for more information
* [docs.launchdarkly.com](http://docs.launchdarkly.com/ "LaunchDarkly Documentation") for our documentation and SDKs
diff --git a/azure-pipelines.yml b/azure-pipelines.yml
new file mode 100644
index 00000000..b7f19ff3
--- /dev/null
+++ b/azure-pipelines.yml
@@ -0,0 +1,52 @@
+jobs:
+ - job: build
+ pool:
+ vmImage: 'vs2017-win2016'
+ steps:
+ - task: PowerShell@2
+ displayName: 'Setup Dynamo'
+ inputs:
+ targetType: inline
+ workingDirectory: $(System.DefaultWorkingDirectory)
+ script: |
+ iwr -outf dynamo.zip https://s3-us-west-2.amazonaws.com/dynamodb-local/dynamodb_local_latest.zip
+ mkdir dynamo
+ Expand-Archive -Path dynamo.zip -DestinationPath dynamo
+ cd dynamo
+ javaw -D"java.library.path=./DynamoDBLocal_lib" -jar DynamoDBLocal.jar
+ - task: PowerShell@2
+ displayName: 'Setup Consul'
+ inputs:
+ targetType: inline
+ workingDirectory: $(System.DefaultWorkingDirectory)
+ script: |
+ iwr -outf consul.zip https://releases.hashicorp.com/consul/1.4.2/consul_1.4.2_windows_amd64.zip
+ mkdir consul
+ Expand-Archive -Path consul.zip -DestinationPath consul
+ cd consul
+ sc.exe create "Consul" binPath="$(System.DefaultWorkingDirectory)/consul/consul.exe agent -dev"
+ sc.exe start "Consul"
+ - task: PowerShell@2
+ displayName: 'Setup Redis'
+ inputs:
+ targetType: inline
+ workingDirectory: $(System.DefaultWorkingDirectory)
+ script: |
+ iwr -outf redis.zip https://github.com/MicrosoftArchive/redis/releases/download/win-3.0.504/Redis-x64-3.0.504.zip
+ mkdir redis
+ Expand-Archive -Path redis.zip -DestinationPath redis
+ cd redis
+ ./redis-server --service-install
+ ./redis-server --service-start
+ - task: PowerShell@2
+ displayName: 'Setup SDK and Test'
+ inputs:
+ targetType: inline
+ workingDirectory: $(System.DefaultWorkingDirectory)
+ script: |
+ python --version
+ pip install -r test-requirements.txt
+ pip install -r consul-requirements.txt
+ python setup.py install
+ mkdir test-reports
+ pytest -s --junitxml=test-reports/junit.xml testing;
diff --git a/docs/Makefile b/docs/Makefile
new file mode 100644
index 00000000..ebce0c0b
--- /dev/null
+++ b/docs/Makefile
@@ -0,0 +1,19 @@
+# Minimal makefile for Sphinx documentation
+#
+
+.PHONY: help install html
+
+SPHINXOPTS =
+SPHINXBUILD = sphinx-build
+SPHINXPROJ = ldclient-py
+SOURCEDIR = .
+BUILDDIR = build
+
+help:
+ @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
+
+install:
+ pip install -r requirements.txt
+
+html: install
+ @$(SPHINXBUILD) -M html "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
diff --git a/docs/README.md b/docs/README.md
new file mode 100644
index 00000000..fee89947
--- /dev/null
+++ b/docs/README.md
@@ -0,0 +1,29 @@
+# How the Python SDK documentation works
+
+The generated API documentation is built with [Sphinx](http://www.sphinx-doc.org/en/master/), and is hosted on [Read the Docs](https://readthedocs.org/).
+
+It uses the following:
+
+* Docstrings within the code. Docstrings can use any of the markup supported by Sphinx.
+* The `.rst` files in the `docs` directory. These provide the overall page structure.
+* The `conf.py` file containing Sphinx settings.
+
+## What to document
+
+Every public class, method, and module should have a docstring. Classes and methods with no docstring will not be included in the API docs.
+
+"Public" here means things that we want third-party developers to use. The SDK also contains many modules and classes that are not actually private (i.e. they aren't prefixed with `_`), but are for internal use only and aren't supported for any other use (we would like to reduce the amount of these in future).
+
+To add an undocumented class or method in an existing module to the docs, just give it a docstring.
+
+To add a new module to the docs, give it a docstring and then add a link to it in the appropriate `api-*.rst` file, in the same format as the existing links.
+
+## Undocumented things
+
+Modules that contain only implementation details are omitted from the docs by simply not including links to them in the `.rst` files.
+
+Internal classes in a documented module will be omitted from the docs if they do not have any docstrings, unless they inherit from another class that has docstrings. In the latter case, the way to omit them from the docs is to edit the `.rst` file that contains the link to that module, and add a `:members:` directive under the module that specifically lists all the classes that _should_ be shown.
+
+## Testing
+
+In the `docs` directory, run `make html` to build all the docs. Then view `docs/build/html/index.html`.
diff --git a/docs/api-extending.rst b/docs/api-extending.rst
new file mode 100644
index 00000000..4f668ce0
--- /dev/null
+++ b/docs/api-extending.rst
@@ -0,0 +1,25 @@
+Extending the SDK
+=================
+
+ldclient.interfaces module
+--------------------------
+
+.. automodule:: ldclient.interfaces
+ :members:
+ :special-members: __init__
+ :show-inheritance:
+
+ldclient.feature_store_helpers module
+-------------------------------------
+
+.. automodule:: ldclient.feature_store_helpers
+ :members:
+ :special-members: __init__
+ :show-inheritance:
+
+ldclient.versioned_data_kind module
+-----------------------------------
+
+.. automodule:: ldclient.versioned_data_kind
+ :members:
+ :show-inheritance:
diff --git a/docs/api-integrations.rst b/docs/api-integrations.rst
new file mode 100644
index 00000000..8d8146ff
--- /dev/null
+++ b/docs/api-integrations.rst
@@ -0,0 +1,10 @@
+Integrating with other services
+===============================
+
+ldclient.integrations module
+----------------------------
+
+.. automodule:: ldclient.integrations
+ :members:
+ :special-members: __init__
+ :show-inheritance:
diff --git a/docs/api-main.rst b/docs/api-main.rst
new file mode 100644
index 00000000..56417ea5
--- /dev/null
+++ b/docs/api-main.rst
@@ -0,0 +1,40 @@
+Core API
+========
+
+ldclient module
+---------------
+
+.. automodule:: ldclient
+ :members: get,set_config,set_sdk_key
+ :show-inheritance:
+
+ldclient.client module
+----------------------
+
+.. automodule:: ldclient.client
+ :members: LDClient
+ :special-members: __init__
+ :show-inheritance:
+
+ldclient.config module
+----------------------
+
+.. automodule:: ldclient.config
+ :members:
+ :special-members: __init__
+ :show-inheritance:
+
+ldclient.flag module
+--------------------
+
+.. automodule:: ldclient.flag
+ :members: EvaluationDetail
+ :special-members: __init__
+ :show-inheritance:
+
+ldclient.flags_state module
+---------------------------
+
+.. automodule:: ldclient.flags_state
+ :members:
+ :show-inheritance:
diff --git a/docs/conf.py b/docs/conf.py
new file mode 100644
index 00000000..f1dc322b
--- /dev/null
+++ b/docs/conf.py
@@ -0,0 +1,174 @@
+# -*- coding: utf-8 -*-
+#
+# Configuration file for the Sphinx documentation builder.
+#
+# This file does only contain a selection of the most common options. For a
+# full list see the documentation:
+# http://www.sphinx-doc.org/en/master/config
+
+# -- Path setup --------------------------------------------------------------
+
+# If extensions (or modules to document with autodoc) are in another directory,
+# add these directories to sys.path here. If the directory is relative to the
+# documentation root, use os.path.abspath to make it absolute, like shown here.
+#
+# import os
+# import sys
+# sys.path.insert(0, os.path.abspath('.'))
+
+import os
+import sys
+
+sys.path.insert(0, os.path.abspath('..'))
+
+import ldclient
+
+# -- Project information -----------------------------------------------------
+
+project = u'ldclient-py'
+copyright = u'2019, LaunchDarkly'
+author = u'LaunchDarkly'
+
+# The short X.Y version.
+version = ldclient.__version__
+# The full version, including alpha/beta/rc tags.
+release = ldclient.__version__
+
+
+# -- General configuration ---------------------------------------------------
+
+# If your documentation needs a minimal Sphinx version, state it here.
+#
+# needs_sphinx = '1.0'
+
+# Add any Sphinx extension module names here, as strings. They can be
+# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
+# ones.
+extensions = [
+ 'sphinx.ext.autodoc',
+ 'sphinx.ext.coverage',
+ 'sphinx.ext.viewcode',
+]
+
+# Add any paths that contain templates here, relative to this directory.
+templates_path = ['_templates']
+
+# The suffix(es) of source filenames.
+# You can specify multiple suffix as a list of string:
+#
+# source_suffix = ['.rst', '.md']
+source_suffix = '.rst'
+
+# The master toctree document.
+master_doc = 'index'
+
+# The language for content autogenerated by Sphinx. Refer to documentation
+# for a list of supported languages.
+#
+# This is also used if you do content translation via gettext catalogs.
+# Usually you set "language" from the command line for these cases.
+language = None
+
+# List of patterns, relative to source directory, that match files and
+# directories to ignore when looking for source files.
+# This pattern also affects html_static_path and html_extra_path .
+exclude_patterns = ['build']
+
+# The name of the Pygments (syntax highlighting) style to use.
+pygments_style = 'sphinx'
+
+
+# -- Options for HTML output -------------------------------------------------
+
+# The theme to use for HTML and HTML Help pages. See the documentation for
+# a list of builtin themes.
+#
+html_theme = 'sphinx_rtd_theme'
+
+# Theme options are theme-specific and customize the look and feel of a theme
+# further. For a list of options available for each theme, see the
+# documentation.
+#
+# html_theme_options = {}
+
+# Add any paths that contain custom static files (such as style sheets) here,
+# relative to this directory. They are copied after the builtin static files,
+# so a file named "default.css" will overwrite the builtin "default.css".
+html_static_path = ['_static']
+
+# Custom sidebar templates, must be a dictionary that maps document names
+# to template names.
+#
+# The default sidebars (for documents that don't match any pattern) are
+# defined by theme itself. Builtin themes are using these templates by
+# default: ``['localtoc.html', 'relations.html', 'sourcelink.html',
+# 'searchbox.html']``.
+#
+# html_sidebars = {}
+
+
+# -- Options for HTMLHelp output ---------------------------------------------
+
+# Output file base name for HTML help builder.
+htmlhelp_basename = 'ldclient-pydoc'
+
+
+# -- Options for LaTeX output ------------------------------------------------
+
+latex_elements = {
+ # The paper size ('letterpaper' or 'a4paper').
+ #
+ # 'papersize': 'letterpaper',
+
+ # The font size ('10pt', '11pt' or '12pt').
+ #
+ # 'pointsize': '10pt',
+
+ # Additional stuff for the LaTeX preamble.
+ #
+ # 'preamble': '',
+
+ # Latex figure (float) alignment
+ #
+ # 'figure_align': 'htbp',
+}
+
+# Grouping the document tree into LaTeX files. List of tuples
+# (source start file, target name, title,
+# author, documentclass [howto, manual, or own class]).
+latex_documents = [
+ (master_doc, 'ldclient-py.tex', u'ldclient-py Documentation',
+ u'LaunchDarkly', 'manual'),
+]
+
+
+# -- Options for manual page output ------------------------------------------
+
+# One entry per manual page. List of tuples
+# (source start file, name, description, authors, manual section).
+man_pages = [
+ (master_doc, 'ldclient-py', u'ldclient-py Documentation',
+ [author], 1)
+]
+
+
+# -- Options for Texinfo output ----------------------------------------------
+
+# Grouping the document tree into Texinfo files. List of tuples
+# (source start file, target name, title, author,
+# dir menu entry, description, category)
+texinfo_documents = [
+ (master_doc, 'ldclient-py', u'ldclient-py Documentation',
+ author, 'ldclient-py', 'One line description of project.',
+ 'Miscellaneous'),
+]
+
+
+# -- Extension configuration -------------------------------------------------
+
+autodoc_default_options = {
+ 'members': None,
+ 'show-inheritance': None,
+ 'special-members': None,
+ 'undoc-members': None
+}
diff --git a/docs/index.rst b/docs/index.rst
new file mode 100644
index 00000000..7a9d2c73
--- /dev/null
+++ b/docs/index.rst
@@ -0,0 +1,21 @@
+.. ldclient-py documentation master file, created by
+ sphinx-quickstart on Mon Feb 4 13:16:49 2019.
+ You can adapt this file completely to your liking, but it should at least
+ contain the root `toctree` directive.
+
+LaunchDarkly Python SDK
+=======================
+
+This is the API reference for the `LaunchDarkly `_ SDK for Python.
+
+The latest version of the SDK can be found on `PyPI `_, and the source code is on `GitHub `_.
+
+For more information, see LaunchDarkly's `Quickstart `_ and `SDK Reference Guide `_.
+
+.. toctree::
+ :maxdepth: 2
+ :caption: Contents:
+
+ api-main
+ api-integrations
+ api-extending
diff --git a/docs/requirements.txt b/docs/requirements.txt
new file mode 100644
index 00000000..f6c80357
--- /dev/null
+++ b/docs/requirements.txt
@@ -0,0 +1,11 @@
+sphinx<2.0
+sphinx_rtd_theme
+
+backoff>=1.4.3
+certifi>=2018.4.16
+expiringdict>=1.1.4
+six>=1.10.0
+pyRFC3339>=1.0
+jsonpickle==0.9.3
+semver>=2.7.9
+urllib3>=1.22.0
diff --git a/ldclient/__init__.py b/ldclient/__init__.py
index f693d989..d75b6b61 100644
--- a/ldclient/__init__.py
+++ b/ldclient/__init__.py
@@ -1,3 +1,7 @@
+"""
+The ldclient module contains the most common top-level entry points for the SDK.
+"""
+
import logging
from ldclient.rwlock import ReadWriteLock
@@ -20,12 +24,16 @@
__lock = ReadWriteLock()
-# 2 Use Cases:
-# 1. Initial setup: sets the config for the uninitialized client
-# 2. Allows on-the-fly changing of the config. When this function is called after the client has been initialized
-# the client will get re-initialized with the new config. In order for this to work, the return value of
-# ldclient.get() should never be assigned
def set_config(config):
+ """Sets the configuration for the shared SDK client instance.
+
+ If this is called prior to :func:`ldclient.get()`, it stores the configuration that will be used when the
+ client is initialized. If it is called after the client has already been initialized, the client will be
+ re-initialized with the new configuration (this will result in the next call to :func:`ldclient.get()`
+ returning a new client instance).
+
+ :param ldclient.config.Config config: the client configuration
+ """
global __config
global __client
global __lock
@@ -42,12 +50,18 @@ def set_config(config):
__lock.unlock()
-# 2 Use Cases:
-# 1. Initial setup: sets the sdk key for the uninitialized client
-# 2. Allows on-the-fly changing of the sdk key. When this function is called after the client has been initialized
-# the client will get re-initialized with the new sdk key. In order for this to work, the return value of
-# ldclient.get() should never be assigned
def set_sdk_key(sdk_key):
+ """Sets the SDK key for the shared SDK client instance.
+
+ If this is called prior to :func:`ldclient.get()`, it stores the SDK key that will be used when the client is
+ initialized. If it is called after the client has already been initialized, the client will be
+ re-initialized with the new SDK key (this will result in the next call to :func:`ldclient.get()` returning a
+ new client instance).
+
+ If you need to set any configuration options other than the SDK key, use :func:`ldclient.set_config()` instead.
+
+ :param string sdk_key: the new SDK key
+ """
global __config
global __client
global __lock
@@ -76,6 +90,18 @@ def set_sdk_key(sdk_key):
def get():
+ """Returns the shared SDK client instance, using the current global configuration.
+
+ To use the SDK as a singleton, first make sure you have called :func:`ldclient.set_sdk_key()` or
+ :func:`ldclient.set_config()` at startup time. Then ``get()`` will return the same shared
+ :class:`ldclient.client.LDClient` instance each time. The client will be initialized if it has
+ not been already.
+
+ If you need to create multiple client instances with different configurations, instead of this
+ singleton approach you can call the :class:`ldclient.client.LDClient` constructor directly instead.
+
+ :rtype: ldclient.client.LDClient
+ """
global __config
global __client
global __lock
@@ -96,8 +122,15 @@ def get():
__lock.unlock()
-# Add a NullHandler for Python < 2.7 compatibility
+# currently hidden from documentation - see docs/README.md
class NullHandler(logging.Handler):
+ """A :class:`logging.Handler` implementation that does nothing.
+
+ .. deprecated:: 6.0.0
+ You should not need to use this class. It was originally used in order to support Python 2.6,
+ which requires that at least one logging handler must always be configured. However, the SDK
+ no longer supports Python 2.6.
+ """
def emit(self, record):
pass
diff --git a/ldclient/client.py b/ldclient/client.py
index ff96475b..d1759f6f 100644
--- a/ldclient/client.py
+++ b/ldclient/client.py
@@ -1,3 +1,7 @@
+"""
+This submodule contains the client class that provides most of the SDK functionality.
+"""
+
import hashlib
import hmac
import threading
@@ -55,15 +59,20 @@ def initialized(self):
class LDClient(object):
+ """The LaunchDarkly SDK client object.
+
+ Applications should configure the client at startup time and continue to use it throughout the lifetime
+ of the application, rather than creating instances on the fly. The best way to do this is with the
+ singleton methods :func:`ldclient.set_sdk_key()`, :func:`ldclient.set_config()`, and :func:`ldclient.get()`.
+ However, you may also call the constructor directly if you need to maintain multiple instances.
+
+ Client instances are thread-safe.
+ """
def __init__(self, sdk_key=None, config=None, start_wait=5):
"""Constructs a new LDClient instance.
- Rather than calling this constructor directly, you can call the `ldclient.set_sdk_key`,
- `ldclient.set_config`, and `ldclient.get` functions to configure and use a singleton
- client instance.
-
:param string sdk_key: the SDK key for your LaunchDarkly environment
- :param Config config: optional custom configuration
+ :param ldclient.config.Config config: optional custom configuration
:param float start_wait: the number of seconds to wait for a successful connection to LaunchDarkly
"""
check_uwsgi()
@@ -157,9 +166,13 @@ def _send_event(self, event):
def track(self, event_name, user, data=None):
"""Tracks that a user performed an event.
- :param string event_name: The name of the event.
- :param dict user: The attributes of the user.
- :param data: Optional additional data associated with the event.
+ LaunchDarkly automatically tracks pageviews and clicks that are specified in the Goals
+ section of the dashboard. This can be used to track custom goals or other events that do
+ not currently have goals.
+
+ :param string event_name: the name of the event, which may correspond to a goal in A/B tests
+ :param dict user: the attributes of the user
+ :param data: optional additional data associated with the event
"""
self._sanitize_user(user)
if user is None or user.get('key') is None:
@@ -169,6 +182,10 @@ def track(self, event_name, user, data=None):
def identify(self, user):
"""Registers the user.
+ This simply creates an analytics event that will transmit the given user properties to
+ LaunchDarkly, so that the user will be visible on your dashboard even if you have not
+ evaluated any flags for that user. It has no other effect.
+
:param dict user: attributes of the user to register
"""
self._sanitize_user(user)
@@ -186,19 +203,31 @@ def is_offline(self):
def is_initialized(self):
"""Returns true if the client has successfully connected to LaunchDarkly.
- :rype: bool
+ If this returns false, it means that the client has not yet successfully connected to LaunchDarkly.
+ It might still be in the process of starting up, or it might be attempting to reconnect after an
+ unsuccessful attempt, or it might have received an unrecoverable error (such as an invalid SDK key)
+ and given up.
+
+ :rtype: bool
"""
return self.is_offline() or self._config.use_ldd or self._update_processor.initialized()
def flush(self):
- """Flushes all pending events.
+ """Flushes all pending analytics events.
+
+ Normally, batches of events are delivered in the background at intervals determined by the
+ ``flush_interval`` property of :class:`ldclient.config.Config`. Calling ``flush()``
+ schedules the next event delivery to be as soon as possible; however, the delivery still
+ happens asynchronously on a worker thread, so this method will return immediately.
"""
if self._config.offline:
return
return self._event_processor.flush()
def toggle(self, key, user, default):
- """Deprecated synonym for `variation`.
+ """Deprecated synonym for :func:`variation()`.
+
+ .. deprecated:: 2.0.0
"""
log.warn("Deprecated method: toggle() called. Use variation() instead.")
return self.variation(key, user, default)
@@ -215,27 +244,18 @@ def variation(self, key, user, default):
return self._evaluate_internal(key, user, default, False).value
def variation_detail(self, key, user, default):
- """Determines the variation of a feature flag for a user, like `variation`, but also
- provides additional information about how this value was calculated.
-
- The return value is an EvaluationDetail object, which has three properties:
-
- `value`: the value that was calculated for this user (same as the return value
- of `variation`)
-
- `variation_index`: the positional index of this value in the flag, e.g. 0 for the
- first variation - or `None` if the default value was returned
-
- `reason`: a hash describing the main reason why this value was selected.
+ """Determines the variation of a feature flag for a user, like :func:`variation()`, but also
+ provides additional information about how this value was calculated, in the form of an
+ :class:`ldclient.flag.EvaluationDetail` object.
- The `reason` will also be included in analytics events, if you are capturing
- detailed event data for this flag.
+ Calling this method also causes the "reason" data to be included in analytics events,
+ if you are capturing detailed event data for this flag.
:param string key: the unique key for the feature flag
:param dict user: a dictionary containing parameters for the end user requesting the flag
:param object default: the default value of the flag, to be used if the value is not
available from LaunchDarkly
- :return: an EvaluationDetail object describing the result
+ :return: an object describing the result
:rtype: EvaluationDetail
"""
return self._evaluate_internal(key, user, default, True)
@@ -307,8 +327,8 @@ def send_event(value, variation=None, flag=None, reason=None):
def all_flags(self, user):
"""Returns all feature flag values for the given user.
- This method is deprecated - please use `all_flags_state` instead. Current versions of the
- client-side SDK will not generate analytics events correctly if you pass the result of `all_flags`.
+ This method is deprecated - please use :func:`all_flags_state()` instead. Current versions of the
+ client-side SDK will not generate analytics events correctly if you pass the result of ``all_flags``.
:param dict user: the end user requesting the feature flags
:return: a dictionary of feature flag keys to values; returns None if the client is offline,
@@ -322,19 +342,27 @@ def all_flags(self, user):
def all_flags_state(self, user, **kwargs):
"""Returns an object that encapsulates the state of all feature flags for a given user,
- including the flag values and also metadata that can be used on the front end.
+ including the flag values and also metadata that can be used on the front end. See the
+ JavaScript SDK Reference Guide on
+ `Bootstrapping `_.
This method does not send analytics events back to LaunchDarkly.
:param dict user: the end user requesting the feature flags
- :param kwargs: optional parameters affecting how the state is computed: set
- `client_side_only=True` to limit it to only flags that are marked for use with the
- client-side SDK (by default, all flags are included); set `with_reasons=True` to
- include evaluation reasons in the state (see `variation_detail`); set
- `details_only_for_tracked_flags=True` to omit any metadata that is normally only
- used for event generation, such as flag versions and evaluation reasons, unless
- the flag has event tracking or debugging turned on
- :return: a FeatureFlagsState object (will never be None; its 'valid' property will be False
+ :param kwargs: optional parameters affecting how the state is computed - see below
+
+ :Keyword Arguments:
+ * **client_side_only** (*boolean*) --
+ set to True to limit it to only flags that are marked for use with the client-side SDK
+ (by default, all flags are included)
+ * **with_reasons** (*boolean*) --
+ set to True to include evaluation reasons in the state (see :func:`variation_detail()`)
+ * **details_only_for_tracked_flags** (*boolean*) --
+ set to True to omit any metadata that is normally only used for event generation, such
+ as flag versions and evaluation reasons, unless the flag has event tracking or debugging
+ turned on
+
+ :return: a FeatureFlagsState object (will never be None; its ``valid`` property will be False
if the client is offline, has not been initialized, or the user is None or has no key)
:rtype: FeatureFlagsState
"""
@@ -381,9 +409,11 @@ def all_flags_state(self, user, **kwargs):
return state
def secure_mode_hash(self, user):
- """Generates a hash value for a user.
+ """Computes an HMAC signature of a user signed with the client's SDK key,
+ for use with the JavaScript SDK.
- For more info: https://github.com/launchdarkly/js-client#secure-mode
+ For more information, see the JavaScript SDK Reference Guide on
+ `Secure mode `_.
:param dict user: the attributes of the user
:return: a hash string that can be passed to the front end
diff --git a/ldclient/config.py b/ldclient/config.py
index 35af5110..f8ef61d0 100644
--- a/ldclient/config.py
+++ b/ldclient/config.py
@@ -1,3 +1,9 @@
+"""
+This submodule contains the :class:`Config` class for custom configuration of the SDK client.
+
+Note that the same class can also be imported from the ``ldclient.client`` submodule.
+"""
+
from ldclient.event_processor import DefaultEventProcessor
from ldclient.feature_store import InMemoryFeatureStore
from ldclient.util import log
@@ -7,6 +13,11 @@
class Config(object):
+ """Advanced configuration options for the SDK client.
+
+ To use these options, create an instance of ``Config`` and pass it to either :func:`ldclient.set_config()`
+ if you are using the singleton client, or the :class:`ldclient.client.LDClient` constructor otherwise.
+ """
def __init__(self,
sdk_key=None,
base_uri='https://app.launchdarkly.com',
@@ -59,7 +70,7 @@ def __init__(self,
:param bool offline: Whether the client should be initialized in offline mode. In offline mode,
default values are returned for all flags and no remote network requests are made. By default,
this is false.
- :type update_processor_class: (str, Config, FeatureStore) -> UpdateProcessor
+ :type update_processor_class: (str, ldclient.config.Config, FeatureStore) -> UpdateProcessor
:param float poll_interval: The number of seconds between polls for flag updates if streaming is off.
:param bool use_ldd: Whether you are using the LaunchDarkly relay proxy in daemon mode. In this
configuration, the client will not use a streaming connection to listen for updates, but instead
@@ -79,9 +90,9 @@ def __init__(self,
By default, events will only include the user key, except for one "index" event that provides the
full details for the user.
:param feature_requester_class: A factory for a FeatureRequester implementation taking the sdk key and config
- :type feature_requester_class: (str, Config, FeatureStore) -> FeatureRequester
+ :type feature_requester_class: (str, ldclient.config.Config, FeatureStore) -> FeatureRequester
:param event_processor_class: A factory for an EventProcessor implementation taking the config
- :type event_processor_class: (Config) -> EventProcessor
+ :type event_processor_class: (ldclient.config.Config) -> EventProcessor
:param update_processor_class: A factory for an UpdateProcessor implementation taking the sdk key,
config, and FeatureStore implementation
"""
@@ -118,9 +129,18 @@ def __init__(self,
@classmethod
def default(cls):
+ """Returns a ``Config`` instance with default values for all properties.
+
+ :rtype: ldclient.config.Config
+ """
return cls()
def copy_with_new_sdk_key(self, new_sdk_key):
+ """Returns a new ``Config`` instance that is the same as this one, except for having a different SDK key.
+
+ :param string new_sdk_key: the new SDK key
+ :rtype: ldclient.config.Config
+ """
return Config(sdk_key=new_sdk_key,
base_uri=self.__base_uri,
events_uri=self.__events_uri,
@@ -146,6 +166,7 @@ def copy_with_new_sdk_key(self, new_sdk_key):
user_keys_flush_interval=self.__user_keys_flush_interval,
inline_users_in_events=self.__inline_users_in_events)
+ # for internal use only - probably should be part of the client logic
def get_default(self, key, default):
return default if key not in self.__defaults else self.__defaults[key]
@@ -157,18 +178,22 @@ def sdk_key(self):
def base_uri(self):
return self.__base_uri
+ # for internal use only - also no longer used, will remove
@property
def get_latest_flags_uri(self):
return self.__base_uri + GET_LATEST_FEATURES_PATH
+ # for internal use only - should construct the URL path in the events code, not here
@property
def events_uri(self):
return self.__events_uri + '/bulk'
+ # for internal use only
@property
def stream_base_uri(self):
return self.__stream_uri
+ # for internal use only - should construct the URL path in the streaming code, not here
@property
def stream_uri(self):
return self.__stream_uri + STREAM_FLAGS_PATH
diff --git a/ldclient/event_processor.py b/ldclient/event_processor.py
index 9a0cae83..30619298 100644
--- a/ldclient/event_processor.py
+++ b/ldclient/event_processor.py
@@ -1,7 +1,12 @@
+"""
+Implementation details of the analytics event delivery component.
+"""
+# currently excluded from documentation - see docs/README.md
+
from collections import namedtuple
from email.utils import parsedate
import errno
-import jsonpickle
+import json
from threading import Event, Lock, Thread
import six
import time
@@ -163,7 +168,7 @@ def run(self):
def _do_send(self, output_events):
# noinspection PyBroadException
try:
- json_body = jsonpickle.encode(output_events, unpicklable=False)
+ json_body = json.dumps(output_events)
log.debug('Sending events payload: ' + json_body)
hdrs = _headers(self._config.sdk_key)
hdrs['X-LaunchDarkly-Event-Schema'] = str(__CURRENT_EVENT_SCHEMA__)
diff --git a/ldclient/event_summarizer.py b/ldclient/event_summarizer.py
index 5a9f19ea..c0aa5aeb 100644
--- a/ldclient/event_summarizer.py
+++ b/ldclient/event_summarizer.py
@@ -1,3 +1,8 @@
+"""
+Implementation details of the analytics event delivery component.
+"""
+# currently excluded from documentation - see docs/README.md
+
from collections import namedtuple
diff --git a/ldclient/feature_requester.py b/ldclient/feature_requester.py
index 046c594f..51aee6a0 100644
--- a/ldclient/feature_requester.py
+++ b/ldclient/feature_requester.py
@@ -1,3 +1,8 @@
+"""
+Default implementation of feature flag polling requests.
+"""
+# currently excluded from documentation - see docs/README.md
+
from collections import namedtuple
import json
import urllib3
diff --git a/ldclient/feature_store.py b/ldclient/feature_store.py
index fccef5b5..efabe82e 100644
--- a/ldclient/feature_store.py
+++ b/ldclient/feature_store.py
@@ -1,3 +1,11 @@
+"""
+This submodule contains basic classes related to the feature store.
+
+The feature store is the SDK component that holds the last known state of all feature flags, as
+received from LaunchDarkly. This submodule does not include specific integrations with external
+storage systems; those are in :class:`ldclient.integrations`.
+"""
+
from collections import OrderedDict, defaultdict
from ldclient.util import log
from ldclient.interfaces import FeatureStore
@@ -16,10 +24,11 @@ def __init__(self,
expiration = DEFAULT_EXPIRATION,
capacity = DEFAULT_CAPACITY):
"""Constructs an instance of CacheConfig.
- :param float expiration: The cache TTL, in seconds. Items will be evicted from the cache after
+
+ :param float expiration: the cache TTL, in seconds. Items will be evicted from the cache after
this amount of time from the time when they were originally cached. If the time is less than or
equal to zero, caching is disabled.
- :param int capacity: The maximum number of items that can be in the cache at a time.
+ :param int capacity: the maximum number of items that can be in the cache at a time
"""
self._expiration = expiration
self._capacity = capacity
@@ -28,41 +37,58 @@ def __init__(self,
def default():
"""Returns an instance of CacheConfig with default properties. By default, caching is enabled.
This is the same as calling the constructor with no parameters.
- :rtype: CacheConfig
+
+ :rtype: ldclient.feature_store.CacheConfig
"""
return CacheConfig()
@staticmethod
def disabled():
"""Returns an instance of CacheConfig specifying that caching should be disabled.
- :rtype: CacheConfig
+
+ :rtype: ldclient.feature_store.CacheConfig
"""
return CacheConfig(expiration = 0)
@property
def enabled(self):
+ """Returns True if caching is enabled in this configuration.
+
+ :rtype: bool
+ """
return self._expiration > 0
@property
def expiration(self):
+ """Returns the configured cache TTL, in seconds.
+
+ :rtype: float
+ """
return self._expiration
@property
def capacity(self):
+ """Returns the configured maximum number of cacheable items.
+
+ :rtype: int
+ """
return self._capacity
class InMemoryFeatureStore(FeatureStore):
- """
- In-memory implementation of a store that holds feature flags and related data received from the streaming API.
+ """The default feature store implementation, which holds all data in a thread-safe data structure in memory.
"""
def __init__(self):
+ """Constructs an instance of InMemoryFeatureStore.
+ """
self._lock = ReadWriteLock()
self._initialized = False
self._items = defaultdict(dict)
def get(self, kind, key, callback):
+ """
+ """
try:
self._lock.rlock()
itemsOfKind = self._items[kind]
@@ -78,6 +104,8 @@ def get(self, kind, key, callback):
self._lock.runlock()
def all(self, kind, callback):
+ """
+ """
try:
self._lock.rlock()
itemsOfKind = self._items[kind]
@@ -86,6 +114,8 @@ def all(self, kind, callback):
self._lock.runlock()
def init(self, all_data):
+ """
+ """
try:
self._lock.rlock()
self._items.clear()
@@ -98,6 +128,8 @@ def init(self, all_data):
# noinspection PyShadowingNames
def delete(self, kind, key, version):
+ """
+ """
try:
self._lock.rlock()
itemsOfKind = self._items[kind]
@@ -109,6 +141,8 @@ def delete(self, kind, key, version):
self._lock.runlock()
def upsert(self, kind, item):
+ """
+ """
key = item['key']
try:
self._lock.rlock()
@@ -122,6 +156,8 @@ def upsert(self, kind, item):
@property
def initialized(self):
+ """
+ """
try:
self._lock.rlock()
return self._initialized
diff --git a/ldclient/feature_store_helpers.py b/ldclient/feature_store_helpers.py
index 2ba83713..58f9a848 100644
--- a/ldclient/feature_store_helpers.py
+++ b/ldclient/feature_store_helpers.py
@@ -1,18 +1,28 @@
+"""
+This submodule contains support code for writing feature store implementations.
+"""
+
from expiringdict import ExpiringDict
from ldclient.interfaces import FeatureStore
class CachingStoreWrapper(FeatureStore):
- """CachingStoreWrapper is a partial implementation of :class:ldclient.interfaces.FeatureStore that
- delegates the basic functionality to an implementation of :class:ldclient.interfaces.FeatureStoreCore -
- while adding optional caching behavior and other logic that would otherwise be repeated in every
- feature store implementation. This makes it easier to create new database integrations by implementing
- only the database-specific logic.
+ """A partial implementation of :class:`ldclient.interfaces.FeatureStore`.
+
+ This class delegates the basic functionality to an implementation of
+ :class:`ldclient.interfaces.FeatureStoreCore` - while adding optional caching behavior and other logic
+ that would otherwise be repeated in every feature store implementation. This makes it easier to create
+ new database integrations by implementing only the database-specific logic.
"""
__INITED_CACHE_KEY__ = "$inited"
def __init__(self, core, cache_config):
+ """Constructs an instance by wrapping a core implementation object.
+
+ :param FeatureStoreCore core: the implementation object
+ :param ldclient.feature_store.CacheConfig cache_config: the caching parameters
+ """
self._core = core
if cache_config.enabled:
self._cache = ExpiringDict(max_len=cache_config.capacity, max_age_seconds=cache_config.expiration)
@@ -21,6 +31,8 @@ def __init__(self, core, cache_config):
self._inited = False
def init(self, all_data):
+ """
+ """
self._core.init_internal(all_data)
if self._cache is not None:
self._cache.clear()
@@ -31,6 +43,8 @@ def init(self, all_data):
self._inited = True
def get(self, kind, key, callback=lambda x: x):
+ """
+ """
if self._cache is not None:
cache_key = self._item_cache_key(kind, key)
cached_item = self._cache.get(cache_key)
@@ -43,6 +57,8 @@ def get(self, kind, key, callback=lambda x: x):
return callback(self._item_if_not_deleted(item))
def all(self, kind, callback=lambda x: x):
+ """
+ """
if self._cache is not None:
cache_key = self._all_cache_key(kind)
cached_items = self._cache.get(cache_key)
@@ -54,10 +70,14 @@ def all(self, kind, callback=lambda x: x):
return callback(items)
def delete(self, kind, key, version):
+ """
+ """
deleted_item = { "key": key, "version": version, "deleted": True }
self.upsert(kind, deleted_item)
def upsert(self, kind, item):
+ """
+ """
new_state = self._core.upsert_internal(kind, item)
if self._cache is not None:
self._cache[self._item_cache_key(kind, item.get('key'))] = [new_state]
@@ -65,6 +85,8 @@ def upsert(self, kind, item):
@property
def initialized(self):
+ """
+ """
if self._inited:
return True
if self._cache is None:
diff --git a/ldclient/file_data_source.py b/ldclient/file_data_source.py
index 61088d50..56da8de8 100644
--- a/ldclient/file_data_source.py
+++ b/ldclient/file_data_source.py
@@ -1,31 +1,21 @@
+"""
+Deprecated entry point for a component that has been moved.
+"""
+# currently excluded from documentation - see docs/README.md
+
from ldclient.impl.integrations.files.file_data_source import _FileDataSource
+from ldclient.interfaces import UpdateProcessor
class FileDataSource(UpdateProcessor):
@classmethod
def factory(cls, **kwargs):
- """Provides a way to use local files as a source of feature flag state. This would typically be
- used in a test environment, to operate using a predetermined feature flag state without an
- actual LaunchDarkly connection.
-
- This module and this implementation class are deprecated and may be changed or removed in the future.
- Please use :func:`ldclient.integrations.Files.new_data_source()`.
+ """Provides a way to use local files as a source of feature flag state.
- :param kwargs:
- See below
-
- :Keyword arguments:
- * **paths** (array): The paths of the source files for loading flag data. These may be absolute paths
- or relative to the current working directory. Files will be parsed as JSON unless the 'pyyaml'
- package is installed, in which case YAML is also allowed.
- * **auto_update** (boolean): True if the data source should watch for changes to the source file(s)
- and reload flags whenever there is a change. The default implementation of this feature is based on
- polling the filesystem, which may not perform well; if you install the 'watchdog' package (not
- included by default, to avoid adding unwanted dependencies to the SDK), its native file watching
- mechanism will be used instead. Note that auto-updating will only work if all of the files you
- specified have valid directory paths at startup time.
- * **poll_interval** (float): The minimum interval, in seconds, between checks for file modifications -
- used only if auto_update is true, and if the native file-watching mechanism from 'watchdog' is not
- being used. The default value is 1 second.
+ .. deprecated:: 6.8.0
+ This module and this implementation class are deprecated and may be changed or removed in the future.
+ Please use :func:`ldclient.integrations.Files.new_data_source()`.
+
+ The keyword arguments are the same as the arguments to :func:`ldclient.integrations.Files.new_data_source()`.
"""
return lambda config, store, ready : _FileDataSource(store, ready,
diff --git a/ldclient/fixed_thread_pool.py b/ldclient/fixed_thread_pool.py
index a3c769e4..27fca13d 100644
--- a/ldclient/fixed_thread_pool.py
+++ b/ldclient/fixed_thread_pool.py
@@ -1,3 +1,8 @@
+"""
+Internal helper class for thread management.
+"""
+# currently excluded from documentation - see docs/README.md
+
from threading import Event, Lock, Thread
# noinspection PyBroadException
diff --git a/ldclient/flag.py b/ldclient/flag.py
index d4fcbdf3..88739ba0 100644
--- a/ldclient/flag.py
+++ b/ldclient/flag.py
@@ -1,3 +1,7 @@
+"""
+This submodule contains a helper class for feature flag evaluation, as well as some implementation details.
+"""
+
from collections import namedtuple
import hashlib
import logging
@@ -18,10 +22,12 @@
class EvaluationDetail(object):
"""
- The return type of LDClient.variation_detail, combining the result of a flag evaluation
- with information about how it was calculated.
+ The return type of :func:`ldclient.client.LDClient.variation_detail()`, combining the result of a
+ flag evaluation with information about how it was calculated.
"""
def __init__(self, value, variation_index, reason):
+ """Constructs an instance.
+ """
self.__value = value
self.__variation_index = variation_index
self.__reason = reason
@@ -29,14 +35,17 @@ def __init__(self, value, variation_index, reason):
@property
def value(self):
"""The result of the flag evaluation. This will be either one of the flag's
- variations or the default value that was passed to the variation() method.
+ variations or the default value that was passed to the
+ :func:`ldclient.client.LDClient.variation_detail()` method.
"""
return self.__value
@property
def variation_index(self):
"""The index of the returned value within the flag's list of variations, e.g.
- 0 for the first variation - or None if the default value was returned.
+ 0 for the first variation -- or None if the default value was returned.
+
+ :rtype: int or None
"""
return self.__variation_index
@@ -45,28 +54,34 @@ def reason(self):
"""A dictionary describing the main factor that influenced the flag evaluation value.
It contains the following properties:
- 'kind': The general category of reason, as follows: 'OFF' - the flag was off;
- 'FALLTHROUGH' - the flag was on but the user did not match any targets or rules;
- 'TARGET_MATCH' - the user was specifically targeted for this flag; 'RULE_MATCH' -
- the user matched one of the flag's rules; 'PREREQUISITE_FAILED' - the flag was
- considered off because it had at least one prerequisite flag that did not return
- the desired variation; 'ERROR' - the flag could not be evaluated due to an
- unexpected error.
+ * ``kind``: The general category of reason, as follows:
+
+ * ``"OFF"``: the flag was off
+ * ``"FALLTHROUGH"`` -- the flag was on but the user did not match any targets or rules
+ * ``"TARGET_MATCH"`` -- the user was specifically targeted for this flag
+ * ``"RULE_MATCH"`` -- the user matched one of the flag's rules
+ * ``"PREREQUISITE_FAILED"`` -- the flag was considered off because it had at least one
+ prerequisite flag that did not return the desired variation
+ * ``"ERROR"`` - the flag could not be evaluated due to an unexpected error.
- 'ruleIndex', 'ruleId': The positional index and unique identifier of the matched
- rule, if the kind was 'RULE_MATCH'
+ * ``ruleIndex``, ``ruleId``: The positional index and unique identifier of the matched
+ rule, if the kind was ``RULE_MATCH``
- 'prerequisiteKey': The flag key of the prerequisite that failed, if the kind was
- 'PREREQUISITE_FAILED'
+ * ``prerequisiteKey``: The flag key of the prerequisite that failed, if the kind was
+ ``PREREQUISITE_FAILED``
- 'errorKind': further describes the nature of the error if the kind was 'ERROR',
- e.g. 'FLAG_NOT_FOUND'
+ * ``errorKind``: further describes the nature of the error if the kind was ``ERROR``,
+ e.g. ``"FLAG_NOT_FOUND"``
+
+ :rtype: dict
"""
return self.__reason
def is_default_value(self):
"""Returns True if the flag evaluated to the default value rather than one of its
variations.
+
+ :rtype: bool
"""
return self.__variation_index is None
diff --git a/ldclient/flags_state.py b/ldclient/flags_state.py
index c5a8ab41..2f611aa6 100644
--- a/ldclient/flags_state.py
+++ b/ldclient/flags_state.py
@@ -1,20 +1,25 @@
+"""
+This submodule contains a helper class for feature flag evaluation.
+"""
+
import json
import time
class FeatureFlagsState(object):
"""
A snapshot of the state of all feature flags with regard to a specific user, generated by
- calling the client's all_flags_state method. Serializing this object to JSON, using the
- to_json_dict method or jsonpickle, will produce the appropriate data structure for
- bootstrapping the LaunchDarkly JavaScript client.
+ calling the :func:`ldclient.client.LDClient.all_flags_state()` method. Serializing this
+ object to JSON, using the :func:`to_json_dict` method or ``jsonpickle``, will produce the
+ appropriate data structure for bootstrapping the LaunchDarkly JavaScript client. See the
+ JavaScript SDK Reference Guide on `Bootstrapping `_.
"""
def __init__(self, valid):
self.__flag_values = {}
self.__flag_metadata = {}
self.__valid = valid
+ # Used internally to build the state map
def add_flag(self, flag, value, variation, reason, details_only_if_tracked):
- """Used internally to build the state map."""
key = flag['key']
self.__flag_values[key] = value
meta = {}
@@ -39,11 +44,14 @@ def add_flag(self, flag, value, variation, reason, details_only_if_tracked):
def valid(self):
"""True if this object contains a valid snapshot of feature flag state, or False if the
state could not be computed (for instance, because the client was offline or there was no user).
+
+ :rtype: bool
"""
return self.__valid
def get_flag_value(self, key):
"""Returns the value of an individual feature flag at the time the state was recorded.
+
:param string key: the feature flag key
:return: the flag's value; None if the flag returned the default value, or if there was no such flag
"""
@@ -51,9 +59,11 @@ def get_flag_value(self, key):
def get_flag_reason(self, key):
"""Returns the evaluation reason for an individual feature flag at the time the state was recorded.
+
:param string key: the feature flag key
:return: a dictionary describing the reason; None if reasons were not recorded, or if there was no
such flag
+ :rtype: dict or None
"""
meta = self.__flag_metadata.get(key)
return None if meta is None else meta.get('reason')
@@ -63,7 +73,9 @@ def to_values_map(self):
default value, its value will be None.
Do not use this method if you are passing data to the front end to "bootstrap" the JavaScript client.
- Instead, use to_json_dict.
+ Instead, use :func:`to_json_dict()`.
+
+ :rtype: dict
"""
return self.__flag_values
@@ -71,6 +83,8 @@ def to_json_dict(self):
"""Returns a dictionary suitable for passing as JSON, in the format used by the LaunchDarkly
JavaScript SDK. Use this method if you are passing data to the front end in order to
"bootstrap" the JavaScript client.
+
+ :rtype: dict
"""
ret = self.__flag_values.copy()
ret['$flagsState'] = self.__flag_metadata
@@ -79,6 +93,8 @@ def to_json_dict(self):
def to_json_string(self):
"""Same as to_json_dict, but serializes the JSON structure into a string.
+
+ :rtype: string
"""
return json.dumps(self.to_json_dict())
diff --git a/ldclient/integrations.py b/ldclient/integrations.py
index fcc89abc..a1e9d2f8 100644
--- a/ldclient/integrations.py
+++ b/ldclient/integrations.py
@@ -1,3 +1,8 @@
+"""
+This submodule contains factory/configuration methods for integrating the SDK with services
+other than LaunchDarkly.
+"""
+
from ldclient.feature_store import CacheConfig
from ldclient.feature_store_helpers import CachingStoreWrapper
from ldclient.impl.integrations.consul.consul_feature_store import _ConsulFeatureStoreCore
@@ -19,25 +24,30 @@ def new_feature_store(host=None,
prefix=None,
consul_opts=None,
caching=CacheConfig.default()):
- """Creates a Consul-backed implementation of `:class:ldclient.feature_store.FeatureStore`.
+ """Creates a Consul-backed implementation of :class:`ldclient.interfaces.FeatureStore`.
For more details about how and why you can use a persistent feature store, see the
- SDK reference guide: https://docs.launchdarkly.com/v2.0/docs/using-a-persistent-feature-store
+ `SDK reference guide `_.
+
+ To use this method, you must first install the ``python-consul`` package. Then, put the object
+ returned by this method into the ``feature_store`` property of your client configuration
+ (:class:`ldclient.config.Config`).
+ ::
- To use this method, you must first install the `python-consul` package. Then, put the object
- returned by this method into the `feature_store` property of your client configuration
- (:class:ldclient.config.Config).
+ from ldclient.integrations import Consul
+ store = Consul.new_feature_store()
+ config = Config(feature_store=store)
- Note that `python-consul` is not available for Python 3.3 or 3.4, so this feature cannot be
+ Note that ``python-consul`` is not available for Python 3.3 or 3.4, so this feature cannot be
used in those Python versions.
- :param string host: Hostname of the Consul server (uses "localhost" if omitted)
- :param int port: Port of the Consul server (uses 8500 if omitted)
- :param string prefix: A namespace prefix to be prepended to all Consul keys
- :param dict consul_opts: Optional parameters for configuring the Consul client, if you need
- to set any of them besides host and port, as defined in the python-consul API; see
- https://python-consul.readthedocs.io/en/latest/#consul
- :param CacheConfig caching: Specifies whether local caching should be enabled and if so,
- sets the cache properties; defaults to `CacheConfig.default()`
+ :param string host: hostname of the Consul server (uses ``localhost`` if omitted)
+ :param int port: port of the Consul server (uses 8500 if omitted)
+ :param string prefix: a namespace prefix to be prepended to all Consul keys
+ :param dict consul_opts: optional parameters for configuring the Consul client, if you need
+ to set any of them besides host and port, as defined in the
+ `python-consul API `_
+ :param CacheConfig caching: specifies whether local caching should be enabled and if so,
+ sets the cache properties; defaults to :func:`ldclient.feature_store.CacheConfig.default()`
"""
core = _ConsulFeatureStoreCore(host, port, prefix, consul_opts)
return CachingStoreWrapper(core, caching)
@@ -52,13 +62,18 @@ def new_feature_store(table_name,
prefix=None,
dynamodb_opts={},
caching=CacheConfig.default()):
- """Creates a DynamoDB-backed implementation of `:class:ldclient.feature_store.FeatureStore`.
+ """Creates a DynamoDB-backed implementation of :class:`ldclient.interfaces.FeatureStore`.
For more details about how and why you can use a persistent feature store, see the
- SDK reference guide: https://docs.launchdarkly.com/v2.0/docs/using-a-persistent-feature-store
+ `SDK reference guide `_.
- To use this method, you must first install the `boto3` package containing the AWS SDK gems.
- Then, put the object returned by this method into the `feature_store` property of your
- client configuration (:class:ldclient.config.Config).
+ To use this method, you must first install the ``boto3`` package containing the AWS SDK gems.
+ Then, put the object returned by this method into the ``feature_store`` property of your
+ client configuration (:class:`ldclient.config.Config`).
+ ::
+
+ from ldclient.integrations import DynamoDB
+ store = DynamoDB.new_feature_store("my-table-name")
+ config = Config(feature_store=store)
Note that the DynamoDB table must already exist; the LaunchDarkly SDK does not create the table
automatically, because it has no way of knowing what additional properties (such as permissions
@@ -67,14 +82,14 @@ def new_feature_store(table_name,
By default, the DynamoDB client will try to get your AWS credentials and region name from
environment variables and/or local configuration files, as described in the AWS SDK documentation.
- You may also pass configuration settings in `dynamodb_opts`.
-
- :param string table_name: The name of an existing DynamoDB table
- :param string prefix: An optional namespace prefix to be prepended to all DynamoDB keys
- :param dict dynamodb_opts: Optional parameters for configuring the DynamoDB client, as defined in
- the boto3 API; see https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html#boto3.session.Session.client
- :param CacheConfig caching: Specifies whether local caching should be enabled and if so,
- sets the cache properties; defaults to `CacheConfig.default()`
+ You may also pass configuration settings in ``dynamodb_opts``.
+
+ :param string table_name: the name of an existing DynamoDB table
+ :param string prefix: an optional namespace prefix to be prepended to all DynamoDB keys
+ :param dict dynamodb_opts: optional parameters for configuring the DynamoDB client, as defined in
+ the `boto3 API `_
+ :param CacheConfig caching: specifies whether local caching should be enabled and if so,
+ sets the cache properties; defaults to :func:`ldclient.feature_store.CacheConfig.default()`
"""
core = _DynamoDBFeatureStoreCore(table_name, prefix, dynamodb_opts)
return CachingStoreWrapper(core, caching)
@@ -92,21 +107,26 @@ def new_feature_store(url='redis://localhost:6379/0',
prefix='launchdarkly',
max_connections=16,
caching=CacheConfig.default()):
- """Creates a Redis-backed implementation of `:class:ldclient.feature_store.FeatureStore`.
+ """Creates a Redis-backed implementation of :class:`ldclient.interfaces.FeatureStore`.
For more details about how and why you can use a persistent feature store, see the
- SDK reference guide: https://docs.launchdarkly.com/v2.0/docs/using-a-persistent-feature-store
-
- To use this method, you must first install the `redis` package. Then, put the object
- returned by this method into the `feature_store` property of your client configuration
- (:class:ldclient.config.Config).
-
- :param string url: The URL of the Redis host; defaults to `DEFAULT_URL`
- :param string prefix: A namespace prefix to be prepended to all Redis keys; defaults to
- `DEFAULT_PREFIX`
- :param int max_connections: The maximum number of Redis connections to keep in the
- connection pool; defaults to `DEFAULT_MAX_CONNECTIONS`
- :param CacheConfig caching: Specifies whether local caching should be enabled and if so,
- sets the cache properties; defaults to `CacheConfig.default()`
+ `SDK reference guide `_.
+
+ To use this method, you must first install the ``redis`` package. Then, put the object
+ returned by this method into the ``feature_store`` property of your client configuration
+ (:class:`ldclient.config.Config`).
+ ::
+
+ from ldclient.integrations import Redis
+ store = Redis.new_feature_store()
+ config = Config(feature_store=store)
+
+ :param string url: the URL of the Redis host; defaults to ``DEFAULT_URL``
+ :param string prefix: a namespace prefix to be prepended to all Redis keys; defaults to
+ ``DEFAULT_PREFIX``
+ :param int max_connections: the maximum number of Redis connections to keep in the
+ connection pool; defaults to ``DEFAULT_MAX_CONNECTIONS``
+ :param CacheConfig caching: specifies whether local caching should be enabled and if so,
+ sets the cache properties; defaults to :func:`ldclient.feature_store.CacheConfig.default()`
"""
core = _RedisFeatureStoreCore(url, prefix, max_connections)
wrapper = CachingStoreWrapper(core, caching)
@@ -124,95 +144,40 @@ def new_data_source(paths, auto_update=False, poll_interval=1, force_polling=Fal
used in a test environment, to operate using a predetermined feature flag state without an
actual LaunchDarkly connection.
- To use this component, call `new_data_source`, specifying the file path(s) of your data file(s)
- in the `path` parameter; then put the value returned by this method into the `update_processor_class`
- property of your LaunchDarkly client configuration (:class:ldclient.config.Config).
+ To use this component, call ``new_data_source``, specifying the file path(s) of your data file(s)
+ in the ``paths`` parameter; then put the value returned by this method into the ``update_processor_class``
+ property of your LaunchDarkly client configuration (:class:`ldclient.config.Config`).
::
- data_source = LaunchDarkly::Integrations::Files.new_data_source(paths=[ myFilePath ])
+ from ldclient.integrations import Files
+ data_source = Files.new_data_source(paths=[ myFilePath ])
config = Config(update_processor_class=data_source)
This will cause the client not to connect to LaunchDarkly to get feature flags. The
client may still make network connections to send analytics events, unless you have disabled
- this with Config.send_events or Config.offline.
-
- Flag data files can be either JSON or YAML (in order to use YAML, you must install the 'pyyaml'
- package). They contain an object with three possible properties:
-
- * "flags": Feature flag definitions.
- * "flagValues": Simplified feature flags that contain only a value.
- * "segments": User segment definitions.
-
- The format of the data in "flags" and "segments" is defined by the LaunchDarkly application
- and is subject to change. Rather than trying to construct these objects yourself, it is simpler
- to request existing flags directly from the LaunchDarkly server in JSON format, and use this
- output as the starting point for your file. In Linux you would do this:
- ::
-
- curl -H "Authorization: {your sdk key}" https://app.launchdarkly.com/sdk/latest-all
-
- The output will look something like this (but with many more properties):
- ::
-
- {
- "flags": {
- "flag-key-1": {
- "key": "flag-key-1",
- "on": true,
- "variations": [ "a", "b" ]
- }
- },
- "segments": {
- "segment-key-1": {
- "key": "segment-key-1",
- "includes": [ "user-key-1" ]
- }
- }
- }
-
- Data in this format allows the SDK to exactly duplicate all the kinds of flag behavior supported
- by LaunchDarkly. However, in many cases you will not need this complexity, but will just want to
- set specific flag keys to specific values. For that, you can use a much simpler format:
- ::
-
- {
- "flagValues": {
- "my-string-flag-key": "value-1",
- "my-boolean-flag-key": true,
- "my-integer-flag-key": 3
- }
- }
-
- Or, in YAML:
- ::
-
- flagValues:
- my-string-flag-key: "value-1"
- my-boolean-flag-key: true
- my-integer-flag-key: 1
+ this in your configuration with ``send_events`` or ``offline``.
- It is also possible to specify both "flags" and "flagValues", if you want some flags
- to have simple values and others to have complex behavior. However, it is an error to use the
- same flag key or segment key more than once, either in a single file or across multiple files.
+ The format of the data files is described in the SDK Reference Guide on
+ `Reading flags from a file `_.
+ Note that in order to use YAML, you will need to install the ``pyyaml`` package.
If the data source encounters any error in any file-- malformed content, a missing file, or a
duplicate key-- it will not load flags from any of the files.
- :param array paths: The paths of the source files for loading flag data. These may be absolute paths
- or relative to the current working directory. Files will be parsed as JSON unless the 'pyyaml'
+ :param array paths: the paths of the source files for loading flag data. These may be absolute paths
+ or relative to the current working directory. Files will be parsed as JSON unless the ``pyyaml``
package is installed, in which case YAML is also allowed.
:param bool auto_update: (default: false) True if the data source should watch for changes to the source file(s)
and reload flags whenever there is a change. The default implementation of this feature is based on
- polling the filesystem, which may not perform well; if you install the 'watchdog' package (not
- included by default, to avoid adding unwanted dependencies to the SDK), its native file watching
- mechanism will be used instead. Note that auto-updating will only work if all of the files you
- specified have valid directory paths at startup time.
- :param float poll_interval: (default: 1) The minimum interval, in seconds, between checks for file
- modifications-- used only if `auto_update` is true, and if the native file-watching mechanism from
- `watchdog` is not being used.
+ polling the filesystem, which may not perform well; if you install the ``watchdog`` package, its
+ native file watching mechanism will be used instead. Note that auto-updating will only work if all
+ of the files you specified have valid directory paths at startup time.
+ :param float poll_interval: (default: 1) the minimum interval, in seconds, between checks for file
+ modifications-- used only if ``auto_update`` is true, and if the native file-watching mechanism from
+ ``watchdog`` is not being used.
:param bool force_polling: (default: false) True if the data source should implement auto-update via
polling the filesystem even if a native mechanism is available. This is mainly for SDK testing.
- :return: an object (actually a lambda) to be stored in the `update_processor_class` configuration property
+ :return: an object (actually a lambda) to be stored in the ``update_processor_class`` configuration property
"""
return lambda config, store, ready : _FileDataSource(store, ready, paths, auto_update, poll_interval, force_polling)
diff --git a/ldclient/interfaces.py b/ldclient/interfaces.py
index 9556bdfc..48c517b8 100644
--- a/ldclient/interfaces.py
+++ b/ldclient/interfaces.py
@@ -1,16 +1,22 @@
+"""
+This submodule contains interfaces for various components of the SDK.
+
+They may be useful in writing new implementations of these components, or for testing.
+"""
+
from abc import ABCMeta, abstractmethod, abstractproperty
class FeatureStore(object):
"""
- A versioned store for feature flags and related objects received from LaunchDarkly.
+ Interface for a versioned store for feature flags and related objects received from LaunchDarkly.
Implementations should permit concurrent access and updates.
- An "object", for `FeatureStore`, is simply a dict of arbitrary data which must have at least
- three properties: "key" (its unique key), "version" (the version number provided by
- LaunchDarkly), and "deleted" (True if this is a placeholder for a deleted object).
+ An "object", for ``FeatureStore``, is simply a dict of arbitrary data which must have at least
+ three properties: ``key`` (its unique key), ``version`` (the version number provided by
+ LaunchDarkly), and ``deleted`` (True if this is a placeholder for a deleted object).
- Delete and upsert requests are versioned-- if the version number in the request is less than
+ Delete and upsert requests are versioned: if the version number in the request is less than
the currently stored version of the object, the request should be ignored.
These semantics support the primary use case for the store, which synchronizes a collection
@@ -22,7 +28,7 @@ class FeatureStore(object):
def get(self, kind, key, callback=lambda x: x):
"""
Retrieves the object to which the specified key is mapped, or None if the key is not found
- or the associated object has a "deleted" property of True. The retrieved object, if any (a
+ or the associated object has a ``deleted`` property of True. The retrieved object, if any (a
dict) can be transformed by the specified callback.
:param kind: The kind of object to get
@@ -97,11 +103,11 @@ def initialized(self):
class FeatureStoreCore(object):
"""
- `FeatureStoreCore` is an interface for a simplified subset of the functionality of :class:`FeatureStore`,
- to be used in conjunction with :class:`feature_store_helpers.CachingStoreWrapper`. This allows developers
- developers of custom `FeatureStore` implementations to avoid repeating logic that would
+ Interface for a simplified subset of the functionality of :class:`FeatureStore`, to be used
+ in conjunction with :class:`ldclient.feature_store_helpers.CachingStoreWrapper`. This allows
+ developers of custom ``FeatureStore`` implementations to avoid repeating logic that would
commonly be needed in any such implementation, such as caching. Instead, they can implement
- only `FeatureStoreCore` and then create a `CachingStoreWrapper`.
+ only ``FeatureStoreCore`` and then create a ``CachingStoreWrapper``.
"""
__metaclass__ = ABCMeta
@@ -174,10 +180,8 @@ def initialized_internal(self):
"""
+# Internal use only. Common methods for components that perform a task in the background.
class BackgroundOperation(object):
- """
- Performs a task in the background
- """
# noinspection PyMethodMayBeStatic
def start(self):
@@ -203,20 +207,24 @@ def is_alive(self):
class UpdateProcessor(BackgroundOperation):
"""
- Responsible for retrieving Feature Flag updates from LaunchDarkly and saving them to the feature store
+ Interface for the component that obtains feature flag data in some way and passes it to a
+ :class:`FeatureStore`. The built-in implementations of this are the client's standard streaming
+ or polling behavior. For testing purposes, there is also :func:`ldclient.integrations.Files.new_data_source()`.
"""
__metaclass__ = ABCMeta
def initialized(self):
"""
Returns whether the update processor has received feature flags and has initialized its feature store.
+
:rtype: bool
"""
class EventProcessor(object):
"""
- Buffers analytics events and sends them to LaunchDarkly
+ Interface for the component that buffers analytics events and sends them to LaunchDarkly.
+ The default implementation can be replaced for testing purposes.
"""
__metaclass__ = ABCMeta
@@ -231,7 +239,7 @@ def flush(self):
"""
Specifies that any buffered events should be sent as soon as possible, rather than waiting
for the next flush interval. This method is asynchronous, so events still may not be sent
- until a later time. However, calling stop() will synchronously deliver any events that were
+ until a later time. However, calling ``stop()`` will synchronously deliver any events that were
not yet delivered prior to shutting down.
"""
@@ -244,7 +252,8 @@ def stop(self):
class FeatureRequester(object):
"""
- Requests features.
+ Interface for the component that acquires feature flag data in polling mode. The default
+ implementation can be replaced for testing purposes.
"""
__metaclass__ = ABCMeta
@@ -254,7 +263,7 @@ def get_all(self):
"""
pass
- def get_one(self, key):
+ def get_one(self, kind, key):
"""
Gets one Feature flag
:return:
diff --git a/ldclient/lru_cache.py b/ldclient/lru_cache.py
index 53cbf5d2..f8f18e37 100644
--- a/ldclient/lru_cache.py
+++ b/ldclient/lru_cache.py
@@ -1,13 +1,13 @@
-'''
-A dictionary-based cache that removes the oldest entries when its limit is exceeded.
-Values are only refreshed by writing, not by reading. Not thread-safe.
-'''
+"""
+Internal helper class for caching.
+"""
+# currently excluded from documentation - see docs/README.md
from collections import OrderedDict
# Backport of Python 3.2 move_to_end method which doesn't exist in 2.7
-class OrderedDictWithReordering(OrderedDict):
+class _OrderedDictWithReordering(OrderedDict):
if not hasattr(OrderedDict, 'move_to_end'):
# backport of Python 3.2 logic
def move_to_end(self, key, last=True):
@@ -28,9 +28,12 @@ def move_to_end(self, key, last=True):
class SimpleLRUCache(object):
+ """A dictionary-based cache that removes the oldest entries when its limit is exceeded.
+ Values are only refreshed by writing, not by reading. Not thread-safe.
+ """
def __init__(self, capacity):
self.capacity = capacity
- self.cache = OrderedDictWithReordering()
+ self.cache = _OrderedDictWithReordering()
def get(self, key):
return self.cache.get(key)
diff --git a/ldclient/memoized_value.py b/ldclient/memoized_value.py
index b2c38fea..7abc944f 100644
--- a/ldclient/memoized_value.py
+++ b/ldclient/memoized_value.py
@@ -1,12 +1,17 @@
-'''
-Simple implementation of a thread-safe memoized value whose generator function will never be
-run more than once, and whose value can be overridden by explicit assignment.
-'''
+"""
+Internal helper class for caching. No longer used.
+"""
+# currently excluded from documentation - see docs/README.md
from threading import RLock
class MemoizedValue(object):
+ """Simple implementation of a thread-safe memoized value whose generator function will never be
+ run more than once, and whose value can be overridden by explicit assignment.
+ .. deprecated:: 6.7.0
+ No longer used. Retained here only in case third parties were using it for another purpose.
+ """
def __init__(self, generator):
self.generator = generator
self.inited = False
diff --git a/ldclient/operators.py b/ldclient/operators.py
index 88a76cd1..253e8a8b 100644
--- a/ldclient/operators.py
+++ b/ldclient/operators.py
@@ -1,3 +1,8 @@
+"""
+Implementation details of feature flag evaluation.
+"""
+# currently excluded from documentation - see docs/README.md
+
import logging
import re
import semver
diff --git a/ldclient/polling.py b/ldclient/polling.py
index 19ed0a7d..59803a30 100644
--- a/ldclient/polling.py
+++ b/ldclient/polling.py
@@ -1,3 +1,8 @@
+"""
+Default implementation of the polling component.
+"""
+# currently excluded from documentation - see docs/README.md
+
from threading import Thread
from ldclient.interfaces import UpdateProcessor
diff --git a/ldclient/redis_feature_store.py b/ldclient/redis_feature_store.py
index ff93c402..1e49d9ee 100644
--- a/ldclient/redis_feature_store.py
+++ b/ldclient/redis_feature_store.py
@@ -11,10 +11,11 @@
# will migrate away from exposing these concrete classes and use only the factory methods.
class RedisFeatureStore(FeatureStore):
- """A Redis-backed implementation of :class:`ldclient.feature_store.FeatureStore`.
+ """A Redis-backed implementation of :class:`ldclient.interfaces.FeatureStore`.
- This module and this implementation class are deprecated and may be changed or removed in the future.
- Please use :func:`ldclient.integrations.Redis.new_feature_store()`.
+ .. deprecated:: 6.7.0
+ This module and this implementation class are deprecated and may be changed or removed in the future.
+ Please use :func:`ldclient.integrations.Redis.new_feature_store()`.
"""
def __init__(self,
url='redis://localhost:6379/0',
diff --git a/ldclient/repeating_timer.py b/ldclient/repeating_timer.py
index 956cfbcd..eb8aa771 100644
--- a/ldclient/repeating_timer.py
+++ b/ldclient/repeating_timer.py
@@ -1,3 +1,8 @@
+"""
+Internal helper class for repeating tasks.
+"""
+# currently excluded from documentation - see docs/README.md
+
from threading import Event, Thread
class RepeatingTimer(object):
diff --git a/ldclient/rwlock.py b/ldclient/rwlock.py
index 8416a35c..251d5eb4 100644
--- a/ldclient/rwlock.py
+++ b/ldclient/rwlock.py
@@ -1,3 +1,8 @@
+"""
+Internal helper class for locking.
+"""
+# currently excluded from documentation - see docs/README.md
+
import threading
diff --git a/ldclient/sse_client.py b/ldclient/sse_client.py
index 5b41413b..49d853c7 100644
--- a/ldclient/sse_client.py
+++ b/ldclient/sse_client.py
@@ -1,3 +1,10 @@
+"""
+Server-Sent Events implementation for streaming.
+
+Based on: https://bitbucket.org/btubbs/sseclient/src/a47a380a3d7182a205c0f1d5eb470013ce796b4d/sseclient.py?at=default&fileviewer=file-view-default
+"""
+# currently excluded from documentation - see docs/README.md
+
import re
import time
import warnings
@@ -9,8 +16,6 @@
from ldclient.util import create_http_pool_manager
from ldclient.util import throw_if_unsuccessful_response
-# Inspired by: https://bitbucket.org/btubbs/sseclient/src/a47a380a3d7182a205c0f1d5eb470013ce796b4d/sseclient.py?at=default&fileviewer=file-view-default
-
# Technically, we should support streams that mix line endings. This regex,
# however, assumes that a system will provide consistent line endings.
end_of_field = re.compile(r'\r\n\r\n|\r\r|\n\n')
diff --git a/ldclient/streaming.py b/ldclient/streaming.py
index 20599eb1..43e815a4 100644
--- a/ldclient/streaming.py
+++ b/ldclient/streaming.py
@@ -1,3 +1,8 @@
+"""
+Default implementation of the streaming component.
+"""
+# currently excluded from documentation - see docs/README.md
+
from collections import namedtuple
import json
diff --git a/ldclient/user_filter.py b/ldclient/user_filter.py
index d48ab23f..fe5baa39 100644
--- a/ldclient/user_filter.py
+++ b/ldclient/user_filter.py
@@ -1,4 +1,8 @@
-import jsonpickle
+"""
+Internal helper class for filtering out private attributes.
+"""
+# currently excluded from documentation - see docs/README.md
+
import six
diff --git a/ldclient/util.py b/ldclient/util.py
index fbb2f11d..b1d533a2 100644
--- a/ldclient/util.py
+++ b/ldclient/util.py
@@ -1,3 +1,8 @@
+"""
+General internal helper functions.
+"""
+# currently excluded from documentation - see docs/README.md
+
import certifi
import logging
import sys
diff --git a/ldclient/versioned_data_kind.py b/ldclient/versioned_data_kind.py
index 04acce43..37504394 100644
--- a/ldclient/versioned_data_kind.py
+++ b/ldclient/versioned_data_kind.py
@@ -1,17 +1,24 @@
-from collections import namedtuple
-
"""
-These objects denote the types of data that can be stored in the feature store and
-referenced in the API. If we add another storable data type in the future, as long as it
-follows the same pattern (having "key", "version", and "deleted" properties), we only need
-to add a corresponding constant here and the existing store should be able to handle it.
+This submodule is used only by the internals of the feature flag storage mechanism.
+
+If you are writing your own implementation of :class:`ldclient.integrations.FeatureStore`, the
+:class:`VersionedDataKind` tuple type will be passed to the ``kind`` parameter of the feature
+store methods; its ``namespace`` property tells the feature store which collection of objects is
+being referenced ("features", "segments", etc.). The intention is for the feature store to treat
+storable objects as completely generic JSON dictionaries, rather than having any special logic
+for features or segments.
"""
+from collections import namedtuple
+
# Note that VersionedDataKind without the extra attributes is no longer used in the SDK,
# but it's preserved here for backward compatibility just in case someone else used it
VersionedDataKind = namedtuple('VersionedDataKind',
['namespace', 'request_api_path', 'stream_api_path'])
+# Note, feature store implementors really don't need to know about this class so we could just
+# not document it at all, but apparently namedtuple() creates its own docstrings so it's going
+# to show up in any case.
VersionedDataKindWithOrdering = namedtuple('VersionedDataKindWithOrdering',
['namespace', 'request_api_path', 'stream_api_path', 'priority', 'get_dependency_keys'])
diff --git a/requirements.txt b/requirements.txt
index f86f3039..2e3cba6f 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -3,6 +3,5 @@ certifi>=2018.4.16
expiringdict>=1.1.4
six>=1.10.0
pyRFC3339>=1.0
-jsonpickle==0.9.3
semver>=2.7.9
urllib3>=1.22.0
diff --git a/test-requirements.txt b/test-requirements.txt
index 88cbbc2e..3bc09d90 100644
--- a/test-requirements.txt
+++ b/test-requirements.txt
@@ -3,6 +3,7 @@ pytest>=2.8
redis>=2.10.5
boto3>=1.9.71
coverage>=4.4
+jsonpickle==0.9.3
pytest-capturelog>=0.7
pytest-cov>=2.4.0
codeclimate-test-reporter>=0.2.1
diff --git a/testing/test_flags_state.py b/testing/test_flags_state.py
index 45ea6404..f8e6d464 100644
--- a/testing/test_flags_state.py
+++ b/testing/test_flags_state.py
@@ -58,6 +58,8 @@ def test_can_convert_to_json_string():
str = state.to_json_string()
assert json.loads(str) == obj
+# We don't actually use jsonpickle in the SDK, but FeatureFlagsState has a magic method that makes it
+# behave correctly in case the application uses jsonpickle to serialize it.
def test_can_serialize_with_jsonpickle():
state = FeatureFlagsState(True)
flag1 = { 'key': 'key1', 'version': 100, 'offVariation': 0, 'variations': [ 'value1' ], 'trackEvents': False }