Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Polished headers and titles for DL landing page #8

Merged
merged 10 commits into from
Nov 13, 2021
2 changes: 1 addition & 1 deletion algorithms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Algorithms and Application Libraries
Following are the list of algorithms and applications implemented using Lava:

.. toctree::
:maxdepth: 4
:maxdepth: 3

dl
dnf
Expand Down
1 change: 1 addition & 0 deletions build.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
url = "https://lava-nc.org"
license = "BSD-3-Clause"


@init
def set_properties(project):
project.set_property("dir_source_main_python", "../lava")
Expand Down
5 changes: 0 additions & 5 deletions conf.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,3 @@
import os
import sys
import sphinx_rtd_theme

project = "Lava"
copyright = "2021, Intel Corporation"
author = "Intel Corporation"
Expand All @@ -17,7 +13,6 @@
"nbsphinx",
]

# napoleon_google_docstring = True
napoleon_numpy_docstring = True
napoleon_include_init_with_doc = False
napoleon_include_private_with_doc = False
Expand Down
34 changes: 20 additions & 14 deletions dl.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,10 @@
Lava DL
=======
Deep Learning
=============

``lava-dl`` is a library of deep learning tools within Lava that
Introduction
------------

Lava-DL (``lava-dl``) is a library of deep learning tools within Lava that
support offline training, online training and inference methods for
various Deep Event-Based Networks.

Expand Down Expand Up @@ -69,10 +72,10 @@ Getting Started

* `Dynamics and Neurons <lava-lib-dl/slayer/notebooks/neuron_dynamics/dynamics.html>`__

``lava.lib.dl.slayer``
----------------------
SLAYER 2.0
----------

``lava.lib.dl.slayer`` is an enhanced version of
SLAYER 2.0 (`lava.lib.dl.slayer`) is an enhanced version of
`SLAYER <https://github.com/bamsumit/slayerPytorch>`__. Most noteworthy
enhancements are: support for *recurrent network structures*, a wider
variety of *neuron models* and *synaptic connections* (a complete list
Expand All @@ -81,7 +84,7 @@ of features is
This version of SLAYER is built on top of the
`PyTorch <https://pytorch.org/>`__ deep learning framework, similar to
its predecessor. For smooth integration with Lava,
``lava.lib.dl.slayer`` supports exporting trained models using the
`lava.lib.dl.slayer` supports exporting trained models using the
platform independent **hdf5 network exchange** format.

In future versions, SLAYER will get completely integrated into Lava to
Expand Down Expand Up @@ -151,15 +154,15 @@ Example Code

net.export_hdf5('network.net')

``lava.lib.dl.bootstrap``
-------------------------
Bootstrap
---------

In general ANN-SNN conversion methods for rate based SNN result in high latency of the network during inference. This is because the rate interpretation of a spiking neuron using ReLU acitvation unit breaks down for short inference times. As a result, the network requires many time steps per sample to achieve adequate inference results.

``lava.lib.dl.bootstrap`` enables rapid training of rate based SNNs by translating them to an equivalent dynamic ANN representation which leads to SNN performance close to the equivalent ANN and low latency inference. More details `here <lava-lib-dl/bootstrap/bootstrap.html>`__. It also supports *hybrid training*
Bootstrap (`lava.lib.dl.bootstrap`) enables rapid training of rate based SNNs by translating them to an equivalent dynamic ANN representation which leads to SNN performance close to the equivalent ANN and low latency inference. More details `here <lava-lib-dl/bootstrap/bootstrap.html>`__. It also supports *hybrid training*
a mixed ANN-SNN network to minimize the ANN to SNN performance gap. This method is independent of the SNN model being used.

It has similar API as ``lava.lib.dl.slayer`` and supports exporting
It has similar API as `lava.lib.dl.slayer` and supports exporting
trained models using the platform independent **hdf5 network exchange**
format.

Expand Down Expand Up @@ -231,10 +234,10 @@ Example Code

net.export_hdf5('network.net')

``lava.lib.dl.netx`` 
--------------------
Network Exchange (NetX) Library 
-------------------------------

For inference using Lava, ``lava.lib.dl.netx`` provides an
For inference using Lava, Network Exchange Library (`lava.lib.dl.netx`) provides an
automated API for loading SLAYER-trained models as Lava Processes, which
can be directly run on a desired backend. ``lava.lib.dl.netx`` imports
models saved via SLAYER using the hdf5 network exchange format. The
Expand Down Expand Up @@ -286,6 +289,9 @@ Example Code

net.run(condition=rcnd.RunSteps(total_run_time), run_cfg=rcfg.Loihi1SimCfg())

Detailed Description
--------------------

.. toctree::
:maxdepth: 1
:caption: Detailed description:
Expand Down
4 changes: 2 additions & 2 deletions lava-lib-dl/netx/index.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Network Exchange (Netx)
=======================
Lava-DL NetX
============

.. Note::
Coming soon.
4 changes: 2 additions & 2 deletions lava-lib-dl/netx/netx.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Lava-dl-netx
Lava-DL NetX
============

``lava.lib.dl.netx`` automates deep learning network model exchange
Lava-DL NetX library ``lava.lib.dl.netx`` automates deep learning network model exchange
to/from LAVA from/to other frameworks. At the moment, we support a
simple platform independent hdf5 network description protocol. In
furture we will extend network exchange support to other neural network
Expand Down
107 changes: 72 additions & 35 deletions sync_notebook.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,43 @@
import glob
from distutils.dir_util import copy_tree

import lava
import lava.lib.dl.slayer as slayer
import lava.lib.dl.bootstrap as bootstrap
module_dict = {}

try:
import lava
module_dict.update({'lava': lava})
except ModuleNotFoundError:
print(f"Failed importing {lava}. It's dependencies will be excluded.")

try:
import lava.lib.dl.slayer as slayer
module_dict.update({'slayer': slayer})
except ModuleNotFoundError:
print(f"Failed importing slayer. It's dependencies will be excluded.")

try:
import lava.lib.dl.bootstrap as bootstrap
module_dict.update({'bootstrap': bootstrap})
except ModuleNotFoundError:
print(f"Failed importing bootstrap. It's dependencies will be excluded.")

try:
import lava.lib.optimization as optim
module_dict.update({'optim': optim})
except ModuleNotFoundError:
print(f"Failed importing optim. It's dependencies will be excluded.")


tutorial_list = [ # list of all notebooks to sync
{
'module': lava,
'module': 'lava',
'dst': 'lava/notebooks/',
'tutorials': {
'end_to_end': 'End to end tutorials',
},
},
{
'module': slayer,
'module': 'slayer',
'dst': 'lava-lib-dl/slayer/notebooks/',
'tutorials': {
'oxford': 'Oxford spike train regression',
Expand All @@ -26,12 +48,19 @@
},
},
{
'module': bootstrap,
'module': 'bootstrap',
'dst': 'lava-lib-dl/bootstrap/notebooks/',
'tutorials': {
'mnist': 'MNIST digit classification',
},
},
{
'module': 'optim',
'dst': 'lava/notebooks/',
'tutorials': {
'end_to_end': 'End to end tutorials',
},
},
]


Expand Down Expand Up @@ -65,40 +94,48 @@ def create_nb_rst(folder_path, rst_name, header, ignore=[]):
continue
rst_text += f' {nb_name}{os.linesep}'


print('Creating ' + folder_path + '/' + rst_name)
with open(folder_path + '/' + rst_name, 'wt') as f:
f.write(rst_text)


if __name__ == '__main__':
for tutorials in tutorial_list:
module = tutorials['module']
module_path = module.__path__[0]
module_path = module_path.split('src/lava')[0]
if module_path[-1] != '/': # this is temp before lava dir restructure
module_path += '/'
dst = tutorials['dst']
if 'ignore' in tutorials.keys():
ignore = tutorials['ignore']
else:
ignore = []
os.makedirs(dst, exist_ok=True)
for tutorial, header in tutorials['tutorials'].items():
src_path = glob.glob(
f'{module_path}tutorials/**/{tutorial}',
recursive=True,
)
dst_path = dst + tutorial
if len(src_path) == 1:
src_path = src_path[0]
print(f'copying from {src_path} to {dst_path}')
copy_tree(src_path, dst_path)
create_nb_rst(src_path, tutorial+'.rst', header, ignore)
key = tutorials['module']
if key in module_dict.keys():
module = module_dict[key]
module_path = module.__path__[0]
module_path = module_path.split('src/lava')[0]
if module_path[-1] != '/':
# this is temp before lava dir restructure
module_path += '/'
dst = tutorials['dst']
if 'ignore' in tutorials.keys():
ignore = tutorials['ignore']
else:
if len(src_path) == 0:
print(f'search path: {module_path}tutorials/**/{tutorial}')
raise Exception('Module not found! Check your config')
ignore = []
os.makedirs(dst, exist_ok=True)
for tutorial, header in tutorials['tutorials'].items():
src_path = glob.glob(
f'{module_path}tutorials/**/{tutorial}',
recursive=True,
)
dst_path = dst + tutorial
if len(src_path) == 1:
src_path = src_path[0]
print(f'copying from {src_path} to {dst_path}')
copy_tree(src_path, dst_path)
create_nb_rst(dst_path, tutorial+'.rst', header, ignore)
else:
raise Exception(
f'Multiple Moudles found: '
f'{src_path}'
)
if len(src_path) == 0:
print(
f'search path: '
f'{module_path}tutorials/**/{tutorial}'
)
raise Exception('Module not found! Check your config')
else:
raise Exception(
f'Multiple Moudles found: '
f'{src_path}'
)