Skip to content

Commit

Permalink
auto-generating sphinx docs
Browse files Browse the repository at this point in the history
  • Loading branch information
pytorchbot committed Feb 3, 2025
1 parent 5e98153 commit 14dfdd5
Show file tree
Hide file tree
Showing 48 changed files with 1,694 additions and 624 deletions.
Binary file not shown.
Binary file modified main/_downloads/150528e38f6816824f1e81ed67476a9f/export.zip
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
.. currentmodule:: tensordict.nn


ProbabilisticTensorDictSequential
=================================

.. autoclass:: ProbabilisticTensorDictSequential
:members:
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
tensordict.nn.rand\_one\_hot
============================

.. currentmodule:: tensordict.nn

.. autofunction:: rand_one_hot

This file was deleted.

7 changes: 6 additions & 1 deletion main/_sources/reference/nn.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -194,6 +194,7 @@ to build distributions from network outputs and get summary statistics or sample
TensorDictModuleBase
TensorDictModule
ProbabilisticTensorDictModule
ProbabilisticTensorDictSequential
TensorDictSequential
TensorDictModuleWrapper
CudaGraphModule
Expand Down Expand Up @@ -257,6 +258,10 @@ Distributions
NormalParamExtractor
OneHotCategorical
TruncatedNormal
InteractionType
set_interaction_type
add_custom_mapping
mappings


Utils
Expand All @@ -270,8 +275,8 @@ Utils

make_tensordict
dispatch
set_interaction_type
inv_softplus
biased_softplus
set_skip_existing
skip_existing
rand_one_hot
12 changes: 6 additions & 6 deletions main/_sources/sg_execution_times.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

Computation times
=================
**02:26.885** total execution time for 11 files **from all galleries**:
**02:26.490** total execution time for 11 files **from all galleries**:

.. container::

Expand All @@ -33,22 +33,22 @@ Computation times
- Time
- Mem (MB)
* - :ref:`sphx_glr_tutorials_tensorclass_fashion.py` (``reference/generated/tutorials/tensorclass_fashion.py``)
- 01:00.697
- 01:00.417
- 0.0
* - :ref:`sphx_glr_tutorials_data_fashion.py` (``reference/generated/tutorials/data_fashion.py``)
- 00:55.185
- 00:55.141
- 0.0
* - :ref:`sphx_glr_tutorials_tensordict_module.py` (``reference/generated/tutorials/tensordict_module.py``)
- 00:16.816
- 00:16.763
- 0.0
* - :ref:`sphx_glr_tutorials_streamed_tensordict.py` (``reference/generated/tutorials/streamed_tensordict.py``)
- 00:11.021
- 0.0
* - :ref:`sphx_glr_tutorials_tensorclass_imagenet.py` (``reference/generated/tutorials/tensorclass_imagenet.py``)
- 00:01.663
- 00:01.648
- 0.0
* - :ref:`sphx_glr_tutorials_export.py` (``reference/generated/tutorials/export.py``)
- 00:01.474
- 00:01.471
- 0.0
* - :ref:`sphx_glr_tutorials_tensordict_keys.py` (``reference/generated/tutorials/tensordict_keys.py``)
- 00:00.009
Expand Down
226 changes: 113 additions & 113 deletions main/_sources/tutorials/data_fashion.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -423,164 +423,164 @@ adjust how we unpack the data to the more explicit key-based retrieval offered b
is_shared=False)
Epoch 1
-------------------------
loss: 2.316633 [ 0/60000]
loss: 2.307225 [ 6400/60000]
loss: 2.285488 [12800/60000]
loss: 2.274556 [19200/60000]
loss: 2.267080 [25600/60000]
loss: 2.228321 [32000/60000]
loss: 2.242993 [38400/60000]
loss: 2.208107 [44800/60000]
loss: 2.199419 [51200/60000]
loss: 2.172009 [57600/60000]
loss: 2.292876 [ 0/60000]
loss: 2.283355 [ 6400/60000]
loss: 2.263936 [12800/60000]
loss: 2.273287 [19200/60000]
loss: 2.236377 [25600/60000]
loss: 2.214329 [32000/60000]
loss: 2.223937 [38400/60000]
loss: 2.185483 [44800/60000]
loss: 2.176149 [51200/60000]
loss: 2.157408 [57600/60000]
Test Error:
Accuracy: 27.5%, Avg loss: 2.168821
Accuracy: 47.8%, Avg loss: 2.149705
Epoch 2
-------------------------
loss: 2.185052 [ 0/60000]
loss: 2.180913 [ 6400/60000]
loss: 2.122231 [12800/60000]
loss: 2.135427 [19200/60000]
loss: 2.095721 [25600/60000]
loss: 2.026379 [32000/60000]
loss: 2.065620 [38400/60000]
loss: 1.988877 [44800/60000]
loss: 1.983771 [51200/60000]
loss: 1.921114 [57600/60000]
loss: 2.151449 [ 0/60000]
loss: 2.149319 [ 6400/60000]
loss: 2.089481 [12800/60000]
loss: 2.119855 [19200/60000]
loss: 2.046867 [25600/60000]
loss: 1.995784 [32000/60000]
loss: 2.019445 [38400/60000]
loss: 1.938274 [44800/60000]
loss: 1.927196 [51200/60000]
loss: 1.870499 [57600/60000]
Test Error:
Accuracy: 57.6%, Avg loss: 1.919113
Accuracy: 60.4%, Avg loss: 1.868008
Epoch 3
-------------------------
loss: 1.955521 [ 0/60000]
loss: 1.929771 [ 6400/60000]
loss: 1.815067 [12800/60000]
loss: 1.856079 [19200/60000]
loss: 1.743889 [25600/60000]
loss: 1.686423 [32000/60000]
loss: 1.723525 [38400/60000]
loss: 1.620473 [44800/60000]
loss: 1.631592 [51200/60000]
loss: 1.530928 [57600/60000]
loss: 1.893928 [ 0/60000]
loss: 1.873756 [ 6400/60000]
loss: 1.753070 [12800/60000]
loss: 1.804454 [19200/60000]
loss: 1.670851 [25600/60000]
loss: 1.636076 [32000/60000]
loss: 1.651654 [38400/60000]
loss: 1.554687 [44800/60000]
loss: 1.565008 [51200/60000]
loss: 1.473971 [57600/60000]
Test Error:
Accuracy: 61.8%, Avg loss: 1.546323
Accuracy: 61.9%, Avg loss: 1.491119
Epoch 4
-------------------------
loss: 1.619225 [ 0/60000]
loss: 1.582403 [ 6400/60000]
loss: 1.432203 [12800/60000]
loss: 1.501907 [19200/60000]
loss: 1.371325 [25600/60000]
loss: 1.368950 [32000/60000]
loss: 1.392523 [38400/60000]
loss: 1.314770 [44800/60000]
loss: 1.336692 [51200/60000]
loss: 1.240931 [57600/60000]
loss: 1.556790 [ 0/60000]
loss: 1.530334 [ 6400/60000]
loss: 1.377608 [12800/60000]
loss: 1.457017 [19200/60000]
loss: 1.323445 [25600/60000]
loss: 1.330096 [32000/60000]
loss: 1.343076 [38400/60000]
loss: 1.268062 [44800/60000]
loss: 1.295799 [51200/60000]
loss: 1.211058 [57600/60000]
Test Error:
Accuracy: 62.7%, Avg loss: 1.263385
Accuracy: 63.9%, Avg loss: 1.231329
Epoch 5
-------------------------
loss: 1.352172 [ 0/60000]
loss: 1.330544 [ 6400/60000]
loss: 1.164969 [12800/60000]
loss: 1.265779 [19200/60000]
loss: 1.133228 [25600/60000]
loss: 1.163435 [32000/60000]
loss: 1.189454 [38400/60000]
loss: 1.129025 [44800/60000]
loss: 1.154566 [51200/60000]
loss: 1.074657 [57600/60000]
loss: 1.308637 [ 0/60000]
loss: 1.296877 [ 6400/60000]
loss: 1.127608 [12800/60000]
loss: 1.239060 [19200/60000]
loss: 1.105856 [25600/60000]
loss: 1.137060 [32000/60000]
loss: 1.160426 [38400/60000]
loss: 1.095086 [44800/60000]
loss: 1.129022 [51200/60000]
loss: 1.059775 [57600/60000]
Test Error:
Accuracy: 64.0%, Avg loss: 1.091707
Accuracy: 65.1%, Avg loss: 1.072701
TensorDict training done! time: 8.6239 s
TensorDict training done! time: 8.5516 s
Epoch 1
-------------------------
loss: 2.303011 [ 0/60000]
loss: 2.287436 [ 6400/60000]
loss: 2.274800 [12800/60000]
loss: 2.269296 [19200/60000]
loss: 2.256683 [25600/60000]
loss: 2.223480 [32000/60000]
loss: 2.236421 [38400/60000]
loss: 2.202974 [44800/60000]
loss: 2.200173 [51200/60000]
loss: 2.161506 [57600/60000]
loss: 2.309895 [ 0/60000]
loss: 2.290487 [ 6400/60000]
loss: 2.263228 [12800/60000]
loss: 2.248584 [19200/60000]
loss: 2.248923 [25600/60000]
loss: 2.215442 [32000/60000]
loss: 2.225330 [38400/60000]
loss: 2.191122 [44800/60000]
loss: 2.185660 [51200/60000]
loss: 2.156927 [57600/60000]
Test Error:
Accuracy: 46.0%, Avg loss: 2.161499
Accuracy: 35.7%, Avg loss: 2.146915
Epoch 2
-------------------------
loss: 2.178989 [ 0/60000]
loss: 2.161377 [ 6400/60000]
loss: 2.110821 [12800/60000]
loss: 2.120728 [19200/60000]
loss: 2.080936 [25600/60000]
loss: 2.013035 [32000/60000]
loss: 2.048943 [38400/60000]
loss: 1.974167 [44800/60000]
loss: 1.977030 [51200/60000]
loss: 1.885004 [57600/60000]
loss: 2.162492 [ 0/60000]
loss: 2.153710 [ 6400/60000]
loss: 2.087680 [12800/60000]
loss: 2.101584 [19200/60000]
loss: 2.071230 [25600/60000]
loss: 2.003889 [32000/60000]
loss: 2.038064 [38400/60000]
loss: 1.957708 [44800/60000]
loss: 1.964049 [51200/60000]
loss: 1.902878 [57600/60000]
Test Error:
Accuracy: 57.7%, Avg loss: 1.898815
Accuracy: 57.2%, Avg loss: 1.891697
Epoch 3
-------------------------
loss: 1.948882 [ 0/60000]
loss: 1.905324 [ 6400/60000]
loss: 1.794196 [12800/60000]
loss: 1.819826 [19200/60000]
loss: 1.724633 [25600/60000]
loss: 1.666429 [32000/60000]
loss: 1.690138 [38400/60000]
loss: 1.601758 [44800/60000]
loss: 1.617406 [51200/60000]
loss: 1.492366 [57600/60000]
loss: 1.927803 [ 0/60000]
loss: 1.901445 [ 6400/60000]
loss: 1.777262 [12800/60000]
loss: 1.814932 [19200/60000]
loss: 1.728945 [25600/60000]
loss: 1.678206 [32000/60000]
loss: 1.701154 [38400/60000]
loss: 1.600237 [44800/60000]
loss: 1.627035 [51200/60000]
loss: 1.526987 [57600/60000]
Test Error:
Accuracy: 61.0%, Avg loss: 1.526873
Accuracy: 60.2%, Avg loss: 1.535037
Epoch 4
-------------------------
loss: 1.613432 [ 0/60000]
loss: 1.562069 [ 6400/60000]
loss: 1.418200 [12800/60000]
loss: 1.472821 [19200/60000]
loss: 1.363791 [25600/60000]
loss: 1.349146 [32000/60000]
loss: 1.369352 [38400/60000]
loss: 1.302855 [44800/60000]
loss: 1.325800 [51200/60000]
loss: 1.219427 [57600/60000]
loss: 1.604841 [ 0/60000]
loss: 1.572237 [ 6400/60000]
loss: 1.413552 [12800/60000]
loss: 1.477304 [19200/60000]
loss: 1.375301 [25600/60000]
loss: 1.372313 [32000/60000]
loss: 1.381226 [38400/60000]
loss: 1.304628 [44800/60000]
loss: 1.339577 [51200/60000]
loss: 1.242289 [57600/60000]
Test Error:
Accuracy: 63.9%, Avg loss: 1.253359
Accuracy: 62.6%, Avg loss: 1.262859
Epoch 5
-------------------------
loss: 1.345231 [ 0/60000]
loss: 1.315563 [ 6400/60000]
loss: 1.153394 [12800/60000]
loss: 1.247195 [19200/60000]
loss: 1.128013 [25600/60000]
loss: 1.140230 [32000/60000]
loss: 1.172961 [38400/60000]
loss: 1.116331 [44800/60000]
loss: 1.143755 [51200/60000]
loss: 1.060270 [57600/60000]
loss: 1.340297 [ 0/60000]
loss: 1.329028 [ 6400/60000]
loss: 1.153737 [12800/60000]
loss: 1.252457 [19200/60000]
loss: 1.134847 [25600/60000]
loss: 1.167557 [32000/60000]
loss: 1.183226 [38400/60000]
loss: 1.120279 [44800/60000]
loss: 1.156876 [51200/60000]
loss: 1.079154 [57600/60000]
Test Error:
Accuracy: 65.0%, Avg loss: 1.085642
Accuracy: 64.4%, Avg loss: 1.093753
Training done! time: 34.3338 s
Training done! time: 33.9425 s
.. rst-class:: sphx-glr-timing

**Total running time of the script:** (0 minutes 55.185 seconds)
**Total running time of the script:** (0 minutes 55.141 seconds)


.. _sphx_glr_download_tutorials_data_fashion.py:
Expand Down
Loading

0 comments on commit 14dfdd5

Please sign in to comment.