Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recalculating D in contrastive divergence #4

Open
alexminnaar opened this issue Oct 1, 2015 · 0 comments
Open

Recalculating D in contrastive divergence #4

alexminnaar opened this issue Oct 1, 2015 · 0 comments

Comments

@alexminnaar
Copy link

In __contrastive_divergence_rsm__ You are passing in D. Then you are using D as the number of trials in the multinomial sample in

for i in xrange(len(vis)):
            neg_vis[i] = random.multinomial(D[i], softmax_value[i], size=1)

Then you are recomputing D in

D = sum(neg_vis, axis=1)

Which is summing the results of the multinomial trials across events, which will always be equal to the number of trials. Therefore it does not look like D ever actually changes, so it does not look like there is any need to recompute it in __contrastive_divergence_rsm__

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant