From faeca00fb1821059006033107d0e979d3f08ee5a Mon Sep 17 00:00:00 2001 From: Sam Griesemer Date: Sat, 20 Jul 2019 15:23:30 -0500 Subject: [PATCH] simple typo fix, "distribution" -> "distributions" --- index.html | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/index.html b/index.html index 6a8e8ff..82c7198 100644 --- a/index.html +++ b/index.html @@ -371,7 +371,7 @@

MDN-RNN (M) Model

RNN with a Mixture Density Network output layer. The MDN outputs the parameters of a mixture of Gaussian distribution used to sample a prediction of the next latent vector zz.
-

In our approach, we approximate p(z)p(z) as a mixture of Gaussian distribution, and train the RNN to output the probability distribution of the next latent vector zt+1z_{t+1} given the current and past information made available to it.

+

In our approach, we approximate p(z)p(z) as a mixture of Gaussian distributions, and train the RNN to output the probability distribution of the next latent vector zt+1z_{t+1} given the current and past information made available to it.

More specifically, the RNN will model P(zt+1at,zt,ht)P(z_{t+1} \; | \; a_t, z_t, h_t), where ata_t is the action taken at time tt and hth_t is the hidden state of the RNN at time tt. During sampling, we can adjust a temperature parameter τ\tau to control model uncertainty, as done in -- we will find adjusting τ\tau to be useful for training our controller later on.

@@ -2120,4 +2120,4 @@

DoomRNN

\ No newline at end of file +