Deep Learning: Foundations and Concepts

Page 590

Exercises

579

19.4 (? ?) Evaluate the Kullback–Leibler divergence term in (19.14). Hence, show how the gradients of this term with respect to w and φ can be evaluated for training the encoder and decoder networks. 19.5 (?) We have seen that the ELBO given by (19.11) can be written in the form (19.14). Show that it can also be written as Z Ln (w, φ) = q(zn |xn , φ) ln {p(xn |zn , w)p(zn )} dzn Z − q(zn |xn , φ) ln q(zn |xn , φ) dzn . (19.24) 19.6 (?) Show that the ELBO given by (19.11) can be written in the form Z Ln (w, φ) = q(zn |xn , φ) ln p(zn ) dzn Z p(xn |zn , w) + q(zn |xn , φ) ln dzn . q(zn |xn , φ)

(19.25)


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.
Deep Learning: Foundations and Concepts by Chris Bishop - Issuu