Enhancing controllability of text generation

Показати скорочений опис матеріалу

dc.contributor.author Shcherbyna, Anton
dc.date.accessioned 2020-06-17T23:16:08Z
dc.date.available 2020-06-17T23:16:08Z
dc.date.issued 2020
dc.identifier.citation Shcherbyna, Anton. Enhancing controllability of text generation : Master Thesis : manuscript rights / Anton Shcherbyna ; Supervisor: Kostiantyn Omelianchuk ; Ukrainian Catholic University, Department of Computer Sciences. – Lviv : [s.n.], 2020. – 30 p. : ill. uk
dc.identifier.uri http://er.ucu.edu.ua/handle/1/2240
dc.language.iso en uk
dc.subject text generation uk
dc.subject controllability uk
dc.subject text modeling uk
dc.title Enhancing controllability of text generation uk
dc.type Preprint uk
dc.status Публікується вперше uk
dc.description.abstracten Many models could generate text conditioned on some context, but those approaches don’t provide us with the ability to control various aspects of the generated text (e.g., sentiment). To address this problem, Variational Autoencoder is typically used because they give the ability to manipulate in latent space and, in this way,control text generation. However, it has been shown that VAE with strong autoregressive decoders,which are used for text modeling, faces posterior collapse problem. We think that one of the reasons why this problem occurs is a restrictive gaussian assumption we made about approximate posterior. In this work, we want to apply well-known approaches based on Normalizing Flows to improve approximate posterior for text modeling and check if it can help avoid posterior collapse.


Долучені файли

Даний матеріал зустрічається у наступних зібраннях

Показати скорочений опис матеріалу

Пошук


Перегляд

Мій обліковий запис