Přístupnostní navigace
E-application
Search Search Close
Course detail
FIT-ZPDAcad. year: 2024/2025
Foundations of the natural language processing, historical perspective, statistical NLP and modern era dominated by machine learning and, specifically, deep neural networks. Meaning of individual words, lexicology and lexicography, word senses and neural architectures for computing word embeddings, word sense classification and inferrence. Constituency and dependency parsing, syntactic ambiguity, neural dependency parsers. Language modeling and its applications in general architectures. Machine translation, historical perspective on the statistical approach, neural translation and evaluation scores. End-to-end models, attention mechanisms, limits of current seq2seq models. Question answering based on neural models, information extraction components, text understanding challenges, learning by reading and machine comprehension. Text classification and its modern applications, convolutional neural networks for sentence classification. Language-independent representations, non-standard texts from social networks, representing parts of words, subword models. Contextual representations and pretraining for context-dependent language modules. Transformers and self-attention for generative models. Communication agents and natural language generation. Coreference resolution and its interconnection to other text understanding components.
Language of instruction
Mode of study
Guarantor
Department
Entry knowledge
Rules for evaluation and completion of the course
Aims
Study aids
Prerequisites and corequisites
Basic literature
Recommended reading
Classification of course in study plans
branch DVI4 , 0 year of study, winter semester, elective
Lecture
Teacher / Lecturer
Syllabus
Guided consultation in combined form of studies