Course detail

Natural Language Processing (in English)

FIT-ZPJaAcad. year: 2023/2024

Foundations of the natural language processing, historical perspective, statistical NLP and modern era dominated by machine learning and, specifically, deep neural networks. Meaning of individual words, lexicology and lexicography, word senses and neural architectures for computing word embeddings, word sense classification and inferrence. Constituency and dependency parsing, syntactic ambiguity, neural dependency parsers. Language modeling and its applications in general architectures. Machine translation, historical perspective on the statistical approach, neural translation and evaluation scores. End-to-end models, attention mechanisms, limits of current seq2seq models. Question answering based on neural models, information extraction components, text understanding challenges, learning by reading and machine comprehension. Text classification and its modern applications, convolutional neural networks for sentence classification. Language-independent representations, non-standard texts from social networks, representing parts of words, subword models. Contextual representations and pretraining for context-dependent language modules. Transformers and self-attention for generative models. Communication agents and natural language generation. Coreference resolution and its interconnection to other text understanding components.

Language of instruction

English

Number of ECTS credits

5

Mode of study

Not applicable.

Offered to foreign students

Of all faculties

Entry knowledge

Knowledge of Python programming and fundamental elements of calculus.

Rules for evaluation and completion of the course

  • Mid-term test - up to 9 points
  • Individual project - up to 40 points
  • Written final exam - up to 51 points

The evaluation includes mid-term test, individual project, and the final exam. The mid-term test does not have a correction option, the final exam has two possible correction runs.

Aims

To understand natural language processing and to learn how to apply modern machine learning methods in this field. To get acquainted with advanced deep learning architectures that proved to be successful in various NLP tasks. To
understand the use of neural networks for sequential language modelling, to understand  their use as conditional language models for transduction tasks, and to approaches employing these techniques in combination with other mechanisms for advanced applications.

The students will get acquainted with natural language processing and will understand a range of neural network models that are commonly applied in the field. They will also grasp basics of neural implementations of attention mechanisms and sequence embedding models and how these modular components can be combined to build state of the art NLP systems. They will be able to implement and to evaluate common neural network models for various NLP applications.
Students will improve their programming skills and their knowledge and practical experience with tools for deep learning as well as with general processing of textual data.

Study aids

Not applicable.

Prerequisites and corequisites

Not applicable.

Basic literature

Not applicable.

Recommended reading

Géron, Aurélien. Hands-on machine learning with Scikit-Learn and TensorFlow: concepts, tools, and techniques to build intelligent systems. " O'Reilly Media, Inc.", 2017. (EN)
(EN)

Elearning

Classification of course in study plans

  • Programme IT-MSC-2 Master's

    branch MGMe , 0 year of study, winter semester, elective

  • Programme IT-MSC-2 Master's

    branch MBI , 0 year of study, winter semester, compulsory-optional
    branch MBS , 0 year of study, winter semester, elective
    branch MPV , 0 year of study, winter semester, elective
    branch MIS , 0 year of study, winter semester, elective
    branch MIN , 0 year of study, winter semester, elective
    branch MGM , 0 year of study, winter semester, elective
    branch MSK , 0 year of study, winter semester, elective
    branch MMM , 0 year of study, winter semester, elective

  • Programme MIT-EN Master's 0 year of study, winter semester, elective

  • Programme MITAI Master's

    specialization NSPE , 0 year of study, winter semester, compulsory
    specialization NBIO , 0 year of study, winter semester, elective
    specialization NSEN , 0 year of study, winter semester, elective
    specialization NVIZ , 0 year of study, winter semester, elective
    specialization NGRI , 0 year of study, winter semester, elective
    specialization NADE , 0 year of study, winter semester, elective
    specialization NISD , 0 year of study, winter semester, elective
    specialization NMAT , 0 year of study, winter semester, elective
    specialization NSEC , 0 year of study, winter semester, elective
    specialization NISY up to 2020/21 , 0 year of study, winter semester, elective
    specialization NCPS , 0 year of study, winter semester, elective
    specialization NHPC , 0 year of study, winter semester, elective
    specialization NNET , 0 year of study, winter semester, elective
    specialization NMAL , 0 year of study, winter semester, elective
    specialization NVER , 0 year of study, winter semester, elective
    specialization NIDE , 0 year of study, winter semester, elective
    specialization NEMB , 0 year of study, winter semester, elective
    specialization NISY , 0 year of study, winter semester, elective
    specialization NEMB up to 2021/22 , 0 year of study, winter semester, elective

  • Programme IT-MGR-1H Master's

    specialization MGH , 0 year of study, winter semester, recommended course

Type of course unit

 

Lecture

26 hod., optionally

Teacher / Lecturer

Syllabus

  1. Introduction, history of NLP, and modern approaches based on deep learning
  2. Word senses and word vector
  3. Dependency parsing
  4. Language models
  5. Machine translation
  6. Seq2seq models and attention
  7. Question answering
  8. Convolutional neural networks for sentence classification
  9. Information from parts of words: Subword models
  10. Modeling contexts of use: Contextual representations and pretraining
  11. Transformers and self-attention for generative models 
  12. Natural language generation 
  13. Coreference resolution

Project

26 hod., compulsory

Teacher / Lecturer

Syllabus

  • Individually assigned project

Elearning