Module overview
Deep learning and differentiable programming has revolutionised numerous fields in recent years. We've witnessed improvements in everything from computer vision through speech analysis to natural language processing as a result of the advent of cheap GPGPU compute coupled with large datasets and some neat algorithms. This module will look at how deep learning works, from the theoretical foundations of the concepts of differentiable programming right through to practical implementation.
Linked modules
Prerequisites: COMP3223 or COMP6245
Aims and Objectives
Learning Outcomes
Subject Specific Practical Skills
Having successfully completed this module you will be able to:
- Apply existing deep learning models to real datasets
- Gain facility in working with differentiable programming and deep learning libraries in order to create and evaluate differentiable programmes and deep networks
Subject Specific Intellectual and Research Skills
Having successfully completed this module you will be able to:
- Critically appraise recent scientific literature in deep learning
- Critically appraise the merits and shortcomings of model architectures on specific problems
Knowledge and Understanding
Having successfully completed this module, you will be able to demonstrate knowledge and understanding of:
- The key factors that have made deep learning successful for various applications
- Underlying mathematical and algorithmic principles of deep learning
Syllabus
Historical Developments
- MLPs
- CNNs, LeNet and the ImageNet competition
- RNNs
- Attention mechanisms and transformers
Mechanics
- Backpropagation
- Automatic differentiation
Learning Algorithms
- Initialisation
- SGD, Momentum, etc.
Auto-encoders
- variational
- denoising
CNNs
- Architectures
- Region Propositions
- Semantic Segmentation
Sequence Modelling
- Linear Embeddings
- RNNs
∗ LSTMs
∗ GRUs
∗ back-prop through time
- Transformers
Deep Learning Applications
- Computer Vision
- Natural Language Processing & Generation
- Speech
Differentiable Relaxations
- Straight-through operator
- Sampling
Current research challenges
Learning and Teaching
Teaching and learning methods
Lectures, labs and guided self-study
Type | Hours |
---|---|
Wider reading or practice | 46 |
Specialist Laboratory | 20 |
Completion of assessment task | 60 |
Lecture | 24 |
Total study time | 150 |
Resources & Reading list
Textbooks
Ian Goodfellow, Yoshua Bengio and Aaron Courville (2016). Deep Learning.
Assessment
Summative
This is how we’ll formally assess what you have learned in this module.
Method | Percentage contribution |
---|---|
Laboratory Report | 40% |
Group project report | 40% |
Blackboard quizzes | 20% |
Referral
This is how we’ll assess you if you don’t meet the criteria to pass this module.
Method | Percentage contribution |
---|---|
Set Task | 100% |
Repeat
An internal repeat is where you take all of your modules again, including any you passed. An external repeat is where you only re-take the modules you failed.
Method | Percentage contribution |
---|---|
Set Task | 100% |
Repeat Information
Repeat type: Internal & External