Introduction to Machine Learning, Spring 2022
GWU Computer Science
For this homework, your group will write a short essay (and you can use diagrams in addition to text) to describe
how gradient descent, feeding-forward, and back-propagation work in a neural network with two internal layers (so
input layer, hidden layer one, hidden layer two, output layer). You must explain what happens to the input across all
four layers of this neural net, specifically focusing on back-propagation. Your answers should include all of the mathematical formulas up through slide 11 on the Neural Nets slide deck.
You may NOT use the images or slides from class (you need to create your own from scratch, so you may not copy
examples from the internet either).
Correct high-level explanation of what feeding-forward is | 5 points |
Correct high-level explanation of what back-propagation is | 5 points |
Correct explanation of what inputs/outputs/weights of a specific weight for a specific node need to be considered as part of back propagation calculation for that weight | 5 points |
Correct diagram (drawn from scratch) with labels for all items in the bullet above | 5 points |
Correct explanation of diagram above, including formulas for derivates | 5 points |