E N D
Computer Vision and Image Processing: CoSc4113 Lecture 5 Deep learning Models
Outline • CNN – Convolutional Neural Networks • RNN: Recurrent Neural Net works • GAN: Generative Adversarial Network
CNN • are a subset of machine learning, and they are at the heart of deep learning algorithms. • are distinguished from other NN by their superior performance with speech and audio signals and images.
CNN – Application • Classification • Recognition • Feature extraction • Detection • Segmentation
CNN – Conventional/Traditional Weights Activation function
CNN – Recent (Deep) • Deep CNN has large layers compared to traditional NN, has abstract feature extraction, … • CNN has many layers with various function. It can be categorized as main layers and supportive trick. The main layers are: • Convolutional layer • Pooling layer • Fully-connected (FC) layer
CNN – Components Feature Maps -the number of feature maps depends on the number of filters. - e.g.. 10 W produces 10 FM Feature Map(s) Convolution • Convolution: Convolving the input with fixed sized weights
CNN – Components • Pooling: -used to down sample the size of feature maps -Commonly two pooling methods are Known: Average Pooling Max pooling
CNN – Components • Padding: -Appending or expending zero to the border -Used when we want to retain the original size of the input Or used when we want to focus only on RIO.
CNN – Components • Stride -determine steps of movement (Both Upward and down ward) • Weight/Kernels, Filter, and bias • Parameters we need to calculate or estimate • Initially, weights and biases are initialized from Gaussian random numbers. • The number of bias and weights should given • Weights and biases are unique for each layers • Activation Function • Batch Normalization • Up/Down Sampling
CNN – Components • Concatenation • Concatenating features from different convolutional Layers • Up/ Down sampling— econding and decoding
CNN – Training • Training CNN: -Estimating the optimized weight and bias. -Calculating cost function -In training CNN, given data-label pairs, a network can learn to generalize the relationship between the data and label.
CNN – Training • Training CNN: -After training, it is tested or validated on a set of data it has never seen before (i.e. data not part of the training set). • This validation accuracy shows just how well a network has learned to generalize through training.
CNN – Training • So, training of CNN is done through Bach propagation algorithm. • Backpropagation is: • Calculating weight and biases that best fit the model. • the method of updating the weights and biases of the network to minimize the error when training. • Calculates error with derivative using chain rule
CNN – Training • Problem With BP: Vanishing Gradient Problem • When we reach deeper layers, we see that they begin to train slower and slower, quickly rendering them useless. • Why is this the case? • We know that once a node converges, the derivative of the sigmoid drops off very quickly.
CNN – Training • However, this means all subsequent layers have lower and lower, learning much slower. This means we can’t make things too deep.
CNN – Training • Tackling gradient vanishing: 1. Activation Function 2. Batch Normalization
CNN – Training • Activation Function: • One way to fix the vanishing gradient problem is by modifying the activation function, which changes the derivative.
CNN – Training • Loss function: is error for a single CNN iteration Example 1: Regression Loss Functions • Squared Error Loss (L2) • Absolute Error Loss (L1) • Huber Loss
CNN – Training • Loss function: is error for a single CNN iteration • Example 2: Binary Classification Loss Functions • Binary Cross Entropy Loss • Hinge Loss
CNN – Training • Loss function: is error for a single CNN iteration • Example 3: multi class Classification Loss Functions • Multi-Class Cross Entropy Loss • KL-Divergence
CNN – Training • Cost Function: • is the Mean of these Squared Errors (MSE). • A cost function, on the other hand, is the average loss over the entire training dataset. • The optimization strategies aim at minimizing the cost function.
CNN – Recent Architecture • Thank You
CNN – Recent Architecture Tab. 1 Tab. 2 Tab. 3 Tab. 4