top of page

Deep Learning Specialization

Coursera

white-background-with-blue-tech-hexagon_1017-19366 (1).jpg

These tasks were created as part of Coursera's Deep Learning Specialization course.

1_U4bbdzQF9Gzi7qn_P0S92g.png_raw=true.png

The Deep Learning Specialization is a foundational program that teaches the capabilities, challenges, and consequences of deep learning and prepares to participate in the development of leading-edge AI technology. In this Specialization, I built and trained neural network architectures such as Convolutional Neural Networks, Recurrent Neural Networks, LSTMs, Transformers, and learnt how to make them better with strategies such as Dropout, BatchNorm, Xavier/He initialization, and more. Theoretical concepts and their industry applications are taught using Python and TensorFlow. Real-world cases were tackled such as speech recognition, music synthesis, chatbots, machine translation, natural language processing, and more.

11.png
12.png
8.png
Assignments

Building Deep Neural Network Step by Step 

notebook | py file

Build a deep neural network from scratch. Implementation of all the functions required to build a deep neural network. Use non-linear units like ReLU to improve NN model, build a deeper neural network (with more than 1 hidden layer) and implement an easy-to-use Neural Network class.

Gradient Checking
notebook | py file

Implement and use Gradient Checking. Be certain that the implementation of the backpropagation process is correct.

Deep Neural Network for Image Classification Application 
notebook
 | py file

building a deep network, and applying it to cat vs non-cat classification. Build and apply a deep neural network to supervised learning.

Optimization methods
notebook | py file

More advanced optimization methods that can speed up learning and perhaps even get a better final value for the cost function. Having a good optimization algorithm can be the difference between waiting days vs. just a few hours to get a good result. Stochastic Gradient Descent, Mini-Batch Gradient Descent, Momentum, Adam.
2.png
1.png
13.png

Regularization
notebook | py file

Deep Learning models have so much flexibility and capacity that overfitting can be a serious problem, if the training dataset is not big enough. the learned network colud do well on the training set, but to don't generalize to new examples that it has never seen. Regularization Methods in deep learning models - L2 Regularization, Dropout

Autonomous Driving Application - Car Detection using YOLO
notebook | py file

Object Detection using the powerful YOLO model. Use object detection on a car detection dataset. Deal with bounding boxes.

TensorFlow Tutorial
notebook | py file

Working with TensorFlow: Initialize variables, Start your own session, Train algorithms, Implement a Neural Network.

Art Generation with Neural Style Transfer
notebook | py file

This algorithm was created by Gatys et al. (2015). Implement the Neural Style Transfer algorithm. Generate novel artistic images using the algorithm. Most of the algorithms optimize a cost function to get a set of parameter values. In Neural Style Transfer, the algorithm optimizes a cost function to get pixel values.

4.png
10.png
10.png
3.png
9.png

Character Level Language Model - Dinosaurus Names, Writing like Shakespeare
notebook | py file

build a character level language model to generate new dinosaur names. The algorithm learns the different name patterns, and randomly generates new names. Store text data for processing using an RNN. Synthesize data, by sampling predictions at each time step and passing it to the next RNN-cell unit. Build a character-level text generation Recurrent Neural Network.

Face Recognition, Face Verification and Triplet Loss Function
notebook | py file

A face recognition system. Face recognition problems commonly fall into two categories:

  • Face Verification - a 1:1 matching problem. "is this the claimed person?". For example: A mobile phone that unlocks using your face is using face verification.

  • Face Recognition - a 1:K matching problem. "who is this person?". Implement the triplet loss function. Use a pretrained model to map face images into 128-dimensional encodings. Use these encodings to perform face verification and face recognition.

Building a Recurrent Neural Network Step by Step
notebook | py file

Implement key components of a Recurrent Neural Network in numpy. Recurrent Neural Networks (RNN) are very effective for Natural Language Processing and other sequence tasks because they have "memory". They can read inputs x⟨t⟩ (such as words) one at a time, and remember some information/context through the hidden layer activations that get passed from one time-step to the next. This allows a unidirectional RNN to take information from the past to process later inputs. A bidirectional RNN can take context from both the past and the future.

Convolution Model Application - SIGNS Dataset - Hand Signs Images to Numbers Classifications
notebook | py file

Implement a fully functioning ConvNet using TensorFlow. Build and train a ConvNet in TensorFlow for a classification problem. Built a model that recognizes SIGN language, the SIGNS dataset is a collection of 6 signs representing numbers from 0 to 5.

7.png

Train

6.png
5.png
ivan-maranan-dino-roar1.gif_1576511965.gif

Emojify Word Vectors Representations
notebook | py file

Implement a model which inputs a sentence (such as "Let's go see the baseball game tonight!") and finds the most appropriate emoji to be used with this sentence (⚾️). Start with a baseline model (Emojifier-V1) using word embeddings. Implement a more sophisticated model (Emojifier-V2) that further incorporates an LSTM. Example: Rather than writing: "Congratulations on the promotion! Let's get coffee and talk. Love you!" The emojifier can automatically turn this into: "Congratulations on the promotion! 👍 Let's get coffee and talk. ☕️ Love you! ❤️"

Operations on Word Vectors - Word Embedding
notebook | py file

Load pre-trained word vectors, and measure similarity using cosine similarity. Use word embeddings to solve word analogy problems such as "Man is to Woman as King is to ___". Modify word embeddings to reduce their gender bias.

Improvise a Jazz Solo with an LSTM Network
notebook | py file

Implement a model that uses an LSTM to generate music. Apply an LSTM to music generation. Generate jazz music with deep learning.

Neural Machine Translation with Attention
notebook | py file

build a Neural Machine Translation (NMT) model to translate human-readable dates ("25th of June, 2009") into machine-readable dates ("2009-06-25") using an attention model, one of the most sophisticated sequence-to-sequence models.

a587eb698ff69dad5eff5935acc171e8.gif
bottom of page