##### Sigmoid & Sigmoid derivative

Sigmoid is a logistic function which gives an ‘S’ shaped curve that can take any real-valued number and map it into a value between 0 and 1.

##### Reshaping, Normalization, Softmax

In this tutorial we'll learn how to reshape arrays, normalize rows, what is broadcasting and softmax. All of this is extremely useful in machine learning.

##### Vectorization with python

When we are writing machine learning funcions, we must be sure that our code is computationally efficient so we always use vectorization.

##### Data preparation

Logistic regression is a binary classification method. In this full tutorial, we will start writing algorythm that could predict correct animal in a given picture.

##### Architecture of the learning algorithm

In this part we'll build a Logistic Regression, using a Neural Network mindset. We'll see that is actually a very simple Neural Network model.

##### Cost optimization function

In this tutorial we will learn how to update learning parameters (gradient descent). We'll use parameters from forward and backward propagation.

##### Prediction function

In this tutorial we will implement the predict() function and we will be able to use w and b to predict the labels for our dataset X.

##### Final cats vs dogs model

In this tutorial you will see how the overall model is structured by putting together all the building blocks together in the right order.

##### Best choice of learning rate

In order for Gradient Descent to work we must choose the learning rate wisely. In this part you will see how learning rate determines how rapidly we update the parameters.