**A hyperparameter is a parameter or a variable we need to set before applying a machine learning algorithm into a dataset.**

These parameters express “High Level” properties of the model such as its complexity or how fast it should learn. Hyperparameters are usually fixed before the actual training process begins.

#### Hyperparameters can be divided into two categories:-

#### 1.Optimizer Hyperparameters

These are the variables or parameters related more to the optimization and training process then to model itself. These parameters help you to tune or optimize your model before the actual training process starts so that you start at the right place in the training process.

**Learning Rate:**It is a hyperparameter that controls how much we are adjusting the weights of our neural network with respect to the gradient.**Mini-Batch Size:**It is a hyperparameter that has an effect on the resource requirements of the training and also impacts training speed and number of iterations.**Number of Training Iteration or Epochs:**It is one complete pass through the training data.

#### 2.Model Hyperparameters

These are the variables which are more involved in the architecture or structure of the model. It helps you to define your model complexity on the basis of :

**Number of Layers:****Layer**is a general term that applies to a collection of ‘nodes’ operating together at a specific depth within a**neural network**. Different layers like Input layer, hidden layer and output layer.**Hidden Units:**Number of hidden layers in a neural network, the more complex model(means more**hidden layers**), the more learning capacity the model will need.**Model Specific parameters for the architectures like RNN**: Choosing a cell type like LSTM Cells, Vanilla RNN cells or GRU Cells) and how deep the model is.

I hope you got the idea of hyperparameters in machine learning and if you like this post informative do share with the community.

To become a part of MLAIT community, and want to get started then see this post join us here.

Thank You, Everyone, and Stay Connected!