Artificial Intelligence and Data Science Enthusiast. Updating Neural Network parameters since 2002.

A Decision tree is a machine learning algorithm that can be used for both classification and regression (* In that case , It would be called Regression Trees* ). This blog is concentrated on Decision trees for classification.

A Decision tree is similar to a computer science tree, with a hierarchical structure . It has nodes and these nodes are connected by edges. A decision tree classifies data by asking questions at each node. (* In a typical situation, If the answer is yes, go to the right child. If not , go to the left child *).

Feature Scaling is a pre-processing technique that is used to bring all the columns or features of the data to the same scale. This is done for various reasons.

It is done for algorithms that involves gradient descent and also for algorithms like K-means clustering and K-Nearest Neighbors.

Let’s consider an example of Linear Regression to understand why feature scaling is required for algorithms with gradient descent.

In general , Supervised Learning consists of 2 types of problem setting.

**Regression**: It is the type of problem where the data scientist models the relationship between the independent variables and the**continuous dependent variable**using a suitable model and used that to give accurate predictions for future input data.

For example,

Predicting the sales of Ice cream on a given day, given the temperature.

Here, Sales of Ice Cream is a continuous variable which means it can take any value.*eg: 500, 100, 10,000, 52,123, 931…etc.***Classification:**It is a type of problem where the data scientist has…

In a Supervised Machine Learning problem , we usually train the model on the dataset and use the trained model to predict the target, given new predictor values.

But, How do we know if the model we have trained on the dataset will producing effective and accurate results on the new input data.

We cannot conclude that the model has performed well based on the error rates or certain statistic measures *(such as R square statistic)* that we get from the dataset on which the model is trained.

The main problem is that there is no way of knowing if the model…

Synergy Effect or Interaction Effect is a phenomenon that arises in the multiple linear regression setting in machine learning, when increase in the value of one Independent variable increases the impact of another Independent variable on the dependent Variable.

It’s okay if this above statement is not easily understandable. Let’s look at an example*Note: You should be familiar with multiple linear regression to understand this*

Let’s consider a data-set with advertising budget of a company for different categories and the sales of the company. So, the 3 columns of the data-set are **Radio Advertising**, **Newspaper Advertising** and **Sales. **Let’s…

When it comes to predictive modeling , A single algorithmic model might be not be enough to make the most optimal predictions.

One of the most effective methodologies in machine learning is **Ensemble Modeling** or **Ensembles**.

Ensemble Modeling is the combination of multiple machine learning models that follow the same or different algorithm to make better overall predictions.

It is usually these types of models that win the machine learning competitions conducted by *netflix* or *kaggle*.

Ensemble modeling methods can be split into various categories.

they are:

**Sequential**

**Probability Distribution :** A probability Distribution shows the list of probabilities associated with each value or a range of values for a discrete or a continuous random variable.

Based on the Type of Random Variable, Probability Distributions can be split into **Continuous Probability Distributions** and **Discrete Probability Distributions**

Convolutional Neural Networks is a type of neural network that is mostly used for Image Datasets and The shared parameters feature of convolutional network reduces the number of parameters for the model and this also makes it efficient in pattern **Edge-Detection. **This blog post contains the implementation of Convolutional Neural Networks with tensorflow 2.0 on the Cats vs Dogs dataset

Cats vs Dogs dataset is one of the most famous dataset used in the field of machine learning, which was developed by the partnership between **pathfinder.com** and **Microsoft.**

*The Dataset is available at :** **https://www.microsoft.com/en-us/download/details.aspx?id=54765*

*Note: The code files will be available at : **https://github.com/ashwinhprasad/Tensorflow-2.0/blob/master/TF2-(4)-ANN/TF2-ANNs.ipynb*

Artificial Neural Networks are the traditional neural networks which means that there are more than or equal to one layer between the input and the output layer. This allows for the model to adapt for the non linearity and form complex functions which makes it useful in real life.

This blog post does not covers only the implementation of feed forward or artificial neural network with tensorflow 2 and not the theory part of the artificial neural network.

*Note : The Program files for tensorflow 2 can be found on - **https://github.com/ashwinhprasad/Tensorflow-2.0*

Logistic Regression is used for Classification tasks and This Blog will take you through the implementation of logistic regression using Tensorflow 2. This blog post won’t be covering about the theories regarding logistic regression and theory is a pre-requisite.

The Dataset that is used in this example is iris dataset from the sklearn library.

we are importing the dataset and storing it in the form of a pandas dataframe

`#importing the libraries`

import numpy as np

import tensorflow as tf

import pandas as pd

import matplotlib.pyplot as plt

import seaborn as…