The Math Behind Random Forest

EliteAI
2 min readMay 10, 2021

--

A step towards Statistical analysis..

fig(a): Decision Tree-1, fig(b): Decision Tree-2, fig(c): Forest

Ensemble means collection or group of things, Ensemble models in machine learning operate on a similar idea. Ensemble learning is a machine learning technique that combines several base models in order to produce one optimal predictive model.

Bagging or bootstrap aggregation is a technique for reducing the variance of an estimated prediction function. Bagging seems to work especially well for high-variance, low-bias procedures, such as trees.

Random forests is a substantial modification of bagging that builds a large collection of de-correlated trees, and then averages them.

ALGORITHM:

B: number of trees, T(b): b^th tree, C(b) : class val for bth tree

Implementation:

A) Prepare Dataset::

B) Decision Tree Classifier::

C)Random Forest Classifier::

D) Train model::

E) Predict and Test::

F) Results::

Thanks for reading. If you have any feedback, please feel to reach out by commenting on this post.

Check out our website! https://eliteaihub.com/

we keep on adding blogs, tools and videos to help to understand the math behind ML and AI !!

References:

https://www.math.mcgill.ca/yyang/resources/doc/randomforest.pdf

--

--

EliteAI
EliteAI

Written by EliteAI

Helping you build something Innovative…

No responses yet