Tag Archives: feature engineering

Data Science Coding | H2O in Python with Grid Search Cross Validation | IRIS Dataset

Hits: 94Data Science Coding | H2O in Python with Grid Search Cross Validation | IRIS Dataset H2O.ai is an open-source platform that provides a wide range of machine learning algorithms and tools for building, deploying, and managing models. It is written in Java and has APIs for several programming languages, including Python. Grid Search Cross …

SKLEARN Gradient Boosting Classifier with Grid Search Cross Validation

Hits: 339SKLEARN Gradient Boosting Classifier with Grid Search Cross Validation   Gradient Boosting Classifier is a machine learning technique used to classify items into different categories. It is an ensemble method that combines the predictions of multiple weak models, such as decision trees, to make a final prediction. The technique uses an iterative process where …

IRIS Flower Classification using SKLEARN DecisionTree Classifier with Monte Carlo Cross Validation

Hits: 95IRIS Flower Classification using SKLEARN DecisionTree Classifier with Monte Carlo Cross Validation   The IRIS flower is a popular example in the field of machine learning. It is a type of flower that has different variations, such as the setosa, virginica, and versicolor. In this blog, we will be discussing how to classify the …

IRIS Flower Classification using SKLEARN DecisionTree Classifier with Grid Search Cross Validation

Hits: 85IRIS Flower Classification using SKLEARN DecisionTree Classifier with Grid Search Cross Validation     The IRIS flower is a popular example in the field of machine learning. It is a type of flower that has different variations, such as the setosa, virginica, and versicolor. In this blog, we will be discussing how to classify …

How to do Feature Selection – recursive feature elimination in R

Hits: 113 How to do Feature Selection – recursive feature elimination in R Recursive feature elimination (RFE) is a feature selection technique that recursively removes the least important features from the dataset. The goal of RFE is to select a subset of features that are most informative and relevant to the target variable, while reducing …

How to create a pipeline that extracts features from the data and create model

Hits: 83 How to create a pipeline that extracts features from the data and create model Creating a pipeline that extracts features from the data and creates a model is a common task in machine learning. A pipeline is a sequence of steps that are executed in order to accomplish a certain task. In this …

How to tune hyper-parameters using GridSearchCV in Python

Hits: 189How to tune hyper-parameters using GridSearchCV in Python When building a machine learning model, it’s important to optimize the parameters of the model for the best performance. One way to do this is by tuning the hyper-parameters using GridSearchCV. GridSearchCV is a method that allows you to search for the best combination of hyper-parameters, …

How to do variance thresholding in Python for feature selection

Hits: 67How to do variance thresholding in Python for feature selection When working with large datasets, it is often important to select the most important features that contribute to the prediction of a model. One technique for doing this is called variance thresholding. In Python, variance thresholding can be performed using the library scikit-learn. The …

How to do recursive features elimination in Python using DecisionTreeRegressor

Hits: 96How to do recursive features elimination in Python using DecisionTreeRegressor Recursive feature elimination (RFE) is a technique used in machine learning to determine the most important features in a dataset. This is done by iteratively removing the least important feature until a certain number of features is reached. In Python, one can use the …

How to do recursive features elimination in Python

Hits: 102How to do recursive features elimination in Python Recursive feature elimination (RFE) is a technique used in machine learning to determine the most important features in a dataset. It can be used to improve the accuracy and efficiency of a model by removing unnecessary features that do not contribute to the prediction. In Python, …

How to drop out highly correlated features in Python

Hits: 134How to drop out highly correlated features in Python In machine learning, correlated features can cause problems because they can provide redundant information to the model. Having too many correlated features can also increase the risk of overfitting. One way to deal with correlated features is to drop some of them. This process is …