Sign Up Now!

Sign up and get personalized intelligence briefing delivered daily.


Sign Up

Articles related to "model"


Machine Learning Basics: Polynomial Regression

  • In this article, we will go through the program for building a Polynomial Regression model based on the non-linear data.
  • In the previous examples of Linear Regression, when the data is plotted on the graph, there was a linear relationship between both the dependent and independent variables.
  • So, in this problem we have to train a Polynomial Regression model with this data to understand the correlation between the Level and Salary of the employee data in the company and be able to predict the salary for the new employee based on this data.
  • The class “LinearRegression” is also imported and is assigned to the variable “lin_reg” which is fitted with the X_poly and y for building the model.
  • In this step, we are going to predict the values Salary based on the Polynomial Regression model built.

save | comments | report | share on


What is a Full Stack Data Scientist?

  • The scope of a full stack data scientist covers every component of a data science business initiative, from identifying to training to deploying machine learning models that provide benefit to stakeholders.
  • A full stack data scientist must be able to identify and understand business problems (or opportunities) that can be solved using the data science toolkit.
  • This skill is essential because useful machine learning models cannot be built without data understanding.
  • Lastly, a full stack data scientist must have the skill to deploy model pipelines to production.
  • Two key underlying skills of the full stack data scientist are the ability to design a system or process and the ability to quickly pick up new technologies.
  • These two elements are the keys to any organization extracting value from data science — solving the right problems and making them accessible to the end-user.

save | comments | report | share on


Model with TensorFlow and Serve on Google Cloud Platform

  • In this guide, we learn how to develop a TensorFlow model and serve it on the Google Cloud Platform (GCP).
  • We consider a regression problem of predicting the earnings of products using a three-layer neural network implemented with TensorFlow and Keras APIs. In this guide, we will use the TensorFlow 2.1.0 and Google colab runtime environment.
  • Google colab offers the training of machine learning models on free GPU and TPU.
  • To define the deep learning model, we use the Keras API module shipped with TensorFlow 2.1.0 We start with a basic model with two intermediate relu layers.
  • Let us read the sample input data and test our model to predict and save the model locally for future use.
  • In this guide, we have learned about training the deep learning models with TensorFlow 2.1.0 and deploying the trained model on the google cloud platform.

save | comments | report | share on


My 10 favorite resources for learning data science online

  • That is why in this article, I want to share my 10 favorite data science resources (online ones), which I frequently use for learning and trying to keep up with the current developments.
  • In the talks you can find a mix of general Python best practices, examples of real-life cases the data scientists worked on (for example, how they model churn or what tools they use to generate an uplift in their marketing campaigns), and introductions to some new libraries.
  • I started my data science journey with R, and even after switching my main programming language to Python I still follow R-bloggers.
  • The list of people to follow will highly depend on the scope of your interests, for example, if you focus on deep learning used for computer vision or maybe NLP.

save | comments | report | share on


Machine Learning Basics: Polynomial Regression

  • In this article, we will go through the program for building a Polynomial Regression model based on the non-linear data.
  • In the previous examples of Linear Regression, when the data is plotted on the graph, there was a linear relationship between both the dependent and independent variables.
  • So, in this problem we have to train a Polynomial Regression model with this data to understand the correlation between the Level and Salary of the employee data in the company and be able to predict the salary for the new employee based on this data.
  • The class “LinearRegression” is also imported and is assigned to the variable “lin_reg” which is fitted with the X_poly and y for building the model.
  • In this step, we are going to predict the values Salary based on the Polynomial Regression model built.

save | comments | report | share on


Machine Learning Basics: Polynomial Regression

  • In this article, we will go through the program for building a Polynomial Regression model based on the non-linear data.
  • In the previous examples of Linear Regression, when the data is plotted on the graph, there was a linear relationship between both the dependent and independent variables.
  • So, in this problem we have to train a Polynomial Regression model with this data to understand the correlation between the Level and Salary of the employee data in the company and be able to predict the salary for the new employee based on this data.
  • The class “LinearRegression” is also imported and is assigned to the variable “lin_reg” which is fitted with the X_poly and y for building the model.
  • In this step, we are going to predict the values Salary based on the Polynomial Regression model built.

save | comments | report | share on


Machine Learning Basics: Polynomial Regression

  • In this article, we will go through the program for building a Polynomial Regression model based on the non-linear data.
  • In the previous examples of Linear Regression, when the data is plotted on the graph, there was a linear relationship between both the dependent and independent variables.
  • So, in this problem we have to train a Polynomial Regression model with this data to understand the correlation between the Level and Salary of the employee data in the company and be able to predict the salary for the new employee based on this data.
  • The class “LinearRegression” is also imported and is assigned to the variable “lin_reg” which is fitted with the X_poly and y for building the model.
  • In this step, we are going to predict the values Salary based on the Polynomial Regression model built.

save | comments | report | share on


The Data Science ABCs: A Whirlwind Tour of the Field

  • For instance, Earth has an estimated trillion number of species; it is infeasible to label-encode, one-hot encode, or even apply a specialized categorical encoding method discussed above, simply because the sheer number of classes creates a complete inability for differentiability between unique values.
  • An important differentiating factor between N-grams and other forms of data fed into natural language processing models like one-hot encoded data is that it retains the sequential aspect of its data.
  • Whereas naïve methods of encoding do not give the model any information on the order of words in a phrase, training a model on N-grams yields more satisfactory results, which can take into account complex aspects of language like sarcasm, idioms, and multi-meaning words based on context.

save | comments | report | share on


XRL: eXplainable Reinforcement Learning

  • The interpretability of the framework comes from the fact that each task (for instance stack cobblestone block) is described by human instruction, and the trained agents can only access learnt skills through these human descriptions, making the agent’s policies and decisions human-interpretable.
  • The resulting framework exhibited a higher learning efficiency, was able to generalize well in new environments and was inherently interpretable as it needs weak human supervision to give the agent instructions in order to learn new skills.
  • In Explainable Reinforcement Learning through a Causal Lens, action influence models are incorporated for Markov Decision Processes (MDP) based RL agents, extending structural causal models (SCMs) with the addition of actions.
  • Explanation generation requires the following steps: 1) defining the action influence model; 2) learning the structural equations during reinforcement learning; and finally, 3) generating explanans for explanandum.

save | comments | report | share on


NGBoost algorithm: solving probabilistic prediction problems

  • The problem we are tackling here is that almost all regression algorithms do not return the distribution of the target variable given predictors P(y|X), but an expectation of the target variable E(y|X), i.e. a point estimate.
  • For example, in scikit-learn the classifiers have method predict_proba(), that returns the probability of the class.
  • Yes, if your model returns a conditional probability distribution of the target variable given the predictors.
  • Once you picked your conditional distribution, the problem is simplified to learning parameter vector θ of the distribution given the input variables.
  • Predicting a conditional distribution of a continuous variable is hard, it is much easier to predict a point estimate.
  • It is different in classification problems, where more algorithms are able to predict the class probability.
  • The NGBoost algorithm allows to easily get prediction of the parameters of conditional distribution of the target variable given the predictors.

save | comments | report | share on