Sign Up Now!

Sign up and get personalized intelligence briefing delivered daily.


Sign Up

Articles related to "ml"


How to effectively employ an AI strategy in your business

  • This means that even though companies need experts to create models and algorithms, but they also need people with different technical abilities who can discover useful insights from data before passing it on to the experts.
  • To be able to come up with relevant questions, companies need to have people who are creative with an analytical mindset and have solutions backed by data and not gut feeling.
  • Another important reason for connecting to the community is that most of the data scientists and researchers today want to collaborate with others.
  • The technologies in AI space are advancing at a rapid pace, and by connecting, people want to ask all the right questions, share with others, participate with them, and learn from everyone.
  • A lot of existing pioneers in machine learning and AI are regularly open-sourcing their technologies, which could act as a good starting point for others.

save | comments | report | share on


7 Ways To Make Software Developers Happy

  • As a compromise, in order to enhance the team communication, core hours that everyone is online can be agreed (e.g. 11am to 3pm) and beyond that each developer can manage their own time — within reason obviously!
  • This is a fact!And although the delivery manager is on top of the planning and the looming milestones, sometimes there is no way around the fact that everyone in the team has to put long hours to get things done and release the new feature to production.
  • Autonomy at work is a really important aspect for me and the more my career was growing the less willing I was to compromise on having a degree of freedom to complete my tasks or be able make decisions.
  • Furthermore, it is a good idea to give your dev team time and space to skill up before any new project starts.

save | comments | report | share on


AI, Machine Learning, and the Future of Marketing

  • Artificial intelligence can help with that in its ability to collate data and decide which content is the most applicable for an individual based around things like historical data, location, and past behavior.
  • Voice search technology is also a great AI tool that can help digital marketers get faster results.
  • AI can help to personalize a customer experience based on their past behavior.
  • This means that businesses will be able to provide the best experience possible for their customers and identify the right tools to increase success in their industry.
  • As you read this, AI software is being created that will allow P.R. and marketing agencies to ‘meet’ with potential clients, ‘sense’ their needs, explore their narratives and culture and identify key target audiences.
  • Success lies in using AI to both researchers, and deliver personal experiences that regularly communicate, engage, and delight customers across multiple channels, formats, and device types.

save | comments | report | share on


Can you crawl data from LinkedIn?

  • For the first project, we would need a script that logs into LinkedIn, visits the University of Mannheim alumni page, scrolls down automatically to load new profiles, and then creates a list of all URLs to the alumni’s LinkedIn page.
  • Right now the script is able to visit LinkedIn, log in, request the Mannheim alumni page and then starts scraping the names of the alumni.
  • Comparing the two biggest employers in close proximity to our university indicates that SAP seems to have increased their hiring efforts the last years while BASF stayed more or the less stable, at least regarding Mannheim graduates with profiles on LinkedIn. Some employers like Deutsche Bank either seem to reduce hiring or stop to be interesting to Mannheim alumni compared to previous years.

save | comments | report | share on


Twitter data collection tutorial using Python

  • In this tutorial, we’ll learn how to use Twitter’s API and some Python libraries to collect Twitter data.
  • Executing the code block above will prompt you to follow a URL to authenticate your account, and allow data streaming between Google Drive and Colab.
  • If the amount of data collected within the last 15 minutes exceeds the API limits, a tweepy.RateLimitError exception is raised, in which case the code will wait for 15 minutes.
  • Note, that the retrieved User objects contain two keys _api and _json, so we simply extract the data we care about using the List comprehension [x._json for x in followers_list].
  • You can see that this is exactly like our get_followers() function, except that we use api.friends to define our Cursor object, so we can retrieve the data of the users we’re following.

save | comments | report | share on


Mind Blowing Secrets On How Artificial Intelligence Will Change Humans

  • In another, virtual experiment, we divided 4,000 human subjects into groups of about 20, and assigned each individual “friends” within the group; these friendships formed a social network.
  • Both of these studies demonstrate that in what I call “hybrid systems” — where people and robots interact socially — the right kind of AI can improve the way humans relate to one another.
  • In this experiment, we found that by adding just a few bots (posing as human players) that behaved in a selfish, free-riding way, we could drive the group to behave similarly.
  • By taking advantage of humans’ cooperative nature and our interest in teaching one another — both features of the social suite — the bots affected even humans with whom they did not interact directly, helping to polarize the country’s electorate.

save | comments | report | share on


The Most Undervalued Data Science Course

  • You all know of Coursera’s machine learning course and Andrew Ng’s deep learning specialization.
  • These are all excellent resources to learn data science, but I want to make you aware of a lesser-known, yet superb, set of courses with which you can augment your knowledge in only a few hours.
  • You most likely have even participated in a challenge and maybe uploaded some kernels, but did you know Kaggle also provides data science education?
  • You’d love to take Andrew Ng’s deep learning course, but you can’t seem to find the time right now in your career.
  • For example, I don’t know of any other platform which offers a course on machine learning explainability or gets you querying SQL databases in only a few hours.
  • These micro-courses are an excellent way to continue building your data science abilities fast.

save | comments | report | share on


Machine learning model deployment with C++

  • Learn the actual Pricipal component analysis algorithm.Now that we have created the data table with the preprocessed face images, learning the PCA model is usually smooth.
  • Having saved the model, the inference layer will involve loading the saved model whose values are stored in the yml file also via OpenCV and developed to form an inference engine with necessary data preprocessing modules which is finally bundled as a .so file for deployment on a linux based environment.
  • Load a PCA modelFor inference, we let OpenCV load the existing model file from the saved .yml file after which we feed the eigenvalues, eigenvectors and mean to a new PCA object which we can then call a pca->project on to create a new image’s projection.
  • Create new image preprocessing and prediction stage.During inference of a machine learning model, it is important that the incoming image also passes through the same preprocessing as the training dataset.

save | comments | report | share on


Autoencoder Neural Network for Anomaly Detection with Unlabeled Dataset

  • Yes, the job of an autoencoder neural network is encoding the data into a small code (compression) and decoding it back to reproduce the input (uncompression).
  • In our case, since the dataset consists 99% of normal data and only 1% of anomalies, what happens while training is, the model misses out the small proportion and fits the remaining 99% of the data so that the MSE is very very small.
  • What we need to do is calculate MSE of the output compared to input and to properly differentiate the anomalies, by checking the outputs we need to set a threshold value for MSE according to our need so that it predicts with good precision and recall.
  • All we need to do now is stacking up these models so that in real-time prediction those which are predicted as anomalies by the high recaller model (autoencoder neural network) are sent through the false positive reduction model (artificial neural network).

save | comments | report | share on


Deploy your RShiny App Locally with Docker

  • My favorite way to deploy RShiny locally is to simply package it into a docker image and run it.
  • Running any application in docker makes it easily transportable, and is a generally acceptable way of distributing applications in 2019.
  • Now that we’ve covered some housekeeping let’s get started building your docker image.
  • This way it is accessible to the docker build process.
  • This is a very simple example with a single R file that serves our RShiny app, app.R. The Dockerfile has our build instructions, and the environment.yml lists our conda packages, namely r-shiny and r-devtools.
  • I personally like to install all of my scientific software with conda.
  • What matters is that your dependencies are installed and ready to run your RShiny app.
  • The dockerfile has our docker build instructions.
  • Now that we have our files all ready let’s build and run our docker container!

save | comments | report | share on