In this article we are going to cover a simple version of Gradient Descent. It is important to note that this version of gradient descent is using Sum of Squares as its cost function to reduce. This implementation utilizes vectorized algorithms. Lets start off with…
So this is another recap from our study group covering the Andrew NG course on Coursera. Lets start by a quick summary from the two weeks. Week 1 was all about introduction to linear regression and gradient descent. There were no assignments due. Week 3 was all about multi-variate linear regression, normalization and a few other topics. There was a coding assignment as well as a quiz due for week 2. Continue reading →
If you are practicing machine learning, you are likely going to run into this at some point. Basically the reason we use feature scaling is to help our algorithms train faster and better. Lets begin by taking a standard theta optimization equation to help better understand the problem.
So I wrote an article earlier “Linear Regression From Scratch”. Many folks have pointed out that this is in fact not the optimal approach. Now being the perfectionist I decided to re-implement. Not to mention it works great in my own libraries. The following article discussing converting the original code into code that uses linear algebra. Beyond this, it still works in PCL for xamarin, Hoo-Rah Xamarin!
This is a video tutorial for building beautiful data visualizations in R. You will learn about what Data Viz is, basic charting libraries and finally a full walk through for how I built the Miami Jail Interactive graphic you see in this article.
So this article is to help provide some guidance around which programming language to use. Note that this article is specifically geared towards delivering code in which intelligence and information is the soul of the product. In this day and age, that should be every product.
I want to preface this article with a few things
This is an excerpt from a paper I wrote for internal use of my own volition. As this is the case, I was able to remove all confidential information and publish my findings.
I only analyzed F#, C#, R and Python. I know there are more, but I picked the top dogs, but F# had some special circumstances that I felt it belonged.
Here in South Florida we have a strong Machine Learning and Data Science community and therefor it is easy to get a study group together. This article is a recap from the first meeting of our study group. Note that this first meeting is the week before the class started. Therefor this article is a great introduction to machine learning, languages, commitments and more generally applicable questions and concerns.
So today we will do a quick conversion from mathematical notations of Algebra into a real algorithm that can be executed. Note we will not be covering gradient descent, but rather only cost functions, errors and execution of these to provide the framework for gradient descent. Gradient descent has so many flavors that it deserves its own article.
So I’ve been working on building some interesting visualizations with open data. Today I get to show off a really interesting one, not only will we discuss the visualization in depth, but also dive into how I built it. And here it is, the top 10 bookings in Miami where the legend is in descending order for most common bookings holistically.
Here is a recorded version of an in-person training I have been doing. Enjoy. I end up coming back to this myself even for reference.
This episode is all about performing data manipulation to derive raw insights from your data using the R programming language. Data manipulation is the core to anything and everything you do in business intelligence and machine learning. This episode sets the base for all R based intelligence sessions from here on out.