# Sigmoid for Classifiers Decoded

Hello World,

Sigmoid really isn’t that complicated (once your understand it of course).  Some back knowledge in case you are coming at this totally fresh is that the Sigmoid function is used in machine learning primarily as a hypothesis function for classifiers.  What is interesting is that this same function is used for binary classifiers, multi-class classifiers and is the backbone of modern neural networks.

Here is the sigmoid function:    $\frac{ 1 }{ 1 + e^{-z}}$

# Merging Data Sets in Python

Hello World,

So this article is inspired by a customer doing financial analysis who can only grab a certain amount of data at a time from the data steward’s stores in chunks based on time windows. As time is constantly moving, what happens is that occasionally you get duplicate data in each request. If you attempt to grab exactly on the edges, you have a chance of missing something, so its best to have a bit of an overlap and just deal with that overlap.

# Machine Learning Study Group – Week 2 & 3 Recap

Hello World,

So this is another recap from our study group covering the Andrew NG course on Coursera. Lets start by a quick summary from the two weeks. Week 1 was all about introduction to linear regression and gradient descent. There were no assignments due. Week 3 was all about multi-variate linear regression, normalization and a few other topics. There was a coding assignment as well as a quiz due for week 2.

# Feature Scaling & Machine Learning

Hello World!

If you are practicing machine learning, you are likely going to run into this at some point.  Basically the reason we use feature scaling is to help our algorithms train faster and better.  Lets begin by taking a standard theta optimization equation to help better understand the problem.
$\theta_j = \theta_j - \alpha \cdot \frac{ \sum_i^m \left(H_{\theta}\left(x\right) - y\right) \cdot x_j } { m }$

# Linear Regression from Scratch using Linear Algebra

Hello World!

So I wrote an article earlier “Linear Regression From Scratch”.  Many folks have pointed out that this is in fact not the optimal approach.  Now being the perfectionist I decided to re-implement.  Not to mention it works great in my own libraries.  The following article discussing converting the original code into code that uses linear algebra.  Beyond this, it still works in PCL for xamarin,  Hoo-Rah Xamarin!

# R Jump Start – Beautiful Data Visualizations

Hello World!

This is a video tutorial for building beautiful data visualizations in R.  You will learn about what Data Viz is, basic charting libraries and finally a full walk through for how I built the Miami Jail Interactive graphic you see in this article.

Part 1: Introduction to Microsoft R Open.

Part 2: Introduction to R Data Structures

Part 3: Data Manipulation with R

Part 4: Beautiful Visualizations with R

# Battle of the Programming Languages

Hello World!

1. This is an excerpt from a paper I wrote for internal use of my own volition.  As this is the case, I was able to remove all confidential information and publish my findings.
2. I only analyzed F#, C#, R and Python.  I know there are more, but I picked the top dogs, but F# had some special circumstances that I felt it belonged.

# Machine Learning Study Group Recap – Week 1

Hello World!

So many of you who are here are probably part of the study group.  For those who are not or are perhaps referencing this at a later time, this is in regards to the following course on Coursera. If you would like to join our study group, please see one of the following meetup pages: Fort Lauderdale Machine Learning or Florida Dot Net.

Here in South Florida we have a strong Machine Learning and Data Science community and therefor it is easy to get a study group together.  This article is a recap from the first meeting of our study group.  Note that this first meeting is the week before the class started.  Therefor this article is a great introduction to machine learning, languages, commitments and more generally applicable questions and concerns.

# Linear Regression from Scratch

Hello World,

So today we will do a quick conversion from mathematical notations of Algebra into a real algorithm that can be executed.  Note we will not be covering gradient descent, but rather only cost functions, errors and execution of these to provide the framework for gradient descent.  Gradient descent has so many flavors that it deserves its own article.

So to the mathematical representation.