So I had a life changing event this past Sunday at 8:55am 5/24/2015. My first child was born! Both child and wife are healthy and happy. Everything is good in life. Like many couples though, my wife and I struggled to find the right name for our child. We didn’t want something too common, or was an old person name, or so rare and funky that nobody could spell it. We also realized we just had a general lack in knowing what names were out there. So after much debate and discussion over what to name her, I started doing a bit of an analysis using some census data. I want to thank Jamie Dixon for providing the data that he found for use in his Dinner Nerds article. The data itself can be found here. This article will discuss the code used to go through all of the data and provide insights into child names.
The answer to these questions are pretty much all the same. Step 1, learn about it and build one piece of software focused on that goal. Step 2, go for it, just do it. So that said, Microsoft has a fantastic resource, Microsoft Virtual Academy, which provides free training around various topics from entry level to advanced. This article focuses on a learning plan with MVA to attain the goal of becoming an Analytics Developer.
I have recently been informed that many of my articles may be a bit advanced for folks, so I am going to kick off a series of C# articles dedicated to the Beginner to programming. I have no idea how long this series is going to be, I’ll just keep adding to it as requests come in for various topics. This series is meant to take the absolute beginner to a level in which they can possibly derive value from my other articles. Those of you who do Jiu Jitsu with me, know you have to shrimp before you can roll, so this is sort of like shrimping.
This should prove to be an interesting series of posts coming up, as I am working on a new project that is very unique and interesting. The idea is to use incoming data from Arduinos, Raspberry Pis, Gallileos, Edisons and other assortments of IoT type devices connected to oil and gas pipelines to determine if a leak is currently in progress and also predict if a leak is likely to occur in the future based on current and trending conditions.
My part in the project is all back end analytics, and I have very little to do with the actual telemetry and hardware. The telemetry will be posted using Azure Event Hubs, and thus my portion of the project begins with mocking that real time data at a large enough geo dispersed scale that I can develop a system that can handle it, and then switch my configurations to consume from the production event hubs. Since I am no longer a consultant working on projects with trade secrets and everything these days is about the elevation of skills in the community, I have posted everything on github that you can download and peruse at your leisure. Please note that this is in progress and well the github source may not necessarily work when you look. I’ll try to enforce a standard to comment “working – comment” on pushes to the repository. The git repository is located here: https://github.com/drcrook1/OilGas
You can see from my blog that I have been increasing my adoption of F# and my love of the language is growing. I find the language very conducive to cloud computing. F# at its core seems to be built for distributed computing. If you program with F# in an idiomatic way, you are setting yourself up for success on the cloud, as the cloud is a variably sized distributed system that you are deploying production code to. That is what F# excels at.
As much as I have loved F#, it has not been all roses and sunshine. There have been issues. This article at its core is really an ask from the community to support me in building the case and evangelizing F# in such a way it can be recognized and adopted not only by the community, but by those who make the decisions to expend resources on additional tooling and support. I must iterate that the contents of this article are in no way the opinion of Microsoft, but merely my own observations. From these observations I have developed a strategy and action items that we the F# community must do to achieve these goals.
First, we must step outside of our normal mindset as F# developers and begin looking at the language from the outside perspective. We must also look at some trends.
This article is one of those that is going to help remind me how to do this deployment, as it can be a bit tricky. If you are working with F# for web jobs, like I have started doing, there are a few steps.
Create a new console application
Add proper nuget packages
Manually add a .dll reference and copy said .dll to output
So you are going to notice a slight shift in this blog to start incorporating not only video game development, but hardcore data analytics. As part of that shift, I am going to start incorporating F# into my standard set of languages as it is the language of hardcore data analytics if you roll with the .NET stack.
This particular article is about building a console based blob manager in F# instead of C#. The very first thing I noticed about using F# to manage my blobs as opposed to C# is just the sheer reduction in lines of code. The code presented here is a port of the C# article located here. This code will eventually make its way into a production system which is part of a big data solution I am building. New data sets that we acquire will be uploaded into blob storage, an entry stored into a queue, with a link to the data set. Once a job is prepared to run, the data will be moved to Hadoop to do the processing and then stored in its final location. So step 1 is…Store data in Blob storage.
Welcome to Part 2! We will be discussing Binary Classification. So I hope many of you have started using AzureML. If not, you should definitely check it out. Here is the link to the dev center for it. This article series will focus on a few key points.
Understanding the Evaluation of each Model Type.
Understanding the published Web Service of each Model
If you are looking for how to build a simple how to get started, check out this article.
The series will be broken down into a three parts.