For this portion of the HoL, we will be bypassing the Raspberry Pi portion all together and go straight to provisioning Resources and using a local app to simulate the telemetry data. You can alternatively still use a Raspberry Pi as this session was intended, but for time purposes, that can be skipped and the app below can be used for simulating data for a live dashboard. Continue reading →
This article is for a hands on lab in IoT I am running. You will find full documentation at the provided link. Also, you may run into a scenario in which the MCP 3008 code does not work. If this is the case, you are likely using an older version of the nugget package which contains it or the new version has not been pushed yet. You can find the code for the MCP 3008 below.
Here is the link: https://onedrive.live.com/redir?resid=BA8DC4B28555902A!3406&authkey=!AGawE2hfolHvC8s&ithint=file%2cpdf
These days I need to make videos instead of written articles, so I am going to post a few of those here.
In this video we will do an initial exploratory analysis on a water flow data set that came from a prototype that I built. The prototype consists of a water pump, a valve and a flow meter. The data set exists in SQL Azure. We will use R and R Studio to perform the analysis from an Azure virtual machine.
Today is a freaking cool day. Why do you ask? Because today I am writing an article on how to use two of the coolest freaking big data/data science tools out there together to do epic shit! Lets start with HBase. HBase is a way to have a big data solution with query performance at an interactive level. So many folks are starting to just dump data into HBase. In the project teddy solution, we are dumping tweets, dialogue and dialogue annotations to power our open domain conversational api. There really is no other way that is easy to use for us to do this.
The second part of project teddy is to predict based on an incoming conversational component, what sort of response the speaker is attempting to illicit from the teddy bear. If we power our teddy bear with predictive analytics and big data, this would be perfect. What better platform to do this quickly and easily than AzureML?
As Azure becomes more and more popular and I encounter more startups, I find myself doing this tutorial all the time and explaining it. Therefor, I have decided to write a blog article with pictures, to make my life (and yours) easier.
What is BizSpark
BizSpark is the best thing since sliced bread for a startup. It is literally every development tool and license that Microsoft has to offer for free for commercial purposes for 3 years. Not only that, but you get $150/month (as of this writing) in Azure for 3 years as well. As if it couldn’t get any better, you get access to reduced pricing on various products from Microsoft Partners. Microsoft being such a giant of a company, there are a TON of partners you get special pricing from. BizSpark also includes product licensing such as just your simple windows licenses, or visual studio licenses, and even SQL Server licenses. Its everything! Usually at this point I get the question, so what’s the catch? There is no catch! Microsoft wants you to use their tools and be successful with their tools so that when you become a giant company, you are using their tools and not a competitors. Therefor Microsoft gives these tools to high potential startups for free! If you think you qualify, apply or come to an event I attend (usually found on the events tab). You can also ping me on twitter @DavidCrook1988.
Many folks may know that the South Florida Evangelism team is undertaking a task that many think is impossible. Well, in that statement all I hear is “there is still a chance!” The end goal is to create a teddy bear that can have a conversation about anything. So step one is to collect as much dialogue as possible from as many sources as possible and annotate them. What better place to power an association engine for word and phrase relevance than something that forces you down to 140 characters to get your message across.
So as any normal developer I decided to start by looking for samples already out there. MSDN has a great starter for writing tweets and doing sentiment analysis with HBase and C#. The only issue with the sample is, that it is very poorly written and difficult to understand with no separation of concerns. So I want to go through simplifying the solution and separating a few concerns out.
As many of you may know at this point, I am relocating to South Florida. Final location to be determined, but will probably be renting around Pompano Beach or Fort Lauderdale while working out of Venture Hive and the Microsoft Fort Lauderdale Offices. So what does this have to do with Zillow? Well, It has EVERYTHING to do with Zillow. What I’ve found while searching for homes is that between Realtors, Zillow and Trulia, they really just don’t have a predictive analytics solution that works for me. So I decided to give a shot at AzureML to mash together a few datasets to send me notifications more to my liking than is currently being sent. So step 1 in this plan is to data mine Zillow. Luckily, Zillow has an api for that. Or if you are feeling particularly frisky, Zillow gets their data from ArcGIS (example for Raleigh). So lets get cracking…
So I thought it would be beneficial to discuss Angular, Web API and Azure in some form of depth as well as provide an entire set of functioning code. I will start by addressing a few things, What is Angular, What is Web API, and what is Azure? Followed by the code and explanation of the code. The code itself provides a simple website, which has restful routing, requests for processing and lists out data from a database acquired from said processing.
The answer to these questions are pretty much all the same. Step 1, learn about it and build one piece of software focused on that goal. Step 2, go for it, just do it. So that said, Microsoft has a fantastic resource, Microsoft Virtual Academy, which provides free training around various topics from entry level to advanced. This article focuses on a learning plan with MVA to attain the goal of becoming an Analytics Developer.