How to Make Use of Azure Cognitive Services

Azure Cognitive Services is a service that is available from your Azure portal which allows you to set-up accounts that enable the use of Microsoft’s pre-built algorithms and models for common advanced analytical functions such as Text Analytics, Recommendation Engines, speech and vision recognition as well as many more service-oriented APIs. These models have been trained and are ready for you to use and consume in your solutions, such as Power BI business intelligence solutions.

There is a very common use case emerging that I’m finding in the Cloud Analytics space that relates specifically to a portion of the Text Analytics function where there are a number of solutions that can take advantage of the Sentiment Analysis capabilities that Azure Cognitive Services provides. By sending in a text string to the API service, you can leverage the existing Microsoft sentiment engine to determine if the message that you sent in is positive, negative or neutral based upon a score.

The best way to demonstrate this is with Power BI. I am going to briefly touch on 2 of the most common social media sources to pull real-time data from and core the messages for sentiment analysis: Twitter & Facebook.

First, the easiest with Power BI (PBI) is Facebook. There is an out-of-the-box connector for Facebook included with PBI, so you can use Get Data to bring data directly into a Power BI model from Facebook very easily:

pbi4

Now, to hook into the Microsoft Sentiment Analysis, follow the method described very well on this PBI community page and from the associated PASS Virtual Chapter presentation on this very topic by Gil from DataChant: http://community.powerbi.com/t5/Community-Blog/Sentiment-Analysis-in-Power-BI/ba-p/55898. His full working sample is awesome and you can play around with it here: https://app.powerbi.com/view?r=eyJrIjoiMjU5NWM3MTMtMTg0NS00YzE2LTk0YTYtMmZkZWM4NjYwMmFjIiwidCI6IjIyNzNjNDFiLWI4ZDAtNDVhZi1iZWU2LWUwODQ5NmFlNjcxOCIsImMiOjN9.

To download and use a similar sample in the PBI Desktop, just download the PBIX from my Github. I recommend going into the detailed PBI Community link from Gil above, it rocks and gives you the background you need to customize this for yourself with step-by-step instructions on how this was made.

For the demo above that I uploaded to Github, I included Gil’s Trump & Clinton Facebook election sentiment analysis data on Tab 2 of the report view on PBI. Tab 1 has Microsoft BI Facebook sentiment data where I used the Microsoft BI Facebook page as the data source. Each of those datasets is run through the Cognitive Services API for Sentiment Analysis, part of the Text Analytics suite.

To make it work for you, follow Gil’s instructions on getting an API key from Cognitive Services and plug in your API Key in the GetSentimentResults function in the Power Query / Edit query tool. You can use the free limited service for Sentiment Analysis or go to the Azure Portal and create a full-use Cognitive Services account. The code for calling out to the API and saving the sentiment score in the PBI model is in a Power Query function call in the PBIX and looks like this:

(Source as table) as any =>
let
JsonRecords = Text.FromBinary(Json.FromValue(Source)),
JsonRequest = “{“”documents””: ” & JsonRecords & “}”,

JsonContent = Text.ToBinary(JsonRequest, TextEncoding.Ascii),
Response =
Web.Contents(“https://westus.api.cognitive.microsoft.com/text/analytics/v2.0/sentiment?”,
[
Headers = [#”Ocp-Apim-Subscription-Key”= “your api key here”,
#”Content-Type”=”application/json”, Accept=”application/json”],
Content=JsonContent
]),
JsonResponse = Json.Document(Response,1252)
in
JsonResponse

As you can tell from my sample Sentiment Facebook dashboard below, I did not put as much time into the report as Gil did. So take the PBIX and add your own visuals to make it better!

pbi

Now onto the 2nd example, using Twitter data. There is no out-of-the-box connector directly into PBI for live Twitter feeds, so I used the Solution Template for Twitter (Marketing use case) from the Power BI Solution Template site: https://bpsolutiontemplates.com/.

That site provides you with a “wizard” that walks through the set-up of an Azure SQL Database, your Cognitive Services API as I mentioned above (free or the Azure account) and the Twitter handles/keywords that you want to use as a filter. This creates a much more robust solution to grab the Twitter data in that the solution template with spin-up Logic Apps, App Service and API connectors to Twitter and Sentiment Analysis on the backend instead of directly from Power BI as the first example with Facebook above.

So, essentially Power BI will just connect to your Azure SQL DB that you’ve chosen from the template and that SQL DB will get populated from the Azure App that is built automatically for you. This separates the data from the business logic and from the presentation layer in PBI. The code to call Twitter and Cognitive Services are all in the “middle tier” on the App Service in Azure, which you can open directly from your Azure Portal. Modify the API calls, strings and sentiment bins in that code:

//Sentiment analysis – Cognitive APIs
string sentiment = await MakeSentimentRequest(tweet);
sentiment = (double.Parse(sentiment) * 2 – 1).ToString(CultureInfo.InvariantCulture);
string sentimentBin = (Math.Floor(double.Parse(sentiment) * 10) / 10).ToString(CultureInfo.InvariantCulture);
string sentimentPosNeg = String.Empty;
if (double.Parse(sentimentBin) > 0)
{
sentimentPosNeg = “Positive”;
}
else if (double.Parse(sentimentBin) < 0)
{
sentimentPosNeg = “Negative”;
}
else
{
sentimentPosNeg = “Neutral”;
}

The out-of-the-box reports from these Power BI templates are quite good with a social graph visualization and a number of report tabs for different pivots of the Twitter data and sentiment scores.

pbi2

pbi3

Azure Marketplace Solutions

Yes, it’s been a while and I’m just now getting back into the swing of things, keeping MS-SQL Dude running, after my 2 year hiatus in the Open Source world.

To get back into the swing of things, I’ll share with you all a series of videos that I put together with the Microsoft Azure Partner team around the database, analytics and big data offerings from other software vendors that are available as solution template images on the Azure Marketplace:

We first started at the fundamental data level with my old friends at DataStax and their Cassandra database product

We then moved up the stack into batch analytics, streaming and distributed execution engines on Hadoop with my friends at Hortonworks

Lastly, we finished up with the end-user data visualization and BI layer from Looker

All of these data solutions are available from the Azure Marketplace link above.

You can also find several of the presentations from this series at my Slideshare site here.

 

SQL Server 2012 Database Backups in Azure

SQL Server 2012 inlcudes new & updated features to SQL Server Management Studio (SSMS) that provide out-of-the-box integration into the Windows Azure platform-as-a-service Cloud service from Microsoft. It goes beyond just SQL Azure integration as you can now also connect into Windows Azure storage accounts where you can store files and blobs, including DACPAC & BACPAC files, essentially providing DBAs with out-of-the-box backup-to-cloud capabilities.

From a DBA’s perspective, this can be very beneficial because this would allow you to take your SQL Server backups and post them into the Azure cloud where files are automatically protected and replicated for high availability. This will also eliminate the need for you to maintain infrastructure for backups locally on your site. You would utilize Azure Storage for maintenance, retrieval and disaster recovery of your database backups. Here is a link with more details on Azure Storage.

Here are the steps to backup from SSMS 2012 to Windows Azure:

  1. First thing to note is that you will need to sign-up for a Windows Azure account and configure a storage container. You can click here for a free trial.
  2. Now, on SSMS, choose a database to backup. But instead of the normal Backup task, select “Export Data-Tier Application”. This is going to walk you through the process of exporting the schema with the data as a “BACPAC” file output.

3. On the next screen in the wizard, you will select the Azure storage account and container where SQL Server will store the export. Note that it will first backup the database (schema & data) to a local file and then upload it to the cloud for you.

4. Once the process is complete, you will see your exported backup as a BACPAC in your storage container. To restore a BACPAC, you right-click on the file from your container and select “Import Data-Tier Application”.

BTW, this process is identical to the way that you can export & import databases in SQL Azure. You can also easily now move your databases (schema and/or data) to and from SQL Server and SQL Azure with these BACPACs. This is not a full-service TLOG and data file backup like SQL Server native backups. This is more of a database export/import mechanism. But that is pretty much the most interactive that a DBA will get with SQL Azure anyway because you do not perform any TLOG maintenance on SQL Azure database. Microsoft takes care of that maintenance for you.

Nice New SQL Azure Integrations in SQL Server 2012

Have you downloaded the newest RTM (release to manufacturing) version of SQL Server 2012 yet? If not, download it here.

And if you are a big Cloud Computing enthusiast like I am, you will definitely want this latest version of SQL Server. The SQL Server 2012 RC0 is missing some of the newly-released integrations to Azure that I am going to highlight below. These are only available in the new RTM version of SQL Server 2012:

  1. Connect directly to your Azure Storage from the Azure Storage connector on SSMS Object Explorer. Here you can use Azure blob storage for your database backups to eliminate the need to use tape libraries or eliminate local storage for database backups. Azure Storage is automatically redundant with 3 copies of your data kept safe in Microsoft data centers.
  2. It is now super-easy to move database schemas from on-premises to Azure cloud in SQL Azure using the updated Data-Tier Application or DACPAC capabilities. And this function is also much less sensitive to database objects than in the previous SQL Server 2008 R2 first version of DAC along with ability to single-click deploy your database to SQL Azure.
  3. SQL Azure has introduced a new built-in programmatic capability to design scale-out OLTP SQL Server solutions using a sharding technology known as Federations. These are now manageable and visible natively in SSMS.

New MSSQLDUDE Videos: SQL Inventory with MAPS and WP7 Cloud BI

If you would like a short 10-minute intro on how to get started using the Microsoft Assessment and Planning Toolkit (MAPS) to inventory, discovery and record your SQL Server footprint, versions, utilization, users, licenses, etc., then you can have a look at the video that I just posted on our Microsoft YouTube channel for SQL Server data warehouse & BI group here in the East Coast U.S. field organization: http://www.youtube.com/watch?v=VdHs8Ee1iZI. BTW, the MAPS tool is a FREE download from Microsoft.com, see it here. This is version 6.5 of this tool and if you have any version prior to it, I suggest that you upgrade to it and run another SQL Server assessment on your network. This version of MAPS now has the awesome feature of virtualization discovery, proposal and recommendations built-in. This makes MAPS the premier SQL Server tool when consolidation your SQL Server environment.

And if you have a Windows Phone 7 and would like to see what a WP7 Mobile Cloud BI App would look like, take a look at this video that I just posted for our Microsoft field national DW & BI team: http://www.youtube.com/watch?feature=player_embedded&v=P1akSjiZasU#!. This describes the updated V2 improved version of the WP7 App for Microsoft Cloud BI on a mobile device which I described in my slide presentation here.

These are some of the BI data visualizations that you get out of the box with this WP7 app on your phone through the native app:

   

SQL Azure Federations

On the latest update to the SQL Azure Database Management Tool, I see that (a) the updated UI looks a lot like the preview of the Project Barcelona SQL Server lineage tool: http://www.sqlmag.com/blog/sql-server-bi-blog-17/business-intelligence/sql-server-denali-dependencies-amp-lineage-140558. And (b) SQL Azure Federations are enabled!

What are SQL Azure Federations? Think about it as the ability to scale-out SQL Server databases: http://blogs.msdn.com/b/usisvde/archive/2011/10/18/sql-azure-roadmap-simplifies-scale-out-needs-with-bigger-data-sql-azure-federation.aspx. I know alot of SQL Server DBAs and developers that have wanted scale-out for SQL Server transactional systems for YEARS. Well, what is nice is that you can build your own scale-out solutions now with this feature built direclty into SQL Azure meant exactly for scale-out.

However, this is not yet available for SQL Server traditional on-premises databases and still requires you to manually partition and shard your database applications, but do so using the built-in SQL Azure Federations capability. Check-out this Cloud Ninja sample on Codeplex: http://shard.codeplex.com/. And the coding for SQL Azure Federations looks to be quite nice & easy:

-- Connect to federation

USE FEDERATION Orders_Federation (CustId = '00000000-0000-0000-0000-000000000000') WITH RESET, FILTERING = OFF

GO

— Create Customers table

CREATE TABLE Customers(   CustomerID uniqueidentifier NOT NULL,   CompanyName nvarchar(50) NOT NULL,   FirstName nvarchar(50),   LastName nvarchar(50),   PRIMARY KEY (CustomerId) ) FEDERATED ON (CustId = CustomerID)

GO

And there are screens available to you on the SQL Azure Management Tool to monitor and manage your Federations (which is additionally good since you won’t see this in SSMS as SQL Server does not have this capability of Federations):

That’s just my sample. When you actually start using a sharded database application, this GUI becomes very helpful to watch the Federations SPLIT and grow:

Very, very nice. But for scale-out SQL Server (non-Azure), you will need either PDW or sharding OLTP SQL Server databases with distributed partitioned views or other application-layer mechanisms. For now, that is, anyway.

SQL Azure: Scale-Out and Big Data

Perhaps the NoSQL / Big Data trend in high-performance computing, made popular by Hadoop and MapReduce will end up being the “killer app” or at least “killer capability” for cloud databases?

I find it be an interesting thought because the currently available Microsoft cloud databsae, SQL Aure, is a complete SQL Server-based transactional database complete with support for SQL Server tools, T-SQL, ODBC, referential integrity, etc. The current maximium single stand-alone database size is 50 GB.

But Microsoft has recently shown a lot of interest in providing support for scaled-out large database workloads, first with SQL Server Parallel Data Warehouse (PDW) and then the recent announcement of PDW support for Hadoop. Scale-out applications built on traditional SQL Server have been around for some time. The typical mechanisms used to do this are based on partitioning or “sharding” the data to fan-out the data and queries across SQL Servers using MS-DTC, replication or Service Broker.

SQL Azure is coming out with a built-in capability to enable this style of “sharded” partitioning called Database Federations. This is a link to a terrific write-up of using these concepts in a Big-Data application, written by Roger Jennings for Visual Studio Magazine.

Note that this capability is not yet available even in CTP (beta) for SQL Azure yet at the time that I am writing this blog post. I like the fact that these capabilities are being surfaced as a built-in T-SQL command. There will be accompanying ADO.NET library changes with APIs and methods to manipulate the distributed data and to query it appropriately as well.

Very interesting, exciting ways that SQL Azure can be used. Once I get access to the CTPs, I’ll start building out distributed apps using that capability and blog my results here for you. In the meantime, that article link above gives you some code samples to start thinking about your Big Data architectures.