Category: Uncategorized

Introducing Azure IoT Hub device management

April 29, 2016 jason Uncategorized 0 Comments

Azure IoT Hub device management was recently announced at //build and I’m very excited to follow up with the details as Azure IoT Hub device management is now available in preview. Our standards based device management features in Azure IoT Hub enable IoT solutions to organize, query, configure, and update devices in Azure based IoT solutions.

Developers can now benefit by extending their Azure IoT Hub based solutions through using the following device management features:

Device twin – The device twin is the representation of your physical device in your Azure based IoT solution. The device twin contains properties that describe the status of the device such as firmware version, battery level, and connected state. You can use tags to organize your set of device twins, which can later be found through device queries. The device twin is accessed by the existing device registry management API.

Device queries – Device queries enable you to find device twins and generate an aggregate view of device twins. For example, you can query for devices that have firmware version ‘1.2’ or devices that have been tagged as being part of building 43.  Your device query is structured as a JSON object that can provide a filter, projection, aggregate and list of sort expressions. 

Device jobs – Device jobs enable you to run actions on devices such as firmware update, reboot, or factory reset. For example, the firmware update device job coordinates multiple devices as they  download the firmware package, apply the firmware, reboot and finally reconnecting to IoT Hub using the new firmware.  The device job ensures the device follows the necessary steps independent of connection state or constraints in the network used by device.  On the device side, the device takes action based on simple requests and notifies the service upon completion.

These new capabilities are now available in preview as part of Azure IoT SDK. We plan to support the full breadth of languages for the Azure IoT service SDK, starting with support for C# and Node.js (through an underlying HTTP REST endpoint).

We have also expanded the open source Azure IoT device SDK to include a device management library that enables you to make your devices manageable, using the LWM2M protocol, with Azure IoT Hub. We plan to support the breadth of languages for the Azure IoT device SDK, starting with support for ANSI C. Through support for ANSI C, a large number of OS based (including Windows and Linux) and microcontroller firmware-based devices can use the client library. 

Device management is also a key part of the Azure IoT Gateway SDK  as IoT gateways and devices using IoT gateways can be managed through Azure IoT Hub using these features.

To learn more and get started using the device management features of Azure IoT Hub and the Azure IoT SDK, please visit the following locations:

  1. Azure IoT Hub device management documentation
  2. Device management service-side sample code on GitHub
  3. Device management device-side sample code on GitHub

We’re very excited to learn more about how you will use these new device management features with your devices and cloud-based IoT solution. We’ve been working with a number of developers who have provided great feedback. We are currently expanding the capabilities of Azure IoT Hub to include device groups and custom models  We appreciate more of your feedback as we work on the next set of exciting device management features.  Please join us on the Azure IoT Suite forum or on the Yammer IoT Advisors group.

Source: Microsoft Azure News

Introducing the Azure IoT Gateway SDK Beta

April 29, 2016 jason Uncategorized 0 Comments

We recently announced the Azure IoT Gateway SDK beta, which enables developers and ISVs to easily build and deploy gateway intelligence tailored to their specific scenario. Our SDK achieves this by providing code that reduces the necessary work required for the development of gateway logic. Any reduction in time to market is a huge advantage in an industry as competitive as IoT.

The Azure IoT Gateway SDK does not place restrictions on the type of gateway solution that can be created. Our IoT philosophy is to support the many different types of operating systems and hardware developers use, so we’ve built the gateway with ANSI C on top of a platform abstraction layer, providing a high performance gateway framework with a portable and cross-platform library that runs on both Windows and Linux. We’ve also integrated standards like LWM2M for device management and posted the entire gateway codebase on GitHub as on open source project as part of our ongoing support for open source SDKs. In the Azure IoT ecosystem, developers are free to extend and contribute functionality as needed.

Today a developer can use the Azure IoT Gateway SDK beta to build a gateway which solves many of the canonical gateway scenarios we will explore below: supporting legacy devices, running edge analytics, minimizing latency, conserving network bandwidth, supporting reliable operations, and addressing security concerns. These solutions are a combination of developer written logic and SDK code. At launch, the Azure IoT SDK beta contains:

  • A framework to create extensible gateway solutions
  • Code which sends data from multiple devices over gateway to cloud connection
  • An example of how to connect a non-TCP/IP enabled device to the cloud via protocol translation on the gateway.

As we continue development, we plan to implement additional gateway features which:

  • Enable developers to use high level languages (e.g. Node, Java, C#) when creating their custom gateway solution
  • Increase the productivity of developers using the SDK
  • Make more complex scenarios like cloud management of gateways a reality

Why IoT gateway devices are important

While many developers are already benefitting from connecting IoT devices directly to the cloud, some scenarios require edge intelligence to get the most out of an IoT solution. Buildings, for example, often have sensors that are not TCP/IP-enabled and cannot connect to the cloud on their own. The Azure IoT Gateway SDK, available in beta today, eases the development process for any solution where on-premise computation is necessary to augment functionality provided by cloud services. Canonical examples of IoT gateway scenarios include:

Support for legacy devices: There are millions of sensors and actuators already deployed that are not capable of sending data directly to the cloud. These devices either use a protocol that is not suited for the internet, do not implement encryption, or can’t store identity certificates. Using an IoT gateway reduces the burden and cost of connecting these devices.

Run edge analytics: There are a plethora of operations that can be done locally to ease cloud communication. Examples include data filtering, batching, and compression. Additionally, some computations like data cleansing or scoring a machine learning model with real time data may be better suited for being done on premise.

Minimize latency: Milliseconds matter when you are trying to prevent manufacturing line shutdowns or restore electrical service. Analyzing data close to the device that collected the data can make the difference between averting disaster and a cascading system failure.

Conserve network bandwidth: A typical offshore oil platform generates between 1TB and 2TB of data each day. A Boeing 787 creates half a terabyte of data per flight. It is not practical to transport vast amounts of data from thousands or hundreds of thousands of edge devices to the cloud. Nor is it necessary, because many critical analyses do not require cloud – scale processing and storage.

Operate reliably: IoT data is increasingly used for decisions affecting citizen safety and critical infrastructure. The integrity and availability of the infrastructure and data cannot be compromised by intermittent cloud connections. Capabilities like store and forward, collecting and acting upon data locally and then sending it up to the cloud when appropriate, help developers create reliable solutions.

Address security concerns: IoT devices, and the data they produce, need to be protected at a number of levels. Gateways can add a layer of security by isolating devices from the open internet, provide encryption and identity for devices that are not capable of doing it themselves, secure data that is locally buffered or stored, and much more.

Get started today

Creating a gateway solution that solves your specific end-to-end scenario, runs on the operating system of your choice, and is powered by the exact configuration of hardware demanded by your environment is only half the battle. Deploying your gateway solution, configuring it, monitoring it, and maintaining it can be just as much work, if not more work, than creating it in the first place.

For instance, take the example of creating a gateway solution to be run on an oil rig in the middle of the ocean. Writing the logic to talk to the correct machines in a protocol they understand, doing some analysis on the data, and sending it to the cloud is a lot of work. That logic, and the hardware on which it runs, must be shipped out to the oil rig. Ideally it would work like a charm from the first time it’s booted up, but that doesn’t happen in the real world. The software must be tweaked during initial deployment and probably updated as time goes on. In the best case, this maintenance happens remotely via SSH and copying files down to the gateway hardware. In the worst case, a human must be sent out to modify the software as necessary.

Imagine the SDK providing logic to have a cloud interface where you can configure the software that will power a gateway, test it before sending it to the physical machine, deploy it from the cloud, monitor the health of the software, and even update it using the same cloud portal.

The Azure IoT Gateway SDK has been tested during a private preview and enters public Beta today. We invite you to go check it out on GitHub. Inside the repo you will find documentation to get you started with the SDK, and samples to get your own gateway up and running in no time. We look forward to hearing what you think about the SDK: how does it work for you, what are the things you would like to see added to the SDK? Please join us on the Azure IoT Suite forum or on the Yammer IoT Advisors group.

Source: Microsoft Azure News

Introducing Azure Cool Blob Storage

April 28, 2016 jason Uncategorized 0 Comments

Data in the cloud is growing at an exponential pace, and we have been working on ways to help you manage the cost of storing this data. An important aspect of managing storage costs is tiering your data based on attributes like frequency of access, retention period, etc. A common tier of customer data is cool data which is infrequently accessed but requires similar latency and performance to hot data.

Today, we are excited to announce the general availability of Cool Blob Storage – low cost storage for cool object data. Example use cases for cool storage include backups, media content, scientific data, compliance and archival data. In general, any data which lives for a longer period of time and is accessed less than once a month is a perfect candidate for cool storage.

With the new Blob storage accounts, you will be able to choose between Hot and Cool access tiers to store object data based on its access pattern. Capabilities of Blob storage accounts include:

  • Cost effective: You can now store your less frequently accessed data in the Cool access tier at a low storage cost (as low as $0.01 per GB in some regions), and your more frequently accessed data in the Hot access tier at a lower access cost. For more details on regional pricing, see​ Azure Storage Pricing.
  • Compatibility: We have designed Blob storage accounts to be 100% API compatible with our existing Blob storage offering which allows you to make use of the new storage accounts in existing applications seamlessly.
  • Performance: Data in both access tiers have a similar performance profile in terms of latency and throughput.
  • Availability: The Hot access tier guarantees high availability of 99.9% while the Cool access tier offers a slightly lower availability of 99%. With the RA-GRS redundancy option, we provide a higher read SLA of 99.99% for the Hot access tier and 99.9% for the Cool access tier.
  • Durability: Both access tiers provide the same high durability that you have come to expect from Azure Storage and the same data replication options that you use today.
  • Scalability and Security: Blob storage accounts provide the same scalability and security features as our existing offering.
  • Global reach: Blob storage accounts are available for use starting today in most Azure regions with additional regions coming soon. You can find the updated list of available regions on the Azure Services by Regions page.

For more details on how to start using this feature, please see our getting started documentation.

Many of you use Azure Storage via partner solutions as part of your existing data infrastructure. Here are updates from some of our partners on their support for Cool storage:

  • Commvault: Commvault’s Windows/Azure Centric “Commvault Integrated Solutions Portfolio” software solution enables a single solution for enterprise data management. Commvault’s native support for Azure has been a key benefit for customers considering a move to Azure and Commvault remains committed to continuing our integration and compatibility efforts with Microsoft, befitting a close relationship between the companies that has existed for over 17 years. With this new Cool Storage offering, Microsoft again makes significant enhancements to their Azure offering and we expect that this service will be an important driver of new opportunities for both Commvault and Microsoft.
  • Veritas: Market leader Veritas NetBackup™ protects enterprise data in on a global scale in both management and performance – for any workload, on any storage device, located anywhere.  The proven global enterprise capabilities in NetBackup converges on and off-premise data protection with scalable, cloud-ready solutions to cover any use case.  In concert with the Microsoft announcement of Azure Cool storage, Veritas is announcing beta availability of the integrated Azure Cloud Connector in NetBackup 8.0 Beta which enables customers to test and experience the ease of use, manageability, and performance of leveraging Azure Storage as a key component of their enterprise hybrid cloud data protection strategy. Click here to go to the NetBackup 8.0 Beta registration and download website.
  • SoftNAS: SoftNAS™® will soon be supporting Azure Cool storage. SoftNAS Cloud® NAS customers will get a virtually bottomless storage pool for applications and workloads that need standard file protocols like NFS, CFS/SMB, and iSCSI. By summer 2016, customers can leverage SoftNAS Cloud NAS with Azure Cool storage as an economical alternative to increasing storage costs. SoftNAS helps customers make the cloud move without changing applications while providing enterprise-class NAS features like de-duplication, compression, directory integration, encryption, snapshotting, and much more. SoftNAS StorageCenter™ console will allow a central means to choose the optimal file storage location ranging from hot (block-backed) to cool (Blob-object backed) and enables content movement to where it makes sense over the data lifecycle. 
  • Cohesity: Cohesity delivers the world’s first hyper-converged storage system for enterprise data.  Cohesity consolidates fragmented, inefficient islands of secondary storage into an infinitely expandable and limitless storage platform that can run both on-premises and in the public cloud.  Designed with the latest web-scale distributed systems technology, Cohesity radically simplifies existing backup, file shares, object, and dev/test storage silos by creating a unified, instantly-accessible storage pool.  The Cohesity platform seamlessly interoperates with Azure Cool storage for three primary use cases:  long-term data retention and archival, tiering of infrequently-accessed data into the cloud, and replication to provide disaster recovery. Azure Cool storage can be easily registered and assigned via Cohesity’s policy-based administration portal to any data protection workload running on the Cohesity platform.
  • CloudBerry Lab: CloudBerry Backup for Microsoft Azure is designed to automate data backup to Microsoft Azure cloud storage. It is capable of compressing and encrypting the data with a user-controlled password before the data leaves the computer. It then securely transfers it to the cloud either on schedule or in real time. CloudBerry Backup also comes with file-system and image-based backup, SQL Server and MS Exchange support, as well as flexible retention policies and incremental backup. CloudBerry Backup now supports Azure Blob storage accounts for storing backup data.

The list of partners integrating with cool storage will continue to grow in the coming months.

As always, we look forward to your feedback and suggestions.


The Azure Storage Team.

Source: Microsoft Azure News

Azure Stream Analytics is now available in Germany

April 28, 2016 jason Uncategorized 0 Comments

This week we find ourselves in Hannover, Germany for this years’ edition of Hannover Messe, one of the industry’s premier conferences for industrial technology since 1961. This conference highlights innovative solutions in the Internet of Things (IoT) space. At Hannover Messe, attendees receive a complete picture of the industrial value-add chain, from providers to partners and more, under one roof.

From this hive of industrial activity and innovation, we are happy to announce the expanded regional availability of Azure Stream Analytics via the Microsoft Azure Deutschland cloud, a Microsoft cloud with T-Sytems, a German data trustee. Select customers can now create Stream Analytics jobs in-region while remaining compliant with local German regulations. Azure Stream Analytics is now available in every region where Azure is available.

Azure Stream Analytics is a fully managed cloud solution in Azure that lets you rapidly develop and deploy low-cost solutions to gain real-time insights from devices, sensors, infrastructure, and applications. Stream Analytics is part of the Azure IoT Suite, which brings IoT to life and lets you easily connect your devices, analyze previously-untapped data, and integrate business systems.

This is why Stream Analytics and Hannover Messe go together like, well, like the internet and your connected things, and why we are so excited to be in Europe for this conference as we expand the availability of Stream Analytics in to another key region. Stream Analytics is already helping Swedish based customer Aerocrine deliver more reliable devices for asthma sufferers worldwide, to diagnose and treat their disease effectively. Read the full customer story here and see how you too can leverage the power of the Azure cloud to unlock real-time insights from your IoT solutions.

Stream Analytics has also helped Aerocrine deliver more reliable devices for asthma sufferers worldwide, to diagnose and treat their disease effectively. Read the full customer story here and see how you too can leverage the power of the Azure cloud to unlock real-time insights from your IoT solutions.

We will continue to invest in our regional expansion plans to ensure we are where you need us to be, giving you the flexibility and choice you need when building your innovative modern big data applications.

As always, we welcome you to provide feedback on our User Voice page. This is where you, our valuable customers, get to let us know what is important to you and what we should be working on next.

If you are new to either Microsoft Azure or Stream Analytics, we invite you to try it out by signing up for a free Azure trial account and create your first Stream Analytics job.

If you need any help or have questions, please reach out to us through the MSDN or Stackoverflow forums, or reach out to the product team directly by email.

To stay up-to-date on the latest news and features follow @AzureStreaming on Twitter.

Source: Microsoft Azure News

Azure DevTest Labs now available in six more regions

April 27, 2016 jason Uncategorized 0 Comments

We are delighted to announce Azure DevTest Labs is now live in all 15 public regions that support the required Azure resources used by the Labs, including Australia, Brazil and more regions in the United States, Europe and Asia. Customers now have the option to create a DevTest lab in any of the below locations.

  • NEW – Brazil South
  • NEW – Australia East
  • NEW – Australia Southeast
  • NEW – North Central US
  • NEW – North Europe
  • NEW – Southeast Asia
  • Central US
  • East US
  • East US 2
  • South Central US
  • West US
  • West Europe
  • East Asia
  • Japan East
  • Japan West

Azure DevTest Lab is a service that helps developers and testers quickly create environments in Azure while minimizing waste and controlling cost. You can test the latest version of your application by quickly provisioning Windows and Linux environments using reusable templates and artifacts. Easily integrate your deployment pipeline with DevTest Lab to provision on-demand environments. Scale up your load testing by provisioning multiple test agents, and create pre-provisioned environments for training and demos.

For more information, please visit the DevTest Labs page to see the features, intro videos and documentation or to learn how to get started. Check out Azure DevTest Team blog, where we share our thoughts behind new features. If you have any questions or need assistance, just go to our MSDN Forum. If you have any suggestions or feature requests, please add them to our Feedback Forum or vote for one already there.


Source: Microsoft Azure News

Azure Services SSL/TLS cipher suite update and removal of RC4

April 27, 2016 jason Uncategorized 0 Comments

In the blog post, Protecting customer data from government snooping, Brad Smith, general counsel and executive vice president of legal and corporate affairs at Microsoft, announced Microsoft’s commitment to increase the security of our customers’ data.

We continue to execute on that commitment by announcing additional enhancements to encryption in transit based security. To date, this has included usage of best-in-class industry standard cryptography, including Perfect Forward Secrecy (PFS), 2048-key lengths, and updates to operating system cipher suite settings.

In addition to these cryptographic changes, the default Transport Layer Security (TLS)/Secure Socket Layer (SSL) cipher suite configuration has been enhanced and includes changes such as removal of SSLv3 support and mitigation of issues such as POODLE.

Today, we are announcing the removal of RC4 from the supported list of negotiable ciphers on our service endpoints in Microsoft Azure. This encryption work builds on the existing protection already extant in many of our products and services, such as Microsoft Office 365, Skype and OneDrive.

In addition, The TLS/SSL cipher suite enhancements are being made available to customers, by default, in the May 2016 Azure Guest OS releases for Cloud Services release. Learn more about Azure Guest OS releases here.

For more detailed information regarding these security enhancements, please see the article Updates to Microsoft Azure SSL/TLS Cipher Suites.

Source: Microsoft Azure News

Announcing Azure support upgrade for enterprises

April 26, 2016 jason Uncategorized 0 Comments

We are excited to announce an Azure Support upgrade for our most committed enterprise customers. From May 1, 2016, through June 30, 2017, eligible Microsoft Azure customers will receive a one-year support upgrade. This upgrade is designed to give you an additional level of support on your journey to the cloud, and it comes at no charge.

Eligible customers who don’t choose to purchase support will automatically get Azure Standard support which offers unlimited technical assistance for subscription management and break-fix issues. Customers who purchase Azure Standard support will be upgraded to Professional Direct. Professional Direct support offers faster initial response and escalation management for high priority issues, proactive monitoring of business critical support issues and account management. And if you purchase Professional Direct support, you get even more. You can learn more about the support upgrade here.

The Azure team is committed to helping customers achieve more with the power of the cloud. With unparalleled end-to end enterprise cloud experience from support, to value, to the most comprehensive offerings spanning SaaS, IaaS, PaaS, on-premises datacenters and hybrid environments, we remain laser focused on meeting your needs.

Frequently asked questions

Q. Who is eligible for the support upgrade?

A. Customers who have or buy Azure Services on an Enterprise Agreement (EA) from May 1, 2016, through June 30, 2017.

Q. How do I get this upgrade?

A. No action is needed. Microsoft will enable support for all eligible customers.  

Q. When do I get this upgrade?

A. New, eligible customers will be entitled to the upgraded support experience once they activate their Azure subscription. Existing eligible customers will be upgraded before September 2016.

Q. Where can I get more information?

A. Please look here for more information about the upgrade. 

Source: Microsoft Azure News

Migrate your Application Insights mobile app to HockeyApp today

April 25, 2016 jason Uncategorized 0 Comments

Early in March we announced we are focusing our efforts for mobile and desktop customer on HockeyApp. HockeyApp is a key component of our Mobile DevOps solution. You can now migrate your application from Application Insights to HockeyApp via a single click.

Transition to HockeyApp is now live

We are happy to announce that as of today you can transfer your application from Application Insights to HockeyApp with just a single click. Simply log into your app, and notice the transition status bar at the top of your app. This single button click will transfer your existing application over to HockeyApp. No need to change any SDKs, no need to change your experience (your Application Insights experience will not be impacted until June 15th when it reaches its sunset date).

Transfer to HockeyApp Status Bar

To best support our existing customers in this transition we are also happy to announce that applications that are moved from Application Insights to HockeyApp will be supported on HockeyApp for free. Apps that are moved during this time period via the transfer will not count against your HockeyApp subscription app limits.

How does it work?

The app will be transferred into a HockeyApp account tied directly to your same Azure account credentials (MSA or AAD). If you don’t have a HockeyApp account it will be created automatically for you. We will then automatically provision a new application and link it to your existing Application Insights app. This will enable data to flow to both experiences. In addition, your HockeyApp app will have access to historical data that has been sent to your Application Insights resource. We hope that this seamless experience will aid in the transition experience until the Application Insights sunset experience on June 15. For more details on the transition please refer to our knowledge base documentation.


We hope that this transition experience and the free app transfer will help to mitigate your transition costs. Our goal is to provide you the best mobile DevOps experience to distribute, test, monitor and manage you apps via HockeyApp. If you have any trouble during this transition process please feel free to contact us.

Source: Microsoft Azure News

Azure SQL Database now supports powerful geo-replication features for all service tiers

April 25, 2016 jason Uncategorized 0 Comments

The built-in active geo-replication has been generally available to the Premium databases for over a year. As a growing number of SaaS customers use the multi-tenant design pattern, including the use of elastic pools, we receive feedback that consistent geo-replication capability independent of the database type would be beneficial to these applications. Today, we are happy to announce that extended active geo-replication to all service tiers. The capabilities now available to all databases include:

  • Ability to create the secondary database in any region in the world
  • Ability to connect to the secondary database with read-only access
  • Ability to create up to four parallel secondary databases in the same or different region
  • Ability to configure performance level of the secondary database
  • When failover is activated to one of the secondary databases, other secondaries are automatically linked to the new primary

The secondary database is priced at 1x price of the performance level you choose. For example if your application leverages a P2 primary database, your secondary databases van be configured with P1 performance level, you will be charged the price of P1 for the secondary. This option is not recommended for applications with high database write activity as it may result  in increased replication lag and therefore, has high risk of data loss after failover. Note, the secondary database must be in the same service tier than the primary. Therefore, if you are using S0 or P1 performance level for the primary you cannot configure a secondary database with lower performance level.

To take advantage of this additional functionality you can create one or several readable secondaries of any Basic or Standard database on V12 server as illustrated below. You will not be required to upgrade to Premium.

In addition to the Azure portal, you can configure readable secondary using T-SQL, PowerShell or REST API.

Deprecation of standard geo-replication

Because active geo-replication is a superset of standard geo-replication and is available to all service tiers, the standard geo-replication configuration option will be retired in 12 months. Until then, the feature will remain fully supported.

Source: Microsoft Azure News

Announcing Content Moderator on Azure Media Analytics

April 21, 2016 jason Uncategorized 0 Comments

Note: The following blog post describes a component of Azure Media Analytics. In order to learn more, please read the introductory blog post on Azure Media Analytics.

With the exponential growth of user generated content, it is even more important for businesses to monitor the content that is distributed to their users. In the past, businesses needed to employ human moderators to look at each piece of content, or worse yet, relied on their customers to alert them of content that violated their policies. Both of these methods have their costs. The former has a monetary cost that grows linearly as more content is added. The latter can devalue the organization’s brand and erode customer confidence.

To solve this problem, we created the Content Moderator service. Content Moderator enables customers to apply their unique set of contextual rules and policies and automatically detect questionable content in images and text.

Last week we announced our expansion into video moderation with a private preview of our new capability, the Content Moderator Media Processor! The Adult Classifier is the initial feature of this media processor, and additional features, including updates to this first classifier, will be coming soon. With this new capability, video monitoring can be automated, enabling the proactive detection of adult content in your video files. This can significantly lower the cost of your moderation efforts. Furthermore, gaining this type of early insight can enable businesses to make better decisions and create safer experiences for their customers.

Sign up to get access to the private preview of the Content Moderator Media Processor.

Getting started

Once you are granted access to the private preview, you can easily integrate the Content Moderator technology into your web or app offering using our REST API/SDKs, or with Azure Media Services Explorer.

The following is a sample C# program set that will get you started with your first Content Moderator job. This code requires both the Azure Media Services C# SDK and SDK Extensions packages (available on NuGet).

Sample program

using System;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.MediaServices.Client;
using System.IO;
using System.Threading;

namespace ContentModeratorAmsSample
    class Program
        // declare constants and globals
        private static CloudMediaContext _context = null;
        private static readonly string _accountName = { ACCOUNT_NAME };
        private static readonly string _accountKey = { ACCOUNT_KEY };
        private const string _mpName = "Azure Media Content Moderator";
        private static readonly string _inputFile = { INPUT_FILE_PATH };
        private static readonly string _outputFolder = { OUTPUT_FOLDER_PATH };
        //True if videos on the server should be scaled down false if not. 
        //Scaling the video down has better performance but could have some impact on the scoring
        private const bool _moderatorConfiguration = false;

        static void Main(string[] args)
            _context = new CloudMediaContext(_accountName, _accountKey);
            RunContentModeratorJob(_inputFile, _outputFolder, _moderatorConfiguration);

        static void RunContentModeratorJob(string inputFilePath, string output, bool scaleVideo)
            // create asset with input file
            IAsset asset = _context.Assets.CreateFromFile(inputFilePath, AssetCreationOptions.None);
            // grab instance of Azure Media Content Moderator MP
            IMediaProcessor mp = _context.MediaProcessors.GetLatestMediaProcessorByName(_mpName);

            // create Job with Content Moderator task
            IJob job = _context.Jobs.Create(String.Format("Content Moderator {0}", 
                Path.GetFileName(inputFilePath) + "_" + Guid.NewGuid()));

            ITask contentModeratorTask = job.Tasks.AddNew("Adult classifier task",
                scaleVideo ? "sdv=true" : "sdv=false",
            contentModeratorTask.OutputAssets.AddNew("Adult classifier output",


            // Create progress printing and querying tasks
            Task progressPrintTask = new Task(() =>
                IJob jobQuery = null;
                    var progressContext = _context;
                    jobQuery = progressContext.Jobs
                    .Where(j => j.Id == job.Id)
                while (jobQuery.State != JobState.Finished &&
                jobQuery.State != JobState.Error &&
                jobQuery.State != JobState.Canceled);

            Task progressJobTask = job.GetExecutionProgressTask(

            // If job state is Error, the event handling 
            // method for job progress should log errors.  Here we check 
            // for error state and exit if needed.
            if (job.State == JobState.Error)
                ErrorDetail error = job.Tasks.First().ErrorDetails.First();
                Console.WriteLine(string.Format("Error: {0}. {1}",

            DownloadAsset(job.OutputMediaAssets.First(), output);

        static void DownloadAsset(IAsset asset, string outputDirectory)
            foreach (IAssetFile file in asset.AssetFiles)
                file.Download(Path.Combine(outputDirectory, file.Name));

        // event handler for Job State
        static void StateChanged(object sender, JobStateChangedEventArgs e)
            Console.WriteLine("Job state changed event:");
            Console.WriteLine("  Previous state: " + e.PreviousState);
            Console.WriteLine("  Current state: " + e.CurrentState);
            switch (e.CurrentState)
                case JobState.Finished:
                    Console.WriteLine("Job finished.");
                case JobState.Canceling:
                case JobState.Queued:
                case JobState.Scheduled:
                case JobState.Processing:
                    Console.WriteLine("Please wait...n");
                case JobState.Canceled:
                    Console.WriteLine("Job is canceled.n");
                case JobState.Error:
                    Console.WriteLine("Job failed.n");

Output XML

InputFileName: The name of the file, as saved on the input asset in Azure Media Services.

FileSize: The size of the input file in bytes.

AdultClassifierValue: Classifier score. The score is in a range of [0-1.0] where 0 is least likely to be adult and 1.0 is most likely.

TimeTakenMs: Time taken to execute the classification on the server.

Resized: Flag indicating whether or not the resize operation was performed.


Keep up with the Azure Media Services blog posts for more updates on the Content Moderator Media Processor and the Media Analytics initiative!

If you have questions about any of the Media Analytics products, send an email to

Source: Microsoft Azure News