Using Application Insights SDK in Aspect-oriented programming (AOP) style with .NET.

In my previous post I talked about using Custom Attributes to avoid repetition, code pollution etc., and illustrated it from a very high level to use with Application Insights API, custom events.

C# – Attributes. Advertisements. An attribute is a declarative tag that is used to convey information to runtime about the behaviors of various elements like classes, methods, structures, enumerators, assemblies etc. in your program. You can add declarative information to a program by using an attribute

The problem I am trying to solve here is to separate the Application Insights SDK code, hide it from the developer and simplify adding the functionality to Classes, Methods and other dependencies that need to be tracked for performance, logging and auditing. I ran my thought process with my friends and fellow workers Alex and Sujit, got some great advise from both to look further into Visual Studio Profiler and Aspects based development, AOP (Aspect-oriented programming) I also looked into PostSharp a popular framework for AOP. I tried out few scenarios and all my thoughts started to come together, VERY COOL. During my googling (wait.. binging!!) I educated myself more and more on AOP (very interesting) I found many articles and landed on this, it gave me great ideas on using pure .NET (RealProxy Class) to implement a solution. Digging more into RealProxy and attributing and going through many posts and techniques certainly helped. Making the long story short I was able to assemble my approach that can be used for Profiling methods and classes for performance, logging and auditing using the AOP and attributing approach, that can simplify injecting Application Insights API into any application. I am sure it can be greatly enhanced per implementations but this can definitely get you started.

The pattern is simple, here are the steps:

Create your Custom AttributeAttr-1
The next step is to build the Proxy for this Custom attribute, which is derived from the RealProxy class in .NET framework. The RealProxy class provides transparent proxy for the object. The transparent proxy provides the illusion that the actual object resides in the client’s space.

To support the performance tracking I implemented a stopwatch and a Write Telemetry method, the stopwatch kicks at at the beginning of the Invoke method carrying our Custom Attribute, and at the end, Telemetry is written with the elapsed time.

Attr2

attr3

This blog article explains in detail the use of RealProxy class for AOP and provides samples for implementing your own proxy classes derived from RealProxy.

The next step is to put the new Custom Attribute on the methods on the Class that need to be tracked for Performance. It is that simple but you do have to build Interfaces for your Classes a good OOP practice anyway. This is illustrated below.

Attr5Attr4

Now when the Add, Update and Delete are called the Performance can be recorded by the Application Insights Custom Events. Our new Custom Attributes can be applied to any number of methods.

This concludes this post, a great use of this pattern can also build a centralized logging framework and I will illustrate that in the next post.

Attrib6

3 seconds for update, 5 seconds for delete and 7 seconds for add as we expected based on the demo delay parameters.

Happy monitoring.

 

 

 

Advertisements

Avoiding Code Pollution when using Application Insights SDK

“Insert a few lines of code in your application to find out what users are doing with it, or to help diagnose issues. You can send telemetry from device and desktop apps, web clients, and web servers. Use the Azure Application Insights core telemetry API to send custom events and metrics, and your own versions of standard telemetry. This API is the same API that the standard Application Insights data collectors use”  …… Copied from here.

The above statement is very true and my experience of using Application Insights for custom events has been great. It is extremely powerful and easy to use. The challenge comes when you have to instrument a large number of classes and methods throughout you code to enable telemetry collection. Although the SDK provides some great out-of-the-box functionality for tracking page load time, external dependencies (database/web service) and code exceptions just by including the SDK, there are lots of scenarios where you must add code and use custom events.

I worked on a simple pattern that can simplify this to some degree by using Custom Attributes a functionality that exists in .Net. I will illustrate my work here which is a very simple implementation of a Custom Attribute, that can be applied to any class or method in your solution, simply by scaffolding your class or method. The first code fragment below shows the Custom Attribute and the next code fragments show the usage and results.

CustomAttr

As highlighted, this custom attribute can be targeted to classes or methods, a name and a user account type can be passed as a parameter. Sample usage is illustrated below.

Attrib1

Attrib2

In the two examples above the Custom Attribute is applied on the classes and a user level of “Admin” is passed on the first example and default user level “User” is accepted in the second.

In the code below the Custom Attribute is applied not to the class but to a method with a user level “Co-Admin”.

Attrib3

When we run this sample and instantiate these classes and methods we see the Application Insights telemetry information sent to the portal.

AttribAI-PortalAttribAIProps-Portal

So we got 3 custom events with the user-id property what we set.

This illustrates the SDK use in its simplest form. In the next posts I will dive into some more use cases for Application Insights Custom Events and try to build some more Custom Attributes for tracking user response time between selections on a page, execution time of a method call inside your application, async api calls and .net core dependency calls etc.

Happy monitoring with Application Insights. Stay tuned.

 

Move To Cloud – ADAL.JS and Token Services

At this point I have pretty much stripped out the ADAL.JS library to include minimal code needed to acquire tokens based on user’s identity. The stripped file is only about 180 lines of code and 7 kb in size and its not even compressed.

Token Services is an azure website, I have hosted it at https://tokenservices.azurewebsites.net/adaltoken.html, the page adaltoken.html takes a query string parameter redirect_url=‘the page to receive the acquired token’, and and upon successful authentication puts the acquired token in another query string parameter called adal_token=‘token‘, it also puts the issue time for the token in a query string parameter called adal_token_issued=‘issue time’. This page can be called from any where needing an access token. As needed it prompts the user to enter the WAAD credentials, adal.js provided token has a lifetime of about an hour.

 

 

adaltoken1adaltoken2

 

Move To Cloud – Platform Website ADAL.JS and CORS

With the Authentication Team at Microsoft really keeping up with ADAL (Active Directory Authentication Library) and Vittorio’s excellent white board illustrations on the new ADAL for JavaScript made me redo the authentication mechanism from my previous post now using ADAL.JS instead. Our platform sites will use ADAL for Authentication and we will add OWIN and Katana support in our Web APIs. The assumption is that asp.net is shrinking down with just core .net framework components, no WCF and probably no IIS in the future.

In this first sample I will provide a simple api/controller that will be called presenting a Bearer token, previously acquired using a single html page with adal.js. We will host this page in a different domain and I will also add CORS support in the web api.

The two azure sites:
1 – https://tokenservices.azurewebsites.net (hosts the token issuer using adal.js)
2 – https://platform-core.azurewebsites.net (core apis with a demo controller api/id)

Steps:
In the Azure Portal Active Directory register the application. The ADAL.js on GIT lists all the steps needed for that.
Create the Azure Websites.
Publish the your solution.

Platform1

Platform2

Platform3

Platform4

Platform5

Now that we have the web/apis that are protected against AD, in the next post I will enhance the core project to provide some capabilities around Azure Management APIs for accessing Azure Websites and Azure Scheduler programmatically. I will also build an Office 365 SharePoint online App that will consume our api/id controller developed here. I will post that on a separate post that is related to SharePoint here.

 

Move To Cloud – Platform Website

In the previous post we summed up the components for the Cloud Platform. In this post lets get started with building an Azure Website that will host our Cloud Platform pages and services. This site will also provide Tokens to clients for authenticated access to hosted services and resources. To keep it really simple I will make it as a simple web application with an aspx pages and Ajax enabled REST web service endpoints. I will not use the MVC pattern or WEB API as I do not want to add all the extras that  gets included using the MVC templates.

So first things first we need to go to the Azure Management Portal and provision a website, and add two Applications in the Azure Active Directory. (https://manage.windowsazure.net) as follows.

blogproj01

our website named “platformservices” at https://platformservices.azurewebsites.net

blogproj02

Add two applications in our Active Directory

blogproj05

1 – Platform Services (Enabling Sign On at: https://platformservices.azurewebsites.net/ourpage.aspx).
2 – Platform Toolkit Client (Enabling Azure Management Services)

blogproj04

With the above setup in place we are ready to build the web application that will be hosted on this site and will provide an aspx page that will enable single sign on, in our case we will just get the user’s identity. Visual Studio 2012 use to provide an add-on for Identity And Access enabling single Sign On to a web application, this has been changed in Visual Studio 2013, we will use Visual Studio 2013 to create a web.config file with the required markup needed to enable Federated Authentication.

blogproj06

blogproj07

By downloading the publish profile from the Azure Management Portal for the Platform Services Azure website that we put together earlier we can now publish our Visual Studio Solution. This solution only has a page called identitytoken.aspx. This page provides authenticated access using the Azure Active Directory user credentials. If the user is logged into Office 365 or Azure, it will print the user name and if not logged in; will be redirected to the login page.

blogproj08

After publishing the solution and navigating to https://platformservices.azurewebsites.net/identitytoken.aspx we will be getting a login screen like this and if we are already signed then the user name will be displayed.

blogproj09

Great progress – we now have a web site that can be used to add Platform Services that can be securely accessed by client application. In the next post we will put together some use cases for using this platform. We will build a REST service endpoint and an app that will communicate with each other and exchanging Tokens for authentication and authorization.

Good luck.. Stay tuned. Tomorrow is Sunday so I may get some time to move this further.

 

Move To Cloud – Cloud Platform

In the last post we narrowed down on what is in our bag towards building a cloud platform. We identified Office 365 and Microsoft Azure as the base for the Cloud Platform. This platform will provide patterns for implementing  mid tier web services, jobs, cache, storage and of-course cloud identity based authentication and authorization, in short everything that an enterprise developers team will use as a framework to build custom solutions. We will be making use of the following:

  • Azure Active Directory
  • Azure Web Sites
  • Azure Redis Cache
  • Azure SQL Storage
  • Azure Job Scheduler

Additionally I will be using Office 365 – SharePoint and Exchange online to build following

  • SharePoint Apps
  • Email Notifications

We will break this framework as follows:

  • Cloud Platform Toolkit (.net dll)
  • Cloud Platform Services (WCF REST Services)
  • Cloud Platform App (SharePoint App)

Cloud Platform Toolkit:
The toolkit will provide .net developers a collection of classes and methods to access SharePoint and Azure securely from WCF service endpoints. This approach will also enforce the abstraction of sensitive credential information as service accounts, connection string etc. Just to make clear that we are not building a composite Cloud App but instead we will be building atomic REST Web Services that can be consumed by Apps for SharePoint and mobile devices.

cloudplatform1

Above image highlights a summary of classes that the toolkit will provide. I will talk about the first aspect of the Cloud Platform in the next post that will cover storing of Credentials (Service Accounts), we will store them as Connection Strings in Azure Websites with Encryption and Expiration, that almost all Enterprise Security Officers ask for. We will be tapping into Azure Management APIs and will provide some REST endpoints to achieve this functionality.

Link to Azure Management Services for Azure websites.

There are also a handful of NUGET packages wrapping these REST services and providing a .NET wrapper, it will be easy to use them but I am not sure how flexible they are so I will try to use the REST APIs directly using the HTTP GET/PUT/POST.

 

 

Move To Cloud – What is in the Bag

I am starting out with this blog series on Move To Cloud, the posts will not address why an Enterprise should choose to move but rather dive into the Software architecture, platform, patterns and tools needed for a successful cloud implementation. We will discuss the use of the Microsoft Cloud offering to build solutions and services that meet an enterprise’s internal demand, security and governance by utilizing the most of what a “Subscription Based” model has to offer.

So for starters let’s see What is in the Bag. A typical subscription to Office 365; along with the Office Suite (word, excel etc.) will give you the SharePoint and Azure Portals. SharePoint online for all the Enterprise Collaboration and Social needs and Azure services for Enterprise Solutions. In a typical Enterprise Application Development, a team is responsible for delivering and supporting custom or semi-custom solutions for the user base. A well designed Cloud Platform provides an Agile Development, on-time delivery and superior support for the in-house applications thus justifying the ultimate ROI for the stake holders.

Let’s get started.. you may already be Subscribing to Office 365, if not there are 30 Day trials available, a minimum one developer subscription 3 dollars for Office 365 and adding Exchange on-line for 4 dollars, a 7 dollars per month is a DEAL to get started. In a nut shell you will have the following.

Office 365 – SharePoint and Exchange Online and Azure Portals

O365Subscription
AzureSubscription

You will also need Visual Studio – Express and online will work too, may have some limitations. I will be using Visual Studio 2013 Update 3. We will be building a combination of:

  1. SharePoint Apps – SharePoint and Provider hosted, Office 365
  2. WCF REST web services on the Cloud.
  3. Toolkit .NET DLLs
  4. HTML/JavaScript JQuery.
  5. Mobile (iOS and Android)

In the next post(s) we will be diving in into the above topics to build a platform based on Cloud Service offered by Office 365 and Azure for delivering Enterprise Solutions that Securely Run on the Cloud. We will also be touching on the Hybrid approach as we all know that some data or access to data can not just move to the cloud. We will develop simple secure on-premises listener services for accomplishing our goal.

Link to next post (coming soon).