Azure Webjobs - light weight web worker roles

Sunday, May 18, 2014

Azure websites Webjobs are currently in preview and they are nice way to have some light weight background processing for your websites. You might have faced scenarios where you would have spin up threads to perform some background tasks in your website. That's not an ideal way so you used to have another option of having PaaS worker roles, but do we really need them for small tasks such as some aggregation of data, reporting or log clean-up.

This is where Azure Webjobs come to rescue. They provide background processing for your websites. In other words we can call them - "light weight web worker role". 

There are different ways of scheduling jobs in Azure webjobs - we can schedule the jobs on daily, weekly basis or have them constant running. Constantly running job, for example, will look at an activity running in the website and react to it. It is kind of a trigger in websites. This is, basically, a continuously running task or service.

Different kinds of running modes that Webjobs provide:

  1. Triggered (Tasks)
    1. Invoked by user, or helper service (such as azure scheduler), which basically just triggers a job
    2. These are just Https protected by deployment credentials. For example, there can be a task to clean up logs that will require to visit a particular url protected by credentials and this will trigger the task. But in most cases one will not want to visit the site every time, so it might be done using a helper service.
    3. Instance used is determined by the software load balancer (configurable).
  2. Continuous (Service)
    1. Background service monitors running state and invokes if needed for jobs.
    2. AlwaysOn is a helper service that monitors the states and invokes, for VM.  AlwaysOn basically does the same thing what a continuous service does for job, but it does it for a website. It monitors website and if it finds that website is down for some reason, it restarts it.
    3. Runs in all available instances, configurable to be a singleton

It is basically a light weight web worker role, and we shouldn't do heavy tasks as it will run on top of the website itself. Also, there's no point in having a worker role just to run a 10 second jobs such as aggregation of data. In this case webjobs are helpful. Webjobs is managed, so everything is much easier to do. If you wanted to spin up a small application, you can do it within 1 minute using webjobs.

Websites are basically a container for webjobs. Webjobs run outside of your website and run parallel to w3wp process. Each job is a separate process running in the VM.

Moving database connection string to azure service configuration (cscfg)

Wednesday, October 10, 2012

While working with ASP.NET web sites/projects we normally keep our database connection string in Web.config. However, while working in Azure, it's a good idea to keep this configuration in the service configuration itself as it will be easier to change the connection string once you have deployed your azure service and this will prevent the need of redeployment.

Also, while working with Azure services it would be easier to keep database configuration in the service configuration file for the simple fact that you will not need to keep on changing the database server name or credentials whiles working locally and while deploying to azure.

Because, we can have different service configuration files for different environments such as local, cloud or even for test or staging, we can simply add a key in the service definition file (csdef) and have values for each environment in each of the service configuration file (cscfg).

This is much more helpful in cases where you are using membership provider for forms authentication, authorization and/or session management, such as using ASP.NET Universal Membership provider for SQL Azure as it requires you to have the connection string in the web.config file, by default it is named DefaultConnection.

To achieve this you can remove the section from web.config and add key for your database connection in your csdef file and values in cscfg. Then it is simply to add the section at run time to your web.config in the Application_Start event, whilst reading the values from the cscfg file.

So, first add a key in your csdef file and its setting in the cscfg file as below, for example:

<Setting name="MyApplicationDB" value="Data Source=.\sqlexpress;Initial Catalog=Customers;Integrated Security=False;User ID=App.Web;Password=%#$##@;MultipleActiveResultSets=True;">

Then in your Global.asax.cs file's Application_Start event include the following code to add the required section in web.config

string connectionString = RoleEnvironment.GetConfigurationSettingValue(“MyApplicationDB”);
            // Obtain the RuntimeConfig type. and instance
            Type runtimeConfig = Type.GetType("System.Web.Configuration.RuntimeConfig, System.Web, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a");
            var runtimeConfigInstance = runtimeConfig.GetMethod("GetAppConfig", BindingFlags.NonPublic | BindingFlags.Static).Invoke(null, null);

            var connectionStringSection = runtimeConfig.GetProperty("ConnectionStrings", BindingFlags.NonPublic | BindingFlags.Instance).GetValue(runtimeConfigInstance, null);
            var connectionStrings = connectionStringSection.GetType().GetProperty("ConnectionStrings", BindingFlags.Public | BindingFlags.Instance).GetValue(connectionStringSection, null);
            typeof(ConfigurationElementCollection).GetField("bReadOnly", BindingFlags.NonPublic | BindingFlags.Instance).SetValue(connectionStrings, false);
            // Set the SqlConnectionString property.
            ((ConnectionStringsSection)connectionStringSection).ConnectionStrings.Add(new ConnectionStringSettings("DefaultConnection", connectionString));

An ASP.NET Picasa Image Gallery

Monday, October 1, 2012

Few days back I was thinking of creating an Image Gallery of the collection of photos I have. Although, there are multiple options available over the internet that you can download and get ready on the go; most of them involves saving the images on your own server. But what I was more concerned was to just have a display only image in my web site to showcase my photos to the word. And, I wanted to make use of one of my social network account – Facebook, Google+, Twitter or Flickr to host the images.

All the major social network sites provide API to get the photos from an album. First, I tried to make use of Facebook, but it’s access token gets expired in few hours and you need to get a new token. This normally involves a user to login to your application first, but I wanted a permanent access token or a way to refresh the access token without user’s intervention. I think it is possible to achieve it somehow, I tried my hand on Picasa (Google+) and it was much easier.

Using Picasa Web Albums Data API  , you can query any public album and get the photos. You need two things for this – Album Id and User Name. In case you need to show photos from one of your private albums, you need to authenticate yourself using the API with your Google account credentials.

Let’s go step by step and see how to achieve this. To achieve this I am using GoogleData API for .NET  and Galleriffic  jquery  plugin for the image library.

Step 1: Design your gallery.
First step is to create your image gallery base. I am using the Galleriffic jquery plugin and for this there should be few style sheet files (css) and javascript files need to be included in the project.

First, go to this link and download the plugin from there. You should copy the css and js folders into your ASP.Net website/application project. Above link showcases 5 examples and I am using the second one (Thumbnail rollover effects and slideshow crossfades), as it is closest to the image gallery looks and feel I wanted.

CSS file used for example 2 is galleriffic-2.css. You can exclude other css files named such as galleriffic-1.css, galleriffic-3.css etc, but you will need other remaining css and image files.

After you are done with including css and js folders, next step is to include them in you page. In this example, I am using Default.aspx to be my image gallery page, but it can be anything for you such as Gallery.aspx. In the head tag include the following tags:

    <link rel="stylesheet" href="css/basic.css" type="text/css" />
    <link rel="stylesheet" href="css/galleriffic-2.css" type="text/css" />
    <script src="//" type="text/javascript"></script>
    <script type="text/javascript" src="js/jquery.galleriffic.js"></script>
    <script type="text/javascript" src="js/jquery.opacityrollover.js"></script>
    <script type="text/javascript">

Next, include the following script in your page:

    <script type="text/javascript">
        jQuery(document).ready(function ($) {
            // We only want these styles applied when javascript is enabled
            $('div.navigation').css({ 'width': '300px', 'float': 'left' });
            $('div.content').css('display', 'block');

            // Initially set opacity on thumbs and add
            // additional styling for hover effect on thumbs
            var onMouseOutOpacity = 0.67;
            $('#thumbs ul.thumbs li').opacityrollover({
                mouseOutOpacity: onMouseOutOpacity,
                mouseOverOpacity: 1.0,
                fadeSpeed: 'fast',
                exemptionSelector: '.selected'

            // Initialize Advanced Galleriffic Gallery
            var gallery = $('#thumbs').galleriffic({
                delay: 2500,
                numThumbs: 15,
                preloadAhead: 10,
                enableTopPager: true,
                enableBottomPager: true,
                maxPagesToShow: 7,
                imageContainerSel: '#slideshow',
                controlsContainerSel: '#controls',
                captionContainerSel: '#caption',
                loadingContainerSel: '#loading',
                renderSSControls: true,
                renderNavControls: true,
                playLinkText: 'Play Slideshow',
                pauseLinkText: 'Pause Slideshow',
                prevLinkText: '‹ Previous Photo',
                nextLinkText: 'Next Photo ›',
                nextPageLinkText: 'Next ›',
                prevPageLinkText: '‹ Prev',
                enableHistory: false,
                autoStart: false,
                syncTransitions: true,
                defaultTransitionDuration: 900,
                onSlideChange: function (prevIndex, nextIndex) {
                    // 'this' refers to the gallery, which is an extension of $('#thumbs')
                                                .eq(prevIndex).fadeTo('fast', onMouseOutOpacity).end()
                                                .eq(nextIndex).fadeTo('fast', 1.0);
                onPageTransitionOut: function (callback) {
                    this.fadeTo('fast', 0.0, callback);
                onPageTransitionIn: function () {
                    this.fadeTo('fast', 1.0);

Next step is to have necessary div tags as detailed in the documentation for Galleriffic ( The galleriffic plugin shows each image’s thumbnail as a list item in an HTML unordered list (ul tag). As we will be getting our images at run time only via call to Picasa API we need to have the li items generated from within code. Therefore, I have added a div (divSlider) and marked it to runat=”server” so that I can assign a value to its innerHTML with constructed html.

Within you form tag in your aspx include following:

 <div id="page">
            <div id="container">
                    <a href="#">My Website</a></h1>
                <div id="gallery" class="content">
                    <div id="controls" class="controls">
                    <div class="slideshow-container">
                        <div id="loading" class="loader">
                        <div id="slideshow" class="slideshow">
                    <div id="caption" class="caption-container">
                <div id="thumbs" class="navigation">
                    <div id="divSlider" runat="server">
                <div style="clear: both;"></div>


Next, we will make a call to Picasa API, get the images within a specific album and construct an unordered list html and assign this html to divSlider.

Step 2: Query the API

Download and install Google Data API SDK for .NET.  Once installed add references to the following dlls in your ASP.NET project:


After you have installed the SDK, the default location for these dlls will be C:\Program Files (x86)\Google\Google Data API SDK\Sample on 64 bit system and C:\Program Files\Google\Google Data API SDK\Samples on x86.

Next include the following namespaces in aspx.cs file:

using Google.GData.Photos;
using Google.GData.Client;
using Google.GData.Extensions;
using Google.GData.Extensions.Location;

There can be two scenarios you can use to display the images
1.       You can display any public album which can belong to you or any other user. For this you will need Google account username (yours or the username of person whose album you are accessing) and the album id.
2.       Accessing your own private album. Here you will need to provide your username, password and album id. You will need to authenticate with API using your credentials and then access the album.

For any of the above cases, it’s a good idea to have username and album id information in Web.config or in a Constant class.

    <add key="albumid" value="5792668263385651889"/>
    <add key="user" value=""/>
    <add key="password" value=""/>

To get album id for the album that contains the photos you want to display, you can simply brose to the album and from the url in the address bar, you can get the album id. For example in case of, the album id is 5792668263385651889.

Now, include the following in your Page_Load :

string userName = ConfigurationManager.AppSettings["user"];
            string password = ConfigurationManager.AppSettings["password"];

            PicasaService service = new PicasaService("freak.roach-sample");
            //service.setUserCredentials(userName, password);  //-- needed when you need to show albums with private visibility

            PhotoQuery query = new PhotoQuery(PicasaQuery.CreatePicasaUri(userName, ConfigurationManager.AppSettings["albumid"]));
            PicasaFeed feed = service.Query(query);

            StringBuilder html = new StringBuilder();

            html.Append("<ul class=\"thumbs noscript\">");

            foreach (PicasaEntry entry in feed.Entries)
                string title = entry.Title.Text.Substring(0, entry.Title.Text.LastIndexOf("."));

                html.Append(String.Format("<li><a class=\"thumb\" name={0} href=\"{1}\" title=\"{2}\"><img src=\"{3}\" alt=\"{4}\"/></a>",
                    title, entry.Media.Content.Url, title, entry.Media.Thumbnails[0].Url, title));
                html.Append(String.Format("<div class=\"caption\"><div class=\"image-title\">{0}</div><div class=\"image-desc\">{1}</div></div></li>",
                    title, entry.Summary.Text));


            divSlider.InnerHtml = html.ToString();    

That’s it, now when you run this project you will be able to view nice image gallery as below:

Next Steps
The above approach can be extended to show images from Twitter, Flickr, Facebook,  Photobucket etc as well.

Source Code
Source code is available at the MSDN Samples Gallery below and you can use it as is; it is ready to go after you change the configuration key values.

source code -

Display List of Uploaded Azure VM Role vhd images in a list

Thursday, May 19, 2011

I just posted a sample on MSDN for displaying the list of base vhd images uploaded to Azure Portal, for Virtual Machine (VM) Role.

You can get the list of uploaded vhds simply by running the following command in command prompt

csupload.exe Get-VMImage -Connection "SubscriptionId=xxxxxx-xxxxxx-xxxxxx-xxxxxx;CertificateThumbprint=xxxxxxxxxxxxxxxxxxx"

Thus making use of Get-VMImage switch of csupload you can get the complete details but the output is not presentable. It looks like something as below:

csupload get-vmimage output

To make this presentable we just simple call this process form our c# code, parse the output, save properties to a list of object and bind it to a gridview, to make it look like as follow:

csupload get-vmimage formatted output

Follow this link for details on how to achieve the same:

csupload issue on 32 bit machines : Azure SDK 1.4 Fix

Thursday, March 10, 2011

Many of you might have encountered an issue while uploading your VM Role base image via csupload from a 32 bit machine.

The issue has now been resolved with the new release of Azure SDK 1.4. Now csupload can be used from x86 platforms too.

There are many more additions to the Azure SDK 1.4, mainly including Windows Azure COnnect and Content Delivery Netwrok (CDN).

Installing Tomcat in Windows Azure

Wednesday, March 2, 2011

There are few options already available to install Tomcat on Windows Azure, which involve running some scripts that create a package and definition file for you that you can deploy to Windows Azure. However, i personally feel that we have a much easier solution for installing Tomcat.

The solution that i am discussing here makes use of the startup tasks in elevated privileges, which were introduced in Azure SDK 1.3.
  1. Download and install jre on your local system. Zip the jre folder and upload it blob.
  2. Download tomcat on your local system.
  3. Edit tomcat’s server.xml to configure specific ports and enable SSL
  4. If you need to deploy any java .war file in your tomcat, then copy this war file in tomcat’s webapps folder.
  5. Zip the tomcat folder an upload it to blob.
  6. Now create a worker role and enable TCP ports configured for tomcat.
  7. Add startup task in this worker role that does the following tasks:
    • Unpack jre zip file to local drive on azure.
    • Unpack our customized tomcat zip file to local drive on azure.
    • Set environment variable for JRE_HOME to the path where JRE was unpacked.
    • Set environment variable for CATALINA_HOME to the path where tomcat was unpacked.
    • Start tomcat.

It shouldn’t be difficult for you to implement the above steps yourself, however, i seem to have plenty of time today so let me explain these steps too.

Prepare Java
It shouldn’t be difficult for you to download java from oracle/sun site. Just download it and install it. It will create a folder on your local machine e.g. jre6. You need to zip this folder and upload it to your blob storage. You can obviously upload it to any other host too, but since we are discussing about Azure, so let’s keep it that way only.

Prepare and configure Tomcat
Now download your desired version of tomcat from and unzip this folder on your local machine.
  • Setup Ports in Server.xml
    You will have to select the ports you want your tomcat to run on. Say for http you want port 80 and for https you need port 443.

    Go to the conf folder inside your tomcat folder and open file server.xml

    Search for the following line and replace 8080 with 80 and 8443 with 443:

     8080" protocol="HTTP/1.1" connectionTimeout="20000" redirectPort="8443" />

    After changing the ports this line will look like

    80" protocol="HTTP/1.1" connectionTimeout="20000" redirectPort="443" />

  • Enable SSL
    Again open the server.xml file and search for the following line:

    This will be commented by default. Uncomment it and change the port to 443. This line will look like as bellow:

    443" protocol="HTTP/1.1" SSLEnabled="true" maxThreads="150" scheme="https" secure="true" clientAuth="false" sslProtocol="TLS"/>

    For SSL we use a .pfx file with tomcat instead of using normal keystore file. I will assume here that you already have .pfx file for PKCS12 certificate. This certificate needs to be added to your Cloud Service in the azure portal. Copy this .pfx file to a folder inside your tomcat folder, say under webapps. And make the following changes to the above line in server.xml:

    443" protocol="HTTP/1.1" SSLEnabled="true" maxThreads="150" scheme="https" secure="true" clientAuth="false" sslProtocol="TLS" keystoreFile="\webapps\" keystorePass="" keystoreType="PKCS12"/>

  • Deploy war files if required
    If you have some war files that need to be deployed with tomcat, then jut copy them under the webapps folder of tomcat. Now when tomcat would be started it will install these applications.
  • Upload tomcat
    Once done zip your customised tomcat and upload it to blob.

Worker Role 
Now create a new Cloud Service Project in Visual Studio and add a new Worker role to it.

  • Add Certificate
    Upload the certificate that you used for tomcat’s SSL to the portal or include it in your worker project.   
  • Enable TCP Ports
    You need to enable the TCP ports in your worker role that you configured for your tomcat. In our case these are port 80 and 443.

    public override bool OnStart()
                // Set the maximum number of concurrent connections
                ServicePointManager.DefaultConnectionLimit = 12;

                // For information on handling configuration changes
                // see the MSDN topic at

                TcpListener port80Listener = new TcpListener(RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["Tcp80"].IPEndpoint);
                TcpListener sslListener = new TcpListener(RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["TcpSSL"].IPEndpoint);


                return base.OnStart();

  • Startup Tasks
    Then as we do for startup tasks create a startup.cmd file in your worker role, mark it’s Build Action Property as Content and Copy to Output Directory to Copy Always.

    Your startup.cmd file will have the following commands:
    • Download and unzip
      We will use GetFiles.exe developed by us for downloading It is just a console app that takes first argument a blob url and second argument is the path to save file. Then using another customised utility ZipUtility.exe we will unzip in the C: drive to a folder tomcat.

      start /w ZipUtility.exe C:\ C:\

    • Download and unzip
      Similarly download and unzip jre.

      start /w ZipUtility.exe C:\ C:\

    • Set Environment variables
      We need to setup two environment variables.
      JRE_HOME needs to be set to your jre folder, which in our case is C:\jre6

      SET JRE_HOME=C:\jre6

      CATALINA_HOME needs to be set to your tomcat folder, which in our case is C:\tomcat

      SET CATALINA_HOME=C:\tomcat

      Note: Please note that you can set environment variables in your service definition file also using the Runtime section as below:

              <Variable name="CATALINA_HOME" value="C:\tomcat"/>

    • Start tomcat


So startup.cmd will look like below:

GetFiles.exe and ZipUtility.exe are custom console apps. They are also added to the worker role project with their  Build Action Property as Content and Copy to Output Directory to Copy Always.

Final Step
That’s it just deploy your package to the cloud. Make sure your hosted service has the certificate that you used for tomcat’s SSL.


Changing Drive Letter of an Azure Drive (aka X-drive)

Monday, February 7, 2011

Sometimes it might be necessary that you want your Azure drive to be always mounted on a fixed drive letter. Consider a scenario of an Azure VM Role where you need to mount an azure drive for data persistance and your VM demands the same letter for you azure drive, e.q. you installed SQL Server on your VM Role and for mdf files you specified azure drive as path so as to make the data persist.

But now, we know that Azure drives are mounted on random-drive letters. To always have a fixed letter what you can do is that after your drive is mounted, you can change the drive letter to a fixed value using diskpart from within the windows service you use to mount the drive in VM Role, or from other part of code if you are not working with VM Role. Check this post to know how to mount Azure Drive in VM Role.

To get a basic idea on how to change drive letter using diskpart visit this Microsoft support link :

To change the drive letter of the mounted Azure Drive using diskpart, we will create a temporary file in local resource storage. This temp file will be used to store the current and target drive letters, and using this we can construct diskpart commands. Following code can be used to achieve the same:


//create temporary diskpart file
string diskpartFile = drive.CachePath + "\\diskpart.txt";

if (File.Exists(diskpartFile))

string driveLetter = drive.DriveLetter;
//start the process
using (Process changeletter = new Process())
changeletter.StartInfo.Arguments = "/s" + " " + diskpartFile;
changeletter.StartInfo.FileName = System.Environment.GetEnvironmentVariable("WINDIR") + "\\System32\\diskpart.exe";




2009 ·Techy Freak by TNB