Measuring SQL Azure network latency

February 26, 2012

SQL Azure is multi tenancy model with synchronize replication to 2 more copies and designed for Data integrity and safety first in mind.  SQL Azure wont scale as on-premise SQL Servers.  In my project i need to run millions of query within cloud (worker role -> Sql Azure).  There was severe performance issue.   It is not possible to re-write every query in a batch certain queries has to be executed as one at a time.  Then I developed this exe to help support engineers to identify the bottle neck.

To execute this exe browse the sql file.  Your sql file can contain one or more sql statement separated by go statement.  From the UI fill all the connection credentials such as SERVERNAME, DATABASE (if this database is not exists then this exe will create the database at the end the database will be deleted), LOGIN and PWD.  Note if LOGING and PWD is empty then the connection string will be switched to windows authentication (will work locally but not for SQL Azure (cloud database required SQL authentication) ).

Once you click the “Run” it will parse your SQL file and runs all the queries.  This will create the TestLog file the log file will look like this.

2/25/2012 10:29:28 PM: Creating database my1…
Query: 1, at time:2/25/2012 10:29:28 PM, local: 820.2274, remote: 0, network: 0
Query: 1, at time:2/25/2012 10:29:31 PM, local: 163.5987, remote: 120, network: 43.5987
Query: 2, at time:2/25/2012 10:29:32 PM, local: 181.745, remote: 37, network: 144.745
Query: 3, at time:2/25/2012 10:29:32 PM, local: 49.9357, remote: 7, network: 42.9357
Query: 4, at time:2/25/2012 10:29:32 PM, local: 88.3577, remote: 40, network: 48.3577
Query: 5, at time:2/25/2012 10:29:33 PM, local: 78.0235, remote: 37, network: 41.0235
Totals: local: 00:00:01.3818880, remote: 00:00:00.2410000, network: 00:00:00.3206606

Here is how time spend on each query in your file…

LOCAL=remote (time spend on SQL Azure) +network (latency)

to download source code for this utility click here


NHDN- Cloud computing [Azure Traffic Manager]

February 17, 2012

NHDN-CloudComputing, February 15th, 6.00 PM – 8.15 PM

Presenter: Udaiappa Ramachandran

Topic: Windows Azure – Traffic Manager & Advanced Web role settings

Abstract: Windows Azure Traffic Manager enables you to manage and distribute incoming traffic to your Windows Azure hosted services whether they are deployed in the same data center or in different centers across the world. In this talk we will explore different load balancing policies available in Traffic Manager and how we can use them to enhance performance, increase availability, and balance traffic to your hosted services. .

Place: Daniel Webster College (DWC), Eaton Richmond Center (Room #122), 20 University Drive, Nashua, NH 03063.



Azure Application Drive size limit and Running Multiple WebSites in the cloud

January 13, 2012

First thing i tried was i deployed multiple websites (with minimum templates and config) in one packages.  And then from the background task i was downloading all the dynamic templates (*.aspx, *.ascx)  from the corresponding blob and copying to site root.  I got an exception because it exceed the size limit of 1GB.  In my project every site consumed 300MB worth of dynamic templates including some static files.  I had 10+ sites and hence i need atleast 3GB application drive size but by default Azure will provide only 1GB of application drive either E or F.  Unfortunately this cannot be changed.  Then I tried this work arounds proposed by microsoft support person.  it seems to be working for me but yet to run the performance test…

  1. Create the empty or minimum files with package for all the sites so that your package is deployed
  2. Move the rest of the dynamic template or static files to blob storage.
  3. Run the background task (you must configure this in your package)
    1. Use Local Storage in your code and download all the data in specific folders at local storage
    2. Windows Server 2008 supports Symbolic links and this way even when your files are stored in Local Storage, still they appear to be inside the Application drive.
    3. After you download the blob from Azure Storage please create symbolic link from your application folder to local storage.
    4. Use mklink command to create the symbolic link to the folder (ex.,)

i.            Mklink /D <targetdrive> <source drive>

ii.            Ex., MKLINK /d E:\Sitesroot\help  C:\Resource\help

iii.            MKLINK /d E:\Sitesroot\1\help  C:\Resource\help

iv.            MKLINK /d E:\Sitesroot\2\help  C:\Resource\help

You can find related documents from the following url:


August 23, 2011

I design large enterprise system.  One of the challenge i always had is increasing the concurrency while maintaining the integrity.  Sequencing solves this problem.  Our database system records are designed to use BIGINT as a identifier and distrubuted by number of servers can go up to 2,147,483,647 servers.  Every server gets maximum of Int32.MaxValue range of Ids and servers can be synched each other by Peer-Peer or Hub-Spoke synchronization model using sync framework.

Identity column is not a good solution because it can only go upto 2 servers (-ve Ids and +ve ids) other than that we need to get lots long time blocking and locking to work.

Custom stored procedure with table works good but got into concurrency issues.

Solution:  Sequencing is the fast and efficient solution that i had ever liked.  Here is how it works.



create tabledbo.MyTable







CREATE  SEQUENCE[dbo].[MyTableSequence]

AS [bigint]




MAXVALUE 2147483648




declare @id bigint

select  @id=NEXT VALUE FOR [dbo].[MyTableSequence]  –GETTING NEXT SEQENCE

insert into mytable(someid,somevalue) values (@id,convert(varchar(40),newid()))


–There are situation that i need to get multiple ids that case we can use sp_sequence_get_range function


DECLARE @fv sql_variant,@lv sql_variant;

EXEC sys.sp_sequence_get_range @sequence_name=‘[MyTableSequence]’,@range_size= 10,@range_first_value=@fv OUTPUT,@range_last_value=@lv OUTPUT;

–SELECT fv = CONVERT(bigint, @fv), lv = CONVERT(bigint, @lv), next = NEXT VALUE FOR dbo.[MyTableSequence];

–Here is how i use this query

declare @fv1 bigint,@lv1 bigint

set  @fv1=convert(bigint,@fv)

set  @lv1=convert(bigint,@lv)

while (@fv1<=@lv1)


insert into mytable(someid,somevalue) values (convert(bigint,@fv1),convert(varchar(40),newid()))

set @fv1=@fv1+1



Summary: I see a Sequencing gives better performance than identity cloumns and easy to use my existing design

You can download the source code here

Hybrid cloud using service bus –NHDN-Nashua talk.

July 21, 2011

July 20th: Nashua User group meeting.

Today I have presented about Hybrid cloud using service bus.  This talks cover complete walk through and  presentation of  “How to connect clients and services via SOAP and REST over the Service Bus using the Windows Azure platform App Fabric SDK.  How to enable service automatic activation by connecting a WCF service in IIS 7.5 to the Service Bus”

You can download source code here.

Udai @ WPC, Los Angeles

July 18, 2011

I attend the WPC 2011 @ Los Angeles, July 10-14,2011.  Presented Ektron in Azure @ Azure Theater ISV booth between 11.30 AM – 12.30 PM on July 11th.  This demo includes the synchronizing data between on-premises and cloud both Data(SQL Server to SQL Azure) and Blob (Filesystem to Azure Blob store), AppFabric caching, using FileSystem API to write data in files and blobs, etc., A public  Press release also available.

NH Code Camp 3 (Azure AppFabric Cache)

June 12, 2011

Code Camp: NHDN Code Camp3, June 4, 2011

I organized speakers of this code camp.  I also presented (evening session) about Azure Appfabric cache and attended 25-thinks about c#.  You can download my presentation and sample code from

Case-sensitive Azure BLOB Urls-Fix

March 25, 2011

Blob Urls are case sensitive.  If you had ever stored file or image in a windows azure blob container then you need to access the url in exact case otherwise you will get blob NOT found error.  To resolve this issue

While writing blob always use lower case.

While reading blob convert your url to lowercase and redirect to blob.

For example you can write HttpHandler or HttpModule to redirect to your blob after.  Here is the simple code snippet.

public class BlobModule : IHttpModule    {
public void Dispose()        {            //throw new NotImplementedException();        }
public void Init(HttpApplication context)        {            context.BeginRequest += new EventHandler(context_BeginRequest);        }
void context_BeginRequest(object sender, EventArgs e)        {

string contextUrl=HttpContext.Current.Request.Url.LocalPath.ToLower();

if (contextUrl.Contains(“myfile/”))            {

string blobUrl = RoleEnvironment.GetConfigurationSettingValue(“BlobOrCdnUrl”) + RoleEnvironment.GetConfigurationSettingValue(“ContainerName”) + contextUrl;                HttpContext.Current.Response.Redirect(blobUrl, true);

}        }    }

And then from your web.config you can add the following snippet under module.

<modules runAllManagedModulesForAllRequests=”true”>

<add name=”myname” type=”mynamespace.blobmodule,myassembly” preCondition=”integratedMode” />


Thats it.  Now your blob can be accessed using mixed url.