Archive for the ‘Udai knows’ Category

Updating Content Approval Status using SharePoint Designer 2013 Workflow

November 20, 2015

I recently changing Akumina’s InterChange to work with SharePoint 2013 SharePoint Designer workflow but ran into issue that I cannot update the content approval status (_ModerationStatus field) via Designer workflow using builtin update list item activity.  Here is the step-by-step instruction to update content approval status from SharePoint Designer workflow using CallHTTP activity.

  1.  Activate “Workflows can use app permissions” from Site Collection –>Site Settings–>Site Actions–>Manage Site Features
  2. Go to Site Collection –>Site Settings–>Users and Permissions then click Site app permissions and then copy the GUID from workflowappperm
  3. Go to site Collection URL ~/_layouts/15/appInv.aspx  complete the following values
    1. AppId: value copied from above step
    2. Title: workflow
    3. AppDomain: yourdomain (or machinename)
    4. Redirect URL: http://YourDomain (or  http://<machinename>)
    5. Permission Request XML:

      <AppPermissionRequests><AppPermissionRequest Scope=”http://sharepoint/content/sitecollection/web&#8221; Right=”FullControl” /></AppPermissionRequests>appinv

    6. Click Create.  You will be promted with dialog.  Click Trust it
  4. Launch the SharePoint Designer and Create a List Workflow (Same applicable for Reusable or Site Workflow)
  5. From the designer action drop the Build Dictionary Activity and name id “ApprovalRequestHeader” add the following fields
    Name Type Value
    accept String application/json;odata=verbose
    content-type String application/json;odata=verbose
    X-HTTP-Method String MERGE
    IF-MATCH String *
  6. From the designer action drop the Build Dictionary Activity and name id “RequestApprovalMetadata” add the following fields.  Note that Approval Required field must be turned on for this value to appear
    Name Type Value
    type String SP.Data.<LISTNAME>ListItem

    You can get this value by browsing to http://<yoursite>/ _api/web/lists

    The value should be from listitementitypefullname

    Build Metadata Dictionary

  7. From the designer action drop the Build Dictionary Activity and name id “RequestApprovalParam” add the following fields
    Name Type Value
    __metadata Dictionary Variable: ApprovalMetadata

    Refer to workflow parameters named ApprovalMetadata created in previous step

    OData__ModerationStatus Integer 0You can get the field type/name by browsing  http://<yoursite>/_api/web/lists/getbyTitle(‘<LISTNAME>’)/items


  8. From the designer action drop the CallHTTP activity and set the propertiescallhttp
  9. Publish the workflow
  10. To package and deploy


September 20, 2012

Nashua .NET/Cloud Computing talk Sep 19,2012.

Today’s meeting we discussed all about ASP.NETWeb API. Talk includes žIntroduction, žWeb API Routing, žWeb API Pipeline, žDependency Injections and Resolvers, žModel Binding and Media Formats, žSelf Hosting, Security and žDeploying Web API into cloud.

click here to download the ppt
click here to download the source code
video of this presentation coming soon

Windows Azure SDK 1.7 and new features

June 21, 2012

NHDN Cloud Computing talk Jun 20th, 2012.

We discussed All new features from Jun 7th announcement and SDK 1.7.  How to utilize all the cloud services, cloud storage and virtual machine and the web sites.

click here to download the presentation

click here to download the sample code on caching and service bus ( before running the sample replace the value for appsetting keys )

Cloud Storage (Azure Blob and Amazon S3)

April 19, 2012

Today @NHDN Cloud Computing User group meeting @ Daniel Webster College (DWC) Nashua, we talked all about the cloud storage.  Talk includes the overview of cloud storage including Azure Blob and Amazon S3.  Blob storage, Azure Drive and S3 using REST and API.

You can download the presentation here

download the sample source here

Measuring SQL Azure network latency

February 26, 2012

SQL Azure is multi tenancy model with synchronize replication to 2 more copies and designed for Data integrity and safety first in mind.  SQL Azure wont scale as on-premise SQL Servers.  In my project i need to run millions of query within cloud (worker role -> Sql Azure).  There was severe performance issue.   It is not possible to re-write every query in a batch certain queries has to be executed as one at a time.  Then I developed this exe to help support engineers to identify the bottle neck.

To execute this exe browse the sql file.  Your sql file can contain one or more sql statement separated by go statement.  From the UI fill all the connection credentials such as SERVERNAME, DATABASE (if this database is not exists then this exe will create the database at the end the database will be deleted), LOGIN and PWD.  Note if LOGING and PWD is empty then the connection string will be switched to windows authentication (will work locally but not for SQL Azure (cloud database required SQL authentication) ).

Once you click the “Run” it will parse your SQL file and runs all the queries.  This will create the TestLog file the log file will look like this.

2/25/2012 10:29:28 PM: Creating database my1…
Query: 1, at time:2/25/2012 10:29:28 PM, local: 820.2274, remote: 0, network: 0
Query: 1, at time:2/25/2012 10:29:31 PM, local: 163.5987, remote: 120, network: 43.5987
Query: 2, at time:2/25/2012 10:29:32 PM, local: 181.745, remote: 37, network: 144.745
Query: 3, at time:2/25/2012 10:29:32 PM, local: 49.9357, remote: 7, network: 42.9357
Query: 4, at time:2/25/2012 10:29:32 PM, local: 88.3577, remote: 40, network: 48.3577
Query: 5, at time:2/25/2012 10:29:33 PM, local: 78.0235, remote: 37, network: 41.0235
Totals: local: 00:00:01.3818880, remote: 00:00:00.2410000, network: 00:00:00.3206606

Here is how time spend on each query in your file…

LOCAL=remote (time spend on SQL Azure) +network (latency)

to download source code for this utility click here

NHDN- Cloud computing [Azure Traffic Manager]

February 17, 2012

NHDN-CloudComputing, February 15th, 6.00 PM – 8.15 PM

Presenter: Udaiappa Ramachandran

Topic: Windows Azure – Traffic Manager & Advanced Web role settings

Abstract: Windows Azure Traffic Manager enables you to manage and distribute incoming traffic to your Windows Azure hosted services whether they are deployed in the same data center or in different centers across the world. In this talk we will explore different load balancing policies available in Traffic Manager and how we can use them to enhance performance, increase availability, and balance traffic to your hosted services. .

Place: Daniel Webster College (DWC), Eaton Richmond Center (Room #122), 20 University Drive, Nashua, NH 03063.



Azure Application Drive size limit and Running Multiple WebSites in the cloud

January 13, 2012

First thing i tried was i deployed multiple websites (with minimum templates and config) in one packages.  And then from the background task i was downloading all the dynamic templates (*.aspx, *.ascx)  from the corresponding blob and copying to site root.  I got an exception because it exceed the size limit of 1GB.  In my project every site consumed 300MB worth of dynamic templates including some static files.  I had 10+ sites and hence i need atleast 3GB application drive size but by default Azure will provide only 1GB of application drive either E or F.  Unfortunately this cannot be changed.  Then I tried this work arounds proposed by microsoft support person.  it seems to be working for me but yet to run the performance test…

  1. Create the empty or minimum files with package for all the sites so that your package is deployed
  2. Move the rest of the dynamic template or static files to blob storage.
  3. Run the background task (you must configure this in your package)
    1. Use Local Storage in your code and download all the data in specific folders at local storage
    2. Windows Server 2008 supports Symbolic links and this way even when your files are stored in Local Storage, still they appear to be inside the Application drive.
    3. After you download the blob from Azure Storage please create symbolic link from your application folder to local storage.
    4. Use mklink command to create the symbolic link to the folder (ex.,)

i.            Mklink /D <targetdrive> <source drive>

ii.            Ex., MKLINK /d E:\Sitesroot\help  C:\Resource\help

iii.            MKLINK /d E:\Sitesroot\1\help  C:\Resource\help

iv.            MKLINK /d E:\Sitesroot\2\help  C:\Resource\help

You can find related documents from the following url:


August 23, 2011

I design large enterprise system.  One of the challenge i always had is increasing the concurrency while maintaining the integrity.  Sequencing solves this problem.  Our database system records are designed to use BIGINT as a identifier and distrubuted by number of servers can go up to 2,147,483,647 servers.  Every server gets maximum of Int32.MaxValue range of Ids and servers can be synched each other by Peer-Peer or Hub-Spoke synchronization model using sync framework.

Identity column is not a good solution because it can only go upto 2 servers (-ve Ids and +ve ids) other than that we need to get lots long time blocking and locking to work.

Custom stored procedure with table works good but got into concurrency issues.

Solution:  Sequencing is the fast and efficient solution that i had ever liked.  Here is how it works.



create tabledbo.MyTable







CREATE  SEQUENCE[dbo].[MyTableSequence]

AS [bigint]




MAXVALUE 2147483648




declare @id bigint

select  @id=NEXT VALUE FOR [dbo].[MyTableSequence]  –GETTING NEXT SEQENCE

insert into mytable(someid,somevalue) values (@id,convert(varchar(40),newid()))


–There are situation that i need to get multiple ids that case we can use sp_sequence_get_range function


DECLARE @fv sql_variant,@lv sql_variant;

EXEC sys.sp_sequence_get_range @sequence_name=‘[MyTableSequence]’,@range_size= 10,@range_first_value=@fv OUTPUT,@range_last_value=@lv OUTPUT;

–SELECT fv = CONVERT(bigint, @fv), lv = CONVERT(bigint, @lv), next = NEXT VALUE FOR dbo.[MyTableSequence];

–Here is how i use this query

declare @fv1 bigint,@lv1 bigint

set  @fv1=convert(bigint,@fv)

set  @lv1=convert(bigint,@lv)

while (@fv1<=@lv1)


insert into mytable(someid,somevalue) values (convert(bigint,@fv1),convert(varchar(40),newid()))

set @fv1=@fv1+1



Summary: I see a Sequencing gives better performance than identity cloumns and easy to use my existing design

You can download the source code here

Hybrid cloud using service bus –NHDN-Nashua talk.

July 21, 2011

July 20th: Nashua User group meeting.

Today I have presented about Hybrid cloud using service bus.  This talks cover complete walk through and  presentation of  “How to connect clients and services via SOAP and REST over the Service Bus using the Windows Azure platform App Fabric SDK.  How to enable service automatic activation by connecting a WCF service in IIS 7.5 to the Service Bus”

You can download source code here.

NH Code Camp 3 (Azure AppFabric Cache)

June 12, 2011

Code Camp: NHDN Code Camp3, June 4, 2011

I organized speakers of this code camp.  I also presented (evening session) about Azure Appfabric cache and attended 25-thinks about c#.  You can download my presentation and sample code from