Stopping execution of SSIS package in SSMS

Sometimes it required to execute the packages from SSMS after deploying it to SSISDB catalog in SQL Server. But what if we want to interrupt the execution in the middle or when package stops the progress or hangs. Recently one of my colleague faced this issue and we somehow couldn’t find any quick solution to stop the running execution of package. We tried to restart integration services and other SQL Server services(Which was kinda silly).

Then I figured it out that it’s something related to SSISDB as we’re executing the package in it. When we execute any package then it launches a process ISServerExec.exe for every execution which is in running in the catalog. Now to see all those running packages in the Sql Server instance follow below steps:

1. Go to the SSISDB catalog in Sql Server management studio. Right Click and select the “Active Executions”. The SSISDB catalog can be found under Integration Services Catalogs as shown in below image:



Now a window will appear that will show the currently running packages. Now select your desired package your desired package you want to kill and click STOP.


Cheers!!! :)

Using NLOG for smart logging

NLog is a free logging platform for .NET, Silverlight and Windows Phone with rich log routing and management capabilities. It makes it easy to produce and manage high-quality logs for your application regardless of its size or complexity. When you don’t want to care about the archiving, log formats then NLog is your friend. Using just few configuration setting will enable everything for you. So let’s get started:
First download the appropriate version of Nlog that you’ll be using in your application. Now when you’ve the desired library of Nlog just add the reference to your project. Next thing you need to do is putting some configuration lines in the config file.
Register the NLog section in config -
    <section name="nlog" type="NLog.Config.ConfigSectionHandler, NLog"/>

Note: keep in mind you’re putting the config sections tag in the top of Configuration file otherwise you’ll get an exception “Can not have more than one config section in the configuration” something like that.
Next you need to specify the Nlog section to be defined. Below is the sample code configuration of Nlog required to enable the logging in your application.
<nlog xmlns="" xmlns:xsi="" throwExceptions="true"
        internalLogFile="c:\temp\nLog_internal_log_file.txt" internalLogLevel="Trace" internalLogToConsole="true">
    <targets async="true">
      <target name="file" xsi:type="File" keepFileOpen="false"
              layout="${newline}${newline}${level}: ${message} ${exception:format=Message,StackTrace}"
      <target xsi:type="Database" name="database">
          data source=LABPC02;initial catalog=SampleDatabase;Integrated Security=SSPI;
          INSERT INTO Diagnostics
          VALUES (  @severity
          , @message
          , @stacktrace
          , @User
          , @machinename
        <parameter name="@severity" layout="${level}"/>
        <parameter name="@message" layout="${message}"/>
        <parameter name="@machinename" layout="${machinename}"/>
        <parameter name="@User" layout="${windows-identity:domain=true}"/>
        <parameter name="@stacktrace" layout="${stacktrace}"/>
      <logger name="*" writeTo="file"/>
      <logger name="*" minLevel="OFF" appendTo="database"/>

In this section we have couple of things that I need to explain.

InternalLogFile – this is the file that Nlog will use for it’s own logging. It’ll help you troubleshooting any issue with NLog configuration with your project. It’s optional and not required if you don’t want to log status of Nlog’s internal stuff.

InternalLogToConsole – You can turn it on/off if your application can run as console application.

Targets – This is the section where you define the targets i.e. your logging will be done in file or database etc. The attribute async=’true’ will define the behavior of Nlog to be async when writing logs. so you don’t need to consider the cost of logging statements if you’re using the async write behavior. Which is a great feature that I liked most cause I’ve seen application consuming enough time/space for logging statements. In the given sample configuration we have file and database logging enabled in targets table.

Renderlayouts – What ever is written under the target tag in the curley braces is called as render layouts. Whatever I used are the internal defined templates by Nlog. You can also create your own templates. These templates are knows as RenderLayouts. So Nlog is providing already quite good list of RenderLayouts check them out here.

That’s pretty much that is required to do in configuration area. Now you need to write an wrapper class that you’ll be using in your project.

Here’s the complete code for the wrapper:
using System;
using NLog;

namespace MyLogger
    public static class Logger
        private static readonly NLog.Logger _logger; //NLog logger
        private const string _DEFAULTLOGGER = "CustomLogger";

        static Logger()
            _logger = LogManager.GetLogger(_DEFAULTLOGGER) ?? LogManager.GetCurrentClassLogger();

        #region Public Methods
        /// <summary>
        /// This method writes the Debug information to trace file
        /// </summary>
        /// <param name="message">The message.</param>
        public static void Debug(String message)
            if (!_logger.IsDebugEnabled) return;

        /// <summary>
        /// This method writes the Information to trace file
        /// </summary>
        /// <param name="message">The message.</param>
        public static void Info(String message)
            if (!_logger.IsInfoEnabled) return;

        /// <summary>
        /// This method writes the Warning information to trace file
        /// </summary>
        /// <param name="message">The message.</param>
        public static void Warn(String message)
            if (!_logger.IsWarnEnabled) return;

        /// <summary>
        /// This method writes the Error Information to trace file
        /// </summary>
        /// <param name="error">The error.</param>
        /// <param name="exception">The exception.</param>
        public static void Error(String error, Exception exception = null)
            if (!_logger.IsErrorEnabled) return;
            _logger.ErrorException(error, exception);

        /// <summary>
        /// This method writes the Fatal exception information to trace target
        /// </summary>
        /// <param name="message">The message.</param>
        public static void Fatal(String message)
            if (!_logger.IsFatalEnabled) return;

        /// <summary>
        /// This method writes the trace information to trace target
        /// </summary>
        /// <param name="message">The message.</param>
        public static void Trace(String message)
            if (!_logger.IsTraceEnabled) return;



So the usage of this wrapper should be very simple as you will only need to write lines in the middle of your code.
Logger.Debug("Time elapsed: " + timeElapsed);

And you’re done with setting up the new generation logger with you application. If you need to turn off the logging for any target then you can use minLevel=”OFF” for that target.

for e.g.

<logger name="*" minLevel="OFF" appendTo="database"/>

Telerik rad grid sorting with groupby issue [solved]

Telerik is rad grid is awesome and good to have control when you are in Rapid Application Development environment. Built-in features and client side support is really helpful in solving problems of creating a grid with full of features.

Grouping the records is similarly a feature that provide a view to see the records in a way like you see your Outlook inbox. And with its default properties its works nicely. But what if you have a default sorter applied on your datasource and you take an action to groupby records on Telerik rad grid, the groupby command will simply override the data source sorter to its default sorter i.e. the column which is used in group by.

That means if you groupby with a column say “Category” then you’ll loose the default sort that was let say sorted by Name.

So to apply your previous sort with grouped column you just need follow the below steps:

1. Write below code in your OnNeedDataSource event of RadGrid.

  1. protected void RadGrid1_OnNeedDataSource(object source, GridNeedDataSourceEventArgs e)
  2.         {
  3.             // Removing the default sorter from groupby to apply datasource sorting
  4.             if (RadGrid1.MasterTableView.GroupByExpressions.Count > 0)
  5.             {
  6.                 RadGrid1.MasterTableView.GroupByExpressions[0].GroupByFields[0].SortOrder =
  7.                     GridSortOrder.None;
  8.             }
  9.         }

2. Now to prevent mess with original default sort command handle the sortcommand event and write the below snippet

  1. protected void RadGrid1_SortCommand(object source, GridSortCommandEventArgs e)
  2.         {
  3.             var sortField = e.SortExpression;
  5.             // Adding the sorting back if user sorts the grid by grouped field itlself
  6.             if (RadGrid1.MasterTableView.GroupByExpressions.Count > 0)
  7.             {
  8.                 var groupbySortExpression = RadGrid1.MasterTableView.GroupByExpressions[0].GroupByFields[0].FieldName;
  9.                 if (groupbySortExpression == sortField)
  10.                     RadGrid1.MasterTableView.GroupByExpressions[0].GroupByFields[0].SortOrder = e.NewSortOrder;
  11.             }
  12.         }

Now you have both features applied RadGrid Grouping and your default sorter in the grouped records.

Note :- This approach will not work if you have multicolumn support enabled.


Consuming OData service in Telerik Rad Grid

Few days back I wrote an article on How to create an OData service using WCF DataService? Now in this article we’ll see how to consume the OData service with one of the client of OData service Telerik Rad Grid. It supports the JSON format as data source from the OData Service. Even you don’t have to write any code behind. It would only take pure declarative coding on aspx page.

Try the AJAX Rad Controls from Telerik today to work with this demo -


Now once you have Telerik controls setup installed you can start with creating a consumer application. Its just a quick set of steps.

Launch the visual studio and create new project. Select website as your project template.


Add new WebForm to the web project or you can utilize default.aspx already added as default page in template.

Open the .aspx file in the designer by double clicking it. Go to the Toolbox and select “Telerik AJAX Data Components” –> RadGrid


Drag and drop it on the .aspx file.

Next add an script manager on the page.

<telerik:RadScriptManager ID="RadScriptManager1" runat="server">

now design your grid based on the datasource items you are going to bind with. In this demo we’ll be using the same service that we created in earlier demo here.

So the Grid will look like this -


  1. <telerik:RadGrid ID="RadGrid3" runat="server" AllowPaging="true" AllowSorting="true"
  2.     AllowFilteringByColumn="true" PageSize="5">
  3.     <MasterTableView ClientDataKeyNames="ProductID">
  4.         <Columns>
  5.                 <telerik:GridBoundColumn DataField="Id" HeaderText="Post ID"
  6.                 UniqueName="Id" DataType="System.Int32" />
  7.             <telerik:GridBoundColumn DataField="Title" HeaderText="Title"
  8.                 UniqueName="Title" DataType="System.String" />
  9.             <telerik:GridBoundColumn DataField="Body" HeaderText="Description"
  10.                 UniqueName="Body" DataType="System.String"/>
  11.             <telerik:GridNumericColumn DataField="ViewCount" HeaderText="View Count"
  12.                 UniqueName="ViewCount" DataType="System.Int32" />
  13.             <telerik:GridNumericColumn DataField="AnswerCount" HeaderText="Answer Count"
  14.                 UniqueName="AnswerCount" DataType="System.Int32"/>
  15.             <telerik:GridCheckBoxColumn DataField="CommentCount" HeaderText="Comment Count"
  16.                 UniqueName="CommentCount" DataType="System.Int32"/>
  17.                 <telerik:GridCheckBoxColumn DataField="CreationDate" HeaderText="Creation Date"
  18.                 UniqueName="CreationDate" DataType="System.DateTime"/>
  19.         </Columns>
  20.     </MasterTableView>
  21.     <ClientSettings>
  22.         <DataBinding Location="http://localhost:64552/Service.svc/" ResponseType="JSONP">
  23.             <DataService TableName="Posts" Type="OData" />
  24.         </DataBinding>
  25.     </ClientSettings>
  26. </telerik:RadGrid>

So other than designing the grid what it takes to bind the Grid in above code is,


This setting enables the grid to have OData service as datasource.

now run the project and see the how the grid looks like:


.. and you’re done with creating a simple OData consumer application.

Auto attach debugger to any process in visual studio

Specially, Sharepoint developers who are frustrated with attaching IIS process to debugger or who used to debug a webservice hosted in IIS. This blog post will help you speeding up your development work at least save few minutes/seconds.

Do you know, Any process that you need to repeat again and again can be automated in Visual Studio? How, Have you heard of Macros? Yes this is the key for winning this game and get a step ahead from co-developers.

 fast like ussain bolt(Image courtesy:

Follow the steps to get the process of attaching a process to debugger.

1. Goto Tools-> Macros->MacroExplorer

2. Add a new Item in MyMacros (a new module file). Name it as per your wish

3. Paste the below code in it. Make sure change the module name as per your new macro file name.

Option Strict Off
Option Explicit Off
Imports System
Imports EnvDTE
Imports EnvDTE80
Imports EnvDTE90
Imports System.Diagnostics

Public Module IISDebugAttach

Sub AttachDebuggerToIIS()
Dim processToAttachTo As String = "w3wp.exe"

If Not AttachToProcess(processToAttachTo) Then
MsgBox(processToAttachTo & " is not running")
End If
End Sub

Function AttachToProcess(ByVal processName As String) As Boolean

Dim proc As EnvDTE.Process
Dim attached As Boolean
For Each proc In DTE.Debugger.LocalProcesses
If (Right(proc.Name, Len(processName)) = processName) Then
attached = True
End If

Return attached
End Function
End Module

4. Build it (optional).

5. Go to again Tools-> Customize

6. Click on Keyboard button at the bottom

7. Now type and your macro will be shown in the filtered list select it.

8. Assign a short key to it and set apply.

Now forget about the headache of attaching IIS process every time. Just press your shortcut key. (Make sure you’re running your visual studio with administrator privilege.)

Have fun. Cheers!!

Signing a third party library with ildasm and ilasm

We all use third party libraries while doing development because we don’t need to implement everything. Now sometime you don’t have source code for a library only the assembly file that you use in your project.
What if you have strongly named your project assembly and third party library is not signed(which means don’t have strong names) you’ll be in trouble. Recently I’ve been through this problem and get to know the solution using ildasm and ilasm in two steps.

Step I:
Launch the visual studio command prompt and run below command for some third party library lets say Test.dll
c:\> ILdasm /all /out:Test.IL Test.dll

Step II:
c:\> ilasm /dll /key:YourKey.snk Test.IL

And you’re done.

But again there can be a trouble.

Case: Let’s say you have .Net 4.0 installed on your system.Your project is in let say .Net 3.5 version and library is of 2.0 version that you’re using. Now when your do the signing with above mentioned steps you’ll end up in a version change of third party library to 4.0.

See How to check the version of your assembly?

Now to get rid of this problem you need to take care when you bundling/packaging the Test.IL i.e. IL of your third party generated in step I.

Use ilasm.exe utility of version 2.0 to achieve this just explicitly specify the 2.0 ilasm.exe like this -

c:\> c:\Windows\Microsoft.NET\Framework\v2.0.50727\ilasm.exe /dll /key:YourKey.snk Test.IL

Now check the generated Test.dll again it should be v2.0

Problem solved.. :) Cheers!!!

RadCombobox sticking on page issue solved

Recently, I was working with Telerik Ajax Rad controls on a sharepoint site and I faced an issue that on a long page with scroll bar when we open the combobox and scroll the page using mouse scroll wheel, the dropdown list sticks on the position and move out of its position.
It looks something like -
After trying all CSS and other tricks I came to solve it by using onScroll event of DOM. Below simple code can work for you if your scroll bar is on mail window of browser.
    <script type="text/javascript"> 
        $(document).ready(function () {
            $(window).scroll(function () {
                var comboBox = $find("RadComboBox1");

But I ran into another issue as above code wasn’t working in my case, then soon I realized that scroll bar wasn’t on window but It was on the content Div in sharepoint site.

so I change the event of scroll from window to the div -

<script type="text/javascript>
$(document).ready(function () {
    $("#div-content").scroll(function () {
        var comboBox = $find("RadComboBox1");

and it works like charm. Smile So I came to share if it could help you getting out of the trouble.

Creating OData service using WCF DataService

What is OData? This should be first question if you're new to this term. Here is the little brief about it -

“The Open Data Protocol (OData) is a Web protocol for querying and updating data that provides a way to unlock your data and free it from silos that exist in applications today. OData does this by applying and building upon Web technologies such as HTTP, Atom Publishing Protocol (AtomPub) and JSONto provide access to information from a variety of applications, services, and stores. The protocol emerged from experiences implementing AtomPub clients and servers in a variety of products over the past several years.  OData is being used to expose and access information from a variety of sources including, but not limited to, relational databases, file systems, content management systems and traditional Web sites.”

Reference –

Want to know more like What, Why and How? visit here
Now once you go through the specification of OData you’ll get to know where it can fit into your requirement when designing service architecture.

Let’s not deep digging these decision making arguments and get back to the business i.e. How can we create an OData service using WCF DataService?

So go ahead and Launch your VisualStudio 2010 (or 2012 if you have that one installed. its available as RC version for free when I’m writing this article.)
Now create a web project or rather add a class library project (Lets keep little layering and separation of business)

below are simple and step by step walkthrough, keeping in mind someday some beginner(new to VS) might be having trouble with written instructions.

Name it as Demo.Model as we’re creating a sample service and this will work as Model (Data) for our service.

Now go ahead and add a new item –> ADO.Net Entity
Name it as DemoModel.edmx. when you click Add, a popup will be waiting for your input. So either you can generate your entities from database or you can just create an empty model which further can be used to generate your database. Make your choice.

But we’re simply going to use the existing database as I already have nice to have test database from Stackoverflow dumps.

Click Next> and get your connection string from NewConnection and you’re ready to click Next> again ..
select your Database object that you want to include into your Entities. SNAGHTMLd03a70c
For creating the OData service we only need the Tables. So after clicking the finish your Entities diagram should look like this. Here’s the important note; “The relationship between the entities should be well defined because the OData will be searching/locating your related entities based on your Query(Know about URI Queries)
We’re close enough to finish the job more than 50% work is done to create the service. Forget about the time when we used to define/create/use the client proxy and other configuration settings including creating the Methods for eveytype of data required.
(Note:- This post will only show you how to get the data from OData service. I can write some other time about Create, Delete, Edit using PUT, DELTE, POST http requests.)
Now Go ahead and add new Web project (Should choose Empty website) from the templates
Add the project reference Demo.Model to the newly added website.
Now add a new item in the web project i.e. WCF Data service.
Now open the code behind file of your service file and write this much of code.
  1.      [JSONPSupportBehavior]     
  2.     //[System.ServiceModel.ServiceBehavior(IncludeExceptionDetailInFaults = true)]   //- for debugging
  3.     public class Service : DataService<StackOverflow_DumpEntities>
  4.     {         
  5.         // This method is called only once to initialize service-wide policies.
  6.         public static void InitializeService(DataServiceConfiguration config)
  7.         {
  8.             config.SetEntitySetAccessRule("*", EntitySetRights.AllRead);             
  9.             //Set a reasonable paging site
  10.             config.SetEntitySetPageSize("*", 25);            
  11.             //config.UseVerboseErrors = true;  //- for debugging
  12.             config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
  13.         }
  15.         /// <summary>
  16.         /// Called when [start processing request]. Just to add some caching
  17.         /// </summary>
  18.         /// <param name="args">The args.</param>
  19.         protected override void OnStartProcessingRequest(ProcessRequestArgs args)
  20.         {
  21.             base.OnStartProcessingRequest(args);            
  22.             //Cache for a minute based on querystring
  23.             HttpContext context = HttpContext.Current;
  24.             HttpCachePolicy c = HttpContext.Current.Response.Cache;
  25.             c.SetCacheability(HttpCacheability.ServerAndPrivate);
  26.             c.SetExpires(HttpContext.Current.Timestamp.AddSeconds(60));
  27.             c.VaryByHeaders["Accept"] = true;
  28.             c.VaryByHeaders["Accept-Charset"] = true;
  29.             c.VaryByHeaders["Accept-Encoding"] = true;
  30.             c.VaryByParams["*"] = true;
  31.         }
  33.         /// <summary>
  34.         /// Sample custom method that you OData also supports
  35.         /// </summary>
  36.         /// <returns></returns>
  37.         [WebGet]
  38.         public IQueryable<Post> GetPopularPosts()
  39.         {
  40.             var popularPosts =
  41.                 (from p in this.CurrentDataSource.Posts orderby p.ViewCount select p).Take(20);
  43.             return popularPosts;
  44.         }
  45.     }

Here we added some default configuration settings in InitializeService() method. Take a note that I’m using DataServiceProtocolVersion.V3 version. This is latest version for OData protocol in .Net, you can get it by installing the SP for WCF 5.0 (WCF Data Services 5.0 for OData V3) and replacing the references of System.Data.Services, System.Data.Services.Client to Microsoft.Data.Services, Microsoft.Data.Services.Client.

Otherwise not a big deal you can still have your DataServiceProtocolVersion.V2 version and It works just fine.

Next in the above snippet I’ve added some cache support by overriding the method OnStartProcessingRequest().

And also I’m not forgetting about the JSONPSupportBehavior attribute on the class. This is open source library that enables your data service to have JSONP support.

First of all you should know why and what JSONP do over JSON. keeping it simple and brief:

i) Cross domain support

ii) Embedding the JSON inside the callback function call.

Hence.. We’re done creating the service. Wondered!! don't be. Now let’s run this project and see what we’ve got. If everything goes fine then you’ll get it running like this.


Now here’s the interesting part.

Play with this service and see the OData power. Now you’ve seen that we didn’t have any custom method implementation here and we’ll be directly working with our entities using the URL and getting the returned format as Atom/JSON. :) .. Sounds cool?

If you have LINQPAD installed then launch it otherwise download and install it from here.

Do you know your LINQPAD supports the ODataServices. So we’ll add a new connection and select WCF Data service as our datasource.


Now click next and enter your service URL.(Make sure the service that we just created is up and running)


The moment you click OK you’ll see your all entities in the Connection panel.


Select your service in the Editor and start typing your LINQ queries


I need top 20 posts order by View count.. blah blah. So here is my LINQ query.
  1. (from p in Posts
  2. orderby p.ViewCount
  3. select p).Take(20)

Now run this with LINQPAD and see the results.


Similary you can write any nested, complex LINQ queries on your ODataService.

Now here is the Key of this, The root of the whole post ..


This is OData protocol URI conventions that helped you filtering, ordering your data from entities. You in the end of this article you’ve just created and API like Netflix, Twitter or anyother public API.

Moreover, We used JSONPSupportBehavior so you can just add one more parameter to above url and enter it in browser and go..


In the next post we’ll see how to consume this service simple in a plain HTML page using Javascript. Keep an eye on the feeds.. stay in touch.

HTML5 Day conference presented OData with jQuery

28, July2012.
IIT Delhi Seminar Hall, Delhi NCR.
Title – HTML5 Day
Being as a speaker this was my first public speaking engagement. I presented on OData protocol, Supports in .net for OData Creation and consuming OData service in HTML5 application using JQuery. It was all 45 minutes session still I managed to talk about Data.js, Knockout.js, Modernizer and jQuery templates.
This was whole nice experience addressing the Delhi techno people, and Meeting few of them in person. I’ll look forward for more opportunities like this.
I managed to collect few snaps from there. and sharing it right here…
Hope u'll join me in upcoming events… Smile