2015-03-25

Send Outlook Calendar Requests in Email

Just a few days ago, I received a request to build a procedure to send calendar item to the customer when there is an appointment has been booked.  

I did a quick search and basically I found two ways to send calendar items.

iCalendar

The first is to use sp_send_dbmail. Basic concept here is to generate an iCalendar item through SQL statement, and then send the item via attachment. Some useful links to help you understand iCalendar are listed below:




One thing confused me is about the property “PRODID”. In the rfc2445 and rfc2446 document it stats that PRODID is required, but in my test I can leave it without value, or even omit it. So let me know if you know the reason.
 
 
Below code is to demonstrate how we can send the iCalendar item through sp_send_dbmail.
 
 EXEC msdb.dbo.sp_send_dbmail    
  @recipients ='<your email address>'    
  ,@subject = 'iCalendar Example'    
  , @query = 'SET NOCOUNT ON;SELECT ''BEGIN:VCALENDAR''+ CHAR(13)+ ''VERSION:2.0''+ CHAR(13)+ ''METHOD:PUBLISH''+ CHAR(13)+ ''BEGIN:VEVENT''+ CHAR(13)+ ''CLASS:PUBLIC''+ CHAR(13)+ ''DESCRIPTION:This is the iCalendar example Most client application support it''+ CHAR(13)+ ''SUMMARY:iCalendar''+ CHAR(13)+ ''DTEND:20150325T110000''+ CHAR(13)+ ''DTSTART:20150325T120000''+ CHAR(13)+ ''LOCATION:My Office''+ CHAR(13)+ ''END:VEVENT''+ CHAR(13)+ ''END:VCALENDAR'''   
  , @attach_query_result_as_file = 1   
  , @query_result_header = 0   
  , @query_result_separator = 'CHAR(10)+CHAR(13)'   
  , @exclude_query_output = 1   
  , @query_attachment_filename = '123.ics'  
 

Exchange Web Service

Sending a calendar item to someone through Exchange Web Service is quite straightforward, just create a request for CalendarItemType, set CalendarItemType properties, then create it. Below is what I did:

Firstly, we need to add the Exchange Web Service, usually you can find it at https://webmail.<company>.com.au/ews/Services.wsdl

Now I create a helper method to retrieve the ExchangeServiceBinding object. It is a generic method I am using across many my Exchange Web Service projects.

 private ExchangeServiceBinding GetExchangeServiceBining(string username, string password, string domain, string url)  
     {  
       System.Net.NetworkCredential NC = new NetworkCredential(username, password, domain);  
       ExchangeServiceBinding ESB = new ExchangeServiceBinding();  
       ESB.Url = url;  
       ESB.Credentials = NC;  
       return ESB;  
     }  

 Finally in my main code:

 
       string UserName = "your ad name";  
       string Password = "your password";  
       string Domain = "your domain";  
       string EWS_URL = @"https://webmail.<company>.com.au/ews/Exchange.asmx";  
       ExchangeServiceBinding esb = GetExchangeServiceBining(UserName, Password, Domain, EWS_URL);  
       CalendarItemType appointment = new CalendarItemType();  
       appointment.Body = new BodyType();  
       appointment.Body.BodyType1 = BodyTypeType.Text;  
       appointment.Body.Value = "this is request body";  
       appointment.Subject = "Appointment created from esb";  
       appointment.Start = new DateTime(2015, 3, 24, 16, 30, 0);  
       appointment.StartSpecified = true;  
       appointment.End = new DateTime(2015, 3, 24, 17, 0, 0);  
       appointment.EndSpecified = true;  
       AttendeeType attendee = new AttendeeType();  
       attendee.Mailbox = new EmailAddressType();  
       attendee.Mailbox.EmailAddress = "attendee email address";  
       attendee.Mailbox.RoutingType = "SMTP";  
       AttendeeType attendee2 = new AttendeeType();  
       attendee2.Mailbox = new EmailAddressType();  
       attendee2.Mailbox.EmailAddress = "2nd recipient email address";  
       appointment.RequiredAttendees = new AttendeeType[] { attendee, attendee2 };  
       CreateItemType CreateRequest = new CreateItemType();  
       CreateRequest.SendMeetingInvitations = CalendarItemCreateOrDeleteOperationType.SendToAllAndSaveCopy;  
       CreateRequest.SendMeetingInvitationsSpecified = true;  
       CreateRequest.SavedItemFolderId = new TargetFolderIdType { Item = new DistinguishedFolderIdType { Id = DistinguishedFolderIdNameType.calendar } };  
       CreateRequest.Items = new NonEmptyArrayOfAllItemsType();  
       CreateRequest.Items.Items = new ItemType[] { appointment };  
       CreateItemResponseType response = esb.CreateItem(CreateRequest);  
       var responsemessage = response.ResponseMessages.Items[0];  

Quite simple, isn’t it?

I did some tests for both methods, and what I thought are

Because the iCalendar item is a standard, it can be accepted by most applications. The Exchange Web Service method, unfortunately needs some extra works to save the item as iCalendar attachment, then it can be recognized by other applications (if you know other hints, please let me know).

However within an MS Outlook and Exchange Service environment, Exchange Web Service is much more clear: the request will be added into Outlook Calendar directly (if attendee is required). So you do not need to open the iCalendar item and save it.

Let me know if you have any idea about this procedure :D

2015-03-19

Report Migration - Phantom Connection String in Catalog table

As mentioned in my prior post, I did a SQL Server migration back to 2014. When I did the report migration from server A to server B, I found something worth to be recorded here.

Let's start from beginning, what I have are

The old SQL Server is at 2008R2 version, reporting service is installed by following default setting
  • Reporting Service URL: http://<SS2008R2>/reportserver
  • Report Catalog DB name: ReportServer
The new SQL Server is 2012, reporting service is installed by following default setting
  • Reporting Service URL: http://<SS2012>/reportserver
  • Report Catalog DB name: ReportServer

What I need to do is to migrate hundreds of reports from SS2008R2 to SS2012, and update all connection strings from SS2008R2 to SS2012.

There were quite a lot ways to do this task, RS utility, report migration tool published by Microsoft (http://www.microsoft.com/en-au/download/details.aspx?id=29560). But because connection string inside the reports needs to be updated as well, I finally decided to write a short C# codes to do the migration. What the code will do is
  • copy the reports from SS2008R2
  • deploy the reports to SS2012
  • update the data source to SS2012

The code is quite simple, firstly ReportingService2010 web services from both SS2008R2 and SS2012 need to be added. usually it is at http://<ReportServer>/reportserver/reportservice2010.asmx

Now I initialized both source server and target server

 RSSource2010 = new RSSource.ReportingService2010();  
 RSSource2010.Credentials = System.Net.CredentialCache.DefaultCredentials;  
 RSSource2010.Url = @"http://" + SourceServer + "/reportserver/reportservice2010.asmx";  
 RSTarget2010 = new RSTarget.ReportingService2010();  
 RSTarget2010.Credentials = System.Net.CredentialCache.DefaultCredentials;  
 RSTarget2010.Url = @"http://" + TargetServer + "/reportserver/reportservice2010.asmx";  

Now below code shows how to migrate the report and change the data source connection string. I have made it a bit simpler. In a production enviornment it could be far more complicated

 //to make it simple, assume the report will be migrated to the root folder, "/"  
 string TargetPath = "/";  
 //get source report, here the variable ReportPath is the source report  
 //we can get it from catalog table (not suggested by MS) or from the webservice  
 byte[] ReportContents = RSSource2010.GetItemDefinition(ReportPath);  
 RS2010_Target.Warning[] warnings = null;  
 try  
 {  
      //create the target item  
      RS2010_Target.CatalogItem TargetItem = RSTarget2010.CreateCatalogItem("Report", "Migrated_Report", TargetPath, true, ReportContents, null, out warnings);  
      //need to change the data source of the target item to new server  
      //to make it simple, assume the data source connection string is embeded in the report  
      //otherwise it could be an expression or reference data source  
      RS2010_Target.DataSource[] DSs = RSTarget2010.GetItemDataSources(TargetItem.Path);  
      RS2010_Target.DataSource[] NewDSs = new RSTarget.DataSource[DSs.Count()];  
      for (int i = 0; i < DSs.Count(); i++)  
      {  
           //copy new data source name  
           RS2010_Target.DataSource NewDS = new RSTarget.DataSource();  
           NewDS.Name = DSs[i].Name;  
           //read source report data source  
           var t = DSs[i].Item.GetType();  
           if (t.Name == "DataSourceDefinition")  
           {  
                RS2010_Target.DataSourceDefinition TargetDSDef = new RSTarget.DataSourceDefinition();  
                RS2010_Target.DataSourceDefinition SourceDSDef = (RS2010_Target.DataSourceDefinition)DSs[i].Item;  
                //set target report data source connection string to new server  
                //set target report data source properties by copying from source  
                TargetDSDef.ConnectString = ((RS2010_Target.DataSourceDefinition)DSs[i].Item).ConnectString.ToUpper().Replace(SourceServer.ToUpper(), TargetServer.ToUpper());  
                TargetDSDef.Enabled = true;  
                TargetDSDef.Extension = ((RS2010_Target.DataSourceDefinition)DSs[i].Item).Extension;  
                TargetDSDef.CredentialRetrieval = RSTarget.CredentialRetrievalEnum.Integrated;  
                TargetDSDef.WindowsCredentials = ((RS2010_Target.DataSourceDefinition)DSs[i].Item).WindowsCredentials;  
                NewDS.Item = (RS2010_Target.DataSourceDefinitionOrReference)TargetDSDef;  
                NewDSs[i] = NewDS;  
           }  
      }  
      //set the data source  
      RSTarget2010.SetItemDataSources(TargetItem.Path, NewDSs);  
 }  

The code works. If you open the report, you can find the report runs correctly, data is coming from SS2012, etc.

BUT

If you open the report via Report Builder, the connection string has been updated. The connection string from report builder will be

DATA SOURCE=<RS2012>;INITIAL CATALOG=REPORTSERVER

However if you query Catalog table by using below statement, You can see the data source coming from Catalog table is still pointing to RS2008R2

 ;WITH XMLNAMESPACES ( DEFAULT 'http://schemas.microsoft.com/sqlserver/reporting/2010/01/reportdefinition', 'http://schemas.microsoft.com/SQLServer/reporting/reportdesigner' AS rd )  
 SELECT ReportName           = name  
   ,DataSourceName      = x.value('(@Name)[1]', 'VARCHAR(250)')   
       ,DataProvider      = x.value('(ConnectionProperties/DataProvider)[1]','VARCHAR(250)')  
       ,ConnectionString = x.value('(ConnectionProperties/ConnectString)[1]','VARCHAR(250)')  
  FROM ( SELECT C.Name,CONVERT(XML,CONVERT(VARBINARY(MAX),C.Content)) AS reportXML  
    FROM ReportServer.dbo.Catalog C  
    WHERE C.Content is not null and c.Name like '%the report migrated%'  
    AND C.Type = 2  
  ) a  
  CROSS APPLY reportXML.nodes('/Report/DataSources/DataSource') r ( x )  
 ORDER BY name ;  

I tried this procedure on my VM machine and got the identical result, so I am not quite sure what caused this discrepancy though the report after migration works. If you are accidently reading this thread, and know what happened, please let me know :D

2015-03-04

SSIS Design Pattern - Handling flat file with variable column numbers

In this post I want to show you how I deal with a flat file in which rows could have different number of columns.

Brief Backgound

When we are working on ETL design, I believe it is quite common that data is coming from various sources, such as SFTP, email attachment, online web portal, etc. Usually in this case I will design one package per data source, and use a parent package to handle the overall project.

However there might be cases client will do this task, I am saying, client extracts data from various sources and combine all into one single file. It sounds good, but sometimes it leads to trouble, for example, there probably will be no insurance about the data quality.

In one of my projects, I were asked to create a customer pool for marketing campaign purpose. Client extracted data from various sources, billing system, sales & marketing department, 3rd party data broker, and then generated one big flat file on a weekly basis.

The nightmare is data quality, some rows could have 4 columns, some rows could have 6 columns... So I wrote below package to solve this problem

Actual Design

Firstly lets look at what is the problem.

I have a SSIS package opened, and within the package I have a Flat File Connection Manager, points to a sample file.


Now I set columns as per normal.



And now I add a Data Flow Task, and within the Data Flow Task, I add some components as shown below

Quite straightforward, isn't it? Now I run it, and found an error:


Open the sample file within Excel, we can see number of columns is the actual reason for the error



While, there are plenty of ways to handle this type of issue:

  • Ask client to fix the data problem, control the data quality, as everyone knows the best position to fix data quality is from data source, unfortunately, in my case, it is too perfect to be true. 
  • Use some .Net code in script task to handle it. It works and should not be a problem.
Now here is another way to do it, basically I treat the file as one single column file, within the file read number of delimiters for each row, finally I use a Conditional Split Component to split the stream:

Firstly, Lets change the setting in the Flat File Connection Manager. Because we want the file to be treated as one single column file, I select "$$$" as the column delimiter.


Now to avoid truncate error, I change the column length to 8000.



Then the Data Flow Task layout looks like below. The Derived Column Component will read number of delimiters from the file, and Conditional Split Component will direct the stream by this newly created derived column.

In the Derived Columns Component, I write two expressions. The first one is to read number of tokens by using new function TOKENCOUNT from SSIS2012. If you are using SSIS2008, the 2nd expression will work for you. It reads number of delimiters.

SSIS 2012: TOKENCOUNT([Name|LogDate|Phone|OptIn],"|")
SSIS 2008: LEN([Name|LogDate|Phone|OptIn]) - LEN(REPLACE([Name|LogDate|Phone|OptIn],"|",""))


In the Conditional Split Component, I define outputs for correct output which can be loaded directly, dynamic output which can be loaded by a dynamic reader, and output needs to be reviewed manually


Now run the package again, it is what I need :D