Friday, March 13, 2009

WCF auto generated client proxy classes - instantiation gotcha

WCF auto-generated client proxy classes all implement IDisposable. This makes them candidates for instantiating via using statements i.e.:

using (MyServiceBindingClient client = new MyServiceBindingClient())
{
    client.Open();
}

This approach should not be taken when working with WCF client proxy classes. This is because if an exception is thrown within the try block above, you will never get to see the true exception:

The problem is that if the client.Open() call (or any subsequent calls placed on the client) throws an exception, we end up leaving the using block, which causes our client’s IDisposable interface to get fired. The Dispose routine on WCF classes have a bad tendency to then fire another exception at that point, because we are attempting to perform a close() operation on the client but it has already faulted. And so the exception that eventually gets caught by any surrounding try/catch block will be the exception thrown by calling close() on a faulted client. It will not be the original exception, and the original exception is not accessable via the InnerException property.

Microsoft’s recommended way of avoiding this situation is to not instantiate WCF clients via using statements. I.e. instantiate/use them as per non-IDisposable’s, and make sure you have appropriate exception handling and explicit calls to dispose your IDisposable.

Thursday, February 26, 2009

Back porting SQL Server 2005 to 2000

Background

We had a SQL Server 2005 DotNetNuke database full of proprietary data that we needed to back-port to SQL Server 2000, due to a limitation on the eventual destination hosting site.


The Problem

SQL Server 2000 will not natively restore databases from backups made on SQL Server 2005, and will not attach SQL Server 2005 data/log files. So we needed to find an alternative way to move our data and schema objects from the SQL 2005 instance back to 2000. Something like Redgate, but perhaps a little cheaper and simpler if possible.


The Solution

Microsoft offers a free download called the SQL Server 2005 Database Publishing Wizard. This tool presents a wizard-style interface that allows you to script out your database in its entirety (all data and objects) into a single SQL script. This script can be targeted at either SQL Server 2000 or 2005.

With just a few mouse clicks we had generated a single SQL script that contained our entire database. The script ran up perfectly first time under SQL Server 2000, giving us the desired result of our DotNetNuke database being back-ported successfully to SQL Server 2000.

The tool can be found at:

http://www.microsoft.com/downloads/details.aspx?familyid=56E5B1C5-BF17-42E0-A410-371A838E570A&displaylang=en

Monday, December 8, 2008

C# enums and aliases - weird behaviour

Jon has been playing with enum values, and fond some apparently inconsistent behaviour from the compiler.

Notes from this example:

  • If you’re going to set an enum value, set all of them.

  • If the value aligns to a table or values used in the database, always set all of the values in the enum.

Also, notice how TestA becomes an alias of TestB, yet TestE becomes an alias of TestD, see how this appears to be reversed.

I was quite surprised with this but it seems it has something to do with TestB being set prior to TestE being set.



using System;
namespace TestConsoleApp
{
   class Program
   {
      enum TestType
      {
         TestA, // TestA is an alias of TestB
         TestB = 0,
         TestC,
         TestD,
         TestE = 2, // TestE is an alias of TestD
         TestF,
         TestG = 6,
         TestH
      }
 
      static void Main(string[] args)
      {
         TestType dc1 = TestType.TestB;
 
         if (dc1 == TestType.TestA)
         
            // since TestA is just an alias to TestB this works.
            Console.WriteLine("No compile warning is bad.");
         }
 
         int a = (int)TestType.TestA; // alias of TestB
         int b = (int)TestType.TestB;
         int c = (int)TestType.TestC;
         int d = (int)TestType.TestD;
         int e = (int)TestType.TestE; // alias of TestD
         int f = (int)TestType.TestF;
         int g = (int)TestType.TestG;
         int h = (int)TestType.TestH;
 
         Console.WriteLine("{0} - {1}"TestType.TestA, a); // Prints 'TestB – 0'
         Console.WriteLine("{0} - {1}"TestType.TestB, b); // Prints 'TestB – 0'
         Console.WriteLine("{0} - {1}"TestType.TestC, c); // Prints 'TestC – 1'
         Console.WriteLine("{0} - {1}"TestType.TestD, d); // Prints 'TestD – 2'
         Console.WriteLine("{0} - {1}"TestType.TestE, e); // Prints 'TestD – 2'
         Console.WriteLine("{0} - {1}"TestType.TestF, f); // Prints 'TestF – 3'
         Console.WriteLine("{0} - {1}"TestType.TestG, g); // Prints 'TestG – 6'
         Console.WriteLine("{0} - {1}"TestType.TestH, h); // Prints 'TestH – 7'
         Console.WriteLine();
 
         TestType type = (TestType)Enum.Parse(typeof(TestType), "TestA");
         Console.WriteLine("{0} - {1}", type, (int)type); // Prints 'TestB – 0'

 
         Console.ReadKey();
      }
   }
}




Friday, October 24, 2008

SQL Server - Date Time accuracy

This does not work as expected:

SET @day1159pm = DATEADD(millisecond,-1,DATEADD(day,1,@dayMidnight))

If @dayMidnight was 1/1/01 then @day1159pm will equal 2/1/01. It seems that it doesn’t keep DATETIME with enough precision to subtract 1 millisecond. Use this instead:

SET @day1159pm = DATEADD(millisecond,-1,DATEADD(day,1,@dayMidnight))

---


The reason for this is provided in the Transact-SQL reference:

Date and time data from January 1, 1753 through December 31, 9999, to an accuracy of one three-hundredth of a second (equivalent to 3.33 milliseconds or 0.00333 seconds). Values are rounded to increments of .000, .003, or .007 seconds, as shown in the table.

Example

Rounded example

01/01/98 23:59:59.999

1998-01-02 00:00:00.000

01/01/98 23:59:59.995,
01/01/98 23:59:59.996,
01/01/98 23:59:59.997, or
01/01/98 23:59:59.998

1998-01-01 23:59:59.997

01/01/98 23:59:59.992,
01/01/98 23:59:59.993,
01/01/98 23:59:59.994

1998-01-01 23:59:59.993

01/01/98 23:59:59.990 or
01/01/98 23:59:59.991

1998-01-01 23:59:59.990



Wednesday, August 6, 2008

Memory leak in FAXCOMEXLib

Yuan has been working on a fax / e-mail / SMS etc server we use internally and also give to our clients, and has come across some strange behavior in Microsoft's COM wrapper to the fax console.


Recently I have been working on our communications toolset – which manages our SMTP, FTP, SMS and Fax sending..


The application has 2 major components: An ASP.NET application to view, search and manage messages and message batches being sent; And a windows service which periodically queries the database to send any queued messages.


Each of the message types is run in its own thread with SMTP, FTP and SMS working for years with no problem.


However, a few weeks after deploying the new Fax component, we began to receive “Out of memory” log traces.


After a little investigation, I narrowed down the problem to a FAXCOMEXLib memory leak. Read below fo the details:


To queue a fax, we use FAXCOMEXLib - which is a Microsoft COM wrapper to the windows fax console. Once the server sends a fax to the console, the fax thread polls to check whether the fax job terminated successfully. The code we use to poll the fax queue is shown below:


         Try

            faxServer.Folders.OutgoingArchive.Refresh()

            iterator = faxServer.Folders.OutgoingArchive.GetMessages(FAX_QUEUE_DEFAULT_PREFETCH) ' This routine exhibits the memory leak

            iterator.MoveFirst()

            For i As Integer = 1 To filesCount

               iterator.MoveNext()

               If Not iterator.Message Is Nothing Then

                  Dim FaxRecipient As Recipient = GetRecipientForFax(iterator.Message.Id)

                  '

                  '     Recipient will be null if the fax job was added to the fax server

                  '     by anything other than this service.

                  '

                  If Not FaxRecipient Is Nothing AndAlso FaxRecipient.Status <> FaxRecipient.Statuses.sent.ToString() Then

                     ' Acknowledge fax has been sent

                     FaxRecipient.ActMarkAsSent()

                     FaxRecipient.save()

                     Dim LogMessage As String = String.Format("CR_ID: {0} Fax sent successfully", FaxRecipient.ID)

                     Log.write(MaxSoft.Common.Log.LevelEnum.INFO, Me, LOG_AREA, LogMessage)

                  End If

                  Marshal.ReleaseComObject(iterator.Message)

               End If

            Next

         Finally

            faxServer.Disconnect()

            Marshal.ReleaseComObject(faxServer.Folders.OutgoingArchive)

            Marshal.ReleaseComObject(faxServer)

            If Not iterator Is Nothing Then

               Marshal.ReleaseComObject(iterator)

            End If

            iterator = Nothing

            faxServer = Nothing

         End Try


As you can see in the code above, we iterate through the files in the OutgoingArchive folder of the fax console and match the id with the id stored in our message queue. Unfortunately, every time we call the faxServer.Folders.OutgoingArchive.GetMessages the Microsoft COM dll leaks memory. With a little more research, we determined that if the ArchiveFolder was empty, there was no memory leak, and that the size of the memory leak was directly proportional to the number of files in the FAXCOMEXLib folder.


Once tracked down, we tried a number of options to dispose the object correctly including using Marshal.REleaseComObject, but with no luck. It appears that the problem is not in the disposal of the FAXCOMEXLib object, but in one of the underlying private objects or code used by FAXCOMEXLib. Since these objects are not exposed via COM, we cannot code around the memory leak.


After briefly considering restarting the windows service periodically I went and had a cup of coffee, and the solution came to me. Don't use FAXCOMEXLib.


Since I know that FAXCOMEXLib persists the completed faxes as a physical file in a known location, and that the filenames are the same as the id, we can get to the completed faxes by going through the physical filesystem rather then through the FAXCOMEXLib queue.

Here is the refactored code with no leaking:


         Try

            Dim unsentFaxRecipientTrackings As New RecipientDeliveryTrackingList

            unsentFaxRecipientTrackings.loadFromSQL("SELECT tr.* FROM tblCommRecipientDeliveryTracking tr JOIN tblCommRecipient cr ON tr.CR_Id = cr.CR_Id WHERE cr.CR_Status = 'sending'"Nothing, CommandType.Text, -1)

            If unsentFaxRecipientTrackings.Count > 0 Then

               Dim archiveFolder As String = faxServer.Folders.OutgoingArchive.ArchiveFolder

               Dim allFiles As New Hashtable

               ' Load files into hashtable

               For Each fileName As String In System.IO.Directory.GetFiles(archiveFolder)

                  ' The file name format is LoginUserID$MessageID.tif

                  If fileName.Split("$").Length > 1 Then

                     fileName = fileName.Split("$"c)(1).Replace(".tif""")

                     allFiles.Add(fileName, fileName)

                  End If

               Next

               For Each FaxTracking As RecipientDeliveryTracking In unsentFaxRecipientTrackings

                  If allFiles.ContainsKey(FaxTracking.FaxDocId) Then

                     Dim recipient As New recipient

                     recipient.loadFromSQL("SELECT * FROM tblCommRecipient WHERE CR_Id = " & FaxTracking.CR_Id.ToString(), Nothing, CommandType.Text, -1)

                     ' Might be a chance that someone remove the record from database manually

                     If Not recipient.isNew Then

                        recipient.ActMarkAsSent()

                        recipient.save()

                        Dim LogMessage As String = String.Format("CR_ID: {0} Fax sent successfully", FaxTracking.CR_Id)

                        Log.write(MaxSoft.Common.Log.LevelEnum.INFO, Me, LOG_AREA, LogMessage)

                     End If

                  End If

               Next

            End If

         Finally

            faxServer.Disconnect()

            Marshal.ReleaseComObject(faxServer)

         End Try


So, rather than ask FAXCOMEXLib to return a list of messages, I go to the file system and get them myself. Happily FAXCOMEXLib aids with this approach, and will happily provide the physical path to Folders.OutgoingArchive.ArchiveFolder.


Since failed messages do not go into the OutgoingArchive.ArchiveFolder, we do not need any additional fax metadata.


Result: After running for 1 week, no memory leak found.

Tools: .NET Memory Profiler

Friday, July 4, 2008

QUT adds up the benefits of SmartaPay

QUT announced today that Queensland based company SmartaPay had completed installation of the SmartaPay solution. For the first time students will be able to pay ALL their University and Student fees through a new custom designed payment solution built by SmartaPay, specifically for QUT

QUT and SmartaPay have spent the past 18 months designing and building the payment solution to ensure that the functionality of SmartaPay’s solution will provide all QUT businesses with
• Shopping Cart
• Online Catalogue
• Pay Now and Pay Later
• Invoice Generation and Payment
• Reconciliation

“We made a decision two years ago to replace our previous student payment option with a more sophisticated and integrated payment solution" said Terry Leighton, Director of Corporate Finance at Queensland University of Technology. "This is just another step in ensuring our University stays at the forefront of Technology advancements as we provide the optimum solutions for our current students and our future students.”

QUT, ‘the University for the Real World’, is one of Australia’s largest universities servicing in excess of 35,000 students across four campuses. QUT aims to strengthen its distinctive national and international reputation by combining academic strength and practical engagement with the world of the professions, industry, government, and the broader community.
Tony Irvine, Chief Operating Officer of SmartaPay, “We’re excited to be chosen by QUT as a partner in building the University’s new Payment Solution. This solution will be a market leader in the education arena and an achievement that both QUT and SmartaPay will be justly proud of.”

SmartaPay is a payment facilitator for Universities and Private Schools Australia wide. SmartaPay’s solution will cater for all types of payers and payment channels, from online internet payments to paying fees over the counter or by cheque, and deliver a single interface to QUT.

“SmartaPay’s business is built on a foundation of providing unique and fully integrated end to end billing and payment solutions” explains Dorian Borin, Chief Information Officer of SmartaPay. “We have a proven track record in providing customized solutions to all our SmartaPay clients”.

For more information please contact SmartaPay on 07 5575 7422 or email info@smartapay.com.au

Tuesday, June 17, 2008

Simple dynamic compiling to mock a Console Application

Toby has written a follow up to ‘Spawning a console application and tracking its Standard Output’ found here.

This article deals with unit testing an application that relies on interaction with a console application.

The problem is making the console application behave in a specific manner for the unit tests. For instance, simulating network issues, file access issues, strange multiple file issues and so on can be a right pain in the neck. Throw in other factors such as the speed of the unit tests with a large console application running, trying to automate the setting up of an entire staged environment and invoking a complex, production only and sometimes cranky console application makes this a challenge I'm keen to avoid.

A simpler approach is to mock the external console application, so that the mocked application returns exactly what you need to test the behaviour of your own application.

We have a few options to solve this, along with the negatives of the approach:
  • Making a simple batch file to accept input, and spit out the correct text on demand
    - Limited interaction potential, validation etc
    - No access to .NET goodness
  • Writing a bunch of .NET console apps which are compiled as part of the solution
    - Too many projects to maintain
    - Hard coded and not flexible for different environments
  • Dynamically generate console apps on the fly with environmental changes as needed
    - Sounds like a solution to me !!!

Writing the mocked console application

The following test code sample shows messages being written to Standard Output and files being created. You will need a separate mock application for each test that you need to perform.

namespace TestApplication
{
   class Program
   {
      static void Main(string[] args)
      {
         Console.WriteLine(@"Connected to www.externalserver.com made.");
         // put download files here
         StreamWriter sw = new StreamWriter(@"C:\DownloadFiles\File1");
         Console.WriteLine("Downloading file File1");
         sw.Write("Text Content of File 1");
         sw.Close();

         sw = new StreamWriter(@"C:\DownloadFiles\File2");
         Console.WriteLine("Downloading file File2");
         sw.Write("Text Content of File 2");
         sw.Close();

         Console.WriteLine(@"Closed Connection.");
      }
   }
}


Dynamically generating mock console applications on the fly

The source code for each test application is included in my solution as a string resource called TestScript. The source code is then compiled on the fly to an executable file. The code to compile the resource is:

Microsoft.CSharp.CSharpCodeProvider cscp = new Microsoft.CSharp.CSharpCodeProvider();
System.CodeDom.Compiler.CompilerParameters param = new System.CodeDom.Compiler.CompilerParameters();
param.GenerateExecutable = true;
param.OutputAssembly = @"c:\App.EXE";

foreach (Assembly assembly in AppDomain.CurrentDomain.GetAssemblies())
{
    param.ReferencedAssemblies.Add(assembly.Location);
}
System.CodeDom.Compiler.CompilerResults results = cscp.CompileAssemblyFromSource(param, TestScript.Test1);


Because this works from text, there is also the opportunity to use string.Replace to manage paths specific to the testing environment.

Setting param.GenerateExecutable to true causes an executable to be generated instead of a dll. Put the file name to generate to in param.OutputAssembly.

The above code assumes that the assemblies required to be referenced are the same as the currently executing application. This is achieved via the for loop. This can be altered by adding specific assemblies to param.ReferencedAssemblies in addition to, or instead of this loop.

If you are in an experimenting mood, any compiler directives available via the compiler options in Visual Studio are available to you. Have a play with System.CodeDom.Compiler.CompilerParameters.CompilerOptions;