Posts in category .net - page 5

LINQ to SQL tips and tricks #3

Another set of useful and lesser-known LINQ to SQL techniques.

Lazy loading with stored procedures

LINQ to SQL supports stored procedures for retrieving entities, insert, update and delete operations, as you know. But you can also use them to perform lazy-loading of navigation properties.

Let’s show an example of a bi-directional relationship between a Post and a Comment. We have two stored procedures shown below, bringing them into the DBML by dragging them from Server Explorer into the LINQ to SQL designer surface, setting the return type property for each to the expected entity (Post and Comment, respectively).

CREATE PROCEDURE LoadPost (@PostID int) AS SELECT * FROM Posts WHERE ID = @PostID
CREATE PROCEDURE LoadComments(@PostID int) AS SELECT * FROM Comments WHERE Parent_Post_ID = @PostID

This generates two method stubs named LoadPost and LoadComments that we can use to programmatically retrieve entities:

var post = dataContext.LoadPost(1).First();
Console.WriteLine("{0}", post.Title);

Now to replace LINQ to SQL’s lazy-loading query generation. We add methods to the data context subclass with a specific signature.

partial class DataClasses1DataContext {
  protected IEnumerable<Comment> LoadComments(Post post) {
    return this.LoadComments(post.ID);
  }

  protected Post LoadParentPost(Comment comment) {
    return this.LoadPost(comment.Post_ID).First();
  }
}

To get the signature of the method names right:

  1. Visibility can be anything (protected or private recommended)
  2. Return type must be the type of the other side of the association (wrapped in IEnumerable when that side can be many)
  3. Method name must start with the word “Load”
  4. Method name must then continue with the name of the navigation property you want to intercept
  5. Parameter type must be the type that has the named navigation property (step 4)

Storing and retrieving binary files

LINQ to SQL supports the SQL Server’s varbinary type but storing something practical like a file in there isn’t clearly documented. Map your varbinary(max) column from your table into your entity, which exposes the column as using the System.Data.Linq.Binary type (effectively a wrapper for a byte array but better change tracking).

File to database

To store a file in the database, read those bytes in and assign them to the property (Binary knows how to create itself from a byte array automatically). e.g.

var readPath = @"c:\test.jpg";
var storedFile = new StoredFile();
storedFile.Binary = File.ReadAllBytes(readPath);
storedFile.FileName = Path.GetFileName(readPath);
data.StoredFiles.InsertOnSubmit(storedFile);

I recommend storing the file name as well as the binary contents for two reasons. Firstly, writing the file back to disk or streaming it to a browser requires you know the file type (eg. .jpg or image/jpeg), and, secondly, nobody likes downloading a file called ‘download’ or ‘1’ :)

Database to file

Writing the file back to disk is just as easy although, you have to use the ToArray() method of System.Data.Linq.Binary to turn it back into a byte array.

var writePath = @"c:\temp";
var storedFile = data.StoredFiles.First();
File.WriteAllBytes(Path.Combine(writePath, storedFile.FileName), storedFile.Binary.ToArray());
Always ensure when writing to the file system based on data that filenames are sanitized! You don’t want users overwriting critical files on your system.

Multiple databases with a single context

Contrary to popular belief, you can, in fact, access entities from multiple databases with a single data context - providing they live on the same server. This is unsupported, but I’ve used it on my own projects without issue :)

The first part is the tricky bit which involves getting the definition of your entity into your DBML. You have two options here:

Create a temporary view

If you have the rights, you can temporarily create views in your primary database for each table in your non-primary database.

CREATE VIEW MyOtherTable AS SELECT * FROM MyOtherDatabase.dbo.MyOtherTable

Once views are created, add them to your DBML by dragging them from Server Explorer into the LINQ to SQL designer surface and delete the views you created from the database.

Create a temporary DBML

If you can’t or don’t want to create temporary views, then add a second (temporary) LINQ to SQL classes file (DBML) to your project. Use Server Explorer to find your secondary database and drag all the tables you want to access to the LINQ to SQL designer surface.

Now save & close open files and use the right-mouse-button context menu to Open With… and choose XML Editor on your original DBML and the new temporary one. Head to the Window menu and select New Vertical Tab Group to make the next step easier.

Looking through the DBML you see each entity has a <Table> block inside the . Select all the Table tags and their children (but not Database or Connection) and copy/paste them into your existing DBML file. Then close the files and check all looks well in the designer again.

If it does, delete the temporary DBML file you created. If not, go back and check the DBML file for duplicate names, mismatched XML etc.

Finally, the easy bit

Open the designer and for each table that comes from the other database select it and change the Source property in the Properties window from dbo.MyOtherTable to MyOtherDatabase.dbo.MyOtherTable.

Hit play and run!

Check out part 1 of LINQ to SQL tips

[)amien

SQL Server query plan cache – what is it and why should you care?

What is a query plan?

SQL Server like all databases goes through a number of steps when it receives a command. Besides parsing and validating the command text and parameters it looks at the database schema, statistics and indexes to come up with a plan to efficiently query or change your data.

You can view the plan SQL Server comes up with for a given query in SQL Management Studio by selecting Include Actual Execution Plan from the Query menu before running your query.

A query plan in SQL Management Studio

Show me the cache!

Query plans are cached so subsequent identical operations can reuse them for further performance gains. You can see the query plans in use on your server with the following SQL:

SELECT objtype, p.size_in_bytes, t.[text], usecounts
   FROM sys.dm_exec_cached_plans p
   OUTER APPLY sys.dm_exec_sql_text (p.plan_handle) t
   WHERE objtype IN ('Prepared', 'Adhoc')
   ORDER BY usecounts DESC

Hitting the cache

DBAs know the value in hitting the query plan often and this is one of the reasons they like stored procedures. You can however achieve the same thing with parameterized queries providing the query text and the parameter definitions are identical so you can execute the same thing over and over again just with different parameters.

If your ORM uses parameterized queries then it too can take advantage of it but it is important to remember the query definition and parameters need to be identical for this to happen.

How this applies to ORMs

In .NET 3.5SP1 both LINQ to SQL and Entity Framework did not set the length of variable type parameters (varchar, nvarchar, text, ntext and varbinary) so SQL Client sets it to the actual content length. This means the cache is often missed and instead populated with plans that are different only in the parameter lengths.

In .NET 4.0 variable length parameters now honour the defined length in both LINQ to SQL and Entity Framework where possible or fall back to the maximum length when the actual content doesn’t fit in the defined length.

[)amien

Multiple outputs from T4 made easy – revisited

My multiple outputs from t4 made easy post contained a class making it easy to produce multiple files from Visual Studio’s text templating engine (T4).

While useful it had a few issues:

  • Getting start/end blocks mixed up resulted in unpredictable behavior
  • Files were rewritten even when content did not change
  • Did not play well with source control
  • Files not always deleted in VS
  • Failed in Visual Studio’s project-less Web Sites

This helper class forms the basis of multiple file output for Entity Framework templates in .NET 4.0 and the LINQ to SQL templates on CodePlex so we (Jeff Reed, Andrew Peters and myself) made the following changes.

Improvements

Simpler block handling

The header, footer and file blocks can now be completed with EndBlock (EndHeader and EndFooter are gone), although it will automatically end the previous block when it hits a new one or the final Process method.

Skip unchanged files

Files are now only written to disk if the contents are different with the exception of the original T4 output file (we can’t stop that, sorry).

There is additional overhead reading and comparing files we believe unmodified files keeping their dates and source control status are worth it.

Automatic checkout

When the template detects it is running in Visual Studio and that the file it needs to write to is currently in source control but not checked out it will check the file out for you.

Predictable clean-up

All files that were not part of the generation process but are nested under the project item will now be deleted when running inside Visual Studio.

Outside of Visual Studio files are no longer deleted – this was destructive and it couldn’t know which files it generated on a previous run to clean-up correctly anyway.

Website projects fall back to single file generation

Visual Studio has both web sites and web applications with the former being project-less leading to very messy multi-file generation so it forces single file generation.

Internal improvements

Source is now simpler to read and understand with less public visibility and faster and more robust VS interop by batching the files & deletes to a single invoke at the end to avoid conflicts with other add-ins that might be triggered by the changes.

Usage

Initialization

You’ll need to get the code into your template – either copy the code in or reference it with an include directive. Then declare an instance of the Manager class passing in some environmental options such as the desired default output path. (For Visual Studio 2010 remove the #v3.5 portion from the language attribute)

<#@ template language="C#v3.5" hostspecific="True"
#><#@include file="Manager.ttinclude"
#><# var manager = Manager.Create(Host, GenerationEnvironment); #>

File blocks

Then add one line before and one line after each block which could be split out into it’s own file passing in what the filename would be if split. The EndBlock is optional if you want it to carry through to the next one :)

<# manager.StartNewFile("Employee.generated.cs"); #>
public class Employee {  }
<# manager.EndBlock(); #>

Headers & footers

Many templates need to share a common header/footer for such things as comments or using/import statements or turning on/off warnings. Simply use StartHeader and StartFooter and the blocks will be emitted to the start and end of all split files as well as being left in the original output file.

<# manager.StartHeader(); #>
// Code generated by a template
using System;

<# manager.EndBlock(); #>

Process

At the end of the template call Process to handle splitting the files (true) or not (false). Anything not included in a specific StartNewFile block will remain in the original output file.

<# manager.Process(true); #>

Revised Manager class

Latest source available at GitHub
<#@ assembly name="System.Core"
#><#@ assembly name="System.Data.Linq"
#><#@ assembly name="EnvDTE"
#><#@ assembly name="System.Xml"
#><#@ assembly name="System.Xml.Linq"
#><#@ import namespace="System"
#><#@ import namespace="System.CodeDom"
#><#@ import namespace="System.CodeDom.Compiler"
#><#@ import namespace="System.Collections.Generic"
#><#@ import namespace="System.Data.Linq"
#><#@ import namespace="System.Data.Linq.Mapping"
#><#@ import namespace="System.IO"
#><#@ import namespace="System.Linq"
#><#@ import namespace="System.Reflection"
#><#@ import namespace="System.Text"
#><#@ import namespace="System.Xml.Linq"
#><#@ import namespace="Microsoft.VisualStudio.TextTemplating"
#><#+

// Manager class records the various blocks so it can split them up
class Manager {
    private class Block {
        public String Name;
        public int Start, Length;
    }

    private Block currentBlock;
    private List<Block> files = new List<Block>();
    private Block footer = new Block();
    private Block header = new Block();
    private ITextTemplatingEngineHost host;
    private StringBuilder template;
    protected List<String> generatedFileNames = new List<String>();

    public static Manager Create(ITextTemplatingEngineHost host, StringBuilder template) {
        return (host is IServiceProvider) ? new VSManager(host, template) : new Manager(host, template);
    }

    public void StartNewFile(String name) {
        if (name == null)
            throw new ArgumentNullException("name");
        CurrentBlock = new Block { Name = name };
    }

    public void StartFooter() {
        CurrentBlock = footer;
    }

    public void StartHeader() {
        CurrentBlock = header;
    }

    public void EndBlock() {
        if (CurrentBlock == null)
            return;
        CurrentBlock.Length = template.Length - CurrentBlock.Start;
        if (CurrentBlock != header && CurrentBlock != footer)
            files.Add(CurrentBlock);
        currentBlock = null;
    }

    public virtual void Process(bool split) {
        if (split) {
            EndBlock();
            String headerText = template.ToString(header.Start, header.Length);
            String footerText = template.ToString(footer.Start, footer.Length);
            String outputPath = Path.GetDirectoryName(host.TemplateFile);
            files.Reverse();
            foreach(Block block in files) {
                String fileName = Path.Combine(outputPath, block.Name);
                String content = headerText + template.ToString(block.Start, block.Length) + footerText;
                generatedFileNames.Add(fileName);
                CreateFile(fileName, content);
                template.Remove(block.Start, block.Length);
            }
        }
    }

    protected virtual void CreateFile(String fileName, String content) {
        if (IsFileContentDifferent(fileName, content))
            File.WriteAllText(fileName, content);
    }

    public virtual String GetCustomToolNamespace(String fileName) {
        return null;
    }

    public virtual String DefaultProjectNamespace {
        get { return null; }
    }

    protected bool IsFileContentDifferent(String fileName, String newContent) {
        return !(File.Exists(fileName) && File.ReadAllText(fileName) == newContent);
    }

    private Manager(ITextTemplatingEngineHost host, StringBuilder template) {
        this.host = host;
        this.template = template;
    }

    private Block CurrentBlock {
        get { return currentBlock; }
        set {
            if (CurrentBlock != null)
                EndBlock();
            if (value != null)
                value.Start = template.Length;
            currentBlock = value;
        }
    }

    private class VSManager: Manager {
        private EnvDTE.ProjectItem templateProjectItem;
        private EnvDTE.DTE dte;
        private Action<String> checkOutAction;
        private Action<IEnumerable<String>> projectSyncAction;

        public override String DefaultProjectNamespace {
            get {
                return templateProjectItem.ContainingProject.Properties.Item("DefaultNamespace").Value.ToString();
            }
        }

        public override String GetCustomToolNamespace(string fileName) {
            return dte.Solution.FindProjectItem(fileName).Properties.Item("CustomToolNamespace").Value.ToString();
        }

        public override void Process(bool split) {
            if (templateProjectItem.ProjectItems == null)
                return;
            base.Process(split);
            projectSyncAction.EndInvoke(projectSyncAction.BeginInvoke(generatedFileNames, null, null));
        }

        protected override void CreateFile(String fileName, String content) {
            if (IsFileContentDifferent(fileName, content)) {
                CheckoutFileIfRequired(fileName);
                File.WriteAllText(fileName, content);
            }
        }

        internal VSManager(ITextTemplatingEngineHost host, StringBuilder template)
            : base(host, template) {
            var hostServiceProvider = (IServiceProvider) host;
            if (hostServiceProvider == null)
                throw new ArgumentNullException("Could not obtain IServiceProvider");
            dte = (EnvDTE.DTE) hostServiceProvider.GetService(typeof(EnvDTE.DTE));
            if (dte == null)
                throw new ArgumentNullException("Could not obtain DTE from host");
            templateProjectItem = dte.Solution.FindProjectItem(host.TemplateFile);
            checkOutAction = (String fileName) => dte.SourceControl.CheckOutItem(fileName);
            projectSyncAction = (IEnumerable<String> keepFileNames) => ProjectSync(templateProjectItem, keepFileNames);
        }

        private static void ProjectSync(EnvDTE.ProjectItem templateProjectItem, IEnumerable<String> keepFileNames) {
            var keepFileNameSet = new HashSet<String>(keepFileNames);
            var projectFiles = new Dictionary<String, EnvDTE.ProjectItem>();
            var originalFilePrefix = Path.GetFileNameWithoutExtension(templateProjectItem.get_FileNames(0)) + ".";
            foreach(EnvDTE.ProjectItem projectItem in templateProjectItem.ProjectItems)
                projectFiles.Add(projectItem.get_FileNames(0), projectItem);

            // Remove unused items from the project
            foreach(var pair in projectFiles)
                if (!keepFileNames.Contains(pair.Key) && !(Path.GetFileNameWithoutExtension(pair.Key) + ".").StartsWith(originalFilePrefix))
                    pair.Value.Delete();

            // Add missing files to the project
            foreach(String fileName in keepFileNameSet)
                if (!projectFiles.ContainsKey(fileName))
                    templateProjectItem.ProjectItems.AddFromFile(fileName);
        }

        private void CheckoutFileIfRequired(String fileName) {
            var sc = dte.SourceControl;
            if (sc != null && sc.IsItemUnderSCC(fileName) && !sc.IsItemCheckedOut(fileName))
                checkOutAction.EndInvoke(checkOutAction.BeginInvoke(fileName, null, null));
        }
    }
} #>

[)amien

When an object-relational mapper is too much, DataReader too little

I fired up Visual Studio this evening to write a proof-of-concept app and found myself wanting strongly typed domain objects from a database but without the overhead of an object-relational mapper  (the application is read-only).

One solution is to write methods by hand, another is to code generate them but it would be nice to be able to do:

var customers = new SqlCommand("SELECT ID, Name FROM Customer", connection)
  .As(r => new Customer { CustomerID = r.GetInt32(0), Name = r.GetString(1) }).ToList();

So for any DbCommand object you can turn it into a bunch of classes by specifying the new pattern.

The tiny helper class to achieve this is:

public static class DataHelpers {
  public static List<T> ToList<T>(this IEnumerable<T> enumerable) {
    return new List<T>(enumerable);
  }

  public static IEnumerable<T> As<T>(this DbCommand command, Func<IDataRecord, T> map) {
    using (var reader = command.ExecuteReader())
      while (reader.Read())
        yield return map(reader);
  }
}

It might even be possible to do some cool caching/materialization. I should look into that :)

[)amien