Thursday, August 20, 2015

SQL Server Compact & SQLite Toolbox 4.3 – Visual Guide of new features

After more than 415.000 downloads, version 4.3 of my SQL Server Compact & SQLite Toolbox extension for Visual Studio 2010 and later is now available for download and available via Tools/Extensions and Updates in Visual Studio. This blog post is a visual guide to the new features, improvements and bug fixes included in this release, many suggested by users of the tool via the CodePlex issue tracker

This new version was released in July at the time that Visual Studio 2015 went live, and is of tested with Visual Studio 2015 – keep in mind that all extensions now work with the free Visual Studio Community Edition. This release has a couple of new features that improves on the SQLite support originally added in version 4.0

Scan solution for SQLite database files

image

The “Add Connections from Solution” feature has been enhanced to scan all projects in the current solution for any SQL Server Compact and SQLite files. If any are found (based on the file patterns defined in Options), the Toolbox will attempt to add them as new connections.

The current file patterns are defined as shown below, but you can change them to anything you like:

image

Rename Connection name

The names of connections currently defaults to simply the file name of the database file. This becomes an issue, if you have many same named files in different folders connected. Therefore, you can now rename the Connection name:

image 

SQLite-net DataAccess.cs improvements

The SQLIite-net code generation feature has received a couple of small updates based on user feedback:

image

The generated classes will also contain any views defined in the SQLite database, and nullable values will be mapped accordingly (for example int?).

NULL value display

image

NULL values are now displayed as 'NULL' in Query results text and grid. Remember that you can see query results both as text and in a grid, the default is text for performance reasons, but this can easily be changed in Options:

image

Also notice that if you prefer the previous behaviour, this can be changed here, via the ShowNullValuesAsNULL option.

Other improvements

Unique indexes now always scripted as UNIQUE constraints (a long standing omission, which hit me when I was creating the reverse engineering for the SQL Server Compact providers for Entity Framework 7)

As usual, any feedback and comments are welcome, and if you like my free tools, I am always very grateful for a review on Visual Studio Gallery

Monday, July 27, 2015

Getting started with the SQL Server Compact and other providers for Entity Framework 7 beta 6

In connection with the recent release of Entity Framework 7 (EF7) beta 6 on NuGet, I have contributed two SQL Server Compact providers that work with this beta release (in close collaboration with and with great support from the EF7 team.)

Notice that the general “EntityFramework” package for EF7 is no longer available. Going forward you always install the desired provider packages as needed. For example EntityFramework.SqlServer

The SQL Server Compact providers

The SQL Server Compact providers are fully featured providers that support all required Core APIs (maybe with the exception of provider specific metadata). This includes support for returning IDENTITY values after INSERTs, Migrations, database object creation, and modelling via both the fluent API and Data Annotations. It also includes support the for Reverse-Engineer command, allowing you to create an EF7 model from an existing database.

For a walkthrough of how to get started with the SQL Server Compact providers, see the wiki article here. Initial Entity Framework 7 documentation is available here (a work in progress). As always feedback is very much appreciated, suggest you use the Github issue tracker for bug reports, feature suggestions or even Pull Request discussions and submissions!

Other providers

With the current EF7 beta, the following providers are available:

EntityFramework.SqlServer Provider for SQL Server/Azure SQL Database - supplied by Microsoft. This provider will be able to take advantage of the .NET Core support for System.Data.SqlClient on Unix and Mac. Currently only SQL Server 2012 and later is supported, but support for SQL Server 2008 is planned

EntityFramework.Sqlite Provider for SQLite, mainly for devices (tablet and phone) - supplied by Microsoft. Will also work with Xamarin (Andriod and IOs)

EntityFramework.InMemory In-memory provider for use with testing, supplied by Microsoft

All the providers above are hosted here on Github.

EntityFramework.Npgsql Provider for PostgressSQL - also works on Mono. Supplied by Shay Rojansky and others (expected to be released with beta7) – Github

EntityFramework.SqlServerCompact40 Provider for SQL Server Compact 4.0, supplied by yours truly – Github

EntityFramework.SqlServerCompact35 Provider for SQL Server Compact 3.5 SP2, also supplied by yours truly – Github

The current plan is to release EF7 at the same time as ASP.NET 5, which is currently scheduled (no guarantees) to beginning of next year.

Monday, July 13, 2015

SQL Server Compact ADO.NET data access performance – part 3: SELECTs

In the third instalment in this series, I will demonstrate how to implement SELECTs using the two available APIs in the SQL Server Compact ADO.NET provider. Keep in mind that the “alternative” method of SELECTing requires a corresponding index, so it will on work for simple scenarios where you are looking up on the primary key, for example – luckily, this is quite a common scenario.

I will implement the following interface method, which finds a cacheelement by it’s ID:

CacheElement FindElementByKey(Guid key);

Let’s first look at the implementation of the classic T-SQL SELECT based method:

public CacheElement FindElementByKey(Guid key)
{
    using (var command = _connection.CreateCommand())
    {
        command.Parameters.Add("@key", SqlDbType.UniqueIdentifier).Value = key;
        command.CommandText = @"SELECT [Tag], [Value], [CreatedAt], [ExpirationAt]
            FROM CacheElement
            WHERE [Key] = @key;";
        using (var reader = command.ExecuteReader())
        {
            if (reader.Read())
            {
                var result = new CacheElement();
                result.Key = key;
                result.Tag = reader.GetValue(1) == DBNull.Value ? null : reader.GetString(1);
                result.Value = (byte[])reader[1];
                result.CreatedAt = reader.GetDateTime(2);
                result.ExpirationAt = reader.GetDateTime(3);
                return result;
            }
            else
            {
                return null;
            }
        }
    }
}

This implementation uses a standard T-SQL based parameterized SELECT query. Nothing new here.

Now let us have a closer look at the implementation using TableDirect mode:

public CacheElement FindElementByKey(Guid key)
{
    using (var command = _connection.CreateCommand())
    {
        command.CommandType = CommandType.TableDirect;
        command.CommandText = "CacheElement";
        command.IndexName = "PK_CacheElement";

        using (var reader = command.ExecuteReader())
        {
            reader.Seek(DbSeekOptions.FirstEqual, key);
            if (reader.Read())
            {
                var element = new CacheElement();
                element.Key = key;
                element.Tag = reader.GetValue(1) == DBNull.Value ? null : reader.GetString(1);
                element.Value = (byte[])reader.GetValue(2);
                element.CreatedAt = reader.GetDateTime(3);
                element.ExpirationAt = reader.GetDateTime(4);
                return element;
            }
        }
        return null;
    }
}

Notice the required differences: The CommandType is set to TableDirect, the CommandText is set to the name of the table we want to query, and the IndexName property is set to the name of the index to be used. The Seek command will force a call to read to find the first matching column matching the key value. Notice that the key parameter is an params array, allowing you to use multi-column indexes.

In terms of performance, the classic SELECT appears to be 3 times slower than the TableDirect Seek, but of course your mileage may vary.

You can download the sample code from here.

Wednesday, April 29, 2015

SQL Server Compact ADO.NET data access performance – part 2: INSERTs

In this second part of this series, I will show by examples how to use the two available ADO.NET APIs in the SQL Server Compact ADO.NET provider for INSERTing data. I will also show some informal performance measurements, but keep in mind that your scenario may give different results.

In the sample solution, I will create a data access library for maintaining the following table, which you could be imagine could be used in a caching library:

CREATE TABLE CacheElement (
    [Key] uniqueidentifier NOT NULL,
    Tag NVARCHAR(4000) NULL,
    Value image NOT NULL,
    CreatedAt DATETIME NOT NULL,
    ExpirationAt DATETIME NOT NULL);

ALTER TABLE [CacheElement]
    ADD CONSTRAINT [PK_CacheElement] PRIMARY KEY ([Key]);

This table will represent the following class:

public class CacheElement
{
    public Guid Key { get; set; }
    public string Tag { get; set; }
    public Byte[] Value { get; set; }
    public DateTime CreatedAt { get; set; }
    public DateTime ExpirationAt { get; set; }
}

I will implement the following interface methods in the two possible ways, and then compare implementation and timings, by calling the library from a Unit test project:

    public interface IDataAccess
    {
        void InsertElement(CacheElement element);
        void InsertElements(List<CacheElement> elements);
    }

(I will add methods and implement further in the upcoming blog posts)

First let us look at the implementation of the T-SQL based INSERT, using best practices with explicit parameters (in the ClassicDal class) :

public void InsertElement(CacheElement element)
{
    using (var command = _connection.CreateCommand())
    {
        command.Parameters.Add("Key", SqlDbType.UniqueIdentifier).Value = element.Key;
        command.Parameters.Add("Value", SqlDbType.Image).Value = element.Value;
        command.Parameters.Add("CreatedAt", SqlDbType.DateTime).Value = element.CreatedAt;
        command.Parameters.Add("ExpirationAt", SqlDbType.DateTime).Value = element.ExpirationAt;
        if (String.IsNullOrWhiteSpace(element.Tag))
        {
            command.Parameters.Add("Tag", SqlDbType.NVarChar, 4000).Value = DBNull.Value;
        }
        else
        {
            command.Parameters.Add("Tag", SqlDbType.NVarChar, 4000).Value = element.Tag;
        }

        command.CommandText = @"INSERT INTO CacheElement
            ([Key],Tag,Value,CreatedAt,ExpirationAt)
            VALUES   
            (@Key, @Tag, @Value, @CreatedAt, @ExpirationAt)";
        command.ExecuteNonQuery();
    }
}

For all tests, I am using best practices for SQL Compact connection handling, and passing an already open connection. This avoids measuring the overhead of loading the SQL Compact engine DLL files, and opening the database file. Notice that for NULLable values, special handling has to be made.

Now lets us look at the implementation that uses the “raw” APIs for INSERTs (in the RawDAL class) :

public void InsertElement(CacheElement element)
{
    using (var command = _connection.CreateCommand())
    {
        command.CommandType = CommandType.TableDirect;
        command.CommandText = "CacheElement";
        using (var resultSet = command.ExecuteResultSet(ResultSetOptions.Updatable))
        {
            SqlCeUpdatableRecord record = resultSet.CreateRecord();
            record.SetGuid(0, element.Key);
            if (String.IsNullOrWhiteSpace(element.Tag))
            {
                record.SetValue(1, DBNull.Value);
            }
            else
            {
                record.SetString(1, element.Tag);
            }
            record.SetBytes(2, 0, element.Value, 0, element.Value.Length);
            record.SetDateTime(3, element.CreatedAt);
            record.SetDateTime(4, element.ExpirationAt);
            resultSet.Insert(record);
        }
    }
}

Notice the following special pattern: Setting the CommandType to TableDirect, the CommandText is the table name, we use the CreateRecord() method to get a SqlCeUpdateableRecord with “slots” that match our table columns. You have to know the exact “ordinal position” of your columns, you can get that by scripting a CREATE TABLE statement with my Toolbox, or inspecting the INFORMATION_SCHEMA.COLUMNS table. There are typed SetXxx methods that must match the datatype of your columns. Finally, call the Insert method to add the “record”.

In the sample code, I have also implemented methods to insert many rows in a single method invocation, thus saving the overhead of recreating a number of objects for each INSERT. This is similar to the way I insert data with my SqlCeBulkCopy library.

First the timings for a single INSERT: Raw: 31 ms, Classic: 29 ms – hardly any difference. Then 1000 single INSERTs, minimal object reuse: Raw: 2548 ms, Classic: 1936 – in this case not any advantage of the raw api. Then finally 1000 INSERTs, maximum object reuse: Raw: 73 ms, Classic: 354 ms. A significant difference, if that is a pattern you have in your application, ie inserting many rows in a single call.

You can download the code sample from here, and stay tune for the next installment: SELECT (using SetRange and Seek)

Monday, April 20, 2015

SQL Server Compact ADO.NET data access performance – part 1: Overview

In this five part series on SQL Server Compact performance, I will demonstrate and measure the performance of the two available data access APIs provider by the SQL Server Compact ADO.NET provider (System.Data.SqlServerCe.dll). The information in this blog post applies to both version 3.5 and 4.0. I will not cover usage of the unmanaged OLEDB interfaces (using C and C++)

As mentioned, the ADO.NET provider provides two APIs for data access (CRUD) operations:

One: The T-SQL standards based APIs - SqlCeCommand: ExecuteReader, ExecuteNonQuery, ExecuteScalar, that allows you to use text based SQL statements, including parameterized SQL. Most ORM implementations that support SQL Server Compact, like LINQ to SQL and Entity Framework, use this API, with the exception of the high perfromant OpenNETCF.ORM Framework, which is able to take advantage of

Two: The “table direct” APIs that allow you to bypass the T-SQL parsing and interpretation process, and that alone provides an obvious saving in processing. These include:

SqlCeResultSet (that inherits from SqlCeDataReader) - in particular when used with a SqlCeCommandType of TableDirect

SqlCeUpdateableRecord

SqlCeCommand.ExecuteResultSet

SqlCeDataReader.Seek

SqlCeCommand.SetRange

As you can see, none of these is available with the SQL Server ADO.NET objects, and they are particular to SQL Server Compact. This blog post series will demonstrate how to use these as alternatives to the “standard” methods.

The diagram from Laxmi’s blog post here explains the advantage graphically:

clip_image002

The “table direct” APIs bypass the Query Processor layer, and instead uses an ISAM like access pattern.

In the next blog post in this series, I will compare INSERT with SqlCeResultSet.Insert( SqlCeUpdateableRecord ) and demonstrate how to use the “Table Direct” APIs for INSERTs.

Monday, March 30, 2015

SQL Server Compact & SQLite Toolbox 4.2 – Visual Guide of new features

After more than 370.000 downloads, version 4.2 of my SQL Server Compact & SQLite Toolbox extension for Visual Studio 2010 and later is now available for download and available via Tools/Extensions and Updates in Visual Studio. This blog post is a visual guide to the new features, improvements and bug fixes included in this release, many suggested by users of the tool via the CodePlex issue tracker

The main features in this release have been improved integration with SQL Server & SQLite, and a number of improvements to the SQL editor.

This version is a recommended update for anyone using this tool.

Improved SQL Server integration

I have previously used this old tip from Jon Galloway for prompting users for a connection to a SQL Server/SQL LocalDB/SQL Azure database, but this caused a number of issues when connecting to LocalDB and SQL Azure, so for this release, you will use Server Explorer to create SQL Server connections, and then I will refer to these from the Toolbox.
So to interact with SQL Server from the Toolbox (for example for scripting, generating DGML files or Exporting a full database for SQL Server to SQLite, first connect to the SQL Server via Server Explorer:

clip_image002

Then launch the Toolbox (you can do that directly from Server Explorer), and select the relevant connection from Server Explorer from the dropdown list:

clip_image003

Export SQLite to SQL Server (and LocalDB/Express)

It is now possible to migrate a SQLite database to SQL Server without running creating any intermediate scripts. Simply right click your SQLite database in the Toolbox , and select the “Migrate to SQL Server” option:

clip_image004

Select your Server Explorer SQL Server connection (as described above), and you can follow the progress of the export via the Visual Studio status bar:

clip_image005

Many thanks to @thughesit for help with improving this feature.

As SQLite is much more loosely type than SQL Server, a new option have been added that will truncate string (causing data loss) if a string has been defined with a length like nvarchar(255) and the length of the data stored in SQLite actually exceeds this. This option is disabled by default. Any truncations will be logged to %temp%\SQLiteTruncates.log
clip_image006

You can now also migrate the other way, from SQL Server to SQLite, without any intermediate scripts. Right click the root “Data Connections” node of the Toolbox and select “Export SQL Server to SQLite (beta)”

clip_image007

Choose the Server from the dropdown list:

clip_image008

Choose the tables to Export:

clip_image010

And specify the name for the new SQLite database file, and you can follow the progress of the Export via the Visual Studio status bar.

SQL editor improvements

clip_image012

The query editor has received a number of highly requested improvements:

Improved document handling:
There is now both a “Open” (1) “Save” and “Save As” (2) buttons, and each editor window is aware of its underlying document. If there is a document open, it is indicated in the window caption, and a * symbol indicates if the document has unsaved changes:

clip_image013

Keyboard shortcuts: Keyboard shortcuts have been added for "Execute" query: F5, Save: Ctrl+S
and Open: Ctrl+O – a long standing request finally implemented
Export results as CSV: A new button has been added, that will export the first results as a csv (Excel) file, saving you having to copy and paste the results to Excel.

SQLite PRAGMAs: It is now possible to execute PRAGMA commands with SQLite

Other Improvements

Migrating a SQL Compact or SQLite database to SQL Server no longer blocks the UI.
Exporting a database from SQL Server no longer blocks the UI.
No more Error Reporting dialog, I am now using Exceptionless for error reporting instead.
Improved formatting of DataAcccess.cs sqlite-net code
Improved initial directory when adding SQLite and SQL Compact connections
Improved error information when SQLite datetime parsing errors occur (SQLite allows you to put anything in a column defined as datetime, apparently!)
Improved error information when SQL Server script execution errors occur (would mainly happen when exporting from SQLite to SQL Server)
Closing the Edit Data grid now closes the connection to the database, meaning that the Toolbox will have no open connection to the database file when all Edit Grids are closed.
Update to version 1.0.96 of SQLite ADO.NET provider

Bug fixes

Migrate to SQL Server feature was broken
Merge Replication Subscription creation was broken
Edit column should not show "Primary Key" column, and not allow editing of column name
Opening the SQL Editor sometimes crashed VS
FormatException in Explorer window

Wednesday, March 25, 2015

Entity Framework 6 (and SQL Server Compact) (5)–Entity Framework 6 extensions

This is a list of some of all the nice Entity Framework 6 extensions out there that expand the functionality of the Entity Framework 6 runtime. Code generator tools and Frameworks using Entity Framework are not included here, only libraries that extend DbContext or similar.

The EF team has lists of Tools and 3rd party EF providers here: http://msdn.microsoft.com/en-us/data/ee712907

Please let me know if I have missed anything, and I will add it to the list.


Store Functions for EntityFramework CodeFirst
 

Why: This project uses the basic building blocks to build end to end experience allowing using TVFs in Linq queries and invoking stroed procedures without having to drop down to SQL. (By EF Team member)
Project: https://codefirstfunctions.codeplex.com/
NuGet: Coming soon


Interactive Pre-Generated Views for Entity Framework 6

Why: An alternative is a solution where views are created dynamically only when needed (i.e. views don't exist or are out dated) and then persisted and made available for use by other instances of the same application (be it running in parallel or the same application after a restart). This is exactly the problem Interactive Pre Generated Views project is trying to solve. (By EF Team member)
Project: https://efinteractiveviews.codeplex.com/
NuGet: https://www.nuget.org/packages/EFInteractiveViews

Second Level Cache for Entity Framework 6.1

Why: Entity Framework does not currently support caching of query results. A sample EF Caching provider is available for Entity Framework version 5 and earlier but due to changes to the provider model this sample provider does not work with Entity Framework 6 and newer. This project is filling the gap by enabling caching of query results for Entity Framework 6.1 applications.  (By EF Team member)
Project: https://efcache.codeplex.com/
NuGet: http://www.nuget.org/packages/EntityFramework.Cache


Entity Framework 6 Contrib

Why: https://ef6contrib.codeplex.com/documentation
Project: https://ef6contrib.codeplex.com/
NuGet: https://www.nuget.org/packages/EF6.Contrib/


Entity Framework Extended Library

Why: Batch Update and Delete, Future Queries, Query Result Cache, Audit Log

Nuget: http://www.nuget.org/packages/EntityFramework61.Extended (forked)
and http://www.nuget.org/packages/EntityFramework6.Extended (forked)
and https://www.nuget.org/packages/EntityFramework.Extended/
Project: https://github.com/loresoft/EntityFramework.Extended and https://github.com/AdvancedREI/EntityFramework.Extended


EntityFramework.Utilities

Why: EntityFramework.Utilities provides some batch operations for using EF that the EF team hasn't yet added for us. (CUD)
Project: https://github.com/MikaelEliasson/EntityFramework.Utilities
NuGet: https://www.nuget.org/packages/EFUtilities/


EntityFramework.BulkInsert

Why: Bulk insert extension for EntityFramework. Insert large amount of data over 20 times faster than regular insert. Supports DB first and code first.
Project: https://efbulkinsert.codeplex.com/
NuGet: https://www.nuget.org/packages/EntityFramework.BulkInsert-ef6


Trackable Entities

Why: Visual Studio 2012 and 2013 project templates and NuGet packages for client-side entity change-tracking and server-side persistence with Entity Framework extensions.
Project: http://trackableentities.github.io/
NuGet: https://www.nuget.org/packages/TrackableEntities.EF.6/

EntityFramework.MappingAPI

Why: EntityFramework mapping API gives you ability to know the database info behind the context.
Project: https://efmappingapi.codeplex.com/
NuGet: https://www.nuget.org/packages/EntityFramework.MappingAPI

EntityFramework.Filters

Why: Create and configures filters applied to queries across a DbContext
Project: https://github.com/jbogard/EntityFramework.Filters
NuGet: https://www.nuget.org/packages/EntityFramework.Filters/

EntityFramework.DynamicFilters

Why: allow you to create global and scoped filters for Entity Framework queries. The filters are automatically included in every query and can be used to support use cases such as Multi-Tenancy, Soft Deletes, Active/Inactive, etc.
Project: https://github.com/jcachat/EntityFramework.DynamicFilters
NuGet: https://www.nuget.org/packages/EntityFramework.DynamicFilters/


EntityFramework.Seeder

Why: Using EntiyFramework.Seeder, you can seed your database from CSV files.
Project: https://github.com/dpaquette/EntityFramework.Seeder/
NuGet: https://www.nuget.org/packages/EntityFramework.Seeder.EF6/

EntityFramework.Triggers

Why: Add triggers to your entities with insert, update, and delete events. There are three events for each: before, after, and upon failure.
Project: https://github.com/NickStrupat/EntityFramework.Triggers
NuGet: https://www.nuget.org/packages/EntityFramework.Triggers/

EntityFramework.IndexingExtensions

Why: Adds fluent API to model indexes
Project: https://github.com/mj1856/EntityFramework.IndexingExtensions
NuGet: https://www.nuget.org/packages/EntityFramework.IndexingExtensions/

Z.EntityFramework.Plus.EF6

Why: Entity Framework Bulk Operations & Utilities (EF Bulk SaveChanges, Insert, Update, Delete, Merge | LINQ Query Cache, Deferred, Filter, IncludeFilter, IncludeOptimize | Audit)
Project: https://github.com/zzzprojects/EntityFramework-Plus
NuGet: https://www.nuget.org/packages/Z.EntityFramework.Plus.EF6/

Monday, March 23, 2015

Using Exceptionless.com for error logging and feature usage tracking with a Visual Studio add-in

In the next release of the SQL Server Compact & SQLite Toolbox, which is currently available as a Release Candidate, I will start using ExceptionLess.com for Error Reporting and Feature Usage tracking. In this blogpost, I will describe the few steps required to integrate the Exceptionless client with a Visual Studio extension.

I have used RedGate SmartAssembly until now for this, but will stop doing so for a few reasons:

- The build process is cumbersome with SmartAssembly, as it is mainly an obfuscation tool, and therefore modifies the add-in assembly itself during build (Link to my blog post)

- The desktop client is clunky, and only runs from a single PC

- The underlying web service is slow if you have a large number of error and feature usage reports

Exceptionless.com is a hosted web service, that allows you to collect error, feature and log reports online from a variety of clients. Simply sign up, get your API key and install the NuGet package for your client and you are ready to go. All your exceptions and logs are then available on the portal. The project is Open Source, and you can host it on your own servers with Azure or on premise. There are both free and paid plans available.

clip_image002

Invoking Error Reporting and Feature Tracking in Code

As the Toolbox is an add-in, I prefer not to catch any unhandled Visual Studio exceptions, but would still like to be able to report any errors occurring in the Toolbox, in order to be able to improve it. In addition I would like to track “feature usage” information, in order to know which features are most popular, and also track statistical information like the version of Visual Studio in use (as the Toolbox runs under the following Visual Studio versions: 2010, 2012, 2013, 2015. The Exceptionless client easily allows you to do this.

First sign up at the portal to get your API key, you can start with the free plan to try out the solution, and later upgrade to a paid plan.

Install the signed WPF package in your extension project (you must use the signed package with a Visual Studio extension)

Install-Package Exceptionless.Wpf.Signed

Alternatively download the NuGet packages, unzip and extract the following three files and include them in your lib folder and reference them from your project. (I did that to avoid NuGet Packages in my project):

Exceptionless.Extras.dll, Exceptionless.Portable.dll, Exceptionless.Wpf.dll

In your overridden package Initalize method, register your API key:

using Exceptionless;

ExceptionlessClient.Default.Configuration.ApiKey = "your api key";

Then to log handled exceptions, use the ToException() extension method:

using Exceptionless;

var ver = SqlCeToolboxPackage.VisualStudioVersion.ToString(2);
ex.ToExceptionless()
.AddObject(ver, "VsVersion")
.Submit();

Notice the fluent API that allows you to add custom information and objects to the error report.

In addition, you can log feature usage as follows:

using Exceptionless;

ExceptionlessClient.Default.SubmitFeatureUsage(feature);

Where feature is the name (key) you have assigned to the feature, see screenshot above.

All in all a very simple and painless integration. You can now monitor exceptions and potentially fix almost before your users report them, and use feature counts to prioritize future feature work.

Happy exception monitoring!

Monday, March 9, 2015

SQL Server Data Tools for Visual Studio – versions, downloads and compatibility

In Visual Studio, SQL Server enables four different project categories under the umbrella of “SQL Server Data Tools” that target various components in the SQL Server product. There is a lot of confusing advice and no single overview of the various versions and downloads available. This blog post hopes to remedy this.

Project types

The four project types are:

SQL Server Analysis Services (SSAS) for creation of various types of Analysis Services models

SQL Server Reporting Services (SSRS) for creation of reports and related artifacts

SQL Server Integration Service (SSIS) for creation of data integration and workflow solution, including ETL (extract, transform, load) workflows

The three project types above are distributed together under the name: SQL Server Data Tools – Business Intelligence (SSDT-BI)

SQL Server Data Tools  (SSDT) database project and SQL Server Object Explorer (SSOX) provides an integrated environment for database developers to carry out all their database design work for any SQL Server platform (both on and off premise) within Visual Studio. Database developers can use SSOX in VS to easily create or edit database objects and data, or execute queries.

Releases

An overview of current releases download links are available on the MSDN page here.

The following packages are available:
SSDT-BI for Visual Studio 2012 (SSDT-BI 2012)
SSDT-BI for Visual Studio 2013 (SSDT-BI 2013)

SSDT for Visual Studio 2012 (SSDT 2012)
SSDT for Visual Studio 2013 (SSDT 2013)

Version support

Once you have determined what kind of project is relevant for your task at hand, you need to decide which download and Visual Studio version to use, based on the SQL Sever version that you target:

SQL Server platform

2005 / 2008

2012

2014

Azure SQL DB

Tool/Project        
Reporting        
SSDT-BI 2012
SSRS

No

Yes

No

N/A

SSDT-BI 2013
SSRS

No

Yes

Yes

N/A

Analysis

 

 

 

 

SSDT-BI 2012
SSAS

No

Yes

No

N/A

SSDT-BI 2013
SSAS

No

Yes

Yes

N/A

Integration

 

 

 

 

SSDT-BI 2012
SSIS

No

Yes

No

N/A

SSDT-BI 2013
SSIS

No

No

Yes

N/A

SQL DB Projects and SSOX

 

 

 

 

SSDT 2012

Yes

Yes

Yes

Yes

SSDT 2013

Yes

Yes

Yes

Yes

NOTE: For SQL Server 2008 based SSRS, SSAS and SSIS projects, you use Business Intelligence Development Studio, which is installed from the SQL Server installation disk.

NOTE: If you open a SSIS project created in SSDT-BI 2012 with SSDT-BI 2014, a one way upgrade will occur, so be careful and use source control. More info here.

Hope this helps

Wednesday, March 4, 2015

Using SQLite with the Entity Framework 6.1.x designer - a troubleshooting guide

Many users would like to be able to use the SQLite database with the Entity Framework 6 EDM Wizard and Designer, but looking at Stack Overflow, they often face problems attempting to do that. This blog post describes the steps required to do that, and gives some troubleshooting advice and tips.

Install the SQLite DDEX provider

UPDATE: This blog post describes how to generate and use an EDMX file with SQLite. I have not tested “Code First from Database”, and “Generate Database from Model” does not work with SQLite. In addtion, Code First Migrations are not supported with SQLite. If you want these additional capabilities in an embedded database, use the SQL Server Compact provider. If you encounter issues with the SQLite provider, suggest you post an issue here.

In order to use the Entity Data Model Wizard to generate an EDMX Model from an existing database, you must be able to connect to the database from Server Explorer in Visual Studio. In order to do that, a so called DDEX provider for the database and Visual Studio version in use must be installed. 

For SQLite you must download the provider from http://system.data.sqlite.org/index.html/doc/trunk/www/downloads.wiki - this page is an unreadable mess, sadly.

For Visual Studio 2013, the download you need is named: sqlite-netFx451-setup-bundle-x86-2013-1.0.96.0.exe  for the current version of the ADO.NET provider, so 1.0.96 might change to 1.0.97 or higher in the future.

image

It is important that the DDEXprovider version matches the version of the current SQLite EF6 NuGet package that you use in your project!

Make sure to enable Visual Studio integration and GAC registration during installation.

You can check which version of the DDEX provider you have installed in “Add or Remove Programs”, and also check that no older versions of the DDEX provider are present:

image

Once installed, you can add a connection to your SQLite database from Server Explorer:

image

image

image


Add the correct SQLite EF6 NuGet package to your project

To demonstrate using the SQLite EF6 provider, let’s go through creating a console app in Visual Studio 2013, that uses the Chinook sample database that we connected to above.

To add the SQLite EF6 provider to your project, select the System.Data.SQLite.EF6 package:

image

This package will install all other required dependencies, including Entity Framework 6.1.2.

After installation of this package in a project that target .NET 4.5, your packages.config should look like this (notice that with version 1.0.95 of the SQLite provider, only a single package is required);

image

And the system.data section of your app.config file should be modified to look like this (notice the added <remove> and <add> statements);

image

<remove invariant="System.Data.SQLite" />

<add name="SQLite Data Provider"
                     invariant="System.Data.SQLite"
                     description=".NET Framework Data Provider for SQLite"
                     type="System.Data.SQLite.SQLiteFactory, System.Data.SQLite" />

Important: Now BUILD your project!

You should now be able to add a EDMX based model to your project, using Add New Item. Data, ADO.NET Entity Data Model:

image

Add some code to the Main method to test that the application also works in runtime:

            using (var db = new ChinookEntities())
            {
                var albums = db.Albums.ToList();
                foreach (var item in albums)
                {
                    Console.WriteLine(item.Title);
                }
            }
            Console.ReadKey();

If you are unable to see the connection to the SQLite database we made earlier, here are a couple of tips:

Make sure to build the project and possibly changing the build configuration to x86 has worked for me.

Installing the latest Entity Framework 6.1.2 Visual Studio tools have worked very well for me (the version 6.1.1 Tools are included in Visual Studio 2013 Update 4) – download link in the blog post here. I think it is due to this fix in the Tools.

Hope this helps! 

Monday, January 12, 2015

“SQL Server Compact & SQLite Toolbox” related news

This blog post contains three bits of news, that all relate to my SQL Server Compact / SQLite Toolbox Visual Studio add-in

Visual Studio 2013 Community Edition

Microsoft recently released Visual Studio 2013 Community, a free, full featured edition of Visual Studio, has essentially all features found in VS Professional, expect some Office/SharePoint project templates, and is free for many scenarios (please check the licensing requirements). Previously, users of the SQL Server Compact Toolbox with VS Express had to use the standalone edition of the Toolbox. If you replace your Express edition with Community, this is no longer the case, as it supports all VSIX extensions from the VS Gallery / Extensions and Updates dialog in Visual Studio. A very bold move from Microsoft, and a major  boost for open source/hobby developers. If you have any questions about this VS edition, start by reading the Q&A here

image

Visual Studio Auto Updater

Mads Kristensen, a Microsoft Program Manager (and fellow Dane), recently released the “one extension to rule them all”, the Visual Studio Auto Updater extension. It allows you to specify which of your Visual Studio extensions you want to automatically update when a new version is released. I have submitted a pull request to have the SQL Server Compact / SQLite Toolbox included in the list of extensions that are always updated. Highly recommended, ensures that you Visual Studio installation is always fresh.

image

SQLite Toolbox on the Channel 9 “Visual Studio Toolbox” show

In early November last year I had the opportunity to appear on the popular Channel 9 show “Visual Studio Toolbox” hosted by Robert Green. In the show, I present the new SQLite support in the Toolbox. A good intro if you want to get started using the Toolbox.

VSToolbox2

Version 4.2 preview

The next release of the Toolbox, version 4.2, is currently available in preview from CodePlex. The main focus for this release is a number of (overdue) improvements to the SQL query editor:

Editor2

1: The editor now has proper “file” handling, with Save and Save As buttons, and the saved file name appearing in the tab caption.

2: Keyboard shortcuts have been enabled, allowing you to use F5 to execute queries, and use Ctrl+O to open scripts, and Ctrl+S to save scripts.

3: A button to export the current result as CSV (Excel) format has been added.

Try out the preview, and let me know if you have any suggestions or find any issues via the CodePlex issue tracker.