I’m constantly amazed at the rate at which blockers for adoption for Azure SQL Database are being removed. With the advent of V12 servers we got the same core services as on premise. Since then we’ve got full text, encryption, row level security, transparent data encryption and more.

Last week Torsten Grabs blogged about the next big thing which not only enables cross database queries but some other great elastic scale features.

You can read about the details here ttps://azure.microsoft.com/en-us/blog/querying-remote-databases-in-azure-sql-db/

We’re working on some posts on how to easily move your on premise DBs to SQL Azure and utilise these new features.

Posted by simonsabin | with no comments

Many people are fortunate enough to have laptops or desktops that enable high resolution displays. If you try and run SQL Server management studio on those displays you will see everything is minute and unreadable.



The solution is to use scaling to increase the size of fonts and other components so you can the fonts and push the buttons.

I use 150% on my surface Pro 3 and it works well for most apps.

Unfortunately for this to work properly your app needs to handle the scaling correctly. many apps the font sizes are increased but not the dialog sizes, of the icons, resulting in an unusable dialog.

This is definitely the case for SQL Server Management Studio.

Luckily for you i was pointed towards a post by an italian SQL guy Gianluca Sartori that shows you how you can enable SQL Server Management Studio (SSMS) to work with high resolution monitors and DPI scaling



Happy viewing

Posted by simonsabin | with no comments

If you are eager to get office 2016, I would there are some really nice features, you might be wondering how.

There is no automated update (that’s coming next year) and if you go to Office 365 and download software you will just get 2013 (the latest ??? go figure)


What latest actually means is latest on the branch of office you are configured for.

If you want the bleeding edge then you need to say so in the service settings of office 365image

To do this

  1. Go to the admin section of office 365.
  2. Select Service Settings
  3. Select Updates

and you will get a screen like that on the left. You can choose whether you want all users or individual users on the Standard Release or the First Release (aka early adopters).

To get Office 2016 at the moment you need to be on the First Release.

Once you do that you will get another section in the download software page https://portal.office.com/OLS/mysoftware.aspx that gives you the option to “try the next version of Office”

From here you can install the latest 32 or 64 bit version of office 365.

If you are a company and need to do a bit more planning then have a look at https://technet.microsoft.com/en-us/library/mt422981.aspx

Posted by simonsabin | 1 comment(s)

I recently upgrade to Excel 2016 (read this on how to get it with Office 365) and couldn’t for the life of me find PowerView.

Power View is included as an add in for excel 2016 and so i assumed like power query that its menu had been consumed into another menu but no couldn’t find it.

I checked add-ins and as expected the add-in wasn’t enabled and so I enabled it, but I still couldn’t find it.

As it turned out the menu item to insert a Power View sheet is turned off by default and so you need to not just enable the menu (like you do for the developer menu) but you actually have to add the “Insert Power View Sheet” command into the ribbon.

The steps to do this can be found here



Posted by simonsabin | with no comments

My heart was filled with a little joy today because someone, somewhere in the world updated an item on uservoice.

I voted on this uservoice item sometime ago, and today it was updated to say it was started.


Now historically MS and SQL have taken in feedback using connect and some would regard it as the black hole of feedback. The reason being was that

1. There was little focus on small improvements to make users better

2. There wasn’t a good ship cycle 2-3 years to wait for an item.

3. Items may get updated

4. There was no indication something was being worked on or when it would be shipped.

5. If I hit a problem I would have to implement a workaround that would last for a number of years because I know thats the ship cycle.


The difference now is that,

1. I’m getting feedback that an item I care about is being worked on. Joy

2. Importantly I know that if its being worked on, then it will likely ship in the near future, because I know there is a continuous cadence of delivery. Joy


If there wasn’t a continuous process then even if an item was flagged as being worked I’d think what difference does it make, I’m not getting it for another x years.

I am also super excited about the prospect of features being delivered for the core SQL tooling now they have decided to ship Management Studio at a different cadence to that of the engine. http://blogs.msdn.com/b/sqlreleaseservices/archive/2015/06/24/sql-server-management-studio-june-2015-release.aspx Yipee at long last.

The feedback cycle is one area that many companies don’t focus on but is core to the wider devops practice that companies are now adopting.

Posted by simonsabin | with no comments

I've got an opportunity for a SQL Developer for a project based in London but with most being done at home.

If you've got the drive to work independently with some input from me and have core skills in SSDT and SSIS then this could be for you.

I'm looking for someone that's worked in an automated CI/CD build environment and understands the benefit of this and is be able to set it up for ssdt and SSIS.

Ideally you'll have C# and powershell skills as well.

The work will be partly in London but also home working.

And the best part is you get to work with me, some of the time J (not sure whats worse working with me some of the time or all of the time).

If you're interested then tweet me @simon_sabin drop me a line at simon [ at ] sabin.io

Posted by simonsabin | with no comments

I started off learning Oracle many moons ago and laughed at this SQL Server thing when I attended a training course and found it didn't do this and that. But the one thing that stood out was tooling.

For the Oracle world everything was command line, or something that I can't remember the name of that was java based, slow and generally awful.

SQL Server had Query Analyser and awesome query tool, it was so lightweight. Enterprise manager an mmc plugin (ok this wasn't amazing but was good)

It also had SQL Profiler. This was just amazing, and looking at the world now with the focus on insights into application performance, error logs etc this really was way ahead of its time.

With it you could see all the queries hitting the database, their duration, the amount of IO, CPU it was awesome and has been the number one tool in any SQL developers toolkit ever since.

Well with SQL 2016 you get a new tool that not only Tells you what queries have hit but importantly how they've been compiled differently.

Probably the most common problem that I encounter that is not down to human error is that of SQL using a bad query plan.

I did talk about this back at SQLbits V http://sqlbits.com/Sessions/Event5/When_a_query_plan_goes_wrong its largely down to the optimiser estimating something wrong. The analogy is estimating the distance from your house to the other side of the world and coming up with 10m and so you decide to walk because using the car would be madness. Its going to take you a long time to get there. That's what happens SQL can pick a plan that is super optimal and super quick for a few rows but if the estimate is wrong that super plan becomes super slow.

The issue is that the optimiser may suddenly flip from one plan to another without warning. This largely happens when a plan is evicted from a cache and a new plan is written based on estimates that are wrong.

The query store will show you, for a specific query, the plans that have been used and the difference in performance. But not only show you it will let you say, always use this plan, that's the safe plan.

So make sure that you go and try the trial as this will solve many problems that you will encounter.


Posted by simonsabin | with no comments

Update 12 Jun 2016

I do love the capability that a saas offering gives you as a provider and a consumer. As a provider you can monitor your service and deliver a fix easily without deploying any new code to the customer. As a customer you get fixes in a much quicker cadence and without the hassles of upgrading software.

I especially love the attitude of the VSO team to getting features and bug fixes release. The bug detailed below is now fixed, thats super quick, well done

I'm really excited about the changes here and coming to Visual Studio Online(VSO) with regards to build and deployment. Once completed the process will be much more a kin to the friendly process that you get with other tools such as Team city.

As part of this the team have blogged about the ability to run tests as part of your CI build process. http://blogs.msdn.com/b/visualstudioalm/archive/2015/05/29/testing-in-continuous-integration-and-continuous-deployment-workflows.aspx

Full on release pipelines with approval etc aren't available yet but you can do a huge amount with the current vNext build process in VSO.

The part that is highlighted in the blog post is the ability to run tests on machines ala the Lab management in MTM. To get this to work you have to add machines to you project that will be the target of the testing.

Unfortunately if you have your VSO account connected to Azure active directory and you log in with a Microsoft account, not an azure active directory account then the machines dialog may not work.

To get it to work you need to be logged in as an Azure Active Directory account.

You can check to see if your VSO account is connected to an AAD by going to .visualstudio.com/_admin/_home/settings">.visualstudio.com/_admin/_home/settings">https://<account>.visualstudio.com/_admin/_home/settings (change <account> to your account name).

At the bottom it will say whether the account is connected.

I've been told there is a fix in the pipeline, which judging by past performance means it shouldn't be too far away.

Posted by simonsabin | with no comments

I love PowerBI, actually I love PowerQuery. It's a great way to combine data from around your business, across the web from anywhere. What I really like is very little is done automatically, i.e. it doesn't do the nice data type detection you get with Excel that screws your imports if the data type in a column differs from the first few rows to the rest of the file.

Does that make it difficult to use. No its not. The nice thing is that its very easy to add additional columns, change data types, use expressions, combine datasets, and do much much more.

If you've ever used SSIS you will often find that the UI gets in the way of the experience. SSIS is really focused on a high performance pipeline. This bubbles far too much to the surface with the buffers and column mappings.

Lets walk through a simple example, of splitting a column that contains filename into the file and the extension


  1. Add a derived column component to your data flow
  2. Delete the existing connection between two data flow components
  3. Connect the upstream component to the derived column component
  4. Connect the downstream component to the derived column component
  5. Correct any mapping you had defined on the downstream component
  6. Open the derived column component
  7. Add a row for the new column
  8. Try and enter your expression in the single text box (or have downloaded BidsHelper to give you the expression editor ) to find the filename
  9. Make sure the length for the new field is going to be big enough
  10. Do the same to get the file extension
  11. Get some sample data
  12. Put in a data viewer
  13. Run the data flow to check the split is working correctly
  14. Click ok

How about in Power Query

The wizard way (picture left)

  1. Click on the column you want to split
  2. Click the split column in the toolbar
  3. Specify the delimiter and how you want it split
  4. Click ok
  5. View the split is doing what you want

The wizard way (2)

  1. Click Add Custom Column
  2. Put as the function (Column1 is the source column and Source is the source table, Column1.1 and Column1.2 are the new column names"

Table.SplitColumn(Source,"Column1",Splitter.SplitTextByEachDelimiter({"."}, null, true),{"Column1.1", "Column1.2"})

  1. Click Ok

The non wizard way

  1. Open the Advanced editor for the query
  2. Add a new line where you want to add the column in the steps
  3. Add

SourceWithSplitColumn = Table.SplitColumn(Source,"Column1",Splitter.SplitTextByEachDelimiter({"."}, null, true),{"Column1.1", "Column1.2"})

  1. Change the line that was referencing the old variable to point to the new variable
  2. Click OK and view the changes


The real power query is that everything is just code, you don't have to deal with some bad UIs, the fall back is always a line of code. With SSIS if something is awful in the UI you had to resort to the advanced editor and that only worked if the component hooked everything up properly.

I was amazed at the library of functions available, json parsing, xml parsing, download from web, from sql and much much more. Discussing this with Matt Masson at ignite this month it all made sense ALL components are built using the M Language. If they need to do something that they can't do with the language they consider extending the language. Adding more functions, which means its then available to all.

Not seen Power Query, where do you start.

Well you can just jump create some queries, use the wizards and see whats generated, quickly you'll want to be jumping into the advanced editor (well I was) and with no intellisense you will stumble.

So once you've found your way around maybe watching some videos looking at samples.


You really need to understand the basics of the language. I'd read the first few sections of the language specification (first item below) this explains syntax, and is really useful. Then there are all the functions you will want they are in the next 2.

Happy reading



Posted by simonsabin | 1 comment(s)
Filed under: , , ,

I was very proud to be asked to speak at Ignite at the start of May. http://ignite.microsoft.com/speaker/speakermoreinfo/?speakerid=f0dfadd0-2498-e411-b87f-00155d5066d7.

I encounter much scepticism about continuous delivery for databases, more over many talks about continuous delivery invariably stumble when databases are mention. This leads developers to often consider schema less solutions.

The talk goes through the reasons why anyone building software in today’s world should be using CI  and CD. Whats more I focus on two of the common misconceptions about CD.

First that of security and compliance. How can you have an automated process for deploying your database and meet your security and compliance needs, I show how the opposite is probably more the case. How can you possibly have a compliant and secure database deployment process that isn’t continuous from source through to deployment.

Secondly that of it not being possible. Many turn around and say that automation of database deployments isn’t possible. Many reasons are touted, the tools don’t support feature x, its too slow, we need control over X, we need the upgrade to not take the database offline. Well in the talk,

1. I’m using the encryption functionality in Azure SQL Database that were only announced at Ignite, and thus the tooling doesn’t support it.

2. I do a large refactor of a table that replaces a table with a view and a trigger along with population of new columns

3. All is done whilst an application is hitting the database and it continues with no problems.

As well as the azure preview features the talk is built around Visual Studio Online and Release Management which provides the source control, review, build, test and deployment mechanisms and a workflow for the whole process along with approval at each stage.

In the session I do 3 full deployments to dev and production through the full lifecycle from checkin through to deployment. Given that, the use of azure and the use of preview features I was very glad when all the demos worked.

If you are in any doubt that CD is possible for databases then watch the video.


You can find the powerpoint by registering on Yammer and going to https://www.yammer.com/microsoftignite/#/Threads/show?threadId=537321578


More Posts Next page »