December 2010 - Posts

I recently asked a few questions about some features I was surprised weren’t in reporting services. The response I got from the team was,”raise them in connect and get them voted for”. So thats what I’ve done.

Here is a list of the items I’ve raised as well as items I’ve found that I think should be being done.

if you agree with any of these features please vote. It does make a difference.

 

Controlling functionality in report builder

Controlling what a user can do in report builder, i.e. not change parameters, not change data sets, not change data sources. Just change the layout.

This is a related connect item option-to-create-only-dataset-if-user-is-only-entitled-to-create-dataset-in-report-builder, report-builder-further-show-hide-by-role

Provide a find feature in report builder for controls

Report builder reports errors for relevant objects but there is no way to find that object, i.e. textbox34. In BIDS there is a drop down you can use to select an object. enable-controls-to-be-found-by-name-in-report-builder

Enable report parts to be propagated to reports

Currently the only way to update a report part in a report is to open the report in report builder. Not very efficient. Vote on the connect item here enable-report-part-updates-to-be-propogated-automatically

Report parts can only be developed in report builder

You can’t develop or use report parts in BIDS. If you agree here is the connect item Report Parts only available for Report Builder consumption? That SUCKS! As the guy says that sucks IMHO

Report templates

Wouldn’t it be nice to develop a report template that could be used across reports. Here is the connect item SSRS Report Templates, Add Templates for Report Builder 2.0 , Add style or stylesheet or template to Reporting Services reports

CSS for reports

On a related note having the ability to style reports with CSS would be nice Add support for CSS and branding in report builder

Intellisense in the expression editor

It would be nice to know my code is right before I run it.  Here is a connect item that includes that Report Builder Improvements

Embedding RB for ISVs

if you are developing apps, report builder is good but doesn’t cut it when you want to embed it. SQL Server Reporting Services Report Builder enhancement

Allowing users to save reports to My Folder

Currently you have to develop your own my folder  report-builder-access-to-my-reports-folder

Support for table valued functions/SPs in the report model would be great so that you aren’t forced to return all the data to reporting services and the most common is to encapsulate some security logic so users can’t access data they shouldn’t. SQL 2012: report model accept parameterised inputs (reporting services)

Sorting of parameters

In todays age the parameter options I think are pretty poor. Which is why I prefer to roll my own. The fact you can’t even sort the parameters is a bit of a joke Sorting parameters for reports developed with report builder

If you want something changed then make a suggestion http://connect.microsoft.com/sql

 

Posted by simonsabin | 4 comment(s)

Azure uses certificates to manage secure access to the management side of things. I’ve started looking into azure for some SQLBits. So in doing so I needed to configure certificates to  connect. VS nicely will create a cert for you and copies a path for you to use to upload to azure. I went to the Azure portal (https://windows.azure.com/Default.aspx) found the certificate bit and added the certificate.

image

However after doing this I couldn’t connect.

After trying numerous fruitless things, including looking round the portal for other places to put certs, I realised I must be missing something.

So went back to the portal and had another look, and what did I find but a “Management Certificates” option.

image

I added the certificate generated by Visual Studio (connection wizard pops up when you try and deploy stuff which allows you to generate a cert) and now I can connect.

So there is a page on the Home of the portal that takes you to http://msdn.microsoft.com/en-us/library/gg441575.aspx, whats odd is that you would expect the Management Certs to be the first option as that is what is required to manage the certificates.

Posted by simonsabin | 2 comment(s)
Filed under: ,

Something I’m very conscious of when delivering a  BI solution is being able to show the facts in a concise way but also not to hide whats going on. I was reminded of this when I looked at the weather today. Everywhere they are reporting weather warnings for the south east and so I though I’d check on the BBC website http://news.bbc.co.uk/weather/forecast/4281?area=AL5

image

Looking at that I thought we are going to miss the worst of it, just like a few weeks ago. However from previous experience with the BBC site I thought I’d look at the hourly breakdown. The way they summarise the day’s events can be interesting, and something I don’t agree with.

 


image

If you look at the hourly breakdown you will see that from 15:00 to possibly 21:00 there will be heavy snow. Is it therefore valid to summarise a day that has possibly 6 hours of heavy snow as “Light snow”.

In there defence its not like revenue per hour, you can just do an average. There isn’t an average function for weather. But whilst that might sound silly its not too far off from doing an summary for a report in business. How do you represent the health of your business in 1 single value, picture, chart its very difficult. You have to find some way of representing financial status, employee status, growth, etc. Doing a simple average isn’t possible, and thats why the development of KPIs for a business is important.

Whats also important and thats highlighted by the weather example here is that of making sure important information isn’t lost when you summarise. Does the summary of “light snow” mean that on average over the day its going to be “light snow” or is it that thats the worst case, or is it that for the majority of the day its going to be “light snow”. Those astute amongst you will remember the first and last as the mean, mode averages you learnt in school. They are different ways of trying to summarise without hiding detail in a single value. I’m a fan of using std deviation (but calling it something simpler for the users) than indicates how accurate an average is.

If you are considering profit. You might have a new area of the business that has a profit of 20% but is only 1% of the business and other areas have 5% profit.  If you just display an average of profit growth it might be 1%. But this average has hidden that you’re new area is actually now making a loss.  Because of the distribution of the business the weighting with the average has hidden some important information. This is the same as the weather situation. “Light snow” might be acceptable to go out in but heavy snow you need to stay in doors.

Whats great about the tools today its very easy to provide multiple visualisation tricks so important detail isn’t lost. Bubble charts can display more than 1 value, and the us of graphics allows multiple pieces of information to be displayed.

Interestingly if you look at the weather summary they have decided not to display an average temperature but rather a best and worse case. Odd given doing an average of temperature is much easier than an average of weather. It would be much better IMHO in the Weather summary to have something that displayed the best and worst, maybe a sun,rain drop and snow drop where the size indicates the intensity and a figure indicating the duration.

This is one area that I feel the Microsoft BI tools don’t excel in and I hope with the introduction of crescent that will change.

Whatever the situation it looks like I’ll be building a snow man sometime this weekend.

Posted by simonsabin | 2 comment(s)

I’ve blogged before about Getting more than 1024 columns on a table , this is done by using sparse columns. Whilst this is potentially useful for people with insane table designs, sparse columns aren’t just for this. My experience over the past few years has shown that sparse columns are useful for almost all databases when you have columns that are largely null i.e. sparse.

A recent client was able to reduce the size of the table by 60% by changing columns to sparse. The way this is achieved is by the storage engine not using any space even for fixed size columns when a column is null, although on the flip side a little more space is used when a value is stored. The trick is to see what ratio you have or null values to non null values. There is a break even point where using sparse doesn’t pay off but in my experience every database has some tables where there are ints or datetime columns that are largely null.

So have a look at your tables and where you see lots of nulls consider using sparse columns

To change a column to sparse just issue an alter table statement.

ALTER TABLE <table> ALTER COLUMN <column> <datatype> SPARSE

However be careful as this causes considerable data movement and it may be quicker to cerate a new table with the correct column definitions and then do an insert into statement. Once copied you can rename and add the relevant constraints. This is likely to be quicker as the copy will be a minimally logged operation (if recovery mode is simple or bulk logged).

1 more note this huge space saving only applies to fixed width data types, such as int, bigint, datetime, numeric, decimal, float.

Following a an email in a DL I decided to look at implementing a logon restriction system to prevent users from logging on at certain ties of the day. The poster had a solution but wanted to add auditing. I immediately thought of the My post on logging messages during a transaction because I new that part of the logon trigger functionality is that you rollback the connection. I therefore assumed you had to do the logging like I talk about in that post (otherwise the logging wouldn’t persist beyond the rollback.

So I jumped in developing a logon trigger solution to test the theory, oh how wrong can I be. This was definitely a RTFM situation. If only I had done that first, what I found out in reading BOL is that after you issue the rollback in the logon trigger you are able to then issue any insert/update/delete and the data will be persisted. Logon Trigger Execution Guidelines explains that,

“The current trigger continues to execute any remaining statements that appear after the ROLLBACK statement. If any of these statements modify data, the modifications are not rolled back.”

So with that resolved I still thought it useful to post the trigger code that allows you to prevent users from logging in.

First we need a table to define our restrictions

create table LoginHours (

   Login varchar(100)

 , LoginStart time

 , LoginEnd time

 , Days int

  )

I made the decision that the table would also allow different days to have different times, however to make it easier to maintain multiple days could share the same time restriction by using a bitmasks for the days. So the Days column can be any combination of 1, 2,4,8,16,32,64 for Monday thru to Sunday respectively. To compare against this just use the following logic

 days & @DayOfWeek = @DayOfWeek

where @DayOfWeek represents the current day and can be calculated as follows

Power(2,datepart(dw,getdate())-1)))

The next is to define put some data in.

One tip when testing logon triggers it is always advised to have at least 2 sysadmin accounts, one that you should never put any restrictions on secondly make sure you always have a few open query windows connected as sysadmins in case you get your boolean logic wrong and block everyone from connecting.

insert into LoginHours (login, LoginStart, LoginEnd, Days)

  values ('<domain>\Simon','9:00','10:00',1+2+4+8+16+32+64)

This allows my user account to only login between 9:00 and 10:00 Monday to Friday. It would be nice to have enum support in SQL but alas no. There is a connect item “Allow literals and read only table data to be represented as enums” and I’ve written an article about it before for SQL Server Central “Enums in SQL Server

Now we need something to tell us if someone can login or not. We could embed this logic in the trigger, however for debugging that makes it difficult. Its much better to use a function which can be called outside of the trigger. In this case I’ve created a base function and 2 other function (effectively overloads)

create function udf_GetLoginRestrictionCountForLoginSecure (@UserName nvarchar(128),@DayOfWeek int)

returns table

return (

select HasLoginRestrictions = count(1)

      ,IsAllowedAtTheseHoursCount = sum(

          case when LoginStart <= cast(getdate() as time)

                 and LoginEnd >= cast(getdate() as time)

                 and days & @DayOfWeek = @DayOfWeek then 1

               else  0

           end)

 from dbo.LoginHours

where @UserName like Login )

go

 

create function udf_GetLoginRestrictionCountForCurrentLoginAndTime ()

returns table

return (

select HasLoginRestrictions

      ,IsAllowedAtTheseHoursCount

 from dbo.udf_GetLoginRestrictionCountForLoginSecure(suser_sname(), Power(2,datepart(dw,getdate())-1)))

go

 

create function udf_GetLoginRestrictionCountForTime (@username nvarchar(128))

returns table

return (

select HasLoginRestrictions

      ,IsAllowedAtTheseHoursCount

 from dbo.udf_GetLoginRestrictionCountForLoginSecure(@username, Power(2,datepart(dw,getdate())-1)))

These allow you to test to see if someone has a valid record or not. I’ve also made a step to simplify the system so users with no restrictions don’t have to be put in the table. No record indicates no restriction. The Days column makes this work because to prevent users from accessing the system you put a 0 in the Days column. This won’t match any day and as there is a record will restrict the user from logging in.

The other feature I did was to use LIKE. This means you can put a wildcard in the table to restrict all users with James in their login for instance. Probably more useful for restricting whole domains.

You could make this as complex as you like, like always allowing sysadmins.

To test the function call the function as follows, this returns 2 columns. The first HasLoginRestrictions indicates if any restrictions are in place and the second indicates if any records exist that allow access at this time.

select *

from udf_GetLoginRestrictionCountForLoginSecure('ajoes',Power(2,datepart(dw,getdate())-1))

We need a table to audit or failed logins. Information will be put in the error log but not much. You could raise severity 21 (or higher) errors to put additional information in the error log but thats not advised as it will bloat your error log with far too many messages for just 1 failed login (2 entries for the failed login and 2 for each severity 21 error raised)

create table LoginFailedAudit (LoginFailedAuditId int identity(1,1)

                              ,Audit xml)

Now we have the data and the audit we now need the trigger.

This does a number of things, firstly it is specified to execute as SELF. This means it executes as the person that created the trigger. This is essential to allow use of the functions. If you  don’t do that you need to grant SELECT on the functions to PUBLIC which is a security risk. Secondly the SET DATEFIRST ensures that Monday is always the first day of the week. Thirdly note the use of original_login() this is used to get the login used to make the connection, if we didn’t use this we would get the person that created the trigger because we are executing the trigger as SELF. Finally we use XML to combine all the data from the sessions and connections DMV and put it in the audit table.

NOTE: the rollback is done very early so that anything that occurs afterwards is not rolled back.

create trigger tr_loginAudit

on all server

with execute as self

for logon

as

set datefirst 1 -- ensures consistency for monday being 1 in the lookup table

 

declare @HasLoginRestrictions int

       ,@IsAllowedAtTheseHoursCount  int

      

select @HasLoginRestrictions = HasLoginRestrictions

      ,@IsAllowedAtTheseHoursCount = IsAllowedAtTheseHoursCount

from udf_GetLoginRestrictionCountForTime(original_login())

 

 

if (@HasLoginRestrictions = 1

   and @IsAllowedAtTheseHoursCount = 0)

    begin

    --This indicates to the server that the login is not allowed

    rollback

   

    --build a message

    declare @message xml

    set @message = (select

                           (select session_id              [@session_id]

                                  ,most_recent_session_id  [@most_recent_session_id]

                                  ,connect_time            [@connect_time]

                                  ,net_transport           [@net_transport]

                                  ,protocol_type           [@protocol_type]

                                  ,protocol_version        [@protocol_version]

                                  ,endpoint_id             [@endpoint_id]

                                  ,encrypt_option          [@encrypt_option]

                                  ,auth_scheme             [@auth_scheme]

                                  ,node_affinity           [@node_affinity]

                                  ,num_reads               [@num_reads]

                                  ,num_writes              [@num_writes]

                                  ,last_read               [@last_read]

                                  ,last_write              [@last_write]

                                  ,net_packet_size         [@net_packet_size]

                                  ,client_net_address      [@client_net_address]

                                  ,client_tcp_port         [@client_tcp_port]

                                  ,local_net_address       [@local_net_address]

                                  ,local_tcp_port          [@local_tcp_port]

                                  ,connection_id           [@connection_id]

                                  ,parent_connection_id    [@parent_connection_id]

                                  ,most_recent_sql_handle  [@most_recent_sql_handle]

                             from sys.dm_exec_connections

                            where session_id = @@spid

                            for xml path('connection'), type)

                          ,(select  original_login_name   [@login]

                                  , host_process_id       [@host_id]

                                  , host_name             [@host]

                                  , program_name          [@program]

                                  , session_id            [@spid]

                                  , login_time            [@time]

                                  , client_interface_name [@client_interface_name]

                            from sys.dm_exec_sessions s

                            where session_id = @@spid

                            for xml path('session'), type)

                    for xml path('LoginAudit'), type)

   

    insert into LoginFailedAudit(audit) values (@message)

    end

I hope you find this useful and will be interested to hear any comments.

The log viewer in management studio is really neat, however if you have large log files or are accessing a remote server over a slow connection it can take a long time to load the log records.

Generally you only need the last x records, so you don’t need to load all the records.

It would be great to have a cancel button to allow us to cancel the loading of the log records in SQL Server Management Studio.

As an aside one of the best features in SQL 2005 was the ability to cancel your connection attempt when starting a new connection in Management Studio. In SQL 2000 Query analyser and Enterprise Manager you had to wait 30 seconds if you put in the wrong password or server name. From SQL2005 onwards you can click cancel straight away.

If you agree it would be useful to cancel the loading of log records then please vote

https://connect.microsoft.com/SQLServer/feedback/details/627731/allow-the-user-to-cancel-the-reading-of-a-log-in-log-viewer

Posted by simonsabin | 2 comment(s)

There has been so much news coming out of MS lately and one that does seem to have gone by with very little noise is Federation in SQL Azure

http://player.microsoftpdc.com/Session/591d586f-3732-4bff-8ee2-857f27d74df4

This is a fascinating feature that enables you to spread a database across multiple nodes. Sharding is another term for this and is one of the main reasons people like the NOSQL movement.

It will be fascinating to see whether this federation will start to appear in the main SQL Server product.

Posted by simonsabin | 1 comment(s)