January 2007 - Posts

Chuck Boyce, the "guy with the camera" for those who attended PASS, has just published 2 videos that he took at PASS

The first is a small discussion between Ted Malone and Gert Drapers on agile development.

http://www.youtube.com/watch?v=AfhchcfzhKA



-
Posted by simonsabin | with no comments

One of the great aspects of the Excel 2007 is the new functions. I have used COUNTIF and SUMIF numerous times and often found I wanted to do multiple criteria, i.e. sum if revenue is for this month and for this store. To achieve this before 2007 you had to jump throuhg numerous hoops and it all became very complex.

The great news is that SUMIF and COUNTIF functions have been extended to take more than one criteria. These are implemented using SUMIFS and COUNTIFS. These both can take up to 256 criteria (more than I can imagine anyone ever needing). 

You can now produce very complex summary sheets very easily.



-
Posted by simonsabin | with no comments
Filed under:

My raw file reader has now been updated, the following are the changes from the previous version

  • Column headers can be included if more than one column is selected.
  • Context menu to copy data from the grid
  • Can read a package for raw files and will load each one of them
  • Will remember the last 10 packages read used
  • Option to change the file extensions associated with raw files (thanks Julian)
  • Fixed some spelling mistakes
  • Application is now an MDI application allowing for multiple raw files to be opened at once
  • Rawfile detail windows can be cacaded, tiled as with all MDI windows
  • Enable application for ClickOnce deployment
  • I am working on the following,

  • Allow a folder to be monitored
  • Display of meta data for the columns, type, length etc.
  • Editing of data (not sure if possible yet)
  • Complete OneClick Deployment
  • Reader explorer treeview to list open files folders and packages.
  • You can get the next version from here

    http://sqlblogcasts.com/files/folders/ssis_tools/entry1528.aspx



    -

    Following feedback I have issued another version of the taskpad report. This now works with databases that have been created by users that have access to the sql server via a group membership

    You can download the report from http://sqlblogcasts.com/files/10/default.aspx



    -

    If you've used SSIS in anger you will have come across Raw files. Unfortunately when you store data in a raw file it is in accesible. There is no tool available to view the data,

    Until now.

    I recently put together a tool that allows you to read the contents of the file, I like to think I've included some neat features, including the automatic reloading of the file when it changes. This means you can run your package flip to the raw file reader and the contents of your raw file will be there waiting for you.

    The current feature set is

    • Read any SSIS Raw file
    • Results displayed in a grid
    • Columns automatically resized
    • Order of columns can be changed
    • The results can be ordered by clicking on the column headings
    • You can choose which columns to display (will improve performance for large files)
    • Data can be copied from the results grid using CTRL+C
    • Any set of cells can be copied from the results grid by using the CTRL key
    • Column headers can be included or excluded from the copied data
    • Will remember the last 10 files used
    • Reading of data can be automatic when the columns are read or the data can be read manually once you have selected the columns you want
    • Column selection mechanism is similiar to that in SSIS components
    • Notification when the currently loaded raw file has changed i.e. package is rerun
    • Opton to automatically reload file when file changes
    • Sizeable window to maximise viewing of data in the grid.

    To download the tool go to http://sqlblogcasts.com/files/folders/ssis_tools/default.aspx

    This is a version 1 product so I am looking for some feedback on what to do next, I have some ideas but would like to know what others want.



    -

    My team and I have been reviewing more CVs than you can shake a stick at recently for the roles we are trying to fill. This is a list of the top 10 things that make your CV work for you,

    1.      Don't make it too long.
    3 pages is a good size. I get board easily.

    2.      Don't try to get to point 1 by using 5pt font.
    I don't want to have to use a magnifying glass

    3.      Don't spend too much time on what the company does.
    I want to know about you not the company

    4.      Don't use terms (TLAs etc) that are specific to the company.
    I won't know what they mean.

    5.      Cover your employment in reverse order. i.e. current job first.
    I get bored easily I might not make it to the bottom

    6.      Highlight keys terms.
    This allows me to focus on the areas of the CV that would interest me.

    7.      Have more than one CV if going for different roles.
    Makes sure that I think you are what I am after, and not a jack of all trades.

    8.      Sell yourself.
    Always have a blurb that makes you the best person for the job you want. Don't forget to do point 6.

    9.      Don't lie.
    Its not worth it.

    10.  Avoid lists of technologies and years experience.
    Its a bugger to maintain and doesn't add a huge amount of value.
     

    The bottom line is make it readable and easy to scan. What is easier to read a large article in a broad sheet newspaper that has many paragraphs or a small article in a tabloid that covers 1 column.The reason is that the latter requires little commitment and concentration, you can read it and don't have a chance to get bored.

    The same applies to a CV. If I have to read every line of a CV to find out what you have been doing then I will get board and probably reject your CV.

    Other posts on CVs can be found here How to make your CV stand out and CVS for IT jobs 



    -
    Posted by simonsabin | 4 comment(s)
    Filed under: ,

    I have put together a little utility to read raw files. Before I finish it off I would like some feeback.

    The current feature set is;

    • Read any SSIS Raw file
    • Results displayed in a grid
    • Columns automatically resized
    • Order of columns can be changed
    • The results can be ordered by clicking on the column headings
    • You can choose which columns to display (will improve performance for large files)
    • Data can be copied from the results grid using CTRL+C
    • Any set of cells can be copied from the results grid by using the CTRL key
    • Column headers can be included or excluded from the copied data
    • Will remember the last 10 files used
    • Reading of data can be automatic when the columns are read or the data can be read manually once you have selected the columns you want
    • Column selection mechanism is similiar to that in SSIS components
    • Notification when the currently loaded raw file has changed i.e. package is rerun
    • Opton to automatically reload file when file changes
    • Sizeable window to maximise viewing of data in the grid.

    You can download it here SQL Server 2005 Integration Services Raw file reader



    -

    This is a classic gotcha that caught me out and thankfully others on the newgroups.

    The object model for SSIS has an object called Package which represents a package.

    One would assume that to load a package from a file you should use the LoadFromXML method on this Package object.

    If that was the million dollar question and you agree, you wouldn't be very rich.

    To load a Package from a dtsx file stored on the file system, you need to use the LoadPackage method of the Application

    You use the LoadFromXML method on the package if your package is stored in memory i.e a string or a stream.

    So be warned.

     



    -

    Are you wanting to develop Integration services using code. If so then you will need to now the identifiers to use for each component.

    You have three ways to set the componentClassId

    Directly if you can remember them i.e. myuComp.ComponentClassID = "DTSTransform.Merge.1";

    or using the friendlyName i.e. myuComp.ComponentClassID = new Application().PipelineComponentInfos["Merge"].CreationName;

    or by referencing the assembly the component is in and doing

    myuComp.ComponentClassID = typeof(Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter).AssemblyQualifiedName;

    For your reference Ive dumped the list names and componentClassIds from PipelineComponentInfos here.

    Name ComponentClassId Description
    Merge DTSTransform.Merge.1 Merges two sorted datasets.
    Merge Join DTSTransform.MergeJoin.1 Merges data from two datasets by using a join.
    Lookup DTSTransform.Lookup.1 Looks up values in a reference dataset by using exact matching.
    Audit DTSTransform.Lineage.1 Adds audit information to rows in a dataset.
    Row Sampling DTSTransform.RowSampling.1 Creates a sample dataset by extracting a number of rows from a dataset.
    Dimension Processing MSMDPP.PXPipelineProcessDimension.1 Dimension Processing
    Term Extraction DTSTransform.TermExtraction.1 Extracts terms from data in a column.
    Data Mining Model Training MSMDPP.PXPipelineProcessDM.1 Data Mining Model Training
    OLE DB Source DTSAdapter.OLEDBSource.1 Extracts data from a relational database by using an OLE DB provider.
    Character Map DTSTransform.CharacterMap.1 Applies string operations to character data.
    Fuzzy Grouping DTSTransform.GroupDups.1 Groups rows in a dataset that contain similar values.
    Percentage Sampling DTSTransform.PctSampling.1 Creates a sample dataset by extracting a percentage of rows from a dataset.
    Excel Destination DTSAdapter.ExcelDestination.1 Loads data into an Excel workbook.
    Pivot DTSTransform.Pivot.1 Pivots a dataset to create a less normalized representation of the data.
    Data Mining Query MSMDPP.PXPipelineDMQuery.1 Data Mining Query
    SQL Server Destination DTSAdapter.SQLServerDestination.1 Loads data into a SQL Server database.
    Conditional Split DTSTransform.ConditionalSplit.1 Evaluates and directs rows in a dataset.
    Slowly Changing Dimension DTSTransform.SCD.1 Updates a slowly changing dimension.
    Flat File Source DTSAdapter.FlatFileSource.1 Extracts data from flat files.
    Term Lookup DTSTransform.TermLookup.1 Counts the frequency that terms in a reference table appear in a dataset.
    Copy Column DTSTransform.CopyMap.1 Copies columns.
    Derived Column DTSTransform.DerivedColumn.1 Updates column values using expressions
    Fuzzy Lookup DTSTransform.BestMatch.1 Looks up values in a reference dataset by using fuzzy matching.
    Flat File Destination DTSAdapter.FlatFileDestination.1 Loads data into a flat file.
    Partition Processing MSMDPP.PXPipelineProcessPartition.1 Partition Processing
    Import Column TxFileInserter.Inserter.1 Imports data from files to rows in a dataset.
    Excel Source DTSAdapter.ExcelSource.1 Extracts data from an Excel workbook.
    Data Conversion DTSTransform.DataConvert.1 Converts columns to different data types and adds the columns to the dataset.
    Union All DTSTransform.UnionAll.1 Merges multiple datasets.
    OLE DB Command DTSTransform.OLEDBCommand.1 Executes an SQL command for each row in a dataset.
    Unpivot DTSTransform.UnPivot.1 Creates a more normalized representation of a dataset.
    Export Column TxFileExtractor.Extractor.1 Exports column values from rows in a dataset to files.
    Row Count DTSTransform.RowCount.1 Counts the rows in a dataset.
    OLE DB Destination DTSAdapter.OLEDBDestination.1 Loads data into a relational database by using an OLE DB provider.
    Raw File Source DTSAdapter.RawSource.1 Extracts data from a raw file.
    Raw File Destination DTSAdapter.RawDestination.1 Indicates what option is used to write to the file.
    Multicast DTSTransform.Multicast.1 Creates copies of a dataset.
    Sort DTSTransform.Sort.1 Sorts data.
    Aggregate DTSTransform.Aggregate.1 Aggregates and groups values in a dataset.
    Recordset Destination DTSAdapter.RecordsetDestination.1 Creates and populates an in-memory ADO recordset.
    DataReader Source Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter, Microsoft.SqlServer.ADONETSrc, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Extracts data from a relational database by using a .NET provider.
    DataReader Destination Microsoft.SqlServer.Dts.Pipeline.DataReaderDestinationAdapter, Microsoft.SqlServer.DataReaderDest, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Creates and populates an ADO.NET in-memory dataset.
    SQL Server Compact Edition Destination Microsoft.SqlServer.Dts.Pipeline.SqlCEDestinationAdapter, Microsoft.SqlServer.SqlCEDest, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Loads data into a SQL Server Compact Edition database.
    Script Component Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost, Microsoft.SqlServer.TxScript, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Executes a custom script.
    XML Source

    Microsoft.SqlServer.Dts.Pipeline.XmlSourceAdapter, Microsoft.SqlServer.XmlSrc, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91

    Extracts data from an XML data file.

    You can do the same with the following code.

    foreach (PipelineComponentInfo p in new Application().PipelineComponentInfos)

    {

    Console.WriteLine ( "{0};{1};{2}",p.CreationName, p.Description, p.Name);

    }

    For some more information have a look at this page http://msdn2.microsoft.com/en-us/library/ms135932.aspx



    -

    Did you know that whilst the samples are included on the install of SQL Server as with books online the most upto date versions can be found online.

    Download details: SQL Server 2005 Samples and Sample Databases

    Some of these samples include the source code for programs. these often require a key file to function. The file is not included in the install and so you have to create one. This can be done simply by issuing the following at the comman line.

    sn -k C:\Program Files\Microsoft SQL Server\90\Samples\samplekey.snk

    (assuming that you have installed the camples to the default location)

    Here is some more information you need to know about the samples

    http://download.microsoft.com/download/d/8/6/d865cf0c-c44b-401b-b426-b3bf5c628112/GettingStartedWithSQLSamples.htm



    -
    Posted by simonsabin | with no comments
    Filed under: ,