Hi All,

 another infrequent Blog post hereBig Smile.

I have had a problem with trying to get an SSIS package running as a Scheduled Job on SQL Server. Everything appeared well - even down to executing the package from within the SSIS on the Server - no errors (well - once the developers had sorted some issues out Tongue Tied as per normal). But on scheduling the package I was receiving

Date  17/06/2009 16:58:08
Log  Job History (testjob)

Step ID  1
Server  Servername

Job Name  testjob
Step Name  test
Duration  00:00:02
Sql Severity  0
Sql Message ID  0
Operator Emailed  
Operator Net sent  
Operator Paged  
Retries Attempted  0

Executed as user: INTERNAL\MSSQLService. ...0.4035.00 for 64-bit  Copyright (C) Microsoft Corp 1984-2005. All rights reserved.    Started:  16:58:08  Error: 2009-06-17 16:58:10.05     Code: 0xC0202009     Source: OA_DG5Admin_CreateEPIFILE_integration Connection manager "RELATEPI"     Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040154.  An OLE DB record is available.  Source: "Microsoft OLE DB Service Components"  Hresult: 0x80040154  Description: "Class not registered".  End Error  Error: 2009-06-17 16:58:10.06     Code: 0xC020801C     Source: Data Flow Task OLE DB Destination [5716]     Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "RELATEPI" failed with error code 0xC0202009.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.  End Error  Error: 2...  The package execution fa...  The step failed.

I theorised that this could be down to an interoperability between the package and the 64-bit envoironment. The package was interfacing with flat files and an access database [:'(] and I remembered having an issue a while back on something simular (developing on the server) and trying to interface into an Excel spreadsheet.

I found patches of the solution - however there was nothing in-depth consisting mainly of Forum Blogs. So I thought I would post up here and hopefully help some other people Big Smile if they have this problem.

 First off - an overview

Scheduling an SSIS Package in a job on 64-bit SQL Server 2005, causes the 64-bit DTEXEC.EXE to be instantiated to execute the package - the parameters passed can be seen if you connect to SSIS in SSMS and Right click the package and select Run Package.


You will then be presented with the "Execute Package Utility" - select the last option - "Command Line" - this shows the parameters that will be passed to the a SQL Job used to run the package when selecting an SSIS Job Step - such as


This information will be used later.

If you are running 64-bit - and if you are reading this I suppose you are - I would suggest going looking at your file system. You will find two SQL Server directories under you install location - one looks standard - the other has (x86) in the path. This is where ther 32-bit versions of SQL Server files are located - the other directory is the 64-bit and is the one defaulted to.

So - now to set up a job.

The secret to getting this working is when creating the job, is not to use the "SQL Server Integration Services Package" but to use the CMDEXEC Step Type.

It is here that you can force the SQL Agent to call the 32-bit version of DTEXEC.EXE.

To get the Executable path - just use the location of your DTEXEC within the SQL Server (x86) directories (usually \90\DTS\BINN\DTEXEC), and to get the parameters - just copy the parameters from the Command Line option of the Execute Package Utility (as shown above)


Oh - and one last bit of advice - be carefull what your mind types - I put DTSEXEC.EXE (note the S) and encountered another problem that had me banging my head for another 10 minutes Embarrassed

 Enjoy Angel


It was a post by GrumpyDBA that caused me to post my first and only blog on here a long time ago. Now I am posting this entry because I am feeling like a grumpy DBA.

 My current client - a government agency who takes money off everybody (that should identify them Big Smile - I have a mortgage to pay) has a lab for testing new analysis software. They wanted some manual processes automating which included movement of file data around for processing by various applications. Well, I hadn't previously looked at SSIS in great depth as I hadn't really had the requirement to, so I thought here's my chance to get some experience. Well, all I can say is come back DTS - all is forgiven.

 I will give it it's dues, some things are good, but there is a lot of bad. In fact, every way I am turning at them moment I am encountering bugs, "features" and "by design". The latest being this https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=252471

so I thought I would work round it, exclude the two tables that had this issue (going to script them seperately outside the wizard as indicated), and then another table failed that does not meet that criteria in any shape or form.... So onto SSIS, using the transfer SQL Objects component. Well, failure after failure about creating an account that already exists - even though I hadn't select the security options....

I have also been hit by issues of file permissions after detaching databases - link to the secure FTP site and password is sat in my inbox for me to get the hotfix although I have worked round it using SUBINACL

 I feel really guilty, my next gig starts next week - hopefully the main one will be completed by then

Hi All,

 Well here I am - my first blogCool . It was a post from GrumpyOldDBA that decided me to join and start blogging.

The link to it is here


 It is not the whole document, but the comment right at the bottom in relation to SAN configuration.

I am a contract DBA and earn my crust on several different types of contract work, from pure support (pays the bills) through to Project work. I also get assignments for going to customers sites to carry out performance tuning on their systems.

What I have found repeatedly is the number of companies that have swallowed the Salesman's words, invested in a SAN and then have loaded all their systems onto it. More times than not, the disks are allocated to a RAID 5, with logical disks carved out for each system. They look at me funny when I start by issuing a disclaimer saying that for enhancements and recommendations that I make, the SAN will be an issue unless they are prepared to rip it apart and start again. Funny, nobody has taken me up on that offer yet!

 There are other issues with using SAN's. One previous customer had really bought into the SAN setup, running Web Servers, Databases, Mail, in fact, everything in the server room was connected to the SAN. Now, fortunately for them, their databases were not heavily used, with a low amount of transactions (just to clarify, this assignment was just giving backfill DBA cover), and they were not experiencing any performance issues in the area of the databases.

But what they were suffering from was Availability issues. I started this section by stating that they had bought into SAN's in a big way. Even going as far as having all the machines boot from the SAN. This is what I call "putting all you eggs into one basket". It ran lovely, but then, all of a sudden, everything disapeared. Everything went down! This then continued to happen 3-4 times a week. They had no way of troubleshooting the problem because all their servers booted from the SAN, so logs and everything were on the SAN, and this had dissapeared!!

When my contract was over, they were still having this issue and were testing updated drivers with the manufacturer of the SAN.

I hope they learn't their lesson with this. As I said above, I call this "Putting all your eggs in one basket", not because of what it appears to do, but because, when that basket is dropped, all you eggs get smashed.Geeked