| Author |
Topic |
|
DBunn
Starting Member
14 Posts |
Posted - 2002-08-02 : 16:19:05
|
| I've been trying to process some info in the following manner:1) Transfer records from table A to table B (different structures), transforming data in each field by various 'Convert's, concantenations, date format change, etc. The only common thread is that I'm transforming every field to text. I'm using a standard Transform Data Task to do this step. Pretty straight forward and successful.2) After the transfer, I need to update a field in the original table for each record that I moved, providing a verification that the new records really are in table B. Here's where I'm having the problem. I'm trying to do this with a Data Driven Query Task and have been unsuccessful. I've tried many different methods using the DDQT to specify the records to update but I guess I'm just not hitting the right combination. Unfortunately, I don't have a key field to make this easy. I'm picking this up from another developer's original design, :-( I'm actually having to match up various fields through the various transformations I originally used. Yes, it's quite a mess but it has to be done.I'm working from SQLServer2000 table to SQLServer2000 table right now for testing but ultimately, I'll be moving from SQLServer2000 table to AS400 file.Anyone out there got any advise? |
|
|
Garth
SQLTeam Author
119 Posts |
Posted - 2002-08-02 : 19:12:01
|
| Could you use one SQL Task to mark the records as "in-process" before the transfer and another to mark them as "processed" after the transfer?Garthwww.SQLBook.com |
 |
|
|
DBunn
Starting Member
14 Posts |
Posted - 2002-08-03 : 09:52:58
|
| Before they are processed, the field that I need to update is empty, which is the criteria for processing the record. After they are moved to the new structure, I would like to verify that they arrived and update the field with an 'M' for Moved. My problem lies in the details of using the Data Driven Query task to update a field in one table based on the matching of all of the fields of the original 'destination' table (since I have no specific key).When it really boils down, I'm having problems figuring out details of the DDQT, such as which table is Source, which is Binding, do I match the fields in the Source query, the transformations with the Binding table, or in the update query. The online and book examples I've found either don't address my issue or maybe I'm just being dense about it. I'm getting ready to chuck it all and set up an ADO recordset to walk through and just check each row myself, but isn't that the whole reason for the DDQT? Frustrating. |
 |
|
|
jasper_smith
SQL Server MVP & SQLTeam MVY
846 Posts |
Posted - 2002-08-03 : 11:56:20
|
I wouldn't use a DDQT for this - for one things it's pretty slow and can be rather confusing (as you are finding). If this is a DTS package to export data from a table to a file then you can simply use a SQL task to update your source table using OPENROWSET to connect to your produced file and using a inner join to update the table. Another option is to use the multiphase Insert Success stage of the transform to use an ActiveX script to update the table after each row is wriiten to file but again this will slow down the export.Here's an example of what I'm talking about using Northwind databaseuse Northwindgoalter table Customersadd written_to_file char(1) NULLgo/*Create a DTS package to export the customers tableto c:\dts\customers.txt then add a Execute SQL taskthat follows the export with the following statement*/UPDATE custSET cust.written_to_file = 'Y'FROM Northwind.dbo.Customers custJOIN(SELECT F1 AS 'CustomerID' FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Text;Database=c:\dts','select F1 from customers#txt')) as txtON cust.CustomerID=txt.CustomerIDHTHJasper SmithHaving said all that - the file creation will work or it won't as a whole so if all you're doing is selecting from a table(s) you can have an On Success workflow step after the file creation to update the table(s) with the same select statement you used to get the data.Edited by - jasper_smith on 08/04/2002 11:46:37 |
 |
|
|
DBunn
Starting Member
14 Posts |
Posted - 2002-08-05 : 10:33:37
|
| Thanks, Jasper. I'm getting away from the DDQT and am really intrigued by the InsertSuccess phase direction. My package will run every 15 minutes and grabs up a small amount of records each time so I'm not too concerned about speed.I have an additional question that relates. Is it possible to open an ADO connection based on an existing DTS connection? I'm been looking at executing an update sql statement against the source table and the examples use a static ADO connection string (hard-coded). |
 |
|
|
jasper_smith
SQL Server MVP & SQLTeam MVY
846 Posts |
Posted - 2002-08-05 : 15:39:43
|
That's a very interesting question However , off the top of my head, I'd say you can't mix and match your DTS Package object and ADO stuff. You can certainly make the package dynamic - using global variables,dynamic properties task, ini files, lookup tables etc for connection string information if you don't want to hard code it. Still , it has inspired me to have a play with the idea. HTHJasper Smith |
 |
|
|
DBunn
Starting Member
14 Posts |
Posted - 2002-08-06 : 08:35:42
|
Maybe it's a good thing, maybe it's a curse, but I'm bound and determined to figure out a challange. Here's what I came up with, having an existing DTS connection of 'MyConn'_______________________________________________dim objADOConndim strIDFunction PreSourceMain() dim objPkg dim objConn dim strADOConn set objPkg = DTSGlobalVariables.Parent set objConn = objPkg.Connections("MyConn") strADOConn = "DRIVER={SQL SERVER};SERVER=" & objConn.DataSource _ & ";DATABASE=" & objConn.Catalog & ";UID=" & objConn.UserID set objADOConn = CreateObject("ADODB.Connection") objADOConn.Open strADOConn set objConn = nothing PreSourceMain = DTSTransformStat_OKEnd FunctionFunction Main() ****Do Transformations and other brilliant things but grab up the id of the row for later*** strID = DTSSource("MyID")End FunctionFunction InsertSuccessMain() dim strSQL strSQL = "Update MyTable Set MyPostFld = 'M' Where " _ & "MyID = '" & strID & "'" objADOConn.Execute strSQL InsertSuccessMain = DTSTransformStat_OKEnd FunctionFunction PumpCompleteMain() objADOConn.Close set objADOConn = nothing PumpCompleteMain = DTSTransformStat_OKEnd Function_________________________________________________Hope this is useful. - DB |
 |
|
|
jasper_smith
SQL Server MVP & SQLTeam MVY
846 Posts |
Posted - 2002-08-06 : 08:44:34
|
Good stuff - that's kind of where I got as wellGreat minds...... HTHJasper Smith |
 |
|
|
DBunn
Starting Member
14 Posts |
Posted - 2002-08-06 : 10:28:02
|
Spoke too soon. The code above works great...if you don't have to use a password for the connection. I moved this to my prototype production and I could not grab the connection password. I guess it's write only, which makes sense from a security perspective.Anyway, back to the drawing board. |
 |
|
|
jasper_smith
SQL Server MVP & SQLTeam MVY
846 Posts |
Posted - 2002-08-06 : 11:24:06
|
| Any reason you are not using NT Authentication - this package will be run by a job locally on the server right ?HTHJasper Smith |
 |
|
|
DBunn
Starting Member
14 Posts |
Posted - 2002-08-07 : 06:04:54
|
Unfortunately, the package will be run on the production server on a schedule. To be able to create ADO connections, an ID/PW combo is required. As descibed in www.sqldts.com, a connection's password was changed in SQL2000 to be write only, so without access to a password from another source, no joy.I've tried alot of workarounds, including switching gears and trying to load the transferred record ID's into a temp table so I can run a SQL task after the xfer, but I find I need a password to create the temp table. I've also been able to store them in an array and plop that into a global variable, but I can't figure how to later join with the array to update the original records.At this time, I've already spent a large number of precious hours on this so I've convinced the project manager that we need a source for our ID/PW info so I can use the ADO connection. Other developers are saying they need the info as well, so it all works out in the end. Jasper, Thanks for your help. |
 |
|
|
rman107
Starting Member
1 Post |
Posted - 2002-09-12 : 02:37:34
|
| my 2c... For your username/password problem... you could store the username/password in the registry then use global variables to retrieve them into DTS.. you'll have to write/use a COM component to make the registry calls and set that in DTS... the only caveat with that is clear text passwords will be stored in the reqistry which may not be an option for you... in that case, you might have to use an encrypt/decrypt program at the moment you retrieve the password from the registry... |
 |
|
|
Tim
Starting Member
392 Posts |
Posted - 2002-09-22 : 22:16:24
|
You can get a disconnected ADO recordset from a DTS connection. You use an EXECSQL task with an existing DTS connection and pass back the entire result set in a global variable. This global variable is actually a disconnected ADO result set. You can then loop through it in an ActiveTask and do whatever's needed for each record.From BOLquote: You can populate multiple global variables with data in one pass with the Execute SQL task. Entire rowsets also can be saved in a single global variable and accessed as a disconnected ActiveX Data Objects (ADO) recordset. For more information, see Execute SQL Task.
The help topic for Execute SQL task has an example of setting up the global variable into a result set and the VBScript to access it....eg:dim countrdim RSset RS = CreateObject("ADODB.Recordset")set RS = DTSGlobalVariables("Authors").valuefor countr = 1 to RS.RecordCount MsgBox "The author ID is " & RS.Fields("au_id").value RS.MoveNextNextMain = DTSTaskExecResult_SuccessHope it helps----Nancy Davolio: Best looking chick at Northwind 1992-2000 |
 |
|
|
|