solution for what you are wanting to do: copy the contents of one table to
another table while performing data transformations. You can use a VBScript
or JScript task for performing the complex transformation. The DTS package
is a standard SQL Server approach, it can be contained within the database,
and can be scheduled via a job.
"nick" <nick@.discussions.microsoft.com> wrote in message
news:0B329B5A-860D-475B-BC24-DF284BEEE547@.microsoft.com...
> In fact, my question is
> Transact-SQL script "declare cursor" (fast forward) vs. Client side ADO
> code
> with fast forward server cursor
> Which one is better for very large rows?
>
> "nick" wrote:
>Thanks. In fact, each row of the very big table provides the parameters for
a
complex C++ function written by others. The source code is available but I
better not touch the C++ code since the author may upgrade. The C++ code wil
l
return quite a few values for each row and these values need to written in a
SQL table. The process is simple but the data amount is big.
Maybe use JScript in DTS to call an executible? But need to parse the
input/ouput...
"JT" wrote:
> After thinking about it more, I believe that DTS would be most appropriate
> solution for what you are wanting to do: copy the contents of one table to
> another table while performing data transformations. You can use a VBScrip
t
> or JScript task for performing the complex transformation. The DTS package
> is a standard SQL Server approach, it can be contained within the database
,
> and can be scheduled via a job.
> "nick" <nick@.discussions.microsoft.com> wrote in message
> news:0B329B5A-860D-475B-BC24-DF284BEEE547@.microsoft.com...
>
>|||What you just described is not really 'data transformation', but some type
of data entry that needs to be automated using a large list of parameters
supplied by a 3rd party. In that case, rather than DTS, I would suggest
writing a client side application that:
#1 Opens a forwardonly / readonly recordset recordset from the source
table
#2 Steps through the recorset from top to bottom (not really looping)
#3 For each record, call the C++ executable using parameters from the
record
#4 Write the result of each SP call to a tab delimited text file
#5 Once the last record has been processed, close the recordset, and use
the bulk copy utility (BCP.EXE) to load the tab delimited file into the
destination table.
#6 Truncate the source table?
I would not reccomend running the executable on the database server, becuase
it would provide only a marginal performance benefit but result in security,
deployment, and resource utilization issues.
Read up on the 'bcp utility' in SQL Server Books Online.
The following article describes a bulk loading, with an emphasis on
optimizing performance, but may provide more detail than what you need.
SQL Server 2000 Incremental Bulk Load Case Study
[url]http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/incbulkload.mspx[/u
rl]
"nick" <nick@.discussions.microsoft.com> wrote in message
news:BCA25342-6927-4416-9F0F-F9CCA2E76FA5@.microsoft.com...
> Thanks. In fact, each row of the very big table provides the parameters
> for a
> complex C++ function written by others. The source code is available but I
> better not touch the C++ code since the author may upgrade. The C++ code
> will
> return quite a few values for each row and these values need to written in
> a
> SQL table. The process is simple but the data amount is big.
> Maybe use JScript in DTS to call an executible? But need to parse the
> input/ouput...
> "JT" wrote:
>|||Thanks. The existed solution is:
#1 A "wrapper"(extended stored procedure) to call the C++ routines.
#2 A stored procedure which declares a fast forward cursor over the large
table and fetch every row and calls the extended stored procedure with
fetched values as parameters.
I am not sure about the C++ code and worry about memory leaks, etc, that's
the reason I am s

both 3rd party code and extended SP) is good, what's the issue of the existe
d
approach? Performance/resource usage?
"JT" wrote:
> What you just described is not really 'data transformation', but some type
> of data entry that needs to be automated using a large list of parameters
> supplied by a 3rd party. In that case, rather than DTS, I would suggest
> writing a client side application that:
> #1 Opens a forwardonly / readonly recordset recordset from the source
> table
> #2 Steps through the recorset from top to bottom (not really looping)
> #3 For each record, call the C++ executable using parameters from the
> record
> #4 Write the result of each SP call to a tab delimited text file
> #5 Once the last record has been processed, close the recordset, and us
e
> the bulk copy utility (BCP.EXE) to load the tab delimited file into the
> destination table.
> #6 Truncate the source table?
> I would not reccomend running the executable on the database server, becua
se
> it would provide only a marginal performance benefit but result in securit
y,
> deployment, and resource utilization issues.
> Read up on the 'bcp utility' in SQL Server Books Online.
> The following article describes a bulk loading, with an emphasis on
> optimizing performance, but may provide more detail than what you need.
> SQL Server 2000 Incremental Bulk Load Case Study
> [url]http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/incbulkload.mspx[
/url]
> "nick" <nick@.discussions.microsoft.com> wrote in message
> news:BCA25342-6927-4416-9F0F-F9CCA2E76FA5@.microsoft.com...
No comments:
Post a Comment