> We are using the stored procedure to do a If Exist, update, else Insert
> processing for
> each record.
Consider loading the data in batches into a temporary table and then
use a single insert statement to insert new records and a single update
statement to update existing ones. This way, you a
The utility is designed to run in the background and maintain/update a
parallel copy of a production system database. We are using the
stored procedure to do a If Exist, update, else Insert processing for
each record.
The originating database is a series of keyed ISAM files. So we need
to read
[EMAIL PROTECTED]
> I have a program that reads records from a binary file and loads them
> into an MS-SQL Server database. It is using a stored proc, passing the
> parameters.
[snip]
> So my questions is
> Is there a "faster" method I can use to connect to the SQL server ?
> Or does anyone
[EMAIL PROTECTED] napisaĆ(a):
> Is there a "faster" method I can use to connect to the SQL server ?
> Or does anyone have any "optimization" tips the can offer ?
This has nothing with python, but the fastest way to load large amount
of data to MS SQL Server database is DTS import from flat file.