Hi, I have been experiencing a very weird problem.
I have a SQL 2012 SP1 running in a IBM server, and another running in a PC. OS is Windows 2008 R2. The IBM server has better performance than PC.
I developed a program to insert tens millions of data to one table. I am using C++ and ADO. The werd problem is the program run dramatically faster in PC than IBM server. It took 8 seconds to insert ten thousand records, while it 60 seconds in IBM server. The SELECT query in PC also is faster than IBM server, but not too much.
I believe in the IBM server has better performance than PC. While I use SQL script to insert ten thousand records in SSMS, IBM server just take 1 second, very faster, while PC take 4.5 seconds. The SQL is very simple, just likes:
while (i < 10000)
{
insert employee
value(@name, @age)
}
All configurations are same, and are default.
While I monitor the performance in SQL profiler,except for the During, others are same. The During value in IBM server is 5, while it is 0 in PC. Write is 0. Read is 20. Both CPU usages are low, about 24%. While running the program, the data transfer rate (IO) is 600K to 1M per second in IBM server, and 2M to 4M in PC.
Why is it slow while I run program to insert data, and faster while I using script in SSMS?
Does someone have run into same issue? Any suggestion?