Обсуждение: Query regarding Bulk Import
Thanks & Regards,
Dheeman Dutta
Application Developer
Web Spiders (India) Pvt. Ltd.
51B, Justice Chandra Madhav Road
Calcutta 700 020. India
Phone: 91.33.2474.3581 to 85
Fax: 91.33.2474.3500
http://www.webspiders.com/
(ISO 9001:2000 and IC 9700 Certified. Microsoft Gold Certified Partner)
Calcutta (India), Birmingham (UK), Toronto (Canada), San Francisco(USA)
Disclaimer: The opinions expressed within this message are those of the author and not necessarily those of the firm The information contained in this message is intended only for the recipient, may be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer or any other device.
"We look at things not as they are, but as we are"
I'd use DTS Thank You Sim Zacks CIO CompuLab 04-829-0145 - Office 04-832-5251 - Fax ________________________________________________________________________________ Hi, I'm a new user of postgres. Earlier my database was sqlserver . now i want to perform BULK IMPORT of data from Sqlserver to Postgres. Can you please suggest what are the tools that might be used in this purpose. Just to note that the data size in Sqlserver is of size more than 10GB. Thanks & Regards, Dheeman Dutta Application Developer Web Spiders (India) Pvt. Ltd. 51B, Justice Chandra Madhav Road Calcutta 700 020. India Phone: 91.33.2474.3581 to 85 Fax: 91.33.2474.3500 http://www.webspiders.com/ (ISO 9001:2000 and IC 9700 Certified. Microsoft Gold Certified Partner) Calcutta (India), Birmingham (UK), Toronto (Canada), San Francisco(USA) _____ Disclaimer: The opinions expressed within this message are those of the author and not necessarily those of the firm The information contained in this message is intended only for the recipient, may be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer or any other device. _____ "We look at things not as they are, but as we are" <http://promos.hotbar.com/promos/promodll.dll?RunPromo&El=&SG=&RAND=80873&pa rtner=hotbar> Upgrade Your Email - Click here!
Hi Sim / Dheeman DTS is designed for scripts that batch more complex tasks. It is best not to do any transformation if your aim is to achieve a bulk copy. In my experience, T-SQL's Bulk Insert gives the best speed, at least for a few 100 megabytes of Tab Delimited file, and I always copy the file locally to the Server first. It shouldn't really matter where you trigger the BULK INSERT from. Other tricks to consider are whether you can compress/abbreviate any verbose fields before dumping them, and then recalculate the uncompressed version after upload. Be careful to be explicit about dates when dumping to TEXT. I tend to go with Floats printed in default format to save space rather than fixing the decimal places - some rounding differences will be unavoidable. Best Regards,James P.S. Reference - (an excerpt from Microsoft's site about DTS) Using Bulk Insert and bcp ========================= The Bulk Insert task creates and executes the Transact-SQL BULK INSERT statement. BULK INSERT, supported by the Microsoft OLE DB Provider for SQL Server, is significantly faster than bcp or the data pump for performing text file import operations. Therefore, if transformations are not used, use the Bulk Insert task and achieve faster throughput. See http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dtssql/ dts_usage_7703.asp -----Original Message----- From: pgadmin-support-owner@postgresql.org [mailto:pgadmin-support-owner@postgresql.org] On Behalf Of Sim Zacks Sent: 18 April 2005 09:26 To: Dheeman - Web Spiders India Cc: pgadmin-support@postgresql.org Subject: Re: [pgadmin-support] Query regarding Bulk Import I'd use DTS Thank You Sim Zacks CIO CompuLab 04-829-0145 - Office 04-832-5251 - Fax ________________________________________________________________________ ________ Hi, I'm a new user of postgres. Earlier my database was sqlserver . now i want to perform BULK IMPORT of data from Sqlserver to Postgres. Can you please suggest what are the tools that might be used in this purpose. Just to note that the data size in Sqlserver is of size more than 10GB. Thanks & Regards, Dheeman Dutta Application Developer Web Spiders (India) Pvt. Ltd. 51B, Justice Chandra Madhav Road Calcutta 700 020. India Phone: 91.33.2474.3581 to 85 Fax: 91.33.2474.3500 http://www.webspiders.com/ (ISO 9001:2000 and IC 9700 Certified. Microsoft Gold Certified Partner) Calcutta (India), Birmingham (UK), Toronto (Canada), San Francisco(USA) _____ Disclaimer: The opinions expressed within this message are those of the author and not necessarily those of the firm The information contained in this message is intended only for the recipient, may be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer or any other device. _____ "We look at things not as they are, but as we are" <http://promos.hotbar.com/promos/promodll.dll?RunPromo&El=&SG=&RAND=8087 3&pa rtner=hotbar> Upgrade Your Email - Click here! ---------------------------(end of broadcast)--------------------------- TIP 2: you can get off all lists at once with the unregister command (send "unregister YourEmailAddressHere" to majordomo@postgresql.org)
James, I was unaware that you could BCP data from SQL server to PostGreSQL. I will have to look into that for the future. Thank You Sim Zacks CIO CompuLab 04-829-0145 - Office 04-832-5251 - Fax ________________________________________________________________________________ Hi Sim / Dheeman DTS is designed for scripts that batch more complex tasks. It is best not to do any transformation if your aim is to achieve a bulk copy. In my experience, T-SQL's Bulk Insert gives the best speed, at least for a few 100 megabytes of Tab Delimited file, and I always copy the file locally to the Server first. It shouldn't really matter where you trigger the BULK INSERT from. Other tricks to consider are whether you can compress/abbreviate any verbose fields before dumping them, and then recalculate the uncompressed version after upload. Be careful to be explicit about dates when dumping to TEXT. I tend to go with Floats printed in default format to save space rather than fixing the decimal places - some rounding differences will be unavoidable. Best Regards,James P.S. Reference - (an excerpt from Microsoft's site about DTS) Using Bulk Insert and bcp ========================= The Bulk Insert task creates and executes the Transact-SQL BULK INSERT statement. BULK INSERT, supported by the Microsoft OLE DB Provider for SQL Server, is significantly faster than bcp or the data pump for performing text file import operations. Therefore, if transformations are not used, use the Bulk Insert task and achieve faster throughput. See http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dtssql/ dts_usage_7703.asp -----Original Message----- From: pgadmin-support-owner@postgresql.org [mailto:pgadmin-support-owner@postgresql.org] On Behalf Of Sim Zacks Sent: 18 April 2005 09:26 To: Dheeman - Web Spiders India Cc: pgadmin-support@postgresql.org Subject: Re: [pgadmin-support] Query regarding Bulk Import I'd use DTS Thank You Sim Zacks CIO CompuLab 04-829-0145 - Office 04-832-5251 - Fax ________________________________________________________________________ ________ Hi, I'm a new user of postgres. Earlier my database was sqlserver . now i want to perform BULK IMPORT of data from Sqlserver to Postgres. Can you please suggest what are the tools that might be used in this purpose. Just to note that the data size in Sqlserver is of size more than 10GB. Thanks & Regards, Dheeman Dutta Application Developer Web Spiders (India) Pvt. Ltd. 51B, Justice Chandra Madhav Road Calcutta 700 020. India Phone: 91.33.2474.3581 to 85 Fax: 91.33.2474.3500 http://www.webspiders.com/ (ISO 9001:2000 and IC 9700 Certified. Microsoft Gold Certified Partner) Calcutta (India), Birmingham (UK), Toronto (Canada), San Francisco(USA) _____ Disclaimer: The opinions expressed within this message are those of the author and not necessarily those of the firm The information contained in this message is intended only for the recipient, may be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer or any other device. _____ "We look at things not as they are, but as we are" <http://promos.hotbar.com/promos/promodll.dll?RunPromo&El=&SG=&RAND=8087 3&pa rtner=hotbar> Upgrade Your Email - Click here! ---------------------------(end of broadcast)--------------------------- TIP 2: you can get off all lists at once with the unregister command (send "unregister YourEmailAddressHere" to majordomo@postgresql.org)
Hi James/Sim, Thanks for the immediate support regarding the BULK IMPORT. Thanks & Regards, Dheeman Dutta Application Developer Web Spiders (India) Pvt. Ltd. 51B, Justice Chandra Madhav Road Calcutta 700 020. India Phone: 91.33.2474.3581 to 85 Fax: 91.33.2474.3500 http://www.webspiders.com/ (ISO 9001:2000 and IC 9700 Certified. Microsoft Gold Certified Partner) Calcutta (India), Birmingham (UK), Toronto (Canada), San Francisco(USA) ________________________________ Disclaimer: The opinions expressed within this message are those of the author and not necessarily those of the firm The information contained in this message is intended only for the recipient, may be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer or any other device. ________________________________ "Success is not about winning every battle, it is about winning the war" -----Original Message----- From: James Prichard [mailto:james.prichard@markit.com] Sent: Monday, April 18, 2005 6:14 PM To: pgadmin-support@postgresql.org Cc: Sim Zacks; dheeman.dutta@webspiders.com Subject: RE: [pgadmin-support] Query regarding Bulk Import Hi Sim / Dheeman DTS is designed for scripts that batch more complex tasks. It is best not to do any transformation if your aim is to achieve a bulk copy. In my experience, T-SQL's Bulk Insert gives the best speed, at least for a few 100 megabytes of Tab Delimited file, and I always copy the file locally to the Server first. It shouldn't really matter where you trigger the BULK INSERT from. Other tricks to consider are whether you can compress/abbreviate any verbose fields before dumping them, and then recalculate the uncompressed version after upload. Be careful to be explicit about dates when dumping to TEXT. I tend to go with Floats printed in default format to save space rather than fixing the decimal places - some rounding differences will be unavoidable. Best Regards,James P.S. Reference - (an excerpt from Microsoft's site about DTS) Using Bulk Insert and bcp ========================= The Bulk Insert task creates and executes the Transact-SQL BULK INSERT statement. BULK INSERT, supported by the Microsoft OLE DB Provider for SQL Server, is significantly faster than bcp or the data pump for performing text file import operations. Therefore, if transformations are not used, use the Bulk Insert task and achieve faster throughput. See http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dtssql/ dts_usage_7703.asp -----Original Message----- From: pgadmin-support-owner@postgresql.org [mailto:pgadmin-support-owner@postgresql.org] On Behalf Of Sim Zacks Sent: 18 April 2005 09:26 To: Dheeman - Web Spiders India Cc: pgadmin-support@postgresql.org Subject: Re: [pgadmin-support] Query regarding Bulk Import I'd use DTS Thank You Sim Zacks CIO CompuLab 04-829-0145 - Office 04-832-5251 - Fax ________________________________________________________________________ ________ Hi, I'm a new user of postgres. Earlier my database was sqlserver . now i want to perform BULK IMPORT of data from Sqlserver to Postgres. Can you please suggest what are the tools that might be used in this purpose. Just to note that the data size in Sqlserver is of size more than 10GB. Thanks & Regards, Dheeman Dutta Application Developer Web Spiders (India) Pvt. Ltd. 51B, Justice Chandra Madhav Road Calcutta 700 020. India Phone: 91.33.2474.3581 to 85 Fax: 91.33.2474.3500 http://www.webspiders.com/ (ISO 9001:2000 and IC 9700 Certified. Microsoft Gold Certified Partner) Calcutta (India), Birmingham (UK), Toronto (Canada), San Francisco(USA) _____ Disclaimer: The opinions expressed within this message are those of the author and not necessarily those of the firm The information contained in this message is intended only for the recipient, may be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer or any other device. _____ "We look at things not as they are, but as we are" <http://promos.hotbar.com/promos/promodll.dll?RunPromo&El=&SG=&RAND=8087 3&pa rtner=hotbar> Upgrade Your Email - Click here! ---------------------------(end of broadcast)--------------------------- TIP 2: you can get off all lists at once with the unregister command (send "unregister YourEmailAddressHere" to majordomo@postgresql.org)
The Postgres equivalent of the Microsoft command Microsoft T-SQL =============== BULK INSERT 'database_name.owner.table_name' FROM 'data_file' WITH (FIRSTROW=2) I like to store the column header names in row 1 Postgres ======== COPY tablename [ ( column [, ...] ) ] FROM 'filename' Both of these work with tab delimited text flat files. Dropping indices and avoiding locking and logging overheads is very important to getting data in fast for Microsoft SQL Server, and presumably Postgres. I've not looked at DTS for dumping text from Microsoft. For my smaller database, an ADODB SELECT query and writing the record set to a text file was quick enough. -----Original Message----- I'd use DTS Thank You Sim Zacks -----Original Message----- Hi, I'm a new user of postgres. Earlier my database was sqlserver . now i want to perform BULK IMPORT of data from Sqlserver to Postgres. Can you please suggest what are the tools that might be used in this purpose. Just to note that the data size in Sqlserver is of size more than 10GB. Thanks & Regards, Dheeman Dutta