kirs149 Posted November 5, 2015 Report Posted November 5, 2015 Vayyas, Pedda mundavallu konchem help seyara .. I have a problem trying to solve where Ihave to potentially insert 100’s of thousands or rows of data at a time for up to 10 or 12 tables. My concern is how to best set it up in a set based operation process while minimizing the risk of the process failing midstream. Knowing that I need to generate a lot of contrived keys, I am concerned that if the process fails I will not be able to recover or restore and will lose a lot of potential primary keys. I was thinking there may be a way to partition the insert to the multiple tables and commit more frequently to minimize the risk of the process failing. While at the same time building in restart ability. Or allow it to pick up where it left off after correcting the failure. Help vayya...
kirs149 Posted November 5, 2015 Author Report Posted November 5, 2015 sethulu etthakunda burraku kattha pani seppu ba
andhravodu Posted November 5, 2015 Report Posted November 5, 2015 Vayyas, Pedda mundavallu konchem help seyara .. I have a problem trying to solve where Ihave to potentially insert 100’s of thousands or rows of data at a time for up to 10 or 12 tables. My concern is how to best set it up in a set based operation process while minimizing the risk of the process failing midstream. Knowing that I need to generate a lot of contrived keys, I am concerned that if the process fails I will not be able to recover or restore and will lose a lot of potential primary keys. I was thinking there may be a way to partition the insert to the multiple tables and commit more frequently to minimize the risk of the process failing. While at the same time building in restart ability. Or allow it to pick up where it left off after correcting the failure. Help vayya... oracle ayite bulk collect with collections, exceptions ni save exceptions to catch chesi rastavu. Or just direct inserts cheyyi append hint vadi. 100s of thousands of rows aath care oracle ki, 2 minutes lo load chesi pettiddi sql server no idea
kirs149 Posted November 5, 2015 Author Report Posted November 5, 2015 oracle ayite bulk collect with collections, exceptions ni save exceptions to catch chesi rastavu. Or just direct inserts cheyyi append hint vadi. 100s of thousands of rows aath care oracle ki, 2 minutes lo load chesi pettiddi sql server no idea devudivi sami.. sql ki kuda sinna salaha ivvochu gaa
andhravodu Posted November 5, 2015 Report Posted November 5, 2015 devudivi sami.. sql ki kuda sinna salaha ivvochu gaa ittanti padalu enduku le kani, nuvvu cheppina danni batti neeku oka dataset vastundi, nuvvu danni 10-12 tables simultaneous ga load cheyala? or you'll be inserting into 1 table with joins/etl transformation and akkadi nunchi u transport into another table and so on, till all tables are complete I'm thinking 2nd method ani. ala aite, under 1 million rows unte, just load everything man nee etl transforms/joins vadesi, commits mathram pettaku until you complete final step. sql server kooda industrial standard kada, it shouldn't fail for such a small number. ex. 300k rows transformation unte, max you'll hold is 300k * 12 tables, 3.6 million in memory after transformations kada, it should be able to get with it
Srujana21 Posted November 5, 2015 Report Posted November 5, 2015 Vayyas, Pedda mundavallu konchem help seyara .. I have a problem trying to solve where Ihave to potentially insert 100’s of thousands or rows of data at a time for up to 10 or 12 tables. My concern is how to best set it up in a set based operation process while minimizing the risk of the process failing midstream. Knowing that I need to generate a lot of contrived keys, I am concerned that if the process fails I will not be able to recover or restore and will lose a lot of potential primary keys. I was thinking there may be a way to partition the insert to the multiple tables and commit more frequently to minimize the risk of the process failing. While at the same time building in restart ability. Or allow it to pick up where it left off after correcting the failure. Help vayya... Ltt
150bryant Posted November 5, 2015 Report Posted November 5, 2015 keys columns data look veyakunda...SQL ela rasthaamu bro ?
kirs149 Posted November 5, 2015 Author Report Posted November 5, 2015 keys columns data look veyakunda...SQL ela rasthaamu bro ? Code vaddu andhra vayya la suggestion saalu...
kirs149 Posted November 5, 2015 Author Report Posted November 5, 2015 ittanti padalu enduku le kani, nuvvu cheppina danni batti neeku oka dataset vastundi, nuvvu danni 10-12 tables simultaneous ga load cheyala? or you'll be inserting into 1 table with joins/etl transformation and akkadi nunchi u transport into another table and so on, till all tables are complete I'm thinking 2nd method ani. ala aite, under 1 million rows unte, just load everything man nee etl transforms/joins vadesi, commits mathram pettaku until you complete final step. sql server kooda industrial standard kada, it shouldn't fail for such a small number. ex. 300k rows transformation unte, max you'll hold is 300k * 12 tables, 3.6 million in memory after transformations kada, it should be able to get with it Simultaneously load seyala
andhravodu Posted November 5, 2015 Report Posted November 5, 2015 Simultaneously load seyala Scheduler vadu man. Commits rasi padeyyi to all tables. Call all of them simultaneously If any load fails, truncate tables job run cheyi Truncate too much ante delete mathram cheyi. Performance fail kakunda undali ante stats collection compulsory
Sambadu Posted November 5, 2015 Report Posted November 5, 2015 business hrs lo run chestara?? downtown time alantivi unaya?
Recommended Posts