Jump to content

Recommended Posts

Posted

Vayyas,

 

Pedda mundavallu konchem help seyara ..

 

I have a problem  trying to solve where Ihave to potentially insert 100’s of thousands or rows of data at a time for up to 10 or 12 tables. My concern is how to best set it up in a set based operation process while minimizing the risk of the process failing midstream. Knowing that I need to generate a lot of contrived keys, I am concerned that if the process fails I will not be able to recover or restore and will lose a lot of potential primary keys.

 

I was thinking there may be a way to partition the insert to the multiple tables and commit more frequently to minimize the risk of the process failing. While at the same time building in restart ability. Or allow it to pick up where it left off after correcting the failure.   

 

Help vayya...

  • Replies 41
  • Created
  • Last Reply

Top Posters In This Topic

  • kirs149

    18

  • Appaji

    4

  • andhravodu

    4

  • Srujana21

    4

Top Posters In This Topic

Posted

Vayyas,

 

Pedda mundavallu konchem help seyara ..

 

I have a problem  trying to solve where Ihave to potentially insert 100’s of thousands or rows of data at a time for up to 10 or 12 tables. My concern is how to best set it up in a set based operation process while minimizing the risk of the process failing midstream. Knowing that I need to generate a lot of contrived keys, I am concerned that if the process fails I will not be able to recover or restore and will lose a lot of potential primary keys.

 

I was thinking there may be a way to partition the insert to the multiple tables and commit more frequently to minimize the risk of the process failing. While at the same time building in restart ability. Or allow it to pick up where it left off after correcting the failure.   

 

Help vayya...

 

oracle ayite bulk collect with collections, exceptions ni save exceptions to catch chesi rastavu. Or just direct inserts cheyyi append hint vadi. 100s of thousands of rows aath care oracle ki, 2 minutes lo load chesi pettiddi

 

sql server no idea

Posted

oracle ayite bulk collect with collections, exceptions ni save exceptions to catch chesi rastavu. Or just direct inserts cheyyi append hint vadi. 100s of thousands of rows aath care oracle ki, 2 minutes lo load chesi pettiddi

 

sql server no idea

 

devudivi sami.. sql ki kuda sinna salaha ivvochu gaa

Posted

devudivi sami.. sql ki kuda sinna salaha ivvochu gaa

 

ittanti padalu enduku le kani, nuvvu cheppina danni batti

 

neeku oka dataset vastundi, nuvvu danni 10-12 tables simultaneous ga load cheyala? or you'll be inserting into 1 table with joins/etl transformation and akkadi nunchi u transport into another table and so on, till all tables are complete

 

I'm thinking 2nd method ani. ala aite, under 1 million rows unte, just load everything man nee etl transforms/joins vadesi, commits mathram pettaku until you complete final step. sql server kooda industrial standard kada, it shouldn't fail for such a small number.  ex. 300k rows transformation unte, max you'll hold is 300k * 12 tables, 3.6 million in memory after transformations kada, it should be able to get with it
 

Posted

Vayyas,

Pedda mundavallu konchem help seyara ..

I have a problem trying to solve where Ihave to potentially insert 100’s of thousands or rows of data at a time for up to 10 or 12 tables. My concern is how to best set it up in a set based operation process while minimizing the risk of the process failing midstream. Knowing that I need to generate a lot of contrived keys, I am concerned that if the process fails I will not be able to recover or restore and will lose a lot of potential primary keys.

I was thinking there may be a way to partition the insert to the multiple tables and commit more frequently to minimize the risk of the process failing. While at the same time building in restart ability. Or allow it to pick up where it left off after correcting the failure.

Help vayya...


Ltt
Posted

keys columns data look veyakunda...SQL ela rasthaamu bro ?


Code vaddu andhra vayya la suggestion saalu...
Posted

ittanti padalu enduku le kani, nuvvu cheppina danni batti

neeku oka dataset vastundi, nuvvu danni 10-12 tables simultaneous ga load cheyala? or you'll be inserting into 1 table with joins/etl transformation and akkadi nunchi u transport into another table and so on, till all tables are complete

I'm thinking 2nd method ani. ala aite, under 1 million rows unte, just load everything man nee etl transforms/joins vadesi, commits mathram pettaku until you complete final step. sql server kooda industrial standard kada, it shouldn't fail for such a small number. ex. 300k rows transformation unte, max you'll hold is 300k * 12 tables, 3.6 million in memory after transformations kada, it should be able to get with it


Simultaneously load seyala
Posted

Simultaneously load seyala

Scheduler vadu man. Commits rasi padeyyi to all tables. Call all of them simultaneously

If any load fails, truncate tables job run cheyi

Truncate too much ante delete mathram cheyi. Performance fail kakunda undali ante stats collection compulsory
×
×
  • Create New...