<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Performance issue while working with large files. in Application Development and Automation Discussions</title>
    <link>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168028#M1821904</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I would suggest, you should split the files and then process each set.&lt;/P&gt;&lt;P&gt;lock the table to ensure it is available all time.&lt;/P&gt;&lt;P&gt;After each set ,do a commit and then proceed.&lt;/P&gt;&lt;P&gt;This would ensure there is no break in middle and have to start again by deleteing the entries from files which are already processed.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Also make use of the sorted table and keys when deleting/updating DB.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;In Delete, when multiple entries are involved , use of&amp;nbsp; an internal table might be tricky as some records may be successfully deleted and some maynot.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;To make sure, first get the count of records from DB that are matching in Internal table set 1&lt;/P&gt;&lt;P&gt;Then do the delete from DB with the Internal tabel set 1&lt;/P&gt;&lt;P&gt;Again check the count from DB that are matching in Internal table set 1 and see the count is zero.&lt;/P&gt;&lt;P&gt;This would make sure the entire records are deleted. but again may add some performance&lt;/P&gt;&lt;P&gt;And the goal here is to reduce the execution time.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Gurus may have a better idea..&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;&lt;P&gt;Sree&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Tue, 25 Mar 2014 07:07:44 GMT</pubDate>
    <dc:creator>Former Member</dc:creator>
    <dc:date>2014-03-25T07:07:44Z</dc:date>
    <item>
      <title>Performance issue while working with large files.</title>
      <link>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168025#M1821901</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;&lt;SPAN class="L0S52"&gt;Hello Gurus,&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN class="L0S52"&gt;&lt;/SPAN&gt; &lt;/P&gt;&lt;P&gt;&lt;SPAN class="L0S52"&gt;I have to upload about 1 million keys from a CSV file on the application server and then delete the entries from a DB table containing 18 million entries. This is causing performance problems and my programm is very slow. Which approach will be better? &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN class="L0S52"&gt;1. First read all the data in the CSV and then use the delete statement?&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN class="L0S52"&gt;2. Or delete each line directly after reading the key from the file?&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN class="L0S52"&gt;&lt;/SPAN&gt; &lt;/P&gt;&lt;P&gt;&lt;SPAN class="L0S52"&gt;And another program has to update about 2 million entries in a DB table containing&amp;nbsp; 20 million entries. Here I also have very big performance problems(the program has been running for more the 14 hours). Which is the best way to work with such a large amount?&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN class="L0S52"&gt;I tried to rewrite the program so that it will run parallel but since this program will only run &lt;SPAN style="text-decoration: underline;"&gt;once&lt;/SPAN&gt; the costs of implementing a aRFC parallization are too big. Please help, maybe someone doing migration is good at this &lt;SPAN __jive_emoticon_name="happy" __jive_macro_name="emoticon" class="jive_emote jive_macro" src="https://community.sap.com/651/images/emoticons/happy.gif"&gt;&lt;/SPAN&gt; &lt;/SPAN&gt;&lt;SPAN class="L0S52"&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="color: #575757;"&gt;Regards,&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="color: #575757;"&gt;Ioan.&lt;/SPAN&gt;&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 25 Mar 2014 06:21:57 GMT</pubDate>
      <guid>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168025#M1821901</guid>
      <dc:creator>former_member205645</dc:creator>
      <dc:date>2014-03-25T06:21:57Z</dc:date>
    </item>
    <item>
      <title>Re: Performance issue while working with large files.</title>
      <link>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168026#M1821902</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi loan ,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Schedule it in background.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;regards,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Yogendra Bhaskar&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 25 Mar 2014 06:51:44 GMT</pubDate>
      <guid>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168026#M1821902</guid>
      <dc:creator>yogendra_bhaskar</dc:creator>
      <dc:date>2014-03-25T06:51:44Z</dc:date>
    </item>
    <item>
      <title>Re: Performance issue while working with large files.</title>
      <link>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168027#M1821903</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;I have just tested the first program.&amp;nbsp; &lt;/P&gt;&lt;P&gt;Delete &lt;STRONG style="text-decoration: underline;"&gt;each line&lt;/STRONG&gt; directly &lt;STRONG style="text-decoration: underline;"&gt;after reading the key from the file&lt;/STRONG&gt; is definitely much faster &lt;SPAN __jive_emoticon_name="happy" __jive_macro_name="emoticon" class="jive_macro jive_emote" src="https://community.sap.com/651/images/emoticons/happy.gif"&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 25 Mar 2014 07:05:48 GMT</pubDate>
      <guid>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168027#M1821903</guid>
      <dc:creator>former_member205645</dc:creator>
      <dc:date>2014-03-25T07:05:48Z</dc:date>
    </item>
    <item>
      <title>Re: Performance issue while working with large files.</title>
      <link>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168028#M1821904</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I would suggest, you should split the files and then process each set.&lt;/P&gt;&lt;P&gt;lock the table to ensure it is available all time.&lt;/P&gt;&lt;P&gt;After each set ,do a commit and then proceed.&lt;/P&gt;&lt;P&gt;This would ensure there is no break in middle and have to start again by deleteing the entries from files which are already processed.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Also make use of the sorted table and keys when deleting/updating DB.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;In Delete, when multiple entries are involved , use of&amp;nbsp; an internal table might be tricky as some records may be successfully deleted and some maynot.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;To make sure, first get the count of records from DB that are matching in Internal table set 1&lt;/P&gt;&lt;P&gt;Then do the delete from DB with the Internal tabel set 1&lt;/P&gt;&lt;P&gt;Again check the count from DB that are matching in Internal table set 1 and see the count is zero.&lt;/P&gt;&lt;P&gt;This would make sure the entire records are deleted. but again may add some performance&lt;/P&gt;&lt;P&gt;And the goal here is to reduce the execution time.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Gurus may have a better idea..&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;&lt;P&gt;Sree&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 25 Mar 2014 07:07:44 GMT</pubDate>
      <guid>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168028#M1821904</guid>
      <dc:creator>Former Member</dc:creator>
      <dc:date>2014-03-25T07:07:44Z</dc:date>
    </item>
    <item>
      <title>Re: Performance issue while working with large files.</title>
      <link>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168029#M1821905</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hello Ioan,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Query 1 -&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Try forming a range table from the uploaded million entries. For making range table you will need to put a Loop.&lt;/P&gt;&lt;P&gt;Post that use a single Delete statement.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Also splitting the million entries into smaller files and running parallel SE38 programs will be advantageous. Implement Key based locks so that you can run multiple SE38 programs simultaneously on same table.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Query 2 - &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Follow same advice -&lt;/P&gt;&lt;P&gt;Also splitting the million entries into smaller files and running parallel SE38 programs will be advantageous. Implement Key based locks so that you can run multiple SE38 programs simultaneously on same table.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;BR.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 25 Mar 2014 13:01:39 GMT</pubDate>
      <guid>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168029#M1821905</guid>
      <dc:creator>Former Member</dc:creator>
      <dc:date>2014-03-25T13:01:39Z</dc:date>
    </item>
    <item>
      <title>Re: Performance issue while working with large files.</title>
      <link>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168030#M1821906</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Unfortunately I can not build ranges &lt;SPAN __jive_emoticon_name="happy" __jive_macro_name="emoticon" class="jive_macro jive_emote" src="https://community.sap.com/651/images/emoticons/happy.gif"&gt;&lt;/SPAN&gt; &lt;/P&gt;&lt;P&gt;It seems that processing each line rather than using "for all entries" is much faster so I will stick to that for now.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Ioan.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 26 Mar 2014 07:03:54 GMT</pubDate>
      <guid>https://community.sap.com/t5/application-development-and-automation-discussions/performance-issue-while-working-with-large-files/m-p/10168030#M1821906</guid>
      <dc:creator>former_member205645</dc:creator>
      <dc:date>2014-03-26T07:03:54Z</dc:date>
    </item>
  </channel>
</rss>

