on 2019 Aug 26 1:21 PM
The database after a few years in production grew quite large and started to affect the performance:
itemsynctimestamps, 1486.84 MB
variantprodlp, 971.00 MB
variantprod, 880.78 MB
savedvalues, 676.19 MB
productfeatures, 644.55 MB
cat2prodrel, 543.11 MB
product, 456.13 MB
productlp, 453.00 MB
orderentries, 440.23 MB
medias, 414.05 MB
abstrordrentrtocontrbs, 403.89 MB
savedvalueentry, 353.50 MB
articlelocalinfo, 242.44 MB
categories, 226.59 MB
pricerows, 224.98 MB
addresses, 210.95 MB
cat2princrel, 206.50 MB
orders, 190.98 MB
Above is ordered list of some biggest table from our project database. Question about `itemsynctimestamps` -> it is basically on top with ~1,5GB size and almost 4.7mln records.
I can't find any documentation regarding cleanup of these table. I am wandering if it is possible to decrease its size? For sure it stores data needed to maintain the synchronization state over time, but size of it is troubling.
I put there also some other tables which maybe you would also recommend to be cleaned.
I will be grateful for any help
Request clarification before answering.
Did you try to reorganize this table with DBMS_REDEFINITION.But this would help to decrease the space only in case the table is fragmented or there are migrated/chained rows within.
Besides this you should check which application is using this table. Usually the corresponding application has some Clean-Up Job available to helps to decrease the content in a (from application perspective) consistent way.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 17 | |
| 7 | |
| 6 | |
| 6 | |
| 4 | |
| 4 | |
| 4 | |
| 2 | |
| 2 | |
| 2 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.