is the subject ot today's answer-thon question. I would first categorize this as a sizing question, with a heavy tendency to give the standard consultant reply of "it depends." But that's not the helpful way of the answer-thon SAP Champion spirit. If we don't have an answer, at least we could have a hint or a clue.
It is a 2-part question, as they say at press conferences:
whether data stores have a limit in its size?
How many entries can I store within them
The supplied "similar questions" don't work. They are good questions of themselves, just unrelated to this one. The first one speaks to checking status prior to committing, while the second is about individual record limits, not total values.
I turned to google, entering "cpi data store limit" and found this one from 2017:
2 replies, both indeterminate, suggesting no "*published*" limit. There might be practical limits even in the absence of theoretical maximums so they were not given credit as, um, credible answers. Both suggested testing, which seems fine until a production system grows beyond a test run.
"Is there any limitations for the data store configurations or system configurations"
Before trying to quantify what a practical limit might be, I'd need to be certain of the platform or environment. Cloud? With "CPI" in the subject, possibly. If there is an "in-memory" limit, those values would be in the sizing contracts or specifications.
I spotted a tag in the question: data store configuration, then searched on that:
Singleton. Discard. Would have been nice. (There could be 2 later, if I include that tag on this post).
Then, I took the search values over to ServerFault and StackOverflow. Nil on the former, and odd results on the latter.
First, this subject has "HCP PI". Could be of some interest if the design layers have any similarity to the question at hand. I read this requirement as a build a set of transactions to apply sequentially, so as not to overload a receiver. A tuning topic, not sizing as much.
Next one has "successfactors". That's SAP, right, or is that the other SF? I lose track of the players without my scorecard. Interesting topic on pushing data through an API. Probably of interest to high volatility systems, like traffic flow, order booking, etc. But a different slice than total volumes the OP is on about. Main idea: "upserts the results to a SAP HANA database every hour" (upserts being insert if new, update if not).
Last here, a question on request building that fails a sanity check. Interesting, but not a total size topic, more of a checksum verification for data quality assurance. Looks at SMTP and XML attachments, a rabbit-hole of another type altogether.
What I would say is, first go through a data volume exercise to estimate worst-case scenarios (or best-case, if you look at it like "how much data can we have all at once?"). Then check with server and database teams, including Basis, if such a team exists these days, to get their expertise. Maybe even open a help ticket with SAP?