System Limits - How does the pushbot act?

Hey Community,
is it true that reaching the system limits in a bot only affects the speed of the bot?
For example, I was wondering if the bot will stop the action 'Save spreadsheet to table' if the Excel has more than 100,000 rows? Or does the bot stop writing rows in a table, if the table contains 100,000 rows already?
And if the bot stops, do I receive a message?
Thank you for your help :smile:


  • Thomas_937381Thomas_937381 Posts: 196
    edited January 20

    @Christina_108366 I just wanted to let you know I'm checking on this for you. Do you have a dataset where you expect to exceed 100,000 rows, and where the data change relatively infrequently, but to which you would regularly need to add rows? If you could describe your technical requirements a bit more I'm sure someone can make a recommendation for you.

  • Hi @Thomas_937381 thank you for your answer.
    I have a data table that is filled day by day. And one day it will reach the limit of 100,000 rows.
    Currently I have not created a bot to delete older rows from this table.
    Would it be possible that a bot deletes rows in a table and at the same time another bot stores data into this table?

  • Hi @Christina_108366,

    Is your thought to copy rows from one table to another? Do you have an estimate as to how many rows / at what interval? Is it 100 / daily, for example?

    If you are handling a really large amount of data, you may want to consider some alternate pattern (leveraging .csv functionality, for example), or a reference table, containing a column with table IDs.

    To answer your other question, it is indeed possible that two processes could touch the same imported table at the same time. If you have concerns about 'crossing the wires,' I might suggest building a status column into the dataset, or leveraging a row timestamp. Does your process run on a scheduled trigger? If yes, you could schedule removal of rows at a time when you know another process is not adding new rows.

    Hope this helps!

  • Hi @Thomas_937381 ,
    the amount of rows varies from day to day. The table could be filled at anyminute of the day, 24/7.
    To avoid reaching the limit of the table, I created a delete function. The process is started by a daily scheduled trigger and every row older than 30 days is deleted. This works because I have a column with a 'Tracking date'.
    Do you think I am safe with this solution and that it is possible to add a new row while deleting another one?

  • Thomas_937381Thomas_937381 Posts: 196

    @Christina_212086 Sorry that I'd not seen this response sooner! I do indeed think that in principle this approach should work for the problem you described. If you have additional questions you're welcome to email me: [email protected]

Sign In or Register to comment.