Hi, I would like to suggest a necessary nodes here below.
Delete record.
My case is that after I run an import from excel file, over 170k empty rows has been imported. Now I need to export database but I don’t have a clear way to do this: .bika is restricted below 50k rows, excel export doesn’t do anything. I tried asking the co-pilot to delete all the empty rows from that db, he even responded and started doing something - no luck. Then I tried to setup an automation so that the outputs from Round Robin would be put through filter and only found out that there’s no Remove record node. I believe it is essential to have.
Another solution would be the Export View functionality which I suggested previously I also need a clear solution to my problem here - how to clean 170k empty rows from the db?
Run automation node. In the current era of multi-agent systems we really need at least the possibility to run different automations from the other automations. Yes, this can be achieved by post request to the webhook of the next automation, but still can be simplified so that the data would be also transfered there (or not transfered if we choose not to).
Parallelization/multithreading. It would be great to have indeed. Example, if we could setup threads number for a Loop.
Please make limit for Get records node higher, at least 1000 instead of 100 currently
Currently, Copilot does not support the ability to delete records. Automation also does not have a standalone “delete record” action at the moment. However, you can use the “Send HTTP Request” action along with the API (API documentation: Bika OpenAPI Reference | Bika.ai) to delete records manually.
Another option is to delete the entire table in the Space Station and create a new one with the same structure, then re-import the data using incremental import
Thank you for your suggestion — I’ll forward it to the product team for evaluation. Could you kindly describe your use case for setting the number of threads in more detail? This will help us better understand your scenario
How many records would you like the “Get Records” action to support? What do you plan to do after retrieving the records? If you could describe your specific business scenario, it would help us provide more accurate feedback. Thank you!
@pengjin Thank you for your reply.
Well, since the db had had sensitive data before import, I can’t just create a new one. So what would be the realistic approach for deleting 173k? Get records → Remove record API request
This won’t do because Get recorsds is just 100 so I’ll need a way to run this automation 1.7k times.
How exactly does Round Robin work? Will this workflow clean all the database from empty lines: Round Robin (all db) → filter (only empty lines) → Remove record API request
? UPD. Round Robin doesn’t work as a loop and there’s no GOTO node nor conditional logic yet so… won’t work
After all, I can’t stress enough again that there MUST be implemented some way to export databases even with millions of rows. We have such big limits in our plans but still can’t export anything more than 50k rows. That makes them sometimes unusable.
Or at least an option to export DB View which also makes sense
Regarding the multithreading:
Now I have a workflow of getting records from database and then running a loop through them which requires multiple API calls. As of now, this just processes 100 records (because that’s limited in Get Records node) in 1 thread.
But usually I need over 10k records processed daily. Currently, I have to run 10-12 such automations simultaneously so that’s like 10-12 threads, 100 records each. And then repeat repeat repeat them after they’re done. So the BEST case scenario would be if Get Records could retrive at least 5-10k records and Loop would support say 10 threads so that would simplify this so much here.
Anyway, that would sure mean some changes in loads on your end so that requires a careful consideration.
HI @anp
Sorry for the late reply. Since the number of records you want to delete is quite large, deleting them via API or using the Round Robin method may be rather cumbersome.
The quickest approach for now is as follows:
First, export your current table data to an Excel file. Then, delete the 173,000 records directly within Excel.
Next, create a new table with the same field structure as your existing table, and re-import the cleaned Excel file using incremental import.
This way, both the structure and the data in the new table will exactly match those of your current table (minus the deleted records).
Note:
Because you’re exporting a large amount of data, it’s recommended to open your browser’s developer console before exporting. In the console, you can monitor the export progress. When you see a request like database.exportToExcel?batch=1 return a status 200, that means the file has been successfully exported.
Below are the screenshots showing how to open the browser console and export the file
Hi, my problem is that I just cannot export as Excel as I mentioned previously. Nothing happens visually after I click. Now with the console I see that there’s a 504 Timeout:
It looks like the export has timed out. Could you please refresh the page and try again?
Would you mind inviting me to your Space Station? After the invitation, just let me know which table it is, and I can help you delete the records — would that work for you?
hi @anp
I’ve already joined your Space Station, but I’m unable to perform any actions. Could you grant me “Admin Permission” for the current folder? That way, I’ll be able to help you delete the blank record rows