You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First thank you so much for creating this amazing repo!
While using the Load.py, I noticed that the data is generally inserted row-by-row, which could be really slow for some of the bigger tables (although they are not that big after all). I am using a low-tier AWS RDS instance with MySQL engine, and it is taking hours to insert certain tables, such as invNames, etc.. I put together some local quick modifications to switch to the bulk insertion from sqlalchemy lib and it speeds up things tremendously, from hours down to a few seconds for the invNames table as an example. Just want to bring this up and see how interested people are with this enhancement. If people believe this is a very useful enhancement, I can clean up my temp walkaround solution and turn it into a more prod quality change. Please let me know. Thanks!
The text was updated successfully, but these errors were encountered:
First thank you so much for creating this amazing repo!
While using the Load.py, I noticed that the data is generally inserted row-by-row, which could be really slow for some of the bigger tables (although they are not that big after all). I am using a low-tier AWS RDS instance with MySQL engine, and it is taking hours to insert certain tables, such as invNames, etc.. I put together some local quick modifications to switch to the bulk insertion from sqlalchemy lib and it speeds up things tremendously, from hours down to a few seconds for the invNames table as an example. Just want to bring this up and see how interested people are with this enhancement. If people believe this is a very useful enhancement, I can clean up my temp walkaround solution and turn it into a more prod quality change. Please let me know. Thanks!
The text was updated successfully, but these errors were encountered: