Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Neat tool.

Couldn't find anything in the docs on mapping file sources to resource needs on the host, how much is too much data to dump into the tool on a single workstation?



Thanks!

It depends on the number of rows/columns and the types of the values, but the application displays a dialog asking you if you want to stop the import before completion when it feels like resources are being exhausted.

The software was specifically developed to be able to handle as much data as possible while remaining responsive so the workstation resources will likely be the bottleneck here.

On my 32GB development machine, I can easily load tens of millions rows with tens of columns.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: