Do you want to: Report a bug
OS version: Windows 11
App version: 9.1.0 64-bit (NOT 32-bit)
Downloaded from: ONLYOFFICE website
I have noticed that the spreadsheet program can have difficulty importing large CSV files. Either the program will report lack of memory or simply “error”.
For example, with a particular CSV file, 57MB, ~730,000 lines long with 21 columns. The only data inside are alphanumeric characters, “@”, “.” and “-” (i.e. only ANSI characters). Trying to open the file, the program will bring up the dialog about the charset encoding and delimiter. Clicking okay will eventually cause the program to crash with the error: “Oops! Something went wrong. We lost access to your file due to a lack of memory or some other reason. Please don’t worry and try reopening the file. Close this tab to continue.”
Picking a charset different from UTF-8 didn’t make a difference (e.g. Western European). Changing the EOF of the text file from Windows to Unix also made no difference.
I was able to eventually open the file with OnlyOffice by first manually editing the csv file in a text editor and reducing the number of rows. Reducing the file from ~730k rows down to 484,189 with a resultant file size of 36.7 MB (38,541,782 bytes) (as shown in Windows Explorer properties), and then the program opened the file successfully. But at 484,190 rows, it didn’t work.
In another instance, an even larger CSV file resulted in a different error. That file was, as per Windows Explorer properties, 118 MB (124,561,326 bytes), with 23 columns and 893,000 lines. Trying to open resulted in an error of: “An error has occurred while opening the file. Press “OK” to close the editor.”
Something to note is that this larger CSV file had special characters beyond the ANSI charset (e.g. unicode accented characters), though no binary data or anything like that. Reducing the text file down to 450,000 rows (~64MB) did not help and resulted in the same “An error has occurred” error. Further reducing the file further down to 225,000 rows (~32 MB), allowed the file to be opened successfully.
I noted a post in the forum with a user having difficulties with the “lack of memory” reason that was solved by switching from the 32-bit to the 64-bit version, but in my case I am already using the 64-bit version.
In terms of my machine and running out of memory, while I don’t have a brand-brand-new computer, I don’t think that it is particularly dated either: it is a 13th Generation Intel Core i7, with an embedded GPU Intel Iris Xe, and 32GB of RAM, 1GB SSD. The tests above were conducted with Task Manager saying the system had 22.6 GB of RAM free.
Also for reference, all the CSV files I could successfully open with LibreOffice without issue.
Perhaps it is something to do with the number of rows in addition to the file size. Because I note that I have been able to open some other large CSV files in the 40MB range, with even more columns than the above two example, including even binary data, without issue. However, and maybe this is one of the issues, those successfully imported CSV files had far fewer rows (e.g. less than 80,000 rows).