0 votes

When trying to load a csv file (791 Mb; data is 3118643 rows x 15 columns) into TOPCAT, I get the following error:

java.io.IOException: Map failed
at sun.nio.ch.FileChannelImpl.map(Unknown Source)
at uk.ac.starlink.table.storage.FileByteStore.toByteBuffers(FileByteStore.java:181)
at uk.ac.starlink.table.storage.FileByteStore.toByteBuffers(FileByteStore.java:146)
at uk.ac.starlink.table.storage.AdaptiveByteStore.toByteBuffers(AdaptiveByteStore.java:127)
at uk.ac.starlink.table.storage.ByteStoreRowStore.endRows(ByteStoreRowStore.java:135)
at uk.ac.starlink.table.storage.MonitorStoragePolicy$TeeRowStore.endRows(MonitorStoragePolicy.java:126)
at uk.ac.starlink.table.storage.MonitorStoragePolicy$TeeRowStore.endRows(MonitorStoragePolicy.java:126)
at uk.ac.starlink.table.StoragePolicy.copyTable(StoragePolicy.java:190)
at uk.ac.starlink.table.StoragePolicy.randomTable(StoragePolicy.java:165)
at uk.ac.starlink.table.StarTableFactory.randomTable(StarTableFactory.java:521)
at uk.ac.starlink.table.StarTableFactory.prepareTable(StarTableFactory.java:1270)
at uk.ac.starlink.table.StarTableFactory.makeStarTables(StarTableFactory.java:800)
at uk.ac.starlink.table.gui.SystemBrowser$1.loadTables(SystemBrowser.java:114)
at uk.ac.starlink.table.gui.TableLoadWorker.run(TableLoadWorker.java:118)
Caused by: java.lang.OutOfMemoryError: Map failed
at sun.nio.ch.FileChannelImpl.map0(Native Method)
... 14 more

I'm using:

Java Version 8 Update 333 (build 1.8.0_333-b02)

TOPCAT Version 4.8-4

STIL Version 4.1

Starjava revision: a12183c9e (2022-04-05)

What's going wrong?

asked Jul 5 by anonymous | 25 views

1 Answer

0 votes
Hi, thanks for reaching out.

I assume you are trying to load a CSV file that you generated with Data Lab? (otherwise, please note that we are not the developers of TOPCAT).

The last message in the error stack seems to indicate that your machine ran out of memory while trying to load the CSV file. While the CSV file might be less than 800 MB, it is possible that TOPCAT requires 2 or even 3 times as much free memory due to internal data type casting and Java peculiarities. I would suggest you try closing other running programs to free up some more RAM that way, then try again.

If the problems persist, and you can't find anything wrong with the CSV file itself, may I suggest that you contact the developer of TOPCAT (he might be happy to receive a bug report -- if it is a bug indeed).

answered Jul 5 by datalab (17,680 points)
See also the TOPCAT tips for using large tables:  http://www.star.bris.ac.uk/~mbt/topcat/sun253/largeTables.html

Increasing the amount of memory available to the JVM with the -Xmx switch as suggested might allow you to load larger tables.

337 questions

351 answers


2,277 users

Welcome to Data Lab Help Desk, where you can ask questions and receive answers from other members of the community.