Entering edit mode
4.9 years ago
evelyn
▴
220
I am trying to import three really big files (each file size >100GB) to R Studio but it complains of memory. I am using this version
:
platform x86_64-apple-darwin15.6.0
arch x86_64
os darwin15.6.0
system x86_64, darwin15.6.0
status
major 3
minor 6.0
year 2019
month 04
day 26
svn rev 76424
language R
version.string R version 3.6.0 (2019-04-26)
I tried increasing the memory limit by using (.Renviron
) on terminal but it gives the same error.
How much actual memory does your mac have? There is no substitute for not having enough RAM unfortunately.
My Mac has 16 GB memory.
I don't think this is going to work as is.
Edit: @zx5784's solution of using virtual memory on disk may work. With such large data set you may need to have patience in equally large amounts.
What kind of data is this? Alternatives:
See CRAN TaskView for HPC: Large memory and out-of-memory data
These are
vcf
files.Do you need all the data at once, maybe slim it down first: filter on Samples, on Variants, or split on Chromosomes, etc. See
bcftools
for manipulating VCFs.What operation are you trying to do with them? Perhaps there are command line alternatives that can be used instead of R. Are these files compressed or not?
I am trying to use them for
Upset
R plot and these are not compressed.Is it to show overlapping Variants or Samples? Again you can get that info using bcftools, then file will be manageable for R.