GATK SPlitNCigarReads memory issue
0
0
Entering edit mode
7.3 years ago
user31888 ▴ 130

Hi,

When using SPlitNCigarReads on a 15GB mRNA .bam file, the program keeps being stuck at the same progression no matter the memory amount I supply to it.

Command line used:

java -jar GenomeAnalysisTK.jar \
-T SplitNCigarReads \
-R 1KGenome_chr37.fasta \
-I input_dedupe_1.bam \
-o output_dedupe_split_1.bam \
-rf ReassignOneMappingQuality \
-RMQF 255 \
-RMQT 60 \
-U ALLOW_N_CIGAR_READS

I tried with different resource setups:

. Test #1: procs=1,mem=100GB (used -Xmx95G java argument in the command line above)
. Test #2: procs=1,mem=300GB (-Xmx295G)
. Test #3: procs=1,mem=500GB (-Xmx495G)

The log file looks always the same no matter the resources used:

INFO 18:10:43,185 ProgressMeter - [INITIALIZATION COMPLETE; STARTING PROCESSING] 
INFO 18:10:43,185 ProgressMeter - | processed | time | per 1M | | total | remaining 
INFO 18:10:43,186 ProgressMeter - Location | reads | elapsed | reads | completed | runtime | runtime 
INFO 18:10:43,211 ReadShardBalancer$1 - Loading BAM index data 
INFO 18:10:43,213 ReadShardBalancer$1 - Done loading BAM index data 
DEBUG 2017-01-14 18:10:54 BlockCompressedOutputStream Using deflater: Deflater
INFO 18:11:13,194 ProgressMeter - 1:791345 605855.0 30.0 s 49.0 s 0.0% 32.7 h 32.7 h 
INFO 18:12:17,273 ProgressMeter - 1:3354759 1205997.0 94.0 s 78.0 s 0.1% 24.1 h 24.1 h 
INFO 18:13:17,276 ProgressMeter - 1:14745310 2006043.0 2.6 m 76.0 s 0.5% 9.0 h 9.0 h 
INFO 18:13:50,268 ProgressMeter - 1:19454173 2406174.0 3.1 m 77.0 s 0.6% 8.3 h 8.2 h 
INFO 18:14:20,947 ProgressMeter - 1:22329073 2906481.0 3.6 m 74.0 s 0.7% 8.4 h 8.3 h 
INFO 18:14:57,330 ProgressMeter - 1:24448921 3306573.0 4.2 m 76.0 s 0.8% 9.0 h 8.9 h 
[...]
[...]
[...]
INFO 10:09:59,257 ProgressMeter - 12:48265263 1.50625431E8 2.6 h 61.0 s 64.5% 4.0 h 85.2 m 
INFO 10:10:29,258 ProgressMeter - 12:50558820 1.51125541E8 2.6 h 61.0 s 64.5% 4.0 h 85.2 m 
INFO 10:10:59,259 ProgressMeter - 12:54069839 1.51526231E8 2.6 h 61.0 s 64.6% 4.0 h 85.0 m 
INFO 10:11:30,112 ProgressMeter - 12:56120031 1.5192652E8 2.6 h 61.0 s 64.7% 4.0 h 85.0 m 
INFO 10:12:00,115 ProgressMeter - 12:57033077 1.52326798E8 2.6 h 61.0 s 64.7% 4.0 h 85.2 m 
INFO 10:12:30,116 ProgressMeter - 12:58022667 1.5272684E8 2.6 h 61.0 s 64.8% 4.0 h 85.4 m 
INFO 10:13:00,117 ProgressMeter - 12:66451463 1.5312693E8 2.6 h 61.0 s 65.0% 4.0 h 84.6 m 
INFO 10:13:42,923 ProgressMeter - 12:66451469 1.5312693E8 2.6 h 61.0 s 65.0% 4.1 h 85.0 m 
INFO 10:15:27,504 ProgressMeter - 12:66451469 1.5312693E8 2.7 h 62.0 s 65.0% 4.1 h 85.9 m 
INFO 10:17:39,795 ProgressMeter - 12:66451469 1.5312693E8 2.7 h 63.0 s 65.0% 4.2 h 87.1 m 
INFO 10:38:46,245 ProgressMeter - 12:66451469 1.5312693E8 3.1 h 71.0 s 65.0% 4.7 h 98.5 m 
INFO 10:44:48,171 ProgressMeter - 12:66451469 1.5312693E8 3.2 h 74.0 s 65.0% 4.8 h 101.7 m 
INFO 10:48:22,821 ProgressMeter - 12:66451469 1.5312693E8 3.2 h 75.0 s 65.0% 4.9 h 103.6 m 
INFO 10:55:17,229 ProgressMeter - 12:66451469 1.5312693E8 3.3 h 78.0 s 65.0% 5.1 h 107.3 m 
INFO 11:07:09,208 ProgressMeter - 12:66451469 1.5312693E8 3.5 h 82.0 s 65.0% 5.4 h 113.7 m 
INFO 11:07:41,434 ProgressMeter - 12:66451469 1.5312693E8 3.5 h 83.0 s 65.0% 5.4 h 114.0 m 
INFO 11:08:11,844 ProgressMeter - 12:66451469 1.5312693E8 3.5 h 83.0 s 65.0% 5.4 h 114.3 m

Although the process is still running (and use 100% of available RAM), the program is stuck at 65% progression (I left it up to 48h before I had to kill the job). After 48h being stuck, the program finally output:

Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "ProgressMeterDaemon"
####ERROR ------------------------------------------------------------------------------------------
####ERROR A USER ERROR has occurred (version 3.6-0-g89b7209):
####ERROR
####ERROR This means that one or more arguments or inputs in your command are incorrect.
####ERROR The error message below tells you what is the problem.
####ERROR
####ERROR If the problem is an invalid argument, please check the online documentation guide
####ERROR (or rerun your command with --help) to view allowable command-line arguments for this tool.
####ERROR
####ERROR Visit our website and forum for extensive documentation and answers to
####ERROR commonly asked questions https://www.broadinstitute.org/gatk
####ERROR
####ERROR Please do NOT post this error to the GATK forum unless you have really tried to fix it yourself.
####ERROR
####ERROR MESSAGE: An error occurred because you did not provide enough memory to run this program. You can use the -Xmx argument (before the -jar argument) to adjust the maximum heap size provided to Java. Note that this is a JVM argument, not a GATK argument.
####ERROR ------------------------------------------------------------------------------------------

Weird facts:

(1) when running the exact same command on a file twice bigger than this one, everything works fine. (Note: I have also encountered a problem using the same file with MarkDuplicates; see http://gatkforums.broadinstitute.org/gatk/discussion/8740/markduplicates-avoid-excessive-duplicate-set-size#latest)

(2) I checked the .bam file at the first occurrences of this genome location, the reads don't even contain a N flag in the CIGAR field.

(3) The GATK user guide mentions that SplitNCigarReads is usually ran with 4GB memory. Definitely does not work for me.

Questions:

(1) Could my .bam file be malformed? (even though the GATK RNASeq pipeline upstream does not give me any warnings or errors). Is it possible to downsample the .bam file only at this specific location?

(2) Could it be caused by the setup of our cluster?

(3) Any possible alternatives to SplitNCigarReads?

Thank you for your help !

GATK SplitNCigarReads • 4.1k views
ADD COMMENT
0
Entering edit mode
ADD REPLY
0
Entering edit mode

I ask here mainly for the 3rd question (alternative program). Thanks !

ADD REPLY
0
Entering edit mode
ADD REPLY

Login before adding your answer.

Traffic: 2336 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6