cn.mops: negative widths error in referencecn.mops
0
0
Entering edit mode
7.4 years ago
wes3985 ▴ 10

I have been analysing samples for copy number variation using cn.mops. My samples are matched tumour/normal samples and several of them seem to work ok, however one sample throws an error indicating that it has a negative width.

> ref_analysis_18 <- referencecn.mops(X[,4], X[,3],
+ norm=1,
+ I = c(0.025, 0.5, 1, 1.5, 2, 2.5, 3, 3.5, 4, 8, 16, 32, 64),
+ classes = paste0("CN", c(0:8, 16, 32, 64, 128)),
+ segAlgorithm="DNAcopy",
+ minReadCount = 10)

Normalizing...
Starting local modeling, please be patient...
Reference sequence:  chr1
Reference sequence:  chr2
Reference sequence:  chr3
Reference sequence:  chr4
Reference sequence:  chr5
Reference sequence:  chr6
Reference sequence:  chr7
Reference sequence:  chr8
Reference sequence:  chr9
Reference sequence:  chr10
Reference sequence:  chr11
Reference sequence:  chr12
Reference sequence:  chr13
Reference sequence:  chr14
Reference sequence:  chr15
Reference sequence:  chr16
Reference sequence:  chr17
Reference sequence:  chr18
Reference sequence:  chr19
Reference sequence:  chr20
Reference sequence:  chr21
Reference sequence:  chr22
Reference sequence:  chrM
Reference sequence:  chrX
Reference sequence:  chrY
Starting segmentation algorithm...
Using "DNAcopy" for segmentation.
Analyzing: Sample.1
Error in .Call2("solve_user_SEW0", start, end, width, PACKAGE = "IRanges") :
  solving row 4: negative widths are not allowed

It isn't clear about what this means or how to solve it. Has anyone come across this before and how to solve it? What would the best way to scan through the objects to to look for any regions that may have negative widths? The objects look like this:

> head(X)
GRanges object with 6 ranges and 6 metadata columns:
      seqnames         ranges strand | g14_picard_dedup.sorted.bam
         <Rle>      <IRanges>  <Rle> |                   <numeric>
  [1]     chr1 [12098, 12258]      * |            391.994613612645
  [2]     chr1 [12553, 12721]      * |            541.874907052774
  [3]     chr1 [13331, 13701]      * |            1444.38485862918
  [4]     chr1 [30334, 30503]      * |            267.017507390261
  [5]     chr1 [35045, 35544]      * |            2000.09486969182
  [6]     chr1 [35618, 35778]      * |             488.37917154799
      fc14_picard_dedup.sorted.bam g18_picard_dedup.sorted.bam
                         <numeric>                   <numeric>
  [1]               408.3622635555            490.307259410992
  [2]             616.982115589288            554.925218146243
  [3]             1988.54667470504            1593.25195965544
  [4]             386.572182298586            292.507248320642
  [5]             2281.09869158028            1861.58913180793
  [6]             755.793003596296             439.00750591125
      fc18_picard_dedup.sorted.bam g21_picard_dedup.sorted.bam
                         <numeric>                   <numeric>
  [1]             359.585470088326            438.333507147116
  [2]             530.101430105641             596.92442096138
  [3]             1277.54444655459            1624.18970354608
  [4]             357.376714129553            328.960463137279
  [5]             1835.47620174078            1676.35223222769
  [6]             519.499401503528            421.506884991757
      fc21_picard_dedup.sorted.bam
                         <numeric>
  [1]             310.340399232444
  [2]             487.214990026718
  [3]              1204.1077911764
  [4]             342.411066794043
  [5]              1787.8587300249
  [6]             569.497308821124

Where X[,1] and X[,2] are a pair, X[,3] and X[,4] are a pair etc. Only the middle pair throw this error. If anyone could post some code that would allow me to scan through the first column to find a begative range then that may help to start with. Many thanks.

next-gen cnv cn.mops • 1.8k views
ADD COMMENT

Login before adding your answer.

Traffic: 2338 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6