Biostar Beta. Not for public use.
Normalized Bigwig Files
4
Entering edit mode
14 months ago
vj • 390
UK

I am trying to normalise the bigwig files (I start from bam files) for a large number of ChIP-Seq data. Is there a well agreed method available to do this? I know of normalising as RPM. Any suggestions?

bigwig • 11k views
ADD COMMENTlink
9
Entering edit mode
13 months ago
Ian 5.4k
University of Manchester, UK

If you use bedtools genomecov you can use a scaling factor.

bedtools genomecov -ibam input.bam -bg -scale X -g genome.chrom.sizes > normalised.bg

where X is the scaling factor. The scale could be for each sample 1,000,000/mapped reads, or each sample divided by the mean of mapped reads for each sample.

You can then use:

wigToBigWig -clip normalised.bg genome.chrom.sizes normalised.bw
ADD COMMENTlink
0
Entering edit mode

Thanks. This options seems to open up a lot of options.

ADD REPLYlink
2
Entering edit mode
2.4 years ago
Ryan Dale 4.8k
Bethesda, MD

pybedtools has a function that will scale your BAM by million mapped reads (the scaling used by many ENCODE data sets) and creates a bigWig file all in one shot:

from pybedtools.contrib.bigwig import bam_to_bigwig
bam_to_bigwig(bam='path/to/bam', genome='hg19', output='path/to/bigwig')

More details in this answer: http://www.biostars.org/p/64495/#64680

ADD COMMENTlink
0
Entering edit mode
15 months ago
sztankatt • 0

As of today, you can use deeptools exactly for these kind of tasks: https://deeptools.readthedocs.io/en/latest/index.html

ADD COMMENTlink

Login before adding your answer.

Similar Posts
Loading Similar Posts
Powered by the version 2.1