Question: Normalized Bigwig Files
Entering edit mode

I am trying to normalise the bigwig files (I start from bam files) for a large number of ChIP-Seq data. Is there a well agreed method available to do this? I know of normalising as RPM. Any suggestions?

ADD COMMENTlinkeditmoderate 6.9 years ago vj • 390 • updated 10 months ago sztankatt • 0
Entering edit mode

If you use bedtools genomecov you can use a scaling factor.

bedtools genomecov -ibam input.bam -bg -scale X -g genome.chrom.sizes >

where X is the scaling factor. The scale could be for each sample 1,000,000/mapped reads, or each sample divided by the mean of mapped reads for each sample.

You can then use:

wigToBigWig -clip genome.chrom.sizes
ADD COMMENTlinkeditmoderate 2.5 years ago Ian 5.4k
Entering edit mode

Thanks. This options seems to open up a lot of options.

ADD REPLYlinkeditmoderate 6.9 years ago
• 390
Entering edit mode

pybedtools has a function that will scale your BAM by million mapped reads (the scaling used by many ENCODE data sets) and creates a bigWig file all in one shot:

from pybedtools.contrib.bigwig import bam_to_bigwig
bam_to_bigwig(bam='path/to/bam', genome='hg19', output='path/to/bigwig')

More details in this answer:

ADD COMMENTlinkeditmoderate 6.9 years ago Ryan Dale 4.8k
Entering edit mode

As of today, you can use deeptools exactly for these kind of tasks:

ADD COMMENTlinkeditmoderate 10 months ago sztankatt • 0

Login before adding your answer.

Powered by the version 2.0