I suppose the pivotal moment was the bold decision of ABI (then PE Biosystems) to sequence the human genome on their own using capillary sequencing machines, turning the Human Genome Project into a race. As I understand it this was almost entirely due to the efforts of Hunkapiller and Venter. The Human Genome Project really floundered until this moment. We would still be waiting for a first draft of the human genome today if competition was not a factor, and there would be no incentive to develop the new sequencers that followed. So I would attribute the advances to ego.
to my opinion here are some factors that contributed to decrease sequencing costs (some were already mentioned). Note that not all points can be applied to all technologies.
Parallelization of sequencing reactions. Technologies such as Illumina, IonTorrent or Nanopore use technologies that can easily be parallelized (clusters of oligos, microchip and nanopores, respectively). This allows to run a large number of parallel sequencing reactions in a relatively small instrument. With Sanger sequencing, to parallelize reactions you need a huge amount of individual sequencers.
Less expensive chemistry. Most of new technologies (e.g. IonTorrent, Nanopore, 454) sequencing libraries do not rely on modified nucleotides (ddNTPs) that are actually expensive. I don't have any numbers to provide you with, but I guess this can be found online. This also (sometimes) implies shorter protocols for libraries preparation which therefore decrease the human cost. Some technologies such as Oxford Nanopore Technologies (platforms are not commercialized yet) do not even need some expensive chemistry since they are mostly relying on the electric signature of nucleotide interacting with the nanopore. In general, excluding the presence of ddNTPs and/or dNTPs considerably reduces the cost of sequencing.
Increase in throughput. The decrease in sequencing cost is tightly linked to the increase in sequencing throughput of the "next generation" sequencers. In one single run (that still can take days or weeks) one machine can sequence several gigabases of DNA while previous technologies (i.e. Sanger-based methods) were more in the order on megabases.
Absence of fluorescent signal detection. Sanger, Illumina or 454 technologies need to detect fluorophore signal to know which nucleotide was inserted during the sequencing reaction. Lasers and scanners that fulfill the accuracy requirements are very expensive. That, in part, explains the discrepancy of prices between an Illumina (fluorophore-based reaction) machine (~500K$) and plaforms such as IonTorrent (from ~50k$ per machine- microchips detect nucleotide incorporated) or ONT platforms (announced price of ~30k$ per cluster - nucleotide detection also rely on electric signal detection).
I would say there are a lot of factors that contributed into the declining of the costs, since we are talking of sequencing this would be mostly in the experimental and chemical advancements, and also with the engineering part of things. Computational advances might have helped, but i would define sequencing only on the basis of defining molecular/physical strategies to make the nucleotide identification. It would be hard to quantify each one since they vary across platforms and methodologies. Another factor which is outside of the science realm is the law of offer and demand since the fact that nowadays there are more companies working on this, there is tight competition to bring the next thing in sequencing that scientist would find appealing and invest their money in.
I would say automation. The robotics and workflows to just keep pumping new input into the system. Obviously each subsequent increase in performance of sequencing tech added on--and will keep doing so. But primarily--I'd say automation.
I wonder how many people here ever loaded a sequencing gel...and sat in the back of the lab with a colleague reading off the bases. Good times. And a giant time sink.