Unfortunately there are many variables in even downloading and updating files, such as:
Many small files take longer to download and update compared to fewer large files. This alone makes the "total bytes downloaded vs total bytes to download" calculation inaccurate.
In order to optimise the delivery of small files, sometimes they are packaged up in a larger file. This larger file then needs to be unpackaged / decompressed to extract the smaller files, the speed of this is dependent on the contents, encryption and finally the creation of the many smaller files.
Many download systems support on-the-fly compression, the compression of files in this way means the total bytes downloaded compared to the total bytes expected (once decompressed) can vary wildly, and some files just don't compress very well anyway.
This is before the problem of actually updating and checking files comes into play? e.g. a good downloader will do a CRC check of some form of the downloaded file compared to the CRC that it expected. And to double check the CRC as well just for good measure. Again, the CRC check time varies depending on the number and size of the files involved, too many files and the CRC check files become a notable download issue all on their own.
This is before the actual patching of files takes place, which given the ball-ache of anything .net / com / activex nature the process of scanning, registry mangling and horrible version control attempts adds an utterly indiscriminate amount of control to a given process, particularly when many libraries are inter-dependent where a full transactional update is required as a single atomic update just won't do the job.
I'm so glad I don't have to calculate the install times... :)