jpo3136b Posted February 25, 2010 Share Posted February 25, 2010 <p>Ever see those notices that tell you how long some task will take? [4 seconds, 497 hours, 12 minutes] They can fluctuate quite a bit, to the point that the time to completion predictions are really inaccurate. </p> <p>I suppose the computers are projecting, based on what has happened in the recent past.</p> <p>What do they count as the recent past, to make these time to completion predictions with?<br> What is that frame of reference called?</p> Link to comment Share on other sites More sharing options...
leicaglow Posted February 25, 2010 Share Posted February 25, 2010 <p>John, what are you talking about in particular? If it's something like a download or software install, it's just simple math, and it can vary depending on the vendor's calculation.</p> Link to comment Share on other sites More sharing options...
Matt Laur Posted February 25, 2010 Share Posted February 25, 2010 <p>Some tasks - especially those that involve the compression and transfer of data - can defy accurate time-to-complete prediction because there are unknowables (such as the nature of the rest of the data, and how compressable it all is ... or what sort of bandwidth bottlenecks will come and go over time). The farther along a given process gets, the more accurate the estimate tends to become, which is why it can swing wildly early on. <br /><br />When you're copying a folder full of files - say, 10,000 of them - and all of them are small, but for a couple of huge ones, odd things can happen. The system can evaluate all of that up front (so that it can make very accurate predictions), but that also adds time to the actual task of copying the files. Compression routines have the same problem. Time spent in advance, to predict the size of the job, is time not spent getting the job done. So programmers decide how much of a guess to make, in a balacing act that is sometimes frustrating to end users.</p> Link to comment Share on other sites More sharing options...
sknowles Posted February 25, 2010 Share Posted February 25, 2010 <p>It's usually, but not always, a short-term moving average, recalculating as it works. Downloads are relatively easy to calculate in real-time from the speed and size of the file(s). Processing times are less easy as it depends on the number and use of the applications (available cpu/memory). </p> Link to comment Share on other sites More sharing options...
py-photography Posted February 25, 2010 Share Posted February 25, 2010 <p>I don't think there is a frame of reference.. it's a WAG</p> Link to comment Share on other sites More sharing options...
hal_b Posted February 25, 2010 Share Posted February 25, 2010 <p>The worst estimates are based on the instantaneous speed only. If there is a sudden lull in transfer (or processing), the speed drops to near zero. When you divide the remaining portion of the task by a speed near zero, the remaining time approaches infinity. You see this often while downloading. If you are downloading a 5GB file and speed hangs at 0 kB/s, your time will increase drastically, somtimes into the years (i.e. 2 years, 6 months remaining) if your download manager can handle it without timing out.</p> <p>A smarter way to calculate time remaining is to simply record time past and assume the same rate averaged over the remainder of the task. Certain programs/tasks use this logic. The effect is that if you are performing a 20 minute task, and you pause it halfway through for, say, 30 minutes, when you resume the time remaining will be 40 minutes. The logic assumes that the second half will take the same amount of time as the first half. The end result is a slight overestimate, instead of a drastic overestimate.</p> <p>The difference is really a matter of the size of your sample. As the sample pool increases, so does the accuracy. If you only take a single, instantaneous sample, your resultant estimate only has meaning for the immediate future. If you take an average of 100% of all past samples, you get the best possible indicator of long-term future performance, if not immediate future performance. The effectiveness of the method really depends on the scale of the task. Larger tasks would benefit more from more thorough sampling, as the task is not about to be completed regardless of the instantaneous speed. A very short task, however, could very easily be completed in seconds if the speed is high enough, and so the only effective estimate might be to use the instantaneous speed. With a short task, you don't have the luxury of extensive time sampling.</p> <p>This must mean that the programs dictating the time estimation for download tasks are geared toward small files and quick download rates. The estimates become meaningless as the file size increases, and as the transfer rate fluctuates.</p> Link to comment Share on other sites More sharing options...
carbon_dragon Posted February 25, 2010 Share Posted February 25, 2010 <p>There's a cartoon I saw, something like the writer of the Windows progress dialog calling his wife to say when he'd be home for dinner. "Hello Marge, yeah I'll be there in half an hour ... err... no 10 minutes ... no 5 hours ... no 3 days ... no 4 minutes..."</p> Link to comment Share on other sites More sharing options...
photomark Posted February 25, 2010 Share Posted February 25, 2010 I got this one a few months ago. I think that's about a quarter million years<div>[ATTACH=full]451279[/ATTACH]</div> Link to comment Share on other sites More sharing options...
jpo3136b Posted February 26, 2010 Author Share Posted February 26, 2010 <p>So, the sample size and increment of taking the sample to project the time to completion is not some standard setup. I would have thought it'd be something like, based on what's been done by the last five seconds of each minute, how many minutes remaining, based on completed percentage or whatever.</p> <p>Now that I think about it, I've never seen anyone disclose the terms under which they've made their projections in those boxes. It may be a feelgood placebo. I suppose it's done when it's done.</p> <p>It could be whatever they set the math up do, then.</p> <p>Thanks for "short term moving average." I wouldn't have known how to describe this.</p> <p>Sorry for the dumb questions, but if I can't figure out what something is called, it's tough to use the internet to help myself. Thanks.</p> Link to comment Share on other sites More sharing options...
alec_myers Posted February 26, 2010 Share Posted February 26, 2010 <p>It's relatively straightforward to do an exponentially weighted rolling average. However, I've noticed that when copying files from one drive to another Windows actually uses its random number generator instead.</p> Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now