It struck me this afternoon.
How much uptime do you need?
It bothers me that our “always on” (or should that be ‘always online’ ?) society gets a bit obsessed with the degree of uptime you need. That is, the amount of time you can access your applications/email/gizmos etc. Usually measured by percentage access. After all when we’re all asleep does it matter that we can’t access our email, apps etc
Anyways it struck me that in the cloud computing part of the IT industry the talk is all about 99.99% uptime. But what does that really mean? And with that box in the corner (remember, the server?) what uptime did we get before. What are the stats on that… so after a drive home from work pondering such huge issues I spent a bit of time researching these items and here is what I came up with…
99.9% = you’re service is down for 0.73 of an hour…so that’s about 45 minutes in every month of operation, assuming a 24 hour service. As my Dad says about computers..’you want the PC to take either 500 msecs to respond or 15 minutes…cos you cant make a cup of tea in 500msecs…” wise words indeed..
99.99% = you’re service is down for 0.073 of an hour….so that’s about 4.38 minutes every month….quandry….can you make a cuppa in this time? But then do you need a cuppa at 3am when the service is down
So assuming that we only want our service between 7am and (say) 7pm during weekdays then the calculations become less and we’re down to more like 2 minutes down-time at 9.99% service uptime. But then you cant really predict when the downtime will be in any 24 hour period.
Now this is where I came unstuck…trying to find any sort of figures on the web to give a comparison against Local Area Network File Servers…I did a lot of google searching and couldnt find anything that was of any use to compare figures.
No comments yet