Popular Post

_

Tuesday, December 28, 2010

Tim Browning: the review of cloud computing article "Optimal Density of Workload Placement"


Bottom line: a cloud computing resource is really a data center with virtualized components.  A GUI-frontend to an outsourcing arrangement.
           
Maybe the only true “cloud computing” takes place in aircraft. Although, that is debatable.





The Cloud Hype in the paper:

The author proclaims that cloud computing “is not simply the re-branding and re-packaging of virtualization”…then proceeds to show that it is just that. He also states that capacity planning’s use of “trend-and-threshold” analytics is not useful in the cloud infrastructure, yet he defines ‘strategic optimization’ as “proactive, long-term placement of resources based on detailed analysis of supply and demand (compacting)”.  I  assume he does not understand that ‘supply’ is a threshold – we only have a finite amount of ‘supply’ -  and that ‘long-term’ is a trend?

He also states

“Rather than the trend-and-threshold model of planning that is typically employed in legacy physical environments, this new form of planning [my emphasis] is based on discrete growth models (at the VM and/or workload level) and the use of permutations and combinations to determine when to rebalance, when to add or remove capacity, and how the environment will respond to different growth, risk and change scenarios.” 

So, I ask myself,  what’s new about ‘discrete growth models’? Where does he get the “growth, risk and change scenarios” -- (wait, don’t tell me…from trend-and-threshold thinking)?  Maybe he is being discreet about the discrete models (thus avoiding being discreetly indiscrete)?

 Permutations and combinations say nothing about end-state solutions relative to (long or short term) time-series load patterns. They are time static, so ‘when to add or remove’ is not part of those computational functions. Perhaps, what arrangement is ‘best’ is what he is meaning?  Perhaps he is thinking of ‘on demand’ capacity wherein capacity planning is replaced by ‘instant’ capacity in response to ‘change’? Which is to say, there is no planning…just rapid and efficient deployment of some kind of limitless unseen capacity?

What is ‘new’ about combinations and permutations? The newest development I know of in this area is perhaps combinatorial optimization, which consists of finding the optimal solution to a mathematical problem in which each solution is associated with a numerical “cost”. It operates on the domain of optimization problems, in which the set of feasible solutions is discrete or can be reduced to discrete (in contrast to continuous), and in which the goal is to find the best solution (lowest cost). (Developed in the early 70’s as linear and integer programming in operations research and similar to the root mean square error criteria for evaluating competing forecast models using neural networks or statistical methods).

So, knowing how many ways you can combine 887 disks on the same I/O path (combinatorics) tells me when to add or remove some if referenced to a discrete growth model? Wow.. yes, that is so NEW…well, for 1968, maybe. 

 Subsequently, he states

 “the natural changes in utilization over time caused by organic growth will tend to push the limits on the configured capacity. Furthermore, the ability to configure capacity is relatively new to IT, and there are typically no existing processes in place to catch misallocation situations.” 

Perhaps the ability to “configure capacity” is new to him, it is in no way new to enterprise IT.  So, trend – a legacy term -  is not, per the author, ‘changes in utilization over time’ and ‘configured capacity’ is not a threshold? There are ‘no existing processes in place’ to catch misallocation situations? What? None? I suppose by ‘misallocation situation’ he means that a capacity shortfall isn’t a capacity issue, it’s an “allocation issue”. Somewhere – over the rainbow -  there is capacity going to waste, but it’s not available for some reason. It’s just been ‘misallocated’. Sort of…misplaced. We must go find it. Instantly.

OK….So do I like anything about this paper?

Some ideas in the paper I DO like:

Workload density – the degree of consolidation of work into one image (of the OS) - is a cool concept where ‘contention for resources’ is a boundary condition for ‘workload placement’. How is this done? “Contention probability analysis”, which involves analyzing the operational patterns and statistical characteristics of running workloads in order to determine the risk of workloads contending for resources. The author uses the phrase, “Patterns and statistical characteristics”. So, in effect, ‘contention probability analysis’ is a ‘trend-and-threshold’ technique (although he thinks it isn’t). I am surprised he didn’t rebrand ‘statistical cluster analysis’ as also something new and revolutionary just hot from computer science labs - yet another form of blessed combinatorics optimization. Where this idea has been usefully applied at KC: SAP Batch Workload time density – the degree of consolidation of batch work into the same time intervals.  In this case a boundary condition for workload ‘time placement’ would examine workload (demand) leveling and distribution to avoid unnecessary spikes for time-movable workloads.

Another idea I like:

He suggests that workloads are best characterized by their statistical properties, rather than “up front descriptions of their demand characteristics”. Thus workloads are ‘placed’ using segmentation of the resource demand profiles (to avoid imbalance, etc.). Which is to say, workloads are aggregations of activity with  common ‘demand characteristics’. In queuing theory, the classification of incoming transactions into resource-based profiles which are used for priority dispatching protocols against an array of appropriately resource mapped servers will always produce a more optimal process model in terms of throughput and average response times in contrast to a queuing network where transactions are not classified based on resource requirements. This was the basis for batch initiator job class definitions in the mainframe world of the 1970’s. It worked then also. It will work for ‘clouds’ too. 

The only ‘up front descriptions of demand characteristics’ that I know of would be the results of demand/performance modeling and/or LoadRunner-type benchmarking. This is still useful for ‘start-state’ sizing of the target landscape.

So…bottom line: interesting concepts or ‘new ways of conceptualizing’ the functional parametric states of virtualized landscapes. Suggestions (but no concrete explanations) that combinatorial optimization techniques can be utilized for capacity planning (implying it is not now being used). Interesting and useful applications for event densities and statistical profiling.

It seems so important, especially to vendor environments, to reinvent the wheel – a legacy object -   by their services or products, and suggest that they have superior knowledge of all things new and different and these new and different things are not ‘legacy’. After all, in vendor gadget technology what isn’t ‘new’ is ‘bad’ and ‘if it works, it’s out of date’. Thus, legacy means ‘bad’ because it’s  not ‘new’ (even if it uses new components) and, most importantly, it’s not what they are selling.

Just because “2 + 2 = 4” is legacy math, i.e. old, and thus bad, it doesn’t mean that it’s no longer true in cloud math.  It is still true, but needs to be repackaged.

So, in the interest of actionable market relevance, here is a new, fresh, cloud hyped- up version of “2+2=4”:

“It has been newly (re)discovered that ‘2 + 2 is optimally 4 and exceptionally relevant for business purposes.  The scope of this process is enhanced for sufficiently configured integer values of {2,4} in a dynamic web-enabled hi definition virtual presence wherein it has locality of reference within the set of all integer number segments of the arithmetic cloud infrastructure. This will provide a competitive edge to your business as newly revealed by the appropriate cloud-centric data mining tools (c1, c2, … cn, ) - with price guarantees, if you act now! -  at current release, version and maintenance levels in dynamic optimal adaptive combination. This fabulous offering is expertly administered under the guidance of cloud certified  analysts, at an attractive hourly rate, who are not now, nor ever have been, legacy experts and thus ‘new’ and ‘fresh’ with exciting social networking added value potential. (Please join us on the Facebook groupI like integer addition with cloud computing”).”
                                   

Of course, I might be preaching the choir (rather than the clouds) on this one. It seems, nevertheless, that corporate IT vendors demonstrate a kind of ‘math neurosis’:

A math-psychotic does NOT believe that 2+2=4.
A math-neurotic knows that 2+2=4 is true, but hates it. It must be repackaged for resale and aggressively marketed with a customer focused strategy.

If mathematics is the art of giving the same name to different things (J. H. Poincare), then IT marketing is the art of giving a new name to the same things and using pretty charts.

  
THE theologically orthodox  axiom for information technology services/product vendors:

"Absolutum Obsoletum"
(TimLatin translated: "If it works,  it’s out of date").









How to make 3 mice out of 2 mice by making 2 = 1:

















(Posted with the Tim's Browning permission)

4 comments:

  1. Here is my 1st response on Tim's review (sent via e-mail):

    "That is brilliant!

    I wish he could critique one of my papers... "

    Then I shared my experience with cloud computing, which is here:
    http://itrubin.blogspot.com/2010/12/cloud-computing-capacity-management.htm

    ReplyDelete
  2. Thanks Igor! You are so kind. That was actually an internal paper for a few co-workers and managers where I work...thus, I felt a need to make it more entertaining and informal than a conference or journal paper. Some of the managers were drooling over the 'cloud' computing hype and asked that I review and respond.

    ReplyDelete
  3. Wow, cool post. I'd like to write like this too - taking time and real hard work to make a great article... but I put things off too much and never seem to get started. Thanks though. Submit Articles

    ReplyDelete
  4. Thanks for your insight for your fantastic posting. I’m exhilarated I have taken the time to see this. It is not enough; I will visit your site every day. hpe simplivity

    ReplyDelete