![]() |
Every IT organization (and IT vendor for that matter) has one of those performance geeks who answers every question with "It depends". When I designed storage systems, I tried to justify implementation choices with our performance team. I would ask specific questions about I/O workloads to our performance guru (Malcolm). I could never get a straight answer out of him, so I stopped asking! Dave Vellante and his Wikibon team are publishing interesting research on the use of Amazon public cloud in the enterprise. One article in particular contains a discussion on speed and agility using the AWS approach. The conclusion... 'However, for complex applications with significant interdependencies, this ratio will decline substantially and in some cases flip in the negative direction. For customers the reality is, "It depends."' As customers build hybrid data centers that mix private and public cloud implementations, what percentage of their data center will be public versus private? What are the rules for determining what part of the infrastructure is public versus private? The answer, of course, is that it depends. Wikibon has also published a fascinating use case which highlights the volatility of the private/public phenomena. Zynga moved from a 100% private implementation, to a public (AWS) approach, and then modified the public approach by re-introducing private. I encourage you to take a deeper look at the current (and future) Wikibon research in this area. I also encourage you to consider the data analytic questions that arise as a result of constantly shifting public/private assets: Do you think companies like Zynga disabled their ability to run analytics as they shifted from private to public and back again? Did they have to re-write their analytic models to fit each new architecture? The beauty of the hybrid approach is that (in theory) it should improve "analytic correctness". Correctness is correlated to volume and variety. The enterprise that can run analytics across private data AND massive amounts of public (or partner) data will win. A Pivotal Decision Providing a platform that allows the enterprise to abstract away the hybrid cloud balance is the answer. Analytic models should be written without regard for data center plumbing and service provider choice. CIOs can derive much greater value and innovative output if their programmers can develop such a framework. They will be presented with a variety of architectural options for such a framework, and there are two main concerns that need to be overcome:
The recently-announced Pivotal Initiative is building a hybrid analytic framework using the following building blocks:
Each of these categories represents an area that I hope to explore in more detail in future posts. Balancing analytics on top of shifting hybrid assets will certainly be an area of focus for CIOs in 2013. The Pivotal portfolio has the right technologies (and people) to navigate this shifting landscape. Steve Twitter: @SteveTodd EMC Fellow
|
Update your feed preferences |
