October 09, 2006
Is IT Virtualization Enough?
Please note that contributed articles, blog entries, and comments posted on MCADcafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Jeff Rowe - Managing Editor


by Jeff Rowe - Contributing Editor
Each MCAD Weekly Review delivers to its readers news concerning the latest developments in the MCAD industry, MCAD product and company news, featured downloads, customer wins, and coming events, along with a selection of other articles that we feel you might find interesting. Brought to you by MCADCafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!

The scale and complexity of a growing number of enterprise IT infrastructures now equals or exceeds that of many national telecommunications carriers not that many decades ago, and the breadth of services offered to enterprise "clients" exceeds those offered by carriers to their subscribers — even in the recent past — by a order of magnitude. So said Udi Paret, a Silicon Valley enterprise software executive, at the Dow Jones Datacenter Ventures conference held recently in San Jose, CA. The challenge, then, said Paret, is to come up with a way of managing these massive infrastructures that concentrates on end-to-end service delivery, based upon streamlined, high-level
models, rather than today's focus on individual resources — virtualized or real — on a domain-by-domain basis.


"Today's IT departments are drowning in a sea of details, as they try to cope with the explosive growth of their data centers, and the need for their services to be always available," said Paret, who is president of Fusion Dynamic, Inc. "The recent shift towards virtualization — virtual servers, virtual networks and virtualized storage — is only compounding the problem. Although a needed level of flexibility and dynamism results from being able to represent, re-partition, or combine physical resources into abstract ones, the total number of entities to be managed continues to skyrocket. The focus is simply too tactical; the level of abstraction is too low."


What Paret means is that a business unit manager planning a new e-commerce program, for example, does not think in terms of hard drives, routers, or firewalls — low level abstractions — but in terms of the end-to-end service that needs to be delivered — an abstraction of the highest level. How it is implemented is of no interest. But what service levels are attained, how the service scales and adapts to changing business conditions, what levels of security are attainable — issues like these are crucial.


"Such a professional would like to be able to tell the IT manager: 'Double my capability for daily online sales transactions, I'll sacrifice service levels in self-guided support…' and have the entire data center architecture automatically rework itself," said Paret. "And there's the difficulty. Data centers, even cutting-edge dynamic data centers with virtualized resources throughout, simply do not have this type of service-oriented infrastructure."


Missing: A Data-Center-Wide Operating Environment

What is missing, said Paret, is a sort of "operating system for data centers" that can automatically reallocate hardware, software, systems, connections, etc., in terms of high-level, service-driven objectives, as business conditions change, or response to problems or catastrophes, to keep service at guaranteed levels. Such an automated operating environment would feature dramatically simplified ways of provisioning and managing the data center, with the focus being on processes, rather than on specific components and topologies. Behind the scenes, the operating environment would adroitly manage such components and topologies dynamically, asking attention only when something had
failed — albeit transparently — thanks to the environment's self-healing capabilities.


The crucial difference between Paret's concept of an operating environment, and the historical operating systems — virtual or physical — that control servers, routers, and other components in a data center, is that the operating environment crosses all of the domains and controls them. "The only way you will achieve a true service-oriented infrastructure that delivers substantially improved operating economies; flexibility in design, provisioning, and operations; uncompromising resiliency; and an almost completely hands-off running model," says Paret, "is with a software infrastructure that is aware of — and can essentially control — all of the
relevant resources in a data center, be they applications, servers, storage, networks. This capability uniquely defines the data center 'operating environment.' If you additionally provide it a with a high-level, abstract control structure that is based upon service delivery, then you have achieved a true service oriented infrastructure."


Paret's company, Fusion Dynamic, has recently introduced the world's first dynamic operating environment for data centers that has the potential to transform a classical, resource-focused data center, into a true, service-oriented dynamic data center capable of meeting the most demanding service-level goals.



Commentary

by Jeffrey Rowe, Editor


The concepts put forth by Mr. Paret reminds me of something that was destined to be the "next big thing" just a few years ago - the application service provider or ASP. There were a number of software companies and service providers that got into the fray, but I was always confused about exactly what an ASP was - a service, product, or a business model? Based on the relative lack of success, meaning customers, for ASPs, I was not exactly alone in my confusion.


As a quick refresher, ASPs are third-party entities that manage and distribute software-based services, including application software, to customers across a wide area network from a central data center. In essence, ASPs are a way for companies to outsource some or virtually all aspects of their IT needs.


In other words, an ASP is a company that offers access over the Internet to applications and related services that would otherwise have to be located on personal or enterprise computers. The "beauty" of this delivery model was to speed implementation, minimize the expenses and risks incurred across the application life cycle, and overcome the need for qualified in-house technical personnel.


An ASP can also be viewed as a company whose business-model specializes in hosting software and/or hardware platforms/systems and then makes them available on a rental and or lease basis — basically a subscription. An ASP arrangement allows for set-up and deployment for "users" without requiring the "user" to invest in the underlying technological infrastructure. Not a bad idea, but a lot more complicated than many of the companies that tried it cared to admit. The Web browser, acted as a universal client interface, and was intended to be the fuel for the "on-demand software" market, but one that never really materialized. I think one of the biggest
factors for ASPs' relative lack of acceptance was tied to the "ownership" issue - who owns what, for how long, and for what purposes.


Admittedly, there are still ASPs around today, including companies offering MCAD applications. They're just not as widespread as they were forecast to be a few years ago.


OK, let's move ahead to Fusion Dynamic …


Based in the Silicon Valley and Israel, Fusion Dynamic is something of a pioneer in a new type of ASP — the service-oriented infrastructure for data centers. The implementation is a software-based operating environment that dynamically links an organization's hardware and software resources into a flexible, adaptive, fault-tolerant, and self-healing infrastructure. This infrastructure permits data centers to be provisioned, managed, and maintained from a high-level, abstract, "application and service delivery" perspective, rather than from the low-level, physical-resource perspective that consumes huge amounts of personnel and capital in most data centers. The Fusion Dynamic
infrastructure provides services, reduced hardware requirements, lower operational complexity, and a commensurate reduction in capital and operational expenses. Remember, like ASPs of old, this all theoretical, but properly implemented, could be quite feasible and cost-effective.


Fusion Dynamic's flagship DynamicOE operating environment is what provides the service-oriented infrastructure to data centers. The company says that it is DynamicOE that enables the service-oriented infrastructure for a broad range of servers, network appliances, and storage.


Will Fusion Dynamic succeed where most other ASP-based software and service provider companies failed? That remains to be seen, but the Fusion Dynamic core concept seems to be solid. So, going back to the title of this story, "Is IT Virtualization Enough?", that is a good question, and by itself, it is not enough, but it is a critical part to the increasingly complex IT puzzle.



The Week's Top 5


At MCADCafé we track many things, including the stories that have attracted the most interest from our subscribers. Below are the five news items that were the most viewed during last week.



Dassault Systemes announced new enhancements to the ENOVIA MatrixOne Matrix 10 PLM platform. Matrix10 updates include enhanced support for environmental/material compliance; improved industry-specific solutions for the apparel, medical device and automotive industries; solutions enabling concurrent printed circuit board development and for distributing product development data throughout the extended enterprise including non-traditional PLM users.



Majenta Solutions Ltd. announced the formation of a new company. Its mission: to make the benefits of PLM software solutions available, at an affordable price, to the small-to-medium business sector in the UK. Known as Majenta PLM, the new company will focus exclusively on the CAD/CAM/CAE and PDM/PLM software portfolio of UGS. This includes the Velocity Series - a packaged family of products including Solid Edge, Femap for pre- and post-processor finite element analysis, NX CAM Express for numerical control (NC) machine tool programming, and Teamcenter Express for collaborative product data management. The
portfolio also includes NX, UGS' Teamcenter, for enterprise-wide product data and lifecycle management and the Tecnomatix suite of manufacturing planning and management software.


1 | 2  Next Page »


You can find the full MCADCafe event calendar here.


To read more news, click here.



-- Jeff Rowe, MCADCafe.com Contributing Editor.


Rating:


Review Article Be the first to review this article
SolidCAM: Program your CNCs directly inside your existing CAD system.


Featured Video
Editorial
Jeff RoweJeff's MCAD Blogging
by Jeff Rowe
Siemens Goes ECAD With Mentor Graphics Acquisition
Jobs
Mechanical Engineer for IDEX Corporation at West Jordan,, UT
GIS Analyst II for Air Worldwide at Boston, MA
Business Partner Manager for Cityworks - Azteca Systems, LLC at Sandy, UT
Senior Structural Engineer for Design Everest at San Francisco, CA
Upcoming Events
Design & Manufacturing, Feb 7 - 9, 2017 Anaheim Convention Center, Anaheim, CA at Anaheim Convention Center Anaheim CA - Feb 7 - 9, 2017
Innorobo 2017 at Docks de Paris Paris France - May 16 - 18, 2017
Display Week 2017 at Los Angeles Convention Center 1201 S Figueroa St Los Angeles CA - May 21 - 26, 2017



Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering EDACafe - Electronic Design Automation GISCafe - Geographical Information Services TechJobsCafe - Technical Jobs and Resumes ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy Advertise