[ Back ]   [ More News ]   [ Home ]
October 09, 2006
Is IT Virtualization Enough?
Please note that contributed articles, blog entries, and comments posted on MCADcafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Jeff Rowe - Managing Editor


by Jeff Rowe - Contributing Editor
Each MCAD Weekly Review delivers to its readers news concerning the latest developments in the MCAD industry, MCAD product and company news, featured downloads, customer wins, and coming events, along with a selection of other articles that we feel you might find interesting. Brought to you by MCADCafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!

The scale and complexity of a growing number of enterprise IT infrastructures now equals or exceeds that of many national telecommunications carriers not that many decades ago, and the breadth of services offered to enterprise "clients" exceeds those offered by carriers to their subscribers — even in the recent past — by a order of magnitude. So said Udi Paret, a Silicon Valley enterprise software executive, at the Dow Jones Datacenter Ventures conference held recently in San Jose, CA. The challenge, then, said Paret, is to come up with a way of managing these massive infrastructures that concentrates on end-to-end service delivery, based upon streamlined, high-level
models, rather than today's focus on individual resources — virtualized or real — on a domain-by-domain basis.


"Today's IT departments are drowning in a sea of details, as they try to cope with the explosive growth of their data centers, and the need for their services to be always available," said Paret, who is president of Fusion Dynamic, Inc. "The recent shift towards virtualization — virtual servers, virtual networks and virtualized storage — is only compounding the problem. Although a needed level of flexibility and dynamism results from being able to represent, re-partition, or combine physical resources into abstract ones, the total number of entities to be managed continues to skyrocket. The focus is simply too tactical; the level of abstraction is too low."


What Paret means is that a business unit manager planning a new e-commerce program, for example, does not think in terms of hard drives, routers, or firewalls — low level abstractions — but in terms of the end-to-end service that needs to be delivered — an abstraction of the highest level. How it is implemented is of no interest. But what service levels are attained, how the service scales and adapts to changing business conditions, what levels of security are attainable — issues like these are crucial.


"Such a professional would like to be able to tell the IT manager: 'Double my capability for daily online sales transactions, I'll sacrifice service levels in self-guided support…' and have the entire data center architecture automatically rework itself," said Paret. "And there's the difficulty. Data centers, even cutting-edge dynamic data centers with virtualized resources throughout, simply do not have this type of service-oriented infrastructure."


Missing: A Data-Center-Wide Operating Environment

What is missing, said Paret, is a sort of "operating system for data centers" that can automatically reallocate hardware, software, systems, connections, etc., in terms of high-level, service-driven objectives, as business conditions change, or response to problems or catastrophes, to keep service at guaranteed levels. Such an automated operating environment would feature dramatically simplified ways of provisioning and managing the data center, with the focus being on processes, rather than on specific components and topologies. Behind the scenes, the operating environment would adroitly manage such components and topologies dynamically, asking attention only when something had
failed — albeit transparently — thanks to the environment's self-healing capabilities.


The crucial difference between Paret's concept of an operating environment, and the historical operating systems — virtual or physical — that control servers, routers, and other components in a data center, is that the operating environment crosses all of the domains and controls them. "The only way you will achieve a true service-oriented infrastructure that delivers substantially improved operating economies; flexibility in design, provisioning, and operations; uncompromising resiliency; and an almost completely hands-off running model," says Paret, "is with a software infrastructure that is aware of — and can essentially control — all of the
relevant resources in a data center, be they applications, servers, storage, networks. This capability uniquely defines the data center 'operating environment.' If you additionally provide it a with a high-level, abstract control structure that is based upon service delivery, then you have achieved a true service oriented infrastructure."


Paret's company, Fusion Dynamic, has recently introduced the world's first dynamic operating environment for data centers that has the potential to transform a classical, resource-focused data center, into a true, service-oriented dynamic data center capable of meeting the most demanding service-level goals.



Commentary

by Jeffrey Rowe, Editor


The concepts put forth by Mr. Paret reminds me of something that was destined to be the "next big thing" just a few years ago - the application service provider or ASP. There were a number of software companies and service providers that got into the fray, but I was always confused about exactly what an ASP was - a service, product, or a business model? Based on the relative lack of success, meaning customers, for ASPs, I was not exactly alone in my confusion.


As a quick refresher, ASPs are third-party entities that manage and distribute software-based services, including application software, to customers across a wide area network from a central data center. In essence, ASPs are a way for companies to outsource some or virtually all aspects of their IT needs.


In other words, an ASP is a company that offers access over the Internet to applications and related services that would otherwise have to be located on personal or enterprise computers. The "beauty" of this delivery model was to speed implementation, minimize the expenses and risks incurred across the application life cycle, and overcome the need for qualified in-house technical personnel.


An ASP can also be viewed as a company whose business-model specializes in hosting software and/or hardware platforms/systems and then makes them available on a rental and or lease basis — basically a subscription. An ASP arrangement allows for set-up and deployment for "users" without requiring the "user" to invest in the underlying technological infrastructure. Not a bad idea, but a lot more complicated than many of the companies that tried it cared to admit. The Web browser, acted as a universal client interface, and was intended to be the fuel for the "on-demand software" market, but one that never really materialized. I think one of the biggest
factors for ASPs' relative lack of acceptance was tied to the "ownership" issue - who owns what, for how long, and for what purposes.


Admittedly, there are still ASPs around today, including companies offering MCAD applications. They're just not as widespread as they were forecast to be a few years ago.


OK, let's move ahead to Fusion Dynamic …


Based in the Silicon Valley and Israel, Fusion Dynamic is something of a pioneer in a new type of ASP — the service-oriented infrastructure for data centers. The implementation is a software-based operating environment that dynamically links an organization's hardware and software resources into a flexible, adaptive, fault-tolerant, and self-healing infrastructure. This infrastructure permits data centers to be provisioned, managed, and maintained from a high-level, abstract, "application and service delivery" perspective, rather than from the low-level, physical-resource perspective that consumes huge amounts of personnel and capital in most data centers. The Fusion Dynamic
infrastructure provides services, reduced hardware requirements, lower operational complexity, and a commensurate reduction in capital and operational expenses. Remember, like ASPs of old, this all theoretical, but properly implemented, could be quite feasible and cost-effective.


Fusion Dynamic's flagship DynamicOE operating environment is what provides the service-oriented infrastructure to data centers. The company says that it is DynamicOE that enables the service-oriented infrastructure for a broad range of servers, network appliances, and storage.


Will Fusion Dynamic succeed where most other ASP-based software and service provider companies failed? That remains to be seen, but the Fusion Dynamic core concept seems to be solid. So, going back to the title of this story, "Is IT Virtualization Enough?", that is a good question, and by itself, it is not enough, but it is a critical part to the increasingly complex IT puzzle.



The Week's Top 5


At MCADCafé we track many things, including the stories that have attracted the most interest from our subscribers. Below are the five news items that were the most viewed during last week.


Dassault Systemes Extends ENOVIA MatrixOne PLM Solution

Dassault Systemes announced new enhancements to the ENOVIA MatrixOne Matrix 10 PLM platform. Matrix10 updates include enhanced support for environmental/material compliance; improved industry-specific solutions for the apparel, medical device and automotive industries; solutions enabling concurrent printed circuit board development and for distributing product development data throughout the extended enterprise including non-traditional PLM users.


New Company To Bring UGS' PLM To SMBs

Majenta Solutions Ltd. announced the formation of a new company. Its mission: to make the benefits of PLM software solutions available, at an affordable price, to the small-to-medium business sector in the UK. Known as Majenta PLM, the new company will focus exclusively on the CAD/CAM/CAE and PDM/PLM software portfolio of UGS. This includes the Velocity Series - a packaged family of products including Solid Edge, Femap for pre- and post-processor finite element analysis, NX CAM Express for numerical control (NC) machine tool programming, and Teamcenter Express for collaborative product data management. The portfolio also includes NX, UGS' Teamcenter, for enterprise-wide product data and
lifecycle management and the Tecnomatix suite of manufacturing planning and management software.


Dassault Systemes Launches CATIA PLM Express

Dassault Systemes announced CATIA PLM Express, a scalable solution which provides CATIA design capabilities at an affordable price. CATIA Team PLM is the core configuration of CATIA PLM Express, giving customers the CATIA modeler for product design, knowledge capture and re-use in a collaborative environment. It also delivers core collaborative PDM functionality via ENOVIA SmarTeam, providing optimized CATIA design management and collaboration, and a foundation for PLM solution scaling. Existing customers can move to this new offering from their existing CATIA seats.


3DQuickForm Version 2 launched at SolidWorks User Group Meeting

3D QuickTools Ltd. announced 3DQuickForm Version 2 to the metal stamping market. There has been a demand in the metal stamping industry to invest in sophisticated blank development software to simulate the complex metal forming process for die making. 3DQuickForm V2 enables tooling designers to use the CAD/CAE integrated environment to develop the process of complex formed parts. 3DQuickForm V2 is powered by ESI technology with features to develop partial and complete blank developments all within the SolidWorks interface.


SKODA AUTO (Volkswagen Group) selects Kineo CAM for Dynamic Digital Mockup Analysis

Kineo C.A.M. announces that SKODA AUTO has selected Kineo Path Planner, a standalone dynamic collision checking solution, whose ability to automatically find and generate mounting and dismounting trajectories has proven to complement SKODA AUTO's existing DMU analysis methods and processes. This new methodology will enable SKODA AUTO's engineering teams to benefit from cost savings and efficiencies by ensuring that aspects such as serviceability and manufacturability are analyzed earlier and verified at each and every step of virtual vehicle development. Kineo Path Planner can find and generate collision-free trajectories automatically within a highly constrained 3D environment.



Jeffrey Rowe is the editor and publisher of MCADCafé and MCAD Weekly Review. He can be reached
here or 408.850.9230.


This Week


Lead Story
  • Virtualization Not Enough, Says Enterprise Software Executive
    Product and Company News
  • UGS Offering FREE Solid Edge Free 2D Drafting Software
  • Dialog Imaging Systems Selects Dassault Systemes PLM Solutions
  • ActiveSolid 2.0 Released
  • Tobermory Accelerates Product Development With SolidWorks And COSMOSWorks
  • Kyocera Mita Switches to CoCreate
  • New VariCAD 2007 Just Released
  • QuadriSpace Now Supports SolidWorks 2007
  • Arena Solutions Honored By START-IT Magazine
  • M2 Technologies “Manufactures” New Website to Help Clients Increase Productivity and Maximize e-Learning Opportunities
  • Delmia reports first implementations of DMU / DPM Path Planner in the US and Japan
  • CoCreate Teams with Microsoft to Bring Technology Directly to the Doorstep of Colorado Front Range Businesses
  • Avatech Solutions Makes Deloitte Fast 50 Technology List
  • Blue Ridge Numerics Named To Technology Fast 50
  • SolidCAM Expands U.S. Organization
  • RuleStream and Razorleaf Sign Strategic Partnership Agreement
  • GiveMePower Corporation Achieves Microsoft Certified Partner Status
  • Miniature Precision Uses EFD.Lab Flow Simulation To Optimize Hydrocarbon Trap
  • Forming Technologies releases Metalforming Solutions Suite for CATIA R17, empowering a 50% reduction in Engineering Changes and Engineering Rework.
  • Delcam's PowerINSPECT Provides Inspection Confidence
  • Moldflow Launches European Plastics Technology Days
  • Missler Software and AutoForm Engineering
  • ProSoft Releases AutoCAD 2007 Courseware
  • SigmaQuest Named As A Top Emerging Manufacturing Software Vendor
    Related MCAD News
  • IBM Announces 5-Year Plan For Mainframe Simplification
  • Knowledge-Based Engineering Technology Introduced By Adroitec
  • UP Aerospace Releases Initial Analysis Of Its Rocket Launch
  • IBM, ASTD Study Reveals Critical Gaps In Changing Workforce Demographics As Baby Boomers Retire
  • KUKA Robotics Selected By AMF Automation Technologies
  • Incuity & NORPAC to Host Workshop On Optimizing Manufacturing Productivity, Profitability
  • Kinaxis Honored As Emerging Software Vendor
  • Gaining Global Competitiveness Through Collaborative Manufacturing Strategies
  • Manufacturing Business Technology Magazine Names Freeflow
  • IBM Strengthens Power Architecture With New Low-Power Processors
  • ATI Launches Enterprise Stream Computing Initiatives
  • CCAT Provides Awards To New Robotics Technologies
    Corporate Moves
  • Teksoft Announces Appointments of New Vice President of Worldwide Sales and Director of Marketing
    Industry Events
  • Interoperability & 3D Collaboration: Productivity In Global Manufacturing 2007 Call For Speakers
  • David Prawel to Keynote Boeing's Product Data Exchange Conference
  • Hankook Delcam Holds User Meeting In Korea
  • Design Visionaries’ Seminar Details PLM for Medical Product Design
  • DASSAULT SYSTEMES Schedules Q3 2006 Conference Call Webcast


    You can find the full MCADCafe event calendar here.


    To read more news, click here.



    -- Jeff Rowe, MCADCafe.com Contributing Editor.


    Rating: