Archive for the ‘Enterprise Architecture’ Category

InfoWorld: Graham Lovell Talking About Sun & Virtualization

October 5, 2006

According to a recent press release from Sun Microsystems, one company in particular is combining the well planned architecture of Sun’s Sun Fire X4200 Server and the power of virtualization to perform a 22-1 server consolidation, thereby allowing them to combat and reduce power consumption and high heat output by up to 84 percent: NewEnergy is replacing its entire Houston data center, comprised of 22 Intel processor-based servers, with two Sun Fire X4200 servers powered with the Dual-Core AMD Opteron processor, and running the Solaris 10 OS. NewEnergy’s Houston data center performs CPU-intensive Grid computing simulations for its customers nationwide, which mirror real-world electric Grids in order to plan for potential disasters. Trial results demonstrated the Sun Fire X4200 servers as being much faster than other servers which is partially credited to the Solaris 10 OS’s efficiency over memory-intensive applications running the Windows OS.Sun and VMware have combined efforts to provide innovation and deliver proven virtual infrastructure solutions for enterprise computing. Leveraging the power of VMware Infrastructure 3, Solaris 10 and the Sun Fire series of servers, customers can maximize performance and reduce overall cost of ownership via server consolidation, business continuity, and test or development solutions. Combining these products, IT managers are given a complete solution to help increase their server utilization, improve performance and reduce costs while making better use of their data center resources, like space, cooling and power consumption. I recently had the pleasure of speaking with Graham Lovell, Senior Director of the Systems Group for Sun Microsystems. I wanted to find out more about Sun and to get his take on the whole virtualization scene, specifically software licensing, emerging trends, and customer needs in the virtualization space.


David Marshall: In your own words, what is Sun’s strategy towards virtualization?Graham Lovell: The first thing we need to establish is what we mean by virtualization and how we communicate it. We need to define it with customers in different circumstances.Customers generally look to improve the utilization of their servers. They want to run multiple applications and different operating systems. The idea is to snapshot what they have in a piece of hardware and then run it on another system in a virtualized way.They can see the benefits of running virtualized environments, but they have to support it. They need management tools to run it well.It is important that suppliers such as Sun can provide a range of options on different multiple operating systems. We have SPARC and x86 product lines. With Solaris 10, we have containers. It lets you run isolated systems where each one thinks its running on a dedicated system. If you are running Xen or VMware, you aren’t running multiple software copies.

This has been popular with customers running Solaris and SPARC and Solaris on x86 platforms.

The next choice is that customers can select VMware. VMware has a number of new products, but people think it is a single solution. When we talk to customers about experiencing VMware, some of them may have just heard of it. That is when we can talk about different styles of implementation.

Customers are seeing the benefit on how they can mix VMware. They talk about pooling resources in the data center so one can then resource data across several servers. This makes it easier to move applications around and help with capacity planning. Virtualization can help you install pool behavior.


David: Are you finding that people are using Solaris containers to do the same thing as VMware? Such as for development and test or support? Or are they strictly using it for server consolidation?Graham: Customers look at virtualization to test and debug applications across a range of application systems. That is where the customer can be more sophisticated in their choice with VMware or Xen. With VMware, you can see things have more choice. Xen is up and coming. It is embedded in a number of operating systems. It has interesting and new budget tools. I think Xen will have an interesting future virtualization stack as well.Containers are typically rolled out in an application environment.
David: How does virtualization impact software licensing?Graham: The software industry is reeling from pricing multiple cores per processor. Microsoft has strong policies around pricing cores. Virtualization software subdivides a processor into pieces of CPU. Vendors then argue why do they pay for the whole software when they only use a fraction of it?Value-based pricing is a more reasonable way to charge for software. I think Microsoft is one of the first to come out with policies around virtualized environments.

David: I agree with you. Software licensing will have to change. People are using virtual machines for things such as disaster recovery options and software companies will have to adapt.Graham: Without the flexibility in licensing, customers may find themselves paying more for the software. They moved the software from a 2-core system to an 8-core system. Virtualized environments have bigger engines. They need to make sure they don’t fall far of software restrictions.Customers need to go back to their ISVs and say, is it ok if I can move from 2-cores to 4? Then you have a start for negotiation.Sun has an enterprise system where you charge by the number of employees in the company. It doesn’t matter how much hardware you run, it’s a site-based license with lots of flexibility.


David: What do you think is driving the demand for virtualization today?Graham: I got this Windows NT application. The problem is I can no longer get hardware that can run the physical operating system. You can’t buy old hardware that will run this new software. Legacy reasons are one of the key drivers for virtualization.Server sprawl also generates too much heat and uses too much power. If I consolidate them, I can then improve the use of space, heat and power in the data center.When customers think of disaster planning, they need to easily migrate applications across platforms. If one data center has a problem, it’s easier to migrate in a virtual environment than a non-virtual one.Virtualization also offers more flexibility. When a business comes along and the IT department needs to respond quickly to business needs – virtualization can ramp things up.
David: I’ve seen problems using VMware and Xen with patch management. Since the containers approach is based on one operating system, would that solve part of the patch management problem? It seems like instead of having to patch multiple areas, you just have to patch one.Graham: The flip side is that it runs the same kernel code. So it is all consistent. But you can apply different patches into the user space. You can’t have multiple kernels. If you make any changes, it is reflected across the containers. Then you may want to run VMware with several implementations of Solaris. Then you have a patch level in one instance of Solaris.

David: Can you leave us with a good customer example?Graham: The one that gets my juices going is New Energy Associates.Neal Tisdale, Vice President of Software Development of NewEnergy Associates, consolidated 22 Dell servers to 2 Sun servers. He cut down not just the number of systems, but he cut down on heat, power and physical space. He then managed a server consolidation environment. That is the low-hanging fruit for customers. They can do better with modern technology and make a huge energy cost savings. Computing is underutilized by customers.There are significant benefits to making that change and pushing people to experiment.

Network World – The Server Strategy – Virtualization

October 5, 2006

11/01/2006 11:55:04
IT execs who have delayed virtualizing their x86-based servers for fear the technology is still unproven should put that project at the top of their to-do lists for 2006, as the market for virtualizing the low-volume systems heats up.

It’s a combination of factors – the increasing power and stability of the x86 platform, the maturing of virtualization software and a growing choice of software vendors – that is driving adoption at a surprisingly fast clip, analysts say.

“In 2005 I saw a lot of enterprises dabbling with virtualization in test and development environments, particularly for server consolidation and cost savings,” says Scott Donahue, an analyst at Tier 1 Research. “What has surprised me more recently when I’ve talked to enterprise clients is the speed at which virtualization has actually moved into production environments.”

IDC describes the shift to x86-based server virtualization as well underway and expects widespread adoption to take place during the next couple of years, without “a five- to 10-year gradual market shift as in other technology areas.” Companies lacking a virtualization strategy for low-end systems will \pay more in the long run, in hardware costs and management headaches, analysts say.

Gartner, for example, estimates that most x86-based servers running a single application – the traditional deployment for these low-end boxes – operate at about a 10 percent average utilization rate. Using virtualization to consolidate workloads into a single box should increase utilization significantly.

In addition, as the x86 platform itself becomes more powerful, customers should find a growing list of applications appropriate for a virtualized environment. In the last couple of years, systems vendors stepped up the performance of their low-end systems with dual-core processors and 64-bit support. This year will bring servers with virtualization technology built into the silicon, a huge step for the x86 platform, which today can only be virtualized with some fancy – and performance-draining – footwork from software vendors such as VMware and Microsoft.

Having virtualization capabilities hard-wired into the chip means end users will get better performance out of virtual servers, software files that contain an operating system and applications. It also means that VMware and its competitors likely will shift their focus to management tools, resulting in more advanced management capabilities down the road.

Today’s management tools enable end users to easily move and copy virtual servers, providing a simple approach to disaster recovery and high availability. But advanced capabilities – such as a faster and more seamless migration of virtual servers among physical systems – are likely to come in the months ahead. Analysts recommend that customers take a close look at management strategies when they choose a virtualization partner.

“In the next year and a half to two years, the market will be flipping on its head completely. . . . It will shift from the hypervisor [low-level virtualization technology] to management,” says Tom Bittman, a Gartner vice president and Fellow. “So the focus should be on choosing management tools and automation, not on choosing a hypervisor. That will be a commodity.”

Another development that makes 2006 a key year for deploying x86 server virtualization is movement among the independent software vendors to make licensing in a virtual environment more user-friendly. Microsoft, for example, late last year announced a new virtualization-licensing model that stands to slash costs for end users. Though analysts note that this is a small first step in an evolving discussion, it’s encouraging to see Microsoft make an early move, industry experts agree.

Those still unsure if server virtualization on x86 systems has moved beyond hype should consider that open source is getting in on the game, with XenSource announcing its first commercial product designed to make it easier for customers to deploy and manage the open source Xen VM technology in corporate networks.

Although VMware has held a nearly uncontested leadership position since 2001, when it introduced the industry’s first virtualization software for x86-based servers, 2006 will bring end users more options in virtualizing low-end systems. That’s good news from both a price and a performance standpoint.

Software from Microsoft, SWsoft and start-ups such as Virtual Iron and XenSource offer interesting alternatives. With the underlying virtualization technology becoming available in hardware, management tools from companies such as PlateSpin, Leostream and Platform Computing deserve a closer look. Analysts also expect systems vendors such as Dell and HP to intensify their focus on this area.

Ulrich Seif, CIO at National Semiconductor in Santa Clara, Calif., says Intel’s and AMD’s plans to incorporate virtualization into their processors, and the maturing of virtualization software’s features, make slicing and dicing x86 servers a smart move, regardless of the vendor.

Seif brought in VMware last year to consolidate an increasing number of Windows servers and says he already has seen a 33 percent savings and now has an architecture that is flexible and easier to manage. ” Almost more importantly, [with server virtualization] you are positioning yourself for future [architectures] that will come natural[ly] with virtualization: true grid computing (with solid management tools); ultimate virus and intrusion detection (the host scanning guest memory for patterns); and software and configuration management,” he says.

Gartner – 2006 Emerging Technologies Hype Cycle

September 27, 2006

Gartner, Inc., today announced its 2006 Emerging Technologies Hype Cycle which assesses the maturity, impact and adoption speed of 36 key technologies and trends during the next ten years. This year’s hype cycle highlights three major themes that are experiencing significant activity and which include new or heavily hyped technologies, where organisations may be uncertain as to which will have most impact on their business.

The three key technology themes identified by Gartner, and the corresponding technologies for enterprises to examine closely within them, are:

1. Web 2.0

Web 2.0 represents a broad collection of recent trends in Internet technologies and business models.  Particular focus has been given to user-created content, lightweight technology, service-based access and shared revenue models.  Technologies rated by Gartner as having transformational, high or moderate impact include:

Social Network Analysis (SNA) is rated as high impact (definition: enables new ways of performing vertical applications that will result in significantly increased revenue or cost savings for an enterprise) and capable of reaching maturity in less than two years. SNA is the use of information and knowledge from many people and their personal networks. It involves collecting massive amounts of data from multiple sources, analyzing the data to identify relationships and mining it for new information. Gartner said that SNA can successfully impact a business by being used to identify target markets, create successful project teams and serendipitously identify unvoiced conclusions.

Ajax is also rated as high impact and capable of reaching maturity in less than two years. Ajax is a collection of techniques that Web developers use to deliver an enhanced, more-responsive user experience in the confines of a modern browser (for example, recent version of Internet Explorer, Firefox, Mozilla, Safari or Opera). A narrow-scope use of Ajax can have a limited impact in terms of making a difficult-to-use Web application somewhat less difficult.  However, Gartner said, even this limited impact is worth it, and users will appreciate incremental improvements in the usability of applications.  High levels of impact and business value can only be achieved when the development process encompasses innovations in usability and reliance on complementary server-side processing (as is done in Google Maps).

Collective intelligence, rated as transformational (definition: enables new ways of doing business across industries that will result in major shifts in industry dynamics) is expected to reach mainstream adoption in five to ten years. Collective intelligence is an approach to producing intellectual content (such as code, documents, indexing and decisions) that results from individuals working together with no centralized authority. This is seen as a more cost-efficient way of producing content, metadata, software and certain services.

Mashup is rated as moderate on the Hype Cycle (definition: provides incremental improvements to established processes that will result in increased revenue or cost savings for an enterprise), but is expected to hit mainstream adoption in less than two years. A “mashup” is a lightweight tactical integration of multi-sourced applications or content into a single offering. Because mashups leverage data and services from public Web sites and Web applications, they’re lightweight in implementation and built with a minimal amount of code. Their primary business benefit is that they can quickly meet tactical needs with reduced development costs and improved user satisfaction. Gartner warns that because they combine data and logic from multiple sources, they’re vulnerable to failures in any one of those sources.

2. Real World Web 

Increasingly, real-world objects will not only contain local processing capabilities—due to the falling size and cost of microprocessors—but they will also be able to interact with their surroundings through sensing and networking capabilities. The emergence of this Real World Web will bring the power of the Web, which today is perceived as a “separate” virtual place, to the user’s point of need of information or transaction. Technologies rated as having particularly high impact include:

Location-aware technologies should hit maturity in less than two years. Location-aware technology is the use of GPS (global positioning system), assisted GPS (A-GPS), Enhanced Observed Time Difference (EOTD), enhanced GPS (E-GPS), and other technologies in the cellular network and handset to locate a mobile user. Users should evaluate the potential benefits to their business processes of location-enabled products such as personal navigation devices (for example, TomTom or Garmin) or Bluetooth-enabled GPS receivers, as well as WLAN location equipment that may help automate complex processes, such as logistics and maintenance. Whereas the market sees consolidation around a reduced number of high-accuracy technologies, the location service ecosystem will benefit from a number of standardized application interfaces to deploy location services and applications for a wide range of wireless devices.

Location-aware applications will hit mainsteam adoption in the next two to five years. An increasing number of organizations have deployed location-aware mobile business applications, mostly based on GPS-enabled devices, to support queue business processes and activities, such as field force management, fleet management, logistics and good transportation. The market is in an early adoption phase, and Europe is slightly ahead of the United States, due to the higher maturity of mobile networks, their availability and standardization.

Sensor Mesh Networks are  ad hoc networks formed by dynamic meshes of peer nodes, each of which includes simple networking, computing and sensing capabilities. Some implementations offer low-power operation and multi-year battery life. Technologically aggressive organizations looking for low-cost sensing and robust self-organizing networks with small data transmission volumes should explore sensor networking. The market is still immature and fragmented, and there are few standards, so suppliers will evolve and equipment could become obsolete relatively rapidly. Therefore, this area should be seen as a tactical investment, as mainstream adoption is not expected for more than ten years.

3. Applications Architecture 

The software infrastructure that provides the foundation for modern business applications continues to mirror business requirements more directly. The modularity and agility offered by service oriented architecture at the technology level and business process management at the business level will continue to evolve through high impact shifts such as model-driven and event-driven architectures, and corporate semantic Web. Technologies rated as having particularly high impact include:

Event-driven Architecture (EDA) is an architectural style for distributed applications, in which certain discrete functions are packaged into modular, encapsulated, shareable components, some of which are triggered by the arrival of one or more event objects. Event objects may be generated directly by an application, or they may be generated by an adapter or agent that operates non-invasively (for example, by examining message headers and message contents).EDA has an impact on every industry. Although mainstream adoption of all forms of EDA is still five to ten years away, complex-event processing EDA is now being used in financial trading, energy trading, supply chain, fraud detection, homeland security, telecommunications, customer contact center management, logistics and sensor networks, such as those based on RFID.

Model-driven Architecture is a registered trademark of the Object Management Group (OMG). It describes OMG’s proposed approach to separating business-level functionality from the technical nuances of its implementation  The premise behind OMG’s Model-Driven Architecture and the broader family of model-driven approaches (MDAs) is to enable business-level functionality to be modeled by standards, such as Unified Modeling Language (UML) in OMG’s case; allow the models to exist independently of platform-induced constraints and requirements; and then instantiate those models into specific runtime implementations, based on the target platform of choice. MDAs reinforce the focus on business first and technology second. The concepts focus attention on modeling the business: business rules, business roles, business interactions and so on. The instantiation of these business models in specific software applications or components flows from the business model. By reinforcing the business-level focus and coupling MDAs with SOA concepts, you end up with a system that is inherently more flexible and adaptable.

Corporate Semantic Web applies semantic Web technologies, aka semantic markup languages (for example, Resource Description Framework, Web Ontology Language and topic maps), to corporate Web content. Although mainstream adoption is still five to ten years away, many corporate IT areas are starting to engage in semantic Web technologies. Early adopters are in the areas of enterprise information integration, content management, life sciences and government. Corporate Semantic Web will reduce costs and improve the quality of content management, information access, system interoperability, database integration and data quality.

“The emerging technologies hype cycle covers the entire IT spectrum but we aim to highlight technologies that are worth adopting early because of their potentially high business impact,” said Jackie Fenn, Gartner Fellow and inventor of the first hype cycle. One of the features highlighted in the 2006 Hype Cycle is the growing consumerisation of IT. “Many of the Web 2.0 phenomenon have already reshaped the Web in the consumer world”, said Ms Fenn. “Companies need to establish how to incorporate consumer technologies in a secure and effective manner for employee productivity, and also how to transform them into business value for the enterprise”.  

The benefit of a particular technology varies significantly across industries, so planners must determine which opportunities relate most closely to their organisational requirements. To make this easier, a new feature in Gartner’s 2006 hype cycle is a ‘priority matrix’ which clarifies a technology’s potential impact – from transformational to low – and the number of years it will take before it reaches mainstream adoption. “The pairing of each Hype Cycle with a Priority Matrix will help organisations to better determine the importance and timing of potential investments based on benefit rather than just hype,” said Ms Fenn.