Archive for October, 2006

eWeek – How CIOs Can Make Their Businesses More Competitive

October 10, 2006


When Mark McDonald, head of Gartner’s Executive Programs, took the stage at the Gartner Symposium today in Orlando, the packed house knew immediately what they were in for: a pep rally.

Maybe it’s the fact that McDonald looks like a linebacker. Maybe it’s the fact that he works up a sweat bellowing out his CIO directives. Whatever it is, the guy is downright inspirational. And he had one single message for the IT leaders in the room: Make your company more competitive.

McDonald’s impassioned speech was long on motivation, short on details. But here’s a little factual information that backs up his belief that in 2007 CIOs must advance their companies’ competitive stance or look for other work. The early returns from Gartner’s 2007 CIO survey are starting to come in for the year, and the order of business priorities is as follows:

1.)    Improve business processes

2.)    Reduce operating costs

3.)    Attract and grow customer base

4.)    Support competitive advantage

5.)    Improve enterprise competitiveness

6.)    Grow revenue

7.)    Improve information intelligence

8.)    Deploy business capabilities

9.)    Improve bottom line profitability

10.)   Security and data protection

Just look at those results for a minute. This is what the business is expecting of IT. Grow customer base? Grow revenue? Since when is this stuff IT’s job? Just look at where traditional IT responsibilities fall on this list. Security is dead last.

To this, McDonald had this to say: “You’ve won. Oliver Stone could not have come up with a better conspiracy theory. First you automate their transactions, then you start automating their processes, then you push technology out to the edge of the network. You’ve achieved Borgdom. You’ve won. And that means that competitive advantage is now an IT issue.”

I could have done without yet another Star Trek reference at a technology conference (haven’t we moved beyond that yet?), but the point is well made. Now that IT has become so integral to all aspects of  business operations, it’s time for CIOs to make like business people.

McDonald says the key to this is to stop thinking about IT as a bunch of layers in an enterprise. He uses a cake as an analogy. See the whole cake instead, not layers of the cake. And ask yourself, before you begin any project, about what a customer wants. “Who’s hungry, how do we find them, how do we get the cake to them, how do we charge them and when they are done, how will they get another piece,” he said.

Sounds like a piece of cake, right? Sorry. Bad joke.

Anyway, McDonald reminded CIOs that they must be the idea generators. CEOs and line-of-business managers don’t understand what technology can do. CIOs must bring ideas to the table. And constantly ask themselves: What will be tangibly different about the business when I am finished [with this project]?

Well, if you believe McDonald, IT has bulled its way to a seat at the corporate table now. Now it’s time to prove it really belongs.


InfoWorld: Graham Lovell Talking About Sun & Virtualization

October 5, 2006

According to a recent press release from Sun Microsystems, one company in particular is combining the well planned architecture of Sun’s Sun Fire X4200 Server and the power of virtualization to perform a 22-1 server consolidation, thereby allowing them to combat and reduce power consumption and high heat output by up to 84 percent: NewEnergy is replacing its entire Houston data center, comprised of 22 Intel processor-based servers, with two Sun Fire X4200 servers powered with the Dual-Core AMD Opteron processor, and running the Solaris 10 OS. NewEnergy’s Houston data center performs CPU-intensive Grid computing simulations for its customers nationwide, which mirror real-world electric Grids in order to plan for potential disasters. Trial results demonstrated the Sun Fire X4200 servers as being much faster than other servers which is partially credited to the Solaris 10 OS’s efficiency over memory-intensive applications running the Windows OS.Sun and VMware have combined efforts to provide innovation and deliver proven virtual infrastructure solutions for enterprise computing. Leveraging the power of VMware Infrastructure 3, Solaris 10 and the Sun Fire series of servers, customers can maximize performance and reduce overall cost of ownership via server consolidation, business continuity, and test or development solutions. Combining these products, IT managers are given a complete solution to help increase their server utilization, improve performance and reduce costs while making better use of their data center resources, like space, cooling and power consumption. I recently had the pleasure of speaking with Graham Lovell, Senior Director of the Systems Group for Sun Microsystems. I wanted to find out more about Sun and to get his take on the whole virtualization scene, specifically software licensing, emerging trends, and customer needs in the virtualization space.

David Marshall: In your own words, what is Sun’s strategy towards virtualization?Graham Lovell: The first thing we need to establish is what we mean by virtualization and how we communicate it. We need to define it with customers in different circumstances.Customers generally look to improve the utilization of their servers. They want to run multiple applications and different operating systems. The idea is to snapshot what they have in a piece of hardware and then run it on another system in a virtualized way.They can see the benefits of running virtualized environments, but they have to support it. They need management tools to run it well.It is important that suppliers such as Sun can provide a range of options on different multiple operating systems. We have SPARC and x86 product lines. With Solaris 10, we have containers. It lets you run isolated systems where each one thinks its running on a dedicated system. If you are running Xen or VMware, you aren’t running multiple software copies.

This has been popular with customers running Solaris and SPARC and Solaris on x86 platforms.

The next choice is that customers can select VMware. VMware has a number of new products, but people think it is a single solution. When we talk to customers about experiencing VMware, some of them may have just heard of it. That is when we can talk about different styles of implementation.

Customers are seeing the benefit on how they can mix VMware. They talk about pooling resources in the data center so one can then resource data across several servers. This makes it easier to move applications around and help with capacity planning. Virtualization can help you install pool behavior.

David: Are you finding that people are using Solaris containers to do the same thing as VMware? Such as for development and test or support? Or are they strictly using it for server consolidation?Graham: Customers look at virtualization to test and debug applications across a range of application systems. That is where the customer can be more sophisticated in their choice with VMware or Xen. With VMware, you can see things have more choice. Xen is up and coming. It is embedded in a number of operating systems. It has interesting and new budget tools. I think Xen will have an interesting future virtualization stack as well.Containers are typically rolled out in an application environment.
David: How does virtualization impact software licensing?Graham: The software industry is reeling from pricing multiple cores per processor. Microsoft has strong policies around pricing cores. Virtualization software subdivides a processor into pieces of CPU. Vendors then argue why do they pay for the whole software when they only use a fraction of it?Value-based pricing is a more reasonable way to charge for software. I think Microsoft is one of the first to come out with policies around virtualized environments.

David: I agree with you. Software licensing will have to change. People are using virtual machines for things such as disaster recovery options and software companies will have to adapt.Graham: Without the flexibility in licensing, customers may find themselves paying more for the software. They moved the software from a 2-core system to an 8-core system. Virtualized environments have bigger engines. They need to make sure they don’t fall far of software restrictions.Customers need to go back to their ISVs and say, is it ok if I can move from 2-cores to 4? Then you have a start for negotiation.Sun has an enterprise system where you charge by the number of employees in the company. It doesn’t matter how much hardware you run, it’s a site-based license with lots of flexibility.

David: What do you think is driving the demand for virtualization today?Graham: I got this Windows NT application. The problem is I can no longer get hardware that can run the physical operating system. You can’t buy old hardware that will run this new software. Legacy reasons are one of the key drivers for virtualization.Server sprawl also generates too much heat and uses too much power. If I consolidate them, I can then improve the use of space, heat and power in the data center.When customers think of disaster planning, they need to easily migrate applications across platforms. If one data center has a problem, it’s easier to migrate in a virtual environment than a non-virtual one.Virtualization also offers more flexibility. When a business comes along and the IT department needs to respond quickly to business needs – virtualization can ramp things up.
David: I’ve seen problems using VMware and Xen with patch management. Since the containers approach is based on one operating system, would that solve part of the patch management problem? It seems like instead of having to patch multiple areas, you just have to patch one.Graham: The flip side is that it runs the same kernel code. So it is all consistent. But you can apply different patches into the user space. You can’t have multiple kernels. If you make any changes, it is reflected across the containers. Then you may want to run VMware with several implementations of Solaris. Then you have a patch level in one instance of Solaris.

David: Can you leave us with a good customer example?Graham: The one that gets my juices going is New Energy Associates.Neal Tisdale, Vice President of Software Development of NewEnergy Associates, consolidated 22 Dell servers to 2 Sun servers. He cut down not just the number of systems, but he cut down on heat, power and physical space. He then managed a server consolidation environment. That is the low-hanging fruit for customers. They can do better with modern technology and make a huge energy cost savings. Computing is underutilized by customers.There are significant benefits to making that change and pushing people to experiment.

Network World – The Server Strategy – Virtualization

October 5, 2006

11/01/2006 11:55:04
IT execs who have delayed virtualizing their x86-based servers for fear the technology is still unproven should put that project at the top of their to-do lists for 2006, as the market for virtualizing the low-volume systems heats up.

It’s a combination of factors – the increasing power and stability of the x86 platform, the maturing of virtualization software and a growing choice of software vendors – that is driving adoption at a surprisingly fast clip, analysts say.

“In 2005 I saw a lot of enterprises dabbling with virtualization in test and development environments, particularly for server consolidation and cost savings,” says Scott Donahue, an analyst at Tier 1 Research. “What has surprised me more recently when I’ve talked to enterprise clients is the speed at which virtualization has actually moved into production environments.”

IDC describes the shift to x86-based server virtualization as well underway and expects widespread adoption to take place during the next couple of years, without “a five- to 10-year gradual market shift as in other technology areas.” Companies lacking a virtualization strategy for low-end systems will \pay more in the long run, in hardware costs and management headaches, analysts say.

Gartner, for example, estimates that most x86-based servers running a single application – the traditional deployment for these low-end boxes – operate at about a 10 percent average utilization rate. Using virtualization to consolidate workloads into a single box should increase utilization significantly.

In addition, as the x86 platform itself becomes more powerful, customers should find a growing list of applications appropriate for a virtualized environment. In the last couple of years, systems vendors stepped up the performance of their low-end systems with dual-core processors and 64-bit support. This year will bring servers with virtualization technology built into the silicon, a huge step for the x86 platform, which today can only be virtualized with some fancy – and performance-draining – footwork from software vendors such as VMware and Microsoft.

Having virtualization capabilities hard-wired into the chip means end users will get better performance out of virtual servers, software files that contain an operating system and applications. It also means that VMware and its competitors likely will shift their focus to management tools, resulting in more advanced management capabilities down the road.

Today’s management tools enable end users to easily move and copy virtual servers, providing a simple approach to disaster recovery and high availability. But advanced capabilities – such as a faster and more seamless migration of virtual servers among physical systems – are likely to come in the months ahead. Analysts recommend that customers take a close look at management strategies when they choose a virtualization partner.

“In the next year and a half to two years, the market will be flipping on its head completely. . . . It will shift from the hypervisor [low-level virtualization technology] to management,” says Tom Bittman, a Gartner vice president and Fellow. “So the focus should be on choosing management tools and automation, not on choosing a hypervisor. That will be a commodity.”

Another development that makes 2006 a key year for deploying x86 server virtualization is movement among the independent software vendors to make licensing in a virtual environment more user-friendly. Microsoft, for example, late last year announced a new virtualization-licensing model that stands to slash costs for end users. Though analysts note that this is a small first step in an evolving discussion, it’s encouraging to see Microsoft make an early move, industry experts agree.

Those still unsure if server virtualization on x86 systems has moved beyond hype should consider that open source is getting in on the game, with XenSource announcing its first commercial product designed to make it easier for customers to deploy and manage the open source Xen VM technology in corporate networks.

Although VMware has held a nearly uncontested leadership position since 2001, when it introduced the industry’s first virtualization software for x86-based servers, 2006 will bring end users more options in virtualizing low-end systems. That’s good news from both a price and a performance standpoint.

Software from Microsoft, SWsoft and start-ups such as Virtual Iron and XenSource offer interesting alternatives. With the underlying virtualization technology becoming available in hardware, management tools from companies such as PlateSpin, Leostream and Platform Computing deserve a closer look. Analysts also expect systems vendors such as Dell and HP to intensify their focus on this area.

Ulrich Seif, CIO at National Semiconductor in Santa Clara, Calif., says Intel’s and AMD’s plans to incorporate virtualization into their processors, and the maturing of virtualization software’s features, make slicing and dicing x86 servers a smart move, regardless of the vendor.

Seif brought in VMware last year to consolidate an increasing number of Windows servers and says he already has seen a 33 percent savings and now has an architecture that is flexible and easier to manage. ” Almost more importantly, [with server virtualization] you are positioning yourself for future [architectures] that will come natural[ly] with virtualization: true grid computing (with solid management tools); ultimate virus and intrusion detection (the host scanning guest memory for patterns); and software and configuration management,” he says.