Monday, March 30, 2009

How many computers does the world need?

There’s been a lot of buzz lately about this question and this question and Nicholas Carr’s corollary assertion, “The coming of the mega computer” :

The original quote, from the FT Techblog:

According to Microsoft research chief Rick Rashid, around 20 per cent of all the servers sold around the world each year are now being bought by a small handful of internet companies - he named Microsoft, Google, Yahoo and Amazon. That is an amazing statistic, and certainly not one I’d heard before. And this is before cloud computing has really caught on in a big way.

Having recently been working with one of the vendors of the high density servers designed for this market, I’ve been reviewing the articles and comments with great interest. Our work is almost always on contract for a major IT vendor, so we tend to see the universe through the distorting glass of our client’s needs. In this case, the outer realities reflect the inner space fairly well.

The market for high density servers is very different from the traditional approach of x86 server vendors. Servers have usually been designed to provide generalized compute capabilities within a form factor such as tower, Multiple rack unit sizes and blades. The new high density servers are much less generalized. Not only do they allow for the more discreet units of physical servers within the space required from other form factors – the high density part – but they’re also focused on providing mission specific computing capabilities through mass customization. Even, or perhaps especially, vendor giants such as IBM are producing server product lines for which the concept of server model might be an oxymoron. Yes, there are SKUs, but in reality they systems these vendors produce are custom built for the customer.

The economies of scale for mass production fits here because typical orders are for thousands of servers, all integrated into racks redesigned to hold more server units and incorporating power efficiency and cooling schemes that not only allow more servers per data center square foot, but also requiring less electricity to run the servers and keep them cool. (Cooling electricity costs are generally equal to operational electricity costs.)

The scale of this business is also very different. Where Dell, HP, IBM and the other server vendors compete to sell hundreds of servers to tens of thousands of companies in the traditional markets, in this space there are only a few customers; perhaps a few hundred world wide. But the math still works – 10,000 servers a month to one account is a lot of servers. And as anyone who’s sold computers knows, five deals to sell 50,000 units is much less expensive than 10,000 sales to sell the same number of units.

If the future is being able to use compute like services from devices anywhere on the planet at any time, then Nicholas Carr’s label of the mega computer makes sense. Or as Scott McNealy said, so long ago, “The network is the computer”. Now they’re building it.

The question is whether the vendors can wait while the economy sorts itself out. Perhaps the economies of scale work again here. That same 5X10,000 metric applies here to: Selling 50,000 servers can pay a lot of bills. Even for IBM.

Tuesday, February 3, 2009

Dell adds Xsigo's I/O virtualization to it storage and server products - Network World

From Deni Conner at Network World…

One of the coolest deals happened this week in the storage arena. I/O virtualization vendor Xsigo teamed up with Dell to expand Dell's data center offerings. In addition to reselling Xsigo's I/O Director, Dell will also collaborate with the company on technology roadmaps and channel resources.

The I/O Director will be paired with the full line of Dell products – PowerEdge servers, PowerVault storage, EqualLogic iSCSI storage and Dell/EMC branded storage

I/O virtualization is the next frontier in data center virtualization and the vendors are scrambling to come up with answers. Xsigo has a unique external approach. HP is doing it in the BladeSystem with Virtual Connect.

I haven’t had the time to figure out what the other vendors (IBM, SUN) are doing.

One thing seems obvious to me with this announcement though: Dell-Xsigo-EMC. Anyone else see a synergy for a single competitor to HP and IBM?

Wednesday, January 28, 2009

Rackable puts desktop CPUs in low-cost servers - Network World

How’s this for a new approach to application serving:

Rackable Systems has turned to low-cost desktop components for a new server design that aims to provide a cheaper alternative for companies running busy Web applications, the company announced Wednesday. The design uses Athlon and Phenom desktop CPUs from Advanced Micro Devices and allows for highly dense servers that can be priced under US$500 because they are based on commodity PC parts, said Saeed Atashie, director of server products at Rackable.

Rackable puts desktop CPUs in low-cost servers - Network World

People have been using desktops as server for years. Hell, my first publically accessible web server ran an a Mac SE30. And it ran commercial sites driven by an Oracle Database that ran on a PC!

But that was a one off test bed, not a large scale web farm. It was also 1994.

Rackable makes high density x86 servers. Until now they utilized Xeon and Opteron processors. The systems were designed so as many as 84 servers could fit into a standard server cabinet. At about $3K a piece, they were not cheap, but were in the same ball park as the much lower density 1U servers from everyone.

This offering fulfills the RAIC paradigm. This is really an array of inexpensive computers. While the reliability won’t be as high as with server grade parts, that doesn’t matter that much in this use model. Quickly pulling out a failed server and replacing it is the story here and dumping a $500 box is much easier to justify to the CFO than a $3500 one, even if you could make the case that the $3500 one was cheap too.

Very interesting.

The shifting tides of heterogeneous storage virtualization :: Wikibon

A few months ago I’d embarked on research concerning external virtualization, here called heterogeneous storage virtualization. I like that term better and the TLA, HSV, works pretty well too. 

The article points to the same drivers I’d identified in my research:

    1. Tiered storage - in an effort to create a default tier 2 strategy and avoid expensive tier 1 platforms;
    2. Migration capability - especially for customers facing a rolling financially-forced lease conversion every year or those with particularly frequent migrations due to acquisition strategies;
    3. Storage consolidation - in an effort to pool heterogeneous storage assets;
    4. VMware and server virtualization - to support backend storage virtualization for virtualized server environments.

So far, my research has focused on how the main players: IBM with SVC, EMC with Invista, Hitachi  with USVP (and HP which sells Hitachi built storage, customized to HP specs, but still using Hitachi technologies) with and LSI with Storage Virtualization Manager. In the past few weeks, HP has brought to market its own version of the LSI product, renaming it SVSP and adding some features and capabilities.

It’s very hard to compare how these products differ in delivering their functionality as the costs are fairly high, but Edison hopes to begin some testing in the next several months. For now, it looks like the market will have to depend upon what the sales folks say. But in the future, perhaps, Edison will have some nuts and bolts information to help organizations make more educated decisions.

Sunday, January 25, 2009

Vista doesn't like to update

I've been having a problem with one of my computers that's running Vista. (I know, I know: why am I running Vista? Probably so I can go through this crap and understand what the fuss is about.)

Anyhow, a few months ago, Windows Update started having problems. It would automatically update, but I couldn't access the tool to change settings and if I also wanted to update other Microsoft products, I couldn't access the Microsoft Update site, nor install nor run Microsoft Update or even MS Office auto updates.

I discovered that this might not be an unusal problem - there was a link on the error page to get support from Microsoft. I clicked it.

Microsoft support people started a long, e-mail discussion of things to try. I've tried them all over the past many weeks. Install this, run that command line utility, download that other thing, do an upgrade installation (which failed because I'd installed SP1 and the Vista DVD was pre SP1; MS kindly sent new software media, but it too was pre-SP1 and also came on CD not DVD. Oh, and it didn't work).

Additional research on the Microsoft forums has shown that this problem is common enough to have several threads about it.

Everyone seems to go through the same steps. Sometimes it works and sometimes it doesn't. I know that Microsoft is at a disadvantage over Apple in this - every PC has different hardware and software installed and not all of those products are made to the same high standards. And updating live software is always a challenge. Windows (and earlier MS OS) biggest advantage over Macintosh OSes is that it runs almost any Intel compatible hardware. It's biggest disadvantage is that it also may not run on almost any Intel compatible hardware.

The fruits of this conflicting agenda is the failure of Vista to gain market acceptance and the on-going concerns about next generation operating systems.

Sad.

Tuesday, December 2, 2008

Evangelism, Word-of-Mouth and Passion is the Next Evolution of Advertising Research

Jack Myers, a marketing and advertising expert has written an interesting set of articles on the, “irreversible progression of advertising messaging away from the 100-year tradition of mass reach and building awareness and toward an emphasis on achieving and maintaining consumer trust and passion.” While my company Edison Group is not engaged in Advertising, the themes are very relevant to what Edison tries to do for our clients. I think these themes should be referenced in our marketing and underlie much of our work.

The following is the meat of the article:

Rather than position these as self-contained and differentiated approaches to advertising value, it is more appropriate to consider established and emerging principles as redundant and overlapping.

Awareness = Relevance
Consumers must judge an advertised product or service to be relevant in order for that product/service to register in their conscious, aka awareness -- a measure that researchers have defined through old-fashioned recall research.

Interest Requires Differentiation
For consumers to become actively interested in purchasing or even learning more about a product/service, its messaging will need to clearly differentiate it from its competitors. Whether that differentiation is based on price, quality, geography or other considerations, advertising will need to clearly communicate a differentiated positioning strategy.

Retention Requires that Consumers Clearly Perceive the Product/Service
Retention has also been measured in old-fashioned ways: how long do consumers remember a message after it has been communicated – and how recently does the message need to be repeated for it to have a meaningful impact. In the future, new measures will focus, as explained by BBDO president Andrew Robertson, on whether consumers can perceive the product or services fitting into the patterns and rituals they maintain throughout their lives.

Trust is Essential to Convince Wary Consumers
Once advertising establishes the unique points of differentiation and the relevance of the product/service to the lives of target consumers, a wary public will consider whether or not the advertiser can be trusted and if they can safely make a purchase decision with confidence the promise of the advertising will be fulfilled. Convincing and persuading consumers requires that consumer trust be established; continued purchase requires that trust be maintained.

The Ultimate Goal for Marketing is Passion and Active Evangelism
A brand is defined by differentiation, relevance, effectively communicated messages and fulfilled expectations. Brand equity is lost when there is a loss of trust. But the final goal of advertising – which in the past has been solely based on the actual purchase – now has an even more fully evolved goal. Consumer advocacy – the desire to be a public proponent and evangelist for a product or service – is defined by the passion consumers have for products and services. Passion goes beyond actual purchase decisions and reflects a proactive decision to seek to convince others to consumer the product as well.

Marketers, creative agencies and all those involved in the creative process across all media and entertainment must begin thinking beyond traditional research metrics and traditional measures of success. They must begin evolving their strategies to move beyond reach and even beyond direct consumer actions – whether those actions be clicks or purchases. The next generation of communications will focus on passion – defined by word-of-mouth, blogs, evangelism, conversational marketing, and other forms of advocacy.

While it should be obvious, I’ll illustrate how each of these principles apply to our work and how we bring value to our customers.

Relevance – this is key to matching a technology product to real end user needs. Not only must the products themselves be useful to our clients’ customers, but the work we do should be relevant to those customers' real needs. For example, our work on management ease is not only a TCO issue, even though that’s how we’re presenting the concept. Ease of use is also relevant to many readers who are used to struggling with products that are just too complex or hard to use. The relevance comes from that personal connection and the comfort a user feels when a product is truly easy to use. Another example: Apple

Differentiation – this is precisely what we do. I generally call it validation of positioning: providing demonstrable proof that differentiating factors for our clients products are true within the product’s own context and versus competitors. By providing validation, we not only support the assertions of our clients, but enhance our demonstrations of relevance to the customer’s needs and perceptions.

Clear Perception Enables Retention – by providing a relevant, clear and validated differentiation of our clients’ products we provide conceptual hooks by which customers can retain the impressions we’ve imparted by our work product. Whether it’s the belief that HP EVA is easy to use that’s current in the customer base —something supported by our work and remembered as such by customers — or a silver bullet in a sales guide that is relevant to a sales person’s daily effort — she made the sale because of something we wrote and she remembers that we wrote it — the best outcomes from our projects come from our clear writing style and approach to our work.

Trust – is a core value we bring to the table. Not only trust in our research, but also by extension, in our client’s products. A reader won’t retain what we’ve produced if they don’t think what we’ve written is true. They won’t believe the differentiations if they don’t trust our methodologies and independence and they can’t perceive relevance if they don’t trust that we understand their issues.

Passion and Active Evangelism – these are even more critical for our customer base. The enterprise technology products we write about are boring: they rarely inspire passion in their users other than hatred when the products don’t work as advertised. Overcoming negative passions and inspiring word of mouth endorsements can go a long way in helping our clients sell more. Helping end user advocates close their internal sell for these expensive solutions is an important goal for most of our public documents. Our white papers don’t necessarily change the reader’s mind: instead the reader uses what they have learned to prove to their boss the efficacy of the solutions we’re supporting.

Monday, June 30, 2008

The Future of OS?

It would seem that I was on the right track over the past decade. The following excerpts from a ZDNet blog, by Mary Jo Foley, and the referenced Microsoft research sites describes research into OS architecture I was writing about years ago when Microsoft was first getting legally slammed for monopolistic practices and delivering crummy software.

My points over the years have been that operating systems were being made to do things for which they were not designed and that the use of legacy code and approaches was hobbling functionality and crippling performance, security, and innovative ways of using every improving microprocessor design and features.

I'd felt that starting over from the basics might solve lots of problems that are the result of renovating and building new additions to an old architecture. This should be obvious, but it's very good to see a willingness on the part of the owners of that rickety Rube Goldberg building to start over from scratch.

The first two quotes are about a "pure" research project at Microsoft: Singularity. The third quote is about a Microsoft spin off from that research: Midori.

From ZDNet:

“The Singularity project started in 2003 to re-examine the design decisions and increasingly obvious shortcomings of existing systems and software stacks. These shortcomings include: widespread security vulnerabilities; unexpected interactions among applications; failures caused by errant extensions, plug-ins, and drivers, and a perceived lack of robustness. We believe that many of these problems are attributable to systems that have not evolved far beyond the computer architectures and programming languages of the 1960’s and 1970’s. The computing environment of that period was very different from today….”

Some more detail from Microsoft Research:

The status quo that confronted them (the Microsoft Research team) was the decades-long tradition of designing operating systems and development tools. Contemporary operating systems—including Microsoft Windows, MacOS X, Linux, and UNIX—all trace their lineage back to an operating system called Multics that originated in the mid-1960s. As a result, the researchers reasoned, current systems still are being designed using criteria from 40 years ago, when the world of computing looked much different than it does today.

“We asked ourselves: If we were going to start over, how could we make systems more reliable and robust?” Larus says. “We weren’t under the illusion that we’d make them perfect, but we wanted them to behave more predictably and remain operating longer, and we wanted people to experience fewer interruptions when using them.”

From the same ZDNet story on Midori:

“There’s a seemingly related (related to Singularity) project under development at Microsoft which has been hush-hush. That project, codenamed ‘Midori,’ is a new Microsoft operating-system platform that supposedly supersedes Windows. Midori is in incubation, which means it is a little closer to market than most Microsoft Research projects, but not yet close enough to be available in any kind of early preview form.

There's not much information on Midori, but that's not too important. What's important is the possibility that as we transition from the PC paradigm to something very different (and beyond the mobile device model too), is the willingness to consider a whole new way of utilizing what we call computing technology. Without knowing more about what Microsoft is up to, and now inside knowledge about what the other interested companies might be doing, I'd like to pose an idea:

Move away from our current hardware architectures. Find alternatives to buses and other limiting structures. Look at hardware design from the same new perspective as Microsoft is looking at OS design. Start over from scratch. And start over by scratching the needs of the folks who might actually find this new paradigm useful.

Computers started as tools for breaking WWII ciphers and calculating ballistics. It was only in 1951 when the first business applications arrived. The first graphical computer game arrived a year later.  All of these inventions were built upon engineering principles, the limitations of vacuum tubes and early electronics. We're now at a point where these limitations can be transcended by a bit of imagination.

I look forward to seeing the imagination in action.