Tuesday, December 2, 2008

Evangelism, Word-of-Mouth and Passion is the Next Evolution of Advertising Research

Jack Myers, a marketing and advertising expert has written an interesting set of articles on the, “irreversible progression of advertising messaging away from the 100-year tradition of mass reach and building awareness and toward an emphasis on achieving and maintaining consumer trust and passion.” While my company Edison Group is not engaged in Advertising, the themes are very relevant to what Edison tries to do for our clients. I think these themes should be referenced in our marketing and underlie much of our work.

The following is the meat of the article:

Rather than position these as self-contained and differentiated approaches to advertising value, it is more appropriate to consider established and emerging principles as redundant and overlapping.

Awareness = Relevance
Consumers must judge an advertised product or service to be relevant in order for that product/service to register in their conscious, aka awareness -- a measure that researchers have defined through old-fashioned recall research.

Interest Requires Differentiation
For consumers to become actively interested in purchasing or even learning more about a product/service, its messaging will need to clearly differentiate it from its competitors. Whether that differentiation is based on price, quality, geography or other considerations, advertising will need to clearly communicate a differentiated positioning strategy.

Retention Requires that Consumers Clearly Perceive the Product/Service
Retention has also been measured in old-fashioned ways: how long do consumers remember a message after it has been communicated – and how recently does the message need to be repeated for it to have a meaningful impact. In the future, new measures will focus, as explained by BBDO president Andrew Robertson, on whether consumers can perceive the product or services fitting into the patterns and rituals they maintain throughout their lives.

Trust is Essential to Convince Wary Consumers
Once advertising establishes the unique points of differentiation and the relevance of the product/service to the lives of target consumers, a wary public will consider whether or not the advertiser can be trusted and if they can safely make a purchase decision with confidence the promise of the advertising will be fulfilled. Convincing and persuading consumers requires that consumer trust be established; continued purchase requires that trust be maintained.

The Ultimate Goal for Marketing is Passion and Active Evangelism
A brand is defined by differentiation, relevance, effectively communicated messages and fulfilled expectations. Brand equity is lost when there is a loss of trust. But the final goal of advertising – which in the past has been solely based on the actual purchase – now has an even more fully evolved goal. Consumer advocacy – the desire to be a public proponent and evangelist for a product or service – is defined by the passion consumers have for products and services. Passion goes beyond actual purchase decisions and reflects a proactive decision to seek to convince others to consumer the product as well.

Marketers, creative agencies and all those involved in the creative process across all media and entertainment must begin thinking beyond traditional research metrics and traditional measures of success. They must begin evolving their strategies to move beyond reach and even beyond direct consumer actions – whether those actions be clicks or purchases. The next generation of communications will focus on passion – defined by word-of-mouth, blogs, evangelism, conversational marketing, and other forms of advocacy.

While it should be obvious, I’ll illustrate how each of these principles apply to our work and how we bring value to our customers.

Relevance – this is key to matching a technology product to real end user needs. Not only must the products themselves be useful to our clients’ customers, but the work we do should be relevant to those customers' real needs. For example, our work on management ease is not only a TCO issue, even though that’s how we’re presenting the concept. Ease of use is also relevant to many readers who are used to struggling with products that are just too complex or hard to use. The relevance comes from that personal connection and the comfort a user feels when a product is truly easy to use. Another example: Apple

Differentiation – this is precisely what we do. I generally call it validation of positioning: providing demonstrable proof that differentiating factors for our clients products are true within the product’s own context and versus competitors. By providing validation, we not only support the assertions of our clients, but enhance our demonstrations of relevance to the customer’s needs and perceptions.

Clear Perception Enables Retention – by providing a relevant, clear and validated differentiation of our clients’ products we provide conceptual hooks by which customers can retain the impressions we’ve imparted by our work product. Whether it’s the belief that HP EVA is easy to use that’s current in the customer base —something supported by our work and remembered as such by customers — or a silver bullet in a sales guide that is relevant to a sales person’s daily effort — she made the sale because of something we wrote and she remembers that we wrote it — the best outcomes from our projects come from our clear writing style and approach to our work.

Trust – is a core value we bring to the table. Not only trust in our research, but also by extension, in our client’s products. A reader won’t retain what we’ve produced if they don’t think what we’ve written is true. They won’t believe the differentiations if they don’t trust our methodologies and independence and they can’t perceive relevance if they don’t trust that we understand their issues.

Passion and Active Evangelism – these are even more critical for our customer base. The enterprise technology products we write about are boring: they rarely inspire passion in their users other than hatred when the products don’t work as advertised. Overcoming negative passions and inspiring word of mouth endorsements can go a long way in helping our clients sell more. Helping end user advocates close their internal sell for these expensive solutions is an important goal for most of our public documents. Our white papers don’t necessarily change the reader’s mind: instead the reader uses what they have learned to prove to their boss the efficacy of the solutions we’re supporting.

Monday, June 30, 2008

The Future of OS?

It would seem that I was on the right track over the past decade. The following excerpts from a ZDNet blog, by Mary Jo Foley, and the referenced Microsoft research sites describes research into OS architecture I was writing about years ago when Microsoft was first getting legally slammed for monopolistic practices and delivering crummy software.

My points over the years have been that operating systems were being made to do things for which they were not designed and that the use of legacy code and approaches was hobbling functionality and crippling performance, security, and innovative ways of using every improving microprocessor design and features.

I'd felt that starting over from the basics might solve lots of problems that are the result of renovating and building new additions to an old architecture. This should be obvious, but it's very good to see a willingness on the part of the owners of that rickety Rube Goldberg building to start over from scratch.

The first two quotes are about a "pure" research project at Microsoft: Singularity. The third quote is about a Microsoft spin off from that research: Midori.

From ZDNet:

“The Singularity project started in 2003 to re-examine the design decisions and increasingly obvious shortcomings of existing systems and software stacks. These shortcomings include: widespread security vulnerabilities; unexpected interactions among applications; failures caused by errant extensions, plug-ins, and drivers, and a perceived lack of robustness. We believe that many of these problems are attributable to systems that have not evolved far beyond the computer architectures and programming languages of the 1960’s and 1970’s. The computing environment of that period was very different from today….”

Some more detail from Microsoft Research:

The status quo that confronted them (the Microsoft Research team) was the decades-long tradition of designing operating systems and development tools. Contemporary operating systems—including Microsoft Windows, MacOS X, Linux, and UNIX—all trace their lineage back to an operating system called Multics that originated in the mid-1960s. As a result, the researchers reasoned, current systems still are being designed using criteria from 40 years ago, when the world of computing looked much different than it does today.

“We asked ourselves: If we were going to start over, how could we make systems more reliable and robust?” Larus says. “We weren’t under the illusion that we’d make them perfect, but we wanted them to behave more predictably and remain operating longer, and we wanted people to experience fewer interruptions when using them.”

From the same ZDNet story on Midori:

“There’s a seemingly related (related to Singularity) project under development at Microsoft which has been hush-hush. That project, codenamed ‘Midori,’ is a new Microsoft operating-system platform that supposedly supersedes Windows. Midori is in incubation, which means it is a little closer to market than most Microsoft Research projects, but not yet close enough to be available in any kind of early preview form.

There's not much information on Midori, but that's not too important. What's important is the possibility that as we transition from the PC paradigm to something very different (and beyond the mobile device model too), is the willingness to consider a whole new way of utilizing what we call computing technology. Without knowing more about what Microsoft is up to, and now inside knowledge about what the other interested companies might be doing, I'd like to pose an idea:

Move away from our current hardware architectures. Find alternatives to buses and other limiting structures. Look at hardware design from the same new perspective as Microsoft is looking at OS design. Start over from scratch. And start over by scratching the needs of the folks who might actually find this new paradigm useful.

Computers started as tools for breaking WWII ciphers and calculating ballistics. It was only in 1951 when the first business applications arrived. The first graphical computer game arrived a year later.  All of these inventions were built upon engineering principles, the limitations of vacuum tubes and early electronics. We're now at a point where these limitations can be transcended by a bit of imagination.

I look forward to seeing the imagination in action.

Wednesday, June 25, 2008

Tiered, Schmeared, Weird: Enterprise Storage Management Issues and Technologies

We've been doing research in the areas of enterprise storage management with a focus on what is now called Tiered Storage and EMC calls Information Lifecycle Management (ILM). The technology was also called, back in the day, Hierarchical Storage Management (HSM).

The basic idea is pretty obvious. Store the most critical, performance sensitive data on devices that are the most reliable and highest performing (and most expensive). Store less critical data on less expensive devices and the least critical data on the least expensive and possible off line devices.

I'll probably address several of the aspects of these technologies in multiple postings over the next several weeks. For now, I'll just define some terms and establish a baseline context for the future. For these discussions most of the storage solutions mentioned will be shared. Sharing may be on a SAN, NAS or iSCSI protocol, but the devices themselves are shared.

First of all, lets describe the nature of the hardware typical to the different storage tiers. As mentioned previously, the logic behind tiered storage is that the most performance sensitive, mission critical data should be stored on devices that offer the greatest reliability and performance. This generally means Fibre Channel Storage Area Networks with the largest arrays, fasted backplanes and connections. Vendors for these systems include IBM DS6000 and DS 8000, HP XP series, SUN StorageTek 9900 series, EMC Symmetrix, Hitachi Universal Storage Platform, Pillar Axiom, and a few others.

The second storage tier, at least for purposes of this discussion, consists of systems that are not as powerful as the top of the line products. They are intended for servers and applications where maximum performance is not the key driver. The hardware can be the same as previously, listed, perhaps with lesser components, or be older models. Depending on the scale of the organization, these devices could also be the same vendor's mid-range systems. For example, HP EVA or EMC CLARiiON. This tier also opens the door for additional vendors and connection protocols. For example, NetApp filers. (I know that NetApp makes high end devices and could be included in the previous list, but this discussion is already very complicated.)

The next tiers bring us to another aspect of the story: online, near-line and offline storage. Quickly, online storage is that which is connected to and immediately available to the devices needing the data. Near-line is storage that's connected, but which may not be immediately available for use, such as a Magstar tape used in legacy mainframe systems. Offline storage is data that needs to be brought online before it can be accessed. This can be as sophisticated as a DVD stored in an automated library or as simple as a tape stored in a vault. The tiers being discussed now can consist of any combination of devices that fit into these three states depending upon need and design.

My next post will discuss some of the ways vendors are attempting to both provide tiered storage solutions. In the future, I'll be looking at solutions from vendors who purport to provide alternatives that greatly simplify this architecture such as XIV Nextra, and perhaps drill down into some of the technical details. I'll also be discussing how the various vendors go to market with their interpretations of these themes. Not to belabor the obvious, but each vendors strategy exploits their history, market and product strengths.

Tuesday, May 20, 2008

The Server Virtualization Battle

Citrix's Xen won't cede to VMware or Hyper-V

It seems like only yesterday when I was wondering about whether the price war in the hypervisor market should be worrying VMware. Wait, it was yesterday. Today I read that Citrix won't concede in the price war or on the technology.

From the TechTarget web site:

In December 2006, two startups, XenSource Inc. (now owned by Citrix) and Virtual Iron Software Inc., kicked off a virtualization price war, offering virtualization for as little as one-sixth the cost of VMware. Last fall with the entrance of Oracle Corp., Novell Inc., Red Hat Inc. and others into the battle, the price competition intensified, and then this spring, rivalry flared when Citrix cut prices again and initiated flat pricing for servers with up to four sockets. Citrix's efforts have met some success as well. Now all the players have geared up for Microsoft's August 2008 launch of Hyper-V, which is extremely low cost at the price of $28 per server. Citrix has a special mission in this new lanscape. Sandwiched in between feature-rich VMware and lower-cost Hyper-V, Citrix's Xen has the daunting task of remaining price-competitive yet fully featured enough to compete.

Citrix's Xen won't cede to VMware or Hyper-V

Yesterday was about VMware not lowering prices because they "…have the right product packages with prices that allow the customer to do whatever it is they need to do with our technology…".

Today is about how "…Citrix may ultimately focus on the desktop virtualization. First, Microsoft hasn't targeted its virtualization efforts to virtual desktops, leaving an opportunity for others to gain advantage."

And that Citrix "…has an open storage interface that integrates innovations from storage vendors into the hypervisor with plug-in drivers."

The rest of the article combines some market-speak with lots of assertions about feature comparisons between Citrix XenSource, VMware and even Microsoft Hyper-V (though to be fair, they state clearly that since Hyper-V isn't shipping feature comparisons are invalid at this time). Lots of on-the-one-hand vs. on-the-other-hand comparisons.

Actually comparing these products and either publishing the results for all to see or sharing them privately with our vendor clients is what my company does best. VMware: hear those footsteps behind you? Novell: wonder why you're not in the top tier? Virtual Iron: the reviewers like you, but don't you need help getting some real market share? Microsoft: we know you're going to deliver something functional eventually; wouldn't it be good to use have some support for your product claims and debunk the critics?

Why don't you give me a call? 

Monday, May 19, 2008

VMware won't lower price, says exec. Does pride go before a fall?

The link will take you to the whole interview, but the point is made in the headline.

VMware won't lower price, says exec

The executive, senior director of product marketing Bogomil Balkansky goes on to explain,

"...The objective is to make sure that we have the right products at the right prices for all types of customers. I don't think that principle is going to change.

I would argue that right now, we have the right product packages with prices that allow the customer to do whatever it is they need to do with our technology. Whether it's something simple like partitioning a server for test and development, a single server consolidation, production server consolidation for a larger environment or if they want to reap the other benefits of virtualization, such as business continuity, disaster recovery and dynamic resource management, they can do that at a price that affords a tremendous ROI. Virtualization is one of those technologies where the ROI is quick, undeniable and easy to see. "

The interview then goes on to address the competitive challenges that VMware is facing, denying that the competitors will have any immediate impact on VMware and in that though Microsoft will become a formidable competitor, they're years away from being even close to VMware in delivering a server virtualization platform with the management functionality enterprises expect.

Mr. Balkansky makes very good points about the strengths VMware brings to the table, and is essentially correct in his assessment of the current server virtualization landscape. The viewpoint is in most essentials that offered by any market leader who has also created the market. In marketing, being first to market with a successful product and sticking with it until the product gets accepted and becomes a de facto standard is the place to be.

Our analysis of server virtualization, including the early work we did in developing tile based virtualization capacity benchmarks, has all centered on VMware. Our research on competing hypervisor and virtualization management not only has looked to VMware as the leader, but has recognized the overall superiority of VMware's products.

There's just this one thing: market leader arrogance. As I said, the viewpoint expressed in the interview is that offered by other market leaders. I'm concerned though about the arrogant tone: no one out there is a real competitor, so we're not really worried about them. We'll watch Microsoft, but Citrix? Xen? non-issues.

Nothing wrong with the assertions. The facts bear them out. But Citrix release of XenDesktop will have an impact, and just because Microsoft hasn't delivered a full management and deployment model for Server 2008 with Hyper-V doesn't mean the immanent release won't be important in the marketplace.

Mr. Balansky may not be willing to express concern in an interview (though I'd be surprise if the filings for the stock market will be so sanguine), but he should be looking over his shoulder. A free, fairly strong offering from Microsoft will have tremendous impact. Especially in the smaller businesses where the costs for ESX and Infrastructure III are perceived as prohibitive. Sure, a company buying two or three or four VMware licenses is not to important VMware. Most vendors don't think about these small sales too often. But there are tens of thousands of companies with a half dozen or so under utilized X86 servers looking at virtualization. Microsoft's offering will be very attractive to these folks. The lack of a full suite of management tools will probably not matter too much since these companies are probably not using that much in the way of sophisticated management platforms anyway.

Hopefully, VMware is investing in following the success of its competitors and looking closely at how their products actually deliver rather than only at the market numbers. Otherwise, a year or two from now, they might be surprised to see their market share shrinking. And shrinking in their existing customer base, where it hurts.  

Friday, March 7, 2008

Apple and Business — Is it 25 Years Too Late?

Jason Perlow, writing on Larry Dignan's blog at ZDNet.com isn't impressed. (I think it's Jason as that's what it says on the blog, but I may be misinterpreting the page design.) The lede:

Hey, Apple’s releasing the Insanely Great iPhone SDK so we can all write enterprise iPhone applications! They’re going to make the iPhone compete with the BlackBerry and hook it up to corporate Exchange email servers! Whoopee! Apple has a Business strategy!

Apple and Business — Is it 25 Years Too Late? | Between the Lines | ZDNet.com

I think he protests too much.

Most of the points make are not in error, though obviously intended to incite fan boy rage. I've had similar thoughts myself for years. And expressed them to Apple executives and other industry folk.  I have only a few issues with the entry.

First of all, it's obviously written from a "anti-Apple" perspective. I know I've probably been reading this column for years, but I'm not a fan of anything and don't recall whether this blogger (either Larry or Jason) are pro-Apple, anti-Apple or just the sorts of writer who like to put down people and organizations because the negativity excites readers and builds traffic. Perhaps they're positioning themselves as the Rush Limbaugh of computing? My issue - enough with the Apple and Steve Jobs bashing. It seems you're jealous or something. Get over it.

What's more important and much more annoying is the assumption that Apple wants to have an enterprise computing strategy that makes their products just like everyone else's.

Some quotes to illustrate:

For Apple to have a real Enterprise strategy, they are going to have to do better than iPhones and sex appeal.

Much of this enterprise cluelessness stems from a 25-year history of Steve Jobs and his legacy at Apple of not really caring about what businesses actually want.

Mac OS X technology is elegant, and there is nothing wrong with it from a pure software architecture perspective. The problem is that Steve Jobs and Apple doesn’t really give a a damn about how to apply the technology to business and make it attractive to enterprises in order to mass adopt it.

The last sentence is the important one. If my interpretation is true - that the author, whomever it may be, believes that Apple's enterprise computing strategy is to conform to the mold set by IBM, Dell, HP and so forth and that mass adoption is the ultimate goal for any computer technology company selling to business - the author is missing the point. Apple doesn't need to be HP or Dell. It needs to be Apple.

Not all business users need to have absolute conformity. Or don't you actually recall the point of that ground breaking 1984 ad?

All Apple needs to do is be a good enough corporate citizen so that enterprise computing groups don't reject it out of hand and enable those would benefit from the enjoyment of an iPhone or the greater good looks,  stability, security and reliability of an iMac, MacBook Pro or whatever to participate as network peers.

Maybe I'm just an old hippy, but conformity is not the only, or even the best corporate culture to emulate. Even IBM doesn't have the corporate culture that that 1984 ad satirized.

I agree about some of the technical arguments and would love to see Mac OS servers running on enterprise class hardware at least as VMs on x86 boxes from various vendors on any virtualization architecture (VMware, XenSource, Hyper-V.)

It would be even more important if Apple collaborated with the BMCs, Tivoli's  and CA's of the world to make sure that enterprise management platforms can easily administer Macs without the absolute need to use Apple's own tools, excellent though those may be.

But go for dominance? Why? That's Microsoft's thing - and all that's done is get them in trouble they didn't need. If Apple can beat their 1% of smart phones target with the iPhone this year because of corporate acceptance that's great. If they can get even 5% of corporate laptops to be Macs - that's even better. And if the get secretarial desktops and executive desktops to even 5% it would be a huge jump in sales for them. Why do they need to strive for bigger things right now?

Wednesday, March 5, 2008

Quote for the day

 

There are grammatical errors even in his silence.
  - Stanislaw J. Lec

Tuesday, February 26, 2008

A Tragic Flaw

Watching the debate tonight...

There is a tragic flaw in our precious Constitution, and I don't know what can be done to fix it. This is it: Only nut cases want to be president.
  - Kurt Vonnegut

Thursday, February 14, 2008

Installing Oracle on Linux is too difficult

A blogger at Tech Republic, Rex Baldazo, says that Oracle installation on Linux isn't as easy as it could be.

I still think Oracle could stand to spend more time making its Linux installation smoother.

Installing Oracle on Linux is too difficult | Programming and Development | TechRepublic.com

The author goes on to say that his problems may stem from being more familiar and comfortable with Windows:

"I’m neither a Linux admin nor an Oracle DBA, which partly explains why I’m having so much trouble getting Oracle installed properly on a Linux machine. But the installation is just too complicated, especially compared to the same installation on Windows."

The company I work for, Edison Group, inc. has performed installation of Oracle on Windows in comparison tests for several years now. One of the things we've discovered is that Oracle's Database was easier to install than even Microsoft SQL Server, especially SQL Server 2005. It made sense for us to install on Windows for these projects since comparing how SQL Server installs on Windows with how Oracle installs on Linux made no sense. We also compared Oracle Database 10g with several versions of IBM DB2 UDB. We used Windows for these comparisons too to maintain the same context as the MS SQL Server comparisons.

Maybe our premise was too limited for today's datacenter realities. It seems that when installing on Linux, the Oracle installer doesn't address pre-installation steps directly. According to Rex:

"Even when I thought I had the pre-install steps completed properly, the Universal Installer would complain about one thing or another. And, of course, the darned Installer doesn’t have any ability to fix the problem — you just have to quit the installation"

He compares this to what happens with SQL Server:

"...Microsoft’s SQL Server install, which can handle much of its own pre-requisite software if necessary.

...even if you follow all the pre-install steps properly and the Universal Installer runs without problems, you still don’t end up with what I would call a working installation. You still have to do post-installation steps manually to get the database to where it will automatically restart when the OS is rebooted."

It seems to me that Oracle has a challenge here as great as the one that instigated the original research we did for them: can it maintain the ease of use advantage that Edison has been measuring over the years on other platforms? After all, Oracle seems to use the same installer for all the platforms. Why isn't it as thorough on Linux as on Windows? Or does the installer work better on Oracle's Red Hat distribution where they have control of the bundled packages and perhaps access to RPM tools specifically for that distro? It wouldn't matter if the difference isn't that great since everyone in IT knows that installations on Linux are usually more complex than on Windows. Oracle has been pushing Linux as the preferred OS for their eponymous database server for several years now. I would think that they should be making every possible effort to insure that their product offered the same ease of use capabilities on Linux as on Windows.

Edison has been negotiating with Oracle to update our research to reflect new products from all the vendors. They have been reluctant to fund the additional research required to update the procedures that have been in use since 2000 to reflect the changes in the products. The issue raised here adds some additional areas of concern that Oracle should have Edison address. I'm not hopeful that Oracle would be willing to have Edison both update the procedures or look at Linux when comparing Oracle to IBM DB2 UDB, let alone Oracle on Linux to anything on Windows.

To me, that comparison, even if Oracle on Linux is not quite as easy as on Windows, would be a powerful competitive message. One that would leverage the ease of use message on top of the unbreakable Linux message.

Friday, January 25, 2008

John Dvorak says "Promises of Productivity Are Often BS"

Or at least that's the headline his editor came up with.

The last paragraph in the article:

I'm almost tempted to do a book on the whole notion of productivity, since I find it to fraudulent in too many instances. Exactly how do you measure worth in today's white-collar workforce? It's a total crapshoot.

Promises of Productivity Are Often BS - Columns by PC Magazine

The opening paragraph:

I have forever been amused by sales pitches that a product or service will pay for itself within so many weeks, months, or years. Generally speaking, if "pay for itself" means the product or service will actually increase cash flow and sales to an extreme, then I'm in. But if "pay for itself" means an increase in productivity, then the red light on top of my BS meter immediately goes off.

In between he talks about experiences with devices that are supposed to enhance productivity, the faulty metric of time saving because people don't actually work 100% of the time, so a few minutes saved are as likely to be absorbed by IM as by being able to do more work.

Now, I've been touting these themes in TCO analyses for a while, so the article touched home. But the fallacy in the discussion isn't that people don't work 100% of the time, but a misapplying of productivity metrics to work done. For me, being more productive means workers get more done in the same or even less time than before. While a business might gain by increasing the total volume of work product through productivity gains, as John says, these gains are mostly manifest in manufacturing and other repetitive task industries.

An office worker may or may not produce more total work product through the use of higher productivity enabling tools, but other factors will probably have enough affect to limit those gains. OTOH, a systems administrator will definitely be able to handle more devices through productivity enhancing tools as his or her repetitive tasks will be simplified or otherwise be made more efficient.

Ultimately, productivity claims are BS. Usually, as I've stated elsewhere, the problem is claiming a productivity gain because fewer people are producing the same work product. They just work longer hours. Often much longer hours. That's not a true productivity gain. Its a labor rip off by the employer. A true productivity gain is when the same number of employees can produce more work product in the same number of hours.

Now John, in his professional curmudgeony way, is right about the nature of office work and in picking on the marketing around productivity. But the biggest productivity enhancing office too has to be the networked PC/Word Process/Laser Printer combination. The amount of time saved by the use of these three tools is almost incalculable.

Personally, if a better tool enables me to get some kinds of work done so I can write this blog; how bad can it be?

Wednesday, January 23, 2008

Geeks Unite!

According to AP, IBM is playing fast and loose with employee pay:

BOSTON (AP) -- Even as IBM Corp. reports record profits, thousands of its U.S. employees are staring at pay cuts.

It's the result of IBM's response to a lawsuit in which the company was accused of illegally withholding overtime pay from some technical employees. IBM settled the case for $65 million in 2006 and has now decided that it needs to reclassify 7,600 technical-support workers as eligible for overtime.

But their underlying salary - the base pay they earn for their first 40 hours of work each week - will be cut 15 percent to compensate.

Wired News - AP News

Basically, IBM is reclassifying people who were on salary as hourly workers to whom overtime applies. It's then cutting their base salary to compensate for a supposed 15% expected overtime for all of them. That's five hours per week. Only some of the 7600 will actually get that overtime, and is seems that the majority expect their time to be cut back to 40 hours regardless.

Perhaps IBM managers will follow in Walmart's footsteps and force these workers to clock out and then work the overtime for no extra pay. Seems reasonable. After all, profits need to go up and these workers are competing with folks in India and China who get paid much fewer dollars per hour.

I've been saying that technology workers are horribly exploited and they've been glad about to be so as they're under the delusion that sweat equity is worth money. It's not. It's worth a shower. Though their employers will take them to the baths.

If you work on salary for a technology company, what's your hourly pay? Making $120,000 for a 40 hour week comes to about $57 an hour. That's pretty damn good. But if you work 60 hours it comes to about $38 per hour. If you were an hourly employee making the same base salary - $57/hr and got that $120,000 for a 4o hour week (besides being paid better than almost everyone on the planet - except lawyers) and you worked an extra 20 hours every week you'd make an extra $60,000 per year.  So regardless of your salary, you're taking a major hit compared to an hourly worker.

Obviously, the hourly worker is unlikely to be paid $57/hr. A more likely sum would be $30 to $35 per hour. But that person, working 60 hours would make an extra $56,000 per year for being paid overtime. That would be more than the salary worker's income even if the hourly person didn't get all that overtime.

The point I'm trying to make is that IT folks work far more hours than they should for the money they get. The problem is they think they're doing great, but in fact they're being ripped off. Several times - not only are they being exploited for the time they work, but they're constantly under the threat of being outsourced to Asia.

It's a shame.

Thursday, January 17, 2008

Mac in the Enterprise

Back in the day, when I was selling Apple Macintosh computers into corporate accounts, Apple actually had a corporate or enterprise computing message. This was when built-in Ethernet was a revelation of a personal computer. Back then (1990-91), a Mac was actually a better network citizen than a PC with Windows. PC's usually didn't have enough memory space to support the operating system, the Windows shell, the expanded memory drivers, the NIC drivers, the Network login client and so forth and still be able to do useful work. Especially if you wanted to run a client/server application at the same time as Lotus 123 and WordPerfect.

For those of you who weren't around then, PC's ran in the lower 640K or RAM (that's Kilobytes, not Megabytes or Gigabytes) and typical business PC was lucky to even have one Megabyte or RAM installed.

For those not totally locked into Microsoft or more likely Novell Netware back then, the Mac wasn't that bad a business computer. In '91, you could get a Mac IIci with a 12" (13"?) monitor, extended keyboard, up to 4 MB RAM, built in Ethernet and more for about $3000. A Compaq DeskPro 386s with the same features cost a bit less, but was much harder to setup and connect to the LAN (with Windows) and had only 1MB RAM.

Selling a Mac to a company that was open to the possibility was essentially a TCO argument.

Since those days, Apple has all but abandoned efforts to sell their products as enterprise computing tools. Sure, Macs are in offices all over, but they've mostly stayed in the content creator niche.

But Apple has been making in-roads over the years through stealth. I think it started with the first OS X laptops. When these appeared, the PowerBook was the first commercially available and supported, reasonably priced laptop computer running  UNIX. And it ran UNIX while also supporting Microsoft Office. This immediately made these laptops popular with UNIX administrators who needed mobility and the ability to create reports and so forth using Microsoft Office file formats.

Just by becoming comfortable, indeed happy, with OS-X made system administrators more open to the Mac.

Then came the iMac. Especially, the LCD screen models. Any company that wanted to have their publicly visible offices look cool, modern and up-to-date had to have them. That they machines also took up less desk space and could run Office app's, made the iMac a much less risky choice.

(I don't know what the current stats are, but a year after OS-X was released there were more Apple computers running UNIX sold than all desktop UNIX sales for all versions of UNIX combined.)

Next came the iPod. Not  a business device? Maybe. But almost everyone had one. So Apple was now a friendly face to even the most belligerent Apple hater. Even Bill owned one!

Now comes the latest assault on staid enterprise computing. To quote ZDNet:

What’s great about the MacBook Air is that this machine appears to be a new twist in Apple’s stealth campaign into the enterprise. The MacBook Air is all about switchers.

Who will be customers of this classy machine? Captains of enterprise and commerce. Traditionally, these customers have been Windows users. But now they will buy Apple’s new ultralight and join the ranks of switchers.

Read related story: Do switchers now rule the Mac?

These executives are helping to drive the adoption of the Mac in the enterprise and mid-market companies.

» Why does the MacBook Air make so many so dumb? | The Apple Core | ZDNet.com

 

If you don't think this is the driving force, just visit places like Gizmodo. For instance this thread about what Sony has said about the MacBook Air. Besides all the stuff about how the Air isn't a power user computer, and it doesn't have this or it doesn't have that - the usual geek parade of complaints - the same point about stealth adoption is made by several folks.

BY CHINHSTER AT 01/16/08 11:49 PM

@nachobel:

Before the MBA was announced, I told my boss what the rumored ultra portable was supposed to be like and that if the rumors were true, it would be too expensive for most people. He travels first class and told me he has observed that there's no shortage of people who would pay for something like that in business/first class regardless of cost. The MBA would be perfect for him because he never uses the optical drive or a wired network, travels often, and doesn't use much local storage space preferring online storage instead, and carries his laptop everywhere he goes.

After the keynote, I got a text from him saying he ordered the MBA... the $3100 one.

This is the real purpose of the MacBook Air. When you start seeing the fashionable folks with them everywhere, including in Vogue, on TV and in the Movies, people are going to want them. A lot.

You don't think turning the Logo upside down on Mac laptop covers was an accident do you?

Tuesday, January 8, 2008

Does the Net bring Freedom or Control

 

Nicholas Carr quotes Alexander Galloway:

“the founding principle of the Net is control, not freedom - control has existed from the beginning.”

Rough Type: Nicholas Carr's Blog

Mr. Carr is driving at the fact that it's no longer the World Wide Web, it's the World Wide Computer. I agree with Carr's history - the personal computer started as a tool for working outside the control of centralized mainframe computing, but computing was re-centralized with LANs and networks. Then the WWW came and broke it open again, but control is back.

I've spent my career at the fulcrum of this issue. I'm strongly committed to the personal computing paradigm, but I'm a control freak, especially when it comes to how business data is created, used, managed, searched and analyzed. I've therefore specialized in applications that enable collaboration in the creation of content empowering users to do their own thing, while working within tightly managed bounds of policy or other structures.

We're now entering a new phase in the constantly shifting personal vs. control seesaw. Cloud computing, to use on popular label, leverages controlled, centralized servers with end user (or at least IT provided) gadgets that enable work to be performed anywhere, anytime on any device. The interesting paradox is that to enable maximum end-user freedom, very strong control over resources is required. These resources may be decentralized and even from disparate and even competing sources, but each "server" is itself tightly controlled.

I don't disagree with Carr's assessment, but I'm more hopeful about how it will pan out.

Friday, January 4, 2008

Using Collaboration Tools Wisely?

Analyst firm claims that collaboration overload costs $588B a year! Among other points made in the article:

Beyond the interruptions and competitive pressure, the different modes of collaboration have created more locations through which people can store data. This makes it harder for users to find information, prompting users to "reinvent the wheel because information cannot be found," Basex said.

Basex' conclusion is that the more information we have, the more we generate, making it harder to manage.

Claims Collaboration Overload Costs U.S. $588B a Year

 

We've all experienced this. Whether it's being unsure of the whereabouts or even existence of a "master" document that's being collaboratively developed, or having to spend the morning just going through e-mails, several of which merely contain an acknowledgement of a previous message, the tools we have for collaboration, especially the one's focused on in this article (e-mail, IM) are definitely costly distractions.

It would seem that the very act of working with others - or at least the tools for communications - can get in the way of actually working.

The article points out:

… Basex proposed several steps to mitigate information overload. With e-mail as the biggest offender, Basex said users can save time by not e-mailing someone, and then following up with a phone call or an instant message two seconds later (a no-brainer perhaps, but a trap many of us fall into).

Basex also said users must not combine multiple topics or requests in a single e-mail; make sure the subject clearly reflects the topic and urgency of the message; read their e-mails before sending to make sure they make sense; and will not hit reply-all unless necessary or reply with one-word e-mails such as "thanks."

I would add a couple of points:

  • If someone sends an e-mail responding to a previous request there's no reason to reply unless further clarification is needed. Sending an e-mail saying "OK" is a major waste of time. For you and the recipient.
  • Multiple topics are definitely a bad practice, but if you can send a subject line that reflect the multiple topics, perhaps that's OK. The problem is replying:
    • Always endeavor to have clear subject that describe as much as possible the contents and the source. An e-mail subject such as "Tuesday's Meeting" may make sense to you when you send it, but without knowing which Tuesday (past or future), the meeting subject matter and possibly some project or client identification makes addressing the message in a timely and appropriate manner difficult and perhaps will also waste time

It's also important to change the subject of a message thread if the topic changes. Don't keep using the same subject when the contents no longer relate to the topic at hand. You should also delete all the thread content that's no longer important to the new subject.

To quote the article:

For all communication, Basex wants to remind workers to be as explicit as possible because their readers are not mind readers. While the statement may seem like an obvious mantra, it is also easily forgotten.

When discussing the choice of medium, the article points out:

Basex also urged users to choose the proper communication medium at the proper time. The researchers suggested instant messaging is better than the phone when multiple parties need to be on and do the talking, or there are a number of many-to-many conversations taking place.

Instant messaging is better than e-mail when an issue demands an immediate response, or trivial, such as lunch plans. E-mail trumps instant messaging when a note must be blasted out to multiple people and when a message must be archived.

One thing the article doesn't point out here is that the use of a collaborative environment such as SharePoint can also be an excellent tool for maintaining knowledge threads. Just by making sure that e-mails are copied to a project document library and insure that all discussions pertinent to a collaborative project are available to all participants.

Regardless of the choices made, it would seem that there are no easy solutions. My personal recommendation is to go beyond the e-mail/IM paradigm and use threaded discussion sites whether on a SharePoint like portal, a Wiki, or even a group blog whenever possible so that content of collaborative efforts be available. The ideal solution would support the delivery of content created via e-mail, IM, word processing and even phone conversations (transcripts or recordings) to a central repository for future reference. That way the creation tools could be chosen for their appropriateness to the situation, while the results of the effort can be shared for use in the collaboration.