Friday, December 21, 2012

ARM-ed and Ready (and should you be?)

Since this blog tends to focus on topics relevant to technophiles and technology leaders, I figured it might be time to get back to some "nuts and bolts".  Over the years, one of the most vexing problems in the realm of IT infrastructure has been how to properly design and manage data centers.  Other than power and space, cooling has always been the hardest to manage.

Contrary to a lot of assumptions, large servers tend to work best at temperatures between 42-68 degrees Fahrenheit.  Anything less and condensation can set in.  Anything more and all kinds of bad things happen:
  • Processors overheat and sometime melt down
  • Hard drive lubricant starts to vaporize
  • Electrons actually start to "jump" across circuits rather than following their designed pathway (not good)
  • Motherboards get too hot and begin to warp and bend
One solution, of course, has been to over-design data centers to allow for more cooling.  Other approaches have involved elaborate designs, early warning systems, and automated responses. 

More recently, hardware vendors have been turning to a technology first introduced in the 1980s called ARM (Advanced RISC Machine) chips.  For a mind-numbingly dry explanation of the technical specs behind ARM technology, go here:

The main difference between a regular CPU chip and an ARM chip is fairly simple.  On regular chips, companies like Intel and AMD will pack millions of transistors onto each silicon wafer.  The more transistors, the more computing power is generated, following a linear relationship.  On the other hand, an ARM chip may only have 30 or 40 thousand transistors.  These ARM chips are not as powerful as their "regular counterparts", but not disproportionally so.  The way the chips work in concert with the motherboard and core software allows them to be a lot more powerful than one might ordinarily expect.

(A Standard 64-bit "Regular" Chip - millions of transitors)

The ARM chips work wonderfully in mobile devices where the conservation of power is a huge consideration.  In fact, over 95% of mobile phones and tablets use ARM chips.  Whenever you hear about the new Apple A4, A5, or A6 chips, those are built on the ARM architecture.

(An ARM Chip - Small and compact, just as you would expect)

File:Conexant arm.jpg
(Picture courtesy of Wikipedia)

Given the success and hence proliferation of ARM chips into the mobile market, innovators have been looking into whether enterprise servers could use the same technology.  If viable data center class servers could be built with ARM chips, it would remove many constraints that exist today for the availability of power and cooling.  All of that could add up to reduced need for robust data centers and greatly lesson monthly power bills.

Recent tests have showed some promise for the use of ARM technology in the data center, especially as the chips have been made faster and more efficient through newer generations.  However, the ARM chip always loses out to regular chips when raw processing power becomes a primary consideration. 

If you have a data center full of mid-size servers, you might very well be able to take advantage of the cost savings that ARM-equipped servers are being designed to provide.  Be careful, though, because like super/turbo charged cars, there is a limit to the processing power of the ARM chip.

Finally, don't forget that the arguments over which chips to use may be a moot point.  These days, smart CIOs are looking to get out of the business of owning/managing data centers altogether.  Rather than the buy-and-own model of old IT practices, more and more CIOs are looking to move as much into the "Cloud" as possible.  In other words, we are looking for ways to leverage the data centers of companies who do only that.  That way, we can focus on the services we are providing rather than having to worry about a physical data center going down in the middle of the night, being swept away in a tsunami, or swallowed up in an earthquake.

For me, the less hardware I have in my portfolio the better I sleep at night...

Monday, December 17, 2012

Ego - The Center of the Talent Triangle

If you read the previous post entitled "The Talent Triangle", you saw this diagram:

We explored the importance and relevance of Capability, Values, and Motivations already.  So what happens when you find the "perfect" candidate according to those criteria and then they fail, or "bomb out"?  In many cases it is because there was a specific, intangible quality that invalidated the feedback on the three other categories.  That quality has been specifically researched and is referred to by several names including the technical (SCTi-MAP assessment) and the more simplistic "Ego assessment".

In my experience, ego is a modifier that can come in three varieties:
  • Negative - the ego becomes a limiting factor on the performance of the individual across all three areas
  • Neutral - the ego is aligned adequately to the role and thus has no great modifying effect
  • Positive - the ego of the individual elevates the performance of the individual, often manifesting as a higher perceived level of Capability than is truly there

There are so many historical examples of people that fit into each category that it isn't worth filling this post with them.  There are even some few rare examples of heroic historical and (possibly) fictional beings possessing both positive and negative modifiers at the same time - Alexander the Great and Achilles being several such.

The work on assessing the level of ego development was greatly advanced by a brilliant Harvard-educated psychologist named Susanne Cook-Greuter (  While I won't try to explain exactly how it works, her test does an excellent job of placing people on a spectrum that ranges from one to six.  In general, most people fall in the middle of a bell curve, somewhere between a level two (diplomat) and a low level four (achiever).  The best take away is that people in the 2-3 range tend to view the world from a 2nd-person perspective.  Consequently, they are more apt to enter into "win-lose" situations.  People at a level 4 or higher begin to view the world and their interactions from a 3rd, 4th, and even 5th-person perspective.  In other words, they can see better how the ramifications of their actions play out across society through time.  These types of people are more likely to consistently generate "win-win" situations.

(I'll insert a quick apology at this point to Dr. Cook-Greuter and Beena Sharma for trying to simplify such a complex subject.)

There is no one perfect level, nor are higher levels necessarily better than lower levels.  It all depends on the role.  You wouldn't want a heart surgeon to be thinking much beyond a 2nd person perspective when they have you on the operating room table.  But conversely, you'd want your CEO to be considering multiple, long-term perspectives when crafting strategies that affect hundreds, or thousands of people over a twenty year time span.  In other words, you'd hope that your CEO could operate at level four (Achiever/Individualist) or level five (Individualist/Strategist).  I've combed the web to find some examples that might further give context on what these levels mean.  Check out this blog:

Consequently you could also go through the assessment yourself on Cook-Greuter's website; your choice.  Below is an example of how your results might be assessed and mapped.  This particular graphic shows the results of a late stage Individualist/early Strategist.

Be careful not to overlook the maturity of ego when selecting a candidate.  If you just look at Capability, Values, or Motivation, who you think is the next Peyton Manning may indeed be the next JaMarcus Russell. (

I will end this post with a diagram that links ego development with how a person's outlook can vary depending on their relative level.  The colors represented link to a book called "Spiral Dynamics" by Beck and Cowan (  Advanced but still very fascinating information.  If you can master the Talent Triangle, you will be well on your way to becoming a Level 5 leader.

Thursday, December 6, 2012

On the Surface, things look bad

Earlier in the year I blogged about my hope that the (then) yet to be released tablet by Microsoft would be a worthy competitor to the iPad.  Not that I would instantly go out and buy one, mind you, but at least the competition would be good for improvement in the overall landscape.  If you're an Apple user like I am (unless forced to use a PC), you haven't seen much "revolutionary" product coming out of Cupertino as of late.  Rather, all we have been seeing are "evolutionary" developments in existing products.
  • The iPhone5 gives us a thinner, bigger, faster phone that finally has 4G.  Nice, but Samsung had that in 2011.
  • The new iPad ("3" & "4") give us 4G as well as some screen improvements.  iOS and maps took a step backwards, though.
That's about it folks, which is pretty disappointing considering what we saw from Apple late last decade.  What about improvements in user interface (virtual keyboards?) or some new cool type of form factor change?  Well, there is the iPad mini but we'll get back to that in a minute.

I've had some time to look at the Surface and Windows 8.  All I can think to say is, "Really?"  I can't tell if the Surface is a laptop, a "laptab", or some new mutation.  Windows 8 is so confusing that I'm not sure if I'll have the patience to learn it.  From a corporate perspective, I can see the possibility to put W8 devices on the shop floor because of the touch screen capabilities.  But I can tell you as a CIO that I'm going to be really, REALLY cautious about trying to introduce W8 into the office environment.  Not only will I have to train employees on a whole new interface, I'll also have to make a renewed commitment to high-priced Wintel equipment.  Right now I am very sold on the new Citrix XenDesktop virtualization solution and how it will free me from specific hardware platforms altogether (bad news for Dell, HP, and Lenovo).  Since the Surface tablets are going to be at least as expensive as the iPads, I have little incentive from a cost perspective to want to adopt them.

Forgive me Microsoft and all you Balmeristas.  Your products, with the exception of XBox and maybe Office 10, are really not getting the job done.  Earlier in the year I tried to make a major commitment to your new Cloud-based Exchange service delivered through the "Office365" brand.  Boy did I take a beating on that decision.  It wasn't until I migrated half an enterprise to O365 that I discovered your non-published throttling process.  For the lay user, Microsoft put my email on the same servers as other companies and then limited how many messages I could send and receive in a given time span, I'm guessing per hour.  My reward for committing to O365 was having to back out of the solution, in shame, because I could not have "throttling" that delayed my email service.  And it didn't help that the highest levels of Microsoft support, the vaunted Tier 3 escalation, could not or would not help me. Folks, believe me when I say that email is the most important application in your whole company.  If it goes down or even gets "throttled", you will hear about it *instantly*.

No the Surface tablet, if that's truly what it is, fails to impress.  Even the commercials where a bunch of trendy looking people are swapping keyboards are irritating.  Didn't you learn your lesson with the Seinfeld bits a while back?  Of all the people in my company that I've given a choice to have either a Surface or iPad, over 95% of them have backed away from the Surface and Windows8.

I really hoped that you would get it right this time, Microsoft.  But you didn't - in fact, a recent article by Computerworld ( says you'll be lucky to get to a 10% market share by the end of the decade.

Both Apple and Amazon are now taking the next step with miniaturization (iPad Mini and Kindle Fire).  How are you going to shrink the surface and still use Windows8?  I'm not sure how you will but I am sure of one thing.  The Surface is not going to penetrate the corporate arena.  Why couldn't you have made it at least a cheaper alternative?  That means my dreams for competition to drive greater innovation will stay just that - dreams.  Darn it.