Monday, October 31, 2011

Are Macs invading the enterprise?

"Better watch out, better not cry Macs are gonna make your IT guy sigh..."
Set to "Santa Claus is coming to town"

Usually I ignore the daily LinkedIn updates in my inbox informing me of the goings on of people who may know people that were once in the same state as a guy I sat next to in Burger King 1o years ago.  Wow, that whole 6 degrees of separation thing must be true...

So one of those strangers in my inbox was recommending a link to a story on Business Insider talking about the increase of Macs in the workplace.  The cliff notes version goes something like; Rich professional people are really creative and like Macs more than PC's and because of it they want to use them at work. 

I've actually seen evidence of this in action at my last employer.  The entire organization was run on PC platforms but there were a few Macs floating around as well as some Mac "servers" Which were glorified 1U server chassis' running Snow Leopard.  AKA, not a server.

True to the assertion of the article, our Mac users were in the executive suites and generally didn't want to do more than get their email and browse the web.  Anything else required running terminal server sessions a la' Parallels just to edit a word document. 

I remember on my first day I got called to the office of the regional manager whose only complaint was that the terminal server session and desktop wasn't like his Mac desktop.  Great first impression I made that day.

By the time I came along the Mac users already accepted the fact that we could never be a pure Mac shop mostly because everyone else had to do lots of boring uncreative stuff that didn't work on Macs.

I've written other articles about Macs in business environment so I won't belabor the point here.   Suffice it to say that as long as Apple treats all their products  as consumer devices (even if it says "pro" on it) with no regard for business  process, there will always be resistance by IT departments.  In this case, resistance is not futile because the bottom line is that Macs don't play well with most enterprise networks and applications.

This isn't meant to be derisive it's a simple statement of fact.  Remember that the sandbox that is Apple rejects conformity.  99% of enterprise networks are running non-Apple hardware and operating systems.  They conform to the evil IBM model because they have to, there isn't a good alternative.  Linux is still somewhere around the level of Windows 98 for the desktop and Macs have to use Microsoft office because they still don't have a good productivity suite. 

If you live in a sandbox, sometimes you gotta order out...

I'm sure the Mac enthusiast is thinking, "Well, the enterprise needs to change then"  Yes, maybe it does but right now it hasn't and honestly it can't.  As long as the bulk of corporate America doesn't produce anything more creative than a suggestive photo at the Christmas party nothing will change.

Until corporate America finally decides to drop its 19th century labor model and realize that people don't have to be under your nose to be productive, nothing can change.  Journalists, consultants and others not dependent on a corporate cubicle have figured out how to excel without the chains of corporate IT conformity.

That works out fine for them but if you go to work in a cubicle decorated with pictures of places you'd rather be don't expect the revolution any time soon. 

Mac's by their very design are non-conformist.  From the ambivalence of the file system organization to its lack of support for common enterprise applications Macs are meant to accommodate the user not vice versa.  That's the way Steve Jobs wanted it so don't expect it to change.

With the advent of cloud services, Google docs and online meeting options , it's possible that someday we may not have to waste years of our lives in pointless commutes to some dreary office building.  This is where Macs can become a viable option.  To make a Mac work for business you have to get it out of the office and give it an Icloud account. 

Apple is all about creativity and connectivity.  Everything from the sandbox is meant to work with everything else with an Apple logo.  Online experiences are supposed to be ABOUT the content not the process of getting TO the content. 

Enterprise IT doesn't work that way.  Enterprise IT has to control all the channels if for no other reason than to protect its information assets.  It's not about WANTING to control everything it's about HAVING to.  IT departments don't have a choice in the matter.  If given the choice without repercussion most IT guys would let the free for all happen if for no other reason than to be hated a little less. 

But we all know the corporate network would be in flames in 20 minutes.  It's human nature to be freeform which flies in the face of any IT organization trying to secure and provide reliable resources. 

So it is, so shall it be.

Macs are to Enterprise as Smartphone  is to Blackberry. 

Similar function, different methodology. 

There's nothing wrong with using a Mac if it fits your style of work but it's very design is guided by the premise to NOT be like a PC.  That's why they never seem to fit well into enterprise IT architectures. 

I'm not anti-Apple, I just know it doesn't work well in the prevailing IT construct. 

If the world decides to throw away the construct and do it Apples way, however, then it's conceivable that Apple could become a catalyst for finally abandoning an outdated work methodology that says work only happens in an office.

Article first published as Are Macs Invading the Enterprise? on Technorati.

Tuesday, October 18, 2011

Shiny Objects, Dull Minds....

I hold two beliefs,  one is that technology  will never stop advancing and the second is that human beings will always gravitate toward shiny objects. 
Crows like shiny objects too.  It's been suggested that they take them to attract a mate.  Hmm, maybe that's why all the geeks feel the need to get a new Smartphone every 6 months. 

I watch a lot of technology podcasts where all the Uber geeks and tech pundits get all misty eyed over the latest bit of techno kitsch.  I t never fails.  They anxiously await the latest whatever and when they get it in their hands they fawn over for about 15 minutes;  playing with every button, adjusting every setting and trying out every new feature.   
Then the facade starts to crack.  It could be a change in how a feature works or even the removal of it entirely.  It doesn't really matter, you can always tell by the look on their face.  It goes from a happy kid on Christmas morning to a blank stare.

The end is always the same.  Unless the thing catches fire in their hands there'll be allowances made.  Phrases like, "They'll fix that in an update" or "This is an early production model" 

We're supposed to dismiss the deficiency and focus instead on the promise of this great new thing even if it's to our own detriment.  

Smartphones are a perfect example.  It's not enough for your phone to make calls anymore.  It has to be able to surf the net, update Facebook and entertain you with a game or a movie.  It's almost as if there's some grand plan to cause the world to develop Attention Deficit Disorder.
 I still find it amazing that people went so nuts over the Iphone when it had so many issues like being chained for two years to a horrible data network, high cost, call quality and usability problems.  Still , even if you had a bad experience with the phone it still managed to check off all the items on our shiny object list.

Like the MP3 that's largely replaced the CD ,we tend tolerate a lesser experience  for greater convenience or just the chance to look cool.   There may be a more insidious penalty than that, however.
Technology can be the catalyst for inspiration but it can also be a debilitating crutch .  In his book "The Shallows: What the internet is doing to our brains" Nick Carr suggests that we may in fact be gradually becoming dumber because of our addiction to connectivity.

It's not so far-fetched an idea.  We don't even care if a phone can be relied on to make a call anymore so long as our Netflix download doesn't buffer too much.   Oh yes and we must be sure that Foursquare knows where we had lunch.  I'm sorry but nobody has the right to know that much about my habits even if I didn't notice your 30 page irrevocable EULA. 
I guess it's too bad if your favorite sushi restaurant is next door to an S&M shop.  If Google maps can't pinpoint my house accurately I suppose I should forget about any hopes of public office.  There are people who believe the president of the United States  has a fake birth certificate.  What hope do I have if my favorite  sushi restaurant is suspiciously located?
It seems we'd rather not use our long term memory either.  It's simpler to just Google whatever it was that we're too lazy to remember.  Google's a godsend then; protecting us from having to spend more than 30 seconds on any stray thought.

Any forum discussion on the topic invariably degenerates into a shouting match ending in a flurry of hyperlinks supporting their point of view.  That's sad.  We're so addicted to the internet that we can't even have a debate without using it.  Are we so enamored with our connectivity that we're becoming incapable of independent thought? 
If a Pulitzer prize finalist believes it's possible then I have to believe that there has to be some truth to it.   But then, I found out about it on the Internet.

Thursday, October 6, 2011

IT and Steve Jobs

As most of you know by now Steve Jobs passed away yesterday 10/5/2011 after a long battle with health problems. The tech industry may have seen it coming but it's a blow felt no less sharply.

I will say this. No Technology CEO, including Bill Gates, has ever been a more visible leader in any technology company. He was a marketing genius and had a public persona that inspired an almost cult following.

Here's where I lose the Cult of Mac....

His passion was undeniable and his vision unquestionable. Still his most valuable trait was that he was a brilliant opportunist.

Consider the following observation.

I've often said to colleagues over the past twenty years that IBM created nothing. IBM Chose instead to obtain innovation and mold it to their vision. Steve Jobs at Apple was not so different in method but his motivation couldn't be more different.

In jobs we had a conquering hero beating back the stagnation and rigidity of faceless corporations concerned more with quarterly profits than usability. He could not only mold someone else's idea to his vision but convince you it was in your best interest to come along for the ride. He knew how to speak to the consumer and convince him that his offerings were superior simply because he'd offered what you wanted before you knew you wanted it.

As far as IT goes, however, I can't be as upbeat about Apple in the enterprise. Apple products are designed from the outset to be consumer devices. That makes sense. Under Jobs' direction that was what technology was meant to be. Computing devices were meant to be enablers of a creative process not some turbocharged calculator.

I've worked with Apple products in organizations and it's blatantly obvious that they were never meant to be part of a typical IT enterprise. Most of the time I find Macs running Parallels just to properly interoperate with other network applications and resources. That's a good thing since replicating that functionality is difficult and in some cases impossible without such software.

I hear the wailing now but when you get past the fanboy devotion you find the real argument is that Mac folks believe Steve got it right and everyone else got it wrong. You'll never convince anyone on either side of the argument of the other's view so don't bother to try. In a way it speaks to how similar Apple's "Different" argument really is to the other guys.

In fact it would go against the core belief of "Think Different" if you believed that in spite of its failings, your way was the only way. In that case you're no better than the evil empires which you spurn.

If Jobs weren't the brilliant marketer that he was, Apple would have had no hope of survival. Over most of its history, Apple products have carried a price premium over and above what was considered tolerable by the market. Jobs was able to combat that by convincing the consumer that he offered a premium experience not offered by his competition. Whether it was worth it was entirely subjective.

Under Jobs Apple undoubtedly had great triumphs but also great missteps. One of the greatest in my view is the closed sandbox that remains central to Apple to this day. To promote change you must get your product into the hands of the masses. From a strictly business standpoint that's a difficult proposition if you're the only one manufacturing it.

Back in 2006 when Windows Vista was released it disappointed a lot of people. So much so that Microsoft finally admitted after a year of denials that it was indeed a poor replacement for the now stable Windows XP and virtually all press turned from Vista to an upcoming release that we now know as Windows 7.

In my view, it was during this time that Apple missed a golden opportunity. At that time Mac OS was at version 10.4 and capable of running on Power PC or Intel platforms. A handful of companies sprang up offering "Hackintosh" computers which were basically PC architectures running Apple's OS. Since MAC OS X 10.4 was less than $100 it was an opportunity for thousands of users to get the Apple experience for far less than the 30 to 40 percent price premium for the same hardware in an Apple branded case.

It was a missed opportunity because millions of dissatisfied Windows users were looking for alternatives to Vista. Many explored Linux, some jumped into Macs with both feet while most were forced to stand pat with Windows XP until Microsoft came up with a better option.

It's my belief that Apple could have grabbed at least another 15% of the market had they allowed OSX to run on non-Apple hardware. It would have allowed a lower cost of entry into the realm of Apple and accelerated the entry of Apple into the enterprise. It would have also, finally, offered a viable alternative to Windows.

Instead, at the direction of Jobs the Hackintosh movement was crushed and an opportunity lost. The message being, "Buy it all from us or go away".

I understand the advantages of operating in a sandbox. Your support costs are lowered since you have only have to support one platform. You minimize your vulnerability to the ills of the Windows platform by offering a smaller attack surface for malware. Finally you can exercise complete control over anything associated with the brand.

It's this last point that has Jobs written all over it. Jobs was fearful of Apple suffering the same fate that befell IBM with the PC and the clones that followed.
What was missed is that while IBM allowed for clones to use their architecture, the premium PC always had an IBM brand on it. If you wanted innovation on the platform you first looked toward IBM. Especially true for business customers who wanted and would pay dearly to have the support of IBM's vast resources.

But then, IBM was the evil empire and Apple was going to be different. Even if it meant being the same in the end. I think Steve misread his customer base. Sure you may have had a whole lot of new Mac users without an Apple logo on their case but you would have also had a lot of peer pressure.

I have no doubt that the Mac forums would have been filled with flaming statements like, " When are you going to get a real mac?" or "Apple hardware doesn't have that kind of issue." and eventually the "MacHacks" would have come fully into the fold.

In the meantime the Apple OS could have matured to become more business friendly without losing its identity. After all I'm talking about an alternative not an emulation.

Such are the failings of our heroes. There's no doubt that Apple owes its existence to the vision of Steve Jobs. I'm hoping that it is his spirit of innovation and not his execution that lives after him.

Saturday, October 1, 2011

The upgrade mill, The IT version...

I maintain this and another blog,

That blog allows me to express my opinions and my passion about all things important to a middle aged gamer. One of my first posts was called, The upgrade mill

In that post I bemoaned the evils of the gaming industry driving consumers toward upgrades that don’t necessarily make their lives any better. I’m sorry, but if you can’t do better than a 5FPS improvement in my favorite game after spending the better part of $1000 U.S. I’m going to feel a bit cheated.

So it goes with IT.

As an independent consultant as well as a full-time IT professional I see the same formula applied and it’s almost criminal.

Let’s take the example of Windows Server 2008. Now I personally have nothing against Server 2008. It’s a fine Server OS and with a new server there’s no question that it’s the right choice for 90% of any enterprise.

That’s not what the marketing guys would have you believe, however.

Well, at least not 12 months from now when Server 8 or whatever it’s going to be called (I agree with Paul Thurrott by the way, in hoping they dont’ get too ambitious with the naming.) You see When server 2008 came out most Microsoft centric enterprises on Server 2003 were encouraged to upgrade.
Now Aside from being against any DOT Zero Microsoft release I saw no compelling reason to upgrade the OS on legacy hardware but Microsoft would have you think otherwise.

The reasoning came straight from the marketing department. Better support for 64 Bit processors, better memory management. Better support for Virtualization. It would be more secure, easier to administer and more cost effective….


Let’s get one thing straight. Other than the inevitable phaseout of security updates there was no compelling reason to move from Server 2003 to Server 2008. None, zip, zero, nada. Any argument to contrary to me is nothing more than marketing brainwashing.

Let’s take a little trip back to 2006…

Microsoft dropped the ball with Windows Vista and desperately needed a Hail Mary play to fix the situation. So enters Windows 7 or should I say Vista Service pack 3. Knowing that Microsoft had unified the code base between server and client OS’s meant that anyone with a modicum of insight into the way Microsoft does things may have some doubt about a new server OS based on the same kernel as Windows Vista. Since Server 2008 showed up 2 years after the Vista disaster it was far enough removed from the stench to be identifed more with Windows 7 than with Windows Vista.

Ok, since server OS’s are notoriously picky about hardware it’s accepted that most of your old stuff wouldn’t work. Tolerable in a server OS, notsomuch in a client OS.

Server OS’s are generally optimized (read that stripped down) to do a few things very well without a lot of frills. I’ve worked in organizations that have standardized on Server 2008 and usually there are usually a few old Server 03 servers hanging around running legacy apps. Admittedly, that’s the fault of unpatched legacy apps and not the OS.

So what’s the big difference between Server 2003 and Server 2008? Mainly interface tweaks and an annoying trend with every new version of the OS to needlessly overcompicate simple administrative functions. Oh yeah and powershell. An add-on in Server 03 and a centerpiece in server 2008.

The only real reason to upgrade a Windows Server OS when alternatives can run for years without the need for an upgrade boils down to our old friend the marketing department. Oh, and that nasty End of Life thing that makes a serviceable platform obsolete with an ad in EWeek…

Server 8 (or whatever they’ll call it) is part of the new family based on a new kernel with powershell even more integrated into core OS administration. Oh yeah and a bit of the Metro interface for administration tasks. Great, just what I need, a tile to add a new user…

Microsoft says that Server 8 is designed to be administered remotely and not meant to be as “console friendly” as previous versions. Good luck with your ISCSI client and RAID controller installs…

The “upgrade mill” parallel here is actually a continuation of a theme. The only thing that drives an upgrade in a Microsoft environment is the marketing department. Unless the previous version was a complete failure.

With each successive version of WIndows Server, Microsoft has moved toward making administration less intuituve and more reliant on scripting and Microsoft VARs to accomplish what used to take a few clicks and 30 seconds. Some will say scripting is a godsend and is exactly what windows has needed. I say it’s a giant step backward. Why have a GUI if all the important stuff is accomplished at a command line.

Let’s face it, there’s nothing sexy about a server OS. Linux knows that which is why they don’t give you a GUI by default in most server distros. In UNIX-land Server OS’s are supposed to be ugly, cryptic and intuitive only to the most geeky or the Uber geeks. It’s an ego thing…

So why does a Server product called “Windows” whatever need to rely more on command line scripting and less intutive interfaces? Is it ”Nix envy? Well I can get that for free without confusing license schemes. No it’s something more insidious. The more needlessly complex you make a server OS the more you rely on VARS to make it run right. That means more money spent on training, consultants and every other related product or service in the food chain.

It’s a money mill with minimal return.

I’m not against upgrades, I’m just against waste. Waste of time, resources and effort for little to no return on the investment.