A year or so ago, we bought an application for managing help desk requests. One of our requirements was Active Directory integration. The product we selected said they integrated with AD. The project was a bit rushed so we didn't conduct due diligence like we should have and we just bought the application.
For this application, AD integration actually meant that periodically (every 24 hours by default), the application would query AD and get the current user base and then replicate the same information in its own database. This means the application didn't authenticate through AD; it maintained its own permissions database.
Perhaps there are good reasons for designing the application this way but to me, it just seems unnecessary. The whole point to AD is to manage user access to network resources. Why does it make sense to create a redundant permissions database? This creates more overhead for system administrators and it adds a level of complexity to troubleshooting because the application's home-brewed authentication is another layer that needs to be troubleshot. Whenever I see solutions like this, my inital reaction is: These guys don't know how to query AD in code.
Yet, in direct contradiction to my theory is Microsoft's ISA Server 2004. Internet Security & Acceleration Server is used to manage user access to network and internet protocols. ISA uses various kinds of object definitions to accomplish this: protocols, applications, ports, users, domains, URLs, etc. You'd think the User object would be AD objects because it's a Microsoft application.
But no. You can't write access rules directly to AD objects like users or security groups. You have to create an ISA group, populate the group with AD users or security groups and then use the ISA User object to define who is affected by a rule. This is not AD integration.
SQL Server has a similarly schizophrenic authentication model. You can run SQL in mixed mode, which allows you to use Windows authentication or with SQL auithentication. With SQL 2005 and later releases of some of their applications (e.g. CRM 3.0) it seems that Microsoft is moving away from SQL authentication toward Windows credentials. This is as it should be.
My point in this post is to be aware of what various technical terms mean because there are often two definitions to important tech terms: the marketing meaning and the technical meaning. My mantra is to always distrust the brochure and the sales guy. Don't assume that important technical terms mean what you think they ought to mean. Always clarify the definition because sales people tend to be oriented around seelling to marketing definitions not technical ones.
Tuesday, May 02, 2006
I have argued before in my blog that for technology, ROI and TCO calcs are accounting gibberish that companies use to justify the costs to acquire new technology, but which are largely useless because they don't lead to meaningful information. I have a couple of components to my argument against ROI and TCO calculations for technology buys:
1. The actual dollar value of benefit gains are wrapped around assumptions that have varying probabiliites of accuracy, where "accuracy" is defined as how closely the assumptions map to the actual values of costs and benefits (and often the total cost and value picture isn't known at the time the ROI and TCO calcs are made). Further, these calculations tend to only look at benefits. They do not look at, for example, the loss of productivity that happens during implementation and during the user's learning experience as well as other factors that could potentially minimize the net benefits associated with the project.
2. ROI and TCO calcs are most accurate when direct costs are used. They are less accurate when indirect costs are applied. For costs like software licensing fees and hardware acquisition, direct costs are pretty solid. But there are other costs like hourly consulting fees that can easily go over budget. Further, these calcs can be hampered by the opportunity cost of the support provided by existing staff to the technology project in question: when staff is supporting the project, they cannot do their regular jobs. For disciplined companies, the cost of validating the ROI after the payoff period has expired adds costs to the project and diminishes the expected return (see item 3).
3. Most companies do not bother to measure the actual ROI of a project after implementation and after the projected payoff period. This implies that the ROI and TCO calculations are merely exercises to mollify the budget-keepers. If a company truly valued its capital and the need to satisfy a capital hurdle rate (ours is 12%), then they would also have the discipline to validate the ROI after the payoff period has passed. To me, the fact that most companies don't do this means that the ROI and TCO calcs are either acts of obeisance or they are simply traditional, yet vacuous corporate exercises.
4. Individuals who want the project approved adjust their assumptions not to reflect reality but to get the numbers needed to get project approval. It simply involves working the numbers backwards.
Check out this four minute cNet video on performing the TCO calculation. It's a wonderful illustration of my objections to these common exercises. There are two key problems with the claims made in this particular presentation.
Common metric for decisions on all potential projects: She asserts that the value of the ROI calc is that it allows a business to evaluate all potential cash outlays by using the same metric: namely, the value of expected benefit in relation to the up-front costs.
This is not true in all cases and is particularly sketchy with technology purchases. This is because the benefits of technology and the dollar value of those benefits are difficult to measure. In contrast, the ROI of a forklift is more tangible because the dollar value of benefits are easier to measure (e.g. increases ability to move inventory to assembly, which helps increase inventory turns, etc.).
She completely glosses over the value of benefits with a straight face: In her example, she says something like, "We are going to experience an increase in productivity that will drive revenue gains of $80,000/yr for three years."
This is exactly the kind of statement that completely undermines the value of ROI calcs. She doesn't describe how to go about calculating the value of the expected benefits. Specifically, how will the expected benefits translate into measureable dollars? The dollar value of benefits magically appear in ROI spreadsheets as functions of assumptions about those benefits and no one ever challenges the ROI calculation even though everyone knows the assumptions are usually innacurate and incomplete.
The reality is that most companies do not have a clear idea of the dollar value of benefits with technology acquisitions because tech buys are largely intangible. Further, the broader the scope of a tech buy, the more systemic its effects will be. This means that implementing an ERP application for a whole company will touch more business processes with more potential for broad benefit or catastrophic disruption than say implementing AutoCAD in the engineering department. The intangibility of tech buys can be made more concrete by conducting baseline analyses of the costs of business processes and redesigning business processes to derive benefit out of the tech buy. Of course, most companies don't have the discipline for this kind of validation.
Some tech spends have variable benefit value based on whether the technology is fully utilized. For example, if your company uses Office primarily for memos, localized number crunching (i.e. individuals running their calculations without regard to other departments) and databases, there isn't likely a whole lot of value in upgrading from Office 2003 to Office 2007. But if you take Office 2007 and integrate it with SQL 2005, SharePoint and Microsoft Business Scorecard Manager to distribute key performance indicator metrics to the company, you will experience incredible value for the cost of upgrading to Office 2007.
The bottom line is this: for tech spends, it is difficult to accurately measure the dollar value of expected benefits. ROI calculations are hampered by this difficulty. TCO calcs are hampered by the presence of direct and indirect costs, as well as costs that are ignored for various reasons. For example, cold room electrical consumption may be viewed as an overhead cost, which makes it difficult to evaluate the cost of electricity as a decision-making factor in choosing one server or another. This is why Sun's marketing strategy to sell SunFires on the basis of lower power consumption and lower heat throw is not likely to connect with a lot of companies. I address this issue heretoward the end of the article.
The bottom line is that in many, if not most cases, decision-makers buy software solutions because they have an intuitive or hopeful sense that the software spend will generate real process and dollar benefit. I have seen decision-makers tweak assumptions known to be erroneous simply to get the ROI over the hurdle rate so they can get their project approved. In other words, they adjust the numbers at the back end of the calculation in order to justify their desire and hope.
Risk analysis then is also intuitive and is measured by comparing the risk of failure with the comfort level among decision-makers that they have made a good selection and that the project will, in the end, be successful at delivering value.
Technology is bought because of emotion. It's not a purely financial decision.
As much as accountants would like it to be otherwise, technology spends are primarily emotional. Technology is bought because of intuition, desire, coveteousness, a need to appear significant and a deep hope that something can make business pain go away.
Posted by Dave at 10:29 AM