Category Archive for ‘Advisory’
Software Asset Management and Truffaut’s Stagecoachby ps
In Truffaut’s film “Day for Night” director Ferrand said: ”Shooting a film is like crossing the Wild West by stagecoach. You set out hoping for a nice trip, but soon you wonder if you’ll ever reach your destination.”
With our design approach we protect our clients from such a “stagecoach ride” through the Wild West of Software Asset Management (SAM) from the very beginning. We do not rely on a given tool, but on a design developed especially for the client. Such a procedure saves clients time and money. Successes become visible more quickly and are more sustainable, even if the planning phase takes a little longer.
We believe that efficient and effective SAM can only be achieved with a profound design of the future organization Only then, SAM can fully unfold its positive effects and play an integral part of IT Management.
We design for our clients the set-up or remodeling of SAM in organizations. This ranges from the concept to the “object support” of our client’s SAM. We use techniques developed from architecture and industrial design e.g. design thinking to ensure each SAM we deliver is perfectly tailored to the SAM customer. We do not believe in the one size fits it all concept mainly pursued in the SAM mainstream which often builds organizations around a license management tool. A license management tool is a handy thing but to our experience a SAM tool alone has rarely improved anything sustainably.
IBM License Metric Tool: To Flex(Net) Or Not to Flex(Net)?by ps
Since Flexera announced that its SAM-Tool is accepted by IBM as an alternative to ILMT (IBM License Metric Tool) in regard to SubCap-Licensing, there has been a lot of excitement around the Software Asset Management community. The news sounded tempting, especially to those who, on a daily basis, have to undertake the Sisyphean labor of managing ILMT’s software inventories and sign audit reports respectively report PVU-consumption. What agited the masses was and still is the question: Can Flexera’s SAM-Tool be used instead of IBM ILMT and will IBM actually legally accept it?
The answer is quite straight-forward: FlexNet-user should not simply install the FlexNet Manager Platform, run with it and assume that IBM will accept it. Instead they should read the IBM Terms on the Flexera’s website (1) first and then ponder carefully if it is worth to go in for a change and replace ILMT’s discovery and inventory capabilities by Flexera’s.
It is important to say, that at the bottom-line IBM sub-capacity reporting requires FlexNet Manager for IBM and FlexNet Manager Platform. These two products must be installed together and at version 2015 R2 or later and it is the FlexNet-User who is fully responsible
…that FlexNet Manager for IBM is functionally equivalent to the most current version of ILMT and provide equivalent reporting (including without limitation a minimum scan frequency of 30 minutes), and
…that FlexNet Manager for IBM is properly installed, configured and maintained at all times
…for working with Flexera to ensure that such conditions are met
…procuring additional Flexera Software products, if required, to meet these conditions.
Even if you can guarantee to fulfill all the above stated obligations, IBM still reserves the right to verify your FlexNet Manager for IBM installation using the assistance of a third party.
IBM’s authorization is furthermore limited to FlexNet Manager for IBM performing discovery and reporting of processor usage either natively or integrated with IBM License Metric Tool (ILMT), Tivoli Asset Discovery for Distributed (TADd) or IBM BigFix Inventory. The authorization does not apply to Flexera acting as an aggregator/dashboard and reporting processor usage discovered by other third party tools.
As the terms have to be accepeted by IBM, FlexNet user have to undergo a formal approval process:
(1) They have to register with IBM and formally submit a request via a Flexera representative.
(2) IBM will review the request and will provide related position.
(3) If IBM agrees, according to the Terms, FlexNet-user are also required to sign a Passport Advantage amendment with IBM with the related Terms.
As stated in a post in the developerworks forum July 21st, 2016 (2), IBM does not certify neither validate Flexera solution and accordingly to the Terms it is only customer responsibility to ensure functional equivalence with ILMT latest releases and updates and it is at full user risk to do so.
So running the FlexNet Manager Platform and FlexNet Manager for IBM for subcap-licensing comes definitely with a price tag. The number on it depending on the ability of the tool-set to be functional respectively technical equivalent to its IBM counterpart(s) not only on dispatch but on a day-to-day basis without much input from the customer side. Contractually spreading risk and responsibilities between customer and tool-vendor to fulfill IBM’s terms can in consequence prevent from unexpected surprises. Consequence: The initial euphoria has now been followed by a sense of sobriety.
(1) http://www.flexerasoftware.com/enterprise/products/software-license-management/flexnet-ibm-license-management/, 2016-09-07
(2) https://www.ibm.com/developerworks/community/forums/html/topic?id=ba2bdee6-8a2b-478b-a64c-836574900bd2&ps=25, 2016-09-07
Zombie Servers – IT Asset Management and the Undead Perilby ps
Who is afraid of the zombies? No one? Not really. At least IT Asset Management around the world gets quite agitated when it comes to zombies in their IT-infrastructure. Why is that? Because comatose servers or so-called zombie servers are doing nothing but burning money. According to an article in Wall Street Journal published in 2015 tens of billions of dollars in the world are just wasted because no one pulls the plug.(1) Setting up a server is cool, keeping servers up and running is ok, but decommissioning servers is risky. Nobody wants to be held responsible for switching one off which later on turns out being used for a critical but rare job. Hence, for IT Asset Managers of all types, sorting out which ones should be turned off is often a Sisyphean labor. Fighting zombie servers may not be sexy, glorious or attractive. But: it is lucrative – saving e.g. energy, software and hardware costs. Hence, it is worth doing it. Being mainly a management problem:
Team Up. Get involved and work cross-functionally in request/change processes as early as possible.
Get to one version of the truth. Get and ensure accurate inventories at any point in time, focusing on collection of data prior to deployment rather than reactively.
(1) http//wsj.com/Articles/zombie-servers-theyre-here-and-doing-nothing-but-burning-energy-1442197727, Sept 13, 2015 10:28 pm ET
SAM Mantraby ckb
I do not install what I do not use.
I do not install what I have not licensed.
I do not use what I have not licensed.
I do not think that the SAM manager is an idiot.
I do not license what is not in use.
I do not license what is not installed.
I do not think that the Admin is an idiot.
Data Master or Master of Disaster?by ps
Data management is the core of any Software asset management. The equation is clear: High-end data ensures high-end decisions, not only in SAM. If SAM data does not satisfy some data quality criteria, decision making in SAM resembles more an heuristic approach to problem solving: practical and sufficient to ease the uncertainty of not-knowing and speed up the decision finding process but nevertheless no more than an educated guess.
Even if data quality is often an unnoticed or even unloved issue – there is no way around it. It is the fundament you build your “house of compliance” on. The better it is build the more stable. But data around soft- and hardware assets have long been a negligible component of everyday IT management as IT itself has been for a long time expected to “serve and deliver”. Nowadays, as information has grown into a strategic value and IT costs have risen considerably, information about IT assets themselves is often scattered in diverse data silos of disparate organizational departments presenting itself as hardly to interconnect and manage. So what to do?
First things first: “You cannot manage what you do not measure”. Most organizations seem to be deaf and blind against efforts to measure the quality of their SAM data neither objectively nor quantitatively because they have no realistic estimation how much it affects targeted benefits. This is not SAM specific but an overall truth: Research estimates that 40% of the anticipated value of all business initiatives is never achieved (1). So introducing a metrics-based approach to assessing data quality in SAM helps remove perceptions and gut-feelings and building a business case for formal data quality improving efforts. Once the business value is tangible, start organizing your SAM data management both strategically and tactically- which means doing more than just implementing a tool (2). Last but not least: Reap benefits!
(1) Friedman T., Smith M.: Measuring the Business Value of Data Quality, Gartner 2011.
(2) Goetz, M.: Are Data Governance Tools Ready for Data Governance? Michele Goetz’ Blog, 2014
SAM Principles To Keep In Mindby ps
Installed + used + licensed = good
Installed + not used + licensed = bad
Installed + used + not licensed = ugly
Migrating Windows Server 2003 Can Save Your IT-Securityby ps
Ten years are enough. On 14th of July 2015 we have to wave Windows Server 2003 good-bye. Reason: End of Extended Support. Being the Windows XP of the server-world makes the farewell for millions of users still hard. But having no migration plan yet is putting your IT-security seriously at risk. No support means no more updates and affected systems will become a welcoming flaw for (cyber)intruders of all kinds. Custom Support may help but seems far too expensive to be a sound and long-term solution to security holes in your IT-landscape. Pro-actively planning your migration now can save you from some insomnia.
So what to do? Changing to another operating system like Unix or Linux? In theory Unix is a handy idea (less expensive and more secure than Windows) – in practice maybe in many cases not feasible because some of the most important applications could only be running on Windows based systems. Leaves the upgrade-option to the bulk of users. Windows Server 2012 R2 is Microsoft’s latest server operating system. Unfortunately, it is not able to execute 16bit applications and 32bit only via an emulator. Sounds as if compatibility issues may ruin your day? Right. Hence, just switch to Windows Server 2012 if you have already used the Windows Server 2003 64bit-edition before. Otherwise just update to Windows Server 2008 but keep in mind that the extended support for this product ends in 2020 and migration to-dos are only postponed not dealt with. Likewise, you can also rethink your attitude towards cloud-services – but as putting (sensitive) enterprise data on a public cloud is not to everybody’s taste, a hybrid solution may be a better alternative if migrating workloads is rather an issue than server (hard- and software-)upgrades.
Given the complex mixture of technical, commercial and legal issues arising out of migration requirements it is a wise thing starting today to execute your migration plans though being done with all the tasks involved in July this year will not be very realistic. Thus bridging security gaps by falling back on EMET (Enhanced Mitigation Experience Toolkit) and Intrusion Prevention Systems should be a matter of course as well as being able to isolate attacked systems.
Even if high-costs in developing new versions of old applications, acquiring 64bit versions or alternatives and/or upgrading hardware are involved and budgets are tight, it is difficult to understand why so many organizations still seem to be reluctant to migration, as legacy software is in the long-term a even costlier venture.
From Data Orc to Business Wizardby ps
It’s high-time that IT moves on from data orc to business wizard. What do I mean by that? With all the data accumulating and storing back in server rooms and clouds while everyday asking ourselves what we really do know about our organizations, our business environments or our customers, and IT rarely being more than a convenient tech delivery service while we are still wondering if all the applications running really improve our productivity or thrive business, it is clear that we need a change of (IT-)perspective.
Gathering, storing, securing data and caring for properly working soft- and hardware is (and will always be) important, no doubt. But sitting on a huge pile of data does not support decision-making. All the buzz around Big Data and all-time high IT-budgets seem to underline that we need to exceed our bits-and-pieces-thinking and embrace all the vast and complex side of technology, information and knowledge residing not only in our data in order to create new products and services (or at least improve old ones).
The tech-savvy geek tuning and controlling machines and apps like a sort of business orc will soon belong to the past, as IT is switching to a more business-focused and therefore more strategic role questioning how it could grow business and add business-value. This will require a lot of knowledge about what is going on inside and outside a company. Hence, IT staff will need more business and social skills. With the growing numbers of service-providers in IT there is no more need to do all on one’s own. In depth-technical knowledge is fine but knowing the right cores is even handier when it comes to promoting innovation and spurring business development. Architecture which allows flexible and agile technical changes, a likewise quick and secure delivery management, an effective and efficient data and vendor management are already gaining more and more momentum in IT departments as they are awakening to the requirements of the creative age.
Mind the gap #Windows Server 2012 R2by ps
For changes in your production environment, the difference in licensing Windows Server 2012 R2 can be bluntly boiled down to this: mind the gap between the number of physical processors and the number of virtual machines your servers run.
Especially in profoundly virtualized environments a costly Datacenter Edition may pay off as it allows running an unlimited number of virtual machines on a server even though only two physical processors are covered by this license. In contrast a Standard Edition allows only one virtual machine per processor although as well covering two physical processors. E.g.: if you have a server with two processors and thirteen virtual machines you are in need of only one Datacenter-license in contrast to seven Standard-licenses. Even if you change to four physical processors per server and run thirteen virtual machines there is only the requirement for two Datacenter Editions while you have to buy seven Standard Editions to be on the safe side.
Hence, small changes can make a big difference when it comes to your IT costs. So a little counting and calculating against the background of your migration plans or other change requirements may avoid the common pitfall of “lived cheap is paid dearly”.