Author Archive

Software Asset Management and Truffaut’s Stagecoachby ps

In Truffaut’s film “Day for Night” director Ferrand said: ”Shooting a film is like crossing the Wild West by stagecoach. You set out hoping for a nice trip, but soon you wonder if you’ll ever reach your destination.”

With our design approach we protect our clients from such a “stagecoach ride” through the Wild West of Software Asset Management (SAM) from the very beginning. We do not rely on a given tool, but on a design developed especially for the client. Such a procedure saves clients time and money. Successes become visible more quickly and are more sustainable, even if the planning phase takes a little longer.

We believe that efficient and effective SAM can only be achieved with a profound design of the future organization Only then, SAM can fully unfold its positive effects and play an integral part of IT Management.

We design for our clients the set-up or remodeling of SAM in organizations. This ranges from the concept to the “object support” of our client’s SAM. We use techniques developed from architecture and industrial design e.g. design thinking to ensure each SAM we deliver is perfectly tailored to the SAM customer. We do not believe in the one size fits it all concept mainly pursued in the SAM mainstream which often builds organizations around a license management tool. A license management tool is a handy thing but to our experience a SAM tool alone has rarely improved anything sustainably.

IBM License Metric Tool: To Flex(Net) Or Not to Flex(Net)?by ps

Since Flexera announced that its SAM-Tool is accepted by IBM as an alternative to ILMT (IBM License Metric Tool) in regard to SubCap-Licensing, there has been a lot of excitement around the Software Asset Management community. The news sounded tempting, especially to those who, on a daily basis, have to undertake the Sisyphean labor of managing ILMT’s software inventories and sign audit reports respectively report PVU-consumption. What agited the masses was and still is the question: Can Flexera’s SAM-Tool be used instead of IBM ILMT and will IBM actually legally accept it?

The answer is quite straight-forward: FlexNet-user should not simply install the FlexNet Manager Platform, run with it and assume that IBM will accept it. Instead they should read the IBM Terms on the Flexera’s website (1) first and then ponder carefully if it is worth to go in for a change and replace ILMT’s discovery and inventory capabilities by Flexera’s.

It is important to say, that at the bottom-line IBM sub-capacity reporting requires FlexNet Manager for IBM and FlexNet Manager Platform. These two products must be installed together and at version 2015 R2 or later and it is the FlexNet-User who is fully responsible

…that FlexNet Manager for IBM is functionally equivalent to the most current version of ILMT and provide equivalent reporting (including without limitation a minimum scan frequency of 30 minutes), and

…that FlexNet Manager for IBM is properly installed, configured and maintained at all times

…for working with Flexera to ensure that such conditions are met

…procuring additional Flexera Software products, if required, to meet these conditions.

Even if you can guarantee to fulfill all the above stated obligations, IBM still reserves the right to verify your FlexNet Manager for IBM installation using the assistance of a third party.

IBM’s authorization is furthermore limited to FlexNet Manager for IBM performing discovery and reporting of processor usage either natively or integrated with IBM License Metric Tool (ILMT), Tivoli Asset Discovery for Distributed (TADd) or IBM BigFix Inventory. The authorization does not apply to Flexera acting as an aggregator/dashboard and reporting processor usage discovered by other third party tools.

As the terms have to be accepeted by IBM, FlexNet user have to undergo a formal approval process:

(1) They have to register with IBM and formally submit a request via a Flexera representative.

(2) IBM will review the request and will provide related position.

(3) If IBM agrees, according to the Terms, FlexNet-user are also required to sign a Passport Advantage amendment with IBM with the related Terms.

As stated in a post in the developerworks forum July 21st, 2016 (2), IBM does not certify neither validate Flexera solution and accordingly to the Terms it is only customer responsibility to ensure functional equivalence with ILMT latest releases and updates and it is at full user risk to do so.

So running the FlexNet Manager Platform and FlexNet Manager for IBM for subcap-licensing comes definitely with a price tag. The number on it depending on the ability of the tool-set to be functional respectively technical equivalent to its IBM counterpart(s) not only on dispatch but on a day-to-day basis without much input from the customer side. Contractually spreading risk and responsibilities between customer and tool-vendor to fulfill IBM’s terms can in consequence prevent from unexpected surprises. Consequence: The initial euphoria has now been followed by a sense of sobriety.


(1) http://www.flexerasoftware.com/enterprise/products/software-license-management/flexnet-ibm-license-management/, 2016-09-07

(2) https://www.ibm.com/developerworks/community/forums/html/topic?id=ba2bdee6-8a2b-478b-a64c-836574900bd2&ps=25, 2016-09-07

Zombie Servers – IT Asset Management and the Undead Perilby ps

Who is afraid of the zombies? No one? Not really. At least IT Asset Management around the world gets quite agitated when it comes to zombies in their IT-infrastructure. Why is that? Because comatose servers or so-called zombie servers are doing nothing but burning money. According to an article in Wall Street Journal published in 2015 tens of billions of dollars in the world are just wasted because no one pulls the plug.(1) Setting up a server is cool, keeping servers up and running is ok, but decommissioning servers is risky. Nobody wants to be held responsible for switching one off which later on turns out being used for a critical but rare job. Hence, for IT Asset Managers of all types, sorting out which ones should be turned off is often a Sisyphean labor. Fighting zombie servers may not be sexy, glorious or attractive. But: it is lucrative – saving e.g. energy, software and hardware costs. Hence, it is worth doing it. Being mainly a management problem:

Team Up. Get involved and work cross-functionally in request/change processes as early as possible.

Get to one version of the truth. Get and ensure accurate inventories at any point in time, focusing on collection of data prior to deployment rather than reactively.

(1) http//wsj.com/Articles/zombie-servers-theyre-here-and-doing-nothing-but-burning-energy-1442197727, Sept 13, 2015 10:28 pm ET

Data Master or Master of Disaster?by ps

Data management is the core of any Software asset management. The equation is clear: High-end data ensures high-end decisions, not only in SAM.  If SAM data does not satisfy some data quality criteria, decision making in SAM resembles more an heuristic approach to problem solving: practical and sufficient to ease the uncertainty of not-knowing and speed up the decision finding process but nevertheless no more than an educated guess.

Even if data quality is often an unnoticed or even unloved issue – there is no way around it. It is the fundament you build your “house of compliance” on. The better it is build the more stable. But data around soft- and hardware assets have long been a negligible component of everyday IT management as IT itself has been for a long time expected to “serve and deliver”. Nowadays, as information has grown into a strategic value and IT costs have risen considerably, information about IT assets themselves is often scattered in diverse data silos of disparate organizational departments presenting itself as hardly to interconnect and manage.  So what to do?

First things first: “You cannot manage what you do not measure”. Most organizations seem to be deaf and blind against efforts to measure the quality of their SAM data neither objectively nor quantitatively because they have no realistic estimation how much it affects targeted benefits. This is not SAM specific but an overall truth: Research estimates that 40% of the anticipated value of all business initiatives is never achieved (1). So introducing a metrics-based approach to assessing data quality in SAM helps remove perceptions and gut-feelings and building a business case for formal data quality improving efforts. Once the business value is tangible, start organizing your SAM data management both strategically and tactically- which means doing more than just implementing a tool (2). Last but not least: Reap benefits!

 


 

(1) Friedman T., Smith M.: Measuring the Business Value of Data Quality, Gartner 2011.

(2) Goetz, M.: Are Data Governance Tools Ready for Data Governance? Michele Goetz’ Blog, 2014

 

SAM Principles To Keep In Mindby ps

Installed + used + licensed = good

Installed + not used + licensed = bad

Installed + used + not licensed = ugly

Software Asset Data Management – A Handy Glossary For Informed Software Asset Managersby ps

A continuously extending summary of important terms and definitions in software asset data management.

C

Cooked Data: SAM raw data that has undergone some form of processing – potentially ending up in a SAM database.

D

Dirty Data: A SAM database record that contains errors. Dirty SAM data can be caused by duplicate SAM records, incomplete or outdated SAM data and the improper parsing of record fields from disparate systems. Errors can be induced at any stage as data is entered, stored and managed.

Data Governance (DG): Overall management of the availability, usability, integrity, and security of the data used in a SAM organization. Includes a governing body, a set of procedures, a plan to execute them. Comprises the specification of who owns SAM data assets, who is accountable for various aspects of data, a definition of how the data is to be stored, archived, backed up, and protected from mishaps, theft, or attack, the development of standards and processes of how the data is to be used by authorized personnel, and specify auditing and controlling procedures allowing for ongoing compliance with external and internal regulations.

Data Hygiene: The collective efforts conducted to secure the cleanness of SAM data – whereat clean means relatively free of errors.

Data Life Cycle Management (DLM): A policy-based approach to managing the flow of an SAM information system’s data throughout its lifecycle – from creation and initial storage to the time when it becomes obsolete and is deleted.

Data Management: The development and execution of architectures, policies, practices and procedures in order to manage the SAM information lifecycle needs of a SAM organisation in an effective manner.

Data Profiling: also called data archeology. The statistical analysis and assessment of SAM data values within a SAM data set for consistency, uniqueness and logic.

Data Quality: the reliability and application efficiency of SAM data.  A perception or an assessment of SAM data’s fitness to serve its purpose in a SAM context. Aspects of SAM data quality are: accuracy, completeness, update status, relevance, consistency, reliablity, appropriate presentation, accessibility. Within SAM organizations, acceptable SAM data quality is crucial to operational and transactional processes as well as SAM analytics and reporting. SAM data quality is affected by the way data is entered, stored and managed. Maintaining SAM data quality requires going through the the data periodically and scrubbing it which generally includes updating, standardizing, deduplicating records to get a single view of the data, even if it is stored in multiple and disparate systems.

Data Scrubbing: Often also called “Data Cleansing”. Process of amending or removing SAM data in a SAM database which is incorrect, incomplete, improperly formatted, or duplicated.

Data Stewardship: The management and oversight of a SAM organization’s SAM data assets to help provide SAM users with high-quality data. Maintains agreed-upon data definitions and formats, identifies data quality issues and ensures that business users adhere to specified standards.

G

Garbage In – Garbage Out (GIGO): The quality of SAM output is determined by the quality of SAM input. A faulty SAM decision made out of incomplete information. Originally coined by George Fuechsel an early IBM programmer. Also depicted by the saying: “A fool with a tool is still a fool” as a SAM tool can only process what is given.

Garbage in – Gospel Out: Tendency to put unwarranted faith in the accuracy of computer-generated data.

M

Master Data Management (MDM): Method of enabling a SAM organisation to link all of its critical data to one file, called a master file, that provides a common point of reference. When properly done, MDM streamlines SAM data sharing among personnel and departments. Includes training and teaching SAM personnel how data is to be formatted, stored and accessed, and updating master data on a regular basis.

Metadata: SAM data that describes other data. The prefix “Meta” connotes an underlying definition or description e.g. author, date created, date modified. Facilitates finding and working with particular instances of SAM data.

O

OLAP (online analytical processing): enables users to analyze multi-dimensional data from various perspectives. OLAP is usually a crucial part of business intelligence. It is based on a multidimensional data model and allows for complex analytical as well as ad hoc queries with a rapid execution time.

R

Raw Data: also Source or Atomic Data. SAM data that has not been processed for use.

 

Migrating Windows Server 2003 Can Save Your IT-Securityby ps

Ten years are enough. On 14th of July 2015 we have to wave Windows Server 2003 good-bye. Reason: End of Extended Support. Being the Windows XP of the server-world makes the farewell for millions of users still hard. But having no migration plan yet is putting your IT-security seriously at risk. No support means no more updates and affected systems will become a welcoming flaw for (cyber)intruders of all kinds. Custom Support may help but seems far too expensive to be a sound and long-term solution to security holes in your IT-landscape. Pro-actively planning your migration now can save you from some insomnia.

So what to do? Changing to another operating system like Unix or Linux? In theory Unix is a handy idea (less expensive and more secure than Windows) – in practice maybe in many cases not feasible because some of the most important applications could only be running on Windows based systems. Leaves the upgrade-option to the bulk of users. Windows Server 2012 R2 is Microsoft’s latest server operating system. Unfortunately, it is not able to execute 16bit applications and 32bit only via an emulator. Sounds as if compatibility issues may ruin your day? Right. Hence, just switch to Windows Server 2012 if you have already used the Windows Server 2003 64bit-edition before. Otherwise just update to Windows Server 2008 but keep in mind that the extended support for this product ends in 2020 and migration to-dos are only postponed not dealt with. Likewise, you can also rethink your attitude towards cloud-services – but as putting (sensitive) enterprise data on a public cloud is not to everybody’s taste, a hybrid solution may be a better alternative if migrating workloads is rather an issue than server (hard- and software-)upgrades.

Given the complex mixture of technical, commercial and legal issues arising out of migration requirements it is a wise thing starting today to execute your migration plans though being done with all the tasks involved in July this year will not be very realistic. Thus bridging security gaps by falling back on EMET (Enhanced Mitigation Experience Toolkit) and Intrusion Prevention Systems should be a matter of course as well as being able to isolate attacked systems.

Even if high-costs in developing new versions of old applications, acquiring 64bit versions or alternatives and/or upgrading hardware are involved and budgets are tight, it is difficult to understand why so many organizations still seem to be reluctant to migration, as legacy software is in the long-term a even costlier venture.

 

Inventories are not enoughby ps

What data and information should be taken care of managing vendor inventories is usually no secret as audit literacy progresses. Irrespective of how well kept your inventory information is, its full economic impact can often be first unfolded if it is matched (A) with internal or external expert vendor and legal knowledge and (B) an organizational culture of learning. (A) because there remain so many pitfalls which can only be identified and overcome by people who have gathered extensive amounts of experience dealing with information on a specific vendor. (B) because checking on results is only half the battle if no adequate action is taken – e.g. acquiring new, or modifying and reinforcing existing knowledge, behaviors, skills, values, preferences and/or synthesizing different types of information in regard to the different vendors – on an organization wide level.

If you are not sure how to recruit the right expert for your organization or how to design and implement a “learning organization” contact us today for a free confidential consultation.

From Data Orc to Business Wizardby ps

It’s high-time that IT moves on from data orc to business wizard. What do I mean by that? With all the data accumulating and storing back in server rooms and clouds while everyday asking ourselves what we really do know about our organizations, our business environments or our customers, and IT rarely being more than a convenient tech delivery service while we are still wondering if all the applications running really improve our productivity or thrive business, it is clear that we need a change of (IT-)perspective.

Gathering, storing, securing data and caring for properly working soft- and hardware is (and will always be) important, no doubt. But sitting on a huge pile of data does not support decision-making. All the buzz around Big Data and all-time high IT-budgets seem to underline that we need to exceed our bits-and-pieces-thinking and embrace all the vast and complex side of technology, information and knowledge residing not only in our data in order to create new products and services (or at least improve old ones).

The tech-savvy geek tuning and controlling machines and apps like a sort of business orc will soon belong to the past, as IT is switching to a more business-focused and therefore more strategic role questioning how it could grow business and add business-value. This will require a lot of knowledge about what is going on inside and outside a company. Hence, IT staff will need more business and social skills. With the growing numbers of service-providers in IT there is no more need to do all on one’s own. In depth-technical knowledge is fine but knowing the right cores is even handier when it comes to promoting innovation and spurring business development. Architecture which allows flexible and agile technical changes, a likewise quick and secure delivery management, an effective and efficient data and vendor management are already gaining more and more momentum in IT departments as they are awakening to the requirements of the creative age.

(c) 2020 ckb GmbH - Legal notice | Privacy policy

top