By Greg Kalina, Allstate & Paul Burnham, ServiceNow
ITAK V10 I4
We have all heard about data integrity and how important it is; but taking it from a concept to reality is not something that is always realized because of unknown challenges. This is a case study that presents some of the challenges that had to be overcome related to data integrity and data quality during an IT Asset Management implementation project at Allstate Insurance Company.
Overview of Project:
Allstate chose a software platform as their primary tool to provide their CMDB, asset management repository and overall support for their related asset management and service management functions.
The objective was to achieve a complete and accurate hardware and software inventory of IT assets within the Allstate environments. The initial implementation included approximately 45,000 Windows based desktops and laptops, 3,500 Windows based servers, 3,500 non-Windows based servers, nine mainframes and other various systems. There were also over 350 software products from Allstate’s top 15 vendors included in the project. These numbers grew considerably over the length of the project.
Key tools relevant to this case study include: Microsoft Service Center Configuration Manager (SCCM), which was the primary discovery tool for workstations and Windows based servers; BDNA Technopedia, which was the primary tool for normalization (and identification) of discovery data from SCCM before it was loaded into ServiceNow; SCCM for usage data from their metering function; and Citrix to identify assigned users of Citrix.
The following are key concepts and why they were important.
Data quality: Refers to the overall ability to provide a clear understanding of the data for its intended purpose. Aspects of data quality include: accessibility; accuracy; consistency; comprehensiveness; currency; granularity; relevancy; and timeliness. Basically; is it good data?
- Why it was important: Without data quality, data becomes faulty or incomplete, losing its ability to serve the intended purpose
Data integrity: Refers to maintaining the overall completeness, accuracy and consistency of data while eliminating intended or unintended alterations when transferring data from source of collection to consumption of the data. Basically; does it maintain an exact copy of the data in transit?
- Why it was important: Without an assurance that the data has maintained its consistency (i.e. no intended or unintended alterations) when being transferring from collection source to consumption, the trustworthiness of the data is in question
Understanding software licenses: Refers to knowing, in detail, the various requirements that are needed to properly identify consumption/use of that specific license. Basically; what is needed to properly count specific license metrics?
- Why it was important: Without this understanding, you would not know which key details were needed to support effective management of specific licenses
Understanding tools within the environment: Refers to knowing, in detail, the capabilities and functionalities of the various tools available within your environment to support your IT Asset Management function. Basically; which tool can provide the needed capability or functionality?
- Why it was important: Without this understanding, you would not know which tool could or could not provide key details needed to support effective management of specific assets – in this case software licenses
The Five Step Approach
We used a basic five step approach during our implementation process:
Software Products Reviewed
For this article on our case study, we selected products to represent three of the key focus groupings within the Allstate environment. The focus groupings were:
- Products that required only deployment data
- Products that required deployment and usage data, and data was provided by an ITAM tool
- Products that required additional “non-discoverable” data that could not be provided by an ITAM tool
Based on these groupings, the following products were chosen as a representative sample of the challenges we had to overcome:
- Required only deployment data: Dell (Quest Software) TOAD products – TOAD Expert and TOAD DBA
- Required deployment and usage data collected from an ITAM tool: Attachmate Reflection products – Reflection and Reflection Suite for X
- Required additional “non-discoverable” data: Citrix XenDesktop products
Initial Attempt: TOAD Products
Additional Attempts: TOAD Products
Initial Attempt: Attachmate Products
Initial Attempt: Citrix XenDesktop products:
At the beginning of this project we thought we fully understood all the key aspects within the environment:
- Data quality
- Data integrity
- Software licenses
As it turned out, we failed to fully understand the capabilities and functionality of the available tools which resulted in unnecessary work and lost time.
We overcame these and other challenges as we progressed through the project. We continued to gain a better understanding of the capabilities and functionality (strengths, weaknesses and limitations) of the available tools, requirements of specific licenses and how to collect and verify the data needed to properly manage those licenses. We shared the various lessons learned and put them to use as we moved forward, which made other software products less of a challenge to bring under management.
The last thing we will leave you with is: data quality and data integrity are not interchangeable; data quality relates to the ability of the data to be used for its intended purpose. Data integrity relates to the validity of the data being unchanged while transiting from its source to its consumption. Data quality and data integrity are both needed when performing asset management and you need to validate that they have been achieved.