New AH Data Project
Once the ODC has completed prioritization of your data source, a formal project can be created to add your data to an Activity Hub.
- 1 Project Scope
- 2 Assessment
- 3 Documentation and Planning
- 4 Data Integration Development
- 5 Data Integration Quality Assurance (QA)
- 6 View Development
- 7 View Quality Assurance (QA)
- 8 BI Tool Modeling
- 9 BI Tool Model Quality Assurance (QA)
- 10 Release Requirements Check
- 11 Release
- 11.1 Beta Release
- 11.2 UAT Release
- 11.3 Production Release
- 11.4 Exit Hypercare
- 11.4.1 Tasks
Project Scope
Data source name
Table name(s)
Final curated view(s)
Which view(s) are included in initial go-live
Assessment
Data is examined for the kinds of information included and data types definitions, redundancy and general condition.
Documentation and Planning
Existing data fields are mapped to the AH and a plan for the transfer is created. Curated Views are designed based on business requirements and processes. Data definition committee (DDC), community of practice (COP) and data governance is identified. Business term name and definition document is kicked off within data definitions committee. Data access process is identified and a Service Now request form is created.
Detailed Design Document
Customer completes design document, listing out the fields needed for initial release. Example-->
Data Definition Committee
The AH Data Definition Committee provides guidance and subject matter expertise regarding the field named and definitions of data.
Role and responsibilities of committee members:
Source System Business Subject Matter Expert(s)
Definition:
o Primary analyst who will be leveraging content via activity hub.
Often the person who requested the data be added to the activity hub or a member of their team.
Must have a defined business need to leverage the data and a refined concept of the data will be used in creating critical reports and/or analytics.
Minimum 1 person and maximum 5 people for initial phase.
Responsibilities:
Provide business expertise on what data is currently leveraged from the source system and from where that data is currently being leveraged.
Provide context on the business use cases of the data as well as current reporting and/or analytical challenges that inclusion of the data within the activity hub is expected to address.
Completes the provided template (see attached) for initial data request prior to DDC meetings beginning. Explanation / instructions for completing this template will be provided to those identified for this role.
Provides critical input on field names, definitions and use cases.
Prepare for each DDC meeting by reviewing list of fields to be reviewed provided by DDC lead and collecting relevant information.
Represent their team and to the best of their ability the analytic community
Attend all DDC meetings to the best of their ability - provide alternate when not able to attend
Source System Technical Subject Matter Expert
· Definition:
Member of the primary team managing use of the source system.
Where possible should have analytic experience or at minimum an awareness of how analysts may leverage content from the source system
(examples: for Concur the IPPS team, for Tririga the Facilities & Services team)
Minimum 1 person from the source management team
Responsibilities:
Able to answer where the data is stored in the source system and how the data gets populated
Representing their team, provide business expertise and guidance related to proper naming, definition, context and sample values for data fields.
May also suggest fields to be included in the activity hub deliverable.
Prepare for each DDC meeting by reviewing the list of fields to be reviewed provided by DDC lead and collecting relevant information.
Able to display source system fields / examples during DDC meetings
Attend all DDC meetings to the best of their ability - provide alternate when not able to attend
Analytic Community of Practice
Communities of practice consist of regular Zoom sessions, mailing lists and forums where analysts come together to discuss specific topics, share best practices, ask questions and get answers. Learn more at Blink > Analytics Community of Practice.
Data Governance
Data and analytics governance is the specification of decision rights and an accountability framework to ensure the appropriate behavior in the valuation, creation, consumption and control of data and analytics.
Data governance must consider privacy, security, access, management, and maintenance. Learn more at Blink > Data Governance.
Data Steward
A data steward will be identified by the Operational Data Committee. At this stage in the process, the data steward is responsible for
Setting access policies for Report Developer and Report Consumer per View
Assigning Protection Level Classification per field
Access
Access to Activity Hubs focuses on two audiences: Report Developers and Report Consumer.
Report Developer access may be requested via
Activity Hub specific Service Now request form. The Data Steward provides the approval criteria and the approval route, which includes the requestor manager and the data steward, at minimum. Or,
Data Source access form. BIA can work with the data source team to add a “would you like to be an Activity Hub report developer” question to the data source access form. Or,
An existing access request process. BIA can work with the owner of the existing process in order to connect AH access to the existing process.
Report Consumer access is granted by Report Developers when they share their completed business intelligence solution with the consumer Active Directory group.
For more information on Activity Hub security please see Blink > Data Access & Security.
Data Integration Development
This work is typically done by ADIS team. Leveraging streaming technologies, data is physically moved from its source to the Hana business data platform. This step is include in the initial, historical data load and ongoing incremental loads and streams. After BIA provides source to target mapping to ADIS, assigned integrator can begin his work.
Data Integration Quality Assurance (QA)
Transferred data is validated for accuracy. Issues are corrected, data is rechecked until there are no accuracy issues.
View Development
Code is written to organize the data into user-friendly curated views.
View Quality Assurance (QA)
Data is validated for accuracy and alignment with business requirements. Data is rechecked until there are no issues.
BI Tool Modeling
Business terms are applied to the user-friendly curated view and data is organized into folders within Cognos and Tableau for ease of use.
BI Tool Model Quality Assurance (QA)
With the data definition committee, data is validated for accuracy and alignment with business requirements. Data is rechecked until there are no issues.
New requirements may be identified and prioritized for a later phase.
Release Requirements Check
QA must be completed and passed. Data access policy must be approved and in place. Production support strategy must be defined and in place.
Release
Data is released to users via Cognos and Tableau data analysis and reporting tools. ServiceNow request form for report developer data access is released.
Beta Release
The first people to see the new views via Cognos will be subject matter experts and project team member. The data steward will provide BIA will a list of people to be given report developer access in Cognos
UAT Release
Approved report developers will test the new view via Cognos or Tableau for one month.
Production Release
Finding no issues, the views will be migrated to Hana PROD, Cognos packages will be migrated to Cognos Production and Tableau data sources will be migrated to the Tableau production folder.
Exit Hypercare
Hypercare = 2 months post initial go live
After hypercare, customer SME can elect to join one of the regular AH syncs or submit SNOW tickets moving forward
Tasks
Criteria | Target Metric | Status | Notes |
---|---|---|---|
Bug Tickets | <5 per month |
|
|
|
|
|
|