Information Management Records (IMR) Audit - Final Report - July 2008
36 pages
English

Information Management Records (IMR) Audit - Final Report - July 2008

Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
36 pages
English
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres

Description

INFORMATION MANAGEMENT RECORDS (IMR) AUDIT FINAL REPORT July, 2008 Prepared by: Ljubica Kuraica-Siveska TABLE OF CONTENTS 1. EXECUTIVE SUMMARY 3 1.1 INTRODUCTION 5 1.2 OBJECTIVES 5 1.3 APPROACH 5 2. INTERVIEWS - FINDINGS / RECOMMENDATIONS 8 2.1 ADEQUACY AND EFFECTIVENESS OF TRAINING 8 2.2 FREQUENCY OF USE 9 2.3 STABILITY AND USER FRIENDLINESS 10 2.4 SUPPORT AND COMMUNICATION 11 2.5 INTEGRATION WITH QMS AND OTHER TECHNOLOGIES 12 2.6 ABILITY TO MAINTAIN RESTRICTED INFORMATION 13 2.7 COMFORT LEVEL WITH THE PROJECT APPROACH 14 3. IMR CAPACITY CHECK 15 APPENDIX I – RDIMS IMPLEMENTATION ...

Sujets

Informations

Publié par
Nombre de lectures 40
Langue English

Extrait

    INFORMATION MANAGEMENT RECORDS (IMR) AUDIT  FINAL REPORT  July, 2008
                 
    
Prepared by: Ljubica Kuraica-Siveska      
pmIemeltatnnoiMR IlaRperotAssessmentFin2gefoPa63   
 
                
TABLE OF CONTENTS     1. 3EXECUTIVE SUMMARY 1.1 INTRODUCTION   5      1.2 OBJECTIVES5       1.3 APPROACH5       2. INTERVIEWS - FINDINGS / RECOMMENDATIONS 8 2.1 ADEQUACY AND EFFECTIVENESS OF TRAINING8            2.2 FREQUENCY OF USE9         2.3 STABILITY AND USER FRIENDLINESS1  0        2.4 SUPPORT AND COMMUNICATION                                          1 1 2.5 INTEGRATION WITH QMS AND OTHER TECHNOLOGIES1  2           2.6 ABILITY TO MAINTAIN RESTRICTED INFORMNATIO1  3 2.7 COMFORT LEVEL WITH THE PROJECT APPR OACH  14 3. IMR CAPACITY CHECK 15 APPENDIX I – RDIMS IMPLEMENTATION INTERVIEW QUESTIONNAIRE 24 APPENDIX II – RDIMS IMPLEMENTATION – MATURITY CRITERIA DESCRIPTION 26 APPENDIX III - DOCUMENTS REVIEWED 34  
bedone.ConsidmcuheramnisotestryveeonatthcsiraelotomlaptioadoisnitylperasfoahesilstsihetinlhtgnireSMIDRtaehmmonede.tAtyembracnotfullstierabSMfeneatthDIRviobsouroimjatehefosomees;ployyemnamybdesugniebotnisSIMRDtneebeddeinshatidehsilpmoccaweveHocuhhgmhtuorlarompionlefirpeerutufndesaB(etc.ons
Since the beginning of the Information Management Renewal Project (January 2007), several significant goals have been achieved including:  ƒ Working Group endorsement of the RDIMS application configuration and profile. Project ƒ Staffing the IM Team. ƒ Training and Coaching of the majority of the Board. ƒ and integration of RDIMS version 4.7 on users desktops across the Board. Implementation ƒ System stability. ƒ knowledge bases designed for different Board functions. Specialized ƒ of IMR process engagements. Commencement ƒ Building relationships and process based focus. ƒ Compliance with Treasury Board of Canadas July 1, 2007 Policy on Information Management.
1. EXECUTIVE SUMM ARY   The enclosed report summarizes the results of the Information Management Renewal (IMR) audit conducted at National Energy Board, in Calgary, Alberta in May/June, 2008. The audit was conducted by a contractor, Ljubica Kuraica-Siveska, under the direction of the Internal Audit and Review Services Advisor, Tony Mitchell. The report is based on information harvested from two primary sources: participant interviews and a review of project practices, processes and related documentation. These were evaluated and analyzed to identify opportunities for improvement and develop recommendations. Two methodologies have been used in this audit:
ƒwas to look at the current records management practices, RDIMS implementation that first approach  The has been done up to date and the work that has to proceed to move the implementation to the next phase. ƒThe second approach, used in conjunction with the first, has been used to assess the IMR management framework capacity. The second approach is based on generally accepted best practices, and therefore provides an additional opportunity to identify gaps between current and best practices, and provides advice on how to address the gaps.
).nignmaneitocvnfieiedtntocdorsarauseƒe:alpeliFcurts(nno)retulovedetepdofromtsoftheprocesses.Dnirurtginia,gneruswesareisdvheirvetosaedtegeninrosdrceedilfbeoTcriwoh;yrotceridrearemevertheecnrsbajarocnoilabyittouethhteimelottacotatiemenImplIMR rtpoRelnaFitnemssessAno3egaP6  of3 
IpmelemI RMPatsAesssemtntaoinalReporntFinge4of36   
ƒ RDIMS not being integrated with QMS and other technologies like ESIMS, ELEXIM, Translation software etc. ƒ Lack of enforcement / accountability. ƒ No timeframes associated with user acceptance. ƒ of communication. Lack ƒ overload / other priorities. Work As a consequence, there is a high user resistance and a perception of RDIMS being user unfriendly and time consuming. Also training retention and a loss of the RDIMS momentum are evident. The overall conclusion is that although the project shows solid framework and sense of direction, there are still a few high risk variables that are associated with the project success. The maturity of the QMS Processes as well as the resources for QMS and RDIMS remains to be ongoing challenges. The audit identified 16 recommendations and 50 opportunities for improvement. Among those recommendations the following are in need of immediate attention: ƒClose-out Implementation and Commence Operations Phase.(Revise the scope, objectives, milestones and deliverables for this phase and have them officially approved by the steering committee). ƒCommunicate the Operations Phase of the implementation.Communicate what the next steps are and clarify what happens with information/records storage produced by QMS processes that will not be integrated this year (Ensure that user perceptions are consistent). ƒRevise the IM Policyto clarify responsibilities and accountabilities for everyone involved. (Then accountability could be monitored as part of the employee’s RESULTS program). ƒFormalize the QMS Engagement Processto be officially accepted and “signed-off” by the Process Owner. (Once it’s signed-off RDIMS becomes the official information repository). ƒKey Performance Indicators (KPI) and Risk Factors AnalysisIncorporate into the project status reports to ensure accurate project control and monitoring. Detailed findings with recommendations are discussed in Chapter 2. To enhance the overall IMR management framework additional opportunities for improvements are presented in Chapter 3.  At this time I would like to thank the NEB and a number of personnel from the various Business Units for the information feedback and assistance given to me by all involved.   Best Regards, Ljubica Kuraica-Siveska
 
1.1 INTRODUCTION The National Energy Boards Records Renewal Program was initiated in mid-2003 in response to a Consulting & Audit Canada report on records management practices at NEB. In September 2005 a second phase of the program (The Phase 2 Project) was initiated and in the summer of 2006 a rising level of concerns about the project resulted in a decision to conduct a review and risk assessment of the program performed by Lawrence Hobbs of Chinook Solutions Inc. During the assessment it was strongly recommended that the Phase 2 Project be carefully paused while management came to grips with the significant problems that affected the very foundation of the project. In the beginning of the fiscal year 2007/2008, after the program was revalidated and proper business analysis had been developed, the subsequent implementation of Information Management Renewal program started. To prevent some of the previous scenarios happening again, it was decided that the program was to be assessed in two time frames; in May/June 2008, during the implementation of the Records Document Information Management System (RDIMS) and during the latter part of the project when it becomes fully operational.  1.2 OBJECTIVES The objective of this report is to assess current records management practices, work of the RDIMS implementation that has been done up to date and the work that has to proceed to move the implementation to the next phase. The sets of objectives are: ƒExamine the capabilities of RDIMS to collect and store permanent and transitory records. ƒ the capability of RDIMS to collect and maintain confidential and secure records. Examine ƒ Determine whether user requirements and concerns are being fully met with the software and the implementation process. ƒfull implementation of RDIMS will be measured Establish a comprehensive baseline against which the and compared and to provide short term and long term opportunities for improvement. ƒ Assesscompare it with generally accepted best practices. management framework capacity and ƒ Ensure that the design and implementation of RDIMS is in compliance with Treasury Board of Canadas July 1, 2007 Policy on Information Management. The findings and recommendations are intended to provide input to the Internal Audit and Review Services Advisor, the Audit & Evaluations Committee as well as the IM Manager.  1.3 APPROACH The assessment was conducted by a contractor, Ljubica Kuraica-Siveska, under the direction of the Internal Audit and Review Services Advisor, Tony Mitchell. The final report is based on information harvested from two primary sources: participant interviews and review of project practices, processes and related documentation. These were evaluated and analyzed to identify opportunities for improvement and develop recommended strategies. The work was organized into the following four stages:
  IMR Implementation Assessment  Final Report Page 5 of 36  
Stage 1 - Documentation Review and Project Practices In stage 1, extensive project documentation was collected and reviewed to establish the appropriateness of the project plan, the progress of the implementation to date and the potential influence of policies, practices, procedures and behaviors on the chances of success. In addition, information from the Treasury Board, other government departments and from outside sources was reviewed to determine legislative and policy requirements. Findings from previous studies and experiences and best practices of others were also reviewed. (A list of documents reviewed for this report is included in Appendix III) Stage 2 - Interviews 30 participants were interviewed between May 21st, 2008 and June 13th.Inorder to get in-depth information, the interviewees (staff and managers) were selected from almost all of the Business Units and have various levels of involvement (some of them have already used RDIMS for 3+ years, while others are relatively new to the system). Each interview was conducted using a questionnaire and lasted between 30 min. to 1 hour. There was somewhat less participation in the interviews than originally expected; this may be explained by conflicting schedules/other commitments, resistance or simply that RDIMS is not fully implemented and not frequently used by some of them. The interview questionnaire (Appendix I) was prepared to capture the current users perception to some of the key elements of the RDIMS Implementation.  Stage 3 – RDIMS Capacity Check AssessmentA successful IMR Implementation requires the efficient and effective transformation of business processes, technology and people skills. To assess the management framework, identify gaps between current and best practices, and provide advice on how to address the gaps, an IMR capacity check was conducted. The objectives of the capacity check are to: ƒ Compare against best practices. ƒ Bring together all elements of the IMR management framework. ƒ a comprehensive baseline of the current maturity state,  Formagainst which the NEB will gauge progress over time. The methodology and tool used for this assignment is based on the e-Government Capacity Check, originally developed by the KPMG Consulting LP. It consists of six key elements (as presented on Table 1 below) and evaluation criteria for each key element across a maturity model scale of 1-5. (Detailed description of the capacity check and scoring model is available in Appendix II).Stage 4 - Response Formulation (Findings and Recommendations) In this stage, 16 recommendations and 50 opportunities for improvement were identified and preliminary mitigation strategies were recommended before the results were validated (in draft form) with the Internal Audit and Review Services Advisor and IM manager. This final report was then developed and delivered.
 IMR Implementation Assessment  Final Report Page 6 of 36    
 
Table 1 - RDIMS CAPACITY CHECK Strate Architecture Risk & Pro ram Mana ement Where we are oin ? What we are develo in ? How we are mana in ?  Vision -Extent to which users and stakeholders haveBusiness Model -Definition of the business processesRisk Management -Mechanisms in place to identify, assess, essen collaborated to develop the vision statement, the degree of tial for RDIMS. mitigate, and monitor all risks, including government-wide, alignment with NEBs business strategies and Treasury BoardSecurity - organization-wideDefinition of security technologies and standards to and project-specific risks associated with direction and vision communication within the organization. ensure that RDIMS transactions are secure. RDIMS. Governance -Effectiveness of the leadership andData -Definition of data objects to support integration ofPortfolio Management -Mechanisms to plan, track, and organizational accountabilities for the RDIMS to support the RDIMS applications. evaluate the overall NEB Project transformation.Application -Definition of how RDIMS applications are portfolio. Strategies, Plans and Policies - designed, how they integrate with existing internal and externalExtent to which existingProject Management -Mechanisms to manage projects in business strategies, plans and policies are aligned with systems, and where they reside. RDIMS program to ensure the optimal deployment of RDIMS.Technology -Definition of the technologies and standards for initiatives. Resource Commitment -The level of funding and degree to the technical components to host RDIMS initiatives.Business Transformation -Mechanisms to transform the which financial and human resources are committed andNetwork - service delivery processes to an e-government organizationsDefinition of the communication infrastructure for the business model. aligned with the RDIMS strategy. transmission of RDIMS information.  Organizational Capabilities Performance ManagementValues Management(What competencies are needed?) (Work with partners & clients?) (How we are doing?)  RDIMS Competencies -Mechanisms used to ensure that staffRelationship Management -Mech anisms and support for theCustomer Satisfaction -Mechanisms to measure, evaluate, competencies in support of RDIMS initiatives are defined, formation of partnership between departments. and learn from users feedback on the effectiveness of acquired, developed and sustained for RDIMS design, delivery RDIMS. and ongoing operations.Privacy Compliance -Mechanisms to ensure that RDIMS Tools & Techniques - confidentiality and anonymity are maintained in the course ofTools and techniques to support the organization in the design, delivery and ongoing conducting RDIMS transactions. operations of RDIMS.Benefits Monitoring -Mechanisms to measure and assess Organizational Learning - degree to which the expected benefits of the RDIMS theThe ability to capitalize on RDIMS knowledge through the access, sharing, and management of program are being realized. information within a learning organization.Predictability -Mechanisms to monitor and measure the  reliability and availability of network, databases and RDIMS application systems and to compare them with predetermined service standards. RDIMS Maturity Reporting -Mechanisms to measure and report on the organizations progress towards implementing RDIMS.
elemtntaoinsAes IMRImpropetmessntinFRalf63  Page7o
tiFanleRoptrtitaAonesssensm RMIlpmIneme                       pelo( d m tovedeesabf )dcorp sse traininollow-upsna  sap gesssoiC suR:1™ht hap eidars mgfthiWh.  denelevponi ght ertiainng incorporate lht fo trnE SMQ entmegegaesocPr  llb  siwenif eeb to cial witdealoh spit ,soiranesce as cstor wndtsa  :ebpmele ax Fories.ivit actorp ssecd ehyliat an tto rreevel shttaa e axpmeleal lifeots of r   .)ral
Perceived Adequacy and Effectiveness of Training (%)
80 60 40 20 0 Ef f ective Mostly A Little Not at all Training 26 61 13 0
2. INTERVIEWS - FINDINGS / RECOMMENDATIONS  The results presented in this section are based on information gathered through interviews with National Energy Boards staff and managers. A considerable amount of feedback from 30 interviewees was collected and organized into seven major areas. Closed questions from the questionnaire permitted the production of the charts presented in the following sections:2.1 ADEQUACY AND EFFECTIVENESS OF TRAINING The majority of people (61%) thought that the training was Mostly effective and adequate. They think the training overall was good and the trainers classroom style very interactive and user friendly. For some people the training was very informative but too basic, and it was noted that they need closer ties to how they will use it with their processes/procedures. There were comments that the online security information training was a bit dry and difficult to concentrate on. Considering that RDIMS is not fully operational at this time (eg. File plan and process documents profiles have not been developed for most of the processes), the timing of the training for some users worked better than others.
 
 methn sosimiing ydw rlaegno roik; ss tIMm ea aisitcai norp nergom…etc. (This is  wotp erevtnt ehofeg8 63 Pa
 
 2.2 FREQUENCY OF USE Most of the people received the training and have RDIMS on their desktops; however they dont know what to do next. Some of them are even advised by their team leaders not to use RDIMS until the file plan is developed. As illustrated on the graph below, it is obvious that most of the people rarely or never use RDIMS. Some of the major causes are: ƒ File plan (structure) not developed for most of the processes. There are concerns about saving documents in To be filed directory and inability to locate them in future (Based on improper file naming conventions etc.). ƒ not being integrated with QMS  RDIMSand other technologies like ESIMS, ELEXIM, Translation software etc.  Lack of enforcement / accountability. ƒ ƒ timeframes associated with user acceptance. No ƒ Work overload / other priorities. ƒ Perceived as user unfriendly and time consuming, etc.
RDIMS Frequency of Use (%)
 
40 30 20 10 0 Daily Weekly Monthly Never Frequency of Use 22 22 17 39
mentgagetc.s, erP SMQƒnE ssecomegenaMan la PntChange rategy ƒgoar mtSna  ƒrPecojPlt   s:Prƒus sa hcucodtnement rtinl pee alvesi,tr t aheltcef rTo. eettmiomc gnireets eht yav hthe  oemicffllaipa yvorpb de deliverables fo rhtsip ahesa dnob, pecoesivctjetselim ,dna seno pro new/phajecteRives .ehs est ht hMQ enoittiw s se aasprS esoclpta,es  alcae n integratart theesahP snoitarepOh it wrtta sTo. atitmenemIlpuo tnce ommend Con a369fo                              so-e –lCR  2 ™neatlpmeRmI MIagePnoitssAmssetneFilnapoRert
  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents