Guidance for Proof of Concept Pilot
ATTENTION! This product is no longer current. For the most recent NARA guidance, please visit our Records Management Policy page. |
Recommended Practice: Developing and Implementing an Enterprise-wide Electronic
Records Management (ERM) Proof of Concept Pilot
A proof of concept pilot project is an opportunity to demonstrate the capabilities of Electronic Records Management (ERM) software on a small area and in a controlled manner. A pilot project is an excellent risk mitigation strategy for an agency planning to implement a ERM system. It can also serve to inform or resolve an alternatives analysis for an agency during the investment planning phase. The pilot helps determine whether the software is appropriate for use by the agency and how easily it can be configured, providing hands-on experience for records managers, information technology (IT) personnel, and users.
This document applies the principles and "best practices" of IT project management to a proof of concept demonstration pilot for ERM whose purpose is to assess whether the solution should be deployed agency-wide. Based on the experiences of ERM pilot projects at the state and federal level, this document can be used by agencies as a reference when they assemble pilot project teams, develop work plans, and solicit participants for an ERM pilot project. It is composed of six sections, followed by an Appendix of Resources for Conducting a Pilot Project:
1. Introduction
The strategic focus of the President's Management Agenda Electronic Government (E-Gov) initiatives is to transform federal programs and services into high-quality and well-managed cross-agency solutions to deliver services across the Federal government. The National Archives and Records Administration (NARA) is the managing partner for the ERM E-Gov Initiative. The ERM Initiative provides a policy framework and guidance for electronic records management applicable government-wide. The Initiative is intended to promote effective management and access to federal agency information in support of accelerated decision making. The project will provide federal agencies guidance in managing their electronic records and enable agencies to transfer electronic records to NARA.
This guidance document is one of a suite of documents to be produced under NARA's ERM Initiative that, when taken together, form the structural support for ensuring a level of uniform maturity in both the federal government's management of its electronic records and its ability to transfer electronic records to NARA.
This is the fifth of six documents to be produced under the Enterprise-wide ERM Issue Area, providing guidance on developing agency-specific functional requirements for ERM systems to aid in the evaluation of Commercial Off-the-Shelf (COTS) products.
- The first document provides guidance for Coordinating the Evaluation of Capital Planning and Investment Control (CPIC) Proposals for ERM Applications ( http://www.archives.gov/records-mgmt/policy/cpic-guidance.html ).
- Electronic Records Management Guidance on Methodology for Determining Agency-unique Requirements ( http://www.archives.gov/records-mgmt/policy/requirements-guidance.html ) offers a process for identifying potential ERM system requirements that are not included in the Design Criteria Standard for Electronic Records Management Applications, DOD 5015.2-STD (v.2).
- Guidance for Evaluating Commercial Off-the-Shelf (COTS) Electronic Records Management (ERM) Applications ( http://www.archives.gov/records-mgmt/policy/cots-eval-guidance.html ) summarizes the Environmental Protection Agency's (EPA) experience determining agency-wide Electronic Records and Document Management System (ERDMS) requirements and identifying the COTS products that would best meet the needs of agency staff for both Electronic Document Management (EDM) and Electronic Records Management (ERM) functionality.
- Advisory guidance for Building an Effective ERM Governance Structure ( http://www.archives.gov/records-mgmt/policy/governance-guidance.html ) defines governance and its importance to the success of IT, the purpose and function of that governance, how project-specific governance (such as those instituted for enterprise-wide ERM) fits within and alongside other established governance structures, and the risks attendant in the absence of good governance.
The final guidance document in this series will be a "lessons learned" paper from EPA's proof of concept ERM pilot as well as other agencies' implementation experience. The guidance documents are aimed at helping federal agencies understand the technology and policy issues associated with procuring and deploying an enterprise-wide ERM system.
2. Application of this Practical Guidance Document
This practical guidance presents "lessons learned" and experience gained in the development of proof of concept pilots for ERM. It represents the experience of federal and state agency managers responsible for managing ERM pilot projects. As with other IT systems, agencies must adhere to OMB policies and guidance when planning for and selecting an ERM system. These policies are articulated in OMB Circular A-11 Part 7, Planning, Budgeting, Acquisition, and Management of Capital Assets 1 and OMB Circular A-130 , Management of Federal Information Resources 2 . Additional OMB guidance is found in OMB Memorandums (see http://www.whitehouse.gov/omb/memoranda/index.html ).
The process described in the document provides insight into the steps necessary for making an ERM pilot project a success. The primary audiences for this document are those involved with the conduct of an ERM pilot project, including records managers, IT personnel, trainers, and selected end-user participants.
This document makes a number of assumptions as to the level of knowledge concerning ERM systems and about the capabilities an agency possesses to acquire and implement an ERM system, including that you already have:
- Created awareness as to the importance of records management to the efficient operation of the agency. Support of senior management for the ERM project as a whole, and the pilot project as a priority activity for the agency, is critical to successful implementation of an enterprise-wide ERM system.
- Encouraged consistent record keeping behavior among staff through written policies, procedures, and records schedules that are up-to-date and include new programs and formats, providing training for new agency staff (and staff with new responsibilities).
- An understanding of ERM (purpose, components, and functionality) and how it differs from paper recordkeeping.
- Understood the drivers for ERM within your agency and made the business case for enterprise-wide ERM, linking project benefits to the agency's mission and objectives.
- Planned an enterprise-wide ERM system and completed the capital planning investment process. Additional assistance with the capital planning and investment control (CPIC) process can be found in the Guidance for Coordinating the Evaluation of CPIC Proposals for ERM Applications at http://www.archives.gov/records-mgmt/policy/cpic-guidance.html .
- Conducted a functionality requirement study to determine any agency-unique requirements not contained in DOD 5015.2-STD (v.2), Design Criteria Standard for Electronic Records Management Applications (http://jitc.fhu.disa.mil/recmgt/index.htm ). This includes a review of existing business processes and the development of a plan for improving those processes as you explore the ERM solution. (Complex business process reengineering issues can be significant and lengthy in their resolution and should be discussed within the organization and addressed prior to the implementation of the pilot. The pilot may highlight areas where additional adjustment to business processes and workflow are warranted.) Additional assistance for determining agency-unique requirements can be found in the ERM Guidance on Methodology for Determining Agency-unique Requirements ( http://www.archives.gov/records-mgmt/policy/requirements-guidance.html ).
- Evaluated several DOD 5015.2-STD (v. 2) compliant COTS ERM solutions, presenting your findings and recommendation to a governance body charged with ensuring that adequate financial resources and appropriately trained staff are allocated to the project, sanctioning tasks and timetables, and setting meaningful, measurable targets for the project.
- Coordinated and tested the system with the enterprise-wide taxonomy.
A pilot project allows you to test the COTS solution in your own environment, using your agency's records and staff. This necessary step—studying the impact and effectiveness of ERM at your agency—will help you determine how you can deploy the system agency-wide.
3. Planning the Successful Pilot
A pilot project can be the last major step before an agency commits to launching an ERM solution for use agency-wide, allowing you to gauge whether the proposed solution meets the needs of your agency (as defined in your requirements analysis). It is the first opportunity to test the technical capabilities of the system and experience how it operates with an agency's infrastructure, alongside other programs and systems, providing opportunities for agency staff to gain practical experience with ERM. Through this real life operational implementation, you can assess your agency's ability to utilize the system effectively. For instance:
- The pilot may reveal a need for additional technical staff and/or user training before enterprise-wide deployment.
- As a result of lessons learned through a pilot project, your agency may want to modify (or redesign) existing workflow processes to take full advantage of the capabilities of the technology.
- The existing records retention schedules may require revision, limiting the number to a manageable quantity of records series and disposition durations within the electronic environment.
To be a useful guide for full-scale implementation, a pilot should be carefully designed and evaluated. If the scope of the pilot is too narrow, the pilot runs the risk of not having a sufficient number of records to become useful to the users. If there is not a critical mass of records in the system, staff feels that searching the system is a waste of time as their searches will not yield the desired results. So, the system is not consulted and users resort to other means of finding records sought. Activities related to pilot projects can be divided into three distinct phases:
Preliminary -
- Define the purpose, goals, objectives, and scope of the pilot/proof of concept demonstration project
- Establish the success criteria for the pilot, with input from all stakeholders, technical staff, records management staff, and users
- Outline the benefits of conducting a pilot and risks of not doing so
- Establish an administrative infrastructure to support and guide pilot project activities.
Conduct of the pilot -
- Determine whether preliminary decisions and assumptions made regarding hardware and software performance, as well as service level required by technical staff, were accurate
- Develop and use tools facilitating documentation, communication/knowledge transfer, and metadata processes.
Test and evaluation -
- Assess hardware and software, system and database design, and procedures (for training, scheduling, system management, and maintenance)
- Test product(s) in dissimilar locations (e.g., in terms of RM and IT support delivery) for functionality, usability, and benefits derived from using ERM
- Validate requirements and performance expectations.
Each of these phases is described in more detail below.
3.1.1 Defining the purpose, goals, and objectives of the ERM pilot
Defining the purpose and goals for an ERM pilot project is essential to its success. These are not the same as those you established for your agency's ERM initiative. To illustrate this point, Figure 1presents the goals of an ERM project for one federal agency alongside the goals for its pilot.
Figure 1. Goals for an ERM initiative and pilot project
To help you achieve the goals for your ERM pilot (as expressed in Figure 1) you will have to:
- Evaluate the ERM system design
- Testing how it operates in your agency's environment (technical architecture/infrastructure and work processes)
- Assessing its suitability for the tasks agency staff needs to perform.
- Testing how it operates in your agency's environment (technical architecture/infrastructure and work processes)
- Test management procedures for ERM
- Market the ERM system to staff and management.
Specifying these objectives will help you identify those aspects you should monitor during the pilot. Additionally, you will want to relate the objectives of the pilot to the goals of the ERM project itself, finding essential metrics to track that will confirm or refute the anticipated financial and non-financial benefits to be derived from ERM deployment at your agency. For example, to determine whether the solution would "Promote the use and re-use of existing agency information" ( Figure 1), you could use metrics such as:
- Analyze records that are accessed multiple times by those groups targeted for participation in the pilot project (by type, relationship of producer to user, frequency of access, purpose of access/use, ultimate recipient type, response time to ultimate user) during a specified period of time prior to the introduction of ERM
- Track the same data among pilot project participants for the duration of the pilot
- Assess whether there is increased use and re-use of agency information as a result of participating in the ERM pilot, emphasizing information identified as having been produced outside of a team/work group or other working relationship. In other words, were the participants of the pilot identifying and retrieving information that they would not have expected to employing manual techniques?
A well-defined pilot project, with carefully crafted purposes, goals, and objectives will make the evaluation of its success an easier task to accomplish. In the next sections, we address the need for establishing success criteria at the outset of your pilot, and the effect this has in assuring smooth agency-wide deployment. These initial steps will help to minimize risk as implementation plans are devised and modified per results of the pilot and limit the cost of rectifying mistakes made during full-scale deployment.
3.1.2 Establishing success criteria
To be successful, a pilot needs the support of management, adequate funding, and experienced and well-trained staff (for managing the pilot and evaluating the system's potential for use in the agency). One agency's pilot project for e-mail ERM identified some factors that had to be achieved in order to declare the project a success:
- Successful search and retrieval whereby users could locate and retrieve records with minimal difficulty
- Development of a file plan containing sufficient detail to file documents appropriately while limiting the number of hierarchical levels (in the interest of simplicity for users)
- Installation and integration of software on the network and workstations: Unobtrusive on the desktop and reasonably quick to launch.
3.1.3 Outlining the benefits of a pilot (and risks of not conducting one)
A pilot provides early visibility of ERM to your agency.
- Conducting a highly visible pilot project will help you "pre-sell" the system, getting staff accustomed to the principles of ERM and the benefits of its use. This will assist with agency-wide deployment of the system.
- The pilot provides a tangible way of communicating the potential of the system to those within the agency who are not yet convinced that this change is necessary (or worth the effort). The results of the pilot are evidence of the system's immediate, tangible value.
- Those participating in the pilot can become important advocates of its use agency-wide. These individuals and work groups have a vested interest in ERM being adopted by the agency because they had input into its development and became adept at using the system.
There are risks attendant to not conducting a thoughtfully designed and comprehensive pilot project. Pilots reduce the risk of investment by identifying technical risk (e.g., compatibility problems with existing systems and infrastructure), areas for policy and procedure changes (workflow issues), and information for production planning (e.g., providing information for developing a realistic cost estimate and implementation and training schedule). Information gained as a result of the pilot will mitigate acquisition risks and prepare for full production/deployment. In addition, the pilot will help to assure that requests for capital funding of the project are accurate reflections of need, based on actual experience with the product operating within the agency's technical environment, and facilitate the governance process for IT projects at your agency.
3.1.4 Defining the scope and duration of the pilot
The pilot provides an opportunity for records managers, IT personnel, and staff (users) to employ the ERM solution within the agency's computing environment, using its own records. Limiting the pilot's scope to a manageable size (in terms of number of records and individuals involved) allows a pilot project team to execute a sufficient number of transactions to:
- Determine whether the software is appropriate for use by the agency
- Assess how easily it can be customized
- Ascertain how it can best be deployed agency-wide.
The success of the pilot centers on the number and types of the groups, the individual participants, and the records selected to participate in the pilot. A phased approach to the pilot permits the number of individuals involved, locations, and groups to be expanded over time, introducing further complexities to the system and those involved with its management.
Choice of groups/areas of the agency to participate
Begin by identifying those business processes and functions to be part of your pilot project. Including several departments and offices in your pilot will ensure that problems encountered in one (such as an increased workload that does not permit staff to devote the time needed to the pilot project) will not bring the pilot to an immediate halt. Anomalies as to types of file formats used/not used (or accessed) will be lessened by involving several areas of the agency in your pilot.
To assess how offices outside of the headquarter location—possibly with limited records management and IT support staff available on-site—are able to utilize the system, you will want to include them in a second phase of the pilot. This will allow you to resolve initial technical difficulties (during the first phase of the pilot) before encountering new complications.
Understanding what electronic records are produced where—and which are mission critical—will help you to determine the areas of the agency to include in your pilot. Concentrating on the areas that are strategically significant to the agency (as well as those that rely heavily on records to accomplish their work) will bolster your case for ERM. Identifying other parts of the agency with related business processes (i.e., with which these groups routinely share records) will allow the pilot project team to map business processes and create a shared file plan for the pilot.
Areas of the agency that have participated in other technology-driven pilot projects will be accustomed to the pilot process. These "early adopters" are likely to be more receptive to change—open to new technology/ways of working—than others. By soliciting the participation of leaders in the agency (i.e., role models for others), you make agency-wide rollout of the system easier to execute. Others may come to view ERM as a status project and want to be part of the early phases of the system rollout.
Choice of individual participants for the pilot
Pilot project participants should represent a range of job types as well, from administrative support to senior management. All levels of aptitude using computers should be represented in your pilot so that you can gauge on-going requirements for support and additional training. Participants representing active, moderate, and infrequent need for accessing records will also improve your pilot's results:
- Heavy users will become more skilled using the system than occasional users, but they will be more demanding of the software
- Infrequent users will require a different set of features from the system to make their work easier.
Records to be included
Another way to delineate the scope of the project is through the records included/excluded in your pilot. You should determine the level, type, and number of records required to adequately test the ERM system and processes. The number of records to include in your pilot should reach a "critical mass" that is small enough to be manageable; the types of records diverse enough to test the system requirements and the goals of the pilot.
- You may choose to include all types of records or only specific records series (e.g., Administrative memos, Contracts, Policy memos, Complaints), but all types of files (documents, spreadsheets, diagrams, including e-mail as relevant supporting documentation to the complete official record) should be part of the pilot so as to test the system fully.
- If versioning and levels of access to your agency records are required (for classified material), these system functions will need adequate testing. Agencies should select a method for permitting access to records subject to Privacy Act restrictions that does not put a burden on administrators. For instance, Access Control Lists established for each office would be affected when reorganizations occur and as managers rotate in and out of positions.
The records chosen for the pilot project should represent the range in terms of formats, version (draft vs. final), as well as the security clearances of individuals utilizing the system. As group level files begin to be used more consistently, the issue of ownership is bound to arise. Standardized file plans makes the searching of co-workers' files possible; users need to be reassured that they remain the owners of their files and permission for access may be limited by them. Decisions as to who has the authority to change records (or data describing records), rights to view—check out and file records—should be documented.
Groups invited to participate in the pilot project should be told of the importance of ERM to the agency's operation and the type of input they will have on the design process. People will want to make sure that they will be able to make a difference. Participants should understand that their active participation is vital to the success of the pilot as well as the importance of the pilot to the agency.
The size of your pilot project will affect the duration required to assess the efficacy of the system and the procedures you have created. More problems are likely to be encountered with larger databases than smaller ones. The more users participating in your pilot, the more time will be required for training, the more problems encountered that will require time to resolve, and the more user support that will be required.
Duration of the pilot
A simple pilot project that can be implemented without major difficulties takes at least three months to conduct, but pre-planning activities, including training, and post-pilot evaluation makes a six month timeframe for an ERM pilot desirable. Developing a solid administrative infrastructure will help you complete the pilot within the timeframe you have established, supplying sufficient data and analysis to make an informed decision to deploy the system agency-wide.
3.1.5 Setting up your administrative infrastructure (management team) and writing the pilot work plan
Before your pilot project begins, you may establish a team and create a work plan that addresses three key aspects:
- Records management protocols
- System design, test, integration, and operation
- Support and training.
The pilot project manager will have both a records management and technical (IT) lead for the project. Each will form a team (including network (and desktop) engineers, network/software support personnel, security engineers, trainers, administrators, and records managers) able to address the concerns within their individual domains, updating one another continually as progress is made. The degree to which there is overlap (in areas such as pilot project communications, help desks, and training, for example) will be clarified in the pilot work plan. Figure 2denotes some of those pre-pilot planning activities being led by records managers (RMs).
Figure 2. Pre-pilot planning activities of RMs 3
Issues and concerns to be dealt with by IT during this pre-pilot planning stage are in Figure 3.
Figure 3 . Pre-pilot planning activities to be addressed by IT 4,
Your team should develop a work plan that describes how the pilot will be conducted and completed. To create a pilot project work plan specific to ERM, employ the project methodologies established in your own agency. Elements of the work plan include description of roles and responsibilities for the team and participants in the pilot, schedule and milestones (GANNTT chart), stakeholder involvement, and management procedures. A model ERM pilot project work plan can be found on the State of Michigan Department of History, Arts and Libraries Web site within its RMA project grant proposal ( http://www.michigan.gov/hal/0,1607,7-160-18835_18894_25946-62996--,00.html).
Procedures and plans should be developed for dealing with issues that arise during the pilot, including:
- Communications and knowledge transfer: How will key information, techniques, and best practices be communicated to the team, pilot project participants, and beyond?
- Business process reengineering/change management strategy: While some business process reengineering will have occurred as a result of your functionality study to determine agency-unique ERM requirements, the pilot undoubtedly will uncover additional workflow issues that need to be addressed. How will your team make the pilot project participants comfortable with the changes?
- Baseline, interim, and final evaluation studies
- Training (of technical staff, records managers, and users).
A method for documenting pilot progress and performance of the system—analysis and assessment—should be created. Measurements, reporting structures, and accountability for each task should be documented. Feedback can be used to resolve system issues as well as procedures (for support, training, communications, etc.). Properly structured and employed, feedback from participants will inform the pilot, allowing the team to make incremental changes to the system and adjust the process to better suit the needs of the agency.
Regularly scheduled meetings of the pilot project team—in-person, telephonic, or virtual—should include a selected subset of pilot project participants (Points of Contact or Super Users). This guarantees the valuable input of users for all decisions made by the team.
Once the groups targeted for the pilot have been identified, it is the job of the project management team to solicit their participation. Potential participants should be made to realize the extent of the commitment necessary to make the pilot a success. Enlist the assistance of agency management, as necessary, to underscore the importance of the project. This can extend to having a senior manager sign the invitation letter.
3.1.6 Minimizing risks associated with pilot projects
An agency should acknowledge that some prudent risk-taking is necessary when it comes to adopting new technology and changing business processes, as an ERM system surely will require. To minimize the risks associated with a pilot launch, the project team should:
- Establish clear performance objectives and evaluation criteria
- Involve and continually encourage pilot project participants to use the system
- Perform prototype work sessions with the software before customizing it
- Finalize system design
- Develop quality acceptance methodology
- Expand the pilot through incremental rollout to other areas of the agency and inclusion of other record formats
- Assure that pilot's requirements are measurable and clearly understood by participants.
Enumerating problems that the project team is likely to encounter—identifying possible ways in which to avoid or promptly address those situations—will minimize disruptions during the pilot, allowing you to maintain the schedule you have developed for your pilot project. To better prepare for these eventualities:
- Review of similar projects will help to identify potential problems that you may encounter as you begin your ERM pilot.
- Conduct pre-planning brainstorming exercises with your team can help you anticipate the challenges ahead.
For each potential problem, develop a contingency plan for what you will do if the problem occurs. This "best management" practice will increase the governance body's confidence in your team's ability to successfully implement ERM agency-wide. The following illustrate successful strategies for dealing with problems frequently encountered:
- Agencies often encounter resistance to changing work processes as ERM is introduced. One e-mail pilot project found that introducing new hires at the beginning of employment to the importance of good records management was the best strategy for conquering resistance to change regarding ERM.
- A version of the software will be up and running for use by the project pilot team before roll-out to the first group of pilot participants. Selecting individuals to train and work with the software during this pre-pilot phase will develop a cadre of Super Users who can serve as liaisons with the groups targeted as initial pilot project participants. When the quality of this pre-pilot phase is deemed acceptable, you can formally launch your ERM pilot.
- Managing users' expectations throughout the pilot will minimize the risk of pilot failure. This can be achieved, in part, through user training and constant communication with pilot project participants. Establishing communication vehicles for the rest of your agency (e.g., a "public" view of your pilot project Web site or online newsletter), keeping staff apprised of the progress being made vis-à-vis ERM, helps to remind people that the project is ongoing. This will make deployment in their area easier if the solution is adopted agency-wide.
A pilot project is a way for an agency to test and refine a new system with a production data base before committing significant financial and human resources to full-scale implementation. It is an opportunity to address problems that present themselves to a small group of pilot test users, learning from mistakes before they have an impact on the entire agency. As such, its purpose is to reduce risks and save investment dollars, serving to validate initial estimates made to the governance body in terms of human and capital requirements for the IT project.
Certain critical decisions need to be made and documented before the pilot begins. This can only be accomplished by reviewing similar projects, determining whether any additional data is required before proceeding, and considering which performance data need to be collected through the pilot to enable meaningful evaluation. Specific elements necessary for the conduct of a pilot project include:
- A pilot monitoring system that consists of service level requirements for the pilot (e.g., data load, update, refresh) and a problem log to note any disruptions in service that occur during the conduct of the pilot that includes what was done to address each situation. A problem log not only documents decisions made in one instance, but redefines the problem in more general terms, providing guidance for possible procedural change.
- A determination as to whether significant changes to the agency IT infrastructure will be required to execute the pilot, including the acquisition and installation of new hardware or modifications to the network.
- Availability of knowledge application developers (programmers) and system analysts to deal with the ERM project. Once you have assessed the capacity of technical support staff to monitor performance and troubleshoot during the pilot project, you will know how much outside support you will need. This assistance can be secured from the ERM vendor, an outside contractor, or by hiring additional staff.
- You may wish to limit calls to your vendor from users by channeling them through key individuals on your pilot project team. This can serve to develop your in-house staff capabilities in supporting the software, documenting the types of questions that are arising among pilot project participants, assuring that the vendor addresses only software-specific problems.
- The pilot will help you develop a support system(s) for offices where on-site records management and/or technical support is limited. This can be accomplished remotely, from headquarters or another office, or with the assistance of a local contractor. Support can include a user manual (made available online) supplemented by a help desk, contact with which can be made via phone or email. Listserves or Communities of Practice (CoP) will permit pilot project participants to help one another.
- You may wish to limit calls to your vendor from users by channeling them through key individuals on your pilot project team. This can serve to develop your in-house staff capabilities in supporting the software, documenting the types of questions that are arising among pilot project participants, assuring that the vendor addresses only software-specific problems.
- Availability of analysts to identify and test potential business process improvements and measure their impact on the agency as well as budget analysts to accurately assess pilot costs and adjust predicted estimates for full-scale implementation.
- Tools facilitating documentation, communication/knowledge transfer, and metadata processes (and automated categorization) should be established for your pilot. These will help all involved in the pilot monitor what is happening and how it affects their work. A variety of methods should be employed: Intranet Web page including FAQs; listserve and/or CoP; in-person, telephonic, and/or virtual (online) meetings.
- Training is essential for all involved in the pilot project. You may need to reinforce agency staff's understanding of basic records management by:
- Defining "What is a Record?"
- Determining when a record should be captured
- Explaining the differences that arise when dealing with electronic records
- Providing guidance for who is responsible for entering e-records into the system (e.g., the originator of an email, the recipient, or both)
- Explaining how ERM will affect the work of those involved in the pilot project.
- Defining "What is a Record?"
Your vendor may offer adequate training of core project team members, teaching IT personnel, records managers, and those charged with training others how to use the ERM system and customize the software. The agency's ERM trainer(s) can then offer pilot project participants an introductory workshop designed to familiarize participants with the basics of using the software, employing examples from the types of records most likely to be encountered by those in the class. These workshops can become the foundation for a new computer-based training (CBT) module for records management training.
The small group class trainings (i.e., classes with up to 10 individuals) should be followed-up by one-on-one sessions conducted at the users' workstation. These on-site visits can address any questions participants may have felt uncomfortable raising in front of the group. The hands-on approach, with users sitting at their own workstations, can address differences among computer settings.
Frequent visits to participants (to help with the cultural adjustment required by ERM and to encourage the use of the software) may not be possible. The State of Michigan (2002) designated a Super User within each office who served as a liaison between the office and the project team. Super Users were responsible for encouraging their co-workers to use the software, helping their co-workers learn the advanced features, and sharing the concerns of their co-workers with the project team. Advanced training was provided to Super Users and they met with the project team on a regular basis to discuss the features of the software and potential business process improvements that could be derived from using the software. These insights will be helpful as enterprise-wide training programs are developed in coordination with full-scale implementation.
Evaluation is perhaps the most important part of the pilot project. A carefully constructed pilot project will make provision for objective analysis of the results and an assessment as to how to proceed with full deployment. The evaluation team for one agency pilot project for e-mail ERM identified five categories of performance measures:
- Installation: Time to install on the network, test, and install on user workstations
- Training: Ready availability of training; keeping users well informed about training opportunities; providing assistance in registering for training; conducting well-organized and understandable training sessions; follow-up after training
- Usage: Streamlined procedures and the use of templates; meetings to increase comfort levels of users and to develop work-specific file plans; application effectiveness/user satisfaction; privacy/security issues adequately addressed
- Knowledge: Increased level of knowledge of RM after pilot
- Communication: Sharing lessons learned beyond the pilot.
Your evaluation of the system, with recommendations for further customization (either by internal staff, the vendor, or outside contractor), should be accompanied by evaluation of the processes and procedures for ERM as they evolved during the pilot and recommendations for their improvement prior to full-scale implementation. Additional suggestions for enhancements in training also are part of this comprehensive evaluation. Usability testing during the pilot will guide the team in modifying the interface and instructions prior to agency-wide system deployment.
The mechanisms designed into the project to monitor the progress of the pilot will inform the evaluation. These include:
- Communications/knowledge transfer mechanisms that you have set up for your pilot project, serving as a source for valuable feedback necessary for adequate analysis.
- Questions posed to help desks, as well as postings to the pilot's listserve or CoP, themed (by type of question posed) and analyzed (in terms of participant/unit and type of record/format involved).
- Minutes of telephone and Web-based conferences with pilot participants, as well as technical team meetings, providing additional input for the evaluation.
A formal approach to quantitative and qualitative analysis of the pilot project should be built into the pilot project plan. The methodologies employed can include a mix of surveys and interviews with participants conducted periodically, including:
- An initial baseline analysis will help you to understand the concerns of participants, giving you an opportunity to address them through pilot trainings and any communications mechanisms you established for the pilot.
- Interim assessments can evaluate the effectiveness of particular aspects of the pilot (e.g., training workshops). These can gauge changes in usage of the system (increasingly frequent usage with less time required per session) and user satisfaction (as the pilot team responds to requests from participants to modify the system/procedures).
- A final evaluation that demonstrates the effects of the ERM on business process and indicates changes to be made before ERM is deployed agency-wide.
Evidence that the concept was proven can be found in the repository of records created using the ERM system. The final evaluation report should detail how well the solution met agency functional, technical, and management expectations. In addition to technical recommendations made with regard to the solution, your final evaluation report should contain suggestions for improving the management procedures used during the pilot. These changes will facilitate deployment of the system agency-wide. A "Lessons Learned" section appended to the evaluation should be made available to those involved in future pilot projects.
Examples of baseline analysis for the State of Michigan Department of History, Arts and Libraries RMA pilot project can be found at http://www.mi.gov/documents/hal_mhc_rm_basequal_72425_7.pdf ; a final evaluation report can be accessed at http://www.michigan.gov/documents/hal_mhc_rm_finaleval_72433_7.pdf .
A pilot project provides agency staff with experience using an ERM system and, barring a poor evaluation, will result in approval to go ahead with full implementation. Agencies conducting an ERM pilot reduce their investment risk. The outcome will be:
- Better-trained staff in terms of records management processes and understanding as to the importance of ERM to the agency
- Well-developed technical, managerial, and production procedures
- An improved implementation plan
- Revised cost estimates and a realistic schedule for agency-wide deployment
- Support of management and users.
The adoption of standardized file plans and naming conventions will make information easier to search, locate, and use no matter what individual created it, broadening access and encouraging more regular use. This will allow managers to review ongoing work among staff and projects, tracking project progress. (Bikson, Baseline Qualitative Study, p. 8)
Agencies that implement ERM should result in savings in time (due to increased productivity of staff) and space (e.g., purchase of filing cabinets and use of off-site storage). Additionally, there will be reduced duplication of saved material now that version-controlled information resides in shared stores. If smaller, specialized systems existing within the agency are replaced by the enterprise-wide ERM solution, the agency will save on the maintenance of those older systems. Based on their pilot, the State of Michigan Department of History, Arts and Libraries calculated savings from business process improvements achieved through ERM ( Figure 4).
Figure 4. Business Process Improvements Achieved by the State of Michigan Department of
History, Arts and Libraries Records Management Application Pilot Project 5
Lessons learned by those involved with ERM pilot projects in the past can be grouped into three topic areas: Users and usage, Implementation, and Technology.
Users and usage
- Users want to be involved (e.g., in the ERM solution selection process, file plan development, strategy for agency-wide deployment of ERM), but may not have the time to devote to the project. Designating a Point of Contact (POC) within each group selected to participate in the pilot can keep the pilot project team aware of what is going on with the users and the users involved in the decisions made by the pilot project team.
- While users want to be involved in policy decisions, they do not want to have to constantly think about ERM. Simplify file plans, simplify and automate organizational forms, use templates, and consider rule-based auto-categorization to minimize daily decision-making.
- Staff reacts differently to change. Long-time staff may resist change, but those whose work requires extensive information handling may be more accepting of ERM than others.
- Allow users to shape the software and associated procedures to the business processes, and accommodate user-generated innovations into the system. Users are afraid of losing control (e.g., the ability to add/remove files from the file plan). For those projects experiencing significant reluctance to adapt to new processes, participant acceptance can be improved by being responsive to requests for change, where possible. Examples from ERM pilots include making retention codes visible alongside the title of each file; relaxing the file rules to include non-records that users might need for business purposes; and adding transitory files for storing electronic documents participants want to keep for 60 days.
- Individuals need to see the difference ERM makes in their daily routine tasks, but this takes time. Usage of the software grows through the peer pressure associated with business process improvements. Individuals should see advantages in their own work if the implementation effort is to succeed. (Bikson, Baseline Qualitative Study , p. 13)
- Users won't use the system until they see benefit; won't see benefit until they use the system. Pre-sell the system by relating benefits to everyday tasks/routine work of staff. Find incentives for use; disincentives for avoidance. As time progresses and team-based work groups become more prevalent, reluctant staff will have to use ERM to retrieve records generated by others and pertinent to their work.
- Management support for the project influences the degree to which staff will utilize the system: While there may be strong support from senior management for ERM, there should be specific "continuing and visible support from the top for this particular pilot project during the trial period." (Bikson & Eveland, p. 14)
- It takes time to adjust to using a search engine as a retrieval tool instead of navigating file plans.
- Use the results of the pilot to respond to unrealistic expectations of users.
Implementation
- The best way to learn is to keep ERM pilot projects simple. Don't try to test more variables than a pilot project can handle well.
- Pilot project teams need to have a solid understanding about what the software can deliver and how it works, making certain that it functions properly during pre-pilot testing before involving users.
- Make use of the pre-pilot period to prepare groups selected for participation in the pilot. Review the importance of RM to the agency and the differences between paper and electronic records management. Use this time to develop your POCs or Super Users by including them in training and testing of the system before the formal pilot launch.
- Determine the information that is important to capture and automate as much of the process for documenting this at the outset.
- Work the plan, but restructure the pilot if the situation warrants.
- Allot sufficient time for the pilot so that the system is fully tested and the team can evaluate user receptivity prior to agency-wide deployment.
- Incremental, phased rollout of the pilot will allow the project team to manage the process more effectively.
- Make multiple avenues to learning and help available to pilot project participants, noting that:
- Training is a substantial cost item.
- ERM software requires technical training that needs to be reinforced throughout the pilot and beyond.
- Introductory training sessions followed-up by individualized coaching at participant workstations is most effective. Additional learning opportunities focusing on RM concepts and methods are desirable.
- Providing both "pull" and "push" options for support to pilot project participants allows users the option to learn in the manner most suited to them. Examples of "pull" items include loading user manuals and maintaining FAQs on an Intranet or Web site. Pilot project teams can be proactive in providing help through unsolicited calls to see if participants need further assistance, for example.
- Having a robust help system in place is essential to successful implementation.
Technology 6
- Use of thin client architecture will allow IT staff to deploy the software quickly and easily, with no need to customize the desktop.
- Avoid macros and integrations with software that requires extensive modification of individual desktops. The macros are unreliable and, in addition to changing frequently, software on the desktop varies with the department and location. Each new version will threaten the connectivity of the macro or integration.
- Integrate the product at the operating system level. Operating systems upgrade to new versions slower than desktop applications, and there are fewer to integrate.
- Develop a robust Web-based product that works the same way a client server version of the product would work. Client-server architecture is difficult to deploy.
- Make the ERM software appear invisible to the user. Allow the ERM server and file plan to look like another local drive and directory that the user accesses when saving and opening documents. Let the user perform the "save as" or "open" function, see the ERM drive, and navigate through their file plan to the desired file. This will boost user acceptance, and it will reduce the amount of training required.
- When selecting any new software product, do not ask a vendor if their product is capable of performing a particular task, because the answer is almost always "yes." Ask the vendor to demonstrate exactly how the product performs the task, and analyze the demonstration from the perspective of the typical user. Remember the bottom line, the user is the most important person affected by a new product.
- When selecting new ERM software, be diligent in researching the viability of ERM products. The corporate acquisition process can cause high volatility in the vendor market, with some products losing support for continued innovation post-acquisition. Integration of product lines may delay promised product modification and improvement.
Pilots are particularly useful for complex projects such as ERM, where the software is new (either to market or the agency), the implementation represents a significant change in the way staff works (operating procedures), and when user acceptance may be difficult to obtain. A pilot provides the practical experience necessary before introducing an ERM solution agency-wide. It allows an agency to test the system design in a real-world controlled environment with actual production user participation, where they perform their normal routines and tasks. This permits the project team to validate the system's design as well as modify the procedures it will employ before implementing enterprise-wide ERM ( Blackberry pilot project framework, 2004). Participants are able to review and adjust business processes while interacting with the system, providing evidence of individual and organizational benefits to using ERM. Based on the results of pilot tests, teams can adjust initial estimates of human and capital resources required to successfully deploy the solution, enterprise-wide.
Activities related to pilot projects can be divided into three distinct phases:
Preliminary activities include defining the purpose, goals, objectives, and scope of the pilot/proof of concept demonstration project, establishing metrics for measuring the outcomes of the pilot, outlining the benefits of conducting a pilot and risks of not doing so, and establishing an administrative infrastructure to support and guide pilot project activities.
- Scope issues concern the choice of groups to participate in the ERM pilot and the correct mix of records (e.g., level, type, format) to yield a sufficient number of record transactions that adequately tests the functionality of the system and your agency's ability to make effective use of ERM
- Administrative issues include the selection of a pilot project team and development of a work plan that documents reporting structures and accountabilities for each task (e.g., training).
Conduct of the pilot will determine whether preliminary decisions and assumptions made regarding hardware and software performance, as well as service level required by technical staff, were accurate. Tools facilitating documentation, communication/knowledge transfer, and metadata processes will have to be developed and used during the pilot.
The test and evaluation phase assesses hardware and software, system and database design, and procedures employed during the pilot. These management procedures include training, scheduling, system management, and maintenance.
To be successful, a pilot needs the support of management, adequate funding, and experienced and well-trained staff (for managing the pilot and evaluating the system's potential for use in the agency). A successfully executed ERM pilot project will result in:
- Better-trained staff in terms of records management processes and understanding as to the importance of ERM to the agency
- Well-developed technical, managerial, and production procedures
- An improved implementation plan
- Revised cost estimates and a realistic schedule for agency-wide deployment
- Support of management and users.
Appendix: Resources for Conducting a Pilot Project
As noted in section 2 of the text, agencies should refer to OMB policies in OMB Circular A-11 Part 7, Planning, Budgeting, Acquisition, and Management of Capital Assets and OMB Circular A-130, Management of Federal Information Resources, and other OMB guidance available at http://www.whitehouse.gov/omb/memoranda/index.html.
While not necessarily limited to ERM, the following resources will be helpful to those responsible for planning, conducting, and/or evaluating pilot projects. A Records Management Application (RMA) case study was issued as a final report for a NARA National Historical Publications and Records Commission Grant to the State of Michigan Department of History, Arts and Libraries. The original proposal and evaluations conducted for the Michigan project are included in this resource listing.
Asbury, S. (n.d.). How to implement a successful AM/FM pilot project. Retrieved August 22, 2005 from http://www.byers.com/spatialage/technicalinformation/whitepapers/ papers/sp_papers_pilot.html
Bikson, T.K. (n.d.). Records management application pilot project: Baseline qualitative study. Retrieved August 22, 2005 from http://www.mi.gov/documents/hal_mhc_rm_basequal_72425_7.pdf
Bikson, T.K. & Eveland, J.D. (n.d.). Records management application pilot project: Final evaluation. Retrieved August 22, 2005 from http://www.michigan.gov/documents/hal_mhc_rm_finaleval_72433_7.pdf
Bikson, T.K. (n.d.). Records management application pilot project: Interim qualitative study. Retrieved August 22, 2005 from http://www.michigan.gov/documents/hal_mhc_rm_interqual_72434_7.pdf
California Environmental Protection Agency. (2000, June 20). Final model pilot project work plan. Retrieved August 22, 2005 from the CA/EPA Environmental Management System Project Web site: http://www.calepa.ca.gov/EMS/Archives/wkpl0600.htm
National Archives (UK). (2001, August). Electronic records management: Framework for strategic planning and implementation, Version 1.0. Retrieved August 22, 2005 from http://www.nationalarchives.gov.uk/electronicrecords/advice/pdf/framework.pdf
Patterson, G. & Sprehe, T. (2002). "Principle challenges facing electronic records management in federal agencies today." Published in Government Information Quarterly 19 (2002) and retrieved August 22, 2005 from http://www.jtsprehe.com/challenges.htm
Research in Motion Ltd. (2004). Blackberry pilot program framework. Retrieved August 22, 2005 from http://www.blackberry.com/knowledgecenterpublic/livelink.exe/fetch/2000/7979/278390/
BlackBerry_Pilot_Program_Framework.pdf?nodeid=644920&vernum=0
Research in Motion Ltd. (2004). Planning to implement Blackberry. Retrieved August 22, 2005 from http://wp.bitpipe.com/resource/org_990469824_534/
Planning_to_Implement_BlackBerry_Bitpipe.pdf?site_cd=tmc
South Carolina Department of Archives and History. (2002, May). Electronic records program development grant proposal. Retrieved August 22, 2005 from http://www.state.sc.us/scdah/ergrant2003toc.htm
State of Michigan Department of History, Arts and Libraries. (Updated 2003, September 9). RMA project grant proposal. Retrieved August 29, 2005 from http://www.michigan.gov/hal/0,1607,7-160-18835_18894_25946-62996--,00.html
State of Michigan Department of History, Arts and Libraries. (2002, December 30). Records management application pilot project: Final report for National Historical Publications and Records Commission grant #2000-059. Retrieved August 22, 2005 from http://www.michigan.gov/documents/HAL_MHC_RM_Final_Report_63480_7.pdf
1 Seehttp://www.whitehouse.gov/omb/circulars/a11/current_year/a11_toc.html
2 See http://www.whitehouse.gov/omb/circulars/a130/a130trans4.pdf
3 Based on Blackberry pilot program framework (2004); National Archives (UK), (2001) Electronic records management: Framework for strategic planning and implementation, Version 1.0; Research in Motion Ltd. (2004) Planning to implement Blackberry; State of Michigan Department of History, Arts and Libraries (2002) Records management application pilot project: Final report for National Historical Publications and Records Commission grant #2000-059.
4 Based on Blackberry pilot program framework (2004); National Archives (UK), (2001) Electronic records management: Framework for strategic planning and implementation, Version 1.0; Research in Motion Ltd. (2004) Planning to implement Blackberry; State of Michigan Department of History, Arts and Libraries (2002) Records management application pilot project: Final report for National Historical Publications and Records Commission grant #2000-059.
5 This is an abbreviated version of the table presenting savings from business process improvements achieved through ERM as calculated by the State of Michigan Department of History, Arts and Libraries. The complete table can be found in Records management application pilot project: Final report for National Historical Publications and Records Commission grant #2000-059.
6 Technology Lessons Learned 1-7 are based on State of Michigan Department of History, Arts and Libraries (2002) Records management application pilot project: Final report for National Historical Publications and Records Commission grant #2000-059, pp. 17-18.
Page updated: April 26, 2019