ARCHIVED - Follow-up Report: 2002 Audit of Partnerships

Archived Content

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

ARCHIVED - Follow-up Report: 2002 Audit of Partnerships (PDF, 358 KB)


It is a recognized good practice to perform follow-ups after two years from the tabling date of an audit report. Consequently, a follow-up of the 2002 Partnerships Audit Report was added to Internal Audit's Plan for this year. The Plan was approved by NRC's Senior Executive Committee on July 24, 2004.

Follow-up work is performed as a review engagement. This is a moderate level of assurance which is normally limited to enquiry, analysis and discussion. When such work is conducted, internal audit provides a lower level of assurance than, for example, an audit.

Executive Summary

In 2002, NRC's internal audit performed an audit of partnerships – collaborative and fee-for-service agreements – at NRC's corporate level and at five institutes: Herzberg Institute of Astrophysics (HIA), Institute for Aerospace Research (IAR), Institute for Information Technology (IIT), Industrial Materials Institute (IMI) and Plant and Biology Institute (PBI). It is good practice to perform a follow-up two years after the tabling of an audit report. Therefore, in 2005, we reviewed the progress that the five NRC institutes had made in implementing the recommendations in our 2002 audit.

In its 2002 audit report, Internal Audit concluded that, in general, partnerships at NRC were well managed. However, the audit also found deficiencies in both the performance indicators for evaluating the results of collaborations for performance-reporting purposes, and in the way in which this information was reported.

Our follow-up found that some NRC institutes had generally improved with respect to the management of partnerships since 2002. For example, NRC institutes are now conducting environmental scans and business analyses. The process is documented and clearly identifies needs, opportunities and risks. Institutes are also either carrying out, or planning to carry out systematic follow-ups with clients when projects are completed. One of the institutes selected, IMI, already had all these processes in place when we carried out our 2002 audit.

Despite these improvements, our follow-up also found that more work is needed to improve NRC's accountability process, for instance, through better assessments of what partnerships have actually achieved. More specifically, the definitions for performance indicators needed clarification, and better mechanisms are needed for determining the outcomes and impacts of research arrangements.

1.0 Introduction

1.1 – In October 2002, we carried out an audit covering partnership practices at HIA, IAR, IIT, IMI and PBI for collaborative and fee-for-service agreements. At that time, we reported that, in general, partnerships in the NRC were well-managed. However, significant deficiencies were found with regard to the indicators for evaluating the results of collaborations. We also found that the way in which performance information was being reported compromised its usefulness. This follow-up provides a status report on the extent to which NRC institutes have improved with respect to these deficiencies.

1.2 – Collaborative and fee-for-service agreements account for an important part of NRC's research activities. According to performance information that NRC's Corporate Services prepared, the total value of collaborations for 2003-04 was about $500 million. Appendix A defines what is meant by partnerships and describes various types of partnership agreements.

What we found in 2002

1.3 – With regard to improved practices, the audit found that:

  • IMI had intelligence-gathering mechanisms in place. We noted that other institutes might find these mechanisms useful as tools for strengthening their leadership role by identifying the needs of their clients, responding to those needs, identifying opportunities in weaker technological areas, and minimizing risk.
  • Penalty clauses for failing to deliver key components of a collaboration in a timely manner should be considered for contracts - whether for procurement, or agreements with partners - when NRC depends heavily on the client.
  • As is the case at IMI, institutes should have formal mechanisms in place to follow-up with their clients once a project has been completed.
  • It would be beneficial for the institutes to establish mechanisms that would enable them to determine the outcomes and impacts of research arrangements on an ongoing basis, without disclosing the nature of any intellectual property.
  • Negotiations with partners should address the use of these mechanisms and provide for an acceptable means of including them in contractual arrangements.

What was planned to take place after 2002

1.4 – In their responses to the 2002 report, the NRC Institutes that we audited agreed to implement the following initiatives:

  • To simplify corporate performance indicators with greater emphasis on meeting their own performance needs;
  • To share practices, over and above the examples provided by the audit team, with regard to gathering intelligence for planning purposes;
  • To incorporate penalty clauses in contracts where risk warrants; and

To exercise caution and careful consideration before negotiating requirements with partners to ensure that they provide information on the outcomes and impacts of collaborations. (The audit team intended that this would be done in the longer term, but that work in this area should begin now.)

2.0 -- Generally Accepted Framework used for the Audit

2.1 – Before carrying out the 2002 Audit of Partnerships, it was necessary to establish expectations against which to assess partnerships. In our case, this consisted of a framework for managing partnerships.

2.2 – The Treasury Board Secretariat had developed the most applicable framework that we found. This framework is entitled The Federal Government as 'Partner': Six Steps to Successful Collaboration. The 2002 audit built the following model based on this document.

Figure 1: Partnership Management Model

Partnership Management

Information on Potential Partners
(Collaborate by Choice not by Chance)

Identify and Analyze Requirements, Opportunities and Risks
(Check the Legalities and Definitions)

Consult with Specialists
(Legal Services, ASPM Footnote1, Corporate Services, Finance)

Plan and Prepare Agreements
(Maximize the Value of the Collaboration & Hold the Partners Accountable)

Achieve the Best Results with Sound Management Practices
(Program Management and Project Management)

Assess the Results of Working with Partners
(Post Project Assessments, Monitoring Results)

2.3 – Our audit criteria reflect the six steps in this model. A complete list of criteria used during 2002 audit is included in Appendix B. In addition to detailed interviews with the five Institute Directors Generals, institute and corporate staff interviews and review of documentation, a randomly drawn sample of five partnership agreements were examined for the follow-up audit.

2.4 – Our findings listed in the following sections are discussed under the six headings that correspond to those of the Partnership Management Model.

3.0 -- Information on Potential Partners

2002 Audit Recommendation: Other Institutes should consider establishing methods of intelligence gathering such as those used by IMI and IAR in order to take a stronger leadership role by identifying the needs of their clients, responding to those needs, identifying opportunities in weaker technological areas and minimizing risk.

3.1 – We noted many cases in 2002 in which the partner with whom the work had been done had initiated the research work. We had expected to find that the NRC institutes had exercised a stronger leadership role by identifying the needs of their clients, responding to those needs, identifying opportunities in weaker technological areas and minimizing risk.

3.2 – As we will explain below, the follow-up found that all the institutes that we looked at in the audit had taken a stronger leadership position.

3.3 – The strategic direction of an NRC institute is determined through its ongoing intelligence-gathering activities, which are aimed at finding the best partner to team with, and at identifying the leading-edge technology. This can be done by creating clusters – special groups consisting of suppliers, manufacturers, universities and other research centres. This process for gathering intelligence was in place at IMI in 2002. During the follow-up we found that IIT and PBI now have such a process. We also found that other NRC institutes, such as IAR, are moving toward adopting that process.

3.4 – Unlike other NRC institutes, HIA's mandate is not of an industrial nature. It is to fulfill specific needs of the astronomy community as identified in its Long Range Plan. We will discuss intelligence gathering for this Institute below.

2002 Audit Recommendation: HIA should communicate its industrial strategy to NRC headquarters in order to ensure all parties have consistent expectations with regards to HIA commercialisation efforts.

3.5 – HIA: The institute has a long-term plan based on environmental scans and consultation with the Canadian astronomy community. However, after the corporate call for increased commercialization effort on the part of all NRC institutes, HIA lacked a commercial focus. Accordingly, we concluded in 2002 that HIA had to communicate its industrial strategy to NRC headquarters in order to ensure all parties had consistent expectations about the Institute's commercialization efforts.

3.6 – NRC is going through a governance-renewal exercise. Results for this exercise will not be known before this follow-up has been completed. Because the renewal exercise is still in progress, at this point it is uncertain whether an across-the-board call for commercialization from all NRC institutes will continue. After the Council's strategic direction has been confirmed, HIA plans to update its strategic plan in the year to come. Nevertheless HIA continues to gather intelligence information to support its commercialization efforts. However these efforts are still at a very early stage.

2002 Audit Recommendation: IIT is encouraged to complete its work on a strategic plan, including a thorough intelligence gathering process and the identification of key technology areas of the future.

3.7 – IIT: During the 2002 audit, we encouraged IIT to complete its strategic plan, which was under development at the time. This planning exercise includes a comprehensive intelligence-gathering process similar to the one that IMI is using, and it involves creating a number of technology clusters or special interest groups. IIT is continuing to develop a strategic plan, which is expected to be completed by October 2005.

4.0 -- Identify and Analyze Requirements, Opportunities and Risks

2002 Audit Recommendation: We recommend that penalty clauses be considered for all contracts, whether procurement or agreements with partners, when there is a major dependency on the other party.

4.1 – In 2002, we found that some NRC collaboration/fee-for-service agreements did not include a penalty clause for late delivery. In one case, in the absence of a penalty clause, NRC was responsible for paying a late-delivery penalty to a third party for a delay that had originated with a contractor. We recommended that penalty clauses be considered for all contracts when there is a major dependency on another party.

4.2 – In general, our review of five randomly selected collaboration/fee-for-service agreements for each NRC institute visited during the 2002 audit showed that each of those institutes has acted on our recommendation.

5.0 -- Consult with Specialists

5.1 – In 2002, we concluded that all NRC institutes which we examined had appropriate processes in place with regard to consultation within the institute and with corporate branches and services.

6.0 -- Plan and Prepare Agreements

2002 Audit Recommendation: All agreements should provide mechanisms to resolve disputes and for clear deliverables through which project closure can be determined.

6.1 – We found in 2002 that not all agreements in our sample of partnerships included a mechanism for resolving disputes among partners. We also noted that not all agreements required a report to provide closure on the work done. Since both are considered best practices in managing partnerships, we recommended that agreements include dispute-resolution clauses. We also recommended that agreements specify clear deliverables through which project closure can be determined.

6.2 – In general, our 2005 follow-up found that all five NRC institutes that we audited in 2002 are now including both a dispute resolution clause (when warranted), and work plans leading to a final report. This finding is based on a random selection of five collaboration/fee-for-service agreements.

7.0 -- Achieve the best Results with Sound Management Practices

2002 Audit Recommendation: Institutes should have formal mechanisms in place to follow-up with its clients once a project is completed.

7.1 – During the 2002 audit, we found that most partnership agreements had no formal requirements to follow-up with partners on the results of the collaborations or services rendered.

7.2 – We also found that one institute (IMI) did have a formal process for following up with its clients. At the conclusion of a project, IMI sends out a client satisfaction survey. The responses are entered into a database, which allows the Institute to periodically assess its overall results on a periodic basis. Consequently we recommended that other NRC institutes should have formal mechanisms for following up with clients once a project has been completed.

7.3 – We found during our follow-up that all selected NRC institutes either plan to implement, or are working on implementing, a formal process for following up with their clients. For IIT and IAR, this process will be similar to IMI's. PBI will use exit interviews between clients and the institute's BDO. Because of its unique situation, HIA will focus on reviewing research publications.

2002 Audit Recommendation: The Business Relations Office's Intellectual Property information system should be developed as quickly as possible; and, the development team should address the question of a possible link to Sigma.

7.4 – Follow-up on Licenses: In 2002 we found instances in which licences had been issued by institutes without making sure that formal follow-up procedures were in place to collect any royalties due.

7.5 – We also found the Business Relations Office Intellectual Property (IP) information system could help to address this problem. NRC institutes could use it as a central source of information on all patents and licences. This system was under development in 2002, and was not yet operational at the time of our follow-up. The system is at the proposal-review phase. If development work proceeds as planned, NRC staff tells us that it should be operational by early 2006. Plans for the system include a direct link to NRC's financial system, thus helping to integrate the entire IP (i.e. royalties) process.

7.6 – For institutes looking for an alternative to this system for reasons of proprietary information, we found another solution which is in use at IMI. This Institute is monitoring its IP, including royalties, through an institute database. IMI staff tells us that since the database was completed about one year ago, it has resulted in increased revenues amounting to hundreds of thousands of dollars. IMI also told us that this money has been reinvested in research.

7.7 – According to IMI staff additional benefits were found by creating an institute database: better support to staff in their decision making process, better monitoring of patents/ royalties/ awards, and better definition of roles and responsibilities between researchers/ business development officers/ Institute's Administrative Services and NRC's Business Relations Office.

7.8 – Programme Management: NRC institutes sometimes become involved in arrangements with partners that either require different groups within an institute to participate, or involve more than one institute. Such arrangements entail a much different management process than that required to manage a single project with dedicated resources. In 2002, our limited review indicated that program management was not an area of concern at NRC.

8.0 -- Assess the Results of Working with Partners

2002 Audit Recommendation: Institutes should review their projects to ensure that the value provided in performance indicators for collaborations and fee-for-service is accurate.

2002 Audit recommendation: Performance indicators should be reviewed to improve related definitions and, if necessary, provide additional categories for items that do not meet the definition developed (e.g. research collaboration vs. collaborations not involving research with partners).

Performance Indicators

8.1 – In 2002 we assessed the adequacy of the performance indicators related to collaborations and fee-for-service arrangements. The indicators for collaboration were the total number of agreements and the dollar value at the domestic and international level. The indicator for fee-for-service work was the number of clients. We found IMI information with respect to these indicators to be accurate. However, we found errors in the other institutes.

8.2 – Overall, we found during the follow-up, that progress in implementing these two recommendations has been slow since our 2002 audit.

Accuracy of performance values

8.3 – The 2002 audit indicated that inaccuracies for dollar values and number of agreements occurred largely because the definitions used between corporate and each NRC institute were unclear, and through the use of an error-prone manual system.

8.4 – Corporate Services is responsible for using the system to roll up performance information for all NRC. Corporate services plans to develop a computerized system for this task. This system would replace the present error-prone system. Corporate Services also plans to clarify with NRC institutes the definitions used for the provision of performance information.

8.5 – Since the causes for these inaccuracies had not been completely dealt with at the time of the follow-up, we neither performed an accuracy test, nor reported on the value provided for performance indicators (e.g. collaborations and fee-for-service).

Improvement of performance definitions

8.6 – During the 2005 follow-up, NRC institutes and corporate staff told us about their plans to clarify definitions and to better categorize the performance information provided. However no timeline has been set at this time to complete these tasks. As noted above, the clarification of definitions and categories is linked to the accuracy of performance information.

8.7 – We found during the follow-up that some progress has been made toward creating new categories for items that do not fit current definitions. However, NRC staff acknowledge that much work remains to be done on improving those definitions and providing additional categories before the performance data can be considered more accurate.

Evaluation of Success

2002 Audit Recommendation: It would be beneficial for the institutes, in cooperation with the Policy, Planning and Assessment Group, to establish mechanisms for the ongoing determination of outcomes and impacts of their research arrangements, without disclosing the nature of any intellectual property.

2002 Audit Recommendation: Negotiations with partners should address the application of such mechanisms and acceptable means of applying them should be included in contractual arrangements.

8.8 – In 2002 we found that NRC institutes were obtaining information on the overall success of their programs largely from cyclical evaluations done by the Planning and Performance Management group of Corporate Services. While formal cyclical evaluations are valuable, institutes would be better served by having continuous and more timely performance information available to them on the success of their programs. Providing this information would entail monitoring the success of projects on a continuous – rather than a cyclical – basis.

8.9 – In the 2002 audit, we did not find any instances in which institutes had approached partners to determine the impacts of earlier projects. It should be possible to both establish clear and measurable objectives for projects, and for institutes to establish mechanisms with their partners for monitoring success against these objectives.

8.10 – Overall, we found as a result of the follow-up that work over these two recommendations is slowly progressing.

Continuously determining the outcomes and impacts

8.11 – In its March 2004 report on the management of research at the NRC, the Auditor General noted that implementing a performance management framework throughout the Council would be a complex undertaking, extending over a considerable period of time. However, it found the need to establish "clear and concrete targets for the results measured by (NRC's) key performance indicators."

8.12 – NRC has a performance management framework in place and is in the initial stage of building an automated system for gathering performance data (as noted in the section above). It also recently cooperated with a few NRC institutes (on a pilot basis) to implement the framework. However, as NRC corporate staff told us, these efforts are still at a very early stage, and it will require time and leadership to develop it further.

Mechanisms for determining outcomes and impacts

8.13 – Projects are linked to the institutes' strategic plans. In turn, these link to NRC's Vision. As noted earlier, it should be possible to establish clear and measurable objectives for projects. The best approach to measuring results for projects in terms of these objectives would be for institutes and their partners to establish mechanisms for monitoring the success for these projects. However, institutes are telling us that their partners would resist including such a mechanism in contractual arrangements, because they feel that information related to intellectual property must be protected.

8.14 – Other approaches to measuring results (individually or in combination) would need further study. Institutes could use existing mechanisms to strengthen communication with clients on the impacts of agreements. For example:

  • IP databases (e.g., for monitoring sales volumes) could serve as a proxy for performance. Such information is often used to compute royalties.
  • Follow-up questionnaires at the end of projects could assess, for instance, performance expectations from NRC's clients on projects.
  • Cluster group meetings - at a more aggregated level (e.g.; inquiring generally about NRC's performance).

8.15 – During our follow-up, we also found that PBI is trying to assess, on a pilot basis, the impacts of its work on specific industry sectors. Other NRC institutes might also want to pursue this approach.

9.0 -- Conclusion

9.1 – When we performed our 2002 audit on partnerships, most corporate level guidance was provided at a high level through NRC's Vision. However, additional stewardship through a corporate strategic plan is considered a good practice by organizations such as the Office of the Auditor General. Currently, NRC is developing a corporate strategic plan. Developing such a plan should enable corporate and the NRC institutes to set expectations against which to measure results. It should also help institute staff to establish clearer and measurable objectives for projects.

9.2 – Our 2002 audit was based on a six-step process from Treasury Board for managing successful collaborations. This process starts with clearly identifying needs, opportunities/ risks and expectations for the activity sectors monitored by each NRC institutes. It ends with assessing results achieved in light of those expectations. It is suggested that this be done at the corporate, institute and (if possible) project levels. This process represents what the Auditor General's Office calls the accountability loop.

9.3 – During the 2002 audit, we found that NRC's institutes were, in general, managing partnership agreements well. However the accountability framework (e.g., identification of needs/ expectations and valuation of collaborations for performance reporting purposes) needed improvements. As noted above, in 2002, NRC institutes had the Vision to 2006 to work with. By the end of 2005, more guidance will be available for NRC institutes through an NRC corporate strategic plan. This new information will help to improve NRC's accountability process. It will also help NRC institutes to define expectations and to measure the extent to which they have been met.

9.4 – During the follow-up, we found that some NRC institutes had generally improved their management of partnerships since our 2002 audit. For example, with the exception of IMI, which was already doing so in 2002, NRC institutes are conducting -- through a documented process -- environmental scans and business analyses to clearly identify needs, opportunities and risks. In addition, they are waiting finalization of the corporate strategic plan to align their own plans. They have generally increased their legal protection against risks associated with dispute-resolution and penalties. They are also performing or planning (again, with the exception of IMI, which was already doing this) systematic follow-ups with clients at the end of projects.

9.5 – However, we also found that more work is needed to improve NRC's accountability process, primarily through better assessing the results of working with partners. For instance, institutes should:

  • review performance indicators to improve related definitions and continue to provide additional categories for items that do not meet the definition developed; and
  • establish mechanisms for continuously determining the outcomes and impacts of research arrangements.

Appendix A -- NRC Partnerships Profiles


A.1 – In its paper entitled Impediments to Partnering and the Role of Treasury Board of May 13, 1998, partnering is defined as:

"An arrangement between two or more entities that enables them to work co-operatively towards shared or compatible objectives and in which there is some degree of shared authority and responsibility, joint investment of resources, shared risk taking and mutual benefit." Footnote2

A.2 – The Treasury Board also indicates that partnering arrangements may be consultative (share information), contributory (share financial and other support), operational (share work), or collaborative (share decision-making). These categories are cumulative, rather than mutually exclusive. An arrangement that can be characterized by one of the above terms usually exhibits the preceding characteristics as well.


A.3 – The Introduction to Collaboration and Services Agreement section of contract material provided by NRC's Legal Services states "In accordance with NRC policies, a services agreement is a research agreement where NRC is reimbursed by the other party for 100% of NRC's costs in the research project, whether the party does or does not conduct itself any research. Any other research agreement is a collaboration, including situations where there is no exchange of monies and situations where NRC is doing all the research but is only reimbursed for a portion of its research costs." However, as per the NRC Pricing Policy, where more detail is provided to guide NRC in its interpretation of partnership agreements, full cost recovery is not the only factor in distinguishing between service and collaborative agreements. Subsequently, full cost recovery is not an element of every service agreement.

A.4 – Our work has allowed us to identify the following types of partnerships in the NRC. Some of these types of arrangements have been a significant part of NRC operations for a long-time. Others are more recent. Figure 1 below illustrates this.

Figure 1: NRC Arrangement History

NRC Partnership History

A.5 – Descriptions of the various types of agreements utilized by the NRC are as follows:

Main Agreements

Fee-For-Service: Through this mechanism, the NRC keeps its laboratories and facilities open to Canadians who need them and could not afford their own facilities. Fees are charged to recover the costs of the NRC. These arrangements may or may not include the transfer of intellectual property.

Collaborative Agreements define the type and conditions of collaboration between the NRC and one or more parties on a specific subject. Each partner shares objectives, information, decision-making and the resulting benefits of joint research and contributes to the cost of the research. The latter can be dealt with through a sharing of resources (in-kind) or through a dollar contribution to the work.

Spin-Off Agreement occurs where an employee creates a company, in which he/she has a principal financial interest, to exploit technology from the NRC.

Incubation Agreements are similar to Co-location Agreements (see below) but provide for specific collaborative efforts between the NRC and the partner to develop technology or knowledge of a technology for use in creating a new company.

Licensing Agreements allow other parties to use NRC intellectual property. Intellectual property includes any rights resulting from intellectual activity in the industrial, scientific, technology, literary, or artistic fields including all intellectual creations legally protected through patents, copyright, industrial design, integrated circuit topography, and plan breeders' rights or subject to protection under the law as trade secrets and confidential information.

Related or Stand Alone Agreements

Co-location Agreement: This occurs when an existing organisation wishes to co-locate in order to access NRC expertise and/or facilities for the purposes of developing technology that it could not develop without NRC cooperation. In addition to requirements for use of the space, contracts include health and safety, security and environmental requirements.

Equipment Use Agreements allow partners to use equipment on NRC premises.

Contribution Arrangement through which an outside organisation provides funding to the NRC for a specific event that does not include research work.


A.6 – The NRC uses the following types of arrangements to facilitate its relationships with its partners:

Incubators provide access to NRC facilities and expertise for the purpose of developing technology or knowledge of a technology to be used in the creation of a new company (see Incubation Agreements).

Institute Technology Cluster: similar to a technology cluster (see below) but applied at an institute level. Group members identify requirements and directions for pre-competitive research. Resulting work may result in Collaborative Agreements.

International Collaborations: efforts on behalf of the NRC to develop new capabilities that would not be possible through domestic activities alone. Strategic S&T information, intelligence, and technology foresight gathered through arrangements are used to enhance knowledge and technology, to transfer S&T information to Canadian firms and universities and to leverage innovation opportunities for Canadian industry internationally.

Joint Research occurs when a partner agrees to work with the NRC and possible other partners, on a specific research topic (see Collaborative Agreements).

Memorandum of Understanding (MOU) – see Strategic Alliances.

Multi-client Group consists of a grouping of many clients on a specific research topic of common interest (see Collaborative Agreements).

Spin-Ins occur when individuals who are not employees of the NRC wish to create a new company but require access to the NRC for the purposes of developing technology to be used in the new business (see Incubators and Incubation Agreements).

Spin-Offs are situations where an employee creates a company, in which he/she has a principal financial interest, for the purpose of exploiting technology from the NRC. The NRC may provide ongoing support with the technology (see Collaborative Agreements).

Spin-Outs occur when organisations complete their work in NRC incubators and are ready to proceed on their own. The NRC may provide ongoing support with the technology (see Collaborative Agreements).

Strategic Alliances are created when organizations with a common interest in a technological field agree to share information and concerns relating to that field. These may be dealt with in a master agreement but do not involve any monetary arrangements (e.g., MOU, Umbrella Agreement). Collaborative Agreements may occur later through specific agreements or as an annex to the master agreement.

Technology Clusters: effort by the NRC to work closely with Canadian communities to help develop their innovative capacity in key technology fields through jointly formulated long-term strategies. Collectively the partners define technology requirements and directions.

Umbrella Agreement – see Strategic Alliances.

Appendix B -- Criteria used During the 2002 Audit


Audit Criteria

We expected that partnerships would be founded on:

  • Environmental scan, business analysis and consultation;
  • The NRC Vision;
  • Institute mission, mandate and its strategic plan and outlook documents;
  • The capacity of the institute to do the work and of any existing or potential partners or collaborators to absorb the technology; and
  • A sound proposal, including a risk/ benefit analysis.

We also expected that partnership arrangements used:

  • Are the most relevant for the requirements of the situation (e.g., spin-in/ incubation versus. collaboration);
  • Are legal and not a "true partnership";
  • Benefit Canadians;
  • Contain proper accountability arrangements; and
  • Provide for sound management practices.


Audit Criteria

We expected that arrangements would be based on the essential characteristics of:

  • Compatible objectives;
  • Clearly identified critical success factors;
  • All parties contributing resources;
  • A fair allocation of risk-taking among all parties; and
  • Sharing of benefits amongst all parties.


Audit Criteria

We expected that institutes will have consulted with appropriate functional specialists in the NRC with regard to issues concerning:

  • Contracting practices;
  • Intellectual property and licensing;
  • Return on investment;
  • Access to Information and Privacy; and
  • Security.


Audit Criteria

We expected that prior to entering into an agreement, an analysis or plan will have been prepared that considers:

  • The partner's needs;
  • Market factors affecting the partner;
  • Associated risks;
  • Legal and security issues;
  • Existing and new competition for both the partner and the institute;
  • Workplace issues such as space, working conditions, etc.

We also expected that provisions would be made in arrangements to ensure:

  • Clearly specified roles and responsibilities of the parties;
  • Performance expectations balanced with the capacity to deliver;
  • A well-defined structure to manage the arrangement;
  • An appropriate monitoring regime whereby the NRC can assess whether the arrangement is accomplishing what is expected;
  • Mechanisms for resolving any disputes among partners; and
  • Reasonable procedures to deal with non-performance in aspects of the arrangement.


Audit Criteria

Project Management

We expected that projects would:

  • Be managed in a manner sensitive to risk, complexity and resources of projects;
  • Have well defined objectives within an accountability framework;
  • Be approved in accordance with project approval requirements;
  • Have a comprehensive and coordinated definition of the overall scope of the project; and
  • Be adequately resourced by institutes.

We also expected that there would be processes in place:

  • That are consistent with the areas of knowledge of the Project Management Institute;
  • To assess project risk and develop contingency plans;
  • To protect intellectual property developed during the project; and
  • To properly close a project including:
    1. all work has been recorded correctly;
    2. project documentation and administrative matters have been completed including matters such as invoices, payments and receipt of revenue;
    3. follow-up with the partner; and
    4. any necessary clean up or restoration.

Audit Criteria

Program Management

Program managers should ensure that:

  • A partnership arrangement is consistent with the strategic requirements of the institute prior to assigning any resources to it;
  • The right resources are identified or will be made available in the future to meet the requirements of a project;
  • Resources are available at the proper time to meet the requirements of a project; and
  • Technical and timing requirements are communicated to staff involved in a project.


Audit Criteria

We expected that provisions would be made in arrangements to ensure:

  • Appropriate evaluation of the success of the arrangement; and
  • An appropriate audit regime is clearly defined, where appropriate (e.g., licensing agreements).

We also expected:

  • That institutes would have processes in place to monitor the results of partnership arrangements subsequent to the completion of agreements and projects; and
  • That the information should be based on:
    1. common definitions of the types of partnerships; and
    2. uniform practices for quantifying the partnerships (e.g., number of partners versus number of agreements, and fee-for-service for partner organisation versus fee-for-service for non-partner.

Appendix C -- Acronyms

ASPM – Administrative Services and Property Management

BDO – Business Development Officer

HIA – Herzberg Institute of Astrophysics

IAR – Institute of Aerospace Research

IIT – Institute for Information Technology

IMI – Industrial Materials Institute

MOU – Memorandum of Understanding

NRC – National Research Council

PBI – Plant Biotechnology Institute

S&T – Science and Technology


Footnote 1

Administrative Services and Property Management

Return to footnote 1 referrer

Footnote 2

Drawn by the Treasury Board from J. David Wright and Alti Rodal, Partnerships and Alliances in New Public Management and Public Administration in Canada (Eds. M. Charih and A. Daniels), Institute of Public of Canada and University of Quebec, 1997, pp.266-7

Return to footnote 2 referrer

Date modified: