ARCHIVED - Evaluation of the Digital Technology Adoption Pilot Program

Archived Content

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Table of contents

ARCHIVED - Evaluation of the Digital Technology Adoption Pilot Program (PDF, 497KB)

Final Report

Prepared for:
National Research Council
Prepared by:
Goss Gilroy Inc.
Management Consultants
Suite 900, 150 Metcalfe Street
Ottawa, ON K2P 1P1
Tel: (613) 230-5577
Fax: (613) 235-9592
E-mail: ggi@ggi.ca

May 24, 2013

Executive Summary

Purpose and Methodology

As part of the Government’s commitment to developing a Digital Economy Strategy, Budget 2011 announced measures to accelerate the adoption of digital technologies, build digital skills and showcase Canada’s digital technologies. This included the development of the Digital Technology Adoption Pilot Program (DTAPP).The objective of DTAPP is to accelerate the adoption of digital technologies in small- and medium-sized enterprises (SMEs) operating in Canada so they can boost their productivity and create economic growth and opportunity. The National Research Council (NRC) Industrial Research Assistance Program (IRAP) was selected to deliver DTAPP from November 14, 2011 to March 31, 2014.

An evaluation of DTAPP was undertaken to:

  • update NRC senior executives and managers of the early outcomes of the Program and to provide them with information that may contribute to program improvements;
  • gather information to inform policy decisions related to the adoption of digital technologies by SMEs and to inform future program design; and
  • address the funding requirement called for in the Treasury Board Submission that an evaluation be carried out in fiscal year (FY) 2012-13.

The evaluation covered the period of 2011-12 to 2012-13. With the Program having only been recently implemented at the time of the evaluation, a process-oriented evaluation was conducted.Issues of relevance and performance, as per the Treasury Board Policy on Evaluation (2009), were also addressed by the evaluation. The evaluation was led by the NRC Office of Audit and Evaluation with the assistance of Goss Gilroy Inc. The evaluation methods, implemented between November 2012 and February 2013, included: key informant interviews; survey of clients; document and literature review; review of administrative and performance data; and a special study designed to assess program design and delivery.

Profile of DTAPP

To support the achievement of its objectives, DTAPP has three main components:

  1. Supporting and increasing the adoption of digital technology in SMEs. The Program provides advisory services to SMEs and provides contribution funding to eligible SMEs for digital technologies adoption projects as well as colleges and other organizations to assist SMEs in digital technologies
  1. Improving the understanding of the link between adopting digital technology and productivity . This component involves data collection on barriers and strategies to technology adoption and the derivation of learnings and best practices.
  1. Raising awareness of the benefits and importance of adopting digital technology. The third element of DTAPP is raising awareness of DTAPP and promoting the benefits of digital technologies uptake by Canadian SMEs.

DTAPP is administered as a standalone Program by NRC-IRAP. Between 2011-12 and 2013-14, NRC-IRAP was allocated $76.5M to deliver DTAPP.

Evaluation Findings

Program Relevance

The evaluation demonstrates that there is a productivity gap between Canadian SMEs and SMEs in other countries, and that the adoption of digital technology is one way to address this. While there are other ways to increase productivity, programs already exist in Canada to address these strategies. Some regional programs exist that address digital technology challenges in SMEs, however, DTAPP addresses this need at a national level. Evidence indicates that SMEs face barriers to adoption, requiring both financial assistance and advisory support.

In terms of the appropriateness of the Program design to meet the needs of SMEs experience productivity challenges, the evidence reviewed as part of this evaluation suggests a need for both financial assistance and advisory services in SMEs adoption of digital technology.Organizations and colleges were also found to play an important role in supporting SMEs in digital technology adoption. While opinions about the value of colleges were mixed, colleges saw themselves as having a role to play in training SMEs and providing referrals.

DTAPP is consistent with current government priorities to promote SME digital technology adoption to enhance productivity. The evaluation also confirmed the federal government’s role in raising awareness among SMEs regarding the benefits and importance of adopting digital technologies and, to a lesser extent, to support colleges to build and promote their digital technology capacity. Evidence to support a federal role to collect information on barriers and strategies for SME technology adoption was mixed. Those from outside NRC-IRAP believed there is a role to be played by the federal government whereas those inside of NRC-IRAP voiced concern about this role, usually in the context of the perceived reporting burden on Industrial Technology Advisors (ITAs) and firms. Finally, the evaluation found that NRC-IRAP is an appropriate delivery agent for DTAPP, primarily due to the existing networks of ITAs, ongoing interactions with SMEs, ITAs’ technological backgrounds, NRC-IRAP’s national presence, among other reasons.

Program Performance - Reach

DTAPP has been successful in reaching its intended clients and delivering the planned program activities. The evaluation found evidence that DTAPP undertook a number of activities that raised awareness of the Program among SMEs, organizations and colleges. As planned, outreach activities declined over time to allow for a transition from awareness of the Program to awareness of the benefits of adopting digital technology. At the time of the evaluation, these outreach activities resulted in over 600 engagements with firms. Financial assistance had been provided to close to 400 of these firms in support of a digital technology adoption project. As well, funding to 34 colleges and 31 organizations contributed to the provision of various types of services for SMEs, ranging from awareness sessions to the conduct of diagnostic assessments. ITAs were found to have provided advice to SMEs, of which was most commonly found to be technical in nature. ITAs also facilitated linkages for SMEs and provided referrals, which were most commonly made to other ITAs and technical resources.

In terms of improving the understanding of the link between adopting digital technology and productivity, the evaluation found that the Program has collected data on barriers and strategies to digital technology adoption and productivity. While it is still too early to assess the linkage between digital technology and productivity, challenges were identified in the evaluation with the measure of productivity being used by the Program (i.e., value added per employee).

Program Performance - Outcomes

DTAPP clients reported positive benefits of Program services on their ability to adopt digital technologies, with more than half having already adopted a digital technology and the remaining planning on adopting in the future. While it is too early to assess the effect of adopting digital technologies on client productivity, DTAPP clients anticipate many benefits from the adopted technologies, including lower production costs; increased productivity; improved management systems; and improved quality of products or services.

Given that DTAPP was a pilot program, the evaluation explored the adequacy of the resource levels and duration. Evidence suggests that while the pilot length was sufficient to test the Program design, it will not be long enough to truly test the ability for DTAPP to achieve its intended outcomes since productivity growth often take longer than the pilot length to fully materialize.

Program Performance - Design and Delivery

In terms of economic and efficient Program delivery, the evaluation found that the ability to leverage NRC-IRAP infrastructure and human resources, and the use of both DTAPP ITAs and NRC-IRAP ITAs to deliver the Program contributes to an economic and efficient Program design. The evaluation also found that DTAPP is being delivered in an economic and efficient manner, largely resulting from NRC-IRAP acting as the delivery agent. Clients are satisfied with the program outputs, and both ITAs and funded organizations and colleges contribute to efficient and economic program delivery.

DTAPP has been delivered according to design, and while variations do exist in the way in which ITAs proceed through the DTAPP engagement process with firms, these differences are still in accordance with the original Program design. Two areas of program design in which changes could be made to optimize delivery included reporting and project selection criteria.

Various lessons were identified that could be taken from the development and implementation of DTAPP and applied to any additional programs that NRC-IRAP is asked to deliver. These included having knowledgeable and dedicated personnel (in this case, ITAs) already in place; allowing for adequate time to get the Program up and running; having clear communications regarding the Program and expectations to delivery personnel and partners; and ensuring the data that is collected is useful for analysis without being an undue burden.

Recommendations

  1. Based on the evidence that there is a need for the federal government to support productivity improvements in SMEs, that productivity can be improved through the adoption of digital technology and the lack of a national level program to encourage SMEs to adopt digital technologies, it is recommended that DTAPP be continued as a permanent Program.
  2. Given NRC-IRAP’s extensive network and existing connection with the SME community, its national presence and the technical expertise of ITAs, it is recommended that NRC-IRAP continue to be the delivery agent of DTAPP, should the Program be renewed.
  3. Given the challenges associated with current indicators used to measure the Program’s contribution to firm productivity (i.e., VAPE), it is recommended that DTAPP conduct a review of its performance measures to ensure that it can adequately measure the effect of digital technology adoption on productivity.
  4. In order to better understand the NRC-IRAP resources leveraged to deliver DTAPP, as well as to accurately estimate the level of operational resources required in the event that NRC-IRAP is asked to deliver additional programs, it is recommended that NRC-IRAP institute a mechanism to better track the level of effort spent on the activities carried out by field and support staff in support of each program.
  5. DTAPP should review the information that is being collected by ITAs and by clients to ensure that it is streamlined, yet meets the needs of Program management and is sufficient to address accountability requirements. When conducting a review of the DTAPP reporting process, the following considerations are suggested:
    • An effort should be made to ensure that information is collected on the advisory services provided by ITAs as well as the services offered by funded organizations and the resulting impact on SMEs, two Program areas that are not addressed by the current reporting requirements
    • Increased integration with NRC-IRAP reporting requirements, including the application process
  6. DTAPP should explore the feasibility of introducing additional criteria to select firm and organization/college projects to ensure the Program results in the greatest impact on SME productivity as is possible.

List of acronyms

Acronyms Description
BDC Business Development Bank of Canada
CA Contribution agreement
CRM Client relationship management
CTO Contribution to organization
DTAPP Digital Technology Adoption Pilot Program
ERP Enterprise resource planning
FTE Full-time equivalent
FY Fiscal Year
HR Human resources
IC Industry Canada
ICT Information and Communication Technology
ICWS IRAP Client Website
IRAP Industrial Research Assistance Program
ITA Industrial Technology Advisor
NRC National Research Council
NSERC Natural Sciences and Engineering Research Council
OECD Organisation for Economic Co-operation and
PPA Post Project Assessment
RDA Regional Development Agency
R&D Research and development
ROI Return on Investment
SMEs Small- and medium-sized enterprises
SoF Status of Firm
TBS Treasury Board Secretariat
US United States
VAPE Value Added per Employee

1.0 Introduction

As part of the Government’s commitment to developing a Digital Economy Strategy, Budget 2011 announced measures to accelerate the adoption of digital technologies, build digital skills and showcase Canada’s digital technologies. This included the development of the Digital Technology Adoption Pilot Program (DTAPP).The objective of DTAPP is to accelerate the adoption of digital technologies in small- and medium-sized enterprises (SMEs) operating in Canada so they can boost their productivity and create economic growth and opportunity. The National Research Council (NRC) Industrial Research Assistance Program (IRAP) was selected to deliver DTAPP from November 14, 2011 to March 31, 2014.

An evaluation was undertaken to:

  • update NRC senior executives and managers of the early outcomes of the Program and to provide them with information that may contribute to program improvements;
  • gather information to inform policy decisions related to the adoption of digital technologies by SMEs and to inform future program design; and
  • address the funding requirement called for in the Treasury Board Submission that an evaluation be carried out in fiscal year (FY) 2012-13.

The scope of the evaluation covered 2011-12 to 2012-13 and only those activities carried-out by NRC-IRAP (i.e., not those activities carried out by Industry Canada (IC) in support of DTAPP, which focused on data collection). With the Program having only been recently implemented at the time of the evaluation, a process-oriented evaluation was conducted. In alignment with the Treasury Board Secretariat (TBS) Policy on Evaluation (2009), the evaluation also explored questions related to relevance and performance, including efficiency and economy.

The evaluation was led by the NRC Office of Audit and Evaluation with the assistance of Goss Gilroy Inc. The evaluation methods, implemented between November 2012 and February 2013, included:

  • Key informant interviews (primary data);
  • Survey of firms (primary data);
  • Study of program processes (both primary and secondary data);
  • Document and literature review (secondary data); and
  • Review of administrative and performance data (secondary data).

All of the methods implemented for the evaluation, as well as their limitations and challenges, are outlined in greater detail in Appendix A.

This report begins with a description of the Program (Section 2.0).  An assessment of the relevance of DTAPP is then presented in Section 3.0. The extent to which the Program has achieved its early outcomes is presented in Section 4.0. Section 5.0 addresses efficiency and economy as well as the lessons learned from the DTAPP experience in terms of implementing a new program. Section 6.0 of the report lays out management’s response to these recommendations and the actions that will be taken as a result of the evaluation. Appendix A presents additional details regarding the methodology and Appendix B provides a crosswalk between the evaluation questions and each of the methods. A selected bibliography is presented as Appendix C.

For qualitative lines of evidence (e.g., key informant interviews, study on processes), the following scale is used in the text of the report to indicate the relative weight of the responses for each of the respondent groups.

  • "All/almost all" - findings reflect the views and opinions of 90% or more of the respondents commenting on that particular issue;
  • "Large majority" - findings reflect the views and opinions of at least 75% but less than 90% of the respondents commenting on that particular issue;
  • "Majority/most" - findings reflect the views and opinions of at least 50% but less than 75% of the respondents commenting on that particular issue;
  • "Some" - findings reflect the views and opinions of at least 25% but less than 50% of the respondents commenting on that particular issue; and,
  • "A few" - findings reflect the views and opinions of at least two respondents but less than 25% of the respondents commenting on that particular issue.

2.0 Program Profile

2.1 Program Objectives and Activities

The 2011 federal budget announced measures to accelerate the adoption of digital technologies, build digital skills and showcase Canada’s digital technologies. These measures included the development of DTAPP. The objective of DTAPP is to accelerate the adoption of digital technologies in SMEs operating in Canada so that they can boost their productivity and create economic growth and opportunity.

Due to its extensive network of technical experts, national presence, capacity to deliver funding programs and experience working with SMEs, NRC-IRAP was selected to deliver DTAPP from November 2011 to March 31, 2014. Although it is being delivered by NRC-IRAP, the objective of DTAPP is different than that of NRC-IRAP. While NRC-IRAP focuses on innovation, DTAPP focuses on productivity. Likewise, while the mandate of NRC-IRAP is to support technology development, DTAPP’s mandate is to support technology adoption. Another difference from NRC-IRAP is that DTAPP consists of a strong learning component, which includes enhanced performance monitoring and measurement components to gather information for the benefit of the broader SME community.

To support the achievement of its objectives, DTAPP has three main components: 1) supporting and increasing the adoption of digital technology in SMEs; 2) improving the understanding of the link between adopting digital technology and productivity; and 3) raising awareness of the benefits and importance of adopting digital technology. While NRC-IRAP has the sole responsibility of providing advice to SMEs and financial assistance to SMEs, colleges and organizations, both NRC-IRAP and Industry Canada are responsible for collecting data and disseminating information on topics related to digital technology adoptionFootnote1. Each of the Program components is described in greater detail below.

2.1.1 Supporting and increasing the adoption of digital technology in SMEs

In support of increasing the adoption of digital technology in SMEs, NRC-IRAP provides advisory services to SMEs and provides financial contributions to eligible SMEs for digital technology adoption projects as well as colleges and other organizations to assist SMEs in digital technology adoption. Each of these services is described below.

Provision of advisory services to SMEs: The provision of advisory services to SMEs occurs through a five-stage process, referred to as the ‘Engagement Process’. The five stages are described in Table 2.1, below.

Table 2.1: DTAPP Engagement Process

Stage Description
Stage 1 - Awareness NRC-IRAP raises SME awareness of the benefits of adopting digital technologies to increase productivity and how DTAPP can assist them in the adoption process. SMEs interested in DTAPP can request advisory services from NRC-IRAP Industrial Technology Advisors (ITAs).ITAs can also approach SMEs in their portfolio that they think may benefit from participating in DTAPP.
Stage 2 - Eligibility A review of the firms operations is conducted by the ITA to determine whether the SME is eligible to participate in DTAPP. This includes assessing the readiness of the firm's management team to adopt digital technology into their operation.
Stage 3 - Analysis If the SME is willing to proceed with the Program, a DTAPP team is formed (which, at minimum, includes a Lead NRC-IRAP ITA and an appropriate DTAPP ITA) to conduct a diagnostic analysis. The team collects and records background information on the firm, including history of technology adoption; stage in existing digital technology adoption process; a financial assessment; and a productivity assessment.
Stage 4 - Execution The DTAPP team and the firm work together on a plan that identifies barriers to technology adoption and strategies to overcome them. During this stage, key tasks and milestones are also identified. The DTAPP team provides advice and referrals to other services and experts based on the actions that the SME needs to take to overcome the barriers to adoption and integrate the digital technology into their operations.
Stage 5 - Outcomes The DTAPP team, with input from the firm, records the outcomes of the adoption process and lessons learned by both parties. The firm continues to record productivity and other post-engagement changes for up to 3 years after the end of the engagement.

Provision of financial services to SMEs: SMEs may be provided with non-repayable contributions for projects related to the adoption of digital technology to support labour costs, including those of graduates; contractor costs, including materials but excluding computer hardware and / or off-the shelf-software; travel and living expenses; feasibility and other studies and/or training costs.

Engaging colleges and other organizations to assist SMEs: Approximately 5 % to 10 % of the DTAPP grants and contributions are provided to not-for-profit organizations and colleges to deliver DTAPP services to groups of SMEs or individual SMEs (e.g., increasing adoption awareness, training, workshops). Funding for colleges is also intended to facilitate the:

  • Referral of SMEs to colleges as contractors or to find qualified graduates to hire;
  • Colleges’ ability to build awareness of their digital technologies adoption expertise amongst SMEs; and/or
  • Capacity building within the colleges (e.g., integrating a Digital Technology Adoption Expert into the colleges’ industrial liaison function).

2.1.2 Improving the understanding of the link between adopting digital technology and productivity

The second component of DTAPP involves data collection on barriers and strategies to technology adoption and the derivation of learnings and best practices.NRC-IRAP is responsible for collecting information from SMEs that receive advisory services as well as SMEs, organizations and colleges in receipt of financial assistance. NRC-IRAP must also analyze the information collected so that it can be used to raise awareness of the benefits and importance of adopting technologies.

Industry Canada also has a responsibility to collect data from businesses and individuals on topics related to digital technologies adoption as well (e.g., technologies being adopted, barriers to adoption, benefits of adoption, etc.). However, where NRC-IRAP’s data collection focuses on SMEs that received DTAPP services, Industry Canada’s focuses on businesses of all sizes in Canada, including SMEs.

2.1.3 Raising awareness of the benefits and importance of adopting digital technologies

The third element of DTAPP is raising awareness of DTAPP and promoting the benefits of digital technologies uptake by Canadian SMEs.At the beginning of the Program, awareness efforts centered around the program services, whereas in later years it would shift to disseminate best practices and lessons learned on technology adoption.

2.2 Expected Results

In the short term (i.e., one to three years after Program implementation), DTAPP is expected to have supported SMEs in digital technology adoption, enabled organizations (including colleges) to support SMEs in digital technology adoption, and increased SME awareness of digital technology and its ability to address productivity challenges.

Over the medium term (that is after 3 years of program implementation), the expected results include college capacity to support digital technology solutions for SMEs, increased capacity among SMEs to adopt digital technologies and increased SME adoption of digital technologies. The ultimate expected outcome of DTAPP is to increase the productivity of SMEs.

2.3 Governance, Clients and Partners

DTAPP is administered as a standalone Program by NRC-IRAP. The Vice-President of NRC-IRAP reports to NRC’s President, and has overall managerial accountability for the Program. The Vice-President is supported by five regional Executive Directors and a national office Executive Director. Together they form the Senior Leadership Team, which makes Program-wide decisions. A Director within NRC-IRAP National Office is responsible for the day-to-day management of DTAPP.

DTAPP’s primary clients are Canadian firms (incorporated, for-profit commercial entities) with 20-200 full-time equivalents. However, SMEs with 500 employees or less are still qualified to receive support. DTAPP partners include colleges and not-for-profit organizations. Colleges and not-for-profit organizations desiring to enhance their capabilities to support SMEs in digital technology adoption are also eligible for financial contributions.

2.4 Program Resources

This section describes the resources available to support the achievement of DTAPP’s objectives. NRC-IRAP was allocated $76.5M between 2011-12 and 2013-14 to deliver DTAPP. Table 2.2, below, provides an overview of the financial resources allocated to NRC-IRAP for DTAPP. In addition to the funding allocated to NRC-IRAP for DTAPP, Industry Canada was provided $3.5M to deliver its component of DTAPP (i.e., data collection and dissemination of information on topics related to technology adoption).

According to Program plans, NRC-IRAP would hire 15 term staff members to deliver DTAPP as well as leverage existing NRC-IRAP structures and resources where appropriate. The planned hires included ten DTAPP ITAs to deliver advisory services, three employees to collect and analyze performance data collected from DTAPP clients and two communications employees in support of raising awareness of DTAPP. As of the time of the evaluation, eight ITAs, two communications officers and three employees to support the data collection and analysis in the National Office had been hired.

Table 2.2: DTAPP Expenditures and Budgeted Financial Figures

Fiscal Year Operating Grants and Contributions (Gs&Cs)
Personnel Operating and Maintenance Sub-TOTAL Operating Contributions to firms Contributions to organizations Sub-TOTAL Contributions TOTAL
2011 - 12 $89,154 $320,271 $409,425 $1,278,712 $998,329 $2,277,041 $2,686,466
2012 - 13 $1,691,667 $1,050,000 $2,741,667 $21,586,289 $3,333,711 $24,920,000 $27,661,667
2013 - 14 $1,758,333 $650,000 $2,408,333 $34,740,000 $34,740,000 $37,418,333
TOTAL - 3 years $3,539,154 $2,020,271 $5,559,425 $57,605,001 $4,332,040 $61,937,041 $67,496,466

Source: Program Financial Data

  • Note 1: Whereas expenditures are final for FY2011-12, they are budgeted for FY2012-13 and FY2013-14.
  • Note 2: Based on available information, the planned distribution for funding to firms versus organizations in 2013-14 has yet to be determined.
  • Note 3: In FY2011-12 NRC-IRAP requested G&Cs re-profiling of $14.190M that was denied. This funding was returned to NRC through the Monthly Forecast Review in Jan 2012. On Nov. 8, 2012, a re-profiling request of $8.5M from FY2011-12 to FY2013-14 from NRC-IRAP was subsequently approved.
  • Note 4: In FY2011-12 NRC-IRAP returned $1,110K in Operating and $550K in Salary to NRC through the monthly forecast reviews (Jan. and Feb. 2012).
  • Note 5: Employee Benefit Plan (EBP) have been excluded from the operating figures.

3.0 Relevance

This section of the report addresses the dimensions of relevance including the extent to which DTAPP is: responding to a demonstrable need; appropriately designed to respond to SME challenges; aligned with government priorities; and consistent with federal roles and responsibilities.

3.1 Continued Need for the Program

In order to determine the extent to which there is a justifiable need to support SME technology adoption, the evaluation sought to assess: the extent to which there is a productivity gap in Canada and whether it can be addressed by digital technology adoption. It also investigated whether SMEs face barriers in adopting digital technologies and therefore need an assistance program.

Evaluation Question: Is there a justifiable need to support SME technology adoption to enhance their productivity through advisory services and /or financial support?

Key Findings:

  • There is a productivity gap between Canadian SMEs and SMEs in other countries, of which the adoption of digital technology has been shown to be one way to increase productivity in SMEs. While there are other ways to increase productivity, programs already exist in Canada to address these strategies. Some regional programs exist that address digital technology challenges in SMEs, however, DTAPP addresses this need at a national level.
  • SMEs face various barriers to adopting digital technologies. The nature of the barriers appears to be well suited to the services offered by DTAPP (i.e., financial assistance and advisory services).

3.1.1 The Productivity Gap and Digital Technology Adoption

There is strong evidence in the literature (including Council of Canadian Academies, 2009, Arcand et al., 2008 and Arcand & Lefebvre, 2011) that the productivity growth rate has decreased in Canada since the 1980s compared to the United States (US) and other Organization for Economic Co-Operation and Development (OECD) countries. Several factors explain why Canada has lagged behind the US, including a greater concentration of SMEs in Canada, a resource-based economy and a taxation regime that was more favorable to SMEs than large companies resulting in a perception that productivity improvement was less pressing. Studies also show that the strong commitment of the US towards the adoption and use of digital technologies by firms over the past two decades is a key contributor to its sustained productivity growth (Council of Canadian Academies (2009)).

Indeed, many large-scale studies conducted in various countries (including Baldwin & Gellatly, 2007, de Mendonca et al, 2008, Majumdar & Chang, 2010 and Mcafee, 2002) have demonstrated that digital technology adoption is positively correlated with business performance and productivity. While Canada used to be a leader in that area until the early 2000s, digital technology investments in 2010 accounted for only 60% of that of the US (Industry Canada, 2010).

When consulted directly about productivity challenges and the link between productivity and digital technology, DTAPP clients support both notions. The survey results strongly support the need to support SME technology adoption to enhance productivity. Over three-quarters of firms (77%) indicated that they had experienced productivity challenges in the last three years to a great extent or to a large extent (a negligible number indicated they had not experienced productivity challenges at all or to a small extent). This finding, coupled with the fact that 83% of clients indicated that the productivity challenges they had experienced were best addressed by digital technology to a great extent or to a large extent, suggests a strong need for DTAPP.

In addition to the adoption of digital technologies, the literature highlighted other ways to increase productivity. In particular, innovation can be a significant driver for productivity (e.g., development of new processes or new technologies) (Council of Canadian Academies, 2009, Parsons, 2011). Of course, there are existing programs to facilitate/encourage innovation among SMEs such as NRC-IRAP and Scientific Research and Experimental Developmental tax credits. The literature (Rao, 2011) also suggests that supporting industries that have a positive influence on the productivity of other industries, such as biotechnology, aerospace, autos and information and communications technologies (ICT), also positively influences productivity. Again, there are existing programs that offer support to these industries (e.g., NRC and Industry Canada both offer programs in support of these industries, among others). Finally, the literature (including Arcand et al., 2008, Baldwin & Gellatly, 2007, and de Mendonca et al, 2008) also confirmed that the size of the firm and its financial capacity (e.g., to make capital investments) influences capacity for productivity improvements. In this respect, there are many players in Canada who aim to improve the capacity, viability and overall success of SMEs (e.g. Business Development Bank of Canada (BDC), Regional Development Agency (RDAs), provincial governments, not-for-profit organizations). Thus, since there are other programs to focus on other drivers of productivity, DTAPP appears to be supporting a unique driver of productivity: digital technology adoption.

3.1.2 Barriers to Digital Technology Adoption

With the link between digital technology and productivity confirmed, it is important to understand the barriers that Canadian SMEs face in adopting these technologies. All lines of evidence confirm that SMEs face barriers to technology adoption. According to the survey alone, 87% of respondents confirmed their firm has faced barriers to technology adoption in the past. Table 3.1, below, provides an overview of the barriers faced by DTAPP firms.

Table 3.1: Barriers to Digital Technology Adoption

Barrier type Individual barrier Number of times Barrier Selected Percentage of TOTAL Firms Percentage of all Selections
Human Resources Availability of operational skills 139 51% 16%
Financial High cost of adoption 108 40% 13%
Human Resources Availability of adoption expertise 99 36% 12%
Organizational Business process 98 36% 11%
Technical Technical compatibility 91 33% 11%
Organizational Leadership 50 18% 6%
Technical Infrastructure 30 11% 4%
Organizational Knowledge 27 10% 3%
Human Resources Other human resources barrier 23 8% 3%
Financial Poor/uncertain effect on bottom line 21 8% 2%
Financial Other financial barrier 14 5% 2%
Technical Other technical barrier 12 4% 1%
Organizational Other organizational barrier 11 4% 1%
Business Environment Lack of advice/support 11 4% 1%
Financial High cost of funds 9 3% 1%
Organizational Business strategy 9 3% 1%
Business Environment Regulatory 8 3% 1%
Business Environment Other business environment barrier 5 2% 1%

Source: DTAPP Administrative and Performance Data

In addition to the barriers reported by DTAPP firms in the Program performance data, the literature review and interviews with key informants from most respondent types agreed on two additional barriers faced by firms - limited knowledge of the benefits of adopting digital technology and a lack of time to adopt digital technology. Overall, these barriers appear to be well suited to the services offered by DTAPP (i.e., financial assistance and advisory services). The nature of the support required by SMEs to adopt digital technologies is addressed to a greater degree in the subsequent section.

3.2 Appropriateness of Program Design to Meet Needs

In order to assess the extent to which the Program’s design was appropriate to meet the needs of SMEs, the evaluation explored: the types of assistance required by SMEs to adopt digital technologies; and the value added by organizations and colleges in meeting SME needs.

Evaluation Question: To what extent is the Program design appropriate to meet the needs of SMEs experiencing productivity challenges?

Key Findings:

  • The evidence reviewed as part of this evaluation suggests a need for both financial assistance and advisory services in SMEs adoption of digital technology.
  • Most interviewees felt that organizations and colleges play an important role in supporting SMEs in digital technology adoption. While opinions about the value of the college’s role were mixed, colleges saw themselves as having a role to play in training SMEs and providing referrals.

3.2.1 Nature of Support Required by SMEs to Adopt Digital Technology

According to the survey, 98% of clients indicated that financial support is important for the adoption of digital technologies and 81% said that advice is important. Interviewees from NRC also indicated that there was an equal need for advisory and financial services. Most respondents from funded organizations and external stakeholders (such as RDAs, BDC and associations representing colleges) cited advisory services as being most needed, whereas some respondents from this group saw that both advisory and financial services were needed.

The literature reviewed also provides evidence of a need for government support to assist SMEs in digital technology adoption. According to the literature, government support is needed to assist SME digital technologies adoption to: increase the purchase of digital technologies; increase staff training; provide low-cost subsidies to develop digital technology capacity; support a shift of focus towards the export/global market; and develop SME-specific digital technologies and distribution strategy. While DTAPP does not directly fund the purchase of digital technologies, it does provide support to staff training, to develop digital technology capacity and the development of specific digital technologies.

3.2.2 Role of Funded Organizations and Colleges in Meeting SME Needs

DTAPP provides financial contributions to organizations and colleges to support SMEs in various ways such as increasing adoption awareness, training and workshops. The evaluation explored the value of this involvement through interviews with all respondent groups. With respect to the value of funded organizations, most interviewees from all groups indicated that funded organizations and colleges bring value to DTAPP and to SMEs more directly. The most commonly mentioned value attributed to funded organizations was technical expertise, although awareness raising, knowledge of technology adoption, training and access to networks were also mentioned.

When asked about the value of funded colleges specifically, opinions were more mixed but still generally positive. Approximately half of NRC-IRAP management, ITAs and federal partner respondents as well as most DTAPP working group members and funded organization respondents (including those from colleges), saw the value of colleges in terms of technical expertise, training and offering an objective opinion. It was noted by NRC-IRAP management respondents who were unsure about the value of colleges that faculty release time was a barrier to colleges being able to meet SME needs. A few others (including NRC-IRAP management and ITAs) believed that knowledge of technology adoption was an area in which colleges were less strong.

Funded colleges interviewed as part of the evaluation felt that they were able to best support SMEs in technology adoption through SME training. A few also mentioned that colleges can offer value through providing referrals to other organizations who can provide advice to SMEs or who can assist SMEs in adopting digital technologies. While almost all college respondents felt that their colleges already had the necessary capacity to play these roles, a few acknowledged that this capacity was not present in every college. Findings were mixed regarding whether SMEs are aware of the available capacity within colleges to help them in the adoption of digital technology. Respondents suggested a variety of different ideas regarding what might be needed to build awareness among SMEs, including greater industry engagement, demystification of the concept of applied research, third parties devoted to increasing awareness and closer alignment between colleges and industry.

3.3 Alignment with Government Priorities

Evaluation Question: To what extent is DTAPP consistent with current government priorities?

Key Findings:

  • Evidence from the evaluation indicates that DTAPP is consistent with current government priorities to promote SME digital technology investments to achievement productivity enhancements.

Evidence was found in the literature and documents reviewed, as well as in interviews with NRC-IRAP senior management, to suggest that DTAPP is consistent with current government priorities to promote SME digital technology investments. In particular, the program is aligned with the 2007 Science and Technology Strategy to address Canada’s low uptake of digital technologies and consequent productivity decline. NRC-IRAP senior management echoed this by most commonly citing that the priorities with which DTAPP is aligned were those of increasing Canada’s productivity and the innovation agenda. Furthermore, federal budgets and Speeches from the Throne (e.g., those from 2010 and 2011) have identified digital technology investment and Canada’s low productivity rate as a top priority to be addressed notably through additional support and funding under the planned Digital Economy Strategy.

3.4 Role of Government

To understand the role and responsibility of federal government as it relates to DTAPP’s suite of programming, the evaluation explored: 1) the extent to which the federal government has a role in supporting SME productivity improvement generally and SME technology adoption specifically; 2) the extent to which the federal government has a role in supporting the capacity of colleges; 3) the extent to which the federal government has a role in the collection of information on barriers and strategies as well as raising awareness of the benefits of digital technology adoption among SMEs; and 4) the appropriateness of NRC-IRAP as the delivery agent of the Program.

Evaluation Question: Is DTAPP consistent with federal roles and responsibilities?

Key Findings:

  • There is a role for the federal government to support SME productivity growth and SME digital technology adoption. The evidence also confirmed the role for the federal government to raise awareness among SMEs regarding the benefits and importance of adopting digital technologies and, to a lesser extent, to support colleges to build and promote their digital technology capacity.
  • There is mixed evidence regarding the extent to which there is a federal role to collect information on barriers and strategies to SME technology adoption. Those from outside of NRC-IRAP believed there is a role to be played by the federal government whereas those inside of NRC-IRAP voiced concern about this role (usually in the context of the perceived reporting burden on ITAs and firms).
  • NRC-IRAP is an appropriate delivery agent for DTAPP, primarily due to the existing networks of ITAs, ongoing interactions with SMEs, ITAs’ technological backgrounds and NRC-IRAP’s national presence, among other reasons. No feasible alternative delivery mechanisms were uncovered by the evaluation.

3.4.1 Federal Role in Supporting SME Technology Adopting and Productivity Growth

Evidence was found in the documentation that the Government of Canada has an enabling and regulatory role to play in supporting adoption of digital technologies by SMEs and increased productivity growth. In particular, an Industry Canada studyFootnote2 on digital economy (2010) argued that the condition for the adoption and use of digital technologies was a "well-functioning marketplace governed by appropriate legislation and regulations" and that governments should ensure that the right legal and regulatory frameworks are in place to ensure that SMEs use digital technologies to "streamline operations, improve services and cut costs." The Federal R&D Review (2011) found widespread consensus among economists, governments, think tanks, and industry that the Government of Canada should increase investments to help businesses improve their competitiveness and also concluded that Government funding is critical to support SMEs in the early stages of their development and technology adoption.

The results from the interviews and survey suggest that there is a role for the federal government to support SME productivity growth and SME digital technology adoption. All interview respondents that were able to comment on the role of the federal government, including internal and external stakeholders, saw a role for the federal government in this regard.

While survey respondents did not speak directly to the role of the federal government, a large majority of respondents to the survey (86%) reported that DTAPP provides their firm with services that could not be accessed through other public programs. Almost as many (83%) reported that DTAPP provides services that cannot be accessed through the private sector at a reasonable cost. Interestingly, only 61% of respondents agreed that, before DTAPP, their firm was not able to acquire the support needed to adopt digital technology. However, 15% disagreed with the statement, suggesting that they were able to receive digital technology support before DTAPP and 24% were neutral (neither agreed nor disagreed).

Recommendation #1: Based on the evidence that there is a need for the federal government to support productivity improvements in SMEs, that productivity can be improved through the adoption of digital technology and the lack of a national level program to encourage SMEs to adopt digital technologies, it is recommended that DTAPP be continued as a permanent Program.

3.4.2 Federal Role in Supporting the Capacity of Colleges

The role of the federal government in supporting colleges to build and promote their digital technology capacity (e.g., equipment and infrastructure, internships) was explored as part of the evaluation. Most interviewees from funded organizations, funded colleges and external stakeholders saw a federal role in this area. A few interview respondents from funded organizations (i.e., not from colleges) did not feel support was needed or appropriate.

3.4.3 Federal Role in Collecting Information and Raising Awareness on Digital Technology Adoption and Productivity

The evaluation also explored whether there is a role for the federal government to collect information on barriers and strategies to SME technology adoption and raise awareness among SMEs regarding the benefits and importance of adopting digital technologies. Among interview respondents, almost all of those from funded organizations and external stakeholders (e.g., RDAs and stakeholder associations) felt that the federal government had a role in raising awareness. Likewise, most interview respondents saw a role for federal government to collect information on barriers and strategies.

Most NRC interviewees, including ITAs, questioned the value of the in-depth data collection that is part of the engagement process with SMEs, specifically in terms of the collection of information on barriers and strategies. A few mentioned that information on barriers and strategies is already available in public documents and that Industry Canada is conducting a survey of firms (including SMEs) as part of DTAPP that seeks information on barriers and strategies. However, it is not clear the extent to which this view that questions the role of collecting the information has been affected by the perceived burden of the data collection process (discussed below in Section 5.4.2).

3.4.4 Federal Role in Delivering DTAPP via NRC-IRAP

The appropriateness of NRC-IRAP as the delivery agent for DTAPP was investigated in the context of confirming whether DTAPP delivery is a federal role. Findings from all lines of evidence support NRC-IRAP in this capacity, including the document review and feedback from interviews. Among all interview respondent types, the most commonly cited reason supporting NRC-IRAP as the delivery agent of DTAPP was the extensive networks of ITAs. Other reasons cited by interview respondents included: ITAs’ ongoing interaction with SMEs; ITAs’ technological backgrounds; NRC-IRAP’s national presence; NRC-IRAP’s good reputation in helping SMEs; and NRC-IRAP’s existing processes and structures to track and monitor projects.

Reasons cited by the few interview respondents who had concerns about NRC-IRAP as the delivery agent included: a possible bias toward the technology sector (as opposed to sectors such as service, design and media); lack of expertise in digital technology and productivity; and NRC-IRAP being quite different from DTAPP in terms of target audience and area of focus (i.e., adoption of technologies and productivity enhancement versus research and development).

Findings from the document review suggest that there is existing capacity in some regions to support digital technology adoption, including Ontario’s Small Business Internship Program and Alberta’s Information and Communications Technology (ICT) Adoption Program. As well, two programs were identified as being complementary to DTAPP, including the Natural Sciences and Engineering Research Council’s (NSERC) College and Community Innovation Program and Business Development Bank’s (BDC) ICT Program.

DTAPP was designed to leverage the NSERC college network to identify potential college delivery partners and can refer to the BDC program for clients in need for financing of a digital technology. However, there is also potential for duplication between DTAPP and BDC’s ICT programming, primarily with respect to diagnostic, coaching and advisory services, and support provided for the selection of a digital technology. Evidence suggests that the Program has worked to engage both NSERC and BDC to ensure that collaboration occurs between programs and to avoid duplication. While a Playbook was developed early on in the implementation of DTAPP that outlines the possible roles for NRC, NSERC and BDC, the federal partner respondents interviewed for the evaluation were only able to cite the development of the Playbook itself without specifics around implementation in the regions. That said, DTAPP working group members and ITAs mentioned that regional ‘hub’ meetings between NRC, NSERC and BDC are particularly useful for collaborating with partners.

Most interview respondents across all respondent types did not think there were other delivery mechanisms, either inside or outside of government, that were better choices than NRC-IRAP. While some interview respondents (from all respondent types who were asked to comment) mentioned RDAs and colleges as possible alternative delivery agents, they were generally felt not to be feasible. Specifically, a few respondents from NRC-IRAP management, federal partners and external stakeholders considered RDAs to have less technical knowledge than what is required. A few respondents from NRC-IRAP management felt that capacity at colleges was uneven. Concern over national consistency was highlighted by a few external stakeholder and federal partners if RDAs were to deliver the program.

Recommendation #2: Given NRC-IRAP’s extensive network and existing connection with the SME community, its national presence and the technical expertise of ITAs, it is recommended that NRC-IRAP continue to be the delivery agent of DTAPP, should the Program be renewed.

4.0 Achievement of Expected Outcomes

Findings in this section focus on evaluation questions related to achievement of outcomes including, reach, awareness, and benefits to clients. This section also explores limitations in achieving outcomes including time and resource levels.

4.1 Program Reach

In order to better understand the outcomes of DTAPP, the activities of the Program as well as its outputs were first assessed. The findings provided below highlight key components of the contributions to firms and contributions to organizations as well as the advisory services and linkages provided by both ITAs and funded organizations.

Evaluation Question: To what extent has DTAPP been successful in reaching its intended clients and delivering planned program activities?

Key Findings:

  • Between November 2011 and October 2012, the DTAPP engagement process had been started with over 600 firms. Financial assistance had been provided to close to 400 of these firms in support of a digital technology adoption project.
  • Funding to 34 colleges and 31 organizations contributed to the provision of various types of services for SMEs, ranging from awareness sessions to the conduct of diagnostic assessments.
  • ITAs were found to have provided advice and facilitated linkages for SMEs. Technical advice was most commonly provided, and referrals were most commonly made to other ITAs and technical resources.

4.1.1 Contributions to Firms

Number of funded firms - Between November 2011 and October 2012, over 600 firms began the DTAPP engagement process, while almost 400 had a DTAPP funded project (see Table 4.1). Regional variations are apparent, with Ontario having had the largest proportion of firms with funded projects and Quebec having had the largest proportion of firms that did not have a funded project.

Among the funded firms, the majority (57%) were new clients as opposed to existing NRC-IRAP clients (43%).Footnote3 ITAs interviewed in the evaluation indicated that early on in the program, they tended to target existing NRC-IRAP clients due to pressures to fund projects early on. However, within a few months of launch, ITAs were able to broaden the reach of DTAPP beyond existing NRC-IRAP clients and have successfully reached a diversified client base. While there was some regional variation in the proportion of new clients, every region’s client base was at least 47% new (see Table 4.1). It should be noted that the 605 firms reached by DTAPP to date is on track with the Pilot target of 630 estimated beneficiaries. This reach has been achieved despite the challenges and delays in roll-out (discussed in detail below, in Section 6.0).

Table 4.1: Number of Funded and Non-funded Firms by Region between November 2011 and October 2012

Region TOTAL Funded Firms (Number of New Firms) TOTAL Firms That Had Not Received Funding at Time of Evaluation Footnote4 TOTAL Firms
Pacific 59 (31) 13 72
West 59 (35) 15 74
Ontario 108 (54) 51 159
Quebec 94 (68) 126 220
Atlantic 66 (31) 14 80
TOTAL 386 (219) 219 605

Source: DTAPP Administrative and Performance Data

Funded firm profile - In general, majority of funded firms had between 20 and 200 employees. This is consistent with the expected firm profile in terms of size. Funded firms were most commonly from the construction and related products services sector, the manufacturing and materials sector, and the ICT sector (see Figure 4.1).

Figure 4.1: Percentage of Funded Firms by Industry

Figure 4.1: Percentage of Funded Firms by Industry

Source: DTAPP Administrative and Performance Data

The evaluation found that the industrial profile of DTAPP clients was different than those of traditional NRC-IRAP clients. Some examples of the non-traditional SMEs served by DTAPP, as highlighted by NRC-IRAP key informant interviewees and ITAs included: a bridal salon; a potato farmer; a gas station; a winter holiday destination; businesses in the fashion industry; and businesses in forestry.

Funded firm projects - DTAPP had a total of 400 funded projects between November 2011 and October 2012. There were 14 cases in which firms had two DTAPP projects. The average contribution to a project over the course of the project was $68K while the median was $75K. As is evident in Table 4.2, regional differences exist in terms of the number of funded projects and the contribution amount.

Table 4.2: Number and Value of Funded Firm Projects by Region between November 2011 and October 2012

Region Number of Funded Projects Average TOTAL Contribution per Project Median TOTAL Contribution per Project
Pacific 65 $78,920 $96,825
West 62 $75,023 $76,618
Ontario 111 $54,391 $50,000
Quebec 95 $66,694 $75,726
Atlantic 67 $78,840 $92,000
TOTAL 400 $68,561 $75,000

Source: DTAPP Administrative and Performance Data

Note: Two were excluded since they were not ascribed a firm, and could not be ascribed a region. Average and median total contribution agreement values are based on the planned amount for an entire project.

4.1.2 Contributions to Organizations and Colleges

Number of funded organizations and colleges - Between November 2011 and October 12, 2012, DTAPP funded a total of 65 colleges and organizations. Of this number, 52% were colleges and 48% were organizations. As shown in Table 4.3, Ontario supported the largest number of organizations and colleges. Among the funded organizations, 43% were new clients while 57% were existing NRC-IRAP clients. Footnote5 Conversely, the proportion of colleges that were new (71%) was substantially higher than the proportion that was preexisting (29%).

Table 4.3: Number of Funded Organizations and Colleges by Region between November 2011 and October 2012

Region Number of Funded Organizations (Number of New Funded Organizations) Number of Funded Colleges (Number of New Funded Colleges)
Pacific 4 (1) 4 (1)
West 2 (1) 3 (1)
Ontario 11 (6) 16 (14)
Quebec 2 (0) 8 (8)
Atlantic 11 (3) 3 (1)
National office 1 (0) 0 (0)
TOTAL 31 34

Source: DTAPP Administrative and Performance Data

Note: details on whether organizations were new or existing were missing for the following: 1 from Pacific; 3 from Ontario; and 1 from Atlantic.

Funded organization and college projects - DTAPP had a total of 57 projects with funded organizations and colleges between November 2011 and October 2012. The average contribution to a funded organization or college project over the course of the project was $85K while the median was $35K. The average financial contribution for projects with colleges was larger than those with organizations (colleges: average = $167K and median = $88K; organizations: average = $65K and median = $28K). As is evident in Table 4.4, regional differences exist in terms of the number of funded projects and the contribution amount.

Table 4.4: Number and Value of Funded Organization and College Projects by Region

Region Number of Funded Projects Average TOTAL Contribution per Project Median TOTAL Contribution Per Project
Pacific 9 $75,170 $36,203)
West 5 $87,072 $96,500
Ontario 14 $81,822 $45,950
Quebec 9 $180,154 $166,500
Atlantic 19 $23,741 $21,000
National Office 1 $520,101 $520,101
TOTAL 57 $85,087 $35,000

Source: DTAPP Administrative and Performance Data

Note: Average and median total contribution agreement values are based on the planned amount for an entire project.

4.1.3 Advisory Services and Linkages

ITAs play a key role in DTAPP and the survey of SMEs indicates that they delivered a wide range of services for the Program. As shown in Figure 4.2, technical advice was the most common type of service provided by ITAs to SMEs while financial and human resource related advice was least likely to be provided by the ITAs.In addition to the advisory services presented in Figure 4.2 interviews with the ITAs confirmed that they also provided support in: finding consultants; planning and defining projects; monitoring contractors and progress on plans; and making sure the client understood the process.

ITAs also facilitate the development of linkages. As is shown in Figure 4.3, ITAs most frequently made referrals or introductions were to other ITAs and to technical resources. This highlights ITA network, and its role in allowing ITA’s to provide appropriate services to clients outside of their particular areas of expertise.

Figure 4.2: Services Provided by ITAs

Figure 4.2: Services Provided by ITAs

Source: Survey of participants. n=213. Number of respondents indicating "don’t know": 11-18. Number of respondents indicating "not applicable": 0. Text of question: "The following questions ask you about the services that were provided to you by your ITA team. Did your ITA team provide your firm with advice on or related to:"

Figure 4.3: Referrals and Introductions Provided by ITAs

Figure 4.3: Referrals and Introductions Provided by ITAs

Source: Survey of participants. n=213. Number of respondents indicating "don’t know": 0. Number of respondents indicating "not applicable": 0.

Text of question: "Did your ITA team provide your firm with a referral or an introduction to:"

4.1.4 Services Provided to SMEs by Funded Organizations and Colleges

Funded organizations are used by DTAPP to reach out and diversify services available to SMEs. According to key informant interviewees, funded organizations and colleges played various roles. Many colleges were funded for time spent conducting assessments for SMEs: for example, college staff conducted diagnostics/assessments of processing lines and recommended new products and technologies. Staff from these organizations also mapped out processes and helped SMEs prioritize potential opportunities to adopt digital technology. In some cases, funds were used to train students to support the productivity enhancements of SMEs.

Some organizations and colleges used the funds for outreach and awareness activities to connect with SMEs in their area. For example, one organization organized workshops for SMEs where information and advice was provided to introduce SMEs to new technologies such as 3D prototyping. Generally speaking, funds were used by organizations and colleges to hire staff or use staff time, professors and/or technicians to conduct these activities.

4.2 Program Outreach Activities

Contributing to the Program’s reach, previously described, are its outreach activities. This section describes the awareness raising activities that DTAPP undertook to reach both SMEs and organizations and colleges.

Evaluation Question:To what extent has DTAPP increased awareness of its services to SMEs?

Key Findings:

  • DTAPP undertook a number of activities that raised awareness of the Program among SMEs, organizations and colleges. As planned, outreach activities declined over time to allow for a transition from awareness of the Program to awareness of the benefits of adopting digital technology.

Outreach to SMEs - DTAPP engaged in many outreach activities to SMEs. DTAPP administrative data on its awareness activities indicate that 182 awareness-raising events took place between November 2011 and September 2012. There were approximately 3,800 occasions during which SMEs had the opportunity to hear about DTAPP. Between January and April 2012, there were a total of 1,670 inquiries about DTAPP, with Ontario having received the largest number of inquiries (36%), followed by the West (18%), the Pacific (16%), Quebec (14%) and the Atlantic (10%). While there was a gradual decline in awareness-raising activities between Program roll-out and September 2012, this is consistent with the Program’s design to move from raising awareness about the Program to focusing on raising awareness about the benefits of technology adoption and enhancing productivity.

Awareness activities were undertaken by the NRC-IRAP communications team and by ITAs in the regions. Activities undertaken by the national communication team included: the Ministerial announcement of the Program; launch of the website and 1-800 number; development and dissemination of brochures and other promotional materials; and presentations to and meetings with networks/partners. ITAs were responsible for promoting the Program through their networks and partners to a broader range of firms and business sectors than those eligible under NRC-IRAP. NRC-IRAP ITAs largely conducted outreach to their network of existing clients and organizations whereas DTAPP-specific ITAs largely chose to communicate with specific industry associations. Some ITAs reported that they worked with colleges to coordinate outreach. As discussed earlier, where earlier outreach efforts were largely focused on NRC-IRAP clients, new and non-traditional NRC-IRAP SMEs were targeted after a few months after Program launch (see Section 4.1.1).

Outreach to organizations and colleges - Evidence suggests that the outreach activities undertaken by DTAPP for organizations and colleges were successful. Organizations and colleges interviewed as part of the evaluation felt that awareness-raising initiatives targeted at them had been effective. When asked how they became aware of DTAPP, about half of interviewees from organizations and colleges stated that they were involved at an early stage of program roll-out, either as part of the regular business of their funded college or organization, or as part of early meetings with NRC-IRAP. The other half indicated that they became aware of the program through discussions with ITAs.

4.3 Collection of information on digital technologies and productivity

As mentioned earlier, the objective of DTAPP is to accelerate the adoption of digital technologies in SMEs to increase their productivity. As a pilot, DTAPP was also meant to gather data on the barriers to digital technology adoption and strategies to overcome them as well as to assess the linkages between digital technology adoption and productivity, and to share these results with the SME community. One of the issues addressed by this evaluation was the extent to which data was collected and used for this purpose. This section addresses that issue.

Evaluation Question: To what extent has DTAPP collected information from SME clients on the link between adopting digital technologies and productivity and began raising awareness of these findings?

Key Findings:

  • The Program has collected data on barriers and strategies to digital technology adoption and productivity. While it is still too early to assess the linkage between digital technology and productivity, challenges were identified in the evaluation with the measure of productivity being used by the Program, (i.e., value added per employee).

4.3.1 Data Collection Process

In order to measure the outcomes of the program, including the link between digital technologies adoption and productivity, data was collected from both ITAs and clients at different points during the DTAPP engagement process. ITAs were required to provide information on the barriers faced by firms and strategies used to overcome them. Clients were asked to provide tombstone data as well as report on their productivity at different points in time over the course of the engagement and for three years following a completed engagement. Firms that received financial assistance were also required to report on outcomes resulting from their funded project. Measures were undertaken by the Program to train staff on the data collection process and tools by developing templates and guidelines, and offering training to ITAs.

Interviews with ITAs indicated that they often begin the data collection process with a client when they know that the client is serious about pursuing digital technology adoption. In most cases, this occurred at the second stage of the engagement process (i.e., the analysis stage). The rationale for this was two-fold: first, it takes some effort on the part of the ITA to enter a record; and second, entering data triggers what some ITAs perceive as an onerous reporting requirement on the SME, especially in the absence of a funded contribution from DTAPP. Many ITAs noted that the questionnaires they are required to complete for each stage of the engagement process were onerous and time-consuming (see Section 5.4 for more details on the data collection reporting process).

4.3.2 Use of Data

While ongoing data collection efforts by the Program had yielded information on barriers to technology adoption and strategies to overcome the barriers, data was not sufficient to assess the link between digital technology adoption and productivity at the time of the evaluation given only a small number of program participants had actually completed a full project cycle. The Program has begun analyzing and disseminating information on the barriers and strategies to technology adoption, at least internally, with plans in progress to extend this externally to SMEs. While the evaluation cannot comment on the use of productivity data to date by the Program, there was evidence to suggest that the measure of productivity being used may result in challenges in assessing the link between digital technology adoption and productivity.

Specically, most ITAs expressed doubt regarding the measure of productivity that was being used by the Program, the Value Added per Employee (VAPE), to demonstrate linkages between digital technology adoption and productivity improvement. Some suggested that there are metrics being used on individual DTAPP projects that are more useful measures of productivity improvements (e.g., Return on Investment (ROI) of specific investments and not ROI in broad terms). Some ITAs also suggested that VAPE does not speak to the impact of the DTAPP intervention alone but rather to overall productivity, which might be driven by numerous factors. For example, VAPE ratios cannot effectively differentiate between changes that are a result of the DTAPP project versus other variables, such as market factors that can lead to higher volumes and increased productivity as a result of economies of scale.

While a few ITAs suggested that the data being collected may be relevant and might reveal trends, comparisons would be necessary to assess the actual impact of technology adoption on productivity. Therefore, the ITAs indicated that a proper assessment of productivity improvements as they relate to DTAPP will have to rely on case studies.

Recommendation #3: Given the challenges associated with current indicators used to measure the Program’s contribution to firm productivity (i.e., VAPE), it is recommended that DTAPP conduct a review of its performance measures to ensure that it can adequately measure the effect of digital technology adoption on productivity.

4.4 Achievement of Early Outcomes

While it is too early to make conclusions regarding the impact of DTAPP on productivity, this section presents the influence that DTAPP has had on clients’ digital technology adption behaviours. It also reports on the anticipated benefits resulting from DTAPP clients’ adotion of digital technology.

Evaluation Question:To what extent has DTAPP resulted in benefits to client SMEs?

Key Findings:

  • DTAPP clients reported positive benefits of Program services on their ability to adopt digital technologies, with more than half having already adopted a digital technology and the remaining planning on adopting in the future.
  • While it is too early to assess the effect of adopting digital technologies on client productivity, DTAPP clients anticipate many benefits from the adopted technologies, including lower production costs; increased productivity; improved management systems; and improved quality of products or services.

4.4.1 Digital Technology Adoption by Client SMEs

According to survey findings, most SMEs using DTAPP shared positive views about the benefits stemming from DTAPP services in relation to their digital technology adoption efforts. As indicated in Figure 4.4, more than nine out of ten SMEs using the Program agreed that, as a result of DTAPP services, their firms were more willing to adopt digital technology, and that DTAPP services provided their firm with the support necessary to successfully adopt digital technologies.

Figure 4.4: Perceived Benefits of DTAPP Services

Figure 4.4: Perceived Benefits of DTAPP Services

Source: Survey of participants. n=213. Number of respondents indicating "don’t know": 2-3. Number of respondents indicating "not applicable": 3-7. Text of question: "The questions in this section of the survey ask you about the benefits that your firm has experienced to date as a result of DTAPP services. These services include financial support, advisory services or both. Please indicate the extent to which you agree or disagree with the following statements."

As DTAPP is intended to increase awareness of the benefits of digital technologies and eventually assist SMEs in the adoption of such technologies, the evaluation assessed the extent to which the SMEs actually adopted the technologies related to DTAPP support. According to survey findings, the firms adopted the digital technologies related to DTAPP support in the majority (69%) of cases. In the remainder of cases (31%), while firms had not yet adopted the digital technology, they intended to do so in the near future. None indicated that they had not and would not adopt the digital technology related to DTAPP support.

An overview of the digital technologies that SMEs were most interested in adopting is presented in Table 4.5. Qualitative evidence from key informant interviews indicate that one of the most prevalent types of digital technology implemented were Enterprise Resource Planning (ERP) systems. Other digital technologies for which DTAPP faciliates the adoption include software associated with specialized design and production equipment, such as 3D imaging, automation and robotics, and client relationship management (CRM).

Table 4.5: Technologies that SMES were Interested in Adopting

Technology Type TOTAL Firms Percentage of TOTAL Firms
Business systems 187 26.4%
Information and communication technology 162 22.9%
Design, engineering and virtual manufacturing 155 21.9%
Plant systems 131 18.5%
Other 172 10.2%
TOTAL 707 100%

Source: DTAPP Administrative and Performance Data

4.4.2 Anticipated Benefits of Adopting Digital Technologies

While it is early for the evaluation to determine the longer term outcomes of adopting digital technology on a firm, such as productivity enhancements, the evaluation did assess the likely impacts of the Program. The anticipated benefits of adopting digital technologies were quite positive, and are presented in Figure 4.5.

Figure 4.5: Anticipated Benefits Arising from Digital Technology Adoption

Figure 4.5: Anticipated Benefits Arising from Digital Technology Adoption

Source: Survey of participants. n=213. Number of respondents indicating "don’t know": 6-19. Number of respondents indicating "not applicable": 0. Text of question: "Using 4-point scale below, please indicate the extent to which the digital technology that your firm adopted/will adopt with DTAPP support is likely to lead to the following benefits."

4.5 Pilot Duration and Resources

DTAPP was set up to test the Program design vis à vis the achievement of outcomes. This section discusses the adequacy of the resource levels and duration to this end.

Evaluation Question:To what extent has DTAPP resulted in benefits to client SMEs?

Key Findings:

  • The pilot length was sufficient to test the Program design, however, will not be long enough to truly test the effectiveness of DTAPP since outcomes such as productivity growth often take longer than the Pilot length to fully materialize.

There is evidence to suggest that the Pilot was sufficient in length to test the design and delivery of the Program. In terms of the pilot duration, most internal respondents from NRC-IRAP indicated that the pilot duration was not adequate to test its ability to achieve intended outcomes. Outcomes of this nature take longer to achieve after projects are completed than what can occur in a three year pilot. Some respondents (including a respondent from another federal organization) highlighted that the already-short duration of the Program was made even shorter because of two factors: delays in Program roll-out caused by a later-than-expected Ministerial announcement of the Program (upon which Program implementation hinged); and a slower ramp-up in program delivery due to delays in hiring DTAPP ITAs and in implementing tools and other supports systems.

The extent to which the Program was able to enhance productivity growth, improve the understanding of the link between technology adoption and productivity and raise awareness of the findings will need to be assessed closer to the end of the Pilot, or after its end.

5.0 Efficiency and Economy

This section provides evaluation findings related to the efficiency and economy of DTAPP.

5.1 Program Design

This sub-section presents the evaluation evidence related to the relationship between the DTAPP design and economic and efficient Program delivery.

Evaluation Question:To what extent is the Program design conducive to economic and efficient Program delivery?

Key Findings:

  • The ability to leverage NRC-IRAP infrastructure and human resources, and the use of both DTAPP ITAs and NRC-IRAP ITAs to deliver the Program contributes to an economic and efficient Program design. However, some aspects of the design could be improved to increase efficiency of the program.

The evaluation found that the DTAPP design largely contributes to economic and efficient delivery. However, some aspects of the design could be improved to increase efficiency. The most efficient and economical aspect of the DTAPP design is the leveraging of existing NRC-IRAP resources. Specifically, DTAPP leveraged significant infrastructure (e.g., administrative systems) and human resources (e.g., ITAs, managers and administrative staff) from NRC-IRAP to deliver the Program at very little additional operating cost. The incremental cost of delivering DTAPP was eight DTAPP ITAs, two communications officers and three employees to support data collection and analysis in the National Office.

A fundamental aspect of the design is the utilization of two types of ITAs (i.e., NRC-IRAP ITAs and DTAPP ITAs). DTAPP ITAs were hired for their expertise in digital technology adoption and/or productivity. Although the role of the DTAPP ITAs varies somewhat by region, the design concept of having designated resources with expertise in digital technology adoption and productivity is economical. By design, these DTAPP ITAs are available as a source of expertise for NRC-IRAP ITAs embarking on DTAPP engagements with SMEs. For specific details on the role of ITAs in DTAPP delivery please refer to Section 5.2.

A key component of the DTAPP design is the engagement process, described under the Program Profile (Section 2.0). By design, each stage of the engagement process includes a series of questionnaires for ITAs to complete regarding the barriers faced by clients in technology adoption and the strategies used to overcome them. ITAs found that the stages facilitated the delivery of DTAPP, particularly since it was a new program. Economy and efficiency issues in program design were observed regarding the reporting requirements associated with each stage. This is discussed in greater detail in Section 5. In addition, most respondents from funded organizations and colleges felt that the design was efficient because it reduces the risk for SMEs to adopt new technologies. This occurs because DTAPP benefits from NRC-IRAP ITAs’ pre-existing relationship with SMEs.

5.2. Program Delivery

This sub-section presents evaluation evidence regarding the extent to which DTAPP is being delivered in an efficient and economical manner. Client satisfaction with program delivery and outputs are described, as well as the role of ITAs, funded organizations and funded colleges in economic and efficient program delivery.

Evaluation Question: To what extent is the Program being delivered in an economic and efficient manner?

Key Findings:

  • The ability to leverage NRC-IRAP infrastructure and human resources, and the use of both DTAPP ITAs and NRC-IRAP ITAs to deliver the Program contributes to an economic and efficient Program design. However, some aspects of the design could be improved to increase efficiency of the program.

5.2.1 Client Satisfaction with Program Outputs

Overall, clients of DTAPP are satisfied with the Program’s delivery processes. According to the survey findings, 90% indicated that they are satisfied with the Program delivery. As Figure 5.1 demonstrates, they are satisfied with both the services provided ITAs, as well as the financial assistance and the process used to disseminate the contributions.

Figure 5.1: Client Satisfaction with DTAPP

Figure 5.1: Client Satisfaction with DTAPP

Source: Survey of participants. n=213. Number of respondents indicating "don’t know": 1-11. Number of respondents indicating "not applicable": 0-20.Text of question: "On a five point scale, where one is very dissatisfied and five is very satisfied, how satisfied were you with the following"

5.2.2 Role of ITAs

Most key informants and process review participants reported that DTAPP is being delivered in an efficient and economical manner. This opinion was expressed most strongly by NRC respondents (including senior management, DTAPP working group members and ITAs), but interview respondents from funded organizations and colleges also shared this opinion.

Many key informants reported that the roles of the ITAs contribute to efficient and economical delivery. Essentially, the NRC-IRAP ITAs are the "face" of DTAPP with SMEs. NRC-IRAP ITAs tend to have the most interaction with SMEs and are typically deeply engaged in Stages 1 (Awareness) through 5 (Outcomes). Stage 2, the Eligibility stage is often led by NRC-IRAP ITAs, which was considered appropriate by all ITAs interviewed during the process review. In certain cases NRC-IRAP ITAs request assistance from DTAPP ITAs to assess a firms eligibility (e.g., in the identification of SMEs’ financial and human resource readiness to adopt a digital technology).

The involvement of the DTAPP ITAs in the remaining stages of the engagement process also varies. In a few regions, DTAPP ITAs are included in engagement teams only on an as-needed basis. Contrast this with models in some regions where at least one DTAPP ITA is integrated into each client engagement and models in other regions where DTAPP ITAs lead their own DTAPP engagements with SMEs.

One of the factors that influenced the role of DTAPP ITAs in client engagement was the sequencing of training NRC-IRAP ITAs and hiring of DTAPP ITAs, resulting from delays in Program implementation. While the original plan was for DTAPP ITAs to be hired prior to roll-out, the official launch of the Program was delayed as discussed earlier. As such, NRC-IRAP ITAs learned how to deliver DTAPP before the DTAPP ITA were hired. Consequently, NRC-IRAP ITAs learned how to deliver the Program without the DTAPP ITAs in place. As a result, it took NRC-IRAP ITAs time to integrate DTAPP ITAs into the process, which as was described above, has having taken various forms. From an efficiency perspective, NRC-IRAP ITAs in some regions may not be taking full advantage of the support available from DTAPP ITAs. In these cases, use of resources is being minimized, thereby generating efficiency. However, greater economy may be achieved through increased integration of DTAPP ITAs, as their specialized skill sets may contribute to an increased quality or quantity of outputs.

Both NRC-IRAP ITAs and DTAPP ITAs indicated that the rapid roll-out of DTAPP did not allow for sufficient training. In the case of the NRC-IRAP ITAs, they reported that formal processes were being solidified as the Program was rolled out and that more training around Program guidelines would have been useful. This barrier was overcome by ITAs working through the process together and sharing their experiences. In the case of DTAPP ITAs, they reported that they did not receive sufficient training at the outset of the Program regarding how the DTAPP process works, guidelines and culture. Enhanced training on processes and workflows would have been beneficial in these cases.

Although evidence from key informant interviews indicates that DTAPP is being delivered in and efficient manner, it was not possible to validate this finding with quantitative data. An exploration of efficiency typically includes a review of administrative costs to delivery costs. This, however, was not possible in the case of DTAPP due to the unavailability of information on the time spent by NRC-IRAP ITAs delivering DTAPP. Likewise, information was not available on the time spent by other NRC-IRAP staff to develop and implement the Program and all of the tools and support mechanisms required for delivery. If NRC-IRAP is to take on the delivery of additional programs in the future, this information would be beneficial from a management perspective and to better understand the impact of delivering additional programs on NRC-IRAP resources. Moreover, should NRC-IRAP be asked to take on the delivery of additional programs, this information will help to ensure that it is best positioned to request an appropriate amount of resources for program delivery.

Recommendation #4: In order to better understand the NRC-IRAP resources leveraged to deliver DTAPP, as well as to accurately estimate the level of operational resources required in the event that NRC-IRAP is asked to deliver additional programs, it is recommended that NRC-IRAP institute a mechanism to better track the level of effort spent on the activities carried out by field and support staff in support of each program.

5.2.3 Role of Funded Organizations and Colleges

Funded organizations and colleges support efficient program delivery by using existing capacity to provide SMEs with information and deliver training and analysis directly relevant to the SMEs’ technology adoption needs. Funded organizations and colleges are also seen by ITAs to be an objective source of information for SMEs, which is critical for the successful adoption of a digital technology. ITAs connect SMEs with funded organizations and colleges to ensure SMEs have sufficient information regarding productivity; hardware and software; best practices in implementation; and where to acquire technology. A few ITAs indicated that they referred DTAPP clients to colleges as a resource to provide technology-specific workshops and training, while other colleges provided knowledge on what technology was available to meet business needs, and where to access this technology. In order to continue to ensure efficient Program delivery by using funded organizations and colleges, NRC-IRAP should work to build greater awareness around the value of services offered by organizations and colleges for SMEs.

5.3 Program Design versus Delivery

This sub-section discusses the extent to which DTAPP has been delivered according to plan, what changes have occurred and what additional changes may be required to optimize delivery.

Evaluation Question:To what extent has DTAPP been delivered according to plan and what changes are required to optimize Program delivery in the field?

Key Findings:

  • DTAPP has largely been delivered according to the original Program design. While variations do exist in the way in which ITAs proceed through the DTAPP engagement process with firms, these differences are still in accordance with the original Program design.
  • Two areas of program design were identified in which changes could be made to optimize program delivery. These included reporting and project selection criteria.

Based on interviews with a sample of ITAs (including both NRC-ITAP ITAs and DTAPP ITAs), there is evidence to indicate that the delivery of DTAPP has been largely according to design. While each of the five stages of the DTAPP engagement process has been implemented as planned, the evaluation found that some stages or components of stages have had more emphasis than others. Table 5.1 describes the way in which each DTAPP engagement stages were delivered as well as the challenges or issues associated with the stage. Footnote6

Table 5.1: Description of DTAPP Delivery and Challenges or Issues Faced

Stage 1 - Awareness
Description of Delivery
  • Initial awareness-raising was conducted by NRC-IRAP ITAs and focused on  their existing network of clients and organizations as well as by the NRC-IRAP National Office communications staff, which targeted a a  broader audience.
  • Once the DTAPP ITAs were hired, they selected specific industry associations to communicate with in order to reach beyond existing NRC-IRAP IRAP clients.
  • Some DTAPP ITAs also worked with colleges to coordinate outreach.  For example, a few DTAPP ITAs noted that colleges had existing networks of SMEs they were working with. In these cases the DTAPP ITAs were invited to speak to groups of SMEs about  DTAPP.
Challenges or Issues
  • One common challenge faced by both types of ITAs was that, while fulfilling their responsibilities for raising awareness, they wanted to avoid raising expectations they could not meet due to limited program funds. This contributed to fairly targeted awareness-raising activities.
  • In addition, some ITAs were critical of the outreach and communications activities led by NRC-IRAP National Office. For example, a few ITAs suggested that the content of the website could have been more specific in terms of describing eligibility criteria. This would theoretically cut down on unnecessary communications with ineligible firms.
  • Overall, most ITAs reported that the Program objectives and eligibility criteria were not clear at the time of the pilot launch. As a result, some NRC-IRAP ITAs lacked a sufficiently deep understanding of the Program to answer questions from clients at the outset.
Stage 2 - Eligibility
Description of Delivery
  • ITAs follow eligibility criteria set out to assess a firm’s eligibility (e.g., incorporated firm with fewer than 500 employees and an interest in increasing their productivity).
  • The approach to date has been to simply support eligible firms rather than projects that would have the most significant results or those that would not have proceeded otherwise.
  • Some ITAs integrated Stage 2 (Eligibility) and Stage 3 (Analysis). The information to fulfill eligibility requirements is typically collected by the ITA in an interview format with a project lead from the SME and/or a senior management representative.
  • Most ITAs specified that a site visit of SME operations was necessary during the Eligibility stage.
  • In addition to collecting information about the firm at this time, ITAs explained the expectations for participation in DTAPP to the firm (e.g., expectations pertaining to cooperation with ITAs, reporting requirements, etc.).
Challenges or Issues
  • A few ITAs and internal stakeholders noted that the Eligibility stage could benefit from additional “sifting” of projects to identify projects with the highest potential for improved productivity.
Stage 3 - Analysis
Description of Delivery
  • The amount of time spent on Stage 3 (Analysis) varies. While some ITAs emphasized that this stage is the most important and should be approached methodically, some ITAs felt that this was more of an administrative step and emphasized movement toward Stage 4 (Execution) as rapidly as possible. This expedited process reportedly took place for two reasons: pressure on ITAs to support eligible projects rather than optimal ones; and the fact that some SMEs had already completed their own analysis and had decided on a preferred technology adoption program.
  • Most ITAs reported that the Analysis stage involved assessing the firm’s readiness for change (e.g., management willingness, human resource readiness, etc.). For some ITAs, this stage also involved completing a VAPE calculation and predicting how it would change under various scenarios. In this stage, it is the role of the ITA to help the firm determine its needs and prioritize them.
  • In cases where the most comprehensive analysis took place, DTAPP teams worked together with an SME to identify productivity challenges, after which point the SME was referred to a funded organization for in-depth productivity analysis.
  • In Quebec, the Analysis stage is considered a pre-diagnosis stage, which determines if the SME really needs to adopt new digital technology, or focus on something else. This component is fulfilled in a value chain study completed by funded colleges. This can assist, for example, in determining the appropriate software solution for a firm. ITAs noted that colleges in Quebec cannot keep up with the demand.
  • The involvement of the DTAPP ITAs increased during this stage due, at least in part, to the level of diagnosis and knowledge of productivity required.
Challenges or Issues
  • ITAs did not identify any major challenges or issues in delivering this stage.
Stage 4 - Analysis
Description of Delivery
  • The most involved point of the DTAPP engagement process for ITAs and SMEs is Stage 4 (Execution).  If a DTAPP ITA had not been previously drawn on by an NRC-IRAP ITA for a DTAPP project, this is generally when they would be involved.
  • Advice was provided to SMEs at this stage, and was often operational and technical, however, at times was financial in nature (e.g., advice on financial systems, referrals financial support, etc.).
  • All ITAs reported that if they did not have the required expertise in a specific area, they drew on their network (e.g., ITAs, funded organizations and colleges) to find a solution.
  • ITAs saw neutral and objective sources of information as critical to the success of a DTAPP project. Objective sources of information include ITAs, a limited number of consultants, and funded organizations and colleges. Some ITAs indicated that they refer SMEs to colleges at this stage.
  • Locating suppliers is also a key component of the Execution stage. Some ITAs refer clients to known contractors or refer them to Business Development Organizations in order to identify suppliers. A few ITAs reported that central repositories of service provider contact information were useful in identifying supports. For example, an ERP advisory database was assembled early on in DTAPP. Also, an advisory database of contacts to support automation tool implementation was set. These were developed by the manufacturing and materials sector team.
  • All ITAs stated that they monitor technology adoption closely during this stage. The ITAs monitored progress to ascertain whether any adjustments to the initial plan were required.
  • In addition to regular contact, all ITAs set up regular meeting times to conduct monthly follow-ups with funded firms.
Challenges or Issues
  • Locating objective consultants was reported to be a challenge. Although a few respondents noted that a central repository for consultant contact information existed, many ITAs were not aware of this.
  • A few ITAs reported that the main challenge at the Execution stage was getting the SMEs staff ready to operate the technology since many staff members are not necessarily technology savvy or open to change. This involved translating and explaining a lot of technical information to the SME to help them understand the process.
Stage 5 - Outcomes
Description of Delivery
  • Few projects had reached the Outcomes stage during the evaluation data collection time frame. As a result there were limited findings for this stage.
  • ITAs who could comment as part of the process review, reported that SMEs had filled out the requisite paper work and ITAs had validated the technology adoption. In some cases, this latter process involved a site visit where ITAs verified that the new technology was implemented and determined if the goals of the project were achieved.
  • The majority of ITAs reported engaging informally to share learnings. A few ITAs noted that because DTAPP ITAs were involved in many projects, they are able to provide lessons learned and best practices to the broader ITA community in their region.  Some ITAs noted that informal sharing of best practices is suitable while others felt that it would be helpful to have a formalized repository of lessons learned and best practices.
Challenges or Issues
  • Most ITAs noted that, although some project impacts are evident at the Outcomes stage, the best measure of Program impacts would be two to three years after the technology adoption.

In terms of changes needed to program design to optimize program delivery, the evaluation identified two primary areas in which changes could be made to the delivery. These include reporting as well as the criteria used to select projects for financial assistance.

Reporting - Reporting is required by both ITAs as part of the engagement process and by clients. Only a minority of those individuals involved with the design and delivery of DTAPP who were consulted for the evaluation indicated that the data collection and reporting requirements were appropriate. Criticisms largely centered around data collection and reporting being overly burdensome and time consuming. Most NRC interview respondents and some ITAs consulted suggested that reporting could be streamlined.

Challenges were also reported with the systems used for reporting. For example, the use of two different systems (i.e., IRAP Client Website (ICWS) and SONAR, NRC-IRAP’s Client Relationship Management system) was challenging, and internal interviewees indicated that they could be integrated or better interact with each other. In particular, few ITAs pointed to SONAR as a source of difficulty in data collection (described as "antiquated" and plagued by crashes). That said, NRC-IRAP is currently in the process of developing an improved, web-based version of SONAR which should address the systems issue for DTAPP reporting.

As well, the rationale behind types of information being collected was not well-communicated or understood by those being asked to collect the information. Most DTAPP working group members and a few NRC-IIRAP senior managers and ITAs noted that they had not used Program data. A few individuals contacted for the evaluation said it was not obvious how the information collected would be of benefit to the firms they were helping. A few ITAs suggested that data collection would be improved if the purpose behind data collection requirements was clearly communicated to ITAs at the outset. Otherwise, it appeared to be an administrative exercise rather than performance reporting.

Interestingly, over three-quarters (81%) of clients that were surveyed felt that reporting requirements were reasonable. Among the 7% who felt reporting requirements were not reasonable, about half thought they were too complex or cumbersome, and a third stated that reporting requirements were too repetitive or frequent. This suggests that the client-facing requirements do not need major adjustments.

In light of the challenges with reporting, the Program has been working to make changes to the reporting process. Program management continues to look at improving reporting requirements and reporting systems. Some changes have already occurred and demonstrate Program management’s effort to optimize program delivery. These include:

  • Changes to funding proposal requirements. The proposal process has been streamlined for small DTAPP funded projects such as feasibility studies and other preliminary investigations undertaken in preparation for a future DTAPP project that have limited scope and size (under $50K).
  • Changes to ITA reporting requirements. In order to reduce input by ITAs, the requirement to update the Stage 4 (Execution) questionnaire monthly was eliminated, now requiring it to be updated only at the end of the engagement process. As well, the requirement to complete a monthly advisory services spreadsheet was eliminated.
  • Reduction in regional reporting. Regions are now required to submit DTAPP engagement examples on a quarterly basis, rather than monthly.

Despite these changes, further investigation seems appropriate as many process review participants reported that duplicative and onerous reporting requirements still exist.

Recommendation #5: DTAPP should review the information that is being collected by ITAs and by clients to ensure that it is streamlined, yet meets the needs of Program management and is sufficient to address accountability requirements. When conducting a review of the DTAPP reporting process, the following considerations are suggested:

  1. An effort should be made to ensure that information is collected on the advisory services provided by ITAs as well as the services offered by funded organizations and the resulting impact on SMEs, two Program areas not addressed by the current reporting requirements.
  2. Increased integration with NRC-IRAP reporting requirements, including the application process.

Project selection criteria - Interviews with ITAs and other internal stakeholders suggest that the Program could benefit from additional efforts to select projects with the highest potential for improved productivity. While estimates for productivity improvements (e.g. VAPE, value stream mapping, etc.) are made for most projects, these estimates are primarily used as a baseline tool for individual projects. The productivity improvement estimates are not compared across applications to guide selection. While there are currently no requirements to support projects that are expected to have the greatest productivity enhancements, given the demand for the Program, this may be a strategy to ensure the most effective use of limited resources.

Recommendation #6: DTAPP should explore the feasibility of introducing additional criteria to select firm and organization/college projects to ensure the Program results in the greatest impact on SME productivity as is possible.

5.4 Lessons Learned

This sub-section presents the lessons that can be taken from the development and implementation of DTAPP and applied to any additional programs that NRC-IRAP is asked to deliver.

Evaluation Question:What knowledge can be derived from the development and implementation of DTAPP and used by NRC-IRAP in the eventual implementation of other targeted initiatives?

Key Findings:

Of the lessons to take away, the following examples are considered to be key:

  • Having knowledgeable and dedicated personnel (in this case, ITAs) already in;
  • Allowing for adequate time to get the Program up and running
  • Having clear communications regarding the Program and expectations to delivery personnel and partners; and
  • Ensuring the data that is collected is useful for analysis without being an undue burden.

While there were many suggestions put forward by those consulted for the evaluation, the most shared most common learnings included:

  • Having knowledgeable and dedicated personnel (in this case ITAs) already in place. By having personnel who were already familiar with engaging SMEs, project design, accountability requirements, etc., the Program was able to have a fast roll-out and begin funding eligible projects quickly. Leveraging of NRC-IRAP personnel, structures and systems also ensured highly cost-efficient Program delivery.
  • Raising awareness of the Program, including its eligibility criteria. Sufficient information should be presented in program communication campaigns so that potential participants know whether or not they meet the eligibility requirement, and should therefore explore the Program further. In the case of DTAPP, ITAs mentioned that the general and far reaching communications led to very high demand, in some cases to the point that capacity was strained to respond to preliminary meeting requests and information requests around eligibility.
  • Allowing for adequate time to get the Program up and running. In particular, learnings occurred with respect to challenges in filling key staff positions, namely those of the DTAPP ITAs. While all ITA respondents reported that having DTAPP ITAs trained and hired prior to Pilot launch would have been beneficial in that it would have assisted with overall capacity, specialized expertise, and eased with the integration of this new position, it is not possible to hire staff before a program launch. That said, preparations for hiring can be done in advance to facilitate a timely acquisition of the needed resources. In addition, a ramp-up period must be considered, which foresees realistic timelines for hiring and training staff following program launch.
  • Ensuring the data that is collected is useful for program management without being an undue burden. The desire was commonly expressed among ITAs for streamlined, more meaningful reporting, and improved reporting systems.
  • Having clear communications regarding the Program expectations to delivery personnel and partners. Many ITAs stated that the intent of the Program and the delivery approach was not sufficiently explained to ITAs prior to Program launch. As well, it was also mentioned that work should be done upfront to align and coordinate with other players and be more strategic around who does what and how the Program can be delivered in complement to others.
  • Monitoring the spend rate for funded projects closely. A few DTAPP working group members and NRC-IRAP ITAs stated that the spend rate for DTAPP funded projects was closely monitored, which ensured that the Program expenditures closely matched the Program’s annual budget. This is critical to ensure optimal delivery to beneficiaries - particularly important in a short-term Pilot.
  • Ensuring sufficient human resources. Some ITAs reported a lack of adequately trained resources at specific project stages. In particular, some NRC-IRAP ITAs mentioned that there was a need for more ITAs, or at least more capacity in a general sense and a few specifically mentioned a need for more DTAPP ITAs.

6.0 Management Response

Recommendation Status Planned action(s) Proposed Person(s)
Responsibilities
Expected date of
completion
(M/D/Y)
Measure(s) of
achievement
Recommendation #1:

Based on the evidence that there is a need for the federal government to support productivity improvements in SMEs, that productivity can be improved through the adoption of digital technology and the lack of a national level program to encourage SMEs to adopt digital technologies, it is recommended that DTAPP be continued as a permanent Program.

Accepted Provide information and advice to Industry Canada and OGDs, as requested
  • Vice-President, NRC-IRAP
  • Executive Director, NRC-IRAP National Office
  • Director, NRC-IRAP Strategic & Operations Alignment
On-going within fiscal year 2013/14 (completion March 31, 2014) N/A to NRC-IRAP; this is a Government of Canada decision.
Recommendation #2:

Given NRC-IRAP's extensive network and existing connection with the SME community, its national presence and the technical expertise of ITAs, it is recommended that NRC-IRAP continue to be the delivery agent of DTAPP, should the Program be renewed.

Accepted Provide information and advice to Industry Canada and OGDs, as requested
  • Vice-President, NRC-IRAP
  • Executive Director, NRC-IRAP National Office
  • Director, NRC-IRAP Strategic & Operations Alignment
On-going within fiscal year 2013/14 (completion March 31, 2014) N/A to NRC-IRAP; this is a Government of Canada decision.
Recommendation #3:

Given the challenges associated with current indicators used to measure the Program's contribution to firm productivity (i.e., VAPE), it is recommended that DTAPP conduct a review of its performance measures to ensure that it can adequately measure the effect of digital technology adoption on productivity.

Accepted

In collaboration with Industry Canada, a review of the performance measures will be undertaken at the end of the pilot.

NRC-IRAP does not see any benefit in implementing potential changes at this point in time - final months of the pilot - as this would negatively impact longitudinal value of data gathered to date.

  • Director, NRC-IRAP Strategic & Operations Alignment
September 30, 2014 Review is undertaken and results are implemented if DTAPP is extended.
Recommendation #4:

In order to better understand the NRC-IRAP resources leveraged to deliver DTAPP, as well as to accurately estimate the level of operational resources required in the event that NRC-IRAP is asked to deliver additional programs, it is recommended that NRC-IRAP institute a mechanism to better track the level of effort spent on the activities carried out by field and support staff in support of each program.

Accepted

One of reasons NRC-IRAP was chosen by the government to deliver DTAPP is the synergy (and the associated savings) realized by having multiple programs available through a single source. ITAs provide a wide range of services to its clients which are not associated with a specific program but are in response to clients' needs. As such, services provided by ITAs cannot be isolated and attributed to a specific program delivered by NRC-IRAP.

Tracking time against a specific program is not a feasible option due to the nature of the services provided by ITAs. That said, alternative options to estimate the administrative cost of DTAPP will be explored by NRC-IRAP, and appropriate options implemented if DTAPP is extended.

  • Executive Director, NRC-IRAP National Office
  • Director, NRC-IRAP Strategic & Operations Alignment
March 31, 2014 Options to arrive at estimated administrative costs of DTAPP are reviewed, and an option implemented if DTAPP is extended.
Recommendation #5:

DTAPP should review the information that is being collected by ITAs and by clients to ensure that it is streamlined, yet meets the needs of Program management and is sufficient to address accountability requirements. When conducting a review of the DTAPP reporting process, the following considerations are suggested:

  1. An effort should be made to ensure that information is collected on the advisory services provided by ITAs as well as the services offered by funded organizations and the resulting impact on SMEs, two Program areas that are not addressed by the current reporting requirements.
  2. Increased integration with NRC-IRAP reporting requirements, including the application process
Accepted

As a pilot program, DTAPP has been under constant improvement since launch. Significant efforts were made to convert paper-based data collection forms to electronic tools, to reduce to minimum data collected from clients and to streamline the processes. Recent changes have reduced ITA reporting from a monthly activity to an entry at the start and at the end of an engagement.

In response to 5a)
NRC-IRAP will explore options to track advisory services provided by ITAs as part of DTAPP, which may include a similar approach to the one used for NRC-IRAP.

With regard to funded Organizations, we incorporated during the second year of the program several Word-based reporting documents into the monthly Status Reports and the Final Report. These reports should provide a fairly detailed picture of the advisory services delivered by Organizations.

In response to 5b)
If DTAPP is extended, NRC-IRAP will ask concurrence from Industry Canada to integrate DTAPP and regular IRAP processes. With regard to reporting requirements, it should be noted that DTAPP reporting by clients models IRAP reporting almost exactly. Clients are required to provide a status report with each claim as well as a final report when a project is completed. The only difference is that some report questions are DTAPP-specific.

  • Director, NRC-IRAP Strategic & Operations Alignment
Review of options to track advisory services to be completed by March 31, 2014 More consistent reporting of advisory services by ITAs and by organizations
Recommendation #6:

DTAPP should explore the feasibility of introducing additional criteria to select firm and organization/college projects to ensure the Program results in the greatest impact on SME productivity as is possible.

Accepted

DTAPP's "stage" approach combined with its specific assessment criteria enable ITAs to analyze each potential project and ensure they meet the program's objectives.

NRC-IRAP acknowledges that the assessment that will be undertaken at the end of this pilot program may lead to improvements to be proposed, including changes to its assessment criteria.

  • Director, NRC-IRAP Strategic & Operations Alignment
End of pilot assessment (September 30, 2014) Potential assessments criteria will be implemented, if DTAPP is extended.

Footnotes

Footnote 1

Industry Canada is responsible for collecting data (via a survey) from both individuals and businesses of all sizes on a variety of topics related to digital technologies adoption. This data collection will be based on international model surveys to ensure the Canadian experience can be understood in an international context and compared to other countries. Industry Canada will also contribute to awareness raising by coordinating with and leveraging work done by BDC and NSERC in this regard.

Return to footnote 1 referrer

Footnote 2

This study was undertaken to provide context to consultations undertaken by Industry Canada in 2010 regarding the development of a digital economy strategy. This paper proposed a set of key challenges that must be addressed by a digital economy strategy, describes what had been done to date and posed questions on what needs to be done in the future.

Return to footnote 2 referrer

Footnote 3

"Existing clients" are clients who had previously been involved with NRC-IRAP. "New clients" are clients who have not received NRC-IRAP advisory services for 2 years nor funding for the past 3 years.

Return to footnote 3 referrer

Footnote 4

Refers to firms that had not yet received financial assistance for a project. While some of these firms may never receive financial assistance, many may eventually receive funding and were simply at an early stage in the engagement process and therefore not ready for financial assistance in support of their DTAPP project.

Return to footnote 4 referrer

Footnote 5

"Existing clients" are clients who had previously been involved with NRC-IRAP. "New clients" are clients who have not received NRC-IRAP advisory services for 2 years nor funding for the past 3 years.

Return to footnote 5 referrer

Footnote 6

Stage 1 (Awareness) is also described above in Section 4.0.

Return to footnote 6 referrer

Appendix A: Methodology

The evaluation methodology was developed by the NRC Office of Audit and Evaluation in consultation with DTAPP stakeholders. An external consultant, Goss Gilroy Inc., was hired through a request for proposals process to carry out the evaluation.

The scope of the evaluation covered 2011-12 to 2012-13.While evaluation questions relating to Program design and implementation covered the broader time period, questions relating to the achievement of Program outcomes covered a more defined period, as appropriate, given that the Program was implemented in November 2011.

The selection of methods was based upon the most efficient means of addressing the evaluation issues in a rigorous way, while taking into account cost, time and resource constraints, as well as other considerations, such as evaluation scope, evaluation budget and minimizing response burden. The evaluation approach and level of effort was commensurate with the Program risk, which was assessed as low to medium during an assessment conducted as part of the planning phase.

A process evaluation was selected as the evaluation approach to be used given that the Program had only been recently implemented at the time of the evaluation. In alignment with the Treasury Board Secretariat (TBS) Policy on Evaluation (2009), the evaluation also explored questions related to relevance, achievement of early outcomes, and efficiency and economy (in terms of design and delivery).

The evaluation methods were structured to collect information on each of the evaluation issues using a multi-method approach allowing for triangulation (i.e., convergence of results across lines of evidence) and complementarity (i.e., developing better understanding by exploring different facets of a complex issue).  Where possible, there was a balance between quantitative and qualitative methods, with qualitative methods providing further description and explanation for the quantitative information. Both primary and secondary data sources were used for the evaluation. In all, five methods were implemented. These included:

  • Literature and document review;
  • Analysis of administrative and performance data;
  • Key informant interviews;
  • Surveys of SME client firms; and
  • Study on Program processes.

Technical reports were submitted for all methods. Details for each of the methods are included in the subsections below, as well the methodological challenges and limitations.

A.1 Description of Methods

A.1.1 Literature and Document Review

A literature and document review was conducted to provide Program context and history, and to contribute to the analysis of relevance and performance. GGI was provided with a list of documents and literature to be considered for the evaluation. In addition to these sources, GGI conducted additional reviews of other documents and sources of literature, as available. The review of internal documents included performance reports, presentations, and other internal reports.The review of external documents included those produced by other government departments and central agencies, working papers, and grey literature. Literature included peer reviewed academic scholarly articles on the subjects of the Canadian productivity gap; technology adoption and its impact on productivity; challenges related to technology adoption by SMEs; and technology adoption strategies and programming developed in other countries.

The key limitation inherent in any document or literature review is the large amount of time required to gain a deep understanding of the issues at hand. Because digital technology adoption and its impact on productivity is still a relatively new area of research (i.e., focused in the last 10 years), the evaluation was able to consult most of the relevant literature and documents developed on the subject resulting in a high degree of reliability on the findings drawn from the evidence.

A.1.2 Administrative and Performance Data Review

Financial and human resources information was reviewed and analyzed to provide context to the findings obtained through other lines of evidence. The evaluation also reviewed Program performance data from various sources to contribute to the assessment of the Program’s achievement of outcomes. Key data sources used in this review included:

  • Financial data from SIGMA;
  • Human resources data;
  • Client and project data from SONAR;
  • Data from internal tools including the Program Awareness Tracking Sheet, Stage 2 Eligibility Forms, Stage 3 Client Analysis Forms, Stage 4 Execution Forms and Stage 5 Outcomes Forms; and
  • Performance data from the Status of the Firm (SoF), the Post-Project Assessment (PPA) and Client Productivity Assessments.

This line of evidence was limited insofar as the number of firms that had received funding and completed a project was quite small at the time of the evaluation (i.e., there were 400 firms that had a funded project, of which only 25 had completed their funded project). This resulted in limited performance data. Also, Program data on ITA advisory services were not available, nor were data on the nature of the services provided by funded organizations and colleges and the resulting impacts on SMEs. In order to address the limitations associated with the Program administrative and performance data, the evaluation conducted a survey of client SMEs and interviews with the funded organizations and colleges.

A.1.3 Key Informant Interviews

The objective of the key informant interviews was to gather in-depth information, including views, perceptions, explanations, examples and factual information that addressed the evaluation questions.  In all, 40 key informant interviews were conducted. Where appropriate, interviewees were selected to represent all regions. Table A.1 presents the number of interviews conducted by respondent group.

Table A.1: Summary of Interviews

Key Informant Group Completed
Internal Program Personnel and Stakeholders
NRC-IRAP senior leadership and management 9
Members of the DTAPP Working Group 5
Sub-total: 14
External Stakeholders and Organizations
Funded organizations 10
Funded colleges 7
Stakeholders (e.g., RDAs, BDC, stakeholder associations) 6
Federal partners (Industry Canada, NSERC) 2
Foreign governments 1
Sub-total: 26
TOTAL 40

Interview guides were developed and pretested for each respondent group. Following an initial email sent by a program representative at NRC-IRAP, GGI followed-up with key informants to secure their participation and schedule an interview. Where requested, respondents were sent a copy of the interview guide in advance of the interview. Most of the interviewees located in the National Capital Region were interviewed in-person.  For those not in the National Capital Region, or who preferred not to have the interview in-person, interviews were conducted by phone.  Interviews were conducted in the preferred official language of the interviewee.  Interviews lasted between 30 and 60 minutes, depending on the interview approach (phone or in-person) and the interviewee group.

The main limitation to the interview line of evidence is the limited number of individuals consulted in some respondent groups (e.g., federal partners). As well, while seven representatives from funded colleges and 10 representatives from funded organizations were interviewed, it is expected that the views of this group are not homogeneous resulting in the fact that caution must be used when extrapolating the views of these interviewees to the population as a whole.  To the extent possible, the evaluation team attempted to ensure a representative sample (e.g., geographically, level and length of involvement with DTAPP, and identified outliers in responses for particular groupings when they emerged).

A.1.4    Survey of SME Clients

A survey of SME clients was conducted to collect information on the experiences of clients with the Program; the extent to which the Program, as designed, meets their needs; and benefits they have experienced as a result of the Program. The survey of DTAPP clients included those who had received funding from the Program. The survey was implemented online. In order to maximize the response rate and since the incremental amount for each additional email invitation was negligible, the sample constituted the population of DTAPP clients to date (n=518 for whom the eligibility stage was completed). With this sample size and a response rate of 41%, this resulted in 213 completed questionnaires.

The NRC evaluation team designed the survey instrument, which was reviewed and subsequently programmed by GGI. While the questionnaire was being finalized and programmed, NRC sent an email to all potential respondents to advise them of the evaluation and to inform them that they would receive an invitation from the consultant to participate in the survey.

The consultant pretested the survey with eight SMEs beginning in late November 2012. Pretesters were recruited by phone but completed the survey online so as to emulate the actual survey conditions. The consultant then followed up by phone to pose several questions to assess the functionality of the survey and the clarity of the questions. In light of suggested changes, minor changes were made to the instrument and they were then translated into French.

After the pretest, the consultant trained interviewers (to conduct follow-up calls) and sent survey invitations to respondents. Emails (containing personalized links to the survey) were sent to about half of the sample on the first day. After all systems appeared functional, GGI sent the second half of the invitations. The consultant monitored responses and assessed the response rates throughout, adjusting for potential problems (e.g., bad emails).

The consultant sent reminder emails several days after sending the initial email, and interviewers made follow-up calls to all SMEs who had not yet completed the survey. Interviewers reminded the client to complete the survey, offered to resend the email invitations with the personalized link, and/or offered to conduct the survey over the phone with the client. A second email reminder was sent a few weeks after the first reminder. Allowing for these email reminders and follow-up calls to all non-respondents, the survey process lasted from November 27, 2012 to January 10, 2013 (allowing for a lull in responses over the winter holiday period).

In terms of limitations, the sample used for the evaluation was limited in size simply because the Program has been in operation for less than a year. Likewise, because very few clients had completed a project, and had had sufficient time for impacts to materialize, it was difficult to collect information on the outcomes of the Program. To address this challenge in the survey, questions were posed as anticipated results rather than actual results. In addition, the survey was completed only by firms that had received financial assistance. Thus, the data are not representative of DTAP clients that had received only advisory services. Despite this, with a 41% response rate, the evaluation team had confidence in the representativeness of the survey findings for funded firms, which do represent the largest proportion of DTAPP clients.

A.1.5    Study on Program Processes

A special study on Program processes was conducted to assess the extent to which the Program was implemented and delivered according to design. This line of evidence included 22 key informant interviews with ITAs designed to assess the strengths and barriers / challenges experienced in the planned delivery of the Program. Fifteen (15) of these interviews were conducted with NRC-IRAP ITAs who have led DTAPP engagements, while the remaining interviews were completed with seven DTAPP ITAs. ITAs interviewed were suggested by the Program as having been sufficiently involved in the delivery of DTAPP, and included a proportional representation of each of the five NRC-IRAP regions.

Interview guides were developed and pretested with three ITAs. Following an initial email sent by a Program representative at NRC-IRAP, GGI followed-up to secure their participation and schedule an interview. Respondents were sent a copy of the interview guide in advance of the interview. Interviews with ITAs were conducted by phone.  Interviews were conducted in the preferred official language of the interviewee.  Interviews lasted between 30 and 60 minutes, depending on the interview approach (phone or in-person) and the interviewee group.

While all DTAPP ITAs were interviewed, the evaluation only consulted 15 NRC-IRAP ITAs. As mentioned, these 15 were identified as having good knowledge of DTAPP since they have delivered the Program in their region. However, the experiences and views of the 15 interviewees are not necessarily representative of those of all NRC-IRAP ITAs who have delivered a DTAPP project. To mitigate this challenge, during the interviews, the evaluation team attempted to develop a broad understanding of how DTAPP was being delivered in each region rather than at the individual level. As well, findings from the process review were cross-referenced with the interview evidence provided by NRC-IRAP managers in each region and at National Office so as to identify areas of inconsistency for follow-up.

Appendix B: Evaluation Matrix

Issue/Question Review of Program administrative and performance data Document and literature review Key informant interviews Study on program processes Survey with SME clients
Relevance: Continued need for program
R1. Is there a justifiable need to support SME technology adoption to enhance their productivity through advisory services and / or financial support?   checked checked   checked
R2. To what extent is the Program design appropriate to meet the needs of SMEs experiencing productivity challenges?   checked checked   checked
Relevance: Alignment with government priorities
R3. To what extent is DTAPP consistent with current government priorities?   checked checked    
Relevance: Alignment with federal roles and responsibilities
R4. Is DTAPP consistent with federal roles and responsibilities?   checked checked    
Performance: Achievement of Program outcomes
P1. To what extent has DTAPP been successful in reaching its intended clients and delivering planned program activities? checked checked checked checked checked
P2. To what extent has DTAPP increased awareness of its services to SMEs? checked checked checked checked checked
P3. To what extent has DTAPP collected information from SME clients on the link between adopting digital technologies and productivity and began raising awareness of these findings? checked   checked checked  
P4. To what extent has DTAPP resulted in benefits to client SMEs? checked   checked   checked
P5. Were the resource levels and duration of the Pilot sufficient to adequacy test the Program and its ability to achieve its intended outcomes? checked checked checked    
Performance: Demonstration of efficiency and economy (from a Program design and delivery perspective)
P6. To what extent is the Program design conducive to economic and efficient Program delivery? checked checked checked checked  
P7. To what extent is the Program being delivered in an economic and efficient manner?   checked checked checked  
P8. What knowledge can be derived from the development and implementation of DTAPP and used by NRC-IRAP in the eventual implementation of other targeted initiatives?     checked checked  
P9. To what extent has DTAPP been delivered according to plan and what changes are required to optimize Program delivery in the field?     checked checked  

Appendix C: Selected bibliography

  • Ahearne, M., Srinivasan, M., & Weinstein, L. (2004). Effect of Technology on Sales Performance: Progressing from Technology Acceptance to Technology Usage and Consequence. Journal of Personal Selling and Sales Management, XXIV (4), 297-310.

  • Albors-Garrigos, José; Hervas-Oliver, José Luis & Hidalgo, Antonio (2009).  "Analyzing high technology adoption and impact within public supported high tech programs : an empirical case", Journal of High Technology Management Research, vol. 20,issue 2 (Sept 2009), pp. 153 à 168.

  • Alexopoulos, Michelle & Cohen, Jon (2012). "The effects of computer technologies on the Canadian economy : evidence from new direct measures", International Productivity Monitor, vol. 23 (Spring 2012), pp. 17-30.

  • Almon, Michael-John & Jianmin, Tang (2011). "Industrial structural change and the post-2000 output and productivity growth slowdown : a Canada-US comparison", International Productivity Monitor, vol. 22 (Fall 2011), pp. 44-81.

  • Alves de Mendonca, M.A., Freitas, F., & Moreira de Souza, J. (2008). Information Technology and Productivity: Evidence for Brazilian Industry From Firm-Level Data. Information Technology for Development, 14 (2), 136-153.

  • Arcand, Alan (et al.) (2008). Sluggish productivity growth in Canada: could the urbanization process be a factor?, Conference Board of Canada, Dec 2008, 38 pp.

  • Arcand, Alan & Lefebvre, Mario (2010). Canada’s lagging productivity : the case of a well-educated workforce lacking the much-needed physical capital, Conference Board of Canada, January 2010, 45 pp.

  • Arcand, Alan & Lefebvre, Mario (2011). Canada’s lagging productivity: what if we had matched the U.S. performance?, Conference Board of Canada, Nov 2011, 18 pp.

  • Aschaiek, Sharon (2011). "Tapping into the global market : technology puts the world within reach for SMEs", CMA Magazine, vol. 85, issue 6 (Nov 2011), pp. 16-17.

  • Association of Canadian Community Colleges (2011). Driving Canada’s Long-term Prosperity: Advanced Skills and Incremental Innovation (2011, August).  Submission by the Association of Canadian Community Colleges to the House of Commons Standing Committee on Finance 2012-2013 Pre-Budget Consultations.

  • Baldwin, John R. and Guy Gellatly. (2007). Innovation Capabilities: Technology Use, Productivity Growth and Business Performance: Evidence from Canadian Technology Surveys. Statistics Canada .

  • Bigné J.E.,  Alda, J., & Andreu, L.  (2007). B2B services: IT adoption in travel agency supply chains. Journal of Service Marketing, 22(6), 454-464.

  • Boothby, Daniel; Dufour, Anik & Tang, Jianmin (2010). "Technology adoption, training and productivity performance", Research Policy, vol. 39, issue 5 (June 2010), pp. 650-661.

  • Boulhol, H. (2009). The effects of population structure on employment and productivity, OECD Economics Department Working papers, no. 684, OECD, 2009. 36 pp.

  • Burke, Kelly (2010). "The impact of Internet and ICT use among SME agribusiness growers and producers", Journal of Small Business and Entrepreneurship, vol. 23, no. 2 (2010), pp. 173-307.

  • Canadian Digital Media Network (2012) 2012 Report.  May 23, 2012.

  • Canadian e-Business Initiative (2004). Fast Forward 5.0: Making Connectivity Work for Canada. September 2004.

  • Canadian e-Business Initiative (2003). Net Impact III: Overcoming the Barriers – A Qualitative Research Study Conducted by the Canadian e-Business Initiative. October 2003.

  • Canadian e-Business Initiative (2004). Net Impact Study Canada: Strategies for Increasing SME Engagement in the e-Economy. Final Report. September 2004.

  • Center for the Study of Living Standards (2008). The Canada-U.S. ICT Investment Gap in 2007: Narrowing but Progress Still Needed. Prepared for the Information Technology Association of Canada (ITAC).

  • Chen, Vivian (et al.) (2011). Performance 2011 : productivity, employment and growth in the world’s economies, Conference Board (US), June 2011, 45 pp.

  • Conference Board of Canada. (2009). Labor Productivity Growth. Retrieved from: http://www.conferenceboard.ca/hcp/details/economy/measuring-productivity-canada.aspx.   July Conference Board of Canada (2009). Western Canada : productivity, competitiveness and potential – key findings, by the Conference Board of Canada, June 2009, 34 pp. 2012 .

  • Corrocher, Nicoletta & Fontana, Roberto (2008). "Objectives, obstacles and drivers of ICT adoption : what do IT managers perceive", Information Economics and Policy, vol. 20, issue 3 (Sept 2008), pp. 229-242.

  • Corrocher, Nicoletta & Fontana, Roberto (2008). "Expectations, network effects and timing of technology adoption : some empirical evidence from a sample of SMEs in Italy", Small Business Economics, vol. 31, no. 4 (Dec 2008), pp. 425-441.

  • Coulombe, Serge (2011). "Lagging behind : productivity and the good fortune of Canadian provinces", Commentary – CD How Institute, vol. 331 (June 2011), 19 pp.

  • Earl, L. (2002). Science, Innovation and Electronic Information Division Starting the new century: technological change in the Canadian private sector, 2000-2002 (Science, Innovation and Electronic Information Division (SIEID); Statistics Canada).

  • Expert Panel for a Review of Federal Support to Research and Development (2011). Innovation Canada: A Call to Action.

  • Fin, B. (2006). Performance implications of information technology implementation in an apparel supply chain. Supply Chain Management: An International Journal 11 (4), 309-316.

  • Finance Canada (2011). Federal Budget in Brief - June 2011.

  • Fotini, M., Anthi-Maria, S., Euripidis, L. (2008).  ERP Systems Business Value: A Critical Review of Empirical Literature. Panhellenic Conference on Informatics, pp. 186-190.

  • FPT Economic Development Ministers (2012). Statement - Federal - Provincia - Territorial Meeting of Ministers Responsible for the Economic Development on the Digital Economy, March 19, 2012.

  • Government of Canada (2010) Speech from the Throne – June 2010.

  • Government of Canada (2011). Government Response to the Senate Committee (on the Digital Economy in Canada). March 21, 2011.

  • Hollenbeck, C.R.,  Zinkhan, G.M., French, W., & Song, J.H. (2009). E-Collaborative Networks: A Case Study onm the New Role of the Sales Force. Journal of Personal Selling and Sales Management, XXIX (2), 125-136.

  • Hyjek, Michael and Sharp Jamie (2006). Special Study. Does ICT Matter to SMBs in Canada?  Prepared for the Information Technology Association of Canada (ITAC).

  • Industry Canada (2007). Mobilizing Science and Technology to Canada's Advantage. (Also known as the S&T Strategy.)

  • Industry Canada (2010). Digital Economy in Canada – Industry Canada 2010 Public Consultations Paper.

  • Infinedo, Princely (2011). "An empirical analysis of factors influencing internet/e-business technologies by SMEs in Canada", International Journal of Information Technology and Decision Making, vol. 10, issue 4 (July 2011), pp. 731-766.

  • Information Technology Association of Canada (ITAC). (2008). Competition Policy Review Panel ITAC’s Submission on Consultation Paper: "Sharpening Canada’s Competitive Edge".

  • Innovation and Business Strategy: Why Canada Falls Short : The Expert Panel on Business Innovation. (2009). Council of Canadian Academies

  • Jenkins, Tom (2009). Canada 3.0. Defining Canada’s Digital Future. Making the Case for digital Content - Presentation. December 3, 2009.

  • Majumdar, S.M., & Chang, S. (2010).  Technology diffusion and firm performance: It pays to join the digital bandwagon. Technology in Society, 32, pp. 100-109.

  • Mcafee, A. (2002). The Impact of Enterprise Information Technology Adaptation on Operational Performance: An Empirical Investigation. Production and Operations Management, 11 (1), pp. 33-53.

  • Meyer, Jenny (2011). "Workforce age and technology adoption in small and medium-sized service firms", Small Business Economics, vol. 37, no. 3 (2011), pp. 305-324.

  • Middleton, C. & Biggar, J. (2012). Government International Best Practices in Fostering ICT Adoption among Small and Medium Enterprises: Lessons for Canada (Prepared for Industry Canada).

  • National Research Council (2011). DTAPP Performance Measurement Strategy Final Report. October 18 2011.

  • National Research Council (2011). DTAPP Planning Document on Technology Adoption. December 2011.

  • National Research Council (2011). Colleges and their involvement in the delivery of the DTAPP . August 3rd, 2011. NRC Corporate Policy and Strategy, Strategic and operational Planning

  • National Research Council (2012). Departmental Performance Report 2011-2012.

  • National Research Council (2012). 2012-2013 Report on Plans and Priorities.

  • Nicholson, Peter (2009). "Innovation and business strategy : why Canada falls short", International Productivity Monitor, vol. 18 (Spring 2009), pp. 51-71

  • Niosi, Jorge (2009). "Bridging Canadian technology SMEs over the valley of death", International Productivity Monitor, vol. 18 (Spring 2009), pp. 80-84.

  • Northern Juggernaut? : a look inside the Canadian economy : an interview with Don Drummond , Harvard International Review, Summer 2012, pp. 71-73.

  • Ozelkan,E.c., Sireli,Y., Munoz,M.P., & Mahadevan, S. (2006). A Decision Model to Analyze Costs and Benefits of RFID for Superior Supply Chain Performance. PICMET 2006 Proceedings, 9-13 July, Istanbul, Turkey.

  • Parsons, Mark (2011). "Improving federal tax support for business R&D in Canada", Commentary – CD Howe Institute, no. 334 (Sept 2011), 26 pp.

  • Power, D. (2005). Determinants of business-to-business e-commerce implementation and performance: a structural model. Supply Chain Management: An International Journal, 10(2), 96-113.

  • Raffo, D.M., & Ferguson, R. (2007).  Evaluating the Impact of the QuARS Requirements Analysis Tool Using Simulation. Q. Wang, D. Pfahl, and D.M. Raffo (Eds.): ICSP 2007, LNCS 4470, pp. 307–319, 2007. Springer-Verlag Berlin Heidelberg 2007.

  • Rao, Someshwar (2011). Cracking Canada’s productivity conundrum, IRPP (Institute for Research on Public Policy Study, no. 25 (Nov 2011), 36 pp.

  • Reuber, A. Rebecca & Fischer, Eileen (2011). "International entrepreneurship in internet-enabled markets", Journal of Business Venturing, vol. 26, issue 6 (Nov 2011), pp. 660-679

  • Richard, J.E., Thirkell, P.C., Huff, S.L. (2007). An Examination of Customer Relationship Management (CRM) Technology Adoption and its Impact on Business-to-Business Customer Relationships. Total Quality Management, 18 (8), 927-94

  • Sharpe, A. (2005). What Explains the Canada –US ICT Investment Intensity Gap? (CSLS Research Report 2005-06). Ottawa, ON: Center for the Study of Living Standards.

  • Sharpe, A. & Arsenault, J.F. (2008). The Canada-US ICT Investment Gap: an Update (CSLS Research Report No. 2008-1). Ottawa, ON: Center for the Study of Living Standards.

  • Shufelt, Tim (2012). "Canada’s productivity gap is looking worse than ever", Financial Post-Productive conversations, May 29 2012.

  • Straub, E.T.  Understanding Technology Adoption: Theory and Future Directions for Informal Learning. (2009) Review of Educational Research, 79, 625-649

  • Therrien, Pierre & Hanel, Petr (2011). "Innovation and productivity : summary results for Canadian manufacturing establishments", International Productivity Monitor, vol. 22 (Fall 2011), pp. 11-28.

  • Ulmanis, Juris.  (2011). Information and communications technology factors for adoption and usage determinants in Latvian companies. Doctoral thesis.

  • Uwizeyemungo, S., & Raymond, L. (2009). Exploring an Alternative Method of Evaluating the Effects of ERP: A Multiple Case Study. Journal of Information Technology, 24, 251-268.

  • Van Ark, Bart (et al.) (2010). The 2010 productivity brief : productivity, employment and growth in the World’s economies, Conference Board (US) – Executive Action Series, no,. 319 (January 2010), 30 pp.

  • Venkatesh, V., Morris, M.G., Davis, F.D., and Davis, G.B. (2003). User Acceptance of Information Technology: Toward a Unified View, MIS Quarterly, 27, pp. 425-478.

  • Wainwright, D., Green, G., Mitchell, E., & Yarrow, D. (2005). Towards a framework for benchmarking ICT practice, competence and performance in small firms. Performance Measurement and Metrics: The International Journal for Library and Information Services, 6 (1), pp. 39-52.

  • Warda, Jacek (2010). Leveraging ICT Adoption:  What Can Work for Business. January, 2010. JPW Innovation Associates Inc.

  • Zelbst, P.J., Green, K.W., & Sower, V.E. (2010). Impact of RFID technology utilization on operational performance. Management Research Review, 33(10), pp. 994-1004.

  • Zelbst, P.J., Green, K.W., Sower, V.E., & Baker, G. (2010).  RFID utilization and information sharing: the impact on supply chain performance. Journal of Business & Industrial Marketing, 25 (8), pp. 582–589.

  • Zhang, Heather & Smith, Michael R. (2012). "Globalization and workplace performance in Canada: cross-sectional and dynamic analyses of productivity and wage outcomes", Research in Social Stratification and Mobility, vol. 30, issue 3 (Sept 201

  • Zhu, K., & Kraemer, K.L. (2005). Post-Adoption Variations in Usage and Value of E-Business by Organizations: Cross-Country Evidence from the Retail Industry. Information Systems Research, 16 (1), pp. 61–84.

Date modified: