Telecom Decision CRTC 2005-20

Ottawa, 31 March 2005

Finalization of quality of service rate rebate plan for competitors

Reference: 8660-C12-200315095

Table of contents

  Paragraph
Summary  
I. Introduction 1
Background 2
Proceeding 8
Issues 17
II. Principles 19
III. The rate rebate mechanism 42
Background 43
The basic structure of the rate rebate plan 45
Frequency of reporting and rebates 75
Treatment of repeated failures 87
Entitlement 100
Form of rate rebate 104
Activities beyond the reasonable control of an ILEC 110
IV. The Q of S indicators 119
Q of S indicators reported on a company-wide basis 121
Q of S indicators reported on a competitor-by-competitor basis 132
New Q of S indicators to complete the final RRP for competitors 170
V. Audits, disputes and non-compliance 203
Audits and dispute resolution 203
Late or non-filing of Q of S results 210
VI. Transition from the interim to the final RRP 215
Appendix A - Sample calculation  
Appendix B - Indicators  

In this Decision, the Commission finalizes the quality of service (Q of S) rate rebate plan (RRP) for competitors applicable to the large incumbent local exchange carriers (ILECs).

The structure of the final RRP is significantly different from the structure used in the interim rate adjustment plan for competitors. Under the final RRP, the total potential rebate amount (TPRA) for a month is set at 5 percent of the revenues derived in that month from competitor services provided to a competitor for which there is activity in that month for one of the approved Q of S indicators. Each active Q of S indicator is given equal weighting and the total rebate payable in a month is equal to the TPRA times the number of Q of S indicators which are missed divided by the number of Q of S indicators active in that month.

The final RRP encompasses only Q of S indicators which measure activity on a competitor-by-competitor basis. There are a total of 14 Q of S indicators included in the final RRP, including three new indicators relating to competitor digital network (CDN) services and Type C loops. The Commission establishes a mechanism for reporting activities beyond the reasonable control of an ILEC and initiates a process for setting service intervals for CDN services and Type C loops.

The ILECs are required to conduct internal and external audits on an annual basis and report any issues raised and findings made from these audits to the Commission.

The final RRP, with the exception of indicators 1.19 and 1.19A, takes effect 1 July 2005.

I. Introduction

1. In Finalization of the Quality of Service rate adjustment plan for competitors, Telecom Public Notice CRTC  2003-9, 30 October 2003 (Public Notice 2003-9), the Commission initiated the present proceeding to consider issues relating to the finalization of the quality of service (Q of S) rate adjustment plan (RAP) for competitors applicable to the large incumbent local exchange carriers (i.e., Aliant Telecom Inc. (Aliant Telecom), Bell Canada, MTS Communications Inc. (MTS) (now MTS Allstream Inc. (MTS Allstream)), Saskatchewan Telecommunications (SaskTel), TELUS Communications Inc. (TCI), TELUS Communications (Québec) Inc. (TELUS Québec) and Société en commandite Télébec (Télébec) (collectively, the ILECs).

Background

2. In a series of decisions beginning with Quality of service indicators for use in telephone company regulation, Telecom Decision CRTC  97-16, 24 July 1997 (Decision 97-16), the Commission established a number of competition-related Q of S indicators. These indicators permit the Commission to monitor the provision of certain services to competitors by the large ILECs.

3. In Regulatory framework for second price cap period,Telecom Decision CRTC 2002-34, 30 May 2002(Decision 2002-34), and Implementation of price regulation for Télébec and TELUS Québec, Telecom Decision CRTC 2002-43, 31 July 2002 (Decision 2002-43), the Commission established an interim Q of S rate adjustment plan for competitors (the interim RAP). The interim RAP was applicable to all of the large ILECs, except SaskTel, and was based on certain competition-related Q of S indicators which had been given final approval at the time Decision 2002-34 was issued.

4. In Decision 2002-34, the Commission required the ILECs to develop a competitor digital network access (CDNA) service to foster facilities-based competition. In Competitor Digital Network services, Telecom Decision CRTC 2005-6, 3 February 2005, the Commission determined which services should be provided by the ILECs to their competitors as part of competitor digital network (CDN) services. The existing Q of S indicators did not apply to CDN services and, consequently, these services were not covered by the interim RAP.

5. In Saskatchewan Telecommunications - Applicability of interim quality of service rate adjustment mechanisms and related matters, Telecom Decision CRTC 2003-36, 5 June 2003 (Decision 2003-36), the Commission concluded that it was appropriate for SaskTel to be subject to the same Q of S regime and interim RAP for competitors as was applicable to the other large ILECs.

6. In Finalization of interim competition-related Quality of Service indicators and standards, Telecom Decision CRTC 2003-72, 30 October 2003, the Commission finalized all interim competition-related Q of S indicators.

7. The interim RAP involved six competition-related Q of S indicators which had been given final approval at the time that Decision 2002-34 was released. Two of these indicators did not relate directly to specific tariffed rates and, consequently, the Commission decided that they would not be used when determining rebates under the interim RAP. Three of the remaining indicators had a minimum performance standard of 90% (i.e., the ILEC was expected to achieve the performance standard 90% of the time). The fourth indicator had a minimum performance standard of 80%. Under the interim RAP, if an ILEC failed to meet the minimum performance standard when providing service to a particular competitor, the ILEC was required to provide the competitor with a rate rebate equal to the total charges (either one time or recurring) paid in respect of the affected service multiplied by the difference between the minimum performance standard and the achieved performance (i.e., rebate = total charges paid x ((80% or 90%) - achieved %)).

Proceeding

8. In Public Notice 2003-9, the Commission invited comments on the following issues relating to the establishment of a final RAP for competitors:

  1. those indicators to which the final RAP should apply;
  2. the appropriate formulae to calculate rate adjustments;
  3. the frequency for filing Q of S results;
  4. whether a mechanism should be established to address repeated failures by an ILEC to meet the standard set for a Q of S indicator;
  5. whether a mechanism should be established to address the failure of an ILEC to file results for a Q of S indicator when required;
  6. what form any rate adjustment should take;
  7. what form of dispute resolution mechanism should be established to deal with disputes between an ILEC and a competitor with respect to the correctness of reported Q of S results;
  8. what types of competitors should be entitled to benefit from the final RAP;
  9. the transition from the interim RAP to the final RAP; and
  10. any other matters relating to the establishment of a final RAP for competitors.

9. The Commission made Aliant Telecom, Bell Canada, MTS, SaskTel, TCI, Télébec and TELUS Québec parties to the present proceeding. Other persons were invited to register as interested parties by 19 November 2003.

10. The following persons registered as interested parties to the present proceeding: AGBriggs Consulting Inc., Allstream Corp. (Allstream), Angus Telemanagement Group Inc., Call-Net Communications Inc. (Call-Net), FCI Broadband, Gouvernement du Québec, Industry Canada, LondonConnect Inc., Microcell Solutions Inc. (Microcell), The Corporation of the City of Thunder Bay, TIA Telecommunications Issues and Analysis, and Xit télécom inc.

11. Public Notice 2003-9 set out the following four stage process:

12. On 29 January 2004, the Commission received evidence from Aliant Telecom, Bell Canada, MTS, SaskTel and Télébec (collectively, the Companies); TCI and TELUS Québec (collectively, TELUS); Allstream, Call-Net, FCI Broadband, and LondonConnect (collectively, the Competitors); Microcell, Xit télécom inc. on behalf of itself and Télécommunications Xittel inc. (Xit). In addition, Aliant Telecom, Bell Canada, MTS, SaskTel and Télébec filed data individually.

13. On 17 May 2004, the Commission received argument from the Companies, the Competitors, Microcell, TELUS and Xit.

14. On 1 June 2004, the Commission received reply argument from the Companies, the Competitors, Microcell and TELUS.

15. On 9 June 2004, Xit filed a "further reply argument", arguing that it was entitled to address any new information filed at the reply argument stage. In a letter to the Commission dated 15 June 2004, the Companies submitted that Xit's further reply argument dated 9 June 2004 was out of process and should not form part of the record of this proceeding. A similar request was filed by TELUS on 16 June 2004.

16. The Commission has determined that no new evidence was introduced at the reply argument stage by any party. Consequently, Xit's 9 June 2004 submission was out of process and consideration of this submission cannot be justified on the basis of procedural fairness or the rules of natural justice. Accordingly, the Commission has not taken this submission into consideration when making its determinations in this decision.

Issues

17. The Commission has identified five main areas where issues arise in connection with the establishment of a final RAP:

Each of these areas, and the issues they entail, are discussed below.

18. The Commission notes that no tariff rate is actually adjusted under the "rate adjustment plan", instead, rebates are provided when an ILEC fails to meet a Q of S performance standard. Accordingly, the plan will be referred to by the Commission as the rate rebate plan (RRP) instead of the rate adjustment plan.

II. Principles

19. A number of parties made submissions regarding the principles they believed should underlie the final RRP.

Positions of parties

20. The Companies proposed that the design of the final RRP be guided by the following general principles:

  1. Penalties should only apply where a Q of S indicator measures performance with respect to the provision of services to competitors;
  2. Performance for the Q of S indicator should be within the control of the Companies;
  3. The Commission-approved standard for a Q of S indicator should be set at a performance level which can reasonably be achieved by the Companies under normal operating circumstances;
  4. The Companies should not be penalized more than once for any particular activity associated with an order for a given due date; and
  5. Volumes for the Q of S indicator for the month in question should meet minimum volume threshold levels.

21. In support of these principles, the Companies argued that it would be unfair and unjust if they were subject to rate rebates for below-standard performance of activities over which they had no control or if a standard could not be reasonably achieved under normal operating circumstances. The Companies also argued that if an activity was not performed in accordance with the Commission's standard under one indicator, it would be unfair and unjust to penalize the Companies multiple times for that same activity, simply because it happened to be reported under multiple indicators. The Companies also submitted that in many instances, due to the low volume of orders or trouble reports during a month, the minimum standard would effectively become 100%, rather than 90%. They argued that this was inappropriate.

22. In addition to arguing in support of their proposed principles, the Companies also submitted that the amount of the rate rebates should be commensurate with the degree to which the mandated level of service had been missed. They submitted that the amount of rate rebate paid for a specific Q of S indicator should be reflective of the value of the services being provided to the competitor with respect to that indicator in the affected month.

23. The Competitors submitted that the final RRP should include measures to ensure that service levels would be consistently met by the ILECs. In their view, the rate adjustment payable under the final RRP should be substantial enough to deter the ILECs from missing the standard level for a service indicator in any month and not be treated as just a cost of doing business.

24. The Competitors also submitted that measures should be in place to ensure that, if the service levels were not met, the ILECs would act quickly to bring service levels up to the approved minimum standards. They argued that the rate adjustment payable for consistently missing an indicator over a number of months should be substantial enough that the ILEC would do everything in its power to quickly bring a service level to the approved standard.

25. In the Competitors' view, the final RRP should be transparent and adaptable to evolving competitor services, as well as easy to verify and audit. In particular, a competitor receiving a rebate should be able to readily understand and confirm the adjustment amount received. The Competitors argued that the proposed plan should be easily adaptable to evolving service levels and to the inclusion of new types of services.

26. The Competitors submitted that the magnitude of penalties administered as part of the retail and competitor Q of S plans should be such that the ILECs would not be motivated to meet service standards in one customer segment at the expense of standards in another customer segment. Neither should the rebate mechanism be designed so as to inadvertently encourage the ILECs to focus on services with high rates, at the expense of low tariff rate service elements. In addition, the level of the rebate should be such that the ILECs would not choose to pay rebates instead of adequately provisioning their networks.

27. The Competitors argued that the ultimate measure of success of the final RRP would be if no payments were due to competitors since this would be evidence that the ILECs were providing competitors with service quality at the approved levels.

28. The Competitors submitted that the Companies' proposal to implement a minimum volume threshold would inappropriately allow the ILECs to escape the application of the final RRP for competitors. Consequently, a competitor would be left with no protection from poor service simply because the volume of work in a given month fell below an arbitrary threshold. Furthermore, the proposal would remove any incentive for the ILECs to meet the approved standards in months when they experienced lower volumes.

29. TELUS suggested that the following principles should underpin the final RRP for competitors:

  1. The RRP should minimize the administrative burden on both the Commission and the ILECs. The costs of implementation should not outweigh the benefits;
  2. The RRP should be easy to understand. The methodology should be clear to all stakeholders and there should be confidence that the results reflect performance reality. The basis for calculating the adjustment should be objective and transparent;
  3. The RRP should establish incentives to ensure ILEC compliance with Q of S performance standards for services provided to the ILEC's own customers, as well as services provided by the ILEC to competitors;
  4. The rate rebates should be commensurate with the degree to which the mandated level of service was missed, and with the value of services the customer purchased; and
  5. The RRP should exclude from the calculation those periods when an ILEC is subject to circumstances or events beyond its reasonable control that would make it impossible to satisfy the Commission's mandated service standards.

30. TELUS agreed with the five general principles presented by the Companies and proposed adding the following two new principles:

  1. Penalties should in no case be allowed to grow so large as to dissuade ILECs from investing in facilities to provide essential and near essential services; and
  2. The risk that rate adjustments might fully erode the 15 percent mandated margin for essential and near essential services should be minimized and the risk that they might compromise the return of Phase II incremental costs should be eliminated.

Commission analysis and determination

31. The purpose of the Q of S regime for competitors, including the RRP for competitors, is to ensure that all competitors receive a Q of S from the ILECs of a sufficiently high level to enable the competitors to compete on a level playing field with each other and with the ILECs.

32. Based on this starting point, and in light of the submissions of the parties, the Commission is of the view that the following considerations are important to the design of the final RRP.

33. First, the RRP must focus on Q of S indicators for services provided to competitors. While obvious on its face, this requirement also implies that the RRP must be designed in a sufficiently flexible manner as to easily permit its expansion, or contraction, to reflect the addition or deletion of competitor services, as well as the introduction of new or the modification of existing service quality measures.

34. Second, the RRP must ensure that all competitors are treated fairly and receive an appropriate Q of S. This implies that the RRP should focus on Q of S indicators which are measured on a competitor-by-competitor basis. This will ensure that the service provided to each competitor is subject to the proper scrutiny and is not masked through the aggregation of data. This requirement also suggests that a rebate under the RRP should be directed to the competitor suffering the Q of S failure. In this way, no competitor would receive a windfall from a Q of S failure which did not affect it directly.

35. Third, since the RRP is intended to provide an ILEC with an incentive to provide an appropriate level of service to competitors, it must focus on matters within the control of the ILEC. An incentive mechanism would be irrelevant in a situation where an ILEC is not able to control or respond to events which impair the quality of service provided to a competitor. In such circumstances, any rebates or other adjustments could have no effect on the quality of service provided.

36. Fourth, the RRP must provide a sufficient incentive to ensure that an ILEC meets its Q of S obligations. In particular, the minimum performance standard must be recognized as a minimum, not a target, for performance. Similarly, provisioning the network to meet that minimum performance standard must not be viewed as over-provisioning. The RRP must not be designed so as to encourage an ILEC to view any rebates as simply a cost of doing business.

37. Fifth, the RRP is an incentive mechanism. This implies that if the RRP provides an appropriate incentive to meet each Q of S standard, then, in circumstances where a single performance failure may trigger two or more Q of S indicators, it would be inappropriate to impose multiple rebates. The ILEC should be adequately incented by a single rebate under a single Q of S indicator. As a corollary to this, a competitor should not receive a windfall because of an overlap in the design of the Q of S indicators.

38. Sixth, the RRP is an incentive mechanism, not a penalty mechanism. The focus must be on ensuring that any rate rebates provide an adequate incentive to an ILEC, and that rates remain just and reasonable, while avoiding a punitive effect.

39. Seventh, in order to be effective, the operation of the RRP must be simple and transparent. The RRP should be easy to understand, to administer and to audit.

40. Eighth, the RRP must be effective for all competitors, in all circumstances. The fact that a competitor may be smaller and have lower service volumes should not result in a situation where that competitor does not enjoy the same quality of service as larger competitors. In all cases where a competitor receives below-standard service quality, the ability of the competitor to compete effectively is impaired.

41. In light of these considerations, the Commission is of the view that the following principles should guide the selection of eligible Q of S indicators and the design of the final RRP for competitors:

  1. Q of S indicators to be included in the RRP should measure performance of the provision of services to competitors;
  2. Q of S indicators to be included in the RRP should be measured on a competitor-by-competitor basis;
  3. Q of S indicators to be included in the RRP should be calculated only in respect of an activity that is within the ILEC's control;
  4. Q of S indicators to be included in the RRP should be such that they do not duplicate activities that are already measured by other indicators in the RRP;
  5. Rate rebates under the RRP should provide sufficient incentive to ensure an ILEC meets its Q of S obligations;
  6. The RRP should maintain just and reasonable rates and must not operate as a penalty mechanism;
  7. The RRP should be easy to understand, to administer and to audit; and
  8. The RRP should be effective for all competitors that acquire services from the ILEC for which a Q of S indicator exists for that service, in all circumstances, irrespective of the type of competitor or the volume of service provided to a competitor.

III. The rate rebate mechanism

42. In the preceding section, the Commission identified eight principles which should guide the design of the final RRP for competitors. Four of these principles, numbers 5 through 8, relate directly to the design of the rate rebate mechanism. These four principles provide the foundation for the Commission's determinations in this regard.

Background

43. Six Q of S indicators were included in the interim RRP, with four of these generating rate rebate payments to competitors if Q of S results were below standard. The six indicators were:

No rate rebate mechanism applied to indicator 1.10 as this indicator related to an activity for which no specific rate was paid by a competitor. For indicator 1.11, the Commission had yet to develop an adjustment amount per event.

44. In Decision 2002-34, the Commission adopted three formulae for determining a rate rebate, depending on the nature of the Q of S indicator involved:

  1. an indicator measured a service paid for by a competitor, in which case the rate rebate would be calculated as: (mandated percentage standard - achieved percentage) x (competitor-specific total tariff charges applied for the month for the specific rate elements(s) in question);
  2. an indicator measured an activity which would affect a service used by a competitor, in which case the rate rebate formula would be: (mandated percentage standard - achieved percentage) x (competitor-specific total tariff charges applied for the month for the service in question); or
  3. an indicator measured an activity which did not directly relate to or affect a particular competitor service, in which case the rate rebate formula would be: (mandated percentage standard - achieved percentage) x (competitor's specific demand for the month for the activity in question) x (CRTC mandated adjustment amount per event).

Only the first two approaches were used in the interim RRP.

The basic structure of the rate rebate plan

45. There were two main proposals for the structure of the final RRP. The first was put forward by the Competitors, the second by the Companies and by TELUS.

Positions of parties
The Competitors

46. The Competitors submitted that the interim RRP was overly complicated, as it used multiple formulae, and that it was administratively burdensome to implement and monitor. They argued that it was nearly impossible to audit the Q of S results reported or to confirm the accuracy of any adjustments paid. The Competitors also argued that the interim RRP did not provide adequate incentives for the ILECs to meet Q of S standard levels.

47. The Competitors submitted that experience to date had shown that the interim RRP was ineffective in bringing the ILECs' Q of S levels up to minimum performance standards. The Competitors argued that the quarterly rate adjustment payments made by the ILECs to competitors under the interim RRP had not sufficiently deterred the ILECs from providing poor service to competitors. In the Competitors' view, the ILECs had demonstrated that it was more cost effective to incur minimal payments for delivering substandard service than to deliver service at the approved minimum performance standards. According to the Competitors, the ILECs had an incentive to provide superior levels of service to their own end-customers at the expense of competitors.

48. In support of their argument, the Competitors provided several graphs illustrating the Q of S performance of TCI. The Competitors submitted that when comparing TCI's Q of S results over the July 2002 - September 2003 period, between retail Q of S indicator 1.1A (Provisioning Interval - Urban) and the similar competitor Q of S indicator 1.8 (New Unbundled Loop Order Service Intervals Met) for Allstream and Call-Net, TCI consistently met the standard for the retail service, while it never met the same 90% standard for the similar competitor service.

49. The Competitors further submitted, that when comparing TCI's performance for service repair for retail Q of S indicator 2.1A (Out-of-Service Trouble Reports Cleared within 24 hours - Urban), over the same July 2002-September 2003 period as above, with the similar competitor Q of S indicator 2.7 (Competitor Out-of-Service Trouble Reports Cleared within 24 hours), TCI met the minimum standard for the retail indicator nine times out of 15 (60 percent of the time). For the similar competitor indicator, TCI only achieved it five times out of 15 for Call-Net (33 percent of the time) and three times out of 15 for Allstream (20 percent of the time).

50. The Competitors also submitted that the exceptionally small quarterly rate adjustments paid under the interim RRP were not commensurate with the harm suffered by the Competitors or the deleterious effect inferior Q of S had on competition. The Competitors estimated that TCI paid competitors less than $10,000 per quarter in rate adjustments under the interim RRP. According to the Competitors, these amounts were insignificant compared to the revenues retained by TCI as the end customers were not migrated to the Competitors. In the Competitors' view, it was clear that the negative impact on competitors outweighed the financial cost of any payments levied under the interim RRP.

51. In light of these problems with the interim RRP, the Competitors proposed a final RRP for competitors which would be similar in concept to the retail rate adjustment plan and which, in the Competitors' view, would provide an appropriate financial incentive to ensure that the ILECs were suitably motivated to provide consistently high levels of service to competitors.

52. The Competitors proposed a final RRP that would function as follows. An ILEC would make a monthly rate rebate payment to a competitor if the ILEC failed to meet the minimum performance standard for one or more of the Q of S indicators included in the RRP. The amount of the rate rebate per indicator would be the indicator's proportionate share of the total rate adjustment value (TRAV) for the month, with each indicator for which there had been activity that month being given equal weight. The TRAV would be a percentage of the ILEC's monthly revenue from the competitor for those services covered by the RRP. The total payable to a competitor would be the sum of the rebates for each indicator below standard.

53. The Competitors proposed that the percentage of monthly revenues (both recurring and service charges) used to determine the TRAV should be set at a minimum of 5 percent. The Competitors submitted that a level of 5 percent or more should provide the ILECs with an appropriate inducement to provide service to competitors at the minimum standard levels approved by the Commission.

54. The Competitors argued that their proposal would address the shortcomings experienced with the interim RRP and would be more straightforward, easier to calculate and simpler to verify. In their view, the plan would be flexible and could accommodate new indicators as required for additional competitor services. Finally, since the TRAV for the final RRP for Competitors would be based on the same percentage of revenues as applicable to the retail Q of S adjustment plan, there would be less incentive for an ILEC to focus attention on meeting the Q of S requirements associated with one plan at the expense of the other plan's indicators.

55. The Competitors submitted that their proposed plan was simple and would be easy to implement, verify and audit, and that competitors would be able to readily understand and confirm the adjustment amounts received. The Competitors further submitted that their proposed plan would be as easily adaptable to evolving service levels as it would be to the inclusion of new types of services, in that additional metrics and services could be readily incorporated into the proposed plan as required. The Competitors submitted that their proposed RRP would ensure that the magnitude of rate adjustments would be such that ILECs would not be motivated to meet service standards in the retail customer segment at the expense of standards in the wholesale customer segment. They also argued that rate rebates under their proposed plan would be proportional to the harm caused to competitors by an ILEC missing indicator standards.

Microcell

56. Microcell submitted that the Competitors had put forward a workable proposal to remedy the deficiencies in the interim regime. According to Mircocell, the Competitors' proposal would avert the shortcomings associated with the interim RRP, and would be both more straightforward and simpler to verify. Moreover, in Microcell's view, the Competitors' proposal would ensure that the ILECs would be suitably motivated to provide consistently high levels of service to competitors for all indicators.

57. Microcell submitted that, in particular, the Competitors' proposal would simplify the calculation effort, as it would not require as many inputs or calculation steps as the interim regime. It would also make it easier for competitors to validate the adjustment amounts received. Microcell also argued that the Competitors' proposal would permit competitors to receive compensatory payments in regard to services for which no revenues were attached and that it would be consistent with the RRP already in place for retail Q of S.

The Companies

58. The Companies argued that the approach established by the Commission in the interim RRP was working well and remained appropriate. However, given that the interim RRP only covered four indicators, the Companies submitted that it was necessary to determine which formulae would apply to the remaining Q of S indicators.

59. The Companies submitted that the rate adjustment formulae should be applied as follows: Formula (i) would apply to indicators 1.12, 1.13 and 1.18; Formula (ii) would apply to indicators 1.11, 2.7 and 2.9; and, indicators 1.6 and 2.6 would be modified to exclude any overlap with other existing indicators so that double counting would be eliminated and a new formula to calculate the rate adjustment for indicators 1.6 and 2.6 would be created.

60. The Companies argued that under their proposed final RRP, the magnitude of any rate rebates would not be punitive. The Companies submitted that, unlike the Competitors' proposed RRP, their RRP reflected the proportional nature of the rate rebates and total potential rate rebates would be based on the value of services provided to a competitor with respect to that indicator in the affected month.

61. The Companies argued that the amount of the TRAV proposed by the Competitors was punitive and lacked proportionality. The Companies also argued that the Competitors' proposal was biased towards larger competitors.

TELUS

62. TELUS submitted that the formulae set out in the interim RRP generally appeared to meet the policy objectives of the Q of S regime and that the rate rebates under the plan were commensurate with both the degree to which the mandated level of service was missed, and with the value of services the customer purchased.

63. TELUS submitted that Q of S had improved in comparison with early results when the plan came into effect. In TELUS's view, there was no need to completely revamp the interim RRP as proposed by the Competitors.

64. With regard to the Competitors' proposal, TELUS argued that it lacked proportionality and would be unfair. TELUS also argued that the Competitors' proposal would create an incentive for an ILEC to favour large competitors, since the potential rebates would be greater for established competitors who took many services from an ILEC. In TELUS's view, the final RRP should promote meeting Q of S standards equally for all competitors.

65. TELUS submitted that if the Commission were to accept the Competitors' proposed RRP, the RRP should be capped such that for any particular service, the rebate could not be in excess of the monthly tariff charges applicable to that service. TELUS also submitted that, in situations where the competitor's volume of orders was low, the competitor had a perverse incentive to ensure that the due date and reporting requirements were not met. According to TELUS, the competitor could then recover in excess of the value of the service, although by the time the rate rebate would be made, the service would probably already be delivered.

Commission analysis and determination

66. The Companies and TELUS argued in favour of a final RRP which would resemble the interim RRP, with minor variations. In the Commission's view, however, the competitor Q of S results for the ILECs since the initiation of the interim RRP indicate that that plan has not provided an adequate incentive for the ILECs to meet the minimum performance standards. Table 1, below, identifies the competition-related Q of S performance and amounts paid for Bell Canada and TCI in the period July 2002 to December 2004.

Table 1
Competition-related Q of S results for Bell Canada and TCI (July 2002 to December 2004) *
ILECs Competitors (aggregated)
# of missed indicators /
# of indicators with results
Total rate rebates paid
Bell Canada 80 / 534 $29,612
TCI 208 / 441 $75,864

Note: Affected competitors include Allstream, Call-Net, FCI Broadband, LondonConnect, TELUS Québec and Bell West.

* These results are for the Q of S indicators included in the interim RRP which were 1.8, 1.9, 2.7 and 2.8.

67. The Commission reiterates that the Q of S standards are not performance targets; they are the minimum acceptable performance levels. The RRP is not intended to provide the ILECs with a choice between paying rate rebates or meeting the Q of S standards. The RRP is intended to ensure that the ILECs have an adequate incentive to achieve these standards and that the rates paid by competitors are just and reasonable in light of the service quality provided.

68. In light of the failure of the interim RRP to ensure satisfactory Q of S results, the Commission has concluded that the interim RRP mechanism must be modified.

69. The Commission is of the view that an RRP structured as suggested by the Competitors would likely be more effective in ensuring compliance with the Q of S standards than the interim RRP. The proposed mechanism would be simple to administer by the ILEC and easy to verify by a competitor. The equal weighting of Q of S indicators would ensure that each activity subject to service quality scrutiny would be given adequate consideration by the ILEC. In particular, an activity which is not directly linked to a particular tariff rate or readily identifiable service revenues would nonetheless be subject to a potential rate rebate directly proportional to the revenues derived from providing service to the competitor.

70. The Commission is also of the view that, based on the evidence in this proceeding, the amount of funds at risk under the final RRP must be significantly greater than the amount under the interim RRP in order to ensure that the ILECs have an adequate incentive to meet the Q of S standards. The Commission considers that a total maximum rebate of 5 percent of the revenues derived from a competitor in respect of competitor services would be appropriate. This level of potential rebate should provide an adequate incentive to the ILECs, while at the same time ensuring that rates are just and reasonable. Moreover, contrary to the argument of the Companies, such a rebate amount could not reasonably be viewed as punitive. The amount of the rebate would be based on revenues derived specifically from services provided to the competitor in question and the maximum rebate would only be realized if an ILEC failed to satisfy all competitor Q of S indicators with activity in a particular month. The Commission does not consider a 5 percent rebate in light of comprehensive Q of S failure to be either unreasonable or punitive.

71. The Commission notes that the Companies and TELUS argued against the idea that failure to satisfy a Q of S standard would trigger the full rate rebate for that indicator. In their view, such an approach would lack proportionality, as compared to the approach under the retail rate adjustment plan where the amount payable increases with the severity of the Q of S failure.

72. The Commission considers it important to emphasize that the ILECs operate under different incentives when they are providing service to retail customers as compared to when they are providing service to competitors. Many of the services provided to competitors only exist because the Commission has mandated their provision. The ILECs have little incentive to provide these services in a manner which facilitates the successful operation of their competitors' businesses.

73. In the Commission's view, given the incentives facing the ILECs in the provision of services to competitors, it would not be appropriate to establish a scaled rebate mechanism. Once again, it is important to emphasize that the Q of S standards are the minimum performance standard deemed acceptable. Failure to meet that standard must, therefore, trigger a rate rebate which will provide a sufficient incentive to ensure that any problems are immediately corrected and satisfactory service is provided thereafter. Requiring the full rebate amount for an indicator to be paid if there is a Q of S failure for that indicator should provide the appropriate incentive.

74. In light of the above, the Commission determines that the final RRP for competitors will operate as follows:

  1. the total potential rebate amount (TPRA) for a month is equal to 5 percent of the amounts billed to a competitor for services (existing plus incremental ordered during the month) covered by a Q of S indicator with activity in that month;
  2. each Q of S indicator with activity in that month is given equal weight and the potential rebate amount (PRA) for an indicator is the TPRA divided by the number of active indicators for the month; and
  3. the total rebate payable to a competitor for a month is the PRA multiplied by the number of Q of S indicators for which the ILEC failed to achieve the minimum performance standard that month.

A sample calculation of a monthly rebate is set out in Appendix A.

Frequency of reporting and rebates

Positions of parties

75. The Competitors submitted that the timeframe under the interim RRP between any failure to meet the Q of S indicator standard and the payment of the rate rebate was too long. The Competitors submitted that waiting 45 days following the end of each quarter made it difficult for competitors to verify the reported results. The Competitors argued that Q of S reports should be filed monthly, within 15 days after the previous month end and that any rate rebate should be issued within the same month as the report.

76. In support of their proposed timeframe, the Competitors noted that Bell Canada's Carrier Services Group (CSG) already provided similar reports, such as a reject report, top ten problems, performance results and competitive local exchange carrier (CLEC) card report, to CLEC CSGs on a monthly basis. The Competitors argued that this demonstrated that reporting Q of S results on a monthly basis would not be overly burdensome. The Competitors also argued that the ILECs should be required to provide, in electronic format, supporting details for the underlying formulae, as well as the values used in the calculation of the rate rebate amount. The Competitors submitted that this was not always done even though this information was essential for competitors to verify and confirm the rate rebates received.

77. The Companies were of the view that it would be appropriate to continue reporting competitor Q of S results on a quarterly basis. The Companies submitted that to increase the frequency of reporting would be inconsistent with the Commission's intentions to impose a minimum regulatory burden and to balance the interests of the three main stakeholders in the telecommunications market. The Companies also submitted that such a requirement would be contrary to section 7(f) of the Telecommunications Act (the Act), which sets out as an objective of the Canadian Telecommunications policy that regulation where required should be efficient and effective.

78. The Companies submitted that if the Commission were concerned that, with quarterly reporting, it might not be apprised soon enough of persistent below-standard Q of S results, an exception reporting process could be implemented, similar to that required for retail Q of S.

79. TELUS submitted that there was no demonstrated need to change the existing filing requirements. TELUS agreed with the Companies that increasing the filing requirements would be contrary to the Commission's principles of minimizing the administrative burden and costs of operating the RRP and would also be contrary to section 7(f) of the Act.

80. TELUS submitted that the administrative burden and costs to satisfy the Competitors' requested increased reporting would be considerable. TELUS also submitted that, should there be a reason to report Q of S results more frequently, the Commission had the power to tighten reporting on a case-by-case basis for an interim period.

81. Xit argued that waiting 45 days after a quarter to seek a remedy for poor Q of S was detrimental to competition. Xit submitted that a more appropriate period for filing Q of S results would be 30 days.

Commission analysis and determination

82. In Public Notice 2003-9, the Commission stated its preliminary view that for an RRP to be efficient, the time span between a failure to meet a Q of S indicator standard and the subsequent rate rebate should be as brief as reasonably possible. At the same time, the Commission recognizes the importance of ensuring that the regulatory burden on all parties is as light as reasonably possible.

83. In the Commission's view, the Competitors' concerns with the quarterly timeframe for reporting appear to be focused primarily on issues related to the availability of information. In particular, there would appear to be no financial basis for requiring monthly reporting. There was also no evidence to suggest that monthly reporting would provide an additional incentive for an ILEC to comply with the Q of S standards.

84. The Commission considers it reasonable to expect that both the ILEC serving a competitor and the competitor itself, would keep track of Q of S failures pertaining to that competitor. The Commission also considers it reasonable that an ILEC should provide the affected competitor with supporting details for its Q of S results and its rate rebate calculation. These information obligations would appear to address the Competitors' primary concern with the quarterly timeframe of the interim RRP.

85. Under the interim RRP, the ILECs are required to file their Q of S reports and make rate rebate payments on a quarterly basis, within 45 days of the end of a quarter. Based on the evidence in this proceeding, the Commission considers that the additional costs and administrative burden of monthly filings would not be justified on the basis of any expected gains in the effectiveness of the competitor Q of S regime. Accordingly, the Commission does not consider it appropriate to require monthly reporting. However, the Commission is of the view that the time period between the end of a reporting period and the payment of any rebates should be shortened. Given that services provided to each competitor will be tracked on an ongoing basis, the Commission considers it reasonable to require the ILECs to file their Q of S reports and make rate rebate payments within 30 days of the end of each quarter.

86. In light of the above, the Commission determines that:

  1. The ILECs shall continue to issue competition-related Q of S results on a quarterly basis;
  2. The ILECs must file those results with the Commission, providing a copy of competitor-specific results to the relevant competitor, within 30 days of the last day of the applicable quarter and make any rate rebate payments to competitors within the same 30-day time period; and
  3. The ILECs are required to file with the Commission, and provide to the relevant competitor, all supporting details associated with the determination of the Q of S results and the calculation of the rate rebate amounts.

Treatment of repeated failures

Positions of parties

87. The Competitors submitted that the failure of the ILECs to satisfy the Q of S standards on a recurring basis indicated that steps had to be taken to address repeated failures. The Competitors submitted that adding a repeat factor for each indicator in the final RRP would significantly help offset any economic incentive the ILECs may have in providing substandard service delivery on a recurring basis.

88. The Competitors proposed that the rebate for an indicator would be multiplied by that indicator's repeat factor. The repeat factor would initially be set at one and would be increased by one every month in which an ILEC reported a below standard Q of S result. The repeat factor would be reset to one once an ILEC delivered service at the approved standard for at least three consecutive months.

89. In the Competitor's view, applying a repeat factor would not have punitive effect, but would be a means of ensuring just and reasonable rates as required by section 27 of the Act. The Competitors argued that the ILECs had complete control over their service performance, and could ensure that the repeat factor would never be applied. The Competitors submitted that ILECs' customer-specific arrangements (CSAs) often contained escalating penalties when a service level objective was missed on a repeated basis.

90. The Companies submitted that if the Commission was concerned with an ILEC's persistent below-standard Q of S, it could require the implementation of an exception reporting process similar to that required for the retail Q of S regime. The Companies submitted that, in their view, such an exception reporting requirement for competitor Q of S indicators should alleviate any concerns the Commission might have regarding persistent below-standard service quality results.

91. The Companies submitted that the Competitors had not submitted any evidence to show that a repeat factor would be appropriate, based on past ILEC performance. The Companies were of the view that, even if a repeat factor were considered to be necessary, two features of the repeat factor proposed by the Competitors made it unreasonable and punitive.

92. First, the Companies submitted that the Competitors had not proposed a cap to the repeat factor, unlike other mechanisms that incorporated escalating penalties, such as those mandated in U.S. jurisdictions or in the Companies' own CSAs, where escalating penalties had been negotiated. Instead, under the Competitors' proposal, rate rebates would increase at an exponential rate ad infinitum.

93. Second, the Companies submitted that the proposed application of the repeat factor would not be symmetrical. According to the Companies, while the Competitors proposed that a repeat factor be applied as soon as the performance standard for an indicator was not met in two consecutive months, they also proposed that the repeat factor continue to apply until the standard had been met for three consecutive months. The Companies submitted that in the United States and in the CSAs noted above, penalties returned to their original level as soon as the standard had been met in one month.

94. The Companies submitted that the Competitors' proposal was punitive and contrary to the Act. According to the Companies, in light of the fact that the level of rate rebates would in no way be calculated with reference to the service rates for the month in question, the Commission did not have the legal authority to impose rate rebates of the nature proposed by the Competitors. The Companies submitted that the magnitude of the rate rebates proposed by the Competitors were prima facie unreasonable and punitive.

95. Microcell supported the inclusion of a repeat factor on the grounds that it would provide a strong incentive for the ILECs to more rapidly correct Q of S deficiencies.

96. TELUS opposed the Competitors' proposed repeat mechanism. TELUS submitted that in practical terms, this would result in an ILEC being penalized twice or more for the same failure. TELUS argued that the cumulative penalty could grow to an amount out of proportion with the actual harm caused. TELUS submitted that the Commission, through the application of its powers conferred in the Act, had considerable discretion to make inquiries and determinations with respect to repeated failures and could use this power on a case-by-case basis to assure itself that the necessary system changes were underway to fix any perceived Q of S indicator problems.

Commission analysis and determination

97. The Commission notes that in Decision 2002-34, it dismissed proposals to apply repeat factors to remedy the problem of repeated substandard Q of S results on the grounds that such a mechanism might become punitive. At that time, the Commission was of the view that the incentives established by the interim RRP would be sufficient to prevent repeated failures.

98. In the Commission's view, the competitor Q of S evidence indicates that the interim RRP did not provide the necessary incentives to prevent repeated failures. Since the initiation of the interim RRP, Bell Canada and TCI have repeatedly missed competitor Q of S standards for some indicators. For example, TCI has missed indicator 1.8 over 21 consecutive months and 16 consecutive months in the provisioning of unbundled loops to Allstream and to Call-Net, respectively.

99. The Commission is of the view that the changes in the structure of the RRP set out above should significantly enhance the incentives for the ILECs to meet the Q of S standards. The Commission is concerned that the mechanism proposed by the Competitors to address repeated failures could become punitive in effect. Accordingly, the Commission does not consider it appropriate, at this time, to incorporate in the RRP a mechanism to address repeated Q of S failures.

Entitlement

Position of parties

100. The Companies submitted that a competitor's eligibility for participation in the final RRP for competitors should differ depending upon whether the indicator for the service performance which was being measured was currently reported on a competitor-specific basis or on an aggregate company-wide basis. For competitor-specific indicators, the Companies submitted that, consistent with eligibility criteria included in each of the ILECs' tariffs, only CLECs and digital subscriber line (DSL) service providers should be entitled to rate adjustments. For indicators reported on a company-wide basis, the Companies proposed that all competitors for whom activity was reported under the indicators in question would be eligible for rate rebates.

101. TELUS submitted that only CLECs operating in an ILEC's territory should be entitled to receive the benefit of the competitor RRP.

102. The Competitors and Xit submitted that there should not be any restrictions on the type of competitor eligible to receive a rebate under the final RRP. In their view, any competitor obtaining services from an ILEC for which a competitor Q of S rate rebate would be applicable should be entitled to receive a rate rebate.

Commission analysis and determination

103. In the Commission's view, no party has identified a valid basis or any compelling evidence for discriminating between competitors so as to entitle one competitor to rate rebates under the RRP, while denying another competitor a similar right. The Commission considers that all customers of the ILEC should be eligible to receive a rate rebate should the ILEC not meet the approved Q of S standard. Accordingly, the Commission determines that any competitor that obtains facilities or services from an ILEC that are measured by a Q of S indicator included in the final RRP should be entitled to receive rate rebates if the Q of S results are below standard.

Form of rate rebate

Position of parties

104. The Companies submitted that given that successful processes had already been established for the issuance of rate adjustments, it should be left to the discretion of the individual ILEC as to whether competitors should be rebated by either credit or by cheque. The Companies noted that, under the interim RRP, any rate adjustments were made through cheques by Bell Canada and MTS, and credits to competitor accounts by Aliant Telecom and SaskTel. The Companies submitted however, that since proposed indicators 1.6 and 2.6 were measured company-wide, rate adjustments for these indicators should be made in the form of a credit to the competitor's account.

105. TELUS submitted that there was no need for the Commission to regulate the method that a rate rebate must take. TELUS was of the view that this should be negotiated between affected parties.

106. The Competitors submitted that any rate adjustment should take the form of a credit on a competitor's bill and should appear on the first competitor bill following the issuance of the Q of S report. The Competitors added that interest should be assessed on any payments missing the approved schedule payment date.

107. Xit submitted that payments should be made by cheque to a competitor's account within 30 days of non-compliance.

Commission analysis and determination

108. The Commission notes that the rate rebate payment process under the interim RRP has not generated any complaints. In the Commission's view, if rate rebate payment methods were subject to negotiation between the parties this should permit the most suitable approach to be adopted for each competitor and ILEC. The Commission emphasizes, however, that such an approach would not mean that the form of payment would be at the discretion of the ILEC, as suggested by the Companies. In addition, as set out above, any late rate rebate payment from an ILEC should incur interest at the interest rate charged by the ILEC for late payment of ILEC bills.

109. In light of the above, the Commission determines that the rate rebate payment method made under the final RRP for competitors should be negotiated between the ILEC and the competitor and that, in instances of late payment by the ILEC, the outstanding payment shall incur interest accruing from the rate rebate payment due date at the interest rate charged by the ILEC for late payment of ILEC bills.

Activities beyond the reasonable control of an ILEC

Positions of parties

110. The Companies argued that they should not be penalized under the final RRP for any substandard performance which was directly and causally related to unusual activity with a CLEC, in circumstances where (i) the unusual activity was not forecast by the CLEC, (ii) the unusual activity could not reasonably be foreseen by the ILEC, and (iii) where the CLEC had not provided to the ILEC sufficient advance notice, complete with supporting quantitative information, of such unusual activity.

111. The Companies also proposed that a force majeure clause should be incorporated in the final RRP for competitors, which would state that:

No penalty shall apply in a month where failure to meet the standard is caused, in that month, by fire, strikes, default or failure of other carriers, floods, epidemics, war, civil commotions, acts of God, acts of public authorities, material change in circumstances, or other events beyond the reasonable control of the Company which could not reasonably be foreseen or provided against.

112. TELUS submitted that the final RRP should exclude from the calculation those periods when an ILEC was subject to circumstances or events beyond its reasonable control that would make it impossible to satisfy the Commission's mandated service standards. TELUS also submitted that the acts of other parties, such as other connecting carriers, unions, or public authorities could make it difficult to continue the normal operations of the ILEC, and that all of these actions were both unpredictable and beyond the reasonable control of the company.

113. The Competitors argued that circumstances related to acts of God, epidemics, war, and civil commotions would be beyond the reasonable control of an ILEC and would generally be felt throughout the industry. The Competitors submitted that in such cases, the ILECs should be permitted to exclude from the calculation of rate adjustments those results directly related to such clearly identified events. According to the Competitors, however, the wording of the Companies' proposed force majeure clause was too broad.

114. In the Competitors' view, if events other than acts of God, epidemics, war, and civil commotions should arise, thus affecting Q of S, these should be dealt with on a case-by-case basis. They submitted that the ILECs could apply to the Commission to exclude from the RRP certain events or occurrences, which directly impacted Q of S and were beyond an ILEC's reasonable control. The Competitors submitted that such applications should clearly describe the event in question and indicate how the ILEC's Q of S results for specific indicators would be impacted, and for how long.

Commission analysis and determination

115. The Commission determined, in Section II above, that one of the guiding principles for the design of the final RRP is that Q of S indicators included in the RRP should be calculated only according to an activity that is within the ILEC's control. Accordingly, the Commission is of the view that there should be a mechanism for considering possible exclusions from Q of S results where circumstances beyond the control of an ILEC may have caused the ILEC to fail to meet a performance standard.

116. The Commission notes that the Companies argued that the final RRP should include a force majeure clause to address the issue of exclusions. The Commission does not consider it appropriate to include such a clause. In the Commission's view, the types of circumstances at issue are, by their very nature, unpredictable and unique and, therefore, are best dealt with on a case-by-case basis.

117. In light of the above, the Commission determines that if an ILEC believes that a performance failure for a Q of S indicator is attributable to circumstances beyond the control of the ILEC, the ILEC may apply to the Commission for a determination that the relevant failure should be excluded from the ILEC's Q of S results. The ILEC must apply for such relief within 21 days of the adverse event and must serve a copy of its application on any affected competitors at the same time as the application is filed with the Commission. In its application, the ILEC must clearly identify the adverse event in question, the effects of the event on specific Q of S indicators and the proposed adjustments to those Q of S results. The competitors will have 10 days to file comments on the ILEC's application with the Commission, serving a copy on the ILEC. The ILEC may file reply comments with the Commission, serving copies on any competitors filing comments, within a further seven day period. Where a document is to be filed or served by a specific date, the document must be actually received, not merely sent by that date.

118. The Commission will make a determination on an exclusion application as expeditiously as possible. In the event that a rebate calculation is required under the RRP prior to the Commission rendering its determination on an exclusion application, the rebate shall be calculated on the basis that the performance failure in question is to be included in the Q of S results.

IV. The Q of S indicators

119. In Section II above, principles 1 to 4 were established with regard to the inclusion and treatment of Q of S indicators in the RRP.

120. In light of these principles and the arguments raised in the submissions of parties, issues related to the Q of S indicators for the final RRP have been grouped and discussed under the following headings:

Q of S indicators reported on a company-wide basis

121. The following competition-related Q of S indicators are measured and reported on a company-wide basis:

Positions of parties

122. The Companies submitted that indicators 1.6 and 2.6 could be included in the final RRP for competitors if their definitions were revised to ensure that the information tracked by these indicators was not also captured by other indicators that were included in the RRP. In this regard, the Companies noted indicator 1.6 included activity that was also reported under indicators 1.8, 1.9 or 1.12.

123. The Companies argued that indicator 1.7 had significant retail aspects to it and should not be included in the RRP for competitors. They noted that PIC activation was requested by the long-distance end-customer. The Companies were of the view that should an adjustment be necessary, the end-customer should be the one to benefit from any rate rebate due to poor service in PIC activation and not the APLDs.

124. The Competitors submitted that indicators 1.6, 1.7 and 2.6 should not be included in the final RRP for competitors on the grounds that these indicators were reported on a company-wide basis, not on a per-competitor basis. They also noted that activities measured under indicators 1.6 and 2.6 overlapped with activities measured under other Q of S indicators. The Competitors argued, however, that while these three indicators should not be included in the final RRP, the ILECs should still be required to report these indicator results to the Commission.

125. Microcell submitted that while indicator 1.7 should not be included in the final RRP for competitors, it should continue to be reported by the ILECs and monitored by the Commission.

126. TELUS argued that indicators 1.6, 1.7 and 2.6 should not be included in the final RRP on the grounds that they were not competitor-specific. TELUS submitted that modifying these indicators to be measured on a competitor-specific basis would require significant effort and that the costs would outweigh any benefits. TELUS also submitted that indicator 1.7 was a retail measure and should be retained in the retail Q of S regime.

Commission analysis and determination

127. The Commission notes that the three Q of S indicators at issue measure activities on a company-wide basis, not on a competitor specific basis. As such, they do not fall within either the structure of the RRP approved above or the Q of S principles established in Section II.

128. The Commission further notes that indicators 1.6 and 2.6 measure activities that are also tracked by other indicators. While it might be possible to re-define indicators 1.6 and 2.6 to eliminate any duplication with other indicators, they would still be measured on an ILEC-wide basis and therefore would not adhere to the principles for the competitor RRP. The Commission does not consider it appropriate to totally redefine these indicators to both eliminate duplication and make them competitor-specific. Such a redefinition would completely change the character of the indicators and render them ineffective for monitoring the Q of S concerns which they were originally designed to address.

129. With respect to indicator 1.7, all parties who commented on this issue agreed that this indicator should not be included in the RRP for competitors. Given the retail character of indicator 1.7 and the fact that it is measured on a company-wide basis, the Commission is of the same view.

130. The Commission considers, however, that the ILECs should continue to track and report results for these three indicators. The Commission notes that this would allow it to monitor ILEC performance on these particular indicators and initiate remedial action if required. With respect to indicators 1.6 and 2.6, it would also allow comparison between the Q of S results for these competition-related indicators and retail Q of S indicators. This would permit the Commission to monitor an ILEC's performance in dealing with competitors as compared to the ILEC's treatment of its own end-customers.

131. In light of the above, the Commission concludes that:

Q of S indicators reported on a competitor-by-competitor basis

132. The Commission has approved 16 competition-related Q of S indicators which are measured on a competitor-by-competitor basis. These indicators are:

Positions of parties
The Competitors

133. The Competitors proposed that all indicators measuring activity on a competitor-by-competitor basis, with the exception of indicator 1.17, be included in the final RRP. The Competitors submitted that the definition of indicator 1.12 should be modified to ensure that it would capture local service requests (LSRs) with due dates longer than the approved standard due dates.

134. The Competitors also proposed that the definitions for the indicators that would be included be broadened to apply to additional services that competitors order from ILECs. In particular, the Competitors suggested that the definitions be expanded to include services such as digital network access (DNA)/CDNA (including Type C loops), asymmetric digital subscriber line (ADSL) access, Ethernet access, Megalink Primary Rate Interface (PRI), Business Line/Centrex and other services. The Competitors argued that under this approach, additional services could easily be added into the RRP as it became necessary. In order to take into account the more stringent service standards required for digital services such as DNA/CDNA, Ethernet and Megalink, the Competitors also proposed that the standard of indicator 2.7 be increased from 80% to 90%.

135. The Competitors further proposed that if the scope of indicators was broadened, the ILECs should be required to report Q of S indicator results at the service level (i.e., by loop, DNA/CDNA, ADSL access, Ethernet access, Megalink PRI, Business Line/Centrex, etc.) even though, for purposes of the RRP, the results would be aggregated across all the applicable services included in the indicator.

The Companies

136. The Companies proposed that indicators 1.11, 1.12, 1.13, 1.18, 2.7 and 2.9 be included in the RRP. The Companies argued that including the activities measured by indicators 1.8, 1.9, 1.10, 1.14, 2.8 and 2.8A would result in double counting since these activities were already measured by indicator 1.12.

137. The Companies submitted that, contrary to the Competitors' concerns, indicator 1.12 captured LSRs with due dates that were longer than the standard service intervals. The Companies nevertheless proposed that the definition for indicator 1.12 be modified to make it clear that the indicator measured whether a due date that had been confirmed with a competitor had, in fact, been met.

138. The Companies were of the view that indicators 1.10A, 1.11A, 1.14 and 2.7A should not be included in the final RRP for competitors on the grounds that the minimum performance standards could not reasonably be achieved under normal operating circumstances. The Companies questioned the Commission's legal authority to impose a penalty for failure to meet a standard that was so high as to, in effect, eliminate a due diligence defence for the Companies.

139. The Companies submitted that if the Commission were to include indicators 1.10A, 1.11A, and 2.7A in the final RRP for competitors, a more appropriate service standard for these indicators would be 90% for indicators 1.10A and 1.11A, and 80% for indicator 2.7A, instead of the existing standard of 100%. The Companies suggested that an appropriate standard for indicator 1.14 would be 1.5%, although the Companies submitted that because this activity was also captured by indicator 1.12, it should not be included in the final RRP as it would result in double counting of an activity in two different indicators.

140. The Companies submitted that indicator 1.17 should not be included in the final RRP for competitors for two reasons. First, the Companies noted that indicator 1.17 measured the CLECs' ability to submit complete information in LSRs and not the ability of the Companies to complete a specific task. The Companies submitted that they therefore did not have control over the Q of S reported by this indicator. Second, the Companies argued that the 5% objective fixed as a standard for this indicator was unreasonable given that from January to September 2003, for three competitors, this indicator varied from 8% to 34% with a non-weighted average of 18%.

141. With regard to the Competitors' proposal to broaden the definitions of the Q of S indicators to include additional services that competitors order from ILECs, the Companies argued that most of the services that the Competitors proposed be included in the existing Q of S indicators were offered by the Companies on a retail basis. The Companies submitted that competitors accessed these services under the same terms and conditions as other retail customers. The Companies argued that there was no justification for the Commission to mandate preferential service intervals for non-competitor services when used by CLECs. The Companies submitted therefore that, with the exception of 1.11 and 1.11A, the Commission should not accede to the Competitors' request to broaden the definitions of these indicators.

142. The Companies did not object to expanding the scope of facilities or services measured under indicators 1.11 and 1.11A to include the measurement of activity related to other interconnection trunks and not just Bill-and-Keep interconnection trunks.

143. The Companies also submitted that the Competitors' request to increase the standard for indicator 2.7 from 80% to 90% should be rejected on the grounds that the Competitors had not provided any sound policy reasons why the Commission-approved standard should be increased.

TELUS

144. TELUS submitted that indicator 1.17 should not be included in the RRP as it measured activity that was beyond the control of the ILEC.

145. TELUS argued that the 100% minimum performance standard for indicators 1.10A, 1.11A and 2.7A was impractical, unworkable and could not be achieved on a long-term, sustainable basis. TELUS also submitted that the 99.75% performance standard for indicator 1.14 was also not realistic. Accordingly, TELUS submitted that these four indicators should be excluded from the final RRP.

146. TELUS proposed that if the service level standard for indicator 1.14 were revised to 2.5%, and for indicator 2.7A decreased from 100% to 80%, their inclusion in the final RRP for competitors would be acceptable.

147. TELUS argued that indicators 1.8, 1.9 and 1.10 should not be included in the final RRP on the grounds that they overlapped with indicator 1.12. TELUS submitted that indicator 1.12 should be retained in the RRP.

148. TELUS supported the inclusion of indicator 1.11 and agreed with the Competitors that the definition of indicator 1.11 could incorporate all trunks that have local network interconnection (LNI) functionality. TELUS also supported the inclusion of indicator 1.13 and the broadening of its definition as proposed by the Competitors. TELUS supported the inclusion of indicator 1.18, but opposed the expansion of the definition of this indicator until TELUS had an opportunity to analyze the impact of such a change.

149. TELUS supported the inclusion of indicator 2.7 in the RRP but opposed the broadening of its definition or the raising of its standard, as proposed by the Competitors.

Commission analysis and determination

150. The Competitors proposed that the definition of several Q of S indicators be expanded to include a list of other services that are ordered by competitors from the ILECs and that are not already included in the current indicator definitions. The Companies opposed this proposal, except as it applied to CDN services, on the grounds that competitors should be treated on the same basis as retail customers in respect of retail services.

151. The Commission notes, that as a result of the follow-up proceeding to Incumbent local exchange carrier service intervals for various competitor services, Telecom Decision CRTC 2003-48, 18 July 2003 (Decision 2003-48), looking into whether service intervals should be established for certain services that CLECs order from ILECs that are also offered on a retail basis, Commission staff issued a letter dated 21 July 2004 where it proposed that ILECs and CLECs pursue the possibility that the CLECs be treated by the ILECs in the same manner as the large customers of the ILECs. Commission staff suggested that in certain situations, it may be appropriate for the ILECs and the CLECs to negotiate a CSA-type arrangement that would allow CLECs the opportunity to acquire those ILEC retail services as well as competitor services under terms and conditions that more closely meet the needs of the CLECs. The Commission notes that this matter is not resolved at this time.

152. In light of the above, and the Commission's determinations with respect to CDN services set out below, the Commission is of the view that it is not necessary, at this time, to consider expanding the definitions of the competition-related indicators to include retail services provided to competitors.

153. The Commission's conclusions with respect to individual indicators are set out in the following subsections.

Indicator 1.8 - New Unbundled Type A and B Loop Order Service Intervals Met

Indicator 1.9 - Migrated Unbundled Type A and B Loop Order Service Intervals Met

Indicator 1.10 - Local Number Portability (LNP) Order (Stand-alone) Service Interval Met

Indicator 1.12 - Local Service Request, Confirmed Due Dates Met

154. In the Commission's view, if indicators 1.8, 1.9, and 1.10 were included in the final RRP for competitors, this could result in double counting of those activities which are also measured by indicator 1.12, as currently defined. On the other hand, the Commission does not consider it appropriate to exclude indicators 1.8, 1.9 and 1.10 from the final RRP as this would eliminate a degree of precision in the monitoring of Q of S results which the Commission considers desirable. Consequently, the Commission determines that indicator 1.12 be redefined to eliminate any duplication with indicators 1.8, 1.9 and 1.10. The Commission also determines that the definition of indicator 1.12 should be modified to ensure that the indicator captures occurrences where LSRs have agreed and confirmed due dates that are longer than the approved standard due dates.

Indicator 1.11 - Competitor Interconnection Trunk Order Service Interval Met

155. All parties who commented on this issue agreed that indicator 1.11 should be included in the final RRP and that the scope of the indicator should be broadened to include additional LNI trunks, not just Bill-and-Keep trunks. LNI trunks include all the trunk side trunks such as the Bill-and-Keep Trunks, Extended Area Service (EAS) Termination and Transport Trunks, Local and Toll Transit Trunks, Emergency Service Trunks and Message Relay Trunks as well as any other trunk side Type or line side Type trunk used to start, complete or enhance the LNI such as signalling trunks or trunks to accommodate traffic overflow.

156. The Commission is of the view that indicator 1.11 should be included in the final RRP and should be expanded as suggested by the parties. The Commission notes that any modifications to indicator 1.11 would also require a modification to its associated trailing indicator 1.11A. Accordingly, the Commission determines that indicators 1.11 and 1.11A be modified to reflect the addition of other LNI trunks.

Indicator 1.10A - Local Number Portability Order (Stand-alone) Late Completions

Indicator 1.11A - Interconnection Trunk Order Late Completions

Indicator 2.7A - Competitor Out-of-Service Trouble Report Late Clearances

157. Indicators 1.10A, 1.11A and 2.7A are examples of trailing indicators. In several cases, the Commission has approved definitions, standards and business rules for pairs of indicators tracking the same function but involving different time periods for performance of the function. The first indicator measures whether an ILEC delivers a facility or service on time (the "main indicator"). The standard for the main indicator has in general been set at 80% or 90% to allow the ILEC some flexibility and account for unforeseen situations such as workload variations in the delivery of the ordered facilities or services. The second or trailing indicator captures the ability of the ILEC to deliver the same facility or service (which missed the original due date) one or five days later depending on the Type of service or facility ordered. The standard for trailing indicators has been set at 100% in order to ensure that all orders are accounted for by the ILEC.

158. The Companies and TELUS argued that it was unreasonable and possibly illegal for the Commission to establish a performance standard of 100% for a trailing indicator as this would not afford them any margin for error, even in cases where they did their best to meet the indicator.

159. The Commission notes that when, after missing the initial service time limit, an ILEC misses a second due date to either provision or repair facilities to a competitor, the competitor suffers the consequences of the ILEC's failure. At a minimum, the competitor is obliged to make new arrangements with its end-customers to provision or repair the service in question. The competitor also faces the possibility that the customer may consider the competitor's repeated failure to meet due dates unacceptable and cease to deal with the competitor. In such cases, the ILEC's failure will have undermined the competitor's business relationship with its customer (or potential customer) and possibly resulted in a positive business opportunity for the ILEC with the frustrated customer. Given these potential consequences, the Commission considers it appropriate to hold the ILECs to a very high standard with respect to trailing indicators.

160. However, the Commission is concerned that it is not clear that a performance standard of 100% is required, and such a standard may be unduly onerous. The Commission has therefore concluded that it is appropriate to reduce the minimum performance standards for indicators 1.10A, 1.11A and 2.7A. Accordingly, the Commission determines that the service standard for indicators 1.10A, 1.11A and 2.7A, currently approved at 100%, be reduced to 90% and the three indicators be included in the final RRP.

161. With regard to indicator 2.7A, the Commission considers it appropriate to include other LNI trunks in all indicators that monitor the repair of these facilities. Consequently, the Commission modifies the definition of indicator 2.7A accordingly.

Indicator 1.13 - Unbundled Type A and B Loop Order Late Completions

162. The Commission notes that as a result of the changes proposed with regard to indicator 1.12 above, indicator 1.13, which is a trailing indicator that captures LSRs missed by the ILECs and measured by indicators 1.8, 1.9 and 1.12, must be modified to reflect the changes in indicator 1.12. The Commission notes, in addition, that the standard for indicator 1.13 has already been established at 90%. Accordingly, the Commission includes indicator 1.13 in the final RRP with a standard of 90% and modified to reflect the changes in indicator 1.12.

Indicator 1.14 - Unbundled Type A and B Loops Held Orders

163. Indicator 1.14 overlaps with indicators 1.8, 1.9 and 1.12 since these latter indicators already capture LSRs that cannot be completed due to lack of facilities (i.e. held orders). Consequently, in order to avoid double counting, the Commission is excluding indicator 1.14 from the final RRP. However, the Commission directs the ILECs to continue to track and report indicator 1.14 for monitoring purposes.

Indicator 1.17 - Local Service Request (LSR) Rejection Rate

164. All parties who commented on this issue agreed that indicator 1.17 should not be included in the final RRP for competitors. Indicator 1.17 depends, in large part, on a competitor's ability to issue complete and error free LSRs and is therefore not in the ILECs' control. Consequently, the Commission excludes indicator 1.17 from the final RRP. However, the Commission directs the ILECs to continue to track and report the indicator for monitoring purposes.

Indicator 1.18 - Local Service Request (LSR) Turnaround Time Met

165. Indicator 1.18 captures information that is also counted in several other Q of S indicators. In order to avoid double counting, the Commission excludes indicator 1.18 from the final RRP. However, the Commission directs the ILECs to continue to track and report the indicator for monitoring purposes.

Indicator 2.7 - Competitor Out-of-Service Trouble Reports Cleared Within 24 Hours

Indicator 2.9 - Competitor Degraded Trouble Reports Cleared Within 48 Hours

166. With regard to indicators 2.7 and 2.9, the Commission notes that all parties who commented on the treatment of these indicators agreed that they should be included in the final RRP. The Competitors submitted that the standard for indicator 2.7 should be increased from 80% to 90% to reflect the proposed expansion of the scope of the indicator (i.e. to include CDN and retail services used by competitors).

167. The Commission decided above that the scope of indicators will not be increased to include retail services. In addition, the treatment of CDN services is dealt with below. However, in keeping with the Commission's decision to expand the scope of indicator 1.11 to include all other LNI trunks, the Commission considers it appropriate to include other LNI trunks in indicators 2.7 and 2.9 which monitor the repair of these facilities. Accordingly, the Commission includes indicators 2.7 and 2.9 in the final RRP, modified as indicated, with the standard for indicator 2.7 unchanged at 80%.

Indicator 2.8 - Migrated Local Loop Completion Notices to Competitors

Indicator 2.8A - New Loop Status Provided to Competitors

168. Indicators 2.8 and 2.8A relate to an ILEC's obligation to report to a competitor that the provisioning of unbundled loops is completed and the competitor can take possession of the loop. The Commission notes that misses of indicators 2.8 and 2.8A would also be counted as misses in the indicators that measure the provisioning of unbundled loops. The Commission considers that, given the Type of activities included in 2.8 and 2.8A, it would be impossible to modify their definition so as to eliminate any duplication with other indicators. Consequently, in order to avoid double counting, the Commission excludes indicators 2.8 and 2.8A from the final RRP. However, the Commission directs the ILECs to continue to track and report these indicators for monitoring purposes.

Summary of Commission determinations

169. In light of the above, with regard to competition-related Q of S indicators reported on a competitor-by-competitor basis, the Commission concludes that:

The complete set of competition-related Q of S indicators, with their relevant definitions, measurement methods, standards and business rules is set out in Appendix B.

New Q of S indicators to complete the final RRP for competitors

170. In Public Notice 2003-9, the Commission asked for comments on whether new indicators would need to be defined or existing indicators modified to take into account services provided to competitors such as CDN services, co-location space delivery, or any other service for which a measure of Q of S might be required.

Positions of parties
The Competitors

171. The Competitors proposed four new Q of S indicators:

172. The Competitors submitted that the ILECs were hindering the Competitors' ability to provide good quality services to their customers within timeframes that were equal to those that the ILECs offered to their own customers. In this regard, the Competitors expressed concern that the current indicators did not track service orders with a confirmed due date that exceeded the standard interval.

173. The Competitors were also concerned that their ability to compete for customers was encumbered by the fact that the ILECs were deploying more and more remote switches to provision local service to communities. According to the Competitors, the lack of copper continuity from customers served off remotes to the host serving switches made it much more difficult for competitors to compete for these customers. In addition, the Competitors were concerned that facilities delivered by ILECs to competitors were failing shortly after delivery, more often than they should. Further, the Competitors were of the view that they could more properly serve their customers if they had more consistent access to customer equipment record information.

174. Overall, the Competitors submitted that, if their proposed new indicators were implemented, the ILECs would be more likely to give competitors the services they need on time and on a more consistent basis.

Xit

175. Xit submitted that until recently, requests for support structures by third parties were limited and did not require tracking of ILEC performance through Q of S indicators. It also submitted that more non-dominant carriers were now ordering support structure services and were encountering difficulties dealing with the ILECs. In particular, Xit pointed to two Part VII applications made to the Commission and three separate occasions where Xit had experienced inappropriate delays in obtaining support structure service as evidence. In its view, these incidents provided that there were significant grounds to justify the establishment of Q of S indicators for support structure services.

176. Xit proposed that six new Q of S indicators for support structures be established and incorporated in the final RRP for competitors:

177. The first indicator proposed by Xit would measure the ILEC's ability to provide by a specific date, search charge estimations to verify that there was room on the support structures requested to accommodate the application. The second indicator would measure the ILEC's ability to provide by a specific date, a response to each portion of the application based on its findings during the search noted in step one above, the applicable construction standards and the various agreements that may exist on the requested support structures. If the application was still viable, the third indicator would measure the ILEC's ability to provide by a specified date, an estimate of the charges that would apply to make the required support structures ready to accept the new facilities proposed, if any were required. Once the applicant decided to approve the make ready costs and timeframes, the fourth indicator would measure the ILEC's ability to complete the make ready activities by a particular date. If the applicant was required to have the ILEC do the installation of the new facilities on the support structures, the fifth indicator would measure its ability to do so by a specific date. If the applicant chose to do the installation work itself, indicator six would measure the ILEC's ability to inspect the work done by a specific date.

178. Xit provided definitions, measurement methods, and business rules for each of the proposed support structure service indicators. Xit suggested that the standard for the provision of services to be monitored by its proposed indicators be set at 100% and that the failure to meet the standard should result in a rate adjustment equal to 200% of all fees associated with the actions measured by the proposed indicators not met on the agreed upon target due date.

The Companies

179. The Companies were of the view that the indicators proposed by the Competitors were inappropriate, unnecessary and should be denied as the Competitors had provided no rationale as to why such indicators should even be considered.

180. The Companies considered that Xit did not demonstrate that there were sufficient grounds to establish an elaborate and costly tracking mechanism to monitor Q of S for support structures. The Companies submitted that the fact that carriers or distribution undertakings most affected by the Companies' provisioning practices had not participated in this proceeding was evidence that provisioning was not an issue. The Companies also submitted that when disputes arose, there was an effective dispute process in place and functioning well. The Companies further submitted that the 200% penalties proposed by Xit were excessive. The Companies were therefore of the view that the proposed indicators to monitor Q of S for support structures should be denied.

181. The Companies noted that Public Notice 2003-9 requested comments on whether Q of S indicators were needed to capture CDN services. The Companies were of the view that CDN services could be accommodated within the existing Q of S indicators structure. However, the Companies submitted that, if the Commission concluded new indicators were needed, then the following two new indicators would be appropriate:

TELUS

182. TELUS submitted that the existing, finalized competition-related Q of S indicators provided a more than adequate measure of virtually all facets of competitor Q of S. TELUS did not propose additional or new indicators.

183. With regard to Xit's proposed Q of S indicators to monitor activities related to support structures, TELUS submitted that no other party using support structures submitted evidence or argument in support of Xit's proposal. TELUS also observed that, since 2000, Canadian carriers and cable undertakings had added approximately 3.3 million metres of cable to TELUS's support structures, largely without incident. TELUS submitted that the existing mechanisms in the Support Structure Tariff and Support Structure Agreement provided a complete code for support structure issues and provided sufficient avenues for licensees to address any potential issues of non-compliance by ILECs in the provisioning of support structures.

184. TELUS reiterated its position that it was unreasonable for the Commission to impose a Q of S indicator standard of 100%, as proposed by Xit for its six Q of S indicators related to support structures. TELUS also submitted that the 200% penalties proposed by Xit were beyond the jurisdiction of the Commission.

185. With regard to Q of S indicators for CDNA services, TELUS was of the view that it was premature to consider these services since the Commission had not yet provided its determination with respect to Competitor Digital Network Access service proceeding, Telecom Public Notice CRTC 2002-4, 9 August 2002 (Public Notice 2002-4).

Commission analysis and determination
Competitor proposals for new indicators

186. The Commission notes that indicator 1.12, as amended by this decision, captures activities that are scheduled to be completed on due dates that are longer than the standard service interval and that indicator 1.14 captures service requests that cannot be completed due to lack of facilities. The Commission is therefore of the view that indicators 1.15 and 1.16, as proposed by the Competitors, would be redundant. Consequently, the Commission will not establish indicators 1.15 and 1.16 as Q of S indicators.

187. With regard to indicator 2.10 proposed by the Competitors, the Commission notes that no evidence was provided to indicate whether facilities provided by the ILECs to competitors were failing at a rate that exceeded the rate experienced by the ILECs for their own customers. Neither was there any evidence as to why these facilities might be failing. In the Commission's view, a facility that is delivered to a competitor might stop working for a number of reasons that would not be the result of activity by the ILECs.

188. Under the current Q of S regime, the ILEC is required to deliver services in working order to the competitors. Although the Q of S regime requires that the ILECs repair subsequent problems within specific service intervals, the Commission is of the view that the concern expressed by the Competitors could be addressed by introducing a Q of S indicator to monitor service failure within the first 30 days of its delivery. This new Q of S indicator would not be included in the RRP at this time.

189. In light of the above, the Commission establishes competition-related Q of S indicator 2.12 to monitor service failures and/or degradation within the first 30 days of delivery for competitor services such as Type A and B unbundled loops and their sub-categories, Bill-and-Keep trunks and other LNI trunks for service providers and CDN services and type C loops. The specifics for this new indicator are described in Appendix B.

190. The Commission will monitor indicator 2.12 for a period of one year, beginning from the quarterly filing that commences the final RRP, in order to assess the effectiveness of and the continued need for the indicator.

191. With regard to indicator 2.11 as proposed above by the Competitors, the Commission is concerned that the administrative burden associated with implementing this indicator would likely outweigh the benefit, given that the service to be monitored appears to be seldom used by competitors. The Commission notes that Competitive local exchange carrier access to incumbent local exchange carrier operational support systems, Telecom Decision CRTC 2005-14, 16 March 2005, has been issued to provide CLECs with access to, among other matters, ILEC customer-specific information. The Commission considers that the access provided through this decision should remove any need for an indicator to measure this service.

Proposed indicators for support structure services

192. In regard to the six new Q of S indicators proposed by Xit, the Commission notes that it receives very few complaints with respect to the current support structure provisioning process. Most complaints that the Commission receives relate to the interpretation of the tariff and the application of existing third-party support structure agreements. Accordingly, the Commission has concluded that it is not necessary to establish the Q of S indicators proposed by Xit at this time.

Proposed indicators for CDN services

193. The Commission notes that in Decision 2002-34, it determined that the provision of CDN services by ILECs to competitors would foster facilities-based competition and that CDN services should be classified as a competitor service. Accordingly, the Commission considers that in order to ensure that CDN services are provided to competitors on a non-discriminatory basis and in a timeframe similar to that which the ILECs offer to themselves, CDN services should be monitored through Q of S indicators.

194. While it would be possible to expand the definition of existing indicators to include CDN services, the Commission is concerned that doing so would dilute the effectiveness of the existing indicators by allowing ILECs the opportunity to provide some services below an acceptable Q of S standard while meeting the overall Q of S indicator standard. The Commission is therefore of the view that it would be more appropriate to monitor the provisioning and repair of CDN services from ILECs to the competitors through new Q of S indicators. The Commission therefore establishes the following three indicators:

A complete description with measurement method, standard and business rules of these three indicators is provided in Appendix B. These three new indicators are included in the final RRP. The Commission notes that with the establishment of new indicator 1.19 the scope of new indicator 2.12, established above, is therefore expanded to include CDN services and Type C loops.

CDN services and Type C loops service intervals

195. In the preceding section, the Commission introduced three new indicators to monitor Q of S for CDN services. While CDN services have an approved MTTR of four hours, no service intervals for the delivery of CDN services and Type C loops have been established.

Position of parties

196. In their evidence, the Competitors referenced the proposals of parties for CDNA service intervals in the follow-up proceeding to Decision 2003-48. In that follow-up proceeding, Allstream and FCI Broadband proposed service intervals for various CDNA service rates (Low speed access, Voice grade access, Fractional DS-1, DS-1, DS-3, OC-3, OC-12, Type C Loop i.e. DS-1). These provisioning intervals ranged from 4 to 17 business days depending on whether facilities needed to be constructed or whether existing facilities could be used. Allstream and FCI Broadband indicated that their proposed intervals were based on the service intervals Bell Canada provided for Digital Network Access (DNA) service to its retail customers, but with two days subtracted in order to provide a competitor with extra time to provision its component of the service package and thereby provide service to its end-customers in a comparable timeframe as the ILEC.

197. The Companies indicated in their responses to interrogatories in the present proceeding that Bell Canada's service interval ranged from nine to 12 business days for a circuit comprising one DS-1 access or one circuit comprising up to two CDNA DS-1s and one intra-exchange DS-1 channel, depending on whether access facilities were available or not. The Companies indicated that if underlying feeder and/or distribution facilities were not available, the service interval would be negotiated. The Companies also indicated that the Bell Canada service interval was 12 business days for a circuit comprising one DS-3 access or one circuit comprising up to two CDNA DS-3s and one intra-exchange DS-3 channel, if DS-3 facilities were available. Once again, if underlying feeder and/or distribution facilities were not available, the service interval would be negotiated. Finally, the Companies stated that the Bell Canada service interval would be seven days for the provision of a Type C Loop if no construction was necessary, consistent with service intervals offered under both Bell Canada's retail digital network access (DNA) appointment plan, as well as under Bell Canada's competitor digital network access (CDNA) appointment plan. If construction was required, the appropriate service interval would have to be negotiated.

198. For the same three types of circuits as identified above for Bell Canada, TELUS estimated that the provisioning service intervals would be: (1) 19 business days if the request were for a service in Metro areas (i.e., downtown locations in Vancouver, Calgary and Edmonton), 27 days in urban areas (i.e., cities and locations within cities other than those covered by Metro) and 37 days in rural areas; (2) 42 business days if the request were for a service in the Metro areas, 42 days in urban areas and 52 days in rural areas; (3) 37 days in the worst case scenario.

Commission analysis and determination

199. The Commission notes that CDN service and Type C loops can be ordered together or separately. Therefore, it is necessary to establish service intervals both for CDN service and for Type C loops.

200. In the Commission's view, the ILECs should provide services to competitors within timeframes similar to those they provide to themselves, so that competitors may compete on an equitable basis with the ILECs. The Commission also considers it generally preferable for services to competitors to be available on the basis of a fixed service interval, rather than a negotiated service interval, since this latter approach may provide an ILEC with an inappropriate level of influence over how a competitor conducts its business.

201. Accordingly, the Commission is of the view that each ILEC should propose, with justification, service intervals for all CDN services and Type C loops offered by the ILEC. The ILEC's proposed service intervals should be the same as those provided to retail customers so that the competitor is not disadvantaged vis-à-vis the ILEC in provisioning service to end-customers. In those situations where the retail market service interval is subject to negotiation, the ILECs should propose a fixed service interval and, if the ILEC considers it appropriate, explain why a negotiated service interval would be more appropriate in those circumstances.

202. The Commission directs Aliant Telecom, Bell Canada, MTS Allstream, SaskTel, TCI, Télébec and TELUS Québec to file with the Commission, serving copies on all parties to this proceeding, by 2 May 2005, proposed fixed service intervals for all CDN services and Type C loops offered by the ILEC, in each case providing justification for the proposed interval, and identifying, with justification, those situations where the ILEC believes the service interval should be subject to negotiation rather than being fixed. Parties may file comments with the Commission, serving copies on all parties to this proceeding, on the proposals of the ILECs, by 10 May 2005. The ILECs may file reply comments, with the Commission, serving copies on all parties to this proceeding, by 16 May 2005. Where a document is to be filed or served by a specific date, the document must be actually received, not merely sent by that date.

V. Audits, disputes and non-compliance

Audits and dispute resolution

Position of parties

203. The Companies proposed an internal audit process that would verify that they were reporting competitor Q of S results in a consistent and accurate manner. The Companies also proposed that each year, an independent external auditor would review the work performed by the internal auditors in cases where it was found that any indicator presented a net error rate of over 5 percent. The Companies submitted that the annual equivalent of any costs of funding the external auditor for that year should be drawn down from the deferral account of the ILEC in question. The Companies submitted that, should any significant disputes arise, the Commission should deal with them on a case-by-case basis.

204. The Competitors submitted that if parties could not resolve a disagreement bilaterally, a third-party audit should be used to resolve the dispute. The Competitors added that the cost of the third-party audit should be borne by the party found in the wrong.

205. TELUS submitted that any disputes should be dealt with on a case-by-case basis through the dispute resolution processes available to the Commission under the CRTC Telecommunications Rules of Procedure. TELUS also noted that there were more informal procedures available involving Commission staff, such as staff mediation.

206. Xit submitted that in case of a dispute around the accuracy of Q of S results, the Commission should not wait more than 30 days to appoint a Commission inspector to supervise the dispute resolution process referred to in the ILEC support structure tariffs.

Commission analysis and determination

207. The Commission is of the view that internal audits are appropriate to ensure that procedures and controls are put in place by the ILECs to deliver and maintain facilities and services on time to competitors. The Commission is also of the view that an independent audit of the Q of S results, RRP calculations and any rate rebate payments should be made annually and that this audit should take place as part of the annual external financial audit of the ILEC. In addition, the Commission considers that it would ensure transparency of the Q of S and RRP process if the findings from both the internal and external audits were filed with the Commission. In the Commission's view, both the internal and external audits are an integral part of the overall Q of S regime and, as such, should be funded directly by the ILECs. The Commission notes that if the costs of external audits were to have a material impact on an ILEC, the ILEC could apply for an exogenous adjustment under the price cap mechanism.

208. On the question of disputes between a competitor and an ILEC concerning the Q of S results, an RRP calculation or an RRP rebate, the Commission is of the view that the most appropriate approach would be for the matter to be brought to the Commission for resolution on a case-by-case basis.

209. Accordingly, the Commission directs the ILECs to:

Late or non-filing of Q of S results

Position of parties

210. The Competitors submitted that they consider timely filing of Q of S results important to the effectiveness of the RRP for competitors. The Competitors submitted that the ILECs should face a penalty of $5,000/day for any late filing or late revisions to its reported results. The Competitors submitted that similar filing penalties existed in the Verizon New Jersey Incentive Plan.

211. Xit submitted that failure by the ILECs to submit Q of S results on time could reasonably be interpreted to mean that they were attempting to hide repeated occurrences of substandard Q of S.

212. The Companies submitted that, based on the fact there were no significant problems in past reporting, there was no need to address this issue at the present time. The Companies argued that, in any event, the Commission could always investigate the reasons for such an occurrence at that time.

213. TELUS submitted that, based on experience, the failure of an ILEC to file Q of S results on time would be an exception, rather than the rule. TELUS argued that any delay in filing would not cause direct harm to a competitor. TELUS also questioned whether the Commission had the authority to collect penalties such as those proposed by the Competitors. TELUS submitted that, if it became apparent to the Commission that some direct action must be taken, theAct gave the Commission considerable discretion to make inquiries, including the ability to direct carrier compliance.

Commission analysis and determination

214. The Commission notes that the late filing of Q of S results has been rare since the implementation of the interim RRP. Of the 10 quarterly filings by the ILECs, since the initiation of the interim RRP, there has been only one instance of a late filing and that situation involved a single ILEC. The Commission also notes that it does not have the authority under the Act to impose fines for the late filing of Q of S results. However, if a situation were to arise where an ILEC was repeatedly late filing its Q of S results, the Commission would have considerable discretion to make inquiries and take remedial steps, including the authority to direct compliance. The Commission therefore concludes that no action is required at this time.

VI. Transition from the interim to the final RRP

Position of parties

215. The Competitors submitted that payments under the final RRP should be payable based upon the ILEC service quality performance for the finalized indicators commencing from the date each indicator was finalized.

216. The Companies submitted that the final RRP for competitors should be implemented on a going-forward basis. The Companies submitted that the final RRP for competitors should take effect at least 90 days after the date of the Commission's decision in this proceeding, in order to allow the ILECs sufficient time to set up the necessary processes and procedures.

217. The Companies argued that it would be unfair and unjust for the Commission to require the ILECs to pay rate rebates for past performance during a period in which they had no knowledge of which indicators might be included in the final RRP, of the magnitude of any potential rate rebates for those indicators, or of the methodology for the calculation of any such rate rebates.

218. The Companies further argued that the assessment of rate rebates on such a basis would constitute retroactive or retrospective ratemaking, which would be beyond the Commission's jurisdiction under theAct. The Companies acknowledged that theAct allows the Commission to make an interim decision, later making a final decision effective from the date on which the interim decision came into effect. However, the Companies submitted that the Commission must formally make an interim decision explicitly affecting a rate in order to be able to later revisit, and possibly revise the rate in question. The Companies noted that in this case, the interim RRP only included indicators 1.8, 1.9, 1.10, 1.11, 2.7 and 2.8. The Companies argued that since the interim RRP does not apply to the remaining competition-related Q of S indicators, if the Commission established a final RRP which included some or all of those remaining indicators, the Commission would be without jurisdiction to apply that RRP retroactively or retrospectively.

219. TELUS submitted that a 90-day implementation period would be appropriate. For the same reasons identified by the Companies, TELUS was of the view that the final RRP for competitors should not include any retroactive rate rebates.

Commission analysis and determination

220. The Commission is of the view that it would be inappropriate for the ILECs to be required to make payments to competitors for services that were not part of the RRP prior to its finalization. Accordingly, the Commission determines that the final RRP for competitors shall apply on a going-forward basis only.

221. The Commission notes that the interim RRP and the final RRP for competitors are both based on quarterly reporting. Therefore, the Commission considers that the final RRP should commence with the start of the third quarter 2005, i.e., 1 July 2005.

222. The Commission notes that it has established a further process to deal with indicators 1.19 and 1.19A covering the provision of CDN services and Type C loops. The commencement of the collection of Q of S results for these two indicators will be established in the determination made during the further process.

223. The Commission therefore directs the ILECs to implement the final RRP for competitors, including all the determinations noted above but with the exclusion of indicators 1.19 and 1.19A at this time, on a going-forward basis, effective 1 July 2005.

224. The Commission notes that it intends to monitor the operation of the final RRP to determine whether any further measures may be required to ensure consistent compliance with the Q of S standards.

Secretary General

This document is available in alternative format upon request, and may also be examined in PDF format or in HTML at the following Internet site: www.crtc.gc.ca

Appendix A

Sample Calculation

Service Assumptions

In month activity

Active Indicators

Under the RRP, the activity generated during the month is measured by 9 main indicators and 5 trailing indicators.

The 9 main indicators are for the provisioning of services: 1.8, 1.9, 1.12, 1.10, 1.19, and1.11 and for the repair activities: 2.7, 2.9, and 2.10.

If the ILEC fails to meet the standards set for these provisioning and repair activities based either on standard or negotiated and confirmed service intervals, the late completion of the same activities will be measured by the following trailing indicators: 1.10A, 1.13, 1.11A, 1.19A and 2.7A. (Indicator 1.13 is the trailing indicator for indicators 1.8, 1.9 and 1.12).

This example assumes that the provisioning ILEC activity during the month will need to be measured by the 14 indicators that are 1.8, 1.9, 1.12, 1.10, 1.19, 1.11, 2.7, 2.9, 2.10, 1.13, 1.10A, 1.11A, 1.19A and 2.7A. This is equivalent to saying that during the month, there will be 14 indicators with activity for the provisioning ILEC.

Calculation of the TPRA and PRA

The TPRA is 5 percent of the ILEC's monthly revenue from the competitor services provisioned to the competitor, which in this case is $6,141,507.97 x 5% = $307,075.40.

The PRA per indicator is equal to $307,075.40 / 14 indicators with activity during the month = $21,933.96.

Q of S Results

Assume, the Q of S results are as follows:

Main indicators for provisioning:
Indicator Standard Result for month Comments
1.8 90% 84.85% 112 LSRs corresponding to 166 loops for business delivered on time out of 132 LSRs for 186 loops issued in the month.
1.9 90% 82.22% 148 LSRs corresponding to 164 loops for business delivered on time out of 180 LSRs for 200 loops issued in the month.
1.12 90% 80.00% 3,824 LSRs corresponding to 9,569 loops for business delivered on time out of 4,870 LSRs for 11,961 issued in the month.
1.10 90% 85.00% 102 LSRs corresponding to 118 standalone ports completed out of 120 LSRs with 150 standalone ports issued in the month.
1.19 90% 85.00% 136 orders corresponding to 136 DS-1 CDN services delivered on time out of 160 orders for 160 DS-1 CDN services in 160 different locations issued in the month.
1.11 90% 66.67% 2 service requests corresponding to 48 EAS trunks delivered on time out of 2 service requests for 72 trunks issued in the month.

Trailing indicators for provisioning:
Indicator Standard Result for month Comments
1.10A 90% 88.88% 16 (out of 18) remaining LSRs corresponding to 24 (out of 32) remaining standalone ports completed within one working day of the confirmed due date during the month.
1.13 90% 88.79% 895 = (15 + 30 + 850) LSRs (new, migrated, negotiated) corresponding to 1859 = (15 + 32 + 1812) loops (new, migrated, negotiated) delivered one business day after the confirmed due date (standard or negotiated) out of 1,008 = (20 + 32 + 956) remaining LSRs corresponding to 2448 = (20 + 36 + 2,392) remaining loops. A total of 113 LSRs corresponding to 589 loops were still not completed after the extended delivery date (standard or negotiated due date + 1 business day) during the month.
1.19A 90% 83.33% 20 (out of 24) remaining orders corresponding to 20 (out of 24) remaining DS-1 CDN services delivered one business day after delivery of the first 136 CDN services in the month. The 4 remaining orders corresponding to 4 DS-1 CDN services in 4 different locations were not completed during the month.
1.11A 90% 0.0% 0 (out of 1) remaining service request corresponding to 24 EAS trunks not completed 5 days after the delivery on time of the first 48 trunks during the month.

Main indicators for repair:
Indicator Standard Result for month Comments
2.7 80% 75% 1,032 OOS reports corresponding to 1,376 (918 in band A + 458 in band B) loops cleared within 24 hours of their receipt by ILEC out of 1,376 issued in the month.
2.10 4 hrs or less 4.5 hrs Average time taken by ILEC to clear 8 DS-1 CDN services OOS reports issued by competitor in the month.
2.9 90% 80% 496 degraded OOS reports for unbundled loops in band C completed within 48 hours of their notification out of 620 issued in the month.

Trailing indicator for repair:
Indicator Standard Result for month Comments
2.7A 90% 88.66% 305 (210 in band A + 95 in band B) OOS reports for unbundled loops (out of 344 composed of 230 in band A and 114 in band B) cleared 24 hours after the original 24 hours set for indicator 2.7 in the month. The remaining 39 loops have been cleared long after the total 48 hours (24 hours for 2.7 and an additional 24 hours for 2.7A) under which measurements were taken.

Calculation of the Rebate

The Q of S results above show that out of the 14 indicators (9 main and 5 trailing) with activity during the month, the ILEC missed the standard for all 14 indicators.

The total rate adjustment payable to the competitor would therefore be PRA x 14 = $307,075.40.

Appendix B

Indicators

This appendix sets out all competition-related Q of S indicators with their relevant definitions, measurement methods, standards and business rules.

Indicator 1.6 - Competitor Installation Appointments Met

Definition: The total number of installation appointments booked and the number met, with percentage of those met relative to the total booked for customers who are also competitors.

Measurement Method: Completed orders are sorted to determine the actual number and percentage completed on the appointed date.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.6 - Competitor Installation Appointments Met.

Indicator 1.8 - New Unbundled Type A and B Loop Order Service Intervals Met

Definition: The percentage of time that the due dates for the provisioning of new unbundled type A and B local loop orders are met within the applicable standard service interval.

Measurement Method: Completed new loop orders are compiled, and the percentage of those that were completed within the applicable standard service interval is reported. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.8 - New Unbundled Type A and B Loop Order Service Interval Met.

Numerator: Number of orders for new type A and B unbundled loops that have met the standard interval due date for the month.

Denominator: Total number of orders for new type A and B unbundled loops for which a standard interval due date has been assigned for the month. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.

Business Rules:

Indicator 1.9 - Migrated Unbundled Type A and B Loop Order Service Intervals Met

Definition: The percentage of time that the due dates for the provisioning of migrated unbundled type A and B local loop orders are met within the applicable standard service interval.

Measurement Method: Completed loop migration orders are compiled, and the percentage of those that were completed within the applicable standard service interval is reported. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.9 - Migrated Unbundled Type A and B Loop Order Service Intervals Met.

Numerator: Number of orders for migrated type A and B unbundled loops that have met the standard interval due date for the month.

Denominator: Total number of orders for migrated type A and B unbundled loops for which a standard interval due date has been assigned for the month. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.

Business Rules:

Indicator 1.10 - Local Number Portability (LNP) Order (Standalone) Service Interval Met

Definition: The percentage of time that due dates relating to orders for the standalone porting of numbers are met within the applicable standard service interval.

Measurement Method: Completed standalone LNP orders are compiled and the percentage of those that were completed within the applicable standard service interval is reported. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.10 - Local Number Portability (LNP) Order (Standalone) Service Interval Met.

Numerator: Number of orders for standalone porting of numbers that have met the standard interval due date for the month.

Denominator: Total number of orders for standalone porting of numbers for which a standard interval due date was assigned for the month. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.

Business Rules:

Indicator 1.10A - Local Number Portability Order (Standalone) Late Completions

Definition: The percentage of orders for standalone porting of numbers that missed the confirmed due date, which are completed within one working day of the confirmed due date.

Measurement Method: Completed (standalone) local number portability orders that missed their confirmed due dates are compiled, and the percentage of those that were completed within one working day of their respective confirmed due dates is reported.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.10A - Local Number Portability Order (Standalone) Late Completions.

Numerator: Total number of orders for standalone porting of numbers in the month that missed the confirmed due date, which were completed within one working day of the confirmed due date.

Denominator: Total number of orders for standalone porting of numbers completed in the month for which a confirmed due date was missed.

Business Rules:

Indicator 1.11 - Competitor Interconnection Trunk Order Service Interval Met

Definition: The percentage of time that the agreed upon due date for the turn-up of Local Network Interconnection (LNI) trunks are met. LNI trunks include all the trunk side trunks such as the Bill-and-Keep Trunks, Extended Area Service (EAS) Termination and Transport Trunks, Local and Toll Transit Trunks, Emergency Service Trunks and Message Relay Trunks as well as any other trunk side type or line side type trunk used to start, complete or enhance the LNI such as signalling trunks or trunks to accommodate traffic overflow.

Measurement Method: Tracking of due dates met. The due date interval is 20 business days or shorter for line side type trunks, when augments to existing trunk groups are required where facilities exist and 35 business days when new trunk groups are required where no facilities exist.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.11 - Competitor Interconnection Trunk Order Service Interval Met.

Numerator: Number of orders for LNI trunks that have met the standard interval (agreed upon) due date for the month.

Denominator: Total number of orders for LNI trunks for which a standard interval (agreed upon) due date has been assigned for the month. The due date interval is 20 business days or shorter for line side type trunks, when augments to existing trunk groups are required where facilities exist and 35 business days when new trunk groups are required where no facilities exist.

Business Rules:

Indicator 1.11A - Interconnection Trunk Order Late Completions

Definition: The percentage of orders for the turn-up of Local Network Interconnection (LNI) trunks for which the due date is missed, but which are completed within five working days of the due date.

Measurement Method: Completed orders for LNI Trunks which were not completed on their due dates are compiled, and the percentage of those orders which were then completed within the next five working days of their respective due date is reported.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.11A - Interconnection Trunk Order Late Completions.

Numerator: Total number of orders for LNI Trunks which were not completed on their due date, but were then completed within the next five working days of the due date.

Denominator: Total number of completed orders for LNI Trunks for which a due date for that month was missed.

Business Rules:

Indicator 1.12 - Local Service Requests (LSRs) Confirmed Due Dates Met

Definition: The percentage of instances that the agreed upon and confirmed due date is met for the provisioning of LSRs other than LSRs for new/migrated loops and for standalone LNP orders measured by indicators 1.8, 1.9 and 1.10. The due date means the agreed upon and confirmed due date that is different than the standard due date measured under indicators 1.8, 1.9 and 1.10.

Measurement Method: Completed LSRs other than LSRs for new/migrated loops and for standalone LNP orders measured by indicators 1.8, 1.9 and 1.10 are compiled, and the percentage of those which were completed by the agreed upon and confirmed due date is reported. LSRs are to be counted as complete only if all constituent elements of the LSR order are complete.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.12 - Local Service Request Agreed upon and Confirmed Due Dates Met.

Numerator: Total number of LSRs other than LSRs for new/migrated loops and for standalone LNP orders measured by indicators 1.8, 1.9 and 1.10, completed on the agreed upon and confirmed due date during the month.

Denominator: Total number of LSRs other than LSRs for new/migrated loops and for standalone LNP orders measured by indicators 1.8, 1.9 and 1.10, with agreed upon and confirmed due dates completed during the month.

Business Rules:

Indicator 1.13 - Unbundled Type A and B Loop Order Late Completions

Definition: The percentage of orders for unbundled type A and B loops and their sub-categories, for which the due date as measured in indicators 1.8 and 1.9 and 1.12, was missed, but which were completed within one working day of the confirmed due date. The due date means the standard service due date, unless the parties have agreed to another (earlier or later) due date.

Measurement Method: Completed loop orders that are not completed by their due dates are compiled, and the percentage of these which were completed within one working day of their respective confirmed due dates is reported.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.13 - Unbundled Type A and B Loop Order Late Completions.

Numerator: Total number of orders for new and migrated type A and B unbundled loops and their sub-categories that have been completed within the month, but missed the confirmed due date by one working day.

Denominator: Total number of orders for new and migrated type A and B unbundled loops and their sub-categories completed within the month for which a due date has been missed.

Business Rules:

Indicator 1.14 - Unbundled Type A and B Loops Held Orders

Definition: The number of orders for type A and B loops and their sub-categories that were not completed on the confirmed due date because of a lack of facilities, expressed as a percentage of loop inward movement. The confirmed due date means the date assigned by the provisioning ILEC and does not necessarily reflect the standard service interval, nor the customer requested due date. Inward movement means instances in which there is the provisioning of new and the migration of unbundled loops or modifications to existing unbundled loops that require loop facility changes.

Measurement Method: Orders for unbundled loops are compiled and the percentage of these orders that were not completed on the due date as a result of the lack of facilities is reported.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 0.25% or less.

Reporting Format: Indicator 1.14 - Unbundled Type A and B Loops Held Orders.

Numerator: Total number of completed orders for type A and B unbundled loops and their sub-categories (inward movement) that were not completed on their due dates that month due to a lack of facilities together with the total number of orders for the month not yet completed for which confirmed due dates cannot be met due to a lack of facilities.

Denominator: Total number of orders for type A and B unbundled loops and their sub-categories (inward movement) completed for the month, together with the total number of orders for the month not yet completed for which due dates cannot be met due to a lack of facilities.

Business Rules:

Indicator 1.17 - Local Service Request (LSR) Rejection Rate

Definition: The percentage of LSRs submitted by CLECs that are returned due to errors identified by the ILECs and based on an error that can be objectively demonstrated and that requires some corrective action that warrants the re-issue of an order.

Measurement Method: LSRs received and rejected are tracked and reported.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 5% or less.

Reporting Format: Indicator 1.17 - Local Service Requests Rejected.

Numerator: Total number of LSRs rejected by the ILEC during the month.

Denominator: Total number of LSRs received by the ILEC during the month.

Business Rules:

Indicator 1.18 - Local Service Request (LSR) Turnaround Time Met

Definition: The percentage of instances that the applicable LSR confirmation interval is met, as defined in the Canadian Local Ordering Guidelines (C-LOG), and in accordance with applicable Commission decisions.

Measurement Method: Local Service Confirmations (LSCs) are compiled, and the percentage of these which were returned within the applicable standard interval, is reported.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 1.18 - Local Service Request (LSR) Turnaround Time Met.

Numerator: Total number of Local Service Confirmations (LSCs) returned to the CLEC during the month within the applicable standard interval.

Denominator: Total number of Local Service Confirmations (LSCs) issued during the month.

Business Rules:

Indicator 1.19 - Confirmed Due Dates Met - CDN Services and Type C Loops

Definition: The percentage of time that the confirmed due dates are met for the provisioning of CDN services and type C loops.

Measurement Method: Completed service requests for CDN services and type C loops are compiled and the percentage of those which were completed by the confirmed due date is reported.

Geographical Basis: Company-wide, no geographic distinction.

Reporting Format: Indicator 1.19 - Confirmed Due Dates Met - CDN services and Type C Loops.

Numerator: The total number of CDN services and Type C loop requests that were completed on the confirmed due date during the month.

Denominator: The total number of CDN services and type C loop requests completed during the month.

Standard: 90% or more

Business Rules:

Indicator 1.19A - CDN Services and Type C Loops - Late Completion

Definition: The percentage of time that CDN services and Type C loop orders for which the due date as measured in Indicator 1.19 was missed, but which were completed within one working day of the confirmed due date. The due date means the standard service due date, unless the parties have agreed to an earlier or later due date.

Measurement Method: Completed service requests for CDN services and type C loops that are not completed by their due dates are compiled, and the percentage of those which were completed within one working day of their respective confirmed due dates is reported.

Geographical Basis: Company-wide, no geographic distinction.

Reporting Format: Indicator 1.19A - CDN services and Type C Loops - Late Completion.

Numerator: The total number of CDN services and Type C loop requests that have been completed within the month, but missed the confirmed due date by one working day.

Denominator: The total number of CDN services and type C loop requests completed within the month for which a due date has been missed.

Standard: 90% or more.

Business Rules:

Indicator 2.6 - Competitor Repair Appointments Met

Definition: The total number of repair appointments booked and the number met, with percentages of those met relative to the total booked for customers who are also competitors.

Measurement Method: Completed orders are sorted to determine the actual number and percentage completed on the appointed date.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 2.6 - Competitor Repair Appointments Met.

Indicator 2.7 - Competitor Out-of-Service Trouble Reports Cleared within 24 hours

Definition: The total of initial out-of-service trouble reports and those cleared within 24 hours. Percentages of those cleared relative to this total.

Initial out-of-service trouble reports are reports relative to unbundled loops and their sub-categories as well as LNI trunks.

Measurement Method: Compilation of trouble report data gathered at each repair bureau.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 80% or more.

Reporting Format: Indicator 2.7 - Competitor Out-of-Service Trouble Reports Cleared within 24 hours.

Numerator: Number of initial out-of service trouble reports cleared within 24 hours of their receipt during the month.

Denominator: Total number of initial out-of-service trouble reports received during the month.

Business Rules:

Indicator 2.7A - Competitor Out-of-Service Trouble Report Late Clearances

Definition: The percentage of trouble reports for type A and B unbundled loops and their sub-categories as well as LNI trunks that are not cleared within 24 hours (i.e., outside the performance standard of indicator 2.7), but which are cleared within the subsequent 24 hours.

Measurement Method: Trouble reports are compiled for type A and B unbundled loops and their sub-categories as well as for LNI trunks outside the performance standard of indicator 2.7, and the percentage of these Trouble Reports that are cleared within a subsequent 24-hour period.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 2.7A - Competitor Out-of-Service Trouble Report Late Clearances.

Numerator: Total number of initial out-of-service trouble reports received during the month for type A and B unbundled loops and their sub-categories as well as for LNI trunks cleared within 48 hours, excluding those cleared within 24 hours of their issuance.

Denominator: Total number of initial out-of-service trouble reports received during the month for type A and B unbundled loops and their sub-categories as well as for LNI trunks, excluding those cleared within 24 hours of their issuance.

Business Rules:

Indicator 2.8 - Migrated Local Loop Completion Notices to Competitors

Definition: The total number of completions of migrations of local loops and the number of notifications given on time by the incumbent telephone company to the competitors, notifying that the local loop migration is complete at the facilities of the incumbent telephone company, with the percentage of notifications given on time relative to this total.

Measurement Method: Completions of migrated local loops and the notifications given on time are sorted to determine the actual numbers and the percentage of notifications given on time.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 2.8 - Migrated Local Loop Completion Notices to Competitors.

Numerator: Number of notifications of local loops migrations completed during the month given on time to the CLEC.

Denominator: Total number of completions of migrations of local loops scheduled for that month.

Business Rules:

Indicator 2.8A - New Loop Status Provided to Competitors

Definition: Percentage of order completion notices and order status reports provided to competitors for new type A and B unbundled loops and their sub-categories. Completion notices are to be provided to competitors as soon as possible following installation of an unbundled loop. Order status reports are to be provided to the competitor by 5:00 p.m. (in the ILEC serving territory) for uncompleted orders on the day for which the orders are scheduled.

Measurement Method: New loop orders are compiled and the percentage of those is reported for which the required completion notices and/or order status reports were provided to the competitor.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 2.8A - New Loop Status provided to Competitors.

Numerator: Total number of orders in the month for new unbundled type A and B loops and their sub-categories for which the required completion notices and/or order status reports were given.

Denominator: Total number of orders for new unbundled type A and B loops and their sub-categories scheduled to be completed in the month.

Business Rules:

Indicator 2.9 - Competitor Degraded Trouble Reports Cleared Within 48 hours

Definition: The total number of CLECs degraded trouble reports cleared by ILECs within 48 hours of notification.

Degraded trouble reports are reports relative to unbundled loops and their sub-categories as well as LNI trunks.

Measurement Method: Total degraded trouble reports are sorted to determine the actual numbers and the percentage of reports cleared.

Geographical Basis: Company-wide, no geographic distinction.

Standard: 90% or more.

Reporting Format: Indicator 2.9 - Competitor Degraded Trouble Reports Cleared Within 48 hours.

Numerator: Total number of degraded trouble reports reported by CLEC and cleared within 48 hours of their notification.

Denominator: Total number of degraded trouble reports received from CLEC during the month.

Business Rules:

Indicator 2.10 - Mean Time to Repair (MTTR) - CDN Services and Type C Loops

Definition: The mean time to repair (MTTR) on a monthly basis CDN services out-of-service trouble reports received from competitors and completed during the month.

Measurement Method: Compilation of monthly trouble report data gathered at each repair bureau.

Geographical Basis: Company-wide, no geographic distinction.

Reporting Format: Indicator 2.10 - MTTR - CDN Services and Type C Loops.

Numerator: Total number of hours required to clear out-of-service CDN services and Type C loops trouble reports completed during the month.

Denominator: Total number of out-of-service CDN services and Type C loops trouble reports completed during the month.

Standard: 4 hour MTTR or less.

Business Rules:

Indicator 2.12 - Service Failures within First 30 days

Definition: The percentage of services that have failed and/or degraded within 30 calendar days of delivery of the service.

Measurement Method: Total of failed and/or degraded trouble reports are sorted to determine the percentage of services that have failed and/or degraded within 30 calendar days of the completion of a new or change service request for the service. The results are expressed as a percentage of the total number of new and change service requests completed the previous month.

Geographical Basis: Company-wide, no geographic distinction.

Reporting Format: Indicator 2.12 - Service Failures within First 30 Days.

Numerator: Total number of failed and/or degraded trouble reports reported by competitor.

Denominator: Total number of new and change service requests completed the previous month.

Standard: Not applicable.

Business Rules:

Date modified: