ARCHIVED - Decision CRTC 2001-366

This page has been archived on the Web

Information identified as archived on the Web is for reference, research or recordkeeping purposes. Archived Decisions, Notices and Orders (DNOs) remain in effect except to the extent they are amended or reversed by the Commission, a court, or the government. The text of archived information has not been altered or updated after the date of archiving. Changes to DNOs are published as “dashes” to the original DNO number. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats by contacting us.

 

Decision CRTC 2001-366

Ottawa, 20 June 2001

File no.: 8660-C12-05/00

To: Interested parties to Decision CRTC 2000-24 & Public Notice CRTC 2000-17

Re: CISC recommended competition-related Quality of Service indicators - Follow up to Decision CRTC 2001-217

On 9 April 2001, the Commission released Decision CRTC 2001-217; CRTC creates new quality of service indicators for telephone companies (Decision 2001-217). Decision 2001-217 established a new series of quality of service indicators that measures the supply and repair of services provided to Competitive Local Exchange Carriers (CLECs) by Incumbent Local Exchange Carriers (ILECs). The Commission believes that there is a continuing need for ILEC regulation of quality of service indicators due to low levels of competition in some areas. Customers are still largely captive to their ILEC and in some cases competitors are dependent on the ILEC for their ability to compete. In setting these new standards of service for ILECs, the CRTC is expecting to foster an environment where CLECs are better able to meet customer expectations and compete in the market more effectively.

Decision 2001-217 requested that two working groups of CISC, Business Process Working Group (BPWG) and Network Operations Working Group (NOWG) assist the CRTC in developing intervals for certain specific indicators. The working groups were required to report back to the Commission with proposed measurement standards within 30 days. The BPWG and NOWG submitted reports on May 11 and May 10 respectively.

CISC Reports

The two working groups filed reports numbered BPRE028a and NORE024c, attached, that, by consensus, recommended the following:

Revised definitions to three indicators;

Three new indicators with service intervals;

Inserting service intervals discussed in the body of Decision 2001-217 into the formal definitions of 9 specific indicators;

A performance standard be set after one year of data collection for a particular indicator.

The reports asked that the Commission determine the intervals for six indicators where the groups' members were unable to agree and that:

All the new intervals be established on an interim basis with a report back to the CRTC on the appropriateness of the intervals after one year.

The Commission determine the technical interpretations for aspects of the new indicators.

Finally, the reports contained the competitors' request that the ILECs report on a CLEC by CLEC basis without combining the data for all CLECs into one report, as the ILECs would prefer.

Conclusion

The Commission notes that within the BPWG and NOWG both ILECs and CLECs were well represented.

In Decision 2001-217, the Commission established a regime that required the ILECs to file reports for indicators that measure quarterly competition-related intervals and standards beginning with the third quarter 2001. The indicators that required the assistance of CISC to develop intervals were not included in the third quarter timeframe schedule. However, the Commission considers that there should be no delay in measuring the new indicators for competition-related intervals and standards and the start date for the reporting of the new indicators should also be set to begin with the third quarter 2001 data.

The Commission considers the consensus-based recommendations of the CISC working groups to be reasonable and approves them as follows:

The revised definitions for indicators 1.8, 1.9 and 1.10 be adopted.

Three new indicators numbered 2.7A, 2.8A and 2.9 with service intervals be created.

Specific service intervals in the formal definitions for 9 specific indicators numbered 1.8, 1.9, 1.10, 1.12, 1.13, 1.15, 1.16, 1.17 and 1.18 be adopted.

For indicator 1.14, a performance standard be set only after one year of data collection.

For indicators 2.7A, 2.8A, 2.9, 1.12, 1.13 and 1.15, where consensus within the working groups could not be reached the Commission concludes as follows:

Indicator 2.7A - immediate implementation of actual benchmark rather than waiting until data has been collected and reviewed.

Indicator 2.8A - alternative B which requires ILECs to report on the status of all loops by the end of the working day.

Indicator 2.9 - Standard of 90% set for clearing of degraded trouble reports rather than the 80% standard proposed by the ILECs.

Indicator 1.12 - The establishment of a confirmed due date measurement for expedited orders rather than standard service interval which may be longer.

Indicator 1.13 - A measurement of 1 working day after confirmed due date for orders that are late (expedited same as 1.12).

Indicator 1.15 - A measurement of 1 working day after confirmed due date for standalone LNP orders that are late (expedited same as 1.12).

The Commission directs that:

The new indicators are approved on an interim basis.

The working groups are to review all the collected data for a full year that is up to and including 2nd quarter 2002 data and to report to the Commission on the appropriateness and reasonableness of the indicators by 1 December 2002.

The ILECs report the new indicators on a CLEC by CLEC basis in a manner similar to the other competition-related indicators established by the Commission in Decision 2001-217.

With regard to Indicator 1.16 the Commission finds that the due date is to be considered met if the order is completed within the standard service interval.

The Commission will soon issue a document that provides:

a full explanation and commentary with respect to the setting of the intervals and standards;

rationale for Commission determination of non-consensus items and a technical interpretation for each indicator.

Appendix 1 describes the approved wording and definition of each indicator.

Yours sincerely,

Ursula Menke
Secretary General

Attachment

Appendix 1

Synopsis of indicators

Indicator

Title

Explanation

1.8

New Unbundled Type A and B Loop Order Service Intervals Met

The percentage of time that the due dates for the provisioning of new unbundled type A and B local loop orders are met within the applicable standard service interval

1.9

Migrated Unbundled Type A and B Loop Order Service Intervals Met

The percentage of time that the due dates for the provisioning of migrated unbundled type A and B local loop orders are met within the applicable standard service interval

1.10

Local Number Portability Order (Standalone) Service Interval[s] Met

The percentage of time that due date[s] relating to orders for the standalone porting of numbers are met within the applicable standard service interval

1.12

Local Service Request Confirmed Due Dates Met

Completed LSRs are compiled, and the percentage of these which were completed by the confirmed due date is reported

1.13

Unbundled Type A and B Loop Order Late Completions

Completed loop orders which missed their confirmed due dates are compiled, and the percentage of these which were completed within 1 working day of their respective confirmed due dates is reported

1.14

Unbundled Type A and B Loops Held Orders

The number of orders for type A and B loops which were not completed on the confirmed due date because of facility shortages, expressed as a percentage of loop inward movement

1.15

Local Number Portability Order (Standalone) Late Completions

Completed (standalone) local number portability orders which missed their confirmed due dates are compiled, and the percentage of these which were completed within 1 working day of their respective confirmed due dates is reported

1.16

Bill & Keep Interconnection Trunk Order Late Completions

The percentage of orders for the turn-up of Bill and Keep interconnection trunks, which missed the confirmed due date, that are completed within 5 working days of the confirmed due date

1.17

Local Service Request (LSR) Rejection Rate

The percentage of LSRs submitted by CLECs that are returned due to errors perceived by the ILECs

1.18

LSR Turnaround Time Met

The percentage of occasions that the applicable LSR confirmation interval is met

2.7A

Mean Time to Clear Competitor Out-of-Service Trouble Reports Outside The Performance Standard of Indicator 2.7

The ILECs will measure and report on the mean time to repair local loops capturing the actual length of time for occurrences that are outside the performance standard of indicator 2.7

2.8A

New Loop Status Provided to Competitors

Compilation of all new loop orders on a given day and status provided by 5:00 p.m. (ILEC Time)

2.9

Competitor Degraded Trouble Reports Cleared Within 48 Hours

The total number of CLECs degraded trouble reports cleared by ILECs within 48 hours of notification

Date Modified: 2001-06-20

Date modified: