Later, monitoring was treated as a subset of federal review of state activities. For example, in 1986 the Report stated that "the program review process has two parts... review of plans submitted by states. . . and monitoring to assure adherence to state plans." When viewed as part of a general review of the states, monitoring was often treated as secondary to review of "annual" state plans that were submitted to DoED by each state every three years for funding approval. Monitoring activities were often named "compliance review," as distinguished from "plan review."
In the 1990s, references in the Annual Reports to monitoring were sometimes explicit and sometimes imbedded in descriptions of the overall federal review process. The 1990 report acknowledged the difficulty in pinning down the monitoring function, as "the federal program review activities . . . are closely related to other OSEP activities. . . as part of a comprehensive system of overall assistance to the states."
This section of the analysis uses the term "monitoring" to apply to the federal assessment of the states' compliance with the provisions of IDEA, including an assessment of the SEAs' activities to ensure the compliance in their respective states of all the public agencies responsible for providing educational services for children with disabilities in accordance with Part B requirements.
In 1979 and 1980, it was not surprising that DoED focused on the development of procedures. During these years, the Bureau (as OSEP was then named) established a system of regular visits to half of the states each year, consisting of a five-day stay by four or more staff members and including visits to local programs, state programs, and state agencies, as well as interviews with state and local officials, program administrators, parents, teachers, and an advisors' panel. In 1980, the Bureau reiterated that it "attempts" regular one-week visits to half of the states each year, using the same basic procedure as described for 1979. That year, an extra emphasis was placed on technical assistance to SEAs during these visits.
The 1980 Annual Report also provided an example of the Reports' somewhat cursory description of results of monitoring, as compared to the detailed descriptions of procedural matters. In 1980, the substantive analyses of the outcome of earlier visits were as follows: states "performed well" in development of Annual Program Plans, reporting, and administration of funds;. IEPs were in place but not in compliance; LRE policies at the state level were good, although individual schools were "having difficulty" implementing them; complaints, while monitored by the Bureau, were handled at the level of the state departments of education. The Annual Report states that 320 complaints were processed between October 1978 and July 1979, and that most complaints were about appropriate placement of children, but results of the processed complaints are not mentioned. The substantive discussion of findings by the Bureau appeared to concentrate on procedural matters, such as successful design of rules, fund administration, and timely processing of complaints. This characteristic was repeated in most Reports in the past 20 years, as in the Eighth Annual Report, which gave "a detailed description of SEP's revised comprehensive compliance review system."
The emphasis on procedure increased as DoED refined its monitoring procedures to involve less scrutiny of outcomes. In 1981, DoED stated that "the Office will redirect its monitoring procedures . . . to focus on assuring that states are effectively monitoring local education agencies." DoED appeared to have increased its reliance on information provided directly by states themselves. The following year, DoED explained, "OSEP's role has necessarily changed. In fact, federal efforts since the enactment of P.L. 94-142 have periodically been modified to provide the states with increasing flexibility to implement the law in a manner consistent with local precedents and resources." The Report continued, "It should be noted that the current regulations were never expected to survive indefinitely without change."
Understandably, procedures underwent annual alterations in the early stages of the implementation of IDEA; since then, such changes have continued. The 1985 Report stated that "internal SEP concerns supplemented by questions from . . . Congress resulted in an intensive analysis of monitoring procedures that may lead to certain revisions in the process." The following year, OSEP began a staggered state plan schedule, as permitted by the Education Department General Administrative Regulations (EDGAR) to "allow for better coordination between the state plan and monitoring procedures."
The 1990 substantive discussion of the effectiveness of IDEA changed little. Most of the comments were praiseworthy but vague and directed at successful structures, such as "types and numbers of personnel providing services," and "continuing growth in SEA [state education agency] capacity to assess and assure conformity with EHA-B [Education for All Handicapped Children Act] requirements." Another procedural accomplishment highlighted that year was the elimination of a backlog of incomplete monitoring reports. The 1990 Report also stated that "it is anticipated that the monitoring process will continue to evolve and undergo adjustments in response to changing management needs." The 1997 Report stated that "over the past four years, OSEP has worked intensively to reorient and strengthen its monitoring system."
In 1981, DoED began a policy of individualizing monitoring to "take into account the particular conditions and variations that exist among the states." That policy hampered the ability to make progress comparisons between states. Because OSEP visited one half of the states in a given year, at most, no follow-up of individual state progress could be gleaned by comparing one year's Annual Report with the next. Because states were not individually named in summaries of compliance review findings, only OSEP had the information to trace such progress. While OSEP looked first at a state's Annual Program Plan, then monitored its compliance, and then watched for implementation of a Corrective Action Plan, it did not appear to link its findings from one stage to the next. In fact, one report noted, "this information cannot be used as a basis for conclusions regarding compliance," but "it will be used as a basis for discussing trends that may reflect problems in the implementation of federal requirements." This discussion was not found.
The Eleventh Annual Report compiled data from 1985 to 1988 on noncompliance problems, noting particular trouble with SEA monitoring procedures, but the data were not examined longitudinally, and states were not individually identified. The 1990 report held that "by reviewing and assessing these data, OSEP may identify trends that raise concerns about the implementation of federal law," but went on to say that "issues or concerns" could not be found because "from year to year the problems identified change and the problems differ from state to state as well."
Out of the nineteen Reports, less than one half contained charts actually listing areas of state level noncompliance. Seven Reports listed areas of state plan deficiencies; however, these charts addressed plan policies that were corrected prior to a state receiving funding, but did not address the effectiveness of implementation of those policies. For instance, in 1996 state plans were deficient in providing for procedural safeguards, individualized education programs (IEPs), least restrictive environment (LRE), right to education, private school participation, confidentiality, and general supervision. These deficiencies required "clarification or revision," and all such problems were resolved prior to final plan approval.
From 1990 to 1993, such plan deficiency charts were included, but no chart summaries were given reflecting results of compliance monitoring. More recently, the Reports returned to including charts on noncompliance along with, or in place of, plan deficiency summaries. These charts lacked detail, and the categories of noncompliance were broad and undefined, varying from year to year and making longitudinal analysis difficult. Also, the tallies of states that failed to ensure compliance for each category did not identify which states were in violation of their plans. Reports never contained follow-up charts on corrective actions taken by those states found to be out of compliance.
From October 15, 1980, until September 1, 1981, OSEP handled 150 complaints and referred to OCR 105 others that also alleged a violation covered by Section 504. Of the 105 referred complaints, 70 concerned placement or related services. In the 1982 Annual Report, DoED reported that OCR had returned or closed 41 cases by August 3, 1981. DoED also reported that OCR took an average of four months to close each case. While no review of OSEP's management of the complaint process was given, the Report discussed problems between OSEP and OCR producing "inconsistent policy interpretations" on "identical issues," leading to the creation of a task force.
In 1983, the Annual Report again analyzed complaints sent to OCR and OCR's turnaround record, although there was no discussion of complaints handled directly by OSEP. Suggestions of cooperation between the two offices appeared in the Eighth Annual Report, which reported data from both offices being used in combination by OSEP to assist states "in improving information collection and remedying the possible problems the information suggest[ed]." While some tension may have existed between OSEP and OCR regarding policy interpretations of the law in the complaint-handling process, it was largely resolved through a Memorandum of Understanding (MOU) between the two offices.
In later Annual Reports, the IDEA complaint process was rarely discussed except in the context of evaluating SEA policies for addressing complaints. Although DoED retained authority to involve itself in complaints based on IDEA that did not involve Section 504, one problem at the state level was a "failure to inform complainants of their right to request that the U.S. Secretary of Education review the state's handling of the complaints." This state responsibility was eliminated, however, with the passage of IDEA '97. OSEP remains "responsible for ensuring that each SEA . . . implements a complaint-management system that satisfies the requirements" set forth in 76.780 B 76.782 of EDGAR.
IDEA anticipated that DoED would play a dual role of assisting states and enforcing the law with respect to the states. Several Annual Reports provided explanations of the law's withholding provision as prologues to discussions of monitoring, but then made no further reference to actual or contemplated use of that provision. Other reports acknowledged that primarily "review activities provide information." DoED's ambiguity about the purpose of monitoring in these contexts suggested a disconnect between monitoring and enforcement.
DoED moved from labeling its activities as "administration," suggesting federal control, to "assisting," indicating more state control.
Additionally, it began by referring to "monitoring" states, implying enforcement, and then shifted to the more open-ended language of "review" and "teams." For instance, "when the SEA is asked to correct identified deficiencies, the PAR (Program Administrative Review) team works with the state by providing technical assistance that enables the SEA to comply with the law." By 1990, enforcement seemed secondary to teaming with and assisting states; review was described as "verification and support of the Corrective Action Plan," and OSEP began to hold "biannual meetings to exchange information with SEA officials."
In 1982, Annual Program Plans begin to receive three-year approval to reduce time and paperwork for states. Additionally, the Bureau's monitoring activities "focused predominantly on assuring and strengthening state capacity to effectively monitor LEAs and public and private agencies." Parent involvement and visits to LEAs were no longer emphasized. DoED put greater emphasis on off-site monitoring of information submitted and data collected, describing those methods as "less intrusive" but "more continuous." It concentrated on developing procedures and obtaining information to create individual state profiles to be regularly reviewed and updated. These changes were made in response to an executive order of January 29, 1981, to reduce the burden and cost on the states and to ensure that regulations were no stricter than the demands of the statute.
By 1983, monitoring consisted of developing screening documents, plus three options: (1) off-site monitoring, (2) on-site monitoring at the state level, or (3) on-site monitoring at the state and local level, although an on-site review was required for each state at least once every three years. After such program review, the SEAs responded with voluntary implementation deadlines, requests for technical assistance, and self-imposed deadlines. These changes seemed to reflect more state autonomy and less direct enforcement. The 1990 Report cited EHA-B 612(6) as "specifically designat[ing] the SEA as the central point of responsibility and accountability." Nonetheless, the Eleventh Annual Report gave reassurance that the review procedure "has the capacity to verify that the requirements of the Act are being carried out." The Report further stated that OSEP would "determine with states the appropriate remedial measures that must be taken to correct identified discrepancies between the requirements and states' policies and procedures." In 1990, a new procedure was implemented.
DoED considered increased reliance on states to perform enforcement activities an appropriate response "to the growing capacity of state education agencies to assure the availability of a free appropriate public education to all handicapped children." "Federal efforts since the enactment of P.L. 94-142 had periodically been modified to provide the states with increasing flexibility to implement the law in a manner consistent with local precedents and resources," another Report noted. Technical assistance was also increasingly targeted to problems of individual states and coordinated with monitoring activities. In one of the few places an Annual Report focused on a specific problem area, it was discussed in the technical assistance section. The 1983 Report found that states were still "experiencing some difficulty with certain requirements of the laws."
Finding # IV.1A
Finding # IV.1B
Recommendation # IV.1
Finding # IV.2
Recommendation # IV.2
Part V presents an overview of some private litigation challenging states' failures to ensure compliance with Part B of IDEA, and describes, in part, the outcomes of key cases impacting state monitoring systems.
Go to Part V, IDEA Litigation Challenging State Noncompliance