DOE Archives

Excellent Educators for New Jersey

Educator Evaluation Frequently Asked Questions (FAQ)

Overview of Educator Evaluation in New Jersey

Q: What are the priorities of New Jersey's educator evaluation reform agenda?

A: The Department aims to improve the effectiveness of all educators in New Jersey's schools by:

  • Establishing a universal vision of highly effective teaching practice based on a common language and clear expectations;
  • Implementing teacher and principal practice measures that yield accurate and differentiated levels of performance;
  •  Providing teachers with timely, actionable, and data-driven feedback;
  •  Providing teachers and principals with targeted professional development opportunities aligned to assessment and feedback to support their growth;
  • Providing district administrators with improved tools by which to measure principal effectiveness;
  • Using multiple measures of performance data to inform personnel decisions.

The ultimate goal is to increase achievement for all students by ensuring that every New Jersey student has access to a highly effective teacher.

Q: Why do we need to change teacher evaluation practices in New Jersey?

A: Effective educators are the most important in-school factor for student success, but we currently lack a robust statewide evaluation system that adequately measures effectiveness. Teachers and principals need timely, meaningful feedback to improve their practice, and students deserve educators who are highly effective and continuously improve. A high-quality evaluation system will enable districts to identify each educator's professional development needs and support his/her growth. Differentiating educators based on their performance rather than treating them as interchangeable widgets is fair to educators and shows respect for their profession. A high-quality evaluation system will also help districts and schools improve their personnel decisions. By linking tenure and other personnel decisions as well as compensation levels to educator effectiveness, rather than to seniority and advanced degrees, school systems will be able to attract and retain more effective teachers and drive significant improvements in student learning. The New Jersey Educator Effectiveness Task Force Report, released in March 2011, outlines several steps for implementing an improved evaluation system.

Q: Are other states and districts changing evaluation practices?

A: Yes, many. The federal government's educational reform agenda has focused on supporting educators as professionals while also holding them accountable for student learning. Through a number of grant programs, the federal government has invested in states' development of innovative strategies that help educators improve student outcomes. Across the country, states are changing laws to support student performance-focused evaluation systems. At both the state and district levels, pilot programs are testing new systems and providing lessons learned. The Department is tracking this work closely and examining best practices to help inform our system as well.

Q:  What are the plans for statewide roll-out of the educator evaluation system?

A:  The new teacher and principal evaluation systems will be implemented in all districts in 2013-14.  The 2012-13 school year serves as a time for districts to build capacity for new evaluation procedures by following specific requirements detailed later in these FAQ. In addition, several districts are participating in teacher and principal evaluation pilots during this school year.

On March 6, 2013, the Department of Education plans to propose regulations to the State Board of Education providing rules for educator evaluation and professional development as outlined in the TEACHNJ Act.  These regulations are scheduled to become effective at the beginning of SY13-14 and will include greater details about several elements of evaluation, including:

  • Calculation of the annual summative ratings for teachers, principals, assistant principals, and vice-principals;
  • Observation requirements for teachers;
  • Objective measures of student achievement for teachers of subjects and grades tested by the New Jersey Assessment of Skills and Knowledge ("NJ ASK") as well as for teachers of non-tested grades and subjects;
  • Measures of practice and student achievement for principals;
  • Mentoring;
  • Individualized professional development;
  • Corrective action plans; and
  • The role and duties of the School Improvement Panel.

The tentative 2013 timeline for State Board review of these proposed regulations is as follows:

Date Action
March 6
April 3
  • Regs proposed for second discussion (including amendments incorporating State Board/public input, if applicable)
  • State Board to hear public testimony
May 1
  • State Board to discuss regs at proposal level

June 3

  • Proposed code published
  • Beginning of 60-day comment period
August 2
  • Close of 60-day comment period
September 4
  • State Board to consider final version of regs at adoption level

October 7

  • Office of Administrative Law (OAL) to publish final regs
  • Final regs go into effect

When proposed regulations are presented to the State Board, the Department will launch a statewide outreach initiative to communicate information about evaluation requirements for 2013-14.  Over the course of several weeks, we will conduct regional presentations, post a variety of resources on our website, and provide opportunities for educators and others to share questions and feedback.  Materials for various audiences (teachers, principals, etc.) and on various topics (evaluation system overview, student achievement and teacher practice measures, training, etc.) will be available for districts to tailor and use.  Any educator interested in providing input on topics to include in this outreach initiative should send an email to educatorevaluation@doe.state.nj.us.

Q: How will evaluation reform benefit educators?

A: New Jersey, like the vast majority of other states, does not have an evaluation system that adequately measures educator effectiveness, and the state only gives districts vague guidance. It's time we treat educators like the professionals they are by taking special care to identify and recognize greatness and spending more energy developing and supporting those needing help. To accomplish this, we need fair, credible, and rigorous evaluations to differentiate performance. Educators in pilot districts are helping shape this new system, providing vital feedback during development and implementation before statewide roll-out.

Q: How will evaluation reform benefit students?

A: More than two decades of research findings show that student achievement is strongly linked to teacher and principal effectiveness; highly skilled educators produce better student results. In order to improve student performance, we must recruit the best and brightest to the profession, prepare and reward outstanding educators, support them in honing their practice, and give them incentives to teach in schools with struggling populations. By implementing rigorous, transparent, and trustworthy evaluations, we aim to improve educator effectiveness and thus student outcomes.

Q: Will evaluations be subject to the Open Public Records Act (OPRA)?

A: No. Personnel files are generally exempt from OPRA except for name, title, position, salary, payroll record, length of service, date of separation and reason, and amount/type of pension received.  Evaluation records are included in this exemption, so they will not be made available for public access.

Tenure Law/Evaluation Connections

Q:  Which personnel does the TEACHNJ Act apply to?

A: The TEACHNJ Act reforms various elements of the tenure process for all employees who earn tenure under Title 18A, such as teachers, principals, janitors, athletic directors, staff at state institutions and counselors.

The degree to which the tenure process is changed and the specific elements that are changed for each employee depend on

1) how the employee formerly earned tenure (under which chapter);
2) the specific language in TEACHNJ, and
3) the definition of "teaching staff," (as provided in N.J.S.A. 18A:1-1; N.J.S.A. 18A:28-5; and N.J.S.A. 18A:6-119).

Where an individual employee seems to earn tenure through multiple chapters, the determination of whether a particular section of TEACHNJ applies to that individual should be reviewed on an individual basis and in conjunction with local board counsel.  Please see the following FAQ for more details on various applications of the law.

Q: For which personnel does TEACHNJ shorten the legal proceedings for tenure disputes?

A: Through TEACHNJ, the New Jersey Legislature has streamlined the process for tenure charges. Such changes apply to employees under tenure of office, "in the public school system of the state," as well as employees "in any other educational institution conducted under the supervision of the commissioner"  (N.J.S.A. 18A:28-5).  Thus, these reforms apply to employees earning tenure pursuant to:

  • N.J.S.A. 18A:60-1 (i.e.teachers at state institutions and Dept. of Corrections);
  • N.J.S.A. 18A:17 (i.e. janitors, secretarial staff); and
  • N.J.S.A. 18A:28-5.

Q: For which personnel does the process for earning tenure change under TEACHNJ?

A: TEACHNJ reforms how teaching staff members employed in public schools and tenured under chapter 28 of Title 18A earn tenure.  For this group, tenure may only be earned after four years, and teaching staff members must earn a rating of at least effective in at least two out of three summative ratings

The law defines teaching staff members as individuals in the positions of

  • teacher,
  • principal, other than administrative principal,
  • assistant principal, vice-principal,
  • assistant superintendent,
  • all school nurses including school nurse supervisors, head school nurses, chief school nurses, school nurse coordinators, and any other nurse performing school nursing services,
  • school athletic trainer, and
  •  other employees who are in positions which require them to hold appropriate certificates issued by the board of examiners.

The new process for earning tenure does not apply to those who do not hold certificates issued by the State Board of Examiners such as school business administrators shared by two or more school districts, teachers at state institutions and Department of Corrections, janitors, secretarial staff, and anyone who falls outside of the definition in N.J.S.A. 18A:28-5

Q: For which personnel does the evaluation process link to tenure charges?

A: TEACHNJ requires an evaluation rubric with four defined ratings for all teaching staff members. However, only an ineffective rating for two consecutive years, or a partially effective rating followed by an ineffective rating on an evaluation rubric for teachers, principals, vice-principals and assistant principals will automatically trigger tenure charges. This applies to teachers, principals, assistant principals, and vice-principals who are serving as members of the professional staff of any district or regional board of education, or county vocational school (see definition in N.J.S.A. 18A:1-1; N.J.S.A. 18A:28-5; and N.J.S.A. 18A:6-119).
The link between summative evaluation ratings and tenure charges may not apply to

  • Teaching staff members other than teachers, principals, assistant principals, and vice-principals
  • Staff members who earn tenure pursuant to N.J.S.A. 18A:60-1 (i.e. teachers at state institutions and Department of Corrections); and
  • Staff members who earn tenure pursuant to N.J.S.A. 18A:17 (i.e. janitors, secretarial staff).

Q: How do teachers and principals earn tenure under the new law?

A: The new law links the earning and keeping of tenure to the results of a teacher or principal's annual summative evaluation. Any teacher, principal, assistant principal, or vice principal employed after August 6, 2012 must complete four years of employment to be eligible for tenure under the following evaluation requirements:

  • To earn tenure, anew teacher must complete a district mentorship program during his/her first year of employment.  After completion of this program, the teacher must be rated either effective or highly effective in two of the three subsequent years.
  • To earn tenure, a new principal, assistant principal, or vice principalmust be rated either effective or highly effective in two annual summative evaluations within the first three years of employment, with the first effective rating on or after completion of the second year.

Q:  How do teachers and principals lose tenure under the new law?

A: If any tenured teacher, principal, assistant principal, or vice principal is rated ineffective or partially effective in two consecutive years according to the chart below, that employee will be charged with inefficiency.  The charges are promptly filed by the superintendent with the local board of education.  Within 30 days of the filing, the board of education shall forward the written charges to the Commissioner, unless the board determines that the evaluation process has not been followed.  After permitting the employee an opportunity to submit a written response to the charges, the Commissioner shall refer the case to an arbitrator to determine potential loss of tenure.  The chart below outlines these rating combinations and the related actions.

Year A Rating Year B (Consecutive) Rating Action
Ineffective Ineffective The superintendent shall file a charge of inefficiency
Partially Effective Ineffective
Ineffective Partially Effective The superintendent may file a charge of inefficiency or may defer the filing until the next year; in the following year (i.e., the third consecutive year), the superintendent shall file a charge of inefficiency if the annual rating is ineffective or partially effective
Partially Effective Partially Effective

In addition to these evaluation implications, tenure charges can still be brought for incapacity, conduct unbecoming, or other just cause.

Q:  How does the new law impact teachers hired before its passage who are already in the process of earning tenure under the previous law?

A:  Any teacher hired before the August 6, 2012 signing of the tenure bill is grandfathered in to the previous 3-year tenure-granting process.  The Evaluation Office is in the process of defining further guidance for new evaluation procedures based on the new law, and will provide that information as it is finalized.

Q:  Are new teachers – board-approved prior to August 6, 2012 but not starting until September – grandfathered into three-year requirement?

A:  Yes, any teacher board-approved before the August 6, 2012 signing of the tenure bill is grandfathered in to the previous 3-year tenure-granting process.  The Evaluation Office is in the process of defining further guidance for new evaluation procedures based on the new law, and will provide that information as it is finalized.

Q:  Will summative ratings "count" this year (2012-13) toward tenure decisions?

A:  No – the only item "on the clock" is the mentorship year for new teachers.  No evaluation outcomes in the 2012-13 school year will impact tenure decisions. 2013-14 is the first year where the statewide system will be in place, and the first year when summative rating "clock" (ie: teachers needing to be rated at least effective for two of three years) will start.

Q: How will the arbitration process be expedited as a result of the new law?

A:  There are four grounds for bringing tenure charges (1) inefficiency, (2) incapacity, (3) unbecoming conduct, and (4) other just cause.  All tenure charges, regardless of the grounds, will go to an arbitrator. For charges brought for inefficiency, the arbitrator can only consider the following: (1) whether the evaluation failed to adhere substantially to the evaluation process, including, but not limited to providing an corrective action plan; (2) there is a mistake of fact in the evaluation; (3) the charges would not have been brought but for considerations of political affiliation, nepotism, union activity, discrimination as prohibited by State or federal law, or other conduct prohibited by State or federal law; or (4) the district's actions were arbitrary and capricious. Only evaluations conducted in accordance with the rubric adopted by the Board and approved by the Commissioner may be used to bring a charge of inefficiency under this section.

There is no restriction in the law regarding what information can be considered by the arbitrator for the other three types of charges (incapacity, unbecoming conduct, or other just cause).

For all charges, the hearing shall be held within 45 days of the assignment to the arbitrator and the written decision shall be held within 45 days from the start of the hearing.  This cap is intended to help ensure quicker resolutions.  Arbitrators who do not adhere to the timelines may be replaced.

Q:  How will the permanent arbitration panel be formed and assigned?

A:  The Commissioner of Education will maintain a panel of 25 permanent arbitrators.  Of the 25,

  • Nine will be designated by the New Jersey School Boards Association
  • Eight will be designated by the New Jersey Education Association
  • Five will be designated by the New Jersey Principals and Supervisors Association
  • Three will be designated by the American Federation of Teachers

The arbitrators will have knowledge and experience in the school employment sector.  The commissioner will inform the appropriate designating entity when a vacancy exists, and if that entity does not appoint an arbitrator within 30 days, the commissioner will designate one to fill that vacancy.  Arbitrators on the permanent panel will be assigned by the commissioner randomly to hear cases.

Q: What are the milestones required of all districts in 2012-13 resulting from the new law and according to the proposed Educator Effectiveness Evaluation System regulations?

A:  The TEACHNJ Act substantially modifies the process for evaluating teaching staff members. Under this new law, all districts must use evaluation rubrics to assess the effectiveness of teaching staff, and these rubrics must be reviewed and approved by the Commissioner of Education.  Districts must do the following in 2012-13:

  • Form a District Evaluation Advisory Committee (DEAC) to ensure stakeholder engagement by October 31, 2012.
    • Members must include teachers from each school level represented in the district, school administrators conducting evaluations (including at least one who participates on the School Improvement Panel and one special education administrator), central office administrators overseeing the evaluation process, superintendent, supervisor, a parent, and a member of the district board of education; at the discretion of the superintendent, membership may be extended to the district business administrator and to representatives of other groups.
  • Adopt educator evaluation rubrics that include state-approved teacher and principal practice evaluation instruments by December 31, 2012.
  • Begin to test and refine evaluation rubrics by January 31, 2013.
  • Form a School Improvement Panel to oversee evaluation activities by February 1, 2013.
    • Members must include the school principal or designee, an assistant/vice principal, and a teacher.
  • Thoroughly train teachers by July 1, 2013.
  • Thoroughly train evaluators by August 31, 2013.
  • Thoroughly train principals and evaluators on the principal practice evaluation instrument by October 31, 2013.
  • Report to the Department on these activities by February 15 and August 1, 2013.

Q:  What is the reporting process for 2013?

A:  In 2013, the reporting process regarding evaluation rubrics for teachers, principals, vice-principals, and assistant principals will occur in two steps:

  • Districts will provide the Department with the names of their selected teacher and principal practice evaluation instruments and certify ongoing compliance with the TEACHNJ Act through a survey that will be due back on February 15, 2013.  Please note that these instruments (a) must be selected from the approved DOE lists and (b) that these instruments are part of the broader evaluation rubrics, which are comprised of multiple measures of practice and student learning (see evaluation terminology explanations on our website).  Please view the January 22, 2013 memo for more details about this process.
  • Districts will report how specific aspects of their evaluation rubrics comply with Department standards by August 1, 2013 through a process to be detailed in a future communication. (Note: Department standards for evaluation in 2013-14 will be proposed to the State Board of Education in March 2013 and communicated widely).

Q: What support will the Department provide to districts in meeting evaluation requirements?

A: The Office of Evaluation will provide targeted support for some districts, and will help those with similar implementation concerns to partner with others to receive support in overcoming these obstacles.  Implementation Managers from the Office of Evaluation and County Office staff will provide field assistance as appropriate.  Finally, Regional Achievement Centers (RACs) will offer support for evaluation activities in the lowest-performing priority and focus schools.

Q: Are districts that have already selected evaluation instruments required to form a District Evaluation Advisory Committee?

A:  All New Jersey districts are required to form District Evaluation Advisory Committees (DEACs) to ensure stakeholder engagement in all aspects of evaluation reform.  Although districts may have already reviewed and selected an evaluation instrument, it is important that DEACs are in place to advise districts on implementation and ensure educators and other specified individuals have a forum for expressing questions, concerns, and other feedback about the evaluation system for teachers and principals.  The Department will continue to provide guidance on the role of DEACs supporting teacher and principal evaluation implementation in future communications.

Q: What are the requirements for the School Improvement Panel mandated by the law?

A: By February 1, 2013, every school must convene a School Improvement Panel that will oversee the mentoring and evaluations of teachers and identify professional development opportunities.  The charge of the School Improvement Panel (ScIP) is to ensure the effectiveness of the school's teachers.  Specific duties are as follows:

  • Oversee mentoring;
  • Conduct evaluations, including a mid-year evaluation of any teacher rated ineffective or partially effective in the most recent annual summative evaluation; and
  • Identify professional development opportunities.

Members of the ScIP must include the school principal or designee, an assistant/vice principal, and a teacher.  The principal will have final responsibility for ScIP membership but must consult with the majority representative in determining a suitable teacher to participate.  To do this, the association might submit suggested names for the principal to consider, or the principal might meet with association representatives to discuss teacher selection.  Principals will not be limited to choosing from among any suggested names.

Pending a State Board decision on the revised professional development regulations, the principal may decide how the school-level professional development committee will interface with the ScIP during this school year.

Q:  How is professional development aligned to evaluations according to the new law?

A: The law states that professional development aligned with evaluations must include a focus on supporting student achievement.  Educators must receive individual professional development plans.  Extra professional development is required for struggling teachers, and a Corrective Action plan must be created for teaching staff members rated as ineffective or partially effective in the annual evaluation.  In addition, School Improvement Panels will identify professional development opportunities for school staff.

As communicated in a December 18, 2012 broadcast memo, the professional development planning process at the school and district levels for SY13-14 will remain the same as it was for the SY12-13.  However, instead of submitting the district professional development plan to the County Professional Development Board for review the chief school administrator should hold the district plan, pending action by the New Jersey State Board of Education and further guidance from the Department to be issued in the coming months.  Please refer to the memo for more information.

For more information, please visit the Office of Professional Development website (http://www.state.nj.us/education/profdev/pd/teacher/) and direct any questions about professional development requirements to teachpd@doe.state.nj.us.

Q:  Does the new law allow evaluation information to be made available to the public?

A: No. All identifiable information related to personnel evaluations will be confidential and not accessible to the public.

Q: Do the new law and regulations pertaining to evaluations and tenure pertain to charter schools?

A: Every charter school must develop and implement a high-quality, rigorous educator evaluation system, which must be approved by their board of trustees (subject to the review and approval of the Commissioner). The Office of Charter Schools will review educator accountability within the parameters established by the Department's Performance Framework and develop and disseminate guidelines for the establishment of charter schools' educator effectiveness evaluation systems in the coming months. Please visit the Department Charter Schools Website for more information.

Evaluation Requirements for All New Jersey Districts

Q:  What are the specified steps that all districts must take in 2012-13 to prepare for statewide implementation?

A: The following chart depicts deadlines and reporting procedures as districts prepare to implement new teacher and principal evaluations in 2013-14: 

Requirement Deadline Reporting Process
Form District Evaluation Advisory Committee* October 31, 2012 February 2013 survey
Adopt educator evaluation rubrics that include state-approved teacher and principal practice evaluation instruments December 31, 2012 February 2013 survey;
August 2013 survey**
Begin to test and refine evaluation rubrics January 31, 2013 February 2013 survey
Form School Improvement Panel February 1, 2013 February 2013 survey
Thoroughly train teachers on teacher practice evaluation instrument July 1, 2013 August 2013 survey
Thoroughly train evaluators on teacher practice evaluation instrument August 31, 2013 August 2013 survey
Thoroughly train principals and evaluators on principal practice evaluation instrument October 31, 2013 TBD

*The District Evaluation Advisory Committee is described in the presentation and previous memos posted at http://www.state.nj.us/education/EE4NJ/presources/.
**The Department will collect specified information about rubric adoption in both surveys.

Q:  What are the "teacher and principal practice evaluation instruments" and how do they fit in to the larger "evaluation rubric?"

A:  The "teacher and principal practice evaluation instruments" are the specific tools used to assess the competencies of teacher and principal practice. The "evaluation system" refers to the overarching umbrella of all components of teacher and principal evaluation that are combined to generate a summative assessment of performance.

Both of these terms are further defined in the Definitions and Explanations of Educator Evaluation Terminology posted on this website.
  
Q: What are the requirements for selecting evaluation instruments?

A: Teacher and principal practice evaluation instruments must be approved by the New Jersey Department of Education. The Request for Qualifications (RFQ) process for state approval of teacher and principal practice evaluation instruments is complete for 2012, and the 2012 State-Approved Teacher and Principal Practice Evaluation Instruments Lists have been posted. We recognize that districts may wish to change selected instruments in the future as new and updated instruments become available.  We anticipate adding instruments to the approved lists through a future RFQ process in the spring or summer of 2013.  Districts will have the opportunity to share information about instrument changes through annual evaluation reporting procedures.

All districts are required to follow public bidding laws and regulations (detailed at http://www.state.nj.us/dca/lgs/lfns/10lfnlis.shtml) in acquiring approved evaluation instruments, and to consult with their Business Administrator (BA) for guidance. If the BA needs additional support, he or she should contact the county office of education and consult with the County School Business Administrator. Note that the instruments on the approved list will not have contracts with the state, necessitating that districts develop their own contracts. Please refer to additional FAQs on public bidding at: http://www.state.nj.us/education/EE4NJ/faq/
     
Q:  Why is the Department of Education requiring districts go through an approval process for evaluation instruments?

A: The goals of a state-approved list of evaluation instruments are as follows:

  • to provide flexibility for districts to select an instrument that meets their distinct needs as well as state requirements;
  • to include a wide variety of approved instruments, including no- and low- cost instruments from other states and districts; and
  • to provide assurance to districts that their selected instrument meets the criteria established by the state.

Q: What is the LEA's responsibility regarding growth data this year and in the future?

A: As part of the pilot stage of developing new evaluation systems, the Department is in the developmental stage of establishing growth measures for math and language arts teachers across the state.  To prepare for statewide implementation of improved evaluations in 2013-14, we are conducting several steps to provide the highest possible quality growth data to all districts.  In 2012-13, all districts are responsible for submitting Course Roster data and Cohort 1 pilot districts are receiving some preliminary growth data for the purposes of the pilot only, as explained below:

  • All districts began providing Course Roster Submission data through NJSMART as of SY11-12.  This data is used to link individual teachers to students as appropriate.  In February of 2013, all districts will receive reports summarizing the data they provided in their SY11-12 Course Roster Submission in order to improve this process for subsequent years.
  • Using Course Roster Submission data and relevant Student Growth Percentile (SGP) scores from the 2011-12 NJ ASK, the Department calculated median Student Growth Percentile (mSGP) scores for qualifying individual teachers in Cohort 1 (2011-12) evaluation pilot districts.  The mSGP scores were sent to the pilot districts in January of 2013, and those districts have been asked to examine the data and share questions, concerns, and any other feedback with the Department.
  • All districts will provide Course Roster Submission data for SY12-13 at the end of this school year. 
  • The Department will use SY12-13 Course Roster Submission data and 2012-13 NJ ASK SGP scores to calculate mSGP scores for all qualifying teachers, and will provide that data to all districts in early 2014.  Guidance on how teachers' mSGP scores will be used in calculating summative evaluation scores for SY13-14 will be provided in forthcoming regulations.

Q: Do these evaluation requirements pertain to charter schools?

A: Every charter school must develop and implement a high-quality, rigorous educator evaluation system, which must be approved by their board of trustees (subject to the review and approval of the Commissioner). The Office of Charter Schools will review educator accountability within the parameters established by the Department's Performance Framework and develop and disseminate guidelines for the establishment of charter schools' educator effectiveness evaluation systems in the coming months. Please visit the Department Charter Schools Website for more information.

Public Bidding and Procurement for Evaluation Instruments

Q: Must districts follow public bidding laws and regulations, as detailed at http://www.state.nj.us/dca/lgs/lfns/10lfnlis.shtml, in acquiring their approved evaluation instruments?

A: Yes. Districts should consult with their Business Administrator for guidance. If the BA needs guidance, they should contact their county office of education and consult with the County School Business Administrator.

Q. Can districts conduct sole source bidding if they have very specific requirements that only one vendor can provide?

A. No. The Public School Contracts Law does not include a sole-source exception; therefore, districts must use the competitive contracting process or the sealed bid process pursuant to N.J.S.A.18A:18A-15(d) for the procurement of proprietary services. Sole source bidding is not allowable for New Jersey districts.

Q: Since the Department has posted lists of approved evaluation instruments, do districts need to conduct public bidding?

A: Yes; the providers of the teacher observation instruments on the approved list will not have financial contracts with the state, necessitating that districts develop their own contracts through their purchasing agents.

Q: Is there a bidding process that will take into account more than price in the selection of evaluation instruments?

A: Yes. A competitive contracting bid is described at N.J.S.A.18A:18A-4.1 et seq. (also referred to as "RFP" or Request for Proposal). A competitive contracting bid is awarded on the basis of price and other factors, and therefore should be written very specifically to meet the needs of the district.

Q. What is the difference between a sealed bid and a competitive contracting bid? Which is more expeditious? What are the rules that determine which can be used?

A. Sealed bids or "IFBs" (Invitations for Bids) are the typical bidding situations that most are familiar with. The sealed bid award is based solely on the "Lowest Responsive/Responsible bidder." A competitive contract is described at N.J.S.A.18A:18A-4.1 et seq. It is also referred to as "RFP" (Request for Proposal). A sealed bid is awarded on the basis of price alone; a competitive bid is awarded on the basis of price and other factors. In all instances, applicants should consult with their Business Administrator for guidance. If the BA needs guidance, they should contact their county office of education and consult with the County School Business Administrator.

Q. What is the quickest way to conduct the bidding process?

A. Sealed bids are the quickest method. In this process, the district submits specifications and accepts the bids in a minimum of 10 days after the advertisement appears in the newspaper. Sealed bids are awarded on the basis of price alone (N.J.S.A.18A:18A-4.5). Districts should confer with their Business Administrator on the appropriate bidding process. If the BA needs guidance, he/she should contact their county office of education and consult with the County School Business Administrator.

Q. What statute or regulations provide authority for competitive bidding?

A. According to the Local Finance Notice (LFN 2010-03) "Guidance on Local Government and Board of Education Procurement" in the development and implementation of a competitive contracting process for "school and district improvement services," districts must comply with the statutory (N.J.S.A.18A:18A-4.1 et seq.) and regulatory (N.J.A.C. 5:34-4.1 et seq.) provisions of the process. The entire LFN is available at http://www.state.nj.us/dca/lgs/lfns/10lfnlis.shtml.

Q. If a district is already working with a vendor, and they want to deepen the work to meet the requirements of the evaluation system, does the district have to go out to bid? What type of bidding is required?

A. It depends on the type of contract the district currently has with the vendor. If the current contract was awarded without bidding, then the district must go out for bids. If the work was publically offered and awarded, but the subject of the contract is materially different, the district must bid for the additional work. Please review the Administrative Code at N.J.A.C.5:30-11.1 et seq. In all instances, applicants should consult with their Business Administrator for guidance. If the BA needs guidance, they should contact their county office of education and consult with the County School Business Administrator.

2012-2013 Teacher Evaluation Pilot

Q: What are the goals of the 2012-13 teacher evaluation pilot?

A:  The goals of the 2012-13 teacher evaluation pilot are to build on successes and lessons learned from Cohort 1 (2011-12) pilot, national research, and other states to refine requirements for a statewide evaluation system; and to continue to actively engage district educators and stakeholders in shaping the new system.   To do this, we will facilitate collaboration between Cohort 1 and Cohort 2 pilot districts to expand upon first year's work and share information.  We will also continue state and district evaluation advisory committee work and will engage an external researcher.

Q: What are the requirements of the 2012 – 2013 pilot?

A: All requirements of the 2012 – 2013 pilot are outlined in the Notice of Grant  Opportunity released on March 28, 2012. Due to federal funding restrictions, two separate NGOs were available: one for Title I LEAs with 100% of their schools receiving Title I funds and having schoolwide status and another for all other districts (excepting Cohort 1 pilots).  Both versions of the NGO contain the same evaluation specifications and requirements.

Q: What are the major requirement differences between the 2011-12 and 2012-13 pilots?
A. Major requirement differences are as follows:

  • Some unannounced observations are required;
  • More flexibility is allowed on duration and number of observations;
  • The minimum number of observations differ for teachers of core and non-core subjects; and
  • New observation processes are required to ensure inter-rater agreement and accuracy, including use of external observers and double-scoring of some sessions.

Q: How do pilot districts interact with the Department throughout the program?

A:  Each pilot district convenes a district-level stakeholder advisory committee to oversee and guide the implementation of the evaluation system during the pilot period. Membership on this committee must include representation from the following groups: teachers from each school level (e.g., elementary, middle, high school) comprising of at least one quarter of District Evaluation Advisory Committee membership, central office administrators overseeing the teacher evaluation process, superintendent, administrators conducting evaluations, a special education administrator, a parent, and the local school board.  In addition, the committee must include a data coordinator who will be responsible for managing all data components of the district evaluation system. At the discretion of the superintendent, membership may also be extended to representatives of other groups, such as counselors, child study team members, instructional coaches, new teacher mentors, and students. One member of the advisory committee must be identified as the program liaison to the Department.

Q:  What role do educators have in the process?

A:  Educators from pilot districts are fully engaged in the program.  Teachers and administrators are trained on the teacher practice evaluation instrument.   They have the opportunity to join or provide feedback to the district advisory committee, which will regularly inform the Department on pilot progress, challenges, and opportunities for improvement.  They will help to shape the new evaluation system and are gaining experience with the system before it is implemented state-wide.

Q:  What is the structure of the state-level stakeholder advisory committee? How often do they meet, and are meetings open to the public?

A:  The statewide Evaluation Pilot Advisory Committee (EPAC) is comprised of stakeholders that collaborate with and advise the Department throughout implementation of the pilot program. Their role is to engage in outreach to stakeholders and constituencies and to provide feedback about issues and challenges to inform statewide implementation of an educator effectiveness evaluation system.  The EPAC has met regularly since September 2011. Because EPAC members are privy to, and provide guidance on, a variety of challenges and issues that pilots are facing in implementation, their deliberations are not open to the public. It is our expectation that many of the issues around implementation will be worked out and course corrections made during the pilot in preparation for statewide rollout.

In addition to the statewide EPAC, each pilot district forms its own advisory committee and appoints members of this committee to serve as liaisons to the statewide committee. Educators in pilot districts are able to present their feedback, questions, and concerns to the district-level advisory committee, which in turn can present them to the state EPAC.

Q: Are informal observations required in the second pilot year?

A:  The Department still encourages informal observations and walk-throughs as a best practice, but based on lessons learned from our Cohort 1 pilot districts, we are not requiring them for the second year of the teacher evaluation pilot.

Q: Why has a distinction been made between how many observations core subject teachers and non-core subject teachers receive in the pilot?

A:  Mathematics, Language Arts, Science, and Social Studies are universally regarded as the core content areas in K-12 to prepare college- and career-ready students.  Likewise, recruitment and retention figures across the country show that finding and retaining teachers in many of these subject areas, specifically math and science, is a difficult endeavor.  In turn, we believe that honing in on these skills and the related teaching is a critical investment of resources and time.  At this time, we are piloting a differentiation in the minimum number of required observations to learn from the field.  Please note that the numbers set forth in the pilot are simply minimums.  We have not lowered the number of observations for non-core teachers, but raised the number for core teachers.  Pilots may choose to observe both their core and non-core teachers at the higher level so that there is no differentiation in the number of observations.  LEAs are welcome and encouraged to conduct more observations for teachers they deem to be in need of more attention.

Q: Why are unannounced observations required? Do they preclude having pre-conferences?

A:  Unannounced observations are required in order to give the observer a different lens on teaching. Announced observations are also required. Unannounced observations do not necessarily preclude pre-observation conferences, which can happen some time during the week or so before the unannounced observation occurs.

Q: How do the new pilot observation requirements address the issue of capacity for training?

A: There are a couple of ways in which they address capacity issues. In the 2012-13 pilot year, observers are not required to hold a supervisory certification, but must be trained, demonstrate proof on mastery on using the teacher practice evaluation instrument, and be calibrated. This allows others besides the administrators to observe teachers. Also, it allows for educators with subject-matter expertise to be observers, thereby bringing another level of expertise to the evaluation and feedback process. In addition, the total number of minutes required for observations has been reduced except for tenured teachers in the core content areas.

Q. Can teacher or evaluator training be done through turnkey?

A. Yes. In the 2011-12 pilot program, Pemberton Township shared its approach to this process. In addition to training a group of teachers to provide turnkey training, Pemberton formed an auxiliary committee of teachers, supervisors, and principals to manage the messaging and ensure consistent training and communications among the turnkey trainers. This resulted in greater understanding of the teacher practice rubrics and greater clarity about procedures, and helped to ensure consistency in turnkey presentations during roll-out. Pemberton's approach to empowering and supporting teacher leaders as turnkey trainers and assembling an auxiliary committee represents a practice that other districts may want to adopt. Turnkey trainers also should be included in comprehensive evaluator training and supported in training other members of the faculty.

Q. How does the Department define proof of mastery?

A. Proof of mastery/certification refers to a set of requirements or assessments used upon completing training to determine whether a trainee observer has achieved mastery of the content of the training as well as accuracy and consistency in using the rubric as applied to practice. This designation would be conferred on candidates who have successfully completed training and achieved a high level of accuracy as defined for that instrument and rubric.

Q:  How will external observations be used in the 2012-13 pilot?

A: For all non-tenured teachers, a minimum of two observations must be conducted by an external observer. For all tenured teachers, a minimum of one observation must be conducted by an external observer. These observations will ensure teachers are receiving an assessment and related feedback from more than one source, and this data will be included in the teacher's summative rating.

An external observer must be appropriately trained as an observer and not now working in the school of the teacher he/she is observing. An external observer must be trained and either certified or have demonstrated proof of mastery in the teacher practice evaluation instrument adopted by the district, and held to all scoring quality monitoring standards.

Q. What can be used at the high school level to measure annual student achievement?

A. Pilot districts are required to develop assessments for the non-tested grades and subjects. The Department is developing guidance on appropriate measures during SY2012-13.

Q: How does this teacher evaluation system impact non-teaching staff (media specialists, guidance, CST, etc.)?

A:  The pilot teacher evaluation system is only for instructional staff.

Q:  How will the results of the pilot evaluation system be used?

A:  The pilot program, underway since September of 2011, has helped to inform new statewide procedures for evaluation as well as various elements of the new tenure law The pilot has offered the state and districts an excellent opportunity to collaborate on a rigorous, trustworthy, transparent system before full implementation in 2013-14.

Q. What type of on-site implementation support does the Department provide to pilot districts?

A. Department implementation managers work with pilot districts on site.  Pilot district advisory committee members are also represented on the statewide Evaluation Pilot Advisory Committee (EPAC) to hear from national experts, share lessons learned, and problem-solve.

Q: Must pilot districts purchase a web-based performance management tool to record teacher evaluations?

A: Given the large number of evaluations required, and the need to accurately capture the information on the multiple measures of teacher effectiveness that will make up the summative rating, it is critical that a web-based performance management tool be used. While several of the teacher practice instruments have proprietary web-based tools, it is not necessary to purchase one as long as the district is using a web-based tool that is able to collect, analyze, and report data according to Department specifications.

Q:  How will the 2012-13 pilot be evaluated?

A:  The Department will contract with an external research organization to conduct an independent evaluation of the teacher evaluation pilot program in 2012-13.  The evaluation, along with input from pilot districts and Department analysis, will be used to identify successes and challenges in implementing a new educator evaluation system and will inform statewide rollout in the 2013-14 school year.

Q:  How can districts not participating in the pilot get access to resources/information about program developments?

A:  This Department evaluation website provides information about the evaluation system, including detailed specifications for the teacher practice evaluation instrument, as well as other measures to be used in evaluations. New information will be posted on a regular basis as new measures are reviewed and approved. Guidance documents and resources are also available.

2012-2013 Principal Evaluation Pilot

Q: What are the goals of the new principal evaluation system?

A: The goal of the new principal evaluation system is to provide district administrators with improved tools by which to measure principal effectiveness, differentiate between those who are excelling and those who need support, and provide meaningful feedback on in order to improve professional practice. The system will also help to improve principals' effectiveness by clarifying the expectations for performance, providing a common vocabulary and understanding of what principals need to know and be able to do, defining metrics that will be used to assess effectiveness, and providing meaningful feedback to inform a development plan for individual growth. In addition, a more comprehensive principal evaluation system will support districts in creating school- and system-wide collaborative cultures focused on continuous improvement through the use of multiple sources of student, teacher, and principal data to improve educators' practice and student learning. A high-quality principal evaluation system will enable districts to improve their personnel decisions concerning school leadership and will provide important data for districts and the state to use in assessing progress, setting goals and priorities, and making decisions about the professional development needs of school leaders.  Finally, the ultimate goal of teacher and principal evaluation reform is to increase achievement for all students by ensuring that every New Jersey student has access to a highly effective teacher.

Q:  How does this pilot interact with the teacher evaluation pilot?

A: The principal evaluation pilot program is the next step in the effort to improve educator evaluation state-wide, following the recommendations of the 2011 Educator Effectiveness Task Force.  This pilot builds on the work that has begun with the teacher evaluation pilot underway since September 2011.  As part of the principal evaluation pilot, 10% of the principal's professional practice evaluation score will be based on the principal's effectiveness in human capital management responsibilities, such as fulfilling the requirements of district policies for the supervision and evaluation of teachers; observing and rating teachers consistently and accurately; and conducting pre- and post-observation conferences and providing feedback that will support teachers in improving their practice.  This component will be linked directly with participating districts' teacher evaluation practices.

Q:  How will the Department incorporate stakeholder feedback from the pilot?

A: Initially, principal evaluation pilot districts were asked to form a District Evaluation Advisory Committee (DEAC) to oversee implementation of their proposed principal evaluation system. As they build capacity for an improved teacher evaluation system according to state guidelines, they will add stakeholders to this committee so that it will oversee and align the district's work on both systems.  The state-level Evaluation Pilot Advisory Committee (EPAC), which was formed to guide the teacher evaluation pilot, has expanded to collaborate on principal evaluation activities.

Q: What are the program requirements for districts/consortia of districts participating in the principal evaluation pilot?

A: Participating LEAs/consortia will be given the flexibility to develop some elements of their own but will need to follow the following specific implementation requirements:

  • Implementation of a high-quality, research-based or evidence-supported instrument for evaluating principal practice during the 2012-2013 school year;
  • Incorporation of specific evaluation system policies and procedures;
  • Alignment of principal and teacher evaluation systems;
  • Provision of ongoing support for evaluators and principals;
  • Implementation of a data management system to store and analyze evaluation data;
  • Formation of a district evaluation advisory committee;
  • Creation and implementation of a pilot program communications plan;
  • Collaboration with the external researcher;
  • Collaboration with the Department;
  • Development, testing, and/or adaptation of evaluation components, measures, processes and sources of evidence; and
  • Creation of a process for linking evaluation results to individual, school and district professional development planning.

Q. What are the required components of the principal evaluation system?

A: The system is comprised of two main components: (1) assessment of the quality of professional practice and (2) assessment of student performance.

  • Fifty percent of a principal's evaluation must be based on measures of professional practice.
    • Districts/consortia must adopt a research- or evidence-based evaluation instrument and rubrics.
    • 40% of a principal's final summative evaluation rating will be derived from the evaluation of principals' performance.
    • 10% of the principal's professional practice evaluation score will be based on the principal's effectiveness in human capital management responsibilities, such as recruiting and retaining effective teachers and exiting ineffective ones.
  • Fifty percent of a principal's evaluation must be based on direct measures of student achievement as demonstrated by assessments and other evaluations of student work.
    • 35% of the total evaluation score must be derived from aggregated measures of student achievement.
    • Every principal must also be measured on school-specific goals related to student performance.  This measure or combination of measures should comprise 15% of the total evaluation of a principal's performance and be focused on the change in achievement of a subset of students. 

Q: What training is required on the principal practice evaluation instrument?

A: The pilot required rigorous and comprehensive training on the professional practice evaluation instrument and its application be provided prior to October 31, 2012. Training on the instrument is required for the following: all district- and school-level administrators, including, but not limited to, superintendents, assistant superintendents, directors, mentors, and other administrative staff responsible for evaluating or supporting principals; and all principals, vice/assistant principals, and supervisors. School board members are strongly encouraged to participate in training as well.

As a result of training, all participants should understand the principal practice evaluation instrument's domains/components of effective practice, specific performance indicators, rubrics, and sources of evidence. Administrators who will evaluate principals must demonstrate that they can apply the principal performance evaluation instrument accurately and consistently.  Training providers must issue certificates or statements of assurances that the evaluators have completed training on the instrument and its application.

After completion of the initial training on the evaluation practice instrument, each pilot LEA is expected to create time for follow-up training for those administrators who are evaluating principals and any other central office staff involved in implementing the evaluations. The purpose of this training is to provide the administrators with an opportunity to discuss implementation issues and concerns once they have begun applying the practice instrument in their districts and to receive additional support, as necessary.

In addition, participating LEAs/consortia must design a process on their own or in collaboration with their instrument provider to check the accuracy and consistency of those evaluating principals at least once per semester during the pilot year.

Q: Which district personnel should be included in training?

A: All superintendents, assistant superintendents, other central office administrators, principals, vice/assistant principals, supervisors, and mentors under active employment in each participating LEA should be included in training.  In addition, training should include:

  • District personnel who are involved in decisions related to the principal evaluation system
  • Staff who could evaluate vice principals and supervisors, even though vice principals and supervisors will not be included in the pilot program in 2012-13
  • Any mentors (e.g., Leader to Leader program mentors) who work with principals
  • All member of the District Evaluation Advisory Committee, which will include at least one school board member

Q: Are vice/assistant principals included in the pilot program?

A: Vice/assistant principals will not be evaluated during the 2012-13 pilot year, but they must participate in training.

Q: Will principals holding a provisional license and being mentored be evaluated in the pilot?

A: Principals holding a provisional license will not be evaluated in the pilot, but they should be included in training.

Q: What will evaluators be looking for in principal practice related to the Human Capital Management responsibilities?

A: Those evaluating principals will be expected to seek evidence of the principal's effectiveness in:

  • Fulfilling the requirements of district policies for the supervision and evaluation of teachers;
  • Observing and rating teachers consistently and accurately;
  • Conducting pre- and post- observation conferences and providing teachers with feedback that will support them in improving their practice;
  • Recruiting and/or retaining teaching staff;
  • Developing and monitoring teachers' required individual professional development plans;
  • Managing the implementation of the required school level professional development plan;
  • Providing opportunities for collaborative work time; and
  • Providing high quality professional development opportunities for staff.

Q:  How will the pilot be evaluated?

A:  The Department plans to contract with an external research organization to conduct an independent evaluation of the principal evaluation pilot program in 2012-13.  The evaluation, along with input from pilot districts and Department analysis, will be used to identify successes and challenges in implementing a new educator evaluation system and will inform statewide rollout in the future.

2011-12 Teacher Evaluation Pilot Information

Q:  Who participated in the school year 2011 – 2012 pilot?

A:  The following ten districts were selected to participate in the pilot:  Alexandria Township (Hunterdon); Bergenfield (Bergen), Elizabeth (Union), Monroe Township (Middlesex), Ocean City (Cape May), Pemberton Township (Burlington), Red Bank (Monmouth), Secaucus (Hudson), West Deptford Township (Gloucester), and Woodstown-Pilesgrove Regional (Salem).

In addition, all schools currently receiving School Improvement Grant (SIG) funding participated in the pilot. These included: Camden High School, Cramer College Preparatory Lab School, and U.S. Wiggins College Preparatory Lab School (Camden);  Cicely Tyson High School (East Orange), Voc West Caldwell Campus (Essex); Fred Martin School of Performing Arts, Lincoln High School, and Snyder High (Jersey City), Lakewood High School (Lakewood); Barringer High School, Brick Avon Academy, Central High, Dayton Street, Malcolm X Shabazz High School, Newark Vocational High School, and West Side High School (Newark); Dr. Frank Napier School of Technology, and Number 10 (Paterson); Boro Abraham Clark High School (Roselle).

Finally, Newark also participated in the pilot, through a separate grant.

Q:  How were districts selected for the pilot?

A:  The DOE followed evaluation procedures determined by state grants protocol.  As according to this protocol, each application was evaluated and rated by a panel of three readers: one reader from within the Department, and two readers external to the Department with deep knowledge in the content area.  We utilized experienced department staff with knowledge of teaching and learning as well as the grants process.  We also included experienced school district practitioners, higher education staff working in teacher preparation, and individuals with extensive understanding of teacher effectiveness issues.  All readers certified that no conflict of interest exists that would create an undue advantage or disadvantage for any applicant in the application evaluation and scoring process.

Applications were evaluated based on the quality, comprehensiveness, completeness, accuracy, and appropriateness of response to the guidelines and requirements of the governing NGO.

Grant application readers used the selection criteria listed below, as well as the application construction guidelines, as the basis for their evaluations:

  • The project plan is comprehensive and reasonable, addresses the identified local conditions and/or needs, and will contribute to the achievement of the intended benefits of the grant program.
  • The project goals and objectives are properly constructed and logically sequenced to substantiate the project plan, and are supported by specific and measurable indicators that will allow for objective assessment of progress toward achievement of the goals and objectives.
  • The project activities represent a well-defined and logically sequenced series of steps which will result in the achievement of each goal and corresponding objective(s).
  • The project budget is integrated with the comprehensive project plan, and proposed expenditures are necessary and reasonable for the effective implementation of the project activities.
  • The agency's commitment to the project is well-documented, and the agency possesses the requisite organizational capacity and authority, including necessary resources and relevant experience, to support successful implementation.

In order to include the widest possible distribution, the New Jersey Department of Education made awards to the highest ranking application in each District Factor Group, and in each region (north, central, south).  From the remaining applications, awards were made based on total score based on available funds.  To be considered eligible, an application must have scored at least 65 points out of 100.

Q:  How was the pilot funded?

A:  To help pilot districts implement a strong evaluation system, the New Jersey Department of Education awarded a total of $1,160,000 in EE4NJ grants to pilots.  This is a major investment in this critical work and demonstrates the Department's commitment to working with districts and schools as partners.  LEAs secured the services of an outside vendor to provide training on a teacher practice instrument that meets the requirements set forth in the Notice of Grant Opportunity. Funding levels were derived based on estimated costs for all required components of the EE4NJ program. Total final costs may be higher or lower than the grant amounts provided depending on the provider and services chosen by the district to deliver the training and other program elements.

Q:  How were School Improvement Grant (SIG) schools and districts involved with the pilot?

A: The School Improvement Grant required that all participating districts implement teacher and leader evaluation systems in SIG schools in the 2011 – 2012 school year. Therefore, SIG schools were meeting the requirements of the teacher evaluation pilot using SIG funding.

Q: What were some of the key lessons learned from the Cohort 1 pilot?

A:  The following are some of the top high-level lessons learned:

  • Stakeholder engagement is critical for ensuring buy-in during initial implementation phase
  • District Evaluation Advisory Committee (DEAC) meetings that are open to additional staff members help build a culture of trust, transparency, and two-way communication
  • Selection and procurement of a teacher practice evaluation instrument requires buy-in from stakeholders and a process taking 4-8 weeks
  • Quality observer and teacher training is critical to help ensure teacher understanding, the quality of observer feedback, and the reliability and accuracy of observer judgments
  • Capacity challenges exist for administrators in completing the increased number of observations, so these must be prioritized
  • Identifying and/or developing student achievement measures for teachers of Non-Tested Grades and Subjects present a significant challenge

The Department continues to work with the pilot districts and related stakeholder advisory committees to address these lessons and apply them to ongoing evaluation reform efforts.

Q:  How is the pilot being evaluated?

A:  The New Jersey Department of Education selected Rutgers University Graduate School of Education to conduct an independent evaluation of the teacher evaluation pilot.  Rutgers' evaluation, along with input from the pilot districts and Department analysis, will be used to identify successes and challenges in implementing a new educator evaluation system and will inform statewide rollout of a new evaluation framework in the 2013-14 school year.

Q: What is the role of the external researcher?

A: The Department hired Rutgers University to serve as an external researcher to the teacher evaluation pilot in 2011-2012. Rutgers is focusing their work on four main goals:

  1. To assess the extent and quality of schools' and districts' efforts to develop and implement measures of teacher performance and student achievement growth.
  2. To identify common contextual facilitators and barriers to the implementation of the new teacher evaluation system.
  3. To assess various stakeholders' perceptions of the teacher evaluation system.
  4. To examine the impact of implementing the new teacher evaluation system on collaborative school cultures and professional development.

They are providing interim reports and one final evaluation of the pilot.  More information on these reports is forthcoming.

Student Growth Measures and Evaluation

Note: answers adapted from MCAS Student Growth Percentiles: Interpretive Guide, March 2011, Massachusetts Department of Elementary and Secondary Education

Q: What is a growth model?

A: For K-12 education, a "growth model" is a method of measuring individual student progress on statewide assessments by tracking the scores of the same students from one year to the next. Traditional student assessment reports show student achievement on the given assessment, whereas growth reports show how much change or "growth" there has been in achievement from year to year.

Q: What questions can a growth model help answer?

A: The growth model allows districts and schools to more easily identify promising, or potentially struggling, programs and practices—and therefore to look deeper into what may or may not be working. A growth model can help answer such questions as:

  • How much academic progress did an individual or group of students make in one or more years?
  • How does an individual student's growth compare to that of students with similar prior test scores?
  • Is a student's, school's, or district's growth higher than, or lower than, or similar to typical growth?
  • Which schools or districts demonstrate better than (or less than) typical growth for their students as compared to schools or districts with similar overall achievement?

Q: Why did New Jersey develop a growth model to measure student progress?

A: New Jersey developed a growth model to help answer the question, "How much academic progress did a student or group of students make in one year, as measured by NJ ASK, in relation to their academic peers?" This will help districts and schools to examine why results differ for certain groups of students and support discovery of which approaches are working best to help more students achieve.

While each district is responsible for developing the contours of their new evaluation system, the state has committed to developing measures of student growth based on NJ ASK, where applicable, for students, teachers, schools, and districts.  By using growth to calculate student outcomes, the department recognizes that our students enter each grade level at different starting points with unique challenges.  We believe we should focus on constant improvement at every point in the continuum of achievement, rather than merely on how many students attain proficiency.  This is a recommendation that we have heard continuously from educators, school and district leaders, and national experts, and we are committed to ensuring that achievement measures accurately and fairly account for growth.

Q: How does New Jersey measure student growth?

A: For subjects tested by the New Jersey Assessment of Skills and Knowledge (NJ ASK), New Jersey measures growth for an individual student by comparing the change in his or her NJ ASK achievement from one year to the next to that of all other students in the state who had similar historical results (the student's "academic peers"). This change in achievement is reported as a Student Growth Percentile ("SGP") and indicates how high or low that student's growth was as compared to that of his/her academic peers.

Q: Can students who perform at the top range of the Advanced level show growth?

A: Yes. Unlike other methodologies, the Student Growth Percentile methodology does not use scale score as part of the 'math' in determining growth. Thus, it is possible for a student who scores a perfect or nearly perfect scale score in the first year and a perfect or nearly perfect scale score in the second year to still demonstrate growth relative to other students who also have a history of perfect or nearly perfect scores.
Q: How does New Jersey attribute student growth to a district, school, or teacher?

A: For a district or school, the Student Growth Percentiles (SGPs) for all students are compiled in an ascending numerical list to identify the median Student Growth Percentile (mSGP).  The mSGP is a representation of "average" growth for students in the district or school.  Half of the students had growth percentiles higher than the median and half had lower.

For an individual teacher, the score represents the mSGP for all of a given teacher's qualifying students in a school year.  To calculate this score, the department creates an ascending list of SGP scores of the qualifying students for an individual teacher.  The median score on this list becomes that teacher's mSGP score.  Each district is required to use NJSMART (New Jersey's student record system) to submit information detailing the assignment of students to individual teachers in a given school year.  After receiving roster data, the department can link individual teachers to their identified students' SGPs to determine the mSGP.

The quality of the mSGP data that the department produces depends entirely on the accuracy of Course Roster Submissions by districts at the end of each school year.  If students are attributed to a teacher incorrectly, that teacher's mSGP score will be incorrect.

Q: What is a district's responsibility to ensure the accuracy of Course Roster Submissions?

A: Districts are responsible for ensuring that their data is accurate when submitted to NJSMART. Every year, from approximately mid-May to the end of June, a six-week "practice" submission window occurs for all NJSMART data submissions.  This practice window gives districts sufficient time to prepare their data and reach out for technical assistance to the NJSMART Help Desk as needed.  This helps to ensure district data meets the appropriate technical quality when the official submission window opens in the summer.  The department strongly encourages all districts to submit data in the practice window.

In February of 2013, the department will provide each district with a report summarizing its SY11-12 Course Roster Submission data.  Our goal for this exercise is to help all districts better prepare for accurate Course Roster Submissions for SY12-13 and beyond. Course Roster Submission summary reports will include the following:

  • Names and Staff Member IDs (SMIDs) of all teachers who would have received median Student Growth Percentile (mSGP) data for the previous school year under the parameters of the new evaluation system;
    • Please note: reports will not include the actual mSGP information, but simply a list of those to whom the data would apply, and
    • The department has not calculated mSGP data for teachers in non-pilot districts, and thus we are unable to provide such information for SY11-12.
  • The overall percent proficient in math and language arts; and
  • The district-level mSGP.

We encourage districts to use the summary reports to check the accuracy of their roster data.  As part of this process, districts should consider developing systems to ensure that data submitted is thoroughly vetted. The following resource from the Data Quality Campaign, entitled: "Effectively Linking Teacher and Student Data," might be helpful as you examine these processes: http://www.tsdl.org/resources/site1/general/White%20Papers/DQC_TSDL_7-27.pdf.

Q: For which grades and subjects will New Jersey report growth data, and when will this information be available for all districts?

A: Currently, growth measures are only developed for those who teach in grades 4 through 8 in Language Arts Literacy and Math and have a requisite number of scores of individual students attributable to them. Beginning with SY13-14, growth data for all qualifying teachers across the state will be provided to all districts not only as part of educator evaluations, but as a useful measure of students' academic progress.  The department will detail the use of median Student Growth Percentile (mSGP) scores in calculating teachers' summative evaluation ratings in forthcoming regulations.  Lessons learned from the distribution of this data to our pilot districts will inform decisions, and we will share more information when it is available.

We continue to work with and learn from pilot districts to ensure that all districts across the state have access to the best resources for how to measure student learning in non-tested grades and subjects.

Q: How is growth data being used in the evaluation pilot?

A:  We are currently conducting teacher and principal evaluation pilots.  In January of 2013, districts who participated in the first year of the teacher evaluation received median Student Growth Percentile (mSGP) data for school year 2011-12.  The methodology used to provide this median SGP data was used for the purposes of the pilot, and is not intended to signal the methodology to be used in future years, as such policy decisions will be proposed in forthcoming regulations.  For the purpose of evaluations, mSGP will be only one factor among many to determine a summative evaluation rating. Those pilot districts are now reviewing the data closely and providing feedback to inform the best use of student growth data for all districts across the state.  While the department has provided the first iteration of growth measures to pilot districts, consistent with the objectives of this pilot, the state is not prescribing individual or school-based actions, either consequences or recognition, based on this data.  If a district chooses to use the data in any manner related to personnel actions, it must be done confidentially per N.J.S.A 18A:6-121(d).  Please note that the TEACHNJ Act signed into law in August prohibits the public release of this data.

Q: How can I learn more about growth measures in New Jersey?

A: For more information about measuring student growth, please view the video about Student Growth Percentile methodology at http://survey.pcgus.com/njgrowth/player.html. If you have questions about this data and its use in evaluation, please contact the Evaluation Office at educatorevaluation@doe.state.nj.us.