Sandie's+Page

**Writing Assessment**

**Introduction** What are the reasons we assess writing? (Tompkins, 2008)

 * document student growth
 * inform writing achievement to students and parents
 * direct learner’s instruction
 * meet grade–level standards
 * evaluate instructional program

** I. Past ** Three phases of writing assessment over the last fifty years; objective tests (1950-1970), holistically scored essays (1970-1986), and portfolio assessments (1986-present) (King 2007).

** II. Present ** v **Rubrics** are any set of criteria that describes a range of degrees of excellence of levels of development in an activity, process, or product (Andrade, 2009; Novak, 1996; Reeves & Stanford, 2009).
 * Assists in measuring student performance better than using a standardized measurement.
 * Assists teachers by measuring teaching to help focus instruction on specific aspects.
 * Assists teachers in showing growth in student writing development.

====v **Portfolios** students become active participants of their own learning through a collection of writing samples that directly reflect the curriculum. (King, 2007; Tompkins, 2008)====
 * informal
 * observing
 * conferencing
 * writing samples
 * record keeping
 * anecdotal notes
 * checklists
 * process and product
 * writing process checklists
 * prewriting
 * drafting
 * <span style="font-family: Arial,helvetica,sans-serif; font-weight: normal;">revising
 * <span style="font-family: Arial,helvetica,sans-serif; font-weight: normal;">editing
 * <span style="font-family: Arial,helvetica,sans-serif; font-weight: normal;">publishing
 * <span style="font-family: Arial,helvetica,sans-serif; font-weight: normal;">student-teacher assessment conferences
 * <span style="font-family: Arial,helvetica,sans-serif; font-weight: normal;">children’s self-assessments
 * <span style="font-family: Arial,helvetica,sans-serif; font-weight: normal;">district-and state-mandated assessments

= III. Future =

=
<span style="font-family: Wingdings,helvetica,sans-serif; font-size: 17px; line-height: 26px;">v **Self-Revised Essays** are written and revisited several times during the semester by the writer. The teacher does not directly comment on the essay, but through other drafted essays written during the semester students are provided with concrete examples to target student needs. (King, 2007; Tompkins, 2008)======

=
<span style="font-family: Wingdings,helvetica,sans-serif; font-size: 17px; line-height: 26px;">v **Curriculum-Based Measurements (CBM)** are a research based assessment tool that has produced valid and reliable affects of writing performance. CBMs can provide teachers with useful student writing data that can inform, guide, and monitor student progress and teacher instruction that is relevant to student learning (Benson & Campbell, 2009).=====

How does one evaluate “good” writing? Assessing student writing can be a laborious task. Writing itself is a complex task to do let alone measure. Due to the cognitive demands placed on young writers it is difficult to accurately assess early writing development. Since writing is an integration between the writer, text, and the audience as teachers we need to assess more than just the finished product but also the process that got the student to the end result and the effort it took. Curriculum-Based Measurements (CBM) offers an approach to assessing writing that shows evidence of reliability and validity (Coker & Ritchey, 2010).

= What are CBM? = “Curriculum-based measurement (CBM) provides educators with a stronger link between assessment and instruction than do standardized tests of achievement” (p. 291) (Gansle et al., 2004) Because CBM are closely linked to the curriculum they are more responsive to student achievement. CBM are inexpensive to make, can be given more frequently, and take less time to score than standardized tests (Gansle et al., 2002, 2004).

=
CBM include a written prompt. Students write in response to a prompt for three to seven minutes, depending on grade level. Written responses are commonly scored in three areas: number of words written, number of correctly spelled words and the number of correct word sequences (proper grammatical placement).======

=
Students writing progress can be monitored weekly or monthly. Using a scoring system, the teacher scores three separate writing pieces and uses a general indicator of a weekly 1.5 increase of correct minus incorrect word sequences to gage students’ writing proficiency. Teachers collected data can then inform their instructional decisions to support student progress within the curriculum.======

= What does the research say about CBM? =

=
Coker & Ritchey (2010) examined using CBM assessments for young writers in kindergarten and first grade. The researchers sought to measure students writing development and “use efficient and developmentally appropriate tasks to provide a manner to monitor growth over time” (p.178). A discussion of McMaster & Espin (2007) did an extensive literature review and found most studies investigating CBM showed consistent evidence of 90% interrater scoring agreement.======

233 young learners in kindergarten and first grade were asked to write two sentences in response to a verbal prompt. The writing samples were taken three times during a 6-month time period (January, March, May). Scoring was completed focusing on the valid measures of number of words, and number of correctly spelled words.

<span style="font-family: Arial,Helvetica,sans-serif;">Most kindergartners and first graders were able to complete their writing task using a word or phrase. Of course they found the kindergartners would produce more writing if CBM were administered in the latter part of the year. They found sentence writing was an appropriate means to ascertain the writing development of younger learners.

====<span style="font-family: Arial,Helvetica,sans-serif;">Benson & Campbell (2009) report 40 published studies stressing CBM technical capabilities as well as the Fuchs 2004 30-year research history on the Three Stages studies. ==== <span style="font-family: Wingdings,helvetica,sans-serif; font-weight: normal;">v **Stage 1 studies**

· CBM validity measures diminish with higher-grade levels, particularly high school.
<span style="font-family: Wingdings,helvetica,sans-serif; font-weight: normal;">v **Stage 2 studies**

· measured writing growth mainly in the Fall to Spring where other standardized tests may not have been sensitive enough to pick up smaller writing progress as the CBM demonstrated.
<span style="font-family: Wingdings,helvetica,sans-serif; font-weight: normal;">v **Stage 3 studies**

· insufficient research in this area that focused on struggling writers being monitored on a regular, weekly basis nor any changes in teacher’s instructional methods.
= How can CBM be utilized in the classroom? = For teachers to truly assess student writing, use that assessment information to guide instruction while still adhering to district and state mandated writing standards using CBM now seems to be a viable possibility. It is teacher driven, quick and easy to administer and is valid, reliable, and most importantly measurable. Standardized testing has its place in the portfolio of student work, but now rather than just having one snapshot in time, CBM can offer data that supports and monitors students writing progress.

Presently scoring of CBM can be an arduous task but maybe with future practical classroom research and investigation the scoring techniques could be as simplified as scoring a Running Record in Reading.[| See examples].

CBM can also be utilized in the higher grades as well with some modifications to its administration processes of time and scoring. CBM can make the struggling writer more successful by adding to the portfolio a scaffolded support learning system that meets the learner where they are and advances their learning according to a measurable assessment rather than one high stakes, large-scale writing test. CBM provides the struggling writer with an early intervention and teachers with the necessary information for response to intervention approaches.

No one assessment will demonstrate all the complexities involved in writing. A combination of present assessment techniques like the use of rubrics and student portfolio collections coupled with active student participation in creating writing collections can be a successful tool for both students and teachers. Graham (2007) discusses how writing needs to be student valued, authentic with an audience, having clear and meaningful purposes that include support and feedback. Teachers gain better insight into teaching writing instruction while students can become engaged and better writers.

= Helpful Internet Resource =

- ** click on RIPM Research link -scroll down to RIPM Reports and Measures -click on Early Writing This easy to read website contains recent research that monitors progress in Writing, Reading, Math and Science. It provides helpful black-line master resources to be used with general and special education curriculum. **
 * Research Institute on Progress Monitoring (RIPM)
 * []**

[|**http://www.omnie.org/guidelines/files/CBM-for-Writing.pd**f]
 * This a helpful website to assist educators is scoring CBM assessments.It provides directions, explanations, and scored samples.


 * <span style="-webkit-text-decorations-in-effect: none; color: #000000; webkittextdecorationsineffect: none;">Click on the link below to view the New Jersey writing standards

Jenna's Page

Assessment and Revision Home Page
R eferences

Andrade, H. L., Wang, X., Du, Y., Akawi, R. L. (2009). Rubric-referenced self-assessment and self-efficacy for writing. __The Journal of Educational Research, 102__ (4), 287-301.

Coker, D.L. & Ritchey, K.D. (2010). Curriculum-based measurement of writing in kindergarten and first grade: an investigation of production and qualitative scores. __Exceptional Children__. 76 (2).

Gansle, K., Noell, G.H., VanDerHeyden, A.M., Naquin, G.M., Slider, N.J. (2002). Moving beyond total words written: the reliability, criterion validity, and time cost of alternative measures for curriculum-based measurement in writing. __School Psychology Review.__ 31 (4).

Gansle, K., Noell, G.H., VanDerHeyden, A.M., Slider, N.J., Hoffpauir, L.D., Whitmarsh, E.L., Naquin, G.M., (2004). An examination of the criterion validity and sensitivity to brief intervention of alternate curriculum-based measures of writing skill. __Psychology in the Schools__. 41 (3).

Graham, S., MacArthur, Fitzgerald, J. (2007) Best practices in writing instruction. __Best practices in writing assessment__ (pp. 265-286). New York: Guilford Press.

King, G. J., (2007). Assessing student writing: the self-revised essay. __Journal of Basic Writing__, 26 (2).

Novak, J. R., Herman, J.L., Gearhart, M. (1996). Establishing validity for performance-based assessments: an illustration for collections of student writing. __The Journal of Educational Research, 89__ (4) 220-233.

Tompkins, G. E., (2008). Teaching writing balancing process and product. (5th ed.) __Assessing children’s writing__ (pp. 74-97). Upper Saddle River, NJ: Pearson Merrill Prentice Hall.

Troia, G.A. (Ed.) (2009). Instruction and assessment for struggling writers: evidence-based practices. __Assessment of student writing with curriculum-based measurement__ New York: Guilford (pp. 337-353).

Reeves, S. & Stanford, B. (2009). Rubrics for the classroom: assessments for students and teachers. __Delta Kappa Gamma Bulletin__ 76 (1).

New jersey assessment of skills and knowledge (NJ ASK) performance level descriptors language arts literacy grade 3. Retrieved March, 2010 from, State of New Jersey Department of Education http://www.state.nj.us/education/assessment/descriptors/es/lal3.htm