MEINFO-L Archives

MEINFO-L - Maine State Library Announcements and Information List

MEINFO-L@LISTS.MAINE.EDU

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
"Norton, Sylvia K" <[log in to unmask]>
Reply To:
Norton, Sylvia K
Date:
Tue, 17 Jun 2003 14:55:38 -0400
Content-Type:
text/plain
Parts/Attachments:
text/plain (161 lines)
> TO:  ALL SUPERINTENDENTS OF SCHOOLS
>
> The following message is being sent at the request of the Commissioner.
>
> ***** Please forward to School Principals *****
> --------------------------------------------------------------------------
> -------------------
> PLEASE NOTE:
> Below you will find an electronic copy of the NAEP Newsletter. You will
> notice that this is the 2nd newsletter in a series of newsletters that you
> will be receiving electronically from your NAEP State Coordinator, John
> Kennedy. Please know that this is the ONLY WAY, for the time being, in
> which you will receive information about Maine's NAEP scores and other
> important NAEP information. If you have any questions or did not receive
> the 1st NAEP Newsletter please contact John Kennedy directly by phone at
> 624-6636 or via e-mail at [log in to unmask]
> <mailto:[log in to unmask] <mailto:[log in to unmask]> >
>
>
> NAEP Newsletter #2
> What is a NAEP Framework?
> It is the blueprint for an assessment.  In general, NAEP frameworks have
> been documents drawn up by teams of teachers, school officials, and
> members of the general public to guide item development and scoring of
> items in each subject.
> NAEP frameworks generally have a ten-year lifespan.  The 1992 Reading
> framework was developed in what the National Assessment Governing Board
> (NAGB) calls a "national consensus effort" managed by the Council of Chief
> State School Officers (CCSSO).
> The NAEP Mathematics framework, which has been revised several times since
> 1990 to align with National Council of Teachers of Mathematics (NCTM)
> standards, is due to be replaced in 2005.
> The 2007 NAEP Reading framework is currently being developed by a national
> committee of educators supervised by American Institutes for Research
> (AIR).
> How Do NAEP Frameworks Generate Assessment Results?
> NAEP reports results for states and the nation as scaled scores,
> percentiles, and in relation to Achievement Levels.  NAEP also reports
> subscale scores corresponding to Reading contexts and Mathematics strands
> described in their respective frameworks:
> Reading Contexts
> Reading for Literary Experience
> Reading to Gain Information
> Reading to Perform a Task
> Mathematics Strands
> Number Sense, Properties, and Operations   Measurement
> Geometry and Spatial Sense
> Data Analysis, Statistics, and Probability   Algebra and Functions
> How Do NAEP Frameworks Compare with Maine's Learning Results?
> NAEP Reading subscales are based upon items sorted by passage type using
> the contexts specified by the NAEP Reading framework; MEA categories of
> items are structured differently because they are based upon Learning
> Results standards
> In addition, NAEP Reading and Mathematics surveys use sets of
> sub-categories that are not reflected in the scaled scores.  The NAEP
> Reading framework also specifies four stances; NAEP Mathematics, two
> domains:
> Reading Stances
> Initial Understanding
> Developing Interpretation
> Personal Response
> Critical Stance
> Mathematics Domains
> Mathematical Abilities
> Mathematical Power
> These sub-categories are used to classify and distribute the items in an
> assessment for a given year.  There appears to be a correspondence between
> NAEP Mathematics strands and the proposed MEA Mathematics clusters; the
> distribution of categories of items in both assessments is roughly similar
> if counted at the strand/cluster level.
> The distributions of multiple choice and constructed response items in
> NAEP and MEA assessments differ in that NAEP uses more constructed
> response and MEA uses more multiple-choice items overall, but this effect
> is offset by assigning different values to the different item types in MEA
> scoring.
> Since the NAEP and MEA Reading blueprints differ so greatly, direct
> comparison of results cannot really be made except in respect to NAEP
> Achievement Levels, which represent the application of standards to the
> NAEP scaled scores.
> NAEP Achievement Levels
> In order to interpret the scaled scores in terms of student proficiency in
> each subject, the National Assessment Governing Board (NAGB) has developed
> Achievement Levels of Basic, Proficient, and Advanced in mathematics,
> reading, history, geography, science, writing, and civics since 1990.
> These are more easily compared with performance levels in Maine's Learning
> Results.
> While MEA scores are directly connected to standards set by Maine's
> Learning Results, NAEP scores are based upon the comparative performance
> of students across the nation.  Maine students participating in an
> assessment make up a representative cross-section of the student
> population of the state.  In 2000, fourth graders in Maine produced an
> average NAEP mathematics scaled score of 231 (compared with 226
> nationwide).  On the NAEP achievement scale for Mathematics, 214 is the
> lower cut-off point for Basic and 249 is the lower cut-off for Proficient.
> Only seven other states had average scores that were higher than Maine's,
> and those were only 1 to 3 points higher.  Statistically speaking, Maine's
> 4th graders performed on average as well as or better than their peers
> across the nation, but the majority of students sampled in the state and
> across the nation ranked in the less-than-proficient range.
> The percentage of Maine 4th graders at or above the Proficient level in
> Mathematics on the 2000 NAEP assessment was 25%, compared to 23% of the
> students meeting or exceeding standards on the MEA.  Clearly, both MEA and
> NAEP standards are rigorous.
> Using NAEP Item Maps
> The NAEP Achievement Levels have another implication for state assessments
> as a result of NAEP's development of item maps.  The Item Response Theory
> (IRT) statistical model that NAEP uses to develop scaled scores can be
> used to predict student performance on specific items. Each year NAEP
> releases 25% of the items used in that year's assessment.  IRT modeling
> can place these items on the achievement scale to show how the ability to
> answer a specific type of question correctly would relate to the overall
> achievement of students participating in the assessment.
> For example, a 4th grade student performing at the Basic level overall on
> the Mathematics assessment would probably be able to answer the following
> question correctly:
> "Determine how much change a person will get back from a purchase."
> But a student also answering the following question correctly has a high
> probability of achieving the Proficient level in 4th grade Mathematics:
> "Apply the concept of symmetry to visualize the result of folding a marked
> strip of paper."
> This suggests that specific MEA and NAEP items might be compared in
> content to make connections between the two assessments.  Released NAEP
> items can be found online.  Log on to the National Center for Educational
> Statistics (NCES) web site (www.nces.ed.gov <www.nces.ed.gov>
> <http://www.nces.ed.gov <http://www.nces.ed.gov> >).  Click on the arrow
> at the corner of the "Visit Popular NCES Sites" window, and select "Data
> Search Tools."  Click on the link "National Assessment of Educational
> Progress (NAEP) Questions Tool" about halfway down the page.
> Comparing Scores
> NAEP and MEA scoring scales are not as similar as they may seem.  MEA
> scores are statistically computed from individual students' raw scores,
> while NAEP scores are derived from abstract statistical values called
> thetas.
> Both assessments use IRT statistics, but each uses a different method and
> uses its results for different purposes.  MEA uses IRT for equating, which
> is a way to ensure that students taking different tests in different years
> in the same subject at the same level receive comparable scores.  MEA does
> not use IRT to create the scores themselves; NAEP does-because the
> Nation's Report Card is building a projection of student performance at
> the state and national level.
> MEA results have implications for instructional practices at the school
> level, while NAEP results do not. NAEP does not report scores at the
> school level because of the sampling methods it uses.  Each student
> participating in a NAEP assessment sees only a small portion of the items
> for that subject and level.
> Schools and students are selected by NCES across the state to provide
> complete coverage of the assessment items for a subject and level by a
> student population representative of the entire state. The same kind of
> sampling of students and items is done to generate a prediction of student
> performance for the nation as a whole.  It is unlikely that the group of
> students selected for a single school in Maine will see all of the items
> in an assessment.
> Both are valuable information resources for Maine schools in complying
> with NCLB requirements.  The federal government takes full financial
> responsibility for the NAEP assessment.  The state's contribution to the
> Nation's Report Card is the cooperation of some of its schools for a few
> hours each year or two.
> NAEP State Coordinator for Maine: [log in to unmask]
> <mailto:[log in to unmask] <mailto:[log in to unmask]> >
> All current NAEP frameworks may be viewed on- line at the NCES website.

ATOM RSS1 RSS2