Massachusetts will link 50 percent of community college funding to improvements in graduation rates, workforce development and minority and low-income student success. That’s one of the most ambitious performance-funding programs in the nation.
Forty-five states and the District of Columbia are moving forward on Common Core Standards, but support for common testing is eroding, reports StateImpact.
Georgia will use its own exam, instead of the costlier test developed by the Partnership for Assessment of Readiness for College and Careers (PARCC).
Two of Florida’s top elected leaders want Florida to leave PARCC, even though Florida is the fiscal agent for the testing consortium.
Already Alabama, North Dakota and Pennsylvania have left the consortium. Oklahoma plans to design its own test, and Indiana isn’t participating in PARCC governing board meetings right now. State education officials say they’re waiting until after a mandatory legislative review of the Common Core academic standards.
That brings the number of states participating in PARCC down to 18 plus the District of Columbia.
The crumbling of the testing consortia is a “disaster,” writes Andy Smarick on Flypaper.
At this point, I won’t be surprised if we end up with 20 or more different testing systems in 2014–15. So much for commonness, so much for comparability. Rigor and alignment with tough standards are likely the next to fall.
Blinded by “technocratic hubris,” common assessment advocates “underestimated how difficult it would be to undo decades of state policy and practice on tests,” writes Smarick. Governors and state chiefs will be reluctant to spend lots of money for a testing system that will make their schools and teachers look bad, he predicted six months ago.
The Common Core sky isn’t falling, responds Checker Finn, also a Fordhamite. This is “right sizing.”
The forty-five-state thing was always artificial, induced by Race to the Top greed and perhaps a crowd mentality. Never in a million years were we going to see forty-five states truly embrace these rigorous academic expectations for their students, teachers, and schools, meet all the implementation challenges (curriculum, textbooks, technology, teacher prep, etc.), deploy new assessments, install the results of those assessments in their accountability systems, and live with the consequences of zillions of kids who, at least in the near term, fail to clear the higher bar.
It’s “better for states to drop out in advance than to fake it, pretending to use the Common Core standards but never really implementing them,” Finn writes. “That’s long-standing California-style behavior (fine standards, wretched implementation), in contrast with Massachusetts-style behavior (exemplary standards and serious implementation—and results to show for it).”
Most of the drop-out states will keep the standards, but write their own tests or sign up with ACT. They’ll give comparability, “one of the major benefits of commonality,” Finn writes. Some may change their minds later “or face up to the fact that (like Texas and Virginia) they don’t really want to use the Common Core at all.”
States define “proficiency” very differently, write Paul Peterson and Peter Kaplan in Education Next.
Massachusetts, Tennessee and Missouri have the highest expectations, while Alabama and Georgia expect the least of their students. Texas, Michigan, Idaho, Illinois and Virginia also set a low bar.
Standards still declined in rigor in 26 states and D.C. between 2009 and 2011, while 24 states increased rigor, the study found.
The study grades the states for setting high standards, not on whether students meet those standards.
Having been graded an F in every previous report, (Tennessee) made the astounding jump to a straight A in 2011. . . state tests were made much more challenging and the percentage of students identified as proficient dropped from 90 percent or more to around 50 percent, a candid admission of the challenges the Tennessee schools faced.
West Virginia, New York, Nebraska, and Delaware also strengthened proficiency standards, while New Mexico, Washington, Hawaii, Montana, and Georgia lowered the bar.
Uneven at the Start, a new Education Trust report, looks at academic performance to predict how different states will meet the challenge of Common Core standards.
New Jersey, Maryland and Massachusetts show strong performance and improvement for all students — and for disadvantaged students, reports Ed Trust. Performance is weak in West Virginia and Oregon. Ohio and Wisconsin do well for students overall, but poorly for “or or more of their undeserved groups.”
Education Trust also has updated its EdWatch reports, which analyze college and career readiness and high school and college graduation rates for all groups of students in each state. The state academic performance and improvement tool shows how each state compares with the national average and with other states.
Massachusetts colleges and universities hired 75 percent more administrators in the last 25 years, three times the rate of enrollment growth. Officials say they need to provide more student services and cope with more federal regulations.
All 11 million community college students and 1,200 community college presidents should demand equitable funding for community colleges, which serve the neediest students, writes a professor. If protest doesn’t work, litigate.
Testing controversies didn’t start with No Child Left Behind or Race to the Top, writes William J. Reese, an education history at the University of Wisconsin, Madison, in the New York Times. “Members of the Boston School Committee fired the first shots in the testing wars in the summer of 1845.”
Many Bostonians smugly assumed that their well-funded public schools were the nation’s best.
. . . Citizens were in for a shock. For the first time, examiners gave the highest grammar school classes a common written test, conceived by a few political activists who wanted precise measurements of school achievement. The examiners tested 530 pupils — the cream of the crop below high school. Most flunked. Critics immediately accused the examiners of injecting politics into the schools and demeaning both teachers and pupils.
In 1837, education reformer Horace Mann, the “father of the common school,” became secretary of the newly created Massachusetts Board of Education, which was “part of the Whig Party’s effort to centralize authority and make schools modern and accountable,” writes Reese. “After a fact-finding trip abroad, Mann claimed in 1844 in a nationally publicized report that Prussia’s schools were more child-friendly and superior to America’s.” (Prussia was the Finland of the mid-19th century!)
Mann’s friend Samuel Gridley Howe, was elected to the School Committee. As a member of the examining committee, he insisted on written rather than oral tests.
His committee arrived at Boston’s grammar schools with preprinted questions, which angered the masters and terrified students. Pupils had one hour to write down their answers on each subject to questions drawn from assigned textbooks.
Only 30 percent passed. It turned out that students had “memorized material they often did not understand,” Reese writes.
The examiners believed that the teacher made the school, a guiding assumption in the emerging ethos of testing. Tests, they said, would identify the many teachers who emphasized rote instruction, not understanding. They named the worst ones and called for their removal.
. . . Anticipating an angry reaction from parents, Mann told Howe to deflect criticism from the examiners by blaming the masters for low scores. While the School Committee fired a few head teachers, parents nevertheless accused Howe of deliberately embarrassing the pupils and bounced him out of office in the next election.
Testing continued. Examiners caught one master leaking questions to students. They criticized a school for black students for low expectations and performance. They worried about how to evaluate school quality.
“Comparison of schools cannot be just,” the chairman of the examining committee wrote in 1850, “while the subjects of instruction are so differently situated as to fire-side influence, and subjected to the draw-backs inseparable from place of birth, of age, of residence, and many other adverse circumstances.”
The history is “eerily familiar,” writes Reese, author of Testing Wars in the Public Schools: A Forgotten History.
Massachusetts may eliminate a cap on charter schools in 29 low-performing school districts, including Boston, reports the Wall Street Journal. Two Democratic legislators introduced the bill.
(State Sen. Barry) Finegold, the bill’s sponsor and the son of public-school teachers, said his motivation sprung from conversations with parents in Lawrence, part of his district northwest of Boston, where the struggling school district was taken over by the state in 2011. The state has since brought in charter operators to run two low-performing schools, and parents told him, “we’d be out of here” had that not happened, Mr. Finegold said. “One thing I don’t think people realize—charter schools are keeping a lot of the middle class in cities,” he said.
More than 31,000 Massachusetts students attend charter schools, an increase of 20 percent in the past four years.
Massachusetts ranks its schools from Level One, the highest, to Level Five based on academic achievement, graduation and dropout rates. This year, 59% of charter schools in the state were Level One, compared with 31% of non-charter schools.
A recent CREDO study found Massachusetts charters produce learning gains statewide — very large learning gains for Boston students.
Boston charter students gain 13 additional months of learning in math and 12 extra months in reading compared to similar students in nearby district-run schools, concludes the latest CREDO study to find significant gains for urban charter students.
Eighty-three percent of Boston charter schools did significantly better than comparison schools; no Boston charter did worse. “The Boston charter schools offer students from historically underserved backgrounds a real and sustained chance to close the achievement gap,” said Margaret Raymond, who directs CREDO at Stanford University.
Statewide, the typical student in a Massachusetts charter school gains an extra one and a half months of learning per year in reading and two and a half in math.
Mike Goldstein, who founded the high-scoring MATCH charter in Boston, wants more on why the city’s charters outperform Boston’s semi-independent “pilot” schools, which draw students with similar demographics. What are Boston’s charters doing right?
Some 45,000 Massachusetts students are on charter school waiting lists because the state caps the number of charters in Boston and other low-performing districts.
Test haters have become myth makers, write Kathleen Porter-Magee and Jennifer Borgioli on Gadfly.
The idea is that teachers know best and that standardized testing—or any kind of testing, really, other than the teacher-built kind—is a distracting nuisance that saps valuable instructional time, deflects instructors from what’s most essential, and yields very little useful information about student learning.
. . . research has consistently demonstrated that, absent independent checks, many teachers hold low-income and minority students to different standards than their affluent, white peers.
. . . Standardized tests not only help us unearth these biases but also put the spotlight on achievement gaps that need to be closed, students who need extra help, schools that are struggling, and on. And by doing so, they drive critical conversations about the curriculum, pedagogy, and state and district policies that we need to catch kids up and get them back on the path to success.
Testing also is blamed for “drill-and-kill” instruction that existed long before the testing-and-accountability era, they write.
All else being equal, the students who typically fare better on state tests are those whose teachers focus not on empty test-taking tricks but rather on content-rich and intellectually engaging curriculum.
Ironically, an anti-testing position paper by the Chicago Teachers Union showed test-prepping teachers’ students scored lower on the ACT than students who were given “intellectually demanding work.”
Standardized tests don’t measure “what really matters” in education, such as critical thinking or social and emotional skills, critics complain. No test can measure everything, concede Porter-Magee and Borgioli. But many skills can be evaluated.
Anti-testers argue that setting standards and aligning assessments to them doesn’t work because it’s not what the Finns do.
Our own history suggests that it is exactly the states that have set rigorous standards connected to strong accountability regimes—most notably, Massachusetts—that have seen the greatest gains for all students, not just our most disadvantaged.
Meaningful reform will “require the effective measurement of student achievement that tests make possible,” they conclude.
Four-year college graduates’ skills don’t match available jobs, complained employers in Fort Collins, Colorado. A local liquor company employs three people with masters’ degrees, including a beer stocker with a physics degree.
A college degree is a valuable investment, but the first four to five years after college are “tougher than they’ve ever been,” said Martin Shields, a Colorado State economics professor.