InBloom doomed by privacy concerns

Privacy and security concerns doomed the InBloom Student Data Repository, reports the New York Times. The Gates-funded non-profit, which offered to manage student records, will close.

The system was meant to extract student data from disparate school grading and attendance databases, store it in the cloud and funnel it to dashboards where teachers might more effectively track the progress of individual students.

But inBloom was set to collect more than academic data, notes the Times.

An inBloom video offered a vision (using fictional students) of new uses for data in education.An inBloom video offered a vision (using fictional students) of new uses for data in education.

The inBloom database included more than 400 different data fields that school administrators could fill in.

. . . some of the details seemed so intimate — including family relationships (“foster parent” or “father’s significant other”) and reasons for enrollment changes (“withdrawn due to illness” or “leaving school as a victim of a serious violent incident”) — that parents objected, saying that they did not want that kind of information about their children transferred to a third-party vendor.

Parents in Louisiana were upset to learn their children’s Social Security numbers had been uploaded to inBloom. 

With states and school districts bailing, inBloom wilted.

 

Christmas cheer raises scores

Christmas cheer raises test scores, concludes Brookings’ Matthew Chingos.

He crunches PISA data to show that scores are higher in countries where Christmas is a public holiday. (First step: Exclude Shanghai.)

That’s confirmed by NAEP scores on fourth-grade math performance from 1990 to 2013, which show test scores rise and fall with holiday cheer (measured by consumer spending in November and December).

Standardizing the NAEP scores and putting the spending index on a logarithmic scale implies that if we could just have about 30% more holiday spirit, our students would do as well as those in Finland!

Brilliant, writes Jay Greene. And the reason why “random-assignment and other research designs that more strongly identify causation are so important.”

From idealist to ‘bad teacher’

John Owens quit a successful publishing career, studied education for a year in graduate school and became a writing teacher at a South Bronx high school that “considered itself a model of school reform.” It didn’t go well, Owens writes in Confessions of a Bad Teacher.
bad teacher
Owens talks to Ed Week Teacher‘s Hana Maruyama about his “heartbreaking” year as a teacher.

His principal was obsessed with data, says Owens, but the numbers were meaningless. “I had to put in 2,000 points of data a week for my kids. Everything from attendance to homework. But I also had to put in things like self-determination. I mean, what is self-determination?”

He was told he was a “bad teacher,” he complains. “If I were a good teacher, the kids who had attention deficit hyperactivity disorder would sit still and learn. If I were a good teacher, the kids who didn’t speak English would speak English. If I were a good teacher, all the problems that these kids faced would be solved in my 46 minutes a day with them.”

Data, data everywhere, but what does it mean?

Many states are collecting extensive education data, but aren’t training teachers and parents in how to use the information effectively to help students learn, concludes the Data Quality Campaign.

Data should be used to improve student achievement and inform parents, not just for “shame and blame,” said Aimee Guidera, executive director of DQC.

Teaching the quantified student

 “I am a bad teacher” wrote Sujata G. Bhatt in Valerie Strauss’ Washington Post blog in the school test-taking season of 2011.  Education reformers want to use data to drive instruction, reform and accountability, wrote Bhatt. “At what cost? Does this data really represent learning and knowledge?”

Since then, she’s embraced data, Bhatt writes in The Quantified Student.

She teaches in a high-poverty Los Angeles school. Many of her students aren’t fluent in English. In the fall of 2010, her fourth graders were particularly unprepared.

Since California’s standardized test for fourth graders measured skills almost all my students needed, I analyzed its requirements, broke them down into core concepts, and then worked and reworked these concepts with the students until they felt a sense of mastery over them. My daily job consisted of finding different, creative ways of approaching, teaching, and reteaching the same core skills so that most all students could incorporate them into their cognitive toolkits.

It worked. The students succeeded wildly. They returned to me for fifth grade with heightened confidence. They saw something new in themselves: the reward of effort and the joy of success.

They also came back with questions about “how many more points it would take to get to the next level, how many more problems they’d need to get right to get those points.”  They saw the test as a game they wanted to win.

Teaching the same cohort in fifth grade, she looked for ways for her students to explore their interest in data. 

We used math websites like TenMarks that enable students to learn about their own learning even as they practice new skills. We analyzed information graphics and dove into ways of presenting numerical information. We explored how numbers shape our understanding of ourselves and the world. And much of their enthusiasm and curiosity for these tasks came out of their interest in numbers from standardized testing.

She now believes standardized testing can help teachers understand how well they’ve taught and help students become “agents in their own learning.”

Testing — and evaluation systems built on test scores — need to get a lot better, Bhatt writes. But it makes more sense “to work to create better data than to fight data.”

Data analysis is an increasingly significant and empowering way of making sense of the world. All sorts of professions use data to interpret their work and decide upon courses of action. Why shouldn’t we in education?

In the high tech world there’s a growing movement called “The Quantified Self.” With quantified self models, adults use data to change habits and behaviors–to lose weight, exercise more, to calm themselves.

“Why not help our students become makers and masters of their own data, and help them use it to propel their own learning forward?” Bhatt asks.

The Measured Man is a fascinating — and somewhat alarming — Atlantic profile of Larry Smarr, an astrophysicist, computer scientist and highly quantified human.

Miami-Dade wins Broad Prize

Miami-Dade’s school district has won the Broad Prize for Urban Education, after five years as a finalist, reports Ed Week.

More black and Hispanic students are scoring “advanced” on state tests and graduating, the foundation said. In addition, more students are taking the SATs and earning higher scores.

(Superintendent Alberto) Carvahlo drew attention to improvements in some of the district’s lowest-performing schools, which he attributed partly to the Data/COM (short for Data assessment, technical assistance, coordination of management, according to Carvalho) process. During Data/COM, school officials analyze a school’s challenges and debate solutions, Carvahlo said.

. . . The district’s budget has also improved dramatically under Carvalho’s tenure, which was noted by the jury. “This may seem strange, but we actually embraced the economic recession as an opportunity to leverage and accomplish change,” he said. The district found additional government and foundation funding and made sure all spending was directed at improving student achievement, Carvalho said.

Runner-ups were Palm Beach County (Florida), Houston and Corono-Norco (California).

Boston Superintendent Carol R. Johnson was honored as the best urban superintendent by the Council of Great City Schools.

Data myths

Defusing myths about classroom data will help teachers reach all students, argues the Dell Foundation.

Common education data myths stifle progress

The value-added debate

Can a few years’ data reveal bad teachers? The New York Times‘ Room for Debate takes on value-added analysis.

Pomp, circumstance and then what?

Few high schools track graduates to see if they’re succeeding in college or careers. Some states are linking high school and college data to evaluate success rates.

Also on Community College Spotlight:  Community college construction has stopped in Los Angeles. The district has billions in bond money, but can’t afford to pay for building maintenance or for instructors to use the new space.

Jerry Brown: Data is useless

School performance data is a “siren song for school reform,”  (pdf) wrote California Gov. Jerry Brown in vetoing a bill to add “multiple indicators,” such as graduation rates, to the state’s Academic Performance Index.

This bill requires a new collection of indices called the “Education Quality Index” (EQI), consisting of “multiple indicators,”many of which are ill-defined and some impossible to design. These “multiple indicators” are to change over time, causing measurement instability and muddling the picture of how schools perform.

SB547 would also add significant costs and confusion to the implementation of the newly-adopted Common Core standards which must be in place by 2014. This bill would require us to introduce a whole new system of accountability at the same time we are required to carry out extensive revisions to school curriculum, teaching materials and tests. That doesn’t make sense.

Finally, while SB547 attempts to improve the API, it relies on the same quantitative and standardized paradigm at the heart of the current system. The criticism of the API is that it has led schools to focus too narrowly on tested subjects and ignore other subjects and matters that are vital to a well-rounded education. SB547 certainly would add more things to measure, but it is doubtful that it would actually improve our schools. Adding more speedometers to a broken car won’t turn it into a high-performance machine.

Over the last 50 years, academic “experts” have subjected California to unceasing pedagogical change and experimentation. The current fashion is to collect endless quantitative data to populate ever-changing indicators of performance to distinguish the educational “good” from the education “bad.”

. . . SB547 nowhere mentions good character or love of learning. It does allude to student excitement and creativity, but does not take these qualities seriously because they can’t be placed in a data stream. Lost in the bill’s turgid mandates is any recognition that quality is fundamentally different from quantity.

There are other ways to improve our schools — to indeed focus on quality. What about a system that relies on locally convened panels to visit schools, observe teachers, interview students, and examine student work? Such a system wouldn’t produce an API number, but it could improve the quality of our schools.

Actually, I doubt it.  Maybe a state school inspector could evaluate school quality without student performance data by looking for signs of good character and love of learning.  Maybe not. A local committee would be easy to snow.

The vetoed bill, SB 547, had broad support, notes John Fensterwald of Educated Guess. The proposed Education Quality Index could have included “dropout rates, the need for remediation in college, success with career technical education programs, and graduation rates.” Standardized test scores would have counted for no more than 40 percent of the score in high school. While critics “questioned whether the EPI would be too squishy,” Brown complained “it would have demanded more of the same, hard data.”