Giving opinion its due

I have been thinking about Justin P. McBrayer’s New York Times op-ed, “Why Our Children Don’t Think There Are Moral Facts.” McBrayer notes that schools implicitly tout moral relativism by having students distinguish repeatedly between “fact” and “opinion.” According to McBrayer, this creates confusion: Not all truths are proven facts, and not all opinions are “mere” opinion. Unfortunately, when given “fact vs. opinion” exercises,  students learn to treat all value statements as opinion. For instance, the statement “killing for fun is wrong” would count as opinion, when, in McBrayer’s view, it should be treated as fact. Over time, after performing many such exercises, students conclude (without thinking the matter through) that there are no moral facts.

I would take McBrayer’s argument one step further (or maybe in a direction he didn’t intend). Opinion itself was not always viewed as a one-off statement of belief or prejudice. It involved reasoning, choice, and judgment about things that were not fully known or proven. The word derives from the Proto-Indo-European *op- (“to choose”) and later from the Latin opinari (“think, judge, suppose, opine”). The OED gives, as its first definition of “opinion,” “What or how one thinks about something; judgement or belief. Esp. in in my opinion: according to my thinking; as it seems to me. a matter of opinion : a matter about which each may have his or her own opinion; a disputable point.” John Milton’s elevates the concept of opinion in his speech Areopagitica: “Opinion in good men is but knowledge in the making”; a similar idea appears in Thomas Usk’s The Testament of Love: “Opinyon is whyle a thyng is in non certayne, and hydde from mens very knowlegyng.” (Both quotes are included in the OED entry.)

Yet for all its former respectability, opinion has always run the risk of falling back on prejudice and superstition. This is particularly true of group opinion. John Stuart Mill argues for freedom of individual expression precisely because the alternative—unconsidered public opinion—holds so many dangers and so much power:

Men’s opinions, accordingly, on what is laudable or blamable, are affected by all the multifarious causes which influence their wishes in regard to the conduct of others, and which are as numerous as those which determine their wishes on any other subject. Sometimes their reason—at other times their prejudices or superstitions: often their social affections, not seldom their anti-social ones, their envy or jealousy, their arrogance or contemptuousness: but most commonly, their desires or fears for themselves—their legitimate or illegitimate self-interest. Wherever there is an ascendant class, a large portion of the morality of the country emanates from its class interests, and its feelings of class superiority.

Ironically, to protect individual opinion, one must also release it, to some degree, from responsibility Mill does not say this outright, but it seems to follow from his argument:

This, then, is the appropriate region of human liberty. It comprises, first, the inward domain of consciousness; demanding liberty of conscience, in the most comprehensive sense; liberty of thought and feeling; absolute freedom of opinion and sentiment on all subjects, practical or speculative, scientific, moral, or theological.

If I can say whatever I want about any subject, if I am not bound to standards of research and reasoning, then my opinion is unfettered but also potentially trivial. On the other hand, if only the qualified elite may speak, then, as Mill notes, certain prevailing opinions go unquestioned while bright and necessary challenges are suppressed.

So, as opinion becomes liberated, it also degrades—to where it becomes near-synonymous with “something that can’t be taken seriously.” As McBrayer points out, the Common Core includes the standard “Distinguish among fact, opinion, and reasoned judgment in a text.” How did “opinion” become separate from “reasoned judgment”? The standard seems to imply that opinion does not involve judgment or reasoning; that is both peculiar and telling. One can have the best of both worlds: freedom of opinion combined with recognition that opinion can be well or poorly formed.

From what I have seen of the Common Core in word and practice, it treats opinion and argument as separate. Something that can’t be supported with “evidence” is regarded as mere opinion; something that can has a more elevated status. But facts are not always definitive and must be selected out of many; moreover, there are good arguments that don’t have “evidence” behind them. As a result, there is little room (and no good word) for inquiring into matters of uncertainty—matters that cannot be proven one way or another but that require more than a snap judgment.

To return to McBrayer’s example, killing for fun is wrong—few would dispute that—but why? Why did he not say “killing of any kind, for any reason, is wrong”? Perhaps he was leaving room for the possibility that killing may sometimes be necessary and thus not altogether wrong. In that case, how is “killing for fun” different? Let’s assume he is referring to the killing of humans; if it is true that human life has dignity (which, for the sake of brevity, I won’t define here), then human life should not be taken lightly. Kill if you must (though some would argue that there is never such necessity), but don’t kill gratuitously, whatever you do. Thus, “killing for fun is wrong” follows—or at least can follow—from the axiom that human life has dignity. I have not given any “evidence” that killing for fun is wrong, but I have identified a possible axiom behind the statement.

Opinion does not have to be trivial; it runs the gamut between folly and wisdom. Instead of dismissing opinion, schools should teach students to form theirs as well as they can.

Note: I made a few edits to this piece after posting it.

(I am delighted to be guest-blogging along with Rachel, Michael, and Darren. I probably won’t post anything else this week but will be back on April 4.)


Physics, ethics, zombies

Fighting zombies — and learning ethic?

Video games are used to teach everything from ethics to physics at a Norwegian high school, reports Tina Barseghian on Mind/Shift.

In a religious studies class, students watch a scene from The Walking Dead.

Supplies are running low and only four food items are left to ration, but there are 10 hungry mouths to feed. Who should eat? The grumpy old guy? The injured teen? The children? The leader?

Once the class reaches a consensus, they have to justify their choice with one of the concepts they’ve learned from moral philosophy. Was their decision guided by situational ethics, utilitarianism or consequentialism?

Games should be more than “chocolate-covered broccoli,” says teacher Tobias Staaby. He also uses Elder Scrolls V: Skyrim, a sword-and-and sorcery action role-playing game, to teach about Norwegian romantic nationalism.

Physics students play Portal 2, which requires solving puzzles to escape a labyrinthine lab complex. Players “manipulate cubes, redirect lasers and tractor beams, time jumps, and teleport through walls . . . ”

“Should we have a large mass and height? Drop 50 kilograms from 50 meters? Oh, the air resistance kicks in – let’s shorten the height,” said (teacher Jørgen) Kristofferson, illustrating how his students toyed with the power of gravity.

“Real world experiments are important and the game can’t replace them,” he said, “but the game gives students a different perspective on the laws of physics, where mechanics are simulated by a computer to create a realistic gaming environment. It can also be a great source of discussion when the laws of physics are broken!” Students think about how the simulation deviates from reality and transform what might be perceived as a game’s shortcoming into a critical thinking opportunity.

An avid gamer, teacher Aleksander Husoy pioneered the idea by using Civilization IV to teach a cross-curricular unit in Norwegian, English and social studies.

Does Facebook need ethics education?

There has been outrage over Facebook’s psychological experiment on 700,000 unwitting users. In order to test its ability to manipulate users’ posts, Facebook used an algorithm that altered the emotional content of their news feeds. (In half of the cases, it omitted content associated with negative emotions; in the other half, positive emotions.)

According to an abstract, “for people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred.” The findings were published in the March 2014 issue of the Proceedings of the National Academy of Sciences (and reported in numerous places, including the Wall Street Journal article that informed this post).

Now, these findings aren’t surprising–who wants to be all cheery when your “friends” are down in the dumps?–but they left many people angry. An experiment of this kind isn’t just a misuse of data; it deliberately provokes people to post things they might not otherwise have posted, in a “space” (i.e., the news feed) that many consider their own, since it includes only what they want to include. (Yes, they’re mistaken in considering it their own, but Facebook does a lot to feed that illusion.)

Did Facebook have the right to conduct this experiment in the first place? Kate Crawford, visiting professor at MIT’s Center for Civic Media and principal researcher at Microsoft Research, says no. Moreover, she holds that ethics should be part of the education of data scientists. (For a more detailed exposition of this view, see danah boyd and Kate Crawford, “Critical Questions for Big Data,” Information, Communication & Society, 15:5, 662-679.)

What would “ethics education” look like in this context? Would it focus on the issues at hand, or would it examine ethics more broadly, with readings  and analysis of ethical problems? Would it take the form of a professional development course, or would it start in high school or earlier?

It is possible that the Facebook controversy (and others like it) will lead to a greater emphasis on ethics in education. That could be promising if handled well. One pitfall of ethics education is that it may be reduced to specific issues and even mistaught. That is, those studying the “Ethics of Big Data” may never consider ethics outside of Big Data, or ancient ethical problems that relate to their own, or even the distinction between ethics and morality (which has been articulated in different ways but is worth considering in any case).

So ethics education, if taken up by “big data” and other nebulous entities, will need to go beyond a crash course or PD. Study ethics, but study it well. How do you do that? Read seminal texts, raise questions boldly, stay aware of your errors and fallacies, and put your principles and reasoning into practice. That’s just a start.

How to praise a child

Instead of praising kids for good grades or athletic achievements, parents and teachers should praise children for acting ethically, says Rabbi Joseph Telushkin.

Learning by teaching

Student work can illuminate teaching, writes Diana Senechal, who presents three students’ philosophy papers on Gotham Schools. She teaches at Columbia Secondary School for Math, Science & Engineering, a selective public school in New York City partnered with Columbia University. In the school’s Philosophy for Thinking program, “ninth-graders study rhetoric and logic; the 10th-graders, ethics and aesthetics; and the 11th-graders, political philosophy.”

She asked students to write about an ethical dilemma in their own lives or in a work of literature. A 10th-grade boy began:

While I was about to start this assignment, I spent about twenty minutes stressing over the fact that I couldn’t think of anything that made me question ethics. I complained to my mother that I couldn’t think of anything to say. I then asked her whether I should ask Professor Senechal whether I could make it up. Mom raised her eyebrow. “Is that ethical?” she asked.

He turned his dilemma about the assignment into the topic of the assignment, Senechal writes. He went on to analyze philosophical positions on lying, such as “Kant’s argument that any lying results in loss of dignity; utilitarian arguments that lying may be acceptable if it is used to a good end” and more.

He concludes that he is somewhere between Kant and utilitarians. Implicit in the discussion is his decision, for this particular occasion, not to lie.

“Real-life applications of philosophy need not be shallow, if the philosophical thought is strong,” Senechal decided.

College cheaters become adult cheaters

Students who cheat and lie in college are likely to behave dishonestly in the workforce, according to a University of Minnesota study, which relied on self reporting.

Among the types of cheating examined were increasing the margins or typeface to make a paper seem longer, telling an instructor a false reason for missing a class or exam, obtaining questions to an exam from an unauthorized person before a test, writing a paper for someone else and preparing cheat sheets.

Those types of unethical actions in college were found to carry over into the workplace in the forms of taking long lunches, telling an employer a fake reason for missing work, writing a report for a co-worker, filling out a false expense report and presenting the ideas of co-workers as their own.

Dishonesty “tends to carry over” from college to adult life, Nathan Kuncel, the study’s co-author, told BusinessNewsDaily.

Making classroom rules

Who Makes the Rules in a Classroom? asks Nancy Flanagan on Teacher in a Strange Land. According to the latest dogma,  good teachers get students to collectively write their own classroom rules.

It seems democratic and encourages “buy-in,” teachers believe, even if students are just as likely to break their own rules as ones set by teachers.

When Flanagan tried it in her own music classroom, students came up with a list of “don’ts” — as in don’t empty your spit valve on someone else’s chair — but “it never felt as if we were wrestling with the really important issues: Building a functioning community. Safety. Personal dignity. Kindness. Order. Academic integrity. Democracy.”

She offers ideas about creating classroom rules, such as:

 •You’re shooting for influence, not control. Fact is, teachers never have absolute control over kids, even using techniques like fear, punishment, isolation and intimidation. (In edu-speak, “consequences.”) You want kids to behave appropriately because they understand that there are rewards for everyone in a civil classroom.

•No matter what rules you put on paper, your most important job is role-modeling those practices, not enforcing them. Behave the way you want kids to behave: Ignore minor, brainless bids for attention. Make eye contact with speakers. Don’t be an attention hog–your stories aren’t more important than theirs. Don’t be rude to kids. Apologize publicly when you’re wrong. Remember that you’re the adult in the room. It’s your calm presence that institutes order, not rules.

Don’t restate the obvious or load up on “don’ts,” she advises. But do give clear instructions when needed.  “Stress: order facilitates learning, makes the class a pleasant place to be.”

 •Integrity helps build community. The most important directives in democratic classrooms are around ethical practices: A clear definition of cheating, understood by all students, in the digital age. Why trust and personal best are more important than winning. Why substandard work isn’t ever OK. How true leadership–kids want to be leaders, too– is a function of respect.

“Carrots and sticks” can be counter-productive, Flanagan writes. Students’ good behavior is its own reward: They get to attend a “civil, well-managed” school.

Cheater prospers

I Used to Think … and Now I Think, reflections by education reformers, includes an essay by recently departed Atlanta Superintendent Beverly Hall, writes John Merrow.

In eight largely self-serving pages, Dr. Hall celebrates her accomplishments. She tells us that it took her three years to bring the school system under her direct control and “to institutionalize strong ethics requirements limiting the school board’s direct involvement with the day-to-day operations of the system.” . . .  Since the Georgia Bureau of Investigation report traces the cheating right to the superintendent’s desk, the sentence resonates with irony.

Hall received nearly $600,000 in bonuses during her time in Atlanta, Merrow notes. “How much of that was for raising test scores (fraudulently) is unclear, but the Board wants to ‘claw back’ those dollars.”


Once a cheater, always a cheater

High school cheaters “are far more likely than non-cheaters to lie to their spouses, bosses, and employees when they grow up,” writes Debbie Viadero of Inside School Research.  In a Josephson Institute study, 64 percent of high school students said they’d cheated on an exam in 2008, 42 percent said they’d lied to save money and 30 percent admitted stealing from a store. The study also talked to older people.

The study also found that, regardless of how old they are now, people who cheated in high school were three times more likely to lie to a customer (20% vs. 6%) or inflate an insurance claim (6% vs. 2%) and more than twice as likely to inflate an expense claim (10% vs. 4%) than people who never cheated in high school. The high school cheaters were also twice as likely to lie to or deceive their boss (20% vs. 10%) or lie about their address to get a child into a better school (29% vs. 15%) and one-and-a-half times more likely to lie to spouse or significant other (35% vs. 22%) or cheat on taxes (18% vs. 13%).

Viadero thinks character education would help. I’m dubious.