Friday, September 28, 2012

I'm Baaaaacccckkkk

Well, we survived summer and the start of school.  With the new school year comes many changes here in Greenwich.  We have a new Superintendent and a new Math Coordinator.  I have had the pleasure of meeting both of them, and I am very encouraged by their outlook and openness.   They have started the preliminary steps toward the math curriculum review, which will begin in earnest with the publication of the Math Monitoring Report at the beginning of November.  The new format for monitoring reports established by the Board of Education last school year should provide an excellent foundation and starting point for the review. 

Will Everyday Math figure in the equation for the review?  EDM is very different in structure from the Common Core Standards (e.g., spiraling versus teaching to mastery).  For EDM to claim alignment to the Common Core will require a major shift in their philosophy, or a great sales job.  But hey, didn’t that get us to where we are now?  Let’s hope we don’t repeat history (I thought we were talking about math, Brian).  Beware of salesmen bearing books!

Even if we dump EDM next year, we still have to worry about the “lost generation” of students who learned (and I use that word loosely) their elementary school math using that program.  Many teachers recognized the failings of EDM and heavily supplemented the program to try to teach the basics.  But not all teachers had the background or experience to supplement.  By the time we get rid of EDM, there will be five or six classes which received the majority of their math education in elementary school under Everyday Math.  What are we going to do to ensure that these students know the basics?  I’ve heard two stories of honors students (advanced math students) who got to high school, then had to ask their parents how to do long division.  Ask a middle school math teacher (off the record, if they will comment) what their take is on how well prepared the students are. 

Speaking of the Common Core, there has been much debate about the Common Core State Standards, with several states questioning their involvement.  The Standards were judged (at least in one review) to be better than most of the individual state standards, so only a few states (not Connecticut) would have a argument with the Standards based on quality.  The main reasons for a second look are (1) concerns about giving up local control for a “national“ standard, and (2) cost.  Most states jumped to the CCSS to provide a path away from the requirements of No Child Left Behind.  This was sort of a Catch 22.  Avoid NCLB by signing up for another centralized set of requirements.  Either way, the national government is increasing its involvement in what is, constitutionally, a state and local purview. 

Apparently most states also jumped before they understood the cost to implement the Common Core.  The publishers and the test makers are so very happy.

Next up: a slightly different take on the Achievement Gap.


Tuesday, June 26, 2012

Mom Told Me Not to Say “I Told You So”

The link below takes you to a press release dealing with a research article discussing the relationship between “Math Success” and knowledge of fractions and long division.  As you might guess, there is a strong relationship.  I am trying to find a copy of the full article so that I can provide you with better information. 

Remember previous postings highlighting ability with fractions and decimal division (long division).  Remember the common complaints (there are so many, of course they are common) about Everyday Mathematics lack of focus on these areas.
Take a look at the YouTube presentation by one of the authors:

Tuesday, June 12, 2012

And Now for Something Completely Different

I don’t necessarily want to get off the math track (sorry, tracking is a bad word) here, but an article in the NY Times on Monday makes me wonder what we are missing on the other side of the three R’s (writing and reading). 

The article talks about the changes made to the scoring of the writing portion of the Florida Comprehensive Achievement Test when the scores plummeted from last year.  Only 27% of fourth graders were proficient, down from 81% the previous year.  According to the article “The numbers fell so drastically because, as announced last summer, state officials toughened the standards, paying more attention to grammar and spelling as well as to the factual accuracy of supporting details.”

I may be old fashioned, but isn’t “paying attention” another phrase for actually grading the test properly?  How can you judge proficiency when you are discounting grammar and spelling, as appears was done under the old standards? 

The rest of the article is interesting, especially the solution to the reduced number if students reaching proficiency.

Monday, June 4, 2012

Push, Push, Push

I spoke at the Board of Education meeting on 24 May to urge the administration to start the Math Curriculum review before November, which is the date for the delivery of the next Math Monitoring Report.  The curriculum review policy is being discussed now in committee, and as written, would rely on the Monitoring Report to kick off the process.

First, congratulations to the high school and middle school Math teams for their first place finishes in Connecticut competitions.

Thank you for your recent actions regarding the acceleration of a math curriculum review.  This is a great first step. 

Recognizing the impact of the proposed revisions to Policy E001, I would like to make a case for the initiation of this review prior to the delivery of the next Math Monitoring Report scheduled for November 2012.  The reasoning for this is simple timing.  If that report triggers the start of the 12-18 month cycle for the review, the earliest implementation would be September 2014, which is the start of the school year with the new standardized testing. 
Initiation now, or more realistically as soon as a new Math Coordinator is appointed, still provides the opportunity to conduct a thorough review and to have time for professional development prior to the start of the 2013-14 school year.  The budget cycle would allow for this timing.   I recognize the unsettled nature of some of the programs being offered for sale as Common Core compliant.  However, investigating the experiences of other districts which have already made moves to new curricula should provide the required information to select an appropriate, Common Core aligned, program.
The data contained in the January 2012 Math Monitoring Report can be augmented with the additional required elements as they become available.  Waiting until November would delay implementation a full year.  It would extend the use of Everyday Math one more year, and would also cause us to spend resources developing and implementing a transition plan with Everyday Math as its basis.
Please consider an immediate start to the review.
Thank you.

Tuesday, May 22, 2012

Great Question

During the follow up discussion regarding the Proposed Technology Plan 2012-2015, Board of Education member Jennifer Dayton posed a very searching and profound question: What can technology do to bring about an extraordinary rise in student achievement?
The response was minimal, but this question should form the basis for any investment in technology going forward. 
Are we investing for the sake of making some investment?  Or to have the "best stuff" in our schools?  The bottom line is results.  Are we raising student achievement significantly with our spending?
Great question.


At the recent Board of Education meeting, there was a discussion regarding homework policy, as part of the larger discussion around the monitoring report for Effective Learning Environment (E040).  The topic had been raised before, seemingly in the context of ensuring the consistency of the application of the homework policy across the district.  See specifically pages 6-7 and pages 12-13.

Pages 12-13 deal with the “Prior Years” management issue around homework, specifically, "the District will be examining research on homework and its relationship to student achievement."  The monitoring report references a book (Visible Learning) which brings together 800 studies around student achievement.  I reproduce the summary contained in the monitoring report below.  I have not read the book (it is not in the Greenwich Library collection).

The recommendation is that the district form a committee of teachers, parents and administrators to explore the homework issue.  From my reading, the research indicates support for less/no homework at the elementary school level, based on the lack of a significant impact on achievement. 

One comment from a Board member was that parents use homework as a means to understand what is going on.  This may indicate a lack of communication, or a lack of informative reports cards (there's a surprise), or both.

Personal note: looking back, I don't recall having any significant or regular homework until seventh grade.  I still remember the excitement when we got our first homework assignments.  Boy, did that end quickly. 

This begs the question, in relation to learning such things as basic math facts: If the current direction is that math facts are to be practiced at home, what would happen if there was no more homework?  Would that drilling return to the schools/teachers?  Would it fall by the side of the road completely?  Perhaps ending the ridiculous Everyday Math homework and letting parents drill their kids in the math facts in the time saved would be an effective answer (and one which appears to be supported by the research - see the seventh bullet point regarding "Effects are highest....").  But what about the parents who are working, or don't get the importance?  Will the achievement gap widen?  Tough questions.

Highlights of Research on Homework

Educational researcher John Hattie spent fifteen years reviewing thousands of studies involving millions of students and teachers on the impact of different influences on student achievement. His findings are summarized in the book Visible Learning: A Synthesis of over 800 Meta-Analyses Relating to Achievement. Effect sizes standardize changes in student achievement allowing us to compare the impact of different factors (an effect size of 1.0 equals one standard deviation). The typical effect size across all the studies Hattie reviewed is d =.40, and the author proposes that this is the level where a strategy or practice begins to noticeably impact student achievement. In Hattie’s words, "The effect size of 0.40 sets a level where the effects of innovation enhance achievement in such a way that we can notice real-world differences and this should be the benchmark of such real-world change."(p. 17) The list below presents highlights from Hattie’s meta-analyses on homework.

• The correlation between time spent on homework and achievement is near zero for elementary students.

• Greater effects for older students vs. younger students.

• Effects of homework are twice as large for high school students as for middle school students. Effects are twice as large for middle school students as for elementary school students.

• Greater effects for high ability students vs. low ability students.

• Higher effects when material was not complex or if it was novel.

• Homework involving higher level conceptual thinking and project based was the least effective.

• Effects are highest when homework involves rote learning, practice, or rehearsal of subject matter.

• Research favored short, frequent homework that was closely monitored by teachers.

• Homework does not help students develop time management skills.

• Direct parental instructional involvement in homework showed a negative relationship with achievement, while parental support for independent homework showed a positive relationship.

• Hattie meta-analyses describes a "Zone of Desired Effects" which ranges from
d= 0.40 to 1.2. The effect for homework is d= 0.29

• Negative impacts of homework:
 Can undermine motivation

 Can cause students to internalize incorrect routines and strategies

 Can reinforce less effective study habits

Hattie, J.A.C. (2009).  Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement. New York: