Organize Your Life With Google Keep


Do you need a tool to help organize your busy life, or are you looking for a simple way to take notes on your Android, iPhone, or laptop? Google Keep should be the first thing you look at. Why? Well, for starters, it’s free, and most people that I talk to seem to be looking for simplicity anyways. So why not start with a freebie and see if it meets your needs?

Google Keep is a great tool that I use personally, professionally, and one that I would also recommend for students too. You can color code notes, categorize them, easily make checklists, share notes with collaborators, and it all syncs across your devices, so you always have access to your notes on any device via your Google account.

If you are struggling to get organized, are ready to move beyond a hodgepodge of paper and digital notes, and are looking to become more efficient at prioritizing your tasks, I highly recommend taking a serious look at Google Keep

Stay organized my friends!

Testing Considerations for the Minnesota Legislature, from Testing and Assessment Expert, Dr. Ben Silberglitt

When I took on the duties of coordinating research, evaluation, and assessment at the St. Croix River Education District (SCRED), fresh out of graduate school, one of the first tasks I was confronted with was helping our consortium of small, rural, Minnesota districts make sense of these new tests (the Minnesota Comprehensive Assessments, a.k.a. MCAs) and this brand new legislation (No Child Left Behind). As part of our work, I suggested that we lead groups of teachers through a “deep dive” into the assessment, dissecting the test by the types of questions asked, how much weight was given to each strand area, and matching that up with the scope and sequence of their curriculum.

I was working with a group of 3rd-grade teachers on the MCA math, reviewing how the test included a significant number of questions on predicting the outcomes of spinners, dice, and other “games.” Through my research, we discovered that those activities were well-matched to the scope of their curriculum, but two years prior, the teachers had purposefully moved the units covering these topics to the end of the school year, after testing was over.

They had felt at the time like these topics were less rigorous (more “fun”) and would not be as good at preparing students for the MCA. Needless to say, they promptly moved these units to immediately before the test, and the following year third-grade math test scores jumped considerably.

The students’ overall learning didn’t change. The quality of their education was just as high. The socioeconomic status of the students arriving into the district hadn’t shifted. But suddenly, this school moved from near the middle of the pack to near the top of the state, in 3rd-grade math. From an accountability perspective, they were rock stars. What lessons are to be learned from this work, which was playing out in schools across the country?

As the adage goes, “what gets measured gets done,” and in this case, we have an example of a system adapting to the measurement pressures placed upon it. What’s broken about this system is not the act of measurement itself, but the extreme overfocus on summative, accountability-focused assessments that don’t provide timely information to educators.

The legislature recently commissioned a report on our state’s MCAs, and, to paraphrase, they found that we spend a lot of money on these tests, but no one finds them useful. The irony is thick, here: you might have said the same thing about this commissioned report. I’ve been working with districts across the state for the past 15 years on topics related to data and testing. I frequently ask large groups at presentations, “how many of you find the MCA useful to help you plan instruction?” I haven’t seen a hand raised yet. I think the collective response by educators to this legislative report has been, “We already knew that!”

My hope is that our state legislature will make informed decisions and listen to what educators already know: Assessments have value, and each has a purpose, but no assessment meets all purposes. The reason the MCA isn’t valuable to educators is that its only purpose is accountability. Accountability isn’t a bad thing, but it isn’t everything.

The cognitive dissonance of policymakers contributed to bad assessment policy. If we spent $50 million on a test, it must be amazing. Our state education department leaders, under pressure to make these tests as valuable as policymakers thought they were, then made the fatal mistake of trying to make the MCAs a “silver bullet” – useful for every possible purpose. (It’s a formative, it’s a summative, it’s a Supertest!) Now we see the backlash, and assessment as a whole is at risk, as the pendulum swings back.

What we need is policy that understands the need to balance different types of formative assessment (screening, diagnostic, and progress monitoring assessment) with the need for summative accountability and outcomes assessments. We aren’t overtesting our students; we are overtesting for just one purpose, and not providing enough support to districts for other purposes.

We need a wider range of assessments, and our resources (both money and time) should be redistributed across these purposes. This redistribution requires more flexibility for schools to implement research-based assessments that match their needs, and more support to help schools make these choices, as well as support for using these assessments effectively once they are in place. We can do all of this, but our staff continues to be hamstrung by outdated state-mandated assessments, pushed onto districts who are left to pay the extensive “hidden” costs of administration, coordination, professional development, and lost instructional time.

It’s time for a more sensible approach to testing in our state. Minnesota can continue its reputation as a trailblazer in education, by doing what’s right for kids, by supporting a balanced approach to assessment – one that focuses on the needs of educators, giving us the tools we need to do what we do best.


The State of Standardized Testing in Minnesota

Many educators have concerns about standardized testing, and this week the Office of the Legislative Auditor released its evaluation report on standardized student testing. The report confirms what many of us knew all along. Key findings in the report included:

  • MDE spent $19.2 million on standardized tests in FY 2016.  Federal sources contributed over one-third of the funding.
  • State required tests strain the resources of many school districts and charter schools.  MDE does not systematically measure the local costs and impacts of state testing requirements.
  • The use of test scores at the local level varies widely with many teachers and principals not feeling prepared to interpret the testing data.
  • Most school districts and charter schools administer other standardized tests and find their locally adopted tests more useful.

The full report can be accessed here.

I have two questions:

  1. How can we justify paying $19.2 million for standardized testing that has little to no return on investment when we have students that struggle to access education due to lack of transportation, food, housing, and the social services they desperately need?
  2. Why are we spending $19.2 million on testing that doesn’t improve outcomes for students?