From Educon to Data – My Reflections

For those that listen to me, I often say that my trip to Philadelphia last summer for PLP Bootcamp was the transformative experience of my professional career.  It was after that experience that I began to write this blog,  became a much more prolific contributor to twitter, and began to build my personal learning network and fully leverage its power.  My experience at Educon this past weekend, coupled with my return to work just in time to analyze state assessment data (sigh) only solidified PLP as the professional life-changing event that it was and my continued journey as a learning leader (including the need to keep learning!)

First Educon

So many have written about their experience at Educon that I am not quite sure I offer a truly unique perspective of the event, but I can say that it is like no other “conference”.  The conversations that didn’t stop until the wee hours of the morning at Con Murphy’s coupled with the flutter of preweekend tweets attest to the palpable buzz that defined this gathering.  What made this conference so different than others, other than the fact that SLA students ran it, was the access that all participants had to conversations that were taking place.  This wasn’t the “sage on the stage” approach to workshops.  Sure there were “big names” there, but when Will Richardson, Dean Shareski, Gary Stager, Chris Lehmann, Sheryl Nussbaum-Beach, and David Warlick (to name a few) weren’t talking, they were listening.  And, when the Tony Baldasaro’s of the world weren’t listening, they were talking.  It was a three-day conversation between and among 500 people passionate about teaching and learning.

Here’s the other thing… It wasn’t about technology.  Sure, technology was everywhere and the work of the SLA technology team and students (who ran the help desk and ran live streaming) needs to be commended, but this conference wasn’t about technology, in part because it was assumed that technology was part of learning, it was embedded not an add on.  There were laptops, netbooks, iPhones, iPods, Droids, etc. but it was so ubiquitous that no one noticed.  I couldn’t help but wonder what our schools would look like if classrooms “looked” the same.

Data

I have been the data “guru” in our district for about a half dozen years now and I am really beginning to question my role in propagating its importance.  Data is very important in our district (as I am sure many others) but I fear that it is used as a weapon to attack rather than a prescription to heal and unfortunately, since I have been called upon to testify so often about its meaning, I feel as though I have legitimized its ability to cause damage.

In part due to my experience at Educon and in part due to my continued struggle with the meaning of assessment data and our use of it, I wonder what our recent round of test scores really mean.  I say that as an Assistant Superintendent of a school district that did very well on the state tests – 93% of our 8th grade students scored proficient or better on the state reading assessment with 1 out of every 2 8th graders scoring in the highest category.  But, in light of my experience with PLP and Educon, I have to say that I don’t know what that “success” truly means.  Does it mean that 93% of our kids have truly learned and if so, learned what?  How does their performance on the state test relate in any way to the act of learning?  I am not naive enough to think that accountability in the age of NCLB will disappear, but I want a model that can somehow account for learning, learning like what I experienced at Educon this past weekend.  But, as hard as I try, I can’t adequately “quantify” my learning at Educon, most likely because it wasn’t linear.  Yet my time there was too valuable to pretend that it didn’t impact me as a learning leader.

This all leads me to the following question:

If accountability is here to stay (for the short term anyway),
and If accountability drives our schools as we know that it all does,
and If we accept the fact that traditional pedagogy doesn’t meet the needs of a 21st century classroom,

how do we account for the learning that I experienced at Educon in a way to satisfy accountability standards in an effort to make learning in school more like that of Educon?

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

15 Responses to From Educon to Data – My Reflections

  1. Matt says:

    I love this quote – “it was assumed that technology was part of learning, it was embedded not an add on”.

    You ask a great question – how DO we measure such learning? How do rate the value of a conversation or reflection afterwards? Do you measure the practical application afterwards and if so, how? These are important questions and if we can determine their answer we can apply them to our students. Perhaps #edchat can help with that.

    • tbaldasaro says:

      Matt,

      Great point about #edchat. The problem in my mind is that we are measuring 20th century expectations, not 21st century outcomes. So, if accountability is here as we know it, how do we measure progress on future outcomes?

  2. Ed Alen says:

    Excellent post. PLP and EduCon have been trans formative experiences for me as well. And it is a challenge in this era of accountability. And you raise a great point about how we are measuring 20th century skills.

    How does SLA do it? They are a public school with state assessments. It would be worth asking Chris. Perhaps he will comment.

    It is challenging to effect transformation of our schools. But it will be worth it. And I like the #edchat idea. I’ll watch for it.

  3. Wonderful, amazing questions about accountability and measuring learning. (And thanks for the kind words.)

    And thank you for contributing to my learning last weekend.

    • tbaldasaro says:

      Chris, I’m wondering if you could take a few minutes to address Ed’s question above. How do you create the culture you have at SLA all the while being accountable via state testing?

  4. Jon Birdsong says:

    Great write up Tony and more importantly, great questions.

    “There were laptops, netbooks, iPhones, iPods, Droids, etc. but it was so ubiquitous that no one noticed. I couldn’t help but wonder what our schools would look like if classrooms “looked” the same.”

    The needle is moving and this is something I like to see.

    Best,
    Jon

  5. Akee123 says:

    Ultimately, school leaders need to define their own local success. “How do we know of we are doing a good job?” It will never be just one assessment. But a combo of assessments, student/parent feedback, and qualitative data will help answer the question.

    -Rob

  6. Thanks for sharing your wonderful reflections!

    I’ve experienced a similar evolution of my thinking over the past few years. As a math major, numbers have never been a source of intimidation for me. Learning the ins and outs of our Provincial staqndardized assessment practices was relatively easy and I used to be of the mind that our results gave us pretty clear direction with respect to where we needed to focus our instruction for future students. I had the same questions many have re: the validity of using data from a previous group of students to inform instruction for a future group of students, but on the whole I did not worry too much about that. I played the game as it should be played and did the best I could to support my teachers and their students.

    Over the last few years however I’ve come to appreciate the qualitative aspect of assessing the learning that occurs in our school. I get the difference between learning and tests now! I do not have an argument with accountability measures. Given the amount of funding spent on education, and being a taxpayer myself, I appreciate knowing what return I’m getting on that investment. The problem, as you so clearly identified, is what is learning? This is where we need to start.

    Once we have a better sense of what learning is, I think this is an area where we in the schools can help drive change. If we start organizing, collecting, and presenting the data that represents student learning at the school level, as a supplement to standardized assessment results, we will show the rest of the learning picture.

    If we wait for others (i.e. our governments and think tanks that analyze standardized test achievement) to start asking for the rest of the information, we’ll be waiting a long time. Teacher/school/district leadership in this area is important to creating a desire in the public to know more about the learning that takes place in our schools!

    Is there a role for student self assessment of learning in this process? What about qualitative parent assessment of overall student learning?

    Sorry for the length of the comment! 🙂 You started me thinking again this morning!

    Cheers

    @acmcdonaldgp

    • tbaldasaro says:

      Thanks Sandy,

      One of the “positives” of more accountability is that it has brought to light the relative performance of our “atypical” learners. In the past, we have hid their performance in the aggregate of our whole schools. We can’t do that today, and that is a good thing. So, I’m not necessarily an opponent of accountability. I’m more interested, however, in focusing on learning, not annual test scores. I just don’t know that those means anymore.

  7. Josie says:

    “…I fear that it (data) is used as a weapon to attack rather than a prescription to heal”
    That is such a keen and key observation.
    I too have seen data used this way – as a means to attack and destroy and distort rather than as something that can be harnessed to help move thing forward in positive directions. Data may be said to be “neutral” but it is not. Who gathers it and for what purposes and at the exclusion of what else truly matters.
    Take a look at: Locking the Gate: Data is Dead
    http://wp.me/pKCQM-wP

    • tbaldasaro says:

      Josie,

      I’ll push back a little bit… Data (numbers) themselves are neutral. Our interpretation of what those numbers mean is not always. That’s the problem.

      I would also suggest that the reason test data is gathered may not always coincide with the way that it is used. Often times data gathered for prescriptive reasons is used to judge. Doug Reeves put it best when he said, “Data is to be used as a physical, not an autopsy.”

  8. Accountability is a popular “buzz word” in education right now, but what are we actually accountable for in education today? Are we accountable to make sure that students know the discrete set of skills in a series of state-created standards? OR, are we accountable to ensure that all students maintain a frame of mind that is open to new ideas and brimming with non-linear thought? Personally, I find the second goal more palatable. What data supports open minds and non-linear thought? A long-term portfolio of content CREATED by students. I appreciate your post, and it has stimulated my thinking in this area!

    • tbaldasaro says:

      I love the portfolio idea. I have long been an advocate for students developing portfolios that include a heavy emphasis on growth and reflection. Those have the potential to be so much more rich than a test score. Thanks for reminding me of that.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s