In Their Own Words: Goal Setting is Often an Act of Desperation: Part 6 (2024)

Jun 17, 2024

In the final episode of the goal setting in classrooms series,John Dues and Andrew Stotz discuss the last three of the 10 KeyLessons for implementing Deming in schools. They finish up with theexample of Jessica's 4th-grade science class.

TRANSCRIPT

0:00:02.4 AndrewStotz: My name is Andrew Stotz, and I'll be your host as wecontinue our journey into the teachings of Dr. W Edwards Deming.Today I'm continuing my discussion with John Dues, who is part ofthe new generation of educators striving to apply Dr. Deming'sprinciples to unleash student joy in learning. This is episode sixabout goal setting through a Deming lens. John, take it away.

0:00:26.4 John Dues:Hey, Andrew, it's good to be back. Yeah, for the past handful ofepisodes or so, we've been talking about organizational goalsetting. We covered these four conditions of healthy goal settingand then got into these 10 key lessons for data analysis. And thenwe've been looking at those 10 key lessons applied to animprovement project. And we've been talking about a project thatwas completed by Jessica Cutler and she did a Continual ImprovementFellowship with us here at our schools. And if you remember,Jessica was attempting to improve the joy in learning of herstudents in her fourth grade science class. So last time we lookedat lessons five through seven. Today we're gonna look at thosefinal three lessons, eight, nine and ten applied to herproject.

0:01:15.7 AS: It'sexciting.

0:01:17.1 JD: Yeah.So we'll jump in here. We'll kind of do a description, a refresherof each lesson. And we'll kind of talk about how it was applied toher specific project, and we'll look at some of her data to kind ofbring that live for those of the folks that have video. Let's jumpin with lesson number eight. So we've talked about this before, butlesson number eight was: more timely data is better for improvementpurposes. So we've talked about this a lot. We've talked aboutsomething like state testing data. We've said, it can be useful,but it's not super useful for improvement purposes, because wedon't get it until the year ends. And students in our case, havealready gone on summer vacation by the time that data comes in. Andyou know that the analogous data probably happens in lots ofdifferent sectors where you get data that lags, to the point thatit's not really that useful for improvement purposes.

0:02:15.8 JD: Sowhen we're trying to improve something, more frequent data ishelpful because then we can sort of see if an intervention thatwe're trying is having an effect, the intended effect. We can learnthat more quickly if we have more frequent data. And so it's,there's not a hard and fast rule, I don't think for how frequentlyyou should be gathering data. It just sort of needs to be in syncwith the improvement context. I think that's the important thing.Whether it's daily or a couple times a day or weekly, or monthly,quarterly, whatever, it's gotta be in sync with whatever you'retrying to improve.

0:02:50.5 AS: Youmade me think about a documentary I saw about, how they do brainsurgery and how the patient can't be sedated because they're askingthe patient questions about, do you feel this and they're testingwhether they're getting... They're trying to, let's say, get rid ofa piece of a cancerous growth, and they wanna make sure thatthey're not getting into an area that's gonna damage their brain.And so, the feedback mechanism that they're getting through theirtools and the feedback from the patient, it's horrifying to thinkof the whole thing.

0:03:27.7 JD:Yeah.

0:03:28.3 AS: It's aperfect example of why more timely data is useful for improvementpurposes 'cause imagine if you didn't have that information, youknock the patient out, you get the cancerous growth, but who knowswhat you get in addition to that.

0:03:43.7 JD: Yeah,that's really interesting. I think that's certainly an extremeexample, [laughter], but I think it's relevant. No matter what ourcontext, that data allows us to understand what's going on,variation, trends, whether our system is stable, unstable, how weshould go about improving. So it's not dissimilar from the doctorsin that example.

0:04:06.8 AS: Andit's indisputable I think, I would argue. But yet many people maynot, they may be operating with data that's not timely. And so thisis a reminder that we would pretty much always want that timelydata. So that's lesson eight. Wow.

0:04:22.6 JD: Lessoneight. Yeah. And let's see how we can, I'll put a visualization onthe screen so you can see what Jessica's data look like. All right.So now you can see. We've looked at these charts before. This isJessica's process behavior chart for joy in science. So just toreorient, you have the joy percentage that students are feelingafter a lesson on the x-axis, sorry, on the y-axis. On the x-axis,you have the school dates where they've collected this surveyinformation from students in Jessica's class.

0:04:57.0 AS: Canyou put that in Slide Show view?

0:05:00.4 JD: Yeah.I can do that. Yeah.

0:05:02.7 AS: Justit'll make it bigger, so for the...

0:05:06.5 JD: Thereyou go.

0:05:07.8 AS: Forthe listeners out there, we're looking at a chart of daily, well,let's say it looks like daily data. There's probably weekends thatare not in there because class is not on weekends, but it's the upsand downs of a chart that's ranging between a pretty, a relativelynarrow range, and these are the scores that are coming fromJessica's surveying of the students each day, I believe.Correct?

0:05:34.2 JD: Yeah.So each day where Jessica is giving a survey to assess the joy inscience that students are feeling, then she's averaging all thosestudents together. And then the plot, the dot is the average of allthe students sort of assessment of how much joy they felt in aparticular science lesson.

0:05:54.7 AS: Andthat's the average. So for the listeners out there John's got anaverage line down the middle of these various data points, and thenhe is also got a red line above and a red line below the, above thehighest point and slightly below the lowest point. Maybe you canexplain that a little bit more.

0:06:15.4 JD: Yeah.So with Jessica, you remember originally she started plotting on aline chart or a run chart when we just had a few data points justto kind of get a sense of how things are moving so she could talkabout it with her class. And over time what's happened is she's nowgot, at this point in the project, which she started in January,now this is sort of mid-March. And so she's collected two to threedata points a week. So she doesn't survey the kids every day justfor time sake, but she's getting two, three data points a week. Andso by March, she started just a couple months ago, she's got 28data points. So that sort of goes back to this idea of more timelydata is better for improvement.

0:07:00.9 JD: And alot of times, let's say a school district or a school does actuallysurvey their students about how, what they think of their classes.That might happen at best once a semester or maybe once a year. Andso at the end of the year you have one or two data points. So it'sreally hard to tell sort of what's actually going on. Compared tothis, Jessica's got these 28 data points in just about two monthsor so of school. So she's got 28 data points to work with. And sowhat her and her students are doing with this data then, one, theycan see how it's moving up and down. So we have, the blue dots areall the plotted points, like you said, the green line is theaverage running sort of through the middle of the data, and thenthose red lines are our process limits, the upper and lower naturalprocess limits that sort of tell us the bounds of the system.

0:07:50.4 JD: Andthat's based on the difference in each successive data point. Butthe most important thing is that as Jessica and her students arelooking at this, initially, they're really just studying it andtrying to sort of see how things are going from survey to survey.So one of the things that Deming talked about frequently is nottampering with data, which would be if you sort of, you overreactto a single data point. So let's say, a couple of days in, it dipsdown from where it started and you say, oh my gosh, we gotta changethings. And so that's what Deming is talking about. Not tampering,not overreacting to any single data point. Instead look at thiswhole picture that you get from these 28 data points and then talkabout...

0:08:41.5 JD: InJessica's case she's talking about with her students, what can welearn from this data? What does the variation from point to pointlook like? If we keep using the system, the fourth grade sciencesystem, if we leave it as is, then we'll probably just keep gettingdata pretty similar to this over time, unless something moresubstantial changes either in the negative or the positive. Soright now they...

0:09:10.1 AS: And Ithink for the listeners, it's, you can see that there's really nostrong pattern that I can see from this. It's just, there's some,sometimes that there's, seems like there's little trends and stufflike that. But I would say that the level of joy in the scienceclassroom is pretty stable.

0:09:32.1 JD: Prettystable. Yeah. Pretty high. It's bouncing around maybe a 76% averageacross those two and a half months or so. And so, they, you kind ofconsider this like the baseline. They've got a good solid baselineunderstanding of what joy looks like in this fourth grade scienceclassroom. Did that stop sharing on your end?

0:10:00.2 AS:Yep.

0:10:00.2 JD: Okay,great. So that's lesson eight. So clearly she's gathered a lot ofdata in a pretty short amount of time. It's timely, it's useful,it's usable, it can be studied by her and her students. So we'llswitch it to lesson nine now. So now they've got a good amount ofdata. They got 28 data points. That's plenty of data to work with.So lesson nine is now we wanna clearly label the start date for anintervention directly in her chart. And remember from earlierepisodes, not only are we collecting this data, we're actuallyputting this up on a screen on a smart board in the classroom, andJessica and her students are studying this data together. They'reactually looking at this, this exact chart and she's explainingsort of kind of like we just did to the listeners. She's explainingwhat the chart means.

0:10:54.2 JD: And soover time, like once a week she's putting this up on the smartboard and now kids are getting used to, how do you read this data?What does this mean? What are all these dots? What do these numbersmean? What do these red lines mean? That type of thing. And so nowthat they've got enough data, now we can start talking aboutinterventions. That's really what lesson nine is about. And thepoint here is that you want to clearly, explicitly with a literallylike a dotted line in the chart to mark on the day that you'regonna try something new. So you insert this dashed vertical line,we'll take a look at it in a second, on the date the interventionstarted. And then we're also gonna probably label it somethingsimple so we can remember what intervention we tried at that pointin time.

0:11:42.7 JD: Sowhat this then allows the team to do is then to very easily see thedata that happened before the intervention and the data thathappened after the implementation of this intervention or thischange idea. And then once we've started this change and we startplotting points after the change has gone into effect, then we canstart seeing or start looking for those patterns in the data thatwe've talked about, those different rules, those three rules thatwe've talked about across these episodes. And just to refresh, ruleone would be if we see a single data point outside of either of thelimits, rule two is if we see eight consecutive points on eitherside of that green average line, and rule three is if we see threeout of four dots in a row that are closer to one of the limits thanthey are to that central line.

0:12:38.3 JD: Sothat again, those patterns tell us that something significant,mathematically improbable has happened. It's a big enough magnitudein change that you wouldn't have expected it otherwise. And when wesee that pattern, we can be reasonably assured that thatintervention that we've tried has worked.

0:12:56.0 AS: Andlet me ask you about the intervention for just a second because Icould imagine that if this project was going on, first question is,does Jessica's students are, obviously know that this experiment isgoing on?

0:13:08.3 JD:Yes.

0:13:09.8 AS:Because they're filling out a survey. And my first question is, dothey know that there's an intervention happening? I would expectthat it would be yes, because they're gonna feel or see thatintervention. Correct?

0:13:25.1 JD: Sure.Yep.

0:13:25.2 AS: That'smy first point that I want to think about. And the second point is,let's imagine now that everybody in the classroom has been seeingthis chart and they're, everybody's excited and they got a lot ofideas about how they could improve. Jessica probably has a lot ofideas. So the temptation is to say, let's change these three thingsand see what happens.

0:13:46.5 JD:Yeah.

0:13:47.1 AS: Is itimportant that we only do one thing at a time or that oneintervention at a time or not? So maybe those are two questions Ihave in my mind.

0:13:58.6 JD: Yeah,so to the first question, are you, you're saying there there mightbe some type of participant or...

0:14:02.3 AS:Bias.

0:14:03.3 JD:Observer effect like that they want this to happen. That'scertainly possible. But speaking to the second question, whatintervention do you go with? Do you go with one or you go withmultiple? If you remember a couple of episodes ago we talked about,and we actually looked at a fishbone diagram that Jessica and herstudents that they created and they said, okay, what causes us tohave low joy in class? And then they sort of mapped those, theycategorized them, and there were different things like technologynot working. If you remember, one was like distractions, like otherteachers walk into the room during the lesson. And one of them wasothers like classmates making a lot of noise, making noises duringclass and distracting me. And so they mapped out different causes.I think they probably came up with like 12 or 15 different causesas possibilities.

0:14:58.7 JD: Andthey actually voted as a class. Which of these, if we worked on oneof these, which would have the biggest impact? So not every kidvoted for it, but the majority or the item that the most kidsthought would have the biggest impact was if we could somehow stopall the noises basically. So they came up with that as a class, butnot, it wasn't everybody's idea. But I think we've also talkedabout sort of the lessons from David Langford where once kids seethat you're gonna actually take this serious, take their ideasserious and start acting on them, they take the project prettyseriously too. So maybe not a perfect answer, but that's sort ofwhat we...

0:15:38.0 AS: I wasthinking that, ultimately you could get short-term blips when youdo an intervention and then it stabilizes possibly. That's onepossibility. And the second thing I thought is, well, I meanultimately the objective, whether that's an output from a factory,and keeping, improving that output or whether that's the outputrelated to joy in the classroom as an example, you want it to go upand stay up and you want the students to see it and say, wow, look,it's happening. So, yeah.

0:16:11.7 JD: Andthere's different ways you can handle this. So this joy thing couldgo up to a certain point. They're like, I don't know if we can getany more joy, like, it's pretty high. And what you could do at thatpoint is say, okay, I'm gonna assign a student to just sort of,every once in a while, we'll keep doing these surveys and we willsort of keep plotting the data, but we're not gonna talk about alot. I'm just gonna assign this as a student's job to plot the newdata points. And we'll kind of, we'll kind of measure it, but wewon't keep up with the intervention 'cause we got it to a pointthat we're pretty happy with. And now as a class we may wannaswitch, switch our attention to something else.

0:16:45.2 JD: So westarted getting into the winter months and attendance has dipped.Maybe we've been charting that and say, Hey guys, we gotta, gottakinda work on this. This is gone below sort of a level that'sreally good for learning. So let's think about as a group how wecould come up with some ideas to raise that. So maybe you turn yourattention to something else, 'cause you can't pay attention toeverything at once.

0:17:07.2 AS: Yeah,and I think I could use an example in my Valuation Master ClassBoot Camp where students were asking for more personal feedback andI realized I couldn't really scale this class if I had to get stuckinto hundreds of grading basically. And that's when I came up withthe concept of feedback Friday, where one student from each teamwould present and then I would give feedback, I would give acritique and they would be intense and all students would bewatching, it would be recorded, and all of a sudden all the issuesrelated to wanting this personal feedback went away. And therefore,once I instituted it on a regular basis, I went on to the nextissue and I made sure that I didn't lose the progress that I hadmade and continue to make feedback Friday better and better.

0:17:56.2 JD: Yeah.Yeah. That's great. That's great. I'll share my screen so you cankinda see what this looked like in Jessica's class now, what thechart looks like now. So now you see that same chart, that sameprocess behavior chart, exact same one we were just looking atexcept now you can see this, this dashed vertical line that marksthe spot where the intervention was started that we just talkedabout. And what the kids are actually doing, and Jessica arerunning a PDSA cycle, a Plan-Do-Study-Act cycle. That's theexperimental cycle in her class. And what they're running that PDSAon is, again, how can we put something in place to reduce thedistracting noises. And so what the students actually said is if weget a deduction for making noises, then there will be less noises.And so in the school's sort of management system, a deduction issort of like a demerit.

0:19:00.0 JD: If youmaybe went to a Catholic school or something like that, or somepublic schools had demerits as well, but basically it's like aminor infraction basically that goes home or that gets communicatedto parents at the end of the week. But the kids came up with thisso their basic premise is, their plan, their prediction is if thereare less noises, we'll be able to enjoy science class. And if wegive deductions for these noises, then there'll be less noises. Sosome people may push back, well, I don't think you should givedeductions or something like that, but which, fine, you could havethat opinion. But I think the powerful point here is this is, thestudents created this, it was their idea. And so they're testingthat idea to see if it actually has impact.

0:19:44.8 JD: Andthey're learning to do that test in this scientific thinking way byusing the Plan-Do-Study-Act cycle, and seeing if it actually has animpact on their data. So at the point where they draw this dashedline, let's call that March 19th, we can see a couple of additionaldata points have been gathered. So you can see the data went upfrom 3/18 to 3/21. So from March 18th to March 21st, rose fromabout, let's call it 73% or so, up to about 76% on March 21st. Andthen that next day it rose another percent or two and let's callthat 78%.

0:20:28.1 JD: And sothe trap here is you could say, okay, we did this intervention andit made things better. But the key point is the data did go up, butwe haven't gathered enough additional data to see one of thosepatterns that we talked about that would say, oh, this actually hashad a significant change. Because before the dashed line, you cansee data points that are as high or even higher than some of theseones that we see after the PDSA is started. So it's too early tosay one way or another if this intervention is having an impact. Sowe're not gonna overreact. You could see a place where you're soexcited that it did go up a couple of days from where it was onMarch 18th before you started this experiment, but that's a trap.Because it's still just common cause data, still just bouncingaround that average, it's still within the bounds of the redprocess limits that define the science system.

0:21:34.2 AS: I havean experiment going on in my latest Valuation Master Class BootCamp, but in that case, it's a 6-week period that I'm testing, andthen I see the outcome at the end of the six weeks to test whethermy hypothesis was right or not. Whereas here it's real time tryingto understand what's happening. So yes, you can be tempted whenit's real time to try to jump to conclusion, but when you said,well, okay, I can't really get the answer to this conclusion untilI've run the test in a fixed time period, then it's you don't haveas much of that temptation to draw a conclusion.

0:22:14.1 JD: Yeah.And if I actually was... I should have actually taken this a stepfarther. I marked it with this Plan-Do-Study-Act cycle. What Ishould have done too is write "noises" or something like that,deduction for noises, some small annotation, so it'd be clear whatthis PDSA cycle is.

0:22:32.1 AS: Inother words, you're saying identify the intervention by thevertical line, but also label it as to what that intervention was,which you've done before on the other chart. I remember.

0:22:42.1 JD: Yeah.And then it'd be sort of just looking at this when she puts this upon the smart board for the class to see it again too. Oh yeah yeah,that's when we ran that first intervention and that was thatintervention where we did deductions for noises. But the biggerpoint is that this never happens where you have some data, youunderstand a system, you plan systematic intervention, and then yougather more data right after it to see if it's having an impact.We'd never do that ever, in education, ever. Ever have I ever seenthis before. Nothing like this. Just this little setup combiningthe process behavior chart with the Plan-Do-Study-Act cycle, Ithink is very, very, very powerful and very different approach thanwhat school improvement.

0:23:33.4 AS:Exciting.

0:23:34.6 JD: Yeah.The typical approach is to school improvement. So I'll stop thatshare for a second there, and we can do a quick overview of lesson10 and then jump back into the chart as more data has beengathered. So lesson 10 is: the purpose of data analysis is insight.Seems pretty straightforward. This is one of those key teachingsfrom Dr. Donald Wheeler who we've talked about. He taught us thatthe best analysis is the simplest analysis, which provides theneeded insight.

0:24:08.1 AS: Sorepeat lesson 10, again, the purpose of...

0:24:11.6 JD: Thepurpose of data analysis is insight.

0:24:14.7 AS:Yep.

0:24:15.6 JD: Sojust plotting the dots on the run chart and turning the run chartinto the process behavior chart, that's the most straightforwardmethod for understanding how our data is performing over time.We've talked about this a lot, but it's way more intuitive tounderstand the data and how it's moving than if you just stored itin a table or a spreadsheet. Got to use these time sequence charts.That's so very important.

0:24:42.2 AS: And Iwas just looking at the definition of insight, which is a clear,deep, and sometimes sudden understanding of a complicated problemor situation.

0:24:51.6 JD: Yeah.And I think that can happen, much more likely to happen when youhave the data visualized in this way than the ways that wetypically visualize data in just like a table or a spreadsheet. Andso in Jessica's case, we left off on March 22nd and they had donetwo surveys after the intervention. And so then of course what theydo is they continue over the next 4, or 5, 6 weeks, gathering moreof that data as they're running that intervention, then we can sortof switch back and see what that data is looking like now.

0:25:28.3 AS:Exciting.

0:25:30.3 JD: So wehave this same chart with that additional data. So we have data allthe way out to now April 11th. So they run this PDSA for about amonth, three weeks, month, three, four weeks.

0:25:47.9 AS: Andthat's 11 data points after the intervention. Okay.

0:25:54.0 JD: Yep.Purposeful. So what was I gonna say? Oh, yeah. So three, four weeksfor a Plan-Do-Study-Act cycle, that's a pretty good amount of time.Two to four weeks, I've kind of found is a sweet spot. Shorter thanthat, it's hard to get enough data back to see if your interventionhas made a difference. Longer than that, then it's you're gettingaway from the sort of adaptability, the ability to sort of build onan early intervention, make the tweaks you need to. So that two tofour week time period for your PDSA seems like a sweet spot to me.So she's continued to collect this joy in learning data to see...Basically what her and her class are doing is seeing if theirtheory is correct. Does this idea of giving deductions for makingnoises have an impact? Is it effective?

0:26:44.0 JD: So ifthey learn, if the data comes back and there is no change, noindication of improvement, then a lot of people will say, well, myexperiment has failed. And my answer to that is, no, it hasn'tfailed. It might not have worked like you wanted, but you learnvery quickly that that noise deduction is not going to work andwe're gonna try some other thing, some other intervention. We learnthat very very quickly within 3 or 4 weeks that we need to trysomething new. Now, in the case of Jessica's class, that's not whathappened. So you can actually see that dotted line, vertical dottedline is still at March 19th, we have those 11 additional datapoints. And you can actually see, if you count, starting with March21st, you count 1-2-3-4-5-6-7-8-9-10-11 data points that are abovethat green average line from before.

0:27:45.5 JD: Sooriginally the red lines, the limits and the central line wouldjust be straight across. But once I see that eight or more of thoseare on one side of that central line, then I actually shift thelimits and the average line, 'cause I have a new system. I'veshifted it up and that actually is an indication that thisintervention has worked, because we said... Now for those that arewatching, it doesn't appear that all the blue dots are above thatgreen line, but they were before the shift. Remember the shiftindicates a new system. So I go back to the point where the firstdot of the 8 or more in a row occurred, and that's where I haveindicated a new system with the shift in the limits and the centralline. So this, their theory was actually correct. This idea ofgiving a deduction for noises actually worked to improve the joy inJessica's science class. It was a successful experiment.

0:28:52.7 AS: Can Idraw on your chart there and ask some questions?

0:29:00.5 JD: Sure.Yeah.

0:29:00.6 AS: So oneof my questions is, is it possible, for instance, in thepreliminary period, let's say the first 20 days or so that thingswere kind of stabilized and then what we saw is that thingspotentially improved here in the period before the intervention andthat the intervention caused an increase, but it may not be assignificant as it appears based upon the prior, the most recent,let's say 10 days or something like that. So that's my question onit. I'll delete my drawings there.

0:29:46.3 JD: Yeah,I think that's a fair question. So, the reason I didn't shift thosebefore, despite you do see a pattern, so before the dotted line, Iconsidered that period a baseline period where we were justcollecting 'cause they hadn't tried anything yet. So Dr. Wheelerhas these series of four questions. So in addition to seeing asignal, he's got these other sort of questions that he typicallyasks and that they're yes/no questions. And you want the answer toall those to be yes. And one of 'em is like, do you know why animprovement or a decline happened? And if you don't, then youreally shouldn't shift the limits. So that's why I didn't shiftthem before. I chose not to shift them until we actually didsomething, actually tried something.

0:30:33.2 AS: Whichis basically saying that you're trying to get the voice of thestudents, a clear voice, and that may be that over the time of theintervention, it could be that the... Sorry, over the time of theinitial data gathering, that the repetition of it may have causedstudents to feel more joy in the classroom because they were beingasked and maybe that started to adjust a little bit up and there'sthe baseline, so. Yep. Okay.

0:31:01.6 JD: Yeah.And so this is sort of where the project ended for the fellowshipthat Jessica was doing. But, what would happen if we could sort ofsee what happened, further out in the school year is that, eitherJessica and the class could then be sort of satisfied with wherethe joy in learning is at this point where the improvementoccurred. Or they could run another cycle, sort of testing, sort ofa tweaked version of that noise reduction PDSA, that interventionor they could add something to it.

0:31:43.0 AS: Orthey could have run another fishbone point, maybe the noise wasn'tactually the students thought it would be the number onecontributor, but, maybe by looking at the next one they could see,oh, hey, wait a minute, this may be a higher contributor ornot.

0:32:01.2 JD: Yeah.And when you dug into the actual plan, the specifics of the plan,how that noise deduction was going to work, there may be somethingin that plan that didn't go as planned and that's where you wouldhave to lean on, 'cause we've talked about the three sort of partsof the improvement team that you need. You need the frontlinepeople. That's the students. You need the person with the authorityto change the system. That's Jessica. And then someone with theknowledge of the system, profound knowledge. That's me. Well,those, the Jessica and her students are the one in that every day.So they're gonna have learning about how that intervention went,that would then inform the second cycle of the PDSA, whatever thatwas gonna be, whatever they're gonna work on next. The learningfrom the first cycle is gonna inform that sort of next cycle.

0:32:51.4 JD: So theidea is that you don't just run a PDSA once but you repeatedly testinterventions or change ideas until you get that system where youwant it to be.

0:33:01.1 AS: So forthe listeners and viewers out there, I bet you're thinking gosh,Jessica's pretty lucky to have John help her to go through this.And I think about lots of things that I want to talk to you about[laughter] about my testing in my own business, and I know in myown teaching, but also in my business. So that I think is one ofthe exciting things about this is the idea that we just, we do alot of these things in our head sometimes. I think this will make adifference and, but we're not doing this level of detail usually inthe way that we're actually performing the tests and trying to seewhat the outcomes are.

0:33:43.9 JD: Yeah Ithink that for school people too, I think when we've attempted toimprove schools, reform schools, what happens is we go really fastand the learning actually happens very slowly and we don't reallyappreciate what it actually takes to change something in practice.And what happens then is to the frontline people like teachers...The reformers have good intentions but the people on the front linejust get worn out basically, and a lot of times nothing actuallyeven improves. You just wear people out. You make these big changesgo fast and wide in the system and you don't really know exactlywhat to do on the ground because the opposite is having Jessica'sclassroom. They're actually learning fast but trying very smallchanges and getting feedback right in the place where that feedbackneeds to be given right in the classroom and then they can thenlearn from that and make changes.

0:34:49.8 JD: Andagain, it may seem smaller. Maybe it doesn't seem thatrevolutionary to people but to me, I think it's a completelyrevolutionary, completely different way to do school improvementthat actually kind of honors the expertise of the teacher in theclassroom, it takes into account how students are experiencing achange and then I'm kind of providing a method that they can use tothen make that classroom better for everybody so and I think indoing so students more likely to find joy in their work, joy intheir learnings, teachers more likely to find joy in their work aswell. So to me it's a win-win for all those involved.

0:35:34.9 AS:Fantastic. Well, should we wrap up there?

0:35:40.6 JD: Yeah,I think that's a good place to wrap up this particular series.

0:35:45.1 AS: Andmaybe you could just review for the whole series of what we've donejust to kind of make sure that everybody's clear and if somebodyjust came in on this one they know a little bit of the flow of whatthey're gonna get in the prior ones.

0:36:00.4 JD: Yeah.So we did six episodes and in those six episodes we started offjust talking about what do you need to have in place for healthygoal setting at an organizational level, and we put four conditionsin place that before you ever set a goal you should have tounderstand the capability of your system, you have to understandthe variation within your system, you have to understand if thesystem that you're studying is stable, and then you have to have alogical answer to the question by what method. By what method areyou gonna bring about improvement or by what method you're gonnaget to this goal that you wanna set. So we talked about that, yougotta have these four conditions in place and without those we saidgoal setting is often an act of desperation.

0:36:49.7 JD: Andthen from there what we did is start talking about these 10 keylessons for data analysis so as you get the data about the goal andyou start to understand the conditions for that system of processwe could use those 10 data lessons to then interpret the data thatwe're looking at or studying and then we basically did that overthe first four episodes. In the last few episodes what we've doneis look at those lessons applied to Jessica's improvement projectand that's what we just wrapped up looking at those 10 lessons.

0:37:23.7 AS: Idon't know about the listeners and viewers but for me this type ofstuff just gets me excited about how we can improve the way weimprove.

0:37:33.4 JD: Yeah.For sure.

0:37:34.9 AS: Andthat's exciting. So John, on behalf of everyone at the DemingInstitute I want to thank you again for this discussion, and forlisteners, remember to go to deming.org to continue your journey.You can find John's book Win-Win W. Edwards Deming, the System ofProfound Knowledge and the Science of Improving Schools onamazon.com. This is your host Andrew Stotz, and I'll leave you withone of my favorite quotes from Dr. Deming, "People are entitled tojoy in work."

In Their Own Words: Goal Setting is Often an Act of Desperation: Part 6 (2024)

References

Top Articles
Latest Posts
Article information

Author: Edmund Hettinger DC

Last Updated:

Views: 6126

Rating: 4.8 / 5 (58 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Edmund Hettinger DC

Birthday: 1994-08-17

Address: 2033 Gerhold Pine, Port Jocelyn, VA 12101-5654

Phone: +8524399971620

Job: Central Manufacturing Supervisor

Hobby: Jogging, Metalworking, Tai chi, Shopping, Puzzles, Rock climbing, Crocheting

Introduction: My name is Edmund Hettinger DC, I am a adventurous, colorful, gifted, determined, precious, open, colorful person who loves writing and wants to share my knowledge and understanding with you.