Home | Login | Recent Changes | Search | All Pages | Help
TeachingInexperiencedPeopleI am leading a 2-hour simulation about software quality and time to market on Monday, July 22. This opportunity emerged from discussions I had with DavidSocha, an AYE participant, about designing experiential learning. David changed jobs recently. He is now an instructor for the University of Washington. He is teaching CSE 403: Software Engineering Bootcamp this summer. David asked me if we I would be a "guest lecturer" for his class. As you can see, I agree to lead a simulation rather than lecture. So, the participants for the simulation will be college students who will soon join the work force. I've been trying for the past few days to put myself in the students' shoes so that I can better relate to them. It isn't working. I cannot seem to get my size 13 feet into their shoes. I remember only bits and pieces from my transition from student to professional. The things that stands out for me before entering the work force was my yearning to "work with the latest technology", "help people", "make my mark" and "make a difference". After a few months on my first job, I remember thinking that my most important learning was about people rather than technology. I didn't have a clue about quality or time to market when I took my first job. I must be a slow learner because it took me 25 years to digest the fundamental dynamics about tradeoffs between quality, speed, and economy. I know that I still have a lot to learn about the dynamics. And, I look forward to learning more. I would like you to share with me the answers to a few questions --
The good news is that the simulation will give the students real world experience about the tradeoff between quality and speed. My objective is to help the students see beyond the context of design and coding to the larger context of software development and business. I would appreciate anything you would share with me about this topic. Thanks, SteveSmith 2002.07.17 Steve: One of the things you should definitely work in is forgetting the academic "all by myself" ethic - make them explore (measure?) each other's relative strengths [& weaknesses] with logging, code reading, brainstorming. There is no academic in TEAM. Time & rate code reuse vs. code from scratch? (I'm not sure what you can cover in 2 hours) My 2 cents. --BobLee 2002.07.17 Steve, Bob has some great observations and I agree with the ambitiousness of your task. Maybe if you picked something familiar to you and your students as the basis of a simulation, you could create that sense of wonder. A summer camp? A high school? When there are different roles (executive to developer) the debrief can bring out interesting perspective, but I wonder how much would be picked up by a student. They have to have some experience with the thing to understand the people in it. I'm sure you'll do a great job with getting people in touch with each other. You might also ask your daughter to get a younger perspective. Adding 2 more cents - BeckyWinant 2002.07.18 Bob and Becky, Thank you for the replies. I don't feel my task is very ambitious. I have a well-designed simulation that I'm pleased to lead. The task for David Socha, the instructor, is ambitious. David is using experiential learning as the foundation for his course, which makes his course way different than the traditional college course. My hat is off to David. He has the students working in teams to produce a software product and once a week he brings in an outsider to do a simulation. I see the simulations as means for enabling the students to broaden their view of software and social development. Bob, I've seen from the students' journals, another innovation from David Socha, that they are encountering issues around teamwork and individuality. Regarding measurement, the simulation measures time and quality. I'll ask the students, during the debrief, to come up with similar measures for their products. I still hope to hear about what AYE participants would like new professionals to know about tradeoffs. Imagine an interview situation. How would you test whether the person, experienced or inexperienced, was aware of the tradeoffs? How would you test whether the person would actually make tradeoffs? SteveSmith 2002.07.18 My first real-life lesson after college was that different stakeholders within the same company can have different expectations about my project's rate of progress. (Among other things) I was showing regular progress to my boss, who was telling me that my rate of progress was fine. [Of course, way back then, I started with writing infrastructure to enable building the rest of the app. I didn't have a schedule or plan, and the deadline was hazy.] At a company meeting, I hear the president of the company say that progress on my product was too slow... surprise -- my boss wasn't communicating with the president, or vice versa. Might you want to simulate something like that? KeithRay 2002.07.19 Keith's point suggests an area: negotiation. Being honest, able to say "No!" rather than placate, and looking for Win-Win at all times. Too many games, academic activities and portrayals in media push the Win/Lose game. (positional negotiation) Understanding diversity opens the mind to Win-Win opportunities. BobLee 2002.07.19 Postscript: I had fun leading the simulation. Here is what stands out for me --
SteveSmith 2002.07.29 Actually, most of my college students were people with jobs in industry going to school part time - or else they were new freshmen and freshwomen who were lost in the college maze and wandered into our experimental programs. I never had much success with "conventional" college students. And welcome to the "academic environment doesn't fit for me" club. - JerryWeinberg 2002.07.29 Wow. The simulation did reach some of the students. Here are a few entries from the students' public journals: Student 1. Monday's guest lecturer taught us about the tradeoffs of when and when not to ship your product. I must admit that Monday I thought that this was sort of a silly lecture. I understood from the game we played that there are inherent risks with the way you tackle any product release bug and that some may be better than others but you need to balance those fixes with wisdom and restraint at times. I thought that this was something that only applied to Microsoft and other corporations that build *poorly-designed* software. It wasn't until Friday when we were in the lab and my classmates were making arbitrary (though needed and desired) architecture changes that something clicked and I really got a glimpse of how crazy the real world gets sometimes and how even software engineers with the best intentions and the best development strategies *do* have to ship product with known bugs. I must admit that before this "epiphany moment" I really thought that I was somehow exempt from that category - that I was one of the rare few who would never be faced with the question of whether or not to ship with bugs. In a sense, Friday was a look at my software engineering mortality in that it checked my dillusions of perfect-coding grandeur. SteveSmith 2002.08.02 Help. Would someone share with me how to ident a passage, such as the above quote? [Steve, I hope it was you asking for help on formatting your passage. - Shannon] Yes, it was me. Nice. Thank you, Shannon. -Steve Steve, one of the things that these logs demonstrate is how often we have an effect on people that we do not see at the time. Over time, as they start working in the "real world", more of them will make the connection. SherryHeinze 2002.08.03 Steve, it sounds from the comments like you might have been using my bead game to do this simulation. I can't remember; was that it? - JerryWeinberg 2002.08.04 Jerry, Yes, it was your "The Bead Game". I used marbles because I couldn't find beads. That change forced me to retitle the simulation to "The Quality Game". I put the W&W copyright on all the handouts, which I collected at the end of the class. I also suggested that the students check out your web site. I've wanted to run the simulation ever since I first heard about it during the review of Jim Highsmith's Adaptive Software Development. This class gave me a chance to try it. Thank you for creating a terrific simulation. I loved running it. SteveSmith 2002.08.04 Can you describe the marble aka bead game? BobKing Even better, maybe we could run it at AYE? - JerryWeinberg I trust that that would be even better - I would like to be a part of that. - BobKing 2002.08.06 Jerry, I think doing the simulation at AYE would be a wonderful addition. You have my full support. Please let me know how I can help. SteveSmith 2002.08.06 Bring all your marbles. (and send me an email) JerryWeinberg For your information, it appears that the course that Steve helped teach (and helped design before the course started) was a very successful learning experience for the students (and for me, their "teacher"). On their anonymous reviews, the students gave me (and thus the "class") an overall grade of 4.5 out of 5, which is half-way between "very-good" and "excellent". They especially liked the external "speakers", working together in the lab, the automated build system, and our stop-light. We hung a physical stop-light in the computer lab, and had it controlled by the automated build system to indicate the health of the student's build: after every check-in to the code repository, the light would turn yellow while it was building and testing the new build, and then either green, if the build passed the regression test, or red, if the tests failed. (The lights produce a visceral feeling when they go green or red.) The students even managed to deliver a product that did a small amount, despite being in chaos three weeks before they were to deliver their final CD-ROM. The cycle of experiential workshops, working together on a single project (too large for a single superhero to do it all), weekly reflections, and occasional guidance and coaching created an amazing learning experience. Students were yelling at each other. Teams were claiming their part was done, but the other teams parts weren't (though when I asked how they could say this if the product did not yet run, the class was silent). People were in tears over how others were treating them. I had to "fire" one of the leaders the week before the end of the project. One student insisted on spending his time adding a cool feature with no customer value (though several students continued to defend him cause the feature was "cool"). The students saw (and wrote about) how a single person (one of the students who was an experienced manager) could instill just a bit of project tracking to pull order out of chaos. They gave a great customer presentation that focused on customer value. And they asked to throw a party on the last day, at which they handed out CD-ROMs with autographed covers. Thinking back over SherryHeinze's comment about the delay between the onset of a change and the measurement of the change, the true measure of success came this morning, a month after the course finished, while I was meeting with the student I had "fired." We were talking about something else (it turns out we have a mutual interest) when he said that the course has changed the way he views every interaction with people. It has even improved his relationship with his girlfriend. Now that's success. - DavidSocha 9/25/2002 David, you've discovered why "happy sheets", the standard feedback sheets people fill out after conferences, workshops, classes, etc. don't work. The real learning comes later as people practice and evolve what they learned. We don't ask for happy sheet feedback at AYE. --JohannaRothman 2002.09.26 Johanna, I couldn't agree more. The only measure of learning I trust is changed behavior. And that takes time to practice and evolve. - DavidSocha 9/25/2002 I was also fortunate to be invited to "guest present" in David's course. Being an old guy, and not really sure what it is that I do, it raised some interesting questions for me:
I decided that I've got to offer - especially in the company of the other guest presenters, experts all - mostly some kind of perspective. I've been a lot of places and seen a lot of things. Maybe I could help them see the soup they were in a bit, and perhaps learn a tool or two for identifying the soup. I think when attempting to teach anyone, but especially inexperienced people, some useful techniques are:
I came in about a week before the "firing." They were well into various crises. The classroom work began at 08:30 in the morning. So, I tried to model, coach and so on, starting with the crisis they were in. Most interesting for me:
I am pleased that per the student comments, David's comments, and the survey information, I kept up with the quality of the course, and other guests. So, I will sustain the things that worked, above, and improve what I can. If I have the opportunity to do a similar presentation again I'll have three changes:
I think I know enough now to make an experiential session on this topic for people who aren't in bootcamp together. The difference is in finding the energy to illuminate a model. If they don't have a common problem to understand, create one - this is what simulations do in any event. If they do have a common problem, you don't have to create one. Once David decided to have them actually build and attempt to ship something, they would have ample - er - experiential motivation. By coming in late in the game, I didn't need a "simulation": they were already doing one. As for how we did, one of them asked at one point, how relevant this experience was. I told them they were getting a dose of perspective that people get only after their first year, or year and a half of doing real work, if ever. Then there's the U-W's student course feedback, which asks what the students got out of the course compared to their other experiences. Ratings of four or better are "very rare." The folks who ran the course - the TA was also exceptional - constructed their own survey with high ratings within the available range (which included negatives), compared to no baseline of course. And several of the students have maintained contact, and reported that they've used what they learned. So we've got several sources of information: "expert" opinions, two surveys, student journals, and ongoing contact with participants, which all seem to track. David (with some help) did a really cool thing here. - JimBullock, 2002.9.26
Updated: Thursday, September 26, 2002 |