Home | Login | Recent Changes | Search | All Pages | Help

TeachingInexperiencedPeople

I am leading a 2-hour simulation about software quality and time to market on Monday, July 22. This opportunity emerged from discussions I had with DavidSocha, an AYE participant, about designing experiential learning.

David changed jobs recently. He is now an instructor for the University of Washington. He is teaching CSE 403: Software Engineering Bootcamp this summer. David asked me if we I would be a "guest lecturer" for his class. As you can see, I agree to lead a simulation rather than lecture.

So, the participants for the simulation will be college students who will soon join the work force. I've been trying for the past few days to put myself in the students' shoes so that I can better relate to them. It isn't working. I cannot seem to get my size 13 feet into their shoes.

I remember only bits and pieces from my transition from student to professional. The things that stands out for me before entering the work force was my yearning to "work with the latest technology", "help people", "make my mark" and "make a difference". After a few months on my first job, I remember thinking that my most important learning was about people rather than technology.

I didn't have a clue about quality or time to market when I took my first job. I must be a slow learner because it took me 25 years to digest the fundamental dynamics about tradeoffs between quality, speed, and economy. I know that I still have a lot to learn about the dynamics. And, I look forward to learning more.

I would like you to share with me the answers to a few questions --

  • What were your yearnings as you entered the work force?
  • What did you know about making tradeoffs?
  • What do you wish you would have known about making tradeoffs before starting your first software job?
  • What would you like a new professional who is working with/for you to know about making tradeoffs?

The good news is that the simulation will give the students real world experience about the tradeoff between quality and speed. My objective is to help the students see beyond the context of design and coding to the larger context of software development and business.

I would appreciate anything you would share with me about this topic.

Thanks, SteveSmith 2002.07.17


Steve: One of the things you should definitely work in is forgetting the academic "all by myself" ethic - make them explore (measure?) each other's relative strengths [& weaknesses] with logging, code reading, brainstorming. There is no academic in TEAM. Time & rate code reuse vs. code from scratch? (I'm not sure what you can cover in 2 hours)

My 2 cents. --BobLee 2002.07.17


Steve, Bob has some great observations and I agree with the ambitiousness of your task. Maybe if you picked something familiar to you and your students as the basis of a simulation, you could create that sense of wonder. A summer camp? A high school?

When there are different roles (executive to developer) the debrief can bring out interesting perspective, but I wonder how much would be picked up by a student. They have to have some experience with the thing to understand the people in it.

I'm sure you'll do a great job with getting people in touch with each other. You might also ask your daughter to get a younger perspective.

Adding 2 more cents - BeckyWinant 2002.07.18


Bob and Becky,

Thank you for the replies.

I don't feel my task is very ambitious. I have a well-designed simulation that I'm pleased to lead. The task for David Socha, the instructor, is ambitious. David is using experiential learning as the foundation for his course, which makes his course way different than the traditional college course. My hat is off to David.

He has the students working in teams to produce a software product and once a week he brings in an outsider to do a simulation. I see the simulations as means for enabling the students to broaden their view of software and social development.

Bob, I've seen from the students' journals, another innovation from David Socha, that they are encountering issues around teamwork and individuality. Regarding measurement, the simulation measures time and quality. I'll ask the students, during the debrief, to come up with similar measures for their products.

I still hope to hear about what AYE participants would like new professionals to know about tradeoffs. Imagine an interview situation. How would you test whether the person, experienced or inexperienced, was aware of the tradeoffs? How would you test whether the person would actually make tradeoffs?

SteveSmith 2002.07.18


My first real-life lesson after college was that different stakeholders within the same company can have different expectations about my project's rate of progress. (Among other things) I was showing regular progress to my boss, who was telling me that my rate of progress was fine. [Of course, way back then, I started with writing infrastructure to enable building the rest of the app. I didn't have a schedule or plan, and the deadline was hazy.] At a company meeting, I hear the president of the company say that progress on my product was too slow... surprise -- my boss wasn't communicating with the president, or vice versa.

Might you want to simulate something like that? KeithRay 2002.07.19


Keith's point suggests an area: negotiation. Being honest, able to say "No!" rather than placate, and looking for Win-Win at all times. Too many games, academic activities and portrayals in media push the Win/Lose game. (positional negotiation) Understanding diversity opens the mind to Win-Win opportunities. BobLee 2002.07.19
Postscript: I had fun leading the simulation. Here is what stands out for me --
  • I marveled at how some teams were successful purely by luck. A true affirmation for similar experiences in the "real world".
  • I had a difficult time connecting people with their experiences working in a team -- probably because they had little experience.
  • I enjoyed watching the lights go on when I asked questions about assumptions, especially who was the leader and whether the team had made intentional decisions about leadership and process.
  • I interpreted that the students' model for the tradeoff process was theoretical rather than practical. They have talked about tradeoffs but they've never had to make hard choices about them.
  • I enjoyed sharing a different model, The Iron Triangle of Software Development, for visualizing the tradeoff process.
  • I wondered how Jerry taught college students who had little experience.
  • The academic environment does not fit me.

SteveSmith 2002.07.29


Actually, most of my college students were people with jobs in industry going to school part time - or else they were new freshmen and freshwomen who were lost in the college maze and wandered into our experimental programs. I never had much success with "conventional" college students.

And welcome to the "academic environment doesn't fit for me" club. - JerryWeinberg 2002.07.29


Wow. The simulation did reach some of the students. Here are a few entries from the students' public journals:
Student 1. Monday's guest lecturer taught us about the tradeoffs of when and when not to ship your product. I must admit that Monday I thought that this was sort of a silly lecture. I understood from the game we played that there are inherent risks with the way you tackle any product release bug and that some may be better than others but you need to balance those fixes with wisdom and restraint at times. I thought that this was something that only applied to Microsoft and other corporations that build *poorly-designed* software. It wasn't until Friday when we were in the lab and my classmates were making arbitrary (though needed and desired) architecture changes that something clicked and I really got a glimpse of how crazy the real world gets sometimes and how even software engineers with the best intentions and the best development strategies *do* have to ship product with known bugs. I must admit that before this "epiphany moment" I really thought that I was somehow exempt from that category - that I was one of the rare few who would never be faced with the question of whether or not to ship with bugs. In a sense, Friday was a look at my software engineering mortality in that it checked my dillusions of perfect-coding grandeur.

Student 2. When we started the marble game, I believed that we worked out a good strategy for a good final score. Unfortunately, we were dealt a bunch of unlucky blows from the good fix bag. I thought this was very interesting, because I started to feel a little desparate and panicky (or at least I was able to imagine that I would feel that way if we were really trying to get a real product ready to ship). After our group got a little desparate, we started using quick and dirty fixes, because I think we figured that we had to make changes (ANY CHANGES) even if they were bad changes! We ended up with one of the lowest scores, but we also had the most bad marbles to start with. I just hope I never experience that kind of situation when working on a real software project!! Steve, <lol> Don't worry. You will.

Student 3. Last week, I somewhat unhappily learned that you actually *do* have to make tradeoffs with the quality of your code; I thought Monday's presentation was unnecessarily cynical and randomized, but from some of the things I've heard in the lab the rest of the week, it seems to be reasonably accurate. Well, perhaps not as random as pulling marbles out of a bag, but frequently appearing so. Watching the build light go red, even occasionally, and knowing that it was code I had no control over, which I couldn't just directly fix, was an interesting experience; I'm used to small groups, where if something stops working, chances are I can help.

Student 4. I learned the need to be intentional. The money changing game that we played with David Schwartz pointed it out. The quality assurance game that we played with Steve Smith not only brought it out but also really focused my attention on it. I was shocked. Here I was attempting to make calculated deductions about how many black marbles were left and yet I wasn't thinking about what I was doing. We got really lucky. The Mikam Tribe aimed for 400,000 points and stuck with that goal in spite of the amount of poker chips they used. They were very intentional even though luck was not on their side. From that class on, I started being intentional. As if selling or making some product to sell, I asked myself "why am I doing this?", "what do I expect?", "how long will I be willing to wait for this? why?" Finally, after two workshops and reading Mastery, I see this change underway! It's exciting! It feels much more satisfying to intentionally aim for a direction than to react to whatever comes along.

SteveSmith 2002.08.02

Help. Would someone share with me how to ident a passage, such as the above quote? [Steve, I hope it was you asking for help on formatting your passage. - Shannon] Yes, it was me. Nice. Thank you, Shannon. -Steve


Steve, one of the things that these logs demonstrate is how often we have an effect on people that we do not see at the time. Over time, as they start working in the "real world", more of them will make the connection.

SherryHeinze 2002.08.03


Steve, it sounds from the comments like you might have been using my bead game to do this simulation. I can't remember; was that it? - JerryWeinberg 2002.08.04
Jerry, Yes, it was your "The Bead Game". I used marbles because I couldn't find beads. That change forced me to retitle the simulation to "The Quality Game". I put the W&W copyright on all the handouts, which I collected at the end of the class. I also suggested that the students check out your web site.

I've wanted to run the simulation ever since I first heard about it during the review of Jim Highsmith's Adaptive Software Development. This class gave me a chance to try it. Thank you for creating a terrific simulation. I loved running it.

SteveSmith 2002.08.04


Can you describe the marble aka bead game? BobKing
Even better, maybe we could run it at AYE? - JerryWeinberg
I trust that that would be even better - I would like to be a part of that. - BobKing 2002.08.06
Jerry, I think doing the simulation at AYE would be a wonderful addition. You have my full support. Please let me know how I can help. SteveSmith 2002.08.06
Bring all your marbles. (and send me an email) JerryWeinberg
For your information, it appears that the course that Steve helped teach (and helped design before the course started) was a very successful learning experience for the students (and for me, their "teacher"). On their anonymous reviews, the students gave me (and thus the "class") an overall grade of 4.5 out of 5, which is half-way between "very-good" and "excellent". They especially liked the external "speakers", working together in the lab, the automated build system, and our stop-light. We hung a physical stop-light in the computer lab, and had it controlled by the automated build system to indicate the health of the student's build: after every check-in to the code repository, the light would turn yellow while it was building and testing the new build, and then either green, if the build passed the regression test, or red, if the tests failed. (The lights produce a visceral feeling when they go green or red.) The students even managed to deliver a product that did a small amount, despite being in chaos three weeks before they were to deliver their final CD-ROM.

The cycle of experiential workshops, working together on a single project (too large for a single superhero to do it all), weekly reflections, and occasional guidance and coaching created an amazing learning experience. Students were yelling at each other. Teams were claiming their part was done, but the other teams parts weren't (though when I asked how they could say this if the product did not yet run, the class was silent). People were in tears over how others were treating them. I had to "fire" one of the leaders the week before the end of the project. One student insisted on spending his time adding a cool feature with no customer value (though several students continued to defend him cause the feature was "cool"). The students saw (and wrote about) how a single person (one of the students who was an experienced manager) could instill just a bit of project tracking to pull order out of chaos. They gave a great customer presentation that focused on customer value. And they asked to throw a party on the last day, at which they handed out CD-ROMs with autographed covers.

Thinking back over SherryHeinze's comment about the delay between the onset of a change and the measurement of the change, the true measure of success came this morning, a month after the course finished, while I was meeting with the student I had "fired." We were talking about something else (it turns out we have a mutual interest) when he said that the course has changed the way he views every interaction with people. It has even improved his relationship with his girlfriend. Now that's success. - DavidSocha 9/25/2002


David, you've discovered why "happy sheets", the standard feedback sheets people fill out after conferences, workshops, classes, etc. don't work. The real learning comes later as people practice and evolve what they learned. We don't ask for happy sheet feedback at AYE. --JohannaRothman 2002.09.26
Johanna, I couldn't agree more. The only measure of learning I trust is changed behavior. And that takes time to practice and evolve. - DavidSocha 9/25/2002

I was also fortunate to be invited to "guest present" in David's course. Being an old guy, and not really sure what it is that I do, it raised some interesting questions for me:

  • What have I really got to offer these people?
  • How can I meet them where they are?

I decided that I've got to offer - especially in the company of the other guest presenters, experts all - mostly some kind of perspective. I've been a lot of places and seen a lot of things. Maybe I could help them see the soup they were in a bit, and perhaps learn a tool or two for identifying the soup. I think when attempting to teach anyone, but especially inexperienced people, some useful techniques are:

  • Modeling. Be what you want them to grasp.
  • Coaching or assists. Help them just a little, to do something themselves.
  • Reflect, on what they're experiencing, and what you're experiencing.
  • Connect, a model or idea to what's happening for them right now. Reflection and doing go together.

I came in about a week before the "firing." They were well into various crises. The classroom work began at 08:30 in the morning. So, I tried to model, coach and so on, starting with the crisis they were in. Most interesting for me:

  • I tried to meet them where they were. When we started talking, the planned presentation went right out the window. They didn't need a created example of why they might want some tools - they had plenty.
  • While we were talking, I pointed out what I was doing right in the moment, sometimes. I pointed out what they were doing in the moment, sometimes. Sometimes they got it, sometimes they didn't.
  • They were skeptical of the unqualified correctness of the project management and software engineering canon. (I reinforced this: "They're just models. Take them from everywhere." Later I gave them a big list of "everywhere.")
  • They got the idea of a stopwatch for finding the energy. They didn't get that that is what I was doing with them (or at least didn't say so in their journals.)
  • They got the use of a diagram of effects to identify opportunities for intervention. One student even used this technique successfully with a management / organization problem outside of class - with a nonprofit where she volunteers. Bonus.
  • They didn't notice (or at least nobody mentioned) that while we were talking about their current problems, I was doing verbal systems diagrams. By the time we got to making one on the board, we were just capturing what we'd been doing for an hour. (BTW, that's a very not-typical lecture approach, and one typical successful consulting approach.)
  • I sent them a journal entry for myself about this experience. Seemed only fair.
  • I got to notice again, some more, how deeply intermingled are development processes & tools, and the associated management of development. Early in my career, I was perhaps fortunate to be on several teams that got ignored. We did what made sense to us, without management "helping" much. So I tend to forget that they matter, especially when they are being "helpful" and think they know something.

I am pleased that per the student comments, David's comments, and the survey information, I kept up with the quality of the course, and other guests. So, I will sustain the things that worked, above, and improve what I can. If I have the opportunity to do a similar presentation again I'll have three changes:

  • I'd get a few, small readings and a thought experiment out ahead of time. This didn't happen this time, simply due to my schedule.
  • I'd show up for a lab session ahead of time. I'd get to learn about the system being built, and the system state, and they'd get to know me a little. This also was intended but my schedule did not permit.
  • I'd absolutely show up for the other guest presenters sessions.

I think I know enough now to make an experiential session on this topic for people who aren't in bootcamp together. The difference is in finding the energy to illuminate a model. If they don't have a common problem to understand, create one - this is what simulations do in any event. If they do have a common problem, you don't have to create one. Once David decided to have them actually build and attempt to ship something, they would have ample - er - experiential motivation. By coming in late in the game, I didn't need a "simulation": they were already doing one.

As for how we did, one of them asked at one point, how relevant this experience was. I told them they were getting a dose of perspective that people get only after their first year, or year and a half of doing real work, if ever. Then there's the U-W's student course feedback, which asks what the students got out of the course compared to their other experiences. Ratings of four or better are "very rare." The folks who ran the course - the TA was also exceptional - constructed their own survey with high ratings within the available range (which included negatives), compared to no baseline of course. And several of the students have maintained contact, and reported that they've used what they learned. So we've got several sources of information: "expert" opinions, two surveys, student journals, and ongoing contact with participants, which all seem to track. David (with some help) did a really cool thing here.

- JimBullock, 2002.9.26


Updated: Thursday, September 26, 2002