Some time ago a client approached our company requesting a course on a particular programming language. It had to be intensive, 5 days long and directed to sysadmins.
My knowledge on the topic and methodologies was mainly based on the experience I got from my professional background: using different tools to solve issues in productive environments, doing data migrations, running scripts, etc.
But this course required the extra effort of digging into those already familiar subjects with a different, more pedagogic, approach. Moreover, the material had to evolve in complexity as the course went on. Nonetheless, it was a challenge I was eager to take since it would allow me to get (and give) the level of knowledge it needed.
The intention of this post is to tell you about the steps and methodologies I used to build a training plan.
Having a plan vs. building a plan
When we started making a plan for it, the first question I had was: what is the scope? Luckily this time, we got off on the right foot since we had a list of topics handed to us by the client in advance.
Although such topics weren’t unknown, as a developer, it’s certainly not the same to google a given topic to understand obscure aspects of it we can't solve, than preparing, from start to finish, a course agenda, assuming that those attending know nothing about it.
The first thing I understood was that I needed a plan. Could the list of topics that they had given us for training fulfill this need?
The answer was no. That list was an end, a goal, not a plan. While I was developing the steps of the plan, I understood the difference between the content list and a planned schedule.
Evaluating how people would improve their knowledge and retain concepts, as time and preparation went on, was a key point to take into account. As well as remembering one's own learning experiences, and the
trial and error involved in the process.
I focused on thinking on those aspects that seemed more complex back when I was learning, and what would have been better to do, or to know, to avoid the bitter experience of spending days trying to understand subjects that, if presented in a different way, would have been easier to understand or solve.
Use the acts of faith
My intuition as a developer is that no matter what the code says, they’re just lines on a screen until you see the outcome. This seems pretty elemental, but it can get really messy for an inexperienced programmer.
When you see on the screen:
> print(‘hello world’)
It’s just the intention of printing a message, but to the eyes of someone new to this world, the meaning and intention might not be so evident until they press enter and see
> print(‘hello world’) hello world
Inherently, we need to try things to understand how they work, before moving forward to greater knowledge. We need to build solid foundations that we can trust. We need to stand in order to walk and to walk in order to run.
One technique which extensively promotes this dynamic is TDD. (I assume you will know what TDD is and if not, it’s a good time to learn; there’s a lot of material online and I personally recommend the post: 6 tips for a powerful TDD session). TDD promotes testing before coding, being one of the best qualities of this technique boosting developer confidence on the qualities of the software.
Explaining what TDD is, to an audience that may not know how to code, can be very difficult. But, in its basics, the TDD mantra (test, code, refactor) is quite simple. So I decided I’d teach using the same methodology, but in a more implicit way.
I appealed to these main ideas:
- A test -believe me- is an efficient way of trying things out.
- Executing them should be simple.
- In order to learn more by themselves, they should create simple tests, break them, and then solve them by learning and applying something new.
- This whole process is iterative and incremental.
Then I put all the course material together, based on the premise that for each topic or characteristic of the language to be explained, we’d have:
- Tests that validate that what I explained worked as expected.
- Students that tried them and became friends with the tests.
Assembling the training
It took me about a week to put together a complete plan that included the technical topics and related tests, complementing with the corresponding slides.
This is where the difference between what makes a training programme and what is meant by providing such a programme becomes noticeable. Many of the topics developed for the practical part, when transformed into the theoretic material, could easily provoke discomfort or confusion given their depth. (Always committed to the task of testing the concepts previously taught).
The only way of accomplishing this meant that I couldn’t teach and develop complete subjects on a single day. I understood that knowledge needs to rest, the mind needs to adapt so it can contain or give space to new knowledge.
Under this premise, I divided full topics into parts, increasing complexity towards the end, where the knowledge and techniques employed would complement each other, based on solid (tested!) foundations. From my point of view, the dynamics and trust of the learner are favored in a visible and relevant way.
It’s also very important to get feedback from people with training experience. I was lucky enough, during the assembly of the course, to count on experienced trainers to exchange opinions. A sort of code review.
I am only a couple weeks away from the beginning of the training course and my work is almost finished. The heuristics that I found fundamental in the assembly of all the material were:
- Thinking about my experience as a student and the difficulties I had back then, instead of my comfort as a teacher.
- Understanding the concept of iterative and incremental. I’m positive that the material will be ready before the start of the course.
- Understanding that knowledge is a road to be built, with focus on the students, who are our companions on the road. It’s a trip we are making together, and I’m not just the driver.
I wish, and hope, to be able to put together the second part of this post, commenting, in retrospect, on the successes and failures I observed at the time of providing the training.