NGCM workshop panorama

"Workshop in action" by Simon Hettrick is licensed under CC BY 2.0

The doctoral training centre in Next Generational Computational Modelling is hosting its first summer academy this week. The schedule is jam packed with talks from NVidia, Microsoft, FEnicS, IPython devs, Intel, and more.

Day one was a crash course in basic python, unit testing, and version control. Three key elements of any computational work in research (argues SSI's Simon Hettrick in the closing lecture). Most of the material was recycled from the every popular software carpentry curriculum.

It was my first experience of assisting a software carpentry session and an interesting reminder of what python, unit testing, and version control look like to new users (its not always pretty). I thought that I would write up a little of what I thought went well and some lessons that I learned for future sessions. I hope the suggestions will be useful to anybody running a basic programming crash course.

Preliminaries

  • Basic CLI commands must be understood: it is too easy to assume what cd .. means.
  • Understand the experience of the students: we've heard it a million times but its important to remember.
  • The point of each task must be understood: when students understand the problem, they're problem solving; when they don't, they're just following a set of commands.
  • Have a group chat: we used slack and it worked well for promoting question and answer sessions and sending out links/details/files etc.

When students get lost

It is critical to create an environment where everybody is comfortable asking questions. The workshop format using teaching and coding at the same time makes it difficult for students if they fall behind.

This can be helped by encouraging questions and creating an environment where the students feel that they're learning together. Students need to feel comfortable with interrupting the flow of the teaching for questions. Software carpentry seems to work best with a lean curriculum.

The slack channel and a system using red flags, were two ways we helped to encourage questions and find lost students but any other method would be beneficial.

Some suggestions

Some of these problems are difficult to summount but I think the following solutions could (and should!) be explored in future sessions:

  • Sometimes students end up exiting python or bash or their text editor and they get lost. A quick 'get-back-on-track' guide would be indespensible for leading students back onto the path of teaching.
  • I did have some doubts about the format, it can be difficult if students get behind. One solution would be to cater for the lowest skill level in the class and providing extra problems for those who are getting ahead.
  • Encouraging questions and helping students to feel that they're learning collectively is critical. Any novel tools to improve this are welcomed.

Overall, the event was a great success, and I look forward to the biannual Software Carpentry events kicking off at the University of Southampton this summer.


Comments

comments powered by Disqus