Several months ago, I volunteered to sit on an exam sub-committee with the Association of Dynamics Professionals. This simply means I volunteered to help write and review exam questions for Dynamics GP exams. Having passed both the exams that exist today (Financials and Install/Config), I felt it was a good chance to help improve the exams and write for future exams too. For those who are unfamiliar with the association, here is a blurb from their website on what they are all about:
The Association of Dynamics Professionals (DynamicsPro) is an independent, not-for-profit membership organization devoted to the global Microsoft Dynamics community of partners, customers and Microsoft.
Our mission is to establish and maintain professional competency standards and assessments for the betterment of the entire community.
Our vision is that every Dynamics professional recognizes and adheres to the DynamicsPro certification and quality standards for the implementation, support and use of Microsoft Dynamics products.
So, back to the exams. There are several exam writing sub-committees both for NAV and GP, each led by one of the members of the overall exam committee, to break us into smaller groups. This makes it easier for us to schedule time to meet (virtually, in my group's case) vs. having to coordinate a larger group.
Well, we have a lot of fun in the group I'm working with and I must say, until recently, the majority of the "fun" is a constant back-and-forth jabbing at language differences between Canadians and Americans. The group I am working with, had at one point 6 people, 2 Americans and 4 Canadians, as it happens. I say "until recently" only because our group dynamics have changed with some moving to other groups and some unable to maintain the commitment, so we still have a mix but now with only 1 American at the moment, the rest of us Canucks have let her off the hook for a while. :)
It may not sound thrilling but we have to make fun where we can, right? We are writing technical exam questions after all!
Here are just some of the goings-on that occur in one of our typical meetings, which are held via a WebEx session so we can read and review the various questions.
- Canadian vs. US English in some cases and Canadian vs. US terminology in other cases start minor squabbles, in jest of course. Rod, designated to herd our particular group of cats, I'm sure is tired of correcting all of the terminology and language differences we present to him as questions. Check vs. Cheque. State vs. Province. Zip Code vs. Postal Code. The Canadian in me just cannot for the life of me spell "check" when I mean to say "cheque". Sometimes, I think we secretly go out of our way to intentionally load up our questions with topics where there are intentional language or terminology differences just to be a pain in the butt.
- Conversations of any kind around Cheques/Checks then morph into making fun of the Canadian spelling by pretending it's pronounced differently. "Chuh-kay-book" is my favourite. Thanks, Amber! One day, one of the group threatened to write questions spelling it "Czechbook" just to be unique.
In terms of non-country-squabbles, and on a more serious note, since we are being productive most of the time once we stop laughing at each other, here are some of the other interesting things we run into!
- Do we use dashes or arrows to identify navigation? I swear this comes up at every meeting. We meet every two weeks and are all busy in between, and we don't tend to write questions for each meeting so when we write questions every 4 or 6 weeks, we forget things like this!
- Do we use window names, menu names or the menu navigation name? See above, for the same problem.
- Rod gets to edit the questions and is sharing his screen with us so we get to "bark" orders at him, kind of! Can we capitalize that? Can we remove the capitals from that? Can we add a question mark? Can we add a comma? Can we remove that comma? That sort of thing is pretty common.
- And then there are the questions we all agree hit on a good exam question area, but we can't get there from where we start. "Can we blow this question up?" inevitably comes up once every meeting or two. It helps a lot that none of us appear to be offended in the least if it is our question we are blowing up to re-word a different way.
Generally, writing questions isn't as easy as one might think. I don't think I ever gave an exam question a minute's thought up until I joined this sub-committee. It's been a really interesting insight into the "science", as well as the art, of writing good questions. We often have the base for a good question and can't work through a better way to word it, so someone takes it with them and brings it back in a future meeting after they mull it over.
The basic structure is interesting:
- The Stem: this is the actual question. Often we spend just as much time re-working the question to get the wording right, the tone, the tense etc. as we do on the answers.
- The Key: this is the answer, or in our case, it's the # of the option which is the answer (as in there are 4 options to the multiple choice questions). Hopefully, this is the easiest part!
- The Distractors: these are the hard part, the wrong but plausible answers. Many many times we ask ourselves various questions to get the questions and options "reasonable". Are any of them obvious throwaways? We follow something that I believe we refer to as the ignorance test (I believe initially coming from David Musgrave) - if someone knows nothing about GP, can they logically work through the options and rule anything out too easily? If the answer is yes, we re-work the options. There's a fine line between "tricking it up" (making it stupid hard), and developing legitimately challenging questions that an experienced GP consultant or power-user would know but isn't necessarily something an inexperienced one would get right.
- The Cognitive Level: this refers to whether the question is testing pure memory/recall, applying knowledge or analysis of a situation. Ideally, there is a mix of all three throughout an exam as we don't want the entire exam to be strictly based on memory.
It's incredible how hard it is to come up with 3 good wrong answers! I can always come up with 1 and usually 2 but I find the 3rd one to be quite the challenge at times. It's also incredible how working with other very knowledgeable Dynamics GP consultants, we cannot only write questions that we sometimes get wrong, but we can typically always work through to get the wording right when we put our heads together!
It's been a very interesting endeavour and I've enjoyed it immensely… it's great to dig into many areas of the GP functionality to write questions that are newer to GP 2015 or 2016 as well as write questions in areas that we figure are "old hat" but still need to be understood to be a good consultant!