May 17, 2019 Delivery WG Call Notes

From LADCO Wiki
Revision as of 16:27, 8 October 2019 by Zac (talk | contribs) (Created page with "=Action Items= * Zac and Jeff: recruit state participants into the WG * Zac: develop the WG scope/charge * Zac: develop thoughts on how to determine the effectiveness of the n...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Action Items

  • Zac and Jeff: recruit state participants into the WG
  • Zac: develop the WG scope/charge
  • Zac: develop thoughts on how to determine the effectiveness of the new eval instruments
  • Amy: comment/edit training deployment terminology worksheet
  • Amy: draft slides for training delivery ideas for the Communication WG leadership
  • Jeff: after Amy feels ready with her slide, schedule a call with the Communication WG to hear Amy's thoughts on delivery

Attendees

Jeff, Amy, Zac

Agenda

  • Review April action items
  • Orient new WG members to tasks and priorities
  • Review charge and ask for this WG
  • Review Jeff's terminology memo
  • Discuss how to identify the best deployment options for a course

Call Notes

'April Action Items'

  • Jeff: research terminology; status: completed, see white paper
  • Jeff: reach out to MJO training coords for upcoming trainings to test new evaluation instruments; status: communicated with other MJOs, and they are being deployed by LADCO, MARAMA, and SESARM
  • Zac: draft WG charge; status: pending
  • Zac: draft WG ask to get people on board for April 17 call; status: larger ask developed for the JTC/WGs sent to MJOs

'Introductions'

  • Amy described her background in education and training
  • Jeff described the history of the national training program; EPA effectively left the program in 2004, now returning under Peter Tsirigotis' leadership
  • Zac described the WG, priorities, and focus on the delivery options in the near term
  • Need to establish communication channels with other JTC WGs; looking to the steering committee for guidance

'Evaluation Instrument Discussion'

  • How do we know if this new eval is working better? Zac suggests that is we receive more meaningful narrative responses than with the previous eval, it's a success; the new evals were designed to elicit more frequent and meaningful narratives about the student experience; are there other metrics that we need to consider?
  • Need to convey to LMS workgroup a need for better content management system (CMS) in the LMS to allow for conditional responses (i.e., if you answer or 1 or 2, you need to substantiate with narrative); LMS should auto email link to survey after a course is completed
  • Is the course evaluation anonymous? respondents may be more bold when anonymous because less accountability; could the survey be anonymous to get more meaningful answers?; currently APTI-Learn is not anonymous; we would need to convey that the person's identity would not be revealed in their response, but that it's not a anonymous survey (need to know the respondent in the database to connect to the certificate generation); the new LMS should have an option to toggle on/off whether respondent identities are provided in the evaluation results summary
  • nature of making the eval a requirement in the process is that some people will just rush through the instrument to get their certificate
  • evals are currently only connected to the classroom courses; SI courses need an evaluation too; can we create a process to use an evaluation/automate the SI courses: take the course -> get link to eval -> certificate, similar to classroom

'Orient new WG members to tasks and priorities'

  • In the near-term the focus is on item 5

'Review Jeff's terminology memo'

  • Amy will mark-up the latest version of the memo

'Best Deployment Option Discussion'

  • Amy described her process for identifying delivery options for a course
  • Choose right tool for the job: do you want people to just have knowledge, or do you need practical/participatory experience; #1 training option will always be face-to-face because it provides opportunities for the full range of communication options: audio, visual, hands-on, etc...but may not always be feasible/practical
  • When determining if it's live, webcast, e-learning, need to decide what will create proficiency; if need proficiency, then you need interaction; simply passing knowledge can be more passive; the higher you need to go up on the proficiency scale, the more you need interactive training (practice applying skills for greater proficiency);
  • Process for finding the right option is based on the factors in place, need to consider geography of students and travel; need to balance flexibility for access with building proficiency
  • e-learning is great for foundational knowledge, understanding a concept, criteria for understanding when a process/task is complete (not a lot of judgement)
  • Should not have to recreate the wheel when creating e-learning; but it's not just putting slides online; to properly transfer format need to cater to the objectives of the course; e.g., if classroom was full of labs, it's hard to transfer to e-learning
  • As the Curriculum WG is already working on the CEMs courses, there is a pressing need to convey information to them on how to best choose deployment options
  • Jeff asked if Amy could communicate to the Curriculum WG the same thoughts that she just shared with us about course and deployment selection
  • Zac suggests that we get a best (and worst) practices summary out in front of the CEMs WG as a way to build some knowledge but also to get them to slow their schedule until we get a more comprehensive guidance together
  • Zac asked Amy to put together slides that summarizes her thoughts on learning delivery, best/worst practices for discussion (not recorded) with the Curriculum WG leadership; we would then evolve this to a webinar for the larger curriculum WG membership that we could record