top of page

Megan's 15/5's

A 15/5 is a short progress report.  Taking about fifteen minutes to write and five minutes to read it keeps everyone up to date on the weekly going's on with the project.  These reports will be written both for team progress and for individual progress.  Check out the team members' pages for their individual 15/5 reports.

Third 15/5

February 9, 2016

 

This week we continued working on our group projects by altering the Needs Assessment and tackling the Task Analysis documents. The Needs Assessment seemed to look good over-all but it was important to take a look back at the document to amend some of the goals and problems to project the inadequacies of training off the learner and rather place this issue upon the governing bodies that should be providing adequate training for the regional commissions in our project. It's important when constructing these documents to consider the verbiage of who is inadequately providing training and what the learners will need to do in order to achieve the goals and solve the problem.I added the testing strategy portion to the Task Analysis document this week, and again, it was interesting to consider how essential the verbiage in the document is. One small change I suggested to make was to alter the objective verb of "discuss". Unless using a module such as the one we create in a course setting with an active professor or manager, it's difficult to use the verb "discuss" as that would typically call for more open-ended answers. In a module like the one we're creating, we chose to opt for verbs like "identify", "recall", and "recognize" since they fit more with multiple choice, drag-and-drop, and pick-many freeform questions available through Articulate. While considering the depth of knowledge for each objective is important, sometimes the indicators of a goal should focus more on straight-forward responses that can be determined as right or wrong. By doing this, the feedback in the module can then immediately provide guidance to the user and the user can then continue without question as to his or her accuracy or understanding of the content.

01

03

First 15/5
January 26, 2016
 
After already hitting the ground and running with my groupmates for the Carl Vinson Institute project, I am looking forward to working with a group of such ambitious and knowledgeable classmates again this semester.  We have a great rapport as a group and I am excited to see how everyone contributes to making this project a success!  As lead evaluator, I have a unique opportunity to explore more of the evaluating and assessing portion of the project.  Since I have somewhat limited experience creating effective evaluating tools in an instructional design setting, I see this as a chance to expand my knowledge and skill level in evaluating and assessing.  I also think this will be a great opportunity to integrate the content of other IDD course this semester, which centers on the evaluation process.  I already have some ideas flowing as to what types of tools and strategies I may want to employ as we continue our progress on the project.
 
Second 15/5

​February 2, 2016

 

This week we continued our work on the CVI project and I had the opportunity to take a look at the Needs Assessment document created by Tracy and Nora.  The document did a great job outlining the direction our project will take.  Our prime objective this semester as a group is to create an e-learning module that provides an overview of regional commission formation and legislative mandates to solve the problem of regional commission councils not being familiar with the laws that govern their decisions.

 

We are also now forming our Task Analysis document.  In addition to meeting Wednesday after class to further discuss and edit this document, we are all also pitching in to complete the remaining sections to tailor the document to our individual plans and expectations of our own roles within the project.  I will be focusing on the testing strategy section within the document.  I have yet to articulate this fully, but I would like to use a variety of questioning techniques within Articulate to check for understanding.  To meet the objectives we outline, I am currently planning on testing users with multiple choice questions, drag-and-drop tasks, and hotspot assessment functions.

02

Fourth 15/5

February 16, 2016

 

This week we focused on the learning theory and design behind the CVI module.  While the discussion and background behind the decisions were complex, it really boiled down to what was the best route to take to enable our learners to achieve the instructional goal.  Although I think we all had to rack our brains in analyzing paradigms, theories, and approaches, it turned into a great opportunity to review key principles of the major paradigms and learning theories.  While Behaviorism and Cognitivism were the two paradigms that best fit our project, we had to narrow down our expectations and consider which of these lines of practice would best fit the instructional goals of the project.  Although Tracy confirmed that a behavioral approach could be viable since certifications for successful training completion were issued and valued amongst the Regional Commissions Councils members, I agree with the team that Cognitivist theories lend themselves better to our goals with this project. 

 

The next task was establishing which theory related the best, and as I outlined this week in the treatment rationale, some Cognitivist theories threw a wrench in our plans by heavily emphasizing background knowledge and previous skills.  With our learners, as Tracy analyzed in the User Profile, they have varied levels of skills and knowledge in regards to the formation of the councils and local and state government legislation in general.  This, however, is not a detriment to our project, but can instead be seen as an opportunity that allows us to create a level playing field.  With that in mind, the Cognitive Theory of Multimedia Learning seems to fit this project very well in that it acknowledges the acquisition of knowledge as a result of a connection with prior knowledge and also allows us to have the flexibility to convey informational content with the use of audio, text, and graphic images to enhance the learning experience.

 

04

Fifth 15/5​

February 23, 2016

 

This week I focused mainly on the Evaluation Plan for our project.  I found that while my Evaluation Plan for a previous class was helpful in the construction of the document for this project, the 6210 course requirements shaped this into a much more complex evaluation plan.  Similarly, I chose to separate the foci of the alpha and beta tests.  The alpha test will focus mainly on usability and design.  Evaluators will take this opportunity to comment on the functionality, navigability, and overall appeal of the module. 

 

However, rather than relying solely on surveys and feedback, I've also added profile questionnaires to the surveys for both alpha and beta tests to gain more insight into our evaluators and their areas of expertise.  The beta test will also feature a survey, but these evaluators will focus on instructional effectiveness.  They will respond to statements regarding content and instructional design.  Another feature of this evaluation plan not included in my previous project is a pre-test and post-test to assess learning and improvement in the users' understanding of key concepts related to our instructional objectives.  I look forward to seeing how effectively our module conveys content and builds upon the users' knowledge. 

 

05

Seventh 15/5​

March 8, 2016

 

This week we had the opportunity to look at the storyboards and get a feel for the module as we begin production.  As the lead evaluator, I have really enjoyed focusing my efforts on writing the evaluation plan and using my research and instrumentation development to help in the evaluation project in EDIT 7350 as well.  I really feel I have a much better grasp on the purposes and procedures of alpha and beta testing to evaluate usability and effectiveness.  I understand much more who should be involved in the evaluation process and I look forward especially to the results of the expert review.  I have not previously used an expert review in the past, but did work closely with SMEs who offered suggestions regarding content and clarity.  Since this content is so far removed from any field of my expertise, I think this is the perfect opportunity to have an expert in Georgia's state and local government check out the module so we can gain feedback regarding the content's accuracy and the module's over-all coherency and instructional quality.

 

While the evaluation process is on pause until the module is complete, I now am looking forward to the creation of assessment activities to include in the module.  I feel the module's design so far is building the foundation for a great tool that clearly conveys the content and engages the users through interactivity.  I look forward to continuing those themes in my plans for assessment activities.

 

07

Sixth 15/5​

March 1, 2016

​

I spent a number of hours this weekend diving into the evaluation documents for both 6210 and 7350 courses.  I feel I made great progress on the 6210 Evaluation Plan and everyone has been offering very constructive feedback on a few edits that need to be made.  After discussing evaluation opportunities with my 7350 group, I felt adding an expert review to our 6210 group project would a smart solution to gauging instructional quality.  Our alpha test consists of an evaluator profile and survey regarding the module's usability and design.  Then, in addition to our alpha test, we will also have an expert review the module and complete a questionnaire to provide feedback on the accuracy of content and the level of effectiveness.

 

Subsequently, our beta test will begin with a pre-test to gauge our users base level of knowledge regarding the formation of Georgia's Regional Commissions Councils.  The users will then complete the module, and afterward, will respond to a participant profile questionnaire, post-test, and survey regarding instructional effectiveness.  I emphasized the importance of gauging whether the module adequately facilitated users in achieving the instructional objectives.  I look forward to seeing the feedback we gain from the evaluation stage and what modifications we can make to reflect these concerns.

06

Eighth 15/5​

March 15, 2016

 

Spring Break - no project work to report.

 

08

Ninth 15/5​

March 22, 2016

 

Wow!  I have been overwhelmed the past two weeks by EDIT 6210 and 7350 work, but the progress our groups are making on our current projects is looking great!  I have really enjoyed diving deeper into the module and creating assessment prompts for our module.  Since much of our content is informational, I felt it would be a great opportunity in the assessments to prioritize interactivity and application of the content rather than just recall.  After editing a few of the assessment questions and adding a couple more scenario-style prompts yesterday, I think the assessment portions are really starting to come together.  Lana’s work on the module has produced a very professional and promising result so far, and I look forward to seeing the progress once she has added in the assessment slides, too.  In relation to the module as a whole, I tried to create assessment questions that seemed vital to the learning objectives and to the role of a commission member.  I also used a familiar tone in setting up the text for the feedback boxes to connect more fluidly with the design of the audio in the module.  I think this module is really coming along nicely and I look forward to starting the alpha testing phase once it is finished to gain feedback on our work.

 

09

Tenth 15/5​

March 29, 2016

 

This week we have focused mainly on development.  Lana is working hard to create all the slides and I have revised some of my assessments to better reflect Dr. Clinton's suggestion of scenario-based assessment.  By incorporating assessment questions at higher complexity levels and integrating first-person decision-making perspectives, the assessment questions now are more varied and will hopefully be more meaningful to the users.  In addition to amping up the assessment questions, we have also had an on-going conversation about the content as a whole.  Which content is essential for users to know, and which was maybe too highly elevated during our task analysis stage?  While most of the content and module is working beautifully, it is important to reconsider - whether to adjust or affirm - content at times to make sure the product meets the over-all instructional goal and expectations of the client.  To consider this to be a development stage may be a misnomer.  In fact, we constantly apply informal evaluation to check for functionality flaws, accuracy of content, and instructional quality.  I think this is a solid strategy though in order to deliver a more finely tuned product to the alpha and beta testers.  Hopefully, with so much work and thought input at this end of development, we will not have to use as much time and effort in making adjustments during the formal evaluation stage.

 

10

Eleventh 15/5​

April 5, 2016

 

This week I have been focusing on two major pieces of our EDIT 6210 curriculum - the comprehensive oral exam and the development stage of our project.  First, I actually enjoyed reviewing the major themes we learned about through the course of the IDD program for the exam this week.  It at first was a bit daunting to piece together a study document, but once I started, I found that much of the knowledge I had on the assigned topics had translated into practice and further understanding with the projects we have done over the past two years.  By doing these projects, we really can gain a better sense of theories and paradigms by putting these ideas into practice with the instructional design process of our own projects.

 

Second, the group continues to delve into the development stage as Lana worked hard to revise content slides this week and Nora continued working hard on the audio narration for our project.  With the content revisions, I also added an assessment question for the two new content slides to make sure that information was assessed.  Since the content was largely informational on these two slides, I did opt for a simple recall question rather than adding another, more complex, scenario-based question.  While it may be lower on the depth of knowledge spectrum, I think these recall questions are also important when serving as a summary or a way of checking a user's understanding of the main idea of informational text.  As always, it's important to have users apply and synthesize decisions using content knowledge - but sometimes I think it serves well to recap information with a simple, straight-forward assessment question, too.

 

11

Twelfth 15/5​

April 11, 2016

 

This week has been our alpha testing week.  As lead evaluator, I am looking forward to thoroughly reviewing these responses this weekend. I plan to compile an informal list of concerns immediately to send out to the group.  This will make everyone aware of major issues, and enable the group to plan for solutions to these problems.  From there, I will analyze the alpha testers' responses and create graphs to display the data for our evaluation report.  While alpha testing focused on usability and functionality, I am looking forward even more to analyzing the beta test responses over the next week.  Beta testing focused on instructional quality and content accuracy.  I think the pre-test and post-test data will provide the best insight into how useful and helpful the module really is.

 

12

Thirteenth 15/5​

April 18, 2016

 

This week I have focused on sorting through alpha test results and ranking these concerns according to significance of impact on the usability and effectiveness of the module.  The alpha test consisted of an evaluator profile questionnaire and a usability survey featuring statements with a 1-5 (Strongly Disagree to Strongly Agree) scale and open-ended short-responses questions.  The goal of this test was to have instructional design professionals examine the module in its current state to check for usability, navigability, and functionality issues.  We received some very thorough and insightful feedback from our three alpha testers, and I had the chance to organize these concerns in a chart to kickstart our modification process.  The chart consisted of two columns - one shared modification suggestions based on evaluator concerns and the other shared the result of how we addressed the concern in the module.  With development in a continuous process, some concerns had already been addressed over the past week.  Others, however, will now serve as a helpful to-do list as the development stage closes with last-minute fixes and modifications.  In this chart, I kept the Heuristics system I worked with as an evaluator for the EDIT 7350 project in mind as I categorized and prioritized the modification suggestions.  The modifications that made a major impact on usability or instructional effectiveness were placed in the primary section; these will be necessary to address first.  Then, the modifications that should be considered but do not make a major impact on usability or instructional effectiveness were placed in the secondary section; these changes may be made based on group decision and time limitation.  Lastly, the modifications that were largely subjective in nature and would not make an impact on usability or instructional effectiveness were placed in the tertiary section; these may be considered as alternatives to the current design but will not be executed unless the group deems them an improvement to the current design.

13

Fourteenth 15/5​

April 25, 2016

 

This week I've been piecing together our Evaluation Report - and it has been quite a task!  The evaluation began with alpha testing, where instructional design experts responded to a survey regarding the module's usability.  They commented on aspects including navigability, functionality, and design elements.  We received the most helpful and cohesive feedback from this testing group.  I organized the modification suggestions based on feedback and categorized the changes as primary (most needed), secondary (should be considered), and tertiary (optional).  This was a long and detailed list, but I think this part of the evaluation helped us the most in assessing and resolving critical flaws within the module that were affecting usability and instructional effectiveness.

 

From there, we implemented beta testing where a group participated in pre-testing, module completion, post-testing, and a quality survey to gauge effectiveness and to assess learning performance.  While this was an essential piece of the evaluation puzzle, it by default is not as useful in an evaluation with such a short time span.  Beta testing focused on effectiveness - judging qualities like content accuracy and relativity - which brought forth feedback that would not be feasible to even consider changing in the time allotted for this project.  With that being said, if an evaluation were done on a more long-term scale, this could have been an important element to consider in the development and modification of the module.

 

14

bottom of page