Download as pdf or txt
Download as pdf or txt
You are on page 1of 429

SYSTEMS APPROACH TO TRAINING

(SAT) MANUAL

JUNE 2004
SUGGESTION FORM

From:

Subj: RECOMMENDATION FOR IMPROVEMENT TO THE SAT MANUAL

1. In accordance with the Forward to the Systems Approach to


Training (SAT) Manual, which encourages commands to provide
suggestions for improving the publication; the following
unclassified recommendation is submitted:

______ __________ _________ _______________


Page Para. No. Line No. Figure/Appendix

Nature of Change: ___Add ___Delete ___Change___Correct

2. Proposed New Text: (Verbatim, double-spaced; continue on


additional pages, if necessary.)

3. Justification/Rationale/Impact of Proposed Change: (Include


source; may be single spaced.)
PREFACE

The Systems Approach to Training (SAT) Manual was developed


to support Marine Corps training/education policy and Department
of Defense (DoD) military training program requirements. This
Manual serves as a primary source of information and guidance,
mainly for use by the formal school/training centers'
instructional staff, for instructional program development and
management.

The SAT Manual is divided into five chapters, each chapter


corresponding to a phase within the SAT model: Analyze, Design,
Develop, Implement, and Evaluate. In many of the sections
within each chapter, topic material is presented first, followed
by procedural steps for performing a task or function.
Throughout the Manual, hypothetical examples are provided to
illustrate a concept, topic, or procedure.

While the information contained in the SAT Manual is based


on and derived from accepted adult learning theories and current
instructional development practices, the Manual is designed as
an introduction to these topics. Additional research in
education-related fields is recommended for those personnel who
participate in the development or management of instruction.

i
EXECUTIVE SUMMARY

Overview. The mission of any instructional system is to


determine instructional needs and priorities, develop effective
and efficient solutions to achieving these needs, implement
these solutions in a competent manner, and assess the degree to
which the outcomes of the system meet the specified needs. To
achieve this in the most effective way possible, a systems
approach to the process and procedures of instruction was
developed. The resulting model, entitled Instructional Systems
Design (ISD), was later adopted by the Marine Corps as the
Systems Approach to Training (SAT). The model, whether it is
referred to as ISD or SAT, is a recognized standard governing
the instructional process in the private sector and within the
Department of Defense (DoD).

Goal of Instruction

The goal of Marine Corps instruction is to develop


performance-based, criterion-referenced instruction that
promotes student transfer of learning from the instructional
setting to the job. For a learning outcome to be achieved,
instruction must be effective and efficient. Instruction is
effective when it teaches learning objectives based on job
performance requirements and efficient when it makes the best
use of resources.
SAT is a comprehensive process that identifies what is
performed on the job, what should be instructed, and how this
instruction should be developed and conducted. This systematic
approach ensures that what is being instructed are those tasks
that are most critical to successful job performance. It also
ensures that the instructional approach chosen is the most time
and cost efficient. The SAT process further identifies
standards of performance and learning objectives. This ensures
that students are evaluated on their ability to meet these
objectives and that instructional courses are evaluated based on
whether or not they allow student mastery of these objectives.
Finally, the SAT identifies needed revisions to instruction and
allows these revisions to be made to improve instructional
program effectiveness and efficiency.

ii
Intent of SAT

The SAT was created to manage the instructional process for


analyzing, designing, developing, implementing, and evaluating
instruction. The SAT serves as a blueprint for organizing or
structuring the instructional process. The SAT is a set of
comprehensive guidelines, tools, and techniques needed to close
the gap between current and desired job performance through
instructional intervention.

The Marine Corps originally targeted the SAT for use in


formal schools, but the comprehensive system applies to
unit/field training as well as to education. The SAT is a
flexible, outcome-oriented system based on the requirements
defined by education and training. Whether referred to as
education or training, the instructional process is the same; it
is the outcomes that are different. Therefore, in keeping with
the intention of the SAT model, throughout this SAT Manual, the
term instruction will be used to discuss both training and
education.
Benefits of SAT

The Systems Approach to Training is a dynamic, flexible


system for developing and implementing effective and efficient
instruction to meet current and projected needs. The SAT
process is flexible in that it accounts for individual
differences in ability, rate of learning, motivation, and
achievement to capitalize on the opportunity for increasing the
effectiveness and efficiency of instruction. The SAT process
reduces the number of school management decisions that have to
be made subjectively and, instead, allows decisions to be made
based on reasonable conclusions which are based on carefully
collected and analyzed data. More than one solution to an
instructional problem may be identified through the SAT, however
the selection of the best solution is a goal of SAT.
The SAT is a continuous, cyclical process allowing any one
of the five phases, and their associated functions, to occur at
any time. In addition, each phase within SAT further builds
upon the previous phase, providing a system of checks and
balances to insure all instructional data are accounted for and
that revisions to instructional materials are identified and
made.

iii
It is not the intent of the SAT process to create an
excessive amount of paperwork, forms, and reporting requirements
that must be generated by each formal school/training center
conducting instruction. This would serve only to detract from
the instructional program. The SAT process does not provide a
specific procedure for every instructional situation that can be
encountered. Instead, it presents a generalized approach that
can be adapted to any instructional situation.
SAT Phases. The SAT model simplifies and standardizes the
instructional process into manageable subsets. The SAT process
is made up of five distinct phases, each serving a specific
purpose. The five phases are Analyze, Design, Develop,
Implement, and Evaluate. Each of these phases involves inputs,
a process, and outputs. The successive phases of the SAT build
upon the outcomes of the previous phase(s).

1. Analyze. During the Analyze Phase of SAT, a particular job


or Occupational Field/Military Occupational Specialty
(OccFld/MOS) is analyzed by Marine Corps Combat Development
Command (MCCDC, C 461) to determine what job holders perform on
the job, the order in which they perform it, and the standard of
performance necessary to adequately perform the job. The
result, or outcome, of the Analyze Phase is Individual Training
Standards (ITS) selected for instruction. ITSs are behavior
statements that define job performance in the Marine Corps and
serve as the basis for all Marine Corps instruction. The
elements of the Analyze Phase are:
Job Analysis. Job or occupational analysis is performed to
determine what the job holder must know or do on the job. Job
analysis results in a verified list of all duties and tasks
performed on the job.
Task Analysis. Task analysis (sometimes called Training
Analysis) is performed to determine the job performance
requirements requisite of each task performed on the job. Job
performance requirements include a task statement, conditions,
standard, performance steps, administrative instructions, and
references. Job performance requirements in the Marine Corps
are defined by Individual Training Standards (ITSs). ITSs
define the measures of performance that are to be used in
diagnosing individual performance and evaluation of instruction.

iv
Selection of Tasks for Instruction. Current instructional needs
are determined by selecting tasks for instruction. Tasks are
selected based on data collected concerning several criteria
relating to each task. A by-product of this process is the
determination of the organization responsible for conducting the
instruction and the instructional setting assigned to each task.

Input Process Outcome


Job task Job analysis Task List
data
Task analysis Individual Training
Standards (ITS)
Instructional Setting
2. Design. During the Design Phase of SAT, formal
school/training center instructional developers equate task
performance under job conditions (ITSS) to task performance
within the instructional setting (learning objectives). The
goal of this phase is to simulate as closely as possible the
real-world job conditions within the instructional environment.
The closer the instructional conditions are to the required in
the work setting, the more likely it is that the student will
transfer the learning to the job. The Design Phase is made up
of five separate sections, each of which has a specific purpose:

Write a Target Population Description (TPD). The TPD defines


the student population entering a course.

Conduct Learning Analysis. The learning analysis is conducted


to develop the learning objectives. The learning analysis
describes what the students will do during instruction.

Write Test Items. Test items are derived from the learning
objectives and are used to determine if the students have
mastered the learning objectives.

Select Delivery System. The delivery system is the primary


means by which the instruction is presented to the students.

Sequence Learning Objectives. Learning objectives are sequenced


to allow students to make logical transitions from one subject
to the next. Sequenced learning objectives provide efficient
instruction and serve as a draft course structure.

v
Input Process Outcome
ITS Define student Target Population
population Descripton (TPD)

Conduct learning Learning Objectives


analysis

Define evaluation Test Items

Select media Delivery System


and method

Organize Sequenced Terminal


instruction Learning Objectives
(TLO)

3. Develop. The Develop Phase of SAT builds on the outcomes of


the Analyze and Design Phases. The Analyze Phase identifies
those tasks to be instructed and the desired standard to which
those tasks must be performed. The Design Phase outlines how to
reach the instructional goals determined in the Analyze Phase by
converting job tasks to tasks taught in the instructional
environment, and further builds the foundation for instruction.
During the Develop Phase, instructional developers from the
formal schoo1/training center modify the instructional program
to fit the requirements identified in the Analyze and Design
Phases. The elements of the Develop Phase are:

Develop Course Schedule. The course schedule provides a


detailed structure for the course to include lesson times,
titles, designators, locations, and references to be used.

Develop Instruction. This section details the process for


developing the lesson plans and supporting course materials that
instructors will present during the Implement Phase. Maximizing
the transfer of learning is the goal of developing instruction.

Develop Media. This section takes the media selected during the
Design Phase and develops them into their final form for
presentation to the students. The purpose of media is to
enhance the instruction and the transfer of learning by
presenting lesson material in a manner that appeals to many
senses, complements student comprehension level, and stimulates
student interest.
vi
Validate Instruction. The goal of validation is to determine
the effectiveness of instructional material and to make any
necessary revisions prior to implementation.

Develop Course Descriptive Data (CDD) and Program of Instruction


(POI). The CDD provides a detailed summary of the course
including instructional resources, class length, and curriculum
breakdown. The POI provides a detailed description including
structure, delivery system, length, learning objectives, and
evaluation procedures. A formal course of instruction must have
an approved POI.

Input Process Outcome

Learning Organize course Course Schedule


Objectives

TPD Develop Master Lesson


Instruction Files (MLF)

Delivery Develop media Media


System

Test Items Validate Revised


Instruction instructional
materials

Develop supporting CDD/POI


course materials

4. Implement. During the Implement Phase of SAT, instructors


within the formal school/training center prepare the class and
deliver the instruction. The purpose of the Implement Phase is
the effective and efficient delivery of instruction to promote
student understanding of material to achieve student mastery of
learning objectives, and to ensure a transfer of student
knowledge from the instructional setting to the job. The
elements of the Implement Phase are:

Prepare for Instruction. Preparation involves all those


activities that instructors and support personnel must perform
to ready themselves for delivering the instruction.

vii
Implement Instruction. Implementing instruction is the
culmination of the analysis, design, and development of
instructional materials. Although the instructional developer
designed and developed the instructional material so that it
maximizes transfer of learning, the way the instructor presents
the material will play a crucial part in determining whether
students learn and transfer that learning to the job.
Implementation is the instructor’s delivery of instruction to
the students in an effective and efficient manner.
Input Process Outcome
Instructional Prepare for Delivery of
Materials instruction instruction
Implement Course data
instruction
5. Evaluate. The Evaluate Phase of SAT measures instructional
program effectiveness and efficiency. Evaluation and revision
drive the SAT model. Evaluation consists of formative and
summative evaluation and management of data. Formative
evaluation involves validating instruction before it is
implemented and revising instruction to improve the
instructional program prior to its implementation. Formative
evaluation is ongoing at all times both within and between each
phase of the SAT model. Summative evaluation is conducted after
a course of instruction has been implemented. Summative
evaluation assesses the effectiveness of student performance,
course materials, instructor performance, and/or the
instructional environment. There are three parts to evaluation:

Plan and Conduct. The purpose of planning and conducting


evaluation is to develop and implement a strategy for
determining the effectiveness and efficiency of an instructional
program.
Analyze and Interpret. After the evaluation data have been
gathered during the conduct of evaluation, the results are
analyzed and interpreted to assess instructional program
effectiveness and efficiency.

viii
Document and Report. Evaluation data is managed and the results
of evaluation are documented and reported so that instruction is
revised, if necessary.

Input Process Outcome

Course Data Conduct Formative Revisions to


Evaluation instruction

Conduct Summative Data on


Evaluation instructional
effectiveness

Manage Data Course Content


Review Board (CCRB)

ix
SYSTEMS APPROACH TO TRAINING MANUAL

TABLE OF CONTENTS

CHAPTER

1 ANALYZE

2 DESIGN

3 DEVELOP

4 IMPLEMENT

5 EVALUATE

6 ADULT LEARNING

7 ADMINISTRATION
Systems Approach To Training Manual Analyze Phase

ANALYZE PHASE
ANALYZE

E
 Job Analysis

V
 Task Analysis
In Chapter 1:
 Determination DESIGN

A
of
Instructional 1000 INTRODUCTION 1-1
Setting

L
1200 JOB ANALYSIS 1-2
 Job Analysis

U
Requirements 1-3
DEVELOP  Task Criteria 1-3
 Duty Areas 1-4
A
 Initial Task List
Development 1-4
 Task List Verification 1-5
T

 Refining the Task List 1-5


 Identifying Tasks for
Instruction 1-5
E

IMPLEMENT
1300 TASK ANALYSIS 1-6
 Purpose 1-6
 Training Standard
Development 1-7
 ITS Components 1-7
 T&R Components 1-8
 ITS/T&R Staffing 1-9
1400 INSTRUCTIONAL
SETTING 1-10

1500 ROLES AND


RESPONSIBILITIES 1-11
 TECOM Responsibilities 1-11
 Formal School/Det
Responsibilities 1-11

Chapter 1
Systems Approach To Training Manual Analyze Phase

Chapter 1000. INTRODUCTION

1 The Analyze Phase is a crucial phase in the Systems Approach to Training


(SAT) process. During this phase, job performance data is collected,
analyzed, and reported. This analysis results in a comprehensive list of tasks
and performance requirements selected for instructional development. In
the Marine Corps, job performance requirements are defined as Individual
Training Standards (ITS) Orders and Training and Readiness (T&R) Manuals.
The Analyze Phase consists of three main processes: job analysis, task
analysis, and determining instructional setting.

This chapter has four separate sections. The first three cover the three
Analyze Phase processes and the fourth provides the administrative
responsibilities.

1. Job Analysis: “What are the job requirements?”

2. Task Analysis: “What are the tasks required to perform the job?”

3. Determine Instructional Setting: “Will the Marine receive job


training in a formal school/detachment setting or through MOJT?”

4. Requirements and Responsibilities in the Analyze Phase: “What


are the roles and responsibilities of each element in the training
establishment?”

INPUT

Job Analysis
ITS Order
New Doctrine
Task Analysis
New Equipment PROCESS OUTPUT or
Manpower Reqs PROCESS OUTPUT
Determine
OccFld Reorg
Instructional T&R Manual
Setting

Figure 1-1

Chapter 1 1-1
Systems Approach To Training Manual Analyze Phase

1100. PURPOSE Section


The purpose of the Analyze Phase is to accurately determine what the Marine 1
must know and do on-the-job. Job Analysis is done through a systematic
research process called the Front-End Analysis (FEA) to collect, collate, and report
job performance data. Task analysis is accomplished by convening a Subject
Matter Expert (SME) conference. This conference, attended by representatives
from the operating forces, formal school, occupational field sponsor, and TECOM,
reviews the results of the FEA and produces a draft ITS Order to describe training
standards. SMEs then determine the instructional setting for each task and finally
A Front-End Analysis is a
produce the draft Target Population Description (TPD). The draft ITS/T&R is
process utilized to
then staffed and, when final changes are made, it is published in the MCO 1510
collect, collate, and
or 3500 series.
report job performance.
The formal schools are responsible for reviewing the signed ITS/T&R to identify
those tasks/events that they are responsible for teaching. The curriculum
developers then enter the relevant tasks into MCAIMS and begin the development
of the Program of Instruction. To accelerate the design and development phases
of the SAT process, schools can begin the process of entering the tasks into
MCAIMS from the draft ITS/T&R that is published immediately following the SME
conference.

As part of instruction, formal schools and detachments design, develop,


implement, and evaluate their curricula based on existing ITS/T&R. The
development of ITS/T&R within the Analyze Phase is unique to TECOM, and is
normally performed under the guidance of Ground Training Branch (GTB) or
Aviation Training Branch (ATB). Formal schools/training detachments within the
Marine Corps will not attempt to develop ITS/T&R independently without prior
approval of TECOM (GTB/ATB).
1. When ITS/T&R already exist for an MOS, the school developing instruction for
that MOS does not have to analyze the job. However, the formal
school/detachment is responsible for reviewing the ITS/T&R for accuracy and
completeness, and for recommending changes to TECOM (GTB/ATB).
2. If the ITS/T&R is awaiting signature following an SME conference, the school
responsible for instruction should obtain authorization from CG, TECOM to
commence course design, development, and implementation based on the
draft training standards.

The results of this phase form the basis for the entire instructional process by
clearly defining the target population, what Marines are actually performing on
the job, what they will need to learn in the formal school, and what will be
learned though managed on-the-job training (MOJT). The Analyze Phase is
concerned with generating an inventory of job tasks, selecting tasks for
instruction, developing performance requirements, and analyzing tasks to
determine instructional setting.

Chapter 1 1-2
Systems Approach To Training Manual Analyze Phase

Section 1200. JOB ANALYSIS

2 The first step in the Analyze Phase is the completion of a Job Analysis that is
conducted through the FEA process. TECOM (GTB) collects, examines, and
synthesizes data regarding each Occupational Field/Military Occupational
Specialty (OccFld/MOS). This data may include time in grade and MOS, career
progression, tasks performed on the job, instructional location, level of
instruction, etc. Job analysis is the collection and organization of data that
results in a clearly defined description of duties, tasks, and indicative
behaviors that define that job. Job analysis involves finding out exactly what
Job analysis involves finding
the Marine does on the job rather than what the Marine must know to perform
out exactly what the Marine
does on the job rather than the job. The product of job analysis is a verified list of all duties and tasks
performed on the job and the identification of those tasks that must be taught
what the Marine must know
to perform the job. in the formal school/detachment. Once the Job Analysis is complete, an FEA
Report is produced and serves as a key input to the Subject Matter Expert
(SME) conference held to define the training standards and determine
instructional setting.

Job Analysis Requirements

Job analysis begins once a requirement for training has been identified and
validated. Job analysis requirements are typically generated by:

1. The introduction of new or better weapons/support systems.


2. Organizational changes such as changes in MOS structure and career
field realignments.
3. Doctrinal changes required by new laws, Department of Defense (DoD)
requirements, and Marine Corps needs.
4. Evaluations indicating that a change in instruction is required.
5. Direction from higher headquarters.

Task Criteria

A task is a behavior performed on the job. A task is defined by specific criteria


and must:

1. Be a logical and necessary unit of work.


2. Be observable and measurable or produce an observable and
measurable result.
3. Have one action verb and one object.
4. Be a specific act done for its own sake.
5. Be independent of other actions.
6. Have a specific beginning and ending.
7. Occur over a short period of time.

Chapter 1 1-3
Systems Approach To Training Manual Analyze Phase

Duty Areas

To facilitate survey of job incumbents and correlation of survey data, closely related
tasks within a task list are grouped by duty area for the purposes of job analysis. A
duty area is an organizer of data consisting of one or more tasks performed within
one functional area. Duties are generally very broad categories. One or more duties
make up a job. A duty may be defined by:

1. a system (e.g., Small Arms Weapons, Mines and Demolitions, Communication


Equipment).
2. a function (e.g., Administrative Functions, Patrolling Functions).
3. a level of responsibility (e.g., Train Logistics Personnel, Supervise Intelligence
Personnel).

1. Initial Task List Development The first step in Job Analysis is the
development of an initial task list and is conducted primarily by TECOM (GTB). This
process can include the initial identification of duties or functional areas in which the
tasks will be organized. An initial task list is developed by a combination of the
following means: STEP 1

a. Reviewing technical documentation and references pertaining to the job. This


documentation might also be obtained from various sources outside the Marine
Corps. These sources may address similar jobs and tasks and have generated Develop an initial task
materials that may be applicable for task list development. These sources include: list.

1) Other Service Schools. These include Navy, Army, Air Force, or Coast Guard
formal schools, such as U.S. Army Engineer School at Ft. Leonard Wood, MO,
U.S. Army Signal School at Ft. Gordon, GA, and Air Force Communications
Technical School at Lowry Air Force Base, CO.

2) Trade Organizations/Associations. Civilian or industry trade


organization/associations, such as Society for Applied Learning Technology
(SALT) or Association of Naval Aviation can provide additional resources and
technical support.

3) Defense Technical Information Center (DTIC). DTIC offers training studies,


analyses, evaluations, technical articles and publications.

b. Convening a board of subject matter experts (SME) who can detail the
requirements of a specific job.

c. Conducting interviews with SMEs.

d. Soliciting input from Marine Corps formal schools/detachments and Centers of


Excellence (COE).

Chapter 1 1-4
Systems Approach To Training Manual Analyze Phase

2. Task List Verification. The next step in Job Analysis involves


verifying the task list in terms of accuracy and completeness. Verification
ensures that the tasks on the list are actually those performed by members of
the OccFld or MOS. Task list verification is normally conducted by TECOM
STEP 2 (GTB) during the FEA by one or more of the following methods:

a. Administering survey questionnaires to job incumbents.


Verify the task list for b. Conducting interviews with SMEs.
accuracy and completeness.
c. Observing actual job performance of tasks at the job site.
d. Convening a board of SMEs to review the task list.

3. Refining the Task List. After the data in the previous two steps have
STEP 3 been collected, the task list is refined and consolidated. A final review of the
task list should be made to ensure all tasks meet the criteria for a task
discussed previously in this Section.

Refine and consolidate the


task list.
4. Identifying Tasks for Instruction. The final step in job analysis
involves identifying specific tasks that may require formal instruction. Some
tasks may not be taught because they are relatively simple to perform, are
seldom performed, or only minimum job degradation would result if the tasks
STEP 4 were not performed. To properly select tasks for instruction, TECOM (GTB)
collects data on several criteria relating to each task. This is accomplished
through administration of a survey questionnaire sent to job incumbents and
SMEs. The data collected represents the judgments of a statistically valid
Identify tasks for formal sample of job incumbents and SMEs who are familiar with the job. The
instruction. responses to the survey are analyzed using statistical analysis procedures. The
following criteria may be considered when selecting tasks for instruction and
are included in the survey questionnaire administered by TECOM (GTB).

a. 1. Percent of jobholders performing the task.


b. 2. Percentage of time spent performing the task.
c. Criticality of the task to the job.
d. Frequency of task performance.
e. Probability of inadequate performance
f. Task learning difficulty.
g. Time between job entry and task performance (task delay
tolerance).
h. Resource constraints at the schoolhouse.

Survey responses to each of these criteria are then analyzed and a Front End
Analysis Report (FEAR) is produced that will assist in the task analysis and
determination of instructional setting.

Chapter 1 1-5
Systems Approach To Training Manual Analyze Phase

1300. TASK ANALYSIS SECTION


The second step in the Analyze Phase is to conduct a Task Analysis that sequences 3
and describes observable, measurable behaviors involved in the performance of a
task or job. Task analysis is conducted by a SME conference. It involves the
systematic process of identifying specific tasks to be trained, and a detailed analysis
of each of those tasks in terms of frequency, difficulty, and importance.

The purpose of task analysis is to:

1. Define the task list based on SME input.

Task Analysis
2. Develop ITS/T&R that identify the conditions, standards, and
identifies specific
performance steps necessary for the successful completion of a task.
tasks to be trained,
3. Determine where the tasks will be instructed (formal school or via MOJT and a detailed analysis
at the unit level). of each of those tasks
in terms of frequency,
4. Produce a target population description that will guide the formal school
difficulty, and
or unit in the preparation of instruction/training.
importance.
Below are questions to ask when performing a Task Analysis:

1. How difficult or complex is the task?


2. What behaviors are used in the performance of the job?
3. How frequently is the task performed?
4. How critical is the task to the performance of the job?
5. To what degree is the task performed individually, or to what degree is the
task part of a set of collective tasks?
6. If a subset of a set of collective tasks, what is the relationship between the
various tasks?
7. What is the consequence if the task is performed incorrectly or is not
performed at all?
8. To what extent can the task be trained on the job?
9. What level of task proficiency is expected following training?
10. How critical is the task?
11. What information is needed to perform the task? What is the source of
information?
12. What are the performance requirements?
13. Does execution of the task require coordination between other personnel
or with other tasks?
14. Are the demands (perceptual, cognitive, psychomotor or physical) imposed
by the task excessive?
15. How often is the task performed during a specified time-frame (i.e., daily,
weekly, monthly, yearly)?
16. How much time is needed to perform this task?
17. What prerequisite skills, knowledge, and abilities are required to perform
the task?
18. What are the current criteria for acceptable performance?
19. What are the desired criteria?
20. What behaviors distinguish good performers from poor performers?
21. What behaviors are critical to the performance of the task?

Chapter 1 1-6
Systems Approach To Training Manual Analyze Phase

1. Training Standard Development. Once the task list is finalized,


performance requirements must be developed for every task selected for
instruction. In the Marine Corps, performance requirements for all
occupational field specialties (OccFld) are defined by Individual Training
Standards (ITS). ITS published in either an ITS Order or a Training and
Readiness (T&R) Manual serve as the basis for all individual instruction in units
and in formal schools/detachments. Formal schools/ detachments are
responsible for teaching the training standards designated for instruction in
the formal school. These ITS/T&R events appear as tasks in Appendix B of
the Course Descriptive Data (CDD) produced by the formal school (see
Chapter 3).

2. Development of ITS/T&R. Once tasks are verified and the task


lists are refined, ITS/T&R may be developed. Often, many elements of the
ITS (e.g., performance steps, conditions, standards) are collected while the
task list is being refined. This enables a better understanding of the task and
can serve as a check to ensure the tasks are actually performed on the job. A
working group conference composed of subject matter experts (SME) is
particularly effective for examining how a task is to be completed by
identifying the performance steps and the sequence of those performance
steps, conditions, and standards necessary to successfully accomplish the
task.

a. ITS Components

1) Task. The task describes what the job holder must do.
2) Condition(s). The conditions set forth the real-world
circumstances in which the tasks are to be performed. Conditions
describe the equipment and resources needed to perform the task
and the assistance, location, safety considerations, etc., that
relate to performance of the task.
3) Standard(s). Standards provide the proficiency level expected
when the task is performed. Standards can measure a product, a
process, or a combination of both. Standards must reflect a
description of how well the task must be performed. This standard
can cite a technical manual or doctrinal reference (e.g., ...in
accordance with FMFM 1-3), or the standard can be defined in
terms of completeness, time, and accuracy.
4) Performance Step(s). Performance steps specify the actions
required to accomplish a task. Performance steps follow a logical
progression.
5) Reference(s). References are doctrinal publications (e.g.,
technical manuals, field manuals, Marine Corps Orders) that
provide guidance in performing the task in accordance with the
given conditions and standards. References cited should be
current and readily available to the Marine.

Chapter 1 1-7
Systems Approach To Training Manual Analyze Phase
6) Administrative Instructions. Administrative instructions provide the
instructor with special circumstances relating to the ITS, such as
simulation requirements and safety or real world limitations, which may
be a prerequisite to successful accomplishment of the ITS.

b. Composition of a T&R Event. A T&R event contains the following


components:

1) Event Code. The event code is a three-letter and three-digit


designator. The three-letter code is used for grouping events
according to their functional area. For collective events, these
groupings are derived directly from the community’s METs. The
three-digit code is used to arrange events in a progressive sequence.
The purpose of coding events is to provide Marines with a simplified
system for planning, tracking, and recording unit and individual
training accomplishments.

a) Functional Area Grouping. Categorizing events with the use of


a recognizable three-letter code makes the type of skill or
capability being referenced fairly obvious. Examples include DEF
(defensive tactics), MAN (maneuver), NBC (nuclear, biological, and
chemical), etc.

b) Sequencing. A numerical code is assigned to each training


event. The higher the number, the more advanced the capability
or skill being evaluated. For example, PAT-201 (patrolling) could
be patrolling conducted at the squad level, PAT-240 could be
patrolling at the platoon-level, PAT-301 could be patrolling at the
battalion level, etc.

2) Event Description. The event description is a narrative description


of the training event.
3) Tasks. A listing of the tasks that are done together to accomplish
the training Event. Tasks are defined on page 1.3. There are
normally multiple training tasks contained in each event. Tasks may
or may not be completed sequentially.
4) Condition. Condition refers to the constraints that may affect event
performance in a real-world environment. It includes equipment,
tools, materials, environmental and safety constraints pertaining to
event completion.
5) Standard. Standards are the metric for evaluating the effectiveness
of the event performance. It identifies the proficiency level for the
event performance in terms of accuracy, speed, sequencing, and
adherence to procedural guidelines. It establishes the criteria of how
well the event is to be performed.
6) Performance Steps. Performance steps specify the actions required
to accomplish a task. Performance steps follow a logical progression,
and should be followed sequentially, unless otherwise stated.
Normally, performance steps are listed only for 100-level individual
T&R events (those that are taught in the entry-level MOS school), but
may be included in upper-level events when appropriate.
7) Prerequisite(s). Prerequisites are the listing of academic training
and/or T&R events that must be completed prior to attempting
completion of the event.
8) Reference(s). References are the listing of doctrinal or reference
publications that may assist the trainees in satisfying the performance
standards and the trainer in evaluating the performance of the event.
Chapter 1 1-8
Systems Approach To Training Manual Analyze Phase

9) Ordnance. Each event will contain a listing of ordnance types


and quantities required to complete the event.

10) External Support Requirements. Each event will contain a


listing of the external support requirements needed for event
completion (e.g., range, support aircraft, targets, training
devices, other personnel, and non-organic equipment).

11) Combat Readiness Percentage (CRP). The CRP is a


numerical value used in calculating training readiness. The CRP
value for each event is determined by that event’s overall
importance within the training syllabus for that unit,
occupational specialty, or billet.

12) Sustainment Interval. The period, expressed in number of


months, between evaluation or retraining requirements to
refresh perishable skills and assure readiness. Skills and
capabilities acquired through the accomplishment of training
events are to be refreshed at pre-determined intervals. Those
intervals, known as sustainment intervals, are developed at the
respective T&R conference to standardize currency
requirements for Marines to maintain proficiency.

3. ITS/T&R Staffing. ITS/T&R staffing involves soliciting comments


from affected individuals or organizations throughout the Marine Corps, and
then integrating those comments into the ITS/T&R document. The Operating
Forces, formal schools/training detachments, and OccFld sponsors (and
designated SMEs under special circumstances) will be included on the
ITS/T&R staffing distribution list. TECOM (GTB/ATB) will coordinate final
review, and will consolidate and reconcile all recommendations.

Upon completion of this process, necessary changes will be incorporated into


the final ITS/T&R draft Order for signature. ITS/T&R Manuals are forwarded
to CG, TECOM for approval and signature.

Once final approval and signature has been received, the training standards
are published as either a T&R Manual in the MCO 3500-series, or as an ITS
Order in the MCO 1510-series, and can then be distributed throughout the
Marine Corps.

Chapter 1 1-9
Systems Approach To Training Manual Analyze Phase

1400. INSTRUCTIONAL SETTING


SECTION
The third process in the Analyze Phase involves determining the instructional
setting for each individual training standard (ITS) task behavior. Instructional
4
setting is important because it defines who is responsible for instructing the task
and the level of proficiency the student must achieve when performing the task in
an instructional environment. TECOM is responsible for determining the
organization responsible for conducting the instruction and the level of instruction
assigned to each task. This is done during the SME Conference while ITS/T&R
events are being developed. When determining instructional setting, two guiding
factors must be used -- effectiveness and efficiency. The Marine Corps seeks the
best training possible within acceptable, affordable costs while meeting the
learning requirement.

1. Responsibility for Instruction. Once the job is defined and the


ITS/T&R events are developed, the job structure can be broken down into
organizations that will assume responsibility for instruction. The tasks must be
divided into four groups:

a. Those that are to be included in a formal learning program.


b. Those that are to be included in a Managed On-the-Job-Training (OJT)
program.
c. Those that can be covered via computer-based instruction or via
simulation.
d. Those for which no formal or OJT is needed (i.e., can be learned by using
job performance aids or self study packets).

2. Instructional Setting. The purpose of entry level formal school


instruction is twofold: to teach the minimum skills necessary to make the Marine
productive immediately upon arrival at his first duty station; and to provide the
Marine with the necessary prerequisites to continue instruction in an MOJT
program. Instructional setting refers to the extent of instruction assigned to each
Individual Training Standard (ITS) task behavior. Instructional setting is
generally determined by convening a board of job incumbents and SMEs to
discuss the extent of instruction required to adequately perform the task.
Instructional settings are published in the T&R Manual or ITS Order.
Instructional settings in T&R Manuals are prescribed only for entry-level training
by listing them as 100-level events. Enclosure (3) of the ITS System (MCO 1510-
series) prescribes instructional settings in the following manner:

a. Tasks taught to standard are indicated by an “S” in the FS/MOJT column.


b. Tasks taught as preliminary or introductory in the formal school setting
are depicted with a “P” in the FS/MOJT column. These tasks require follow-on
instruction at the unit through MOJT for the Marine to achieve the standard of
proficiency required.
c. Tasks that are not taught at the formal school have no designator in the
FS/MOJT column.

Chapter 1 1-10
Systems Approach To Training Manual Analyze Phase

SECTION 1500. REQUIREMENTS AND RESPONSIBILITIES


IN THE ANALYZE PHASE
5
1. Training and Education Command [TECOM (GTB/ATB)]
Responsibilities. A systematic approach to the design of instruction
requires an identification of the tasks performed on the job. Job performance
in the Marine Corps is defined and supported by training standards. Training
standards published in ITS orders or as individual events in T&R Manuals are
the primary source for the development of all Marine Corps instruction.
TECOM (GTB/ATB) is responsible for coordinating all the steps in the Analyze
Phase and for managing the FEA process. TECOM will coordinate the
development of ITS/T&R for military occupational fields (OccFld) and military
occupational specialties (MOS). The culmination of the Analyze Phase is an
approved set of training standards for an OccFld or MOS, published as a
Marine Corps Order (MCO) in the 1510 or 3500 series.

a. Job Analysis. As part of the FEA process, TECOM (GTB) is


responsible for conducting job analyses. Additionally, TECOM (GTB) will
collect supporting information that will assist in the identification and selection
of tasks for instruction. TECOM (GTB) publishes the analysis results in a Front-
End Analysis Report (FEAR).

b. Task Analysis. TECOM (GTB/ATB) is responsible for convening the


SME conference. The conference conducts formal task analysis and produces
the refined task list.

c. Determination of Instructional Setting. The SME conference


also determines where the tasks should be taught, either at the formal
school/detachment, or in the operating forces/supporting establishment. The
TECOM task analyst conducting the SME conference will publish the
instructional setting in the T&R Manual or ITS Order.

2. Formal School/Detachment Responsibilities. The formal


schools play important roles during the Analyze Phase.

a. Job Analysis. The formal school/detachment advises the task


analyst within TECOM (GTB/ATB) on the construction of task lists that will be
used to build FEA questionnaires. The school also sends key personnel to the
SME conference who can make decisions on behalf of the commander.
Formal school personnel actively participate in the final step of Job Analysis –
selection of the tasks for instruction -- by making recommendations on
whether or not the task can be properly taught at the school. When the
requirements of the task exceed current resources, the SMEs make
recommendations for additional resources.

b. Task Analysis. Since task analysis involves determining the


condition, standard, performance steps, etc., having the resident experts from
the formal school participate in this process is critical.

Chapter 1 1-11
Systems Approach To Training Manual Analyze Phase
c. Determination of Instructional Setting. The determination of where
the tasks should be taught, either at the formal school/detachment, as part of a
web-based course, or as part of an MOJT program in the operating forces is
essential. Formal school/detachment personnel provide key inputs to this step
during the SME conference.

The Determination of the Instructional Setting is the final process in the


Analyze Phase. The output of this phase is:

 Individual Training Standards (ITS) Order or Training and Readiness


(T&R) Manual.

This output becomes the input to the Develop Phase. The first step of
the Design Phase will be to write a Target Population Description (TPD)
for the course to be developed from the events/ITS identified during
the Analyze Phase.

Chapter 1 1-12
Systems Approach To Training Manual Design Phase

DESIGN PHASE
ANALYZE

E
V
DESIGN

A
 Produce and
Analyze TPD In Chapter 2:

L
 Conduct a LA
DEVELOP U 2000 INTRODUCTION 2-1
 Sequence LO
2100 WRITE THE TARGET
A

POPULATION DESCRIPTION 2-2


 Purpose 2-2
 Role of TPD in instruction 2-2
T

 Steps in Writing the TPD 2-2


IMPLEMENT
2200 CONDUCT LEARNING
E

ANALYSIS (LA) 2-4


 Purpose 2-4
 Steps to conduct a LA 2-4
 Develop Learning Objectives 2-13
(LO)
 Components of LOs 2-14
 Record LOs 2-17
 Writing LOs 2-18
 Writing ELOs 2-21
 Develop Test Items 2-23
 Select Instructional Methods 2-43
 Select Instructional Media 2-52

2300 SEQUENCE TERMINAL


LEARNING OBJECTIVES (TLO) 2-59
 Purpose 2-59
 Relationships between TLOs 2-59
 Guidelines for sequencing TLOs 2-62

Chapter 2
Systems Approach To Training Manual Design Phase

Chapter 2000. INTRODUCTION

2 The outputs of the Analysis Phase, the ITS Order or the T&R Manual, become the
inputs to the Design Phase. During the Design Phase, the curriculum developer
takes the ITS Tasks or T&R events designated to be taught at the formal
school/detachment, and attempts to simulate, as closely as possible, the real-
world job conditions within the instructional environment. The closer the
instruction is to real world job requirements, the more likely it is that the student
will transfer the learning to the job.

The Design Phase consists of these three processes:


Formal School/ 1. Write the Target Population Description (TPD): “Who is coming for
Detachment is any MOS or instruction and what knowledge, skills, and attitudes (KSAs) must/will they bring
professional development with them?”
school in the Marine Corps.
2. Conduct a Learning Analysis: “What do I have to teach with?” and “What
will be taught, evaluated, and how?”

3. Sequence TLOs: “In what order will the instruction be taught to maximize
both resources and the transfer of learning?”

INPUT

TPD

Write TPD Learning Objectives

ITS or T&R Conduct Learning Analysis


PROCESS
PROCESS
OUTPUT
OUTPUT
Test Items
Sequence TLOs
Methods/Media

Sequenced TLOs

Figure 2-1

Chapter 2 2-1
Systems Approach To Training Manual Design Phase

2100. WRITE THE TARGET POPULATION SECTION


DESCRIPTION

INTRODUCTION The first process of the Design Phase is to write the Target
1
Population Description (TPD). A TPD is a description of the knowledge, skills,
and attitudes (KSAs) students are expected to bring to a course of instruction. It
provides a general description of an average student and establishes the minimum
administrative, physical, and academic prerequisites they must possess prior to
attending a course. During the Design Phase, the TPD will provide guidance for KSA - Knowledge, skills, and
developing objectives and selecting instructional strategies that will meet the attitudes.
needs of the students.

2101. ROLE OF TPD IN INSTRUCTION

The TPD provides the focus for designing instruction. For instruction to be
effective and efficient, it must build upon what students already know.
Considering the TPD allows the curriculum developer to focus on those specific TPD - Target Population
knowledge and skills a student must develop. For example, if knowing the Description
nomenclature of the service rifle is required for the job, and the students entering
the course already possess this knowledge, then teaching this specific information
is not required. Conversely, if students entering a course do not know the service
rifle nomenclature, then they need instruction. The TPD also allows the curriculum
developer to select appropriate methods of instruction, media, and evaluation
methods. For example, experienced students can often learn with group projects
or case studies and self-evaluation. Entry-level students generally need instructor-
led training and formal evaluation. In summary, the TPD describes the average
student in general terms, establishes prerequisites, serves as the source document
for developing course description and content, and is used to design instruction.

2102. STEPS IN WRITING THE TPD

1. Obtain Sources of Data To clearly define the target population, gather


data from the appropriate sources listed below. These references outline job
performance by detailing what tasks must be performed on the job and the
specific requirements of that particular job.

a. MCO P1200.7_, Military Occupational Specialty (MOS) Manual. STEP 1


b. Marine Corps Order (MCO) P3500 Series, Training and Readiness (T&R).
c. Marine Corps Order (MCO) 1510 Series, Individual Training Standards
(ITS).
Additionally, information can be obtained from the OccFld Sponsor and Task Gather Data
Analysts (GTB) by means of phone conversation and/or electronic message.

Chapter 2 2- 2
Systems Approach To Training Manual Design Phase

2. Gather and Review Student Background Information While


considering the adult learning characteristics identified in Chapter 6 and the
resources identified above, review pertinent student background information. In
order to ensure the course prerequisites are correct and that the training program
is developed to meet the attributes of the TPD, organize this information into the
STEP 2 following categories:

a. Administrative Certain prerequisites may be necessary due to


administrative requirements of the school or the course material. These
prerequisites include the student’s rank, MOS, security clearance, time
remaining in service, or police record (which may mean exclusion from
certain types of instruction).

1. Administrative b. Physical Physical information includes specific skills and general fitness
2. Physical which may include age, height, color perception, vision acuity, physical
3. Academic limitations, etc.

c. Academic Academic information represents an inventory of the


knowledge and skills the student must or will possess prior to the start of
instruction. These prerequisites may include specific basic courses already
completed, reading level, test scores, training experience and GCT/ASVAB
scores.

3. Write the TPD Capture information that describes the general


characteristics of the average student attending the course. Summarize the data
into a concise paragraph describing the target population. Organize the general
information describing the average student so that it is grouped together and any
STEP 3 prerequisites are grouped together.

TPD FOR CURRICULUM DEVELOPER COURSE

This course is designed for Sergeant through Lieutenant Colonel and civilian
Figure 2-2. employees who perform curriculum development duties at a Marine Formal
Sample Target School or Detachment. Prior to being enrolled in this course, students are
Population Description required to complete the Systems Approach to Training Interactive
(TPD) Multimedia Instruction (IMI), and the Operational Risk Management IMI.
Most students attending the course have experience as an instructor at a
Formal School or Detachment, are able to use Microsoft Word and
PowerPoint, and possess effective written communication skills.

Chapter 2 2- 3
Systems Approach To Training Manual Design Phase

2200. CONDUCT A LEARNING ANALYSIS


The second process of the Design Phase is to conduct a Learning Analysis to define POI- Program of
what will be taught. The purpose of the Learning Analysis is to examine the real world Instruction.
behavior that the Marine performs in the Operating Forces and transform it into the
instructional environment. A Learning Analysis produces three primary products
essential to any Program of Instruction (POI): learning objectives, test items, and 3 primary products of
methods/media. This process allows for adjustments to be made to accommodate for a Learning Analysis:
resource constraints at the formal school/detachment. A Learning Analysis must be
performed for every task covered in new courses. Additionally, each new task added 1. Learning Objectives
to either the Individual Training Standard (ITS) Order or Training and Readiness (T&R) 2. Test items
Manual, and taught at the formal school, requires a Learning Analysis. 3. Methods/Media

2201. STEPS TO CONDUCT A LEARNING ANALYSIS


1. Gather Materials The first step in conducting a Learning Analysis is to gather
materials. Once the scope of the course that the curriculum developer is designing is
determined (by reading guidance from TECOM or the school commander), obtain the:

a. ITS order or T&R manual – to determine what tasks the jobholder performs.
STEP 1
b. Publications – like orders, directives, manuals, job aids, etc. that will help
analyze the tasks to be taught.

c. Subject Matter Experts – to fill in details that the publications will not. SMEs
will conduct the brainstorming session along with the curriculum developer.

d. Learning Analysis Worksheet (LAW) - Use the LAW found in the SAT Manual,
enlarge it to turn-chart size, or create one on a dry erase board (take a digital photo to
record results). It does not matter which technique is chosen, as long as a record of
the analysis is created.

e. Previously developed LAWs and LOWs for established courses under review.

Figure 2-3 is an extract from an ITS task list. Figure 2-4 is an ITS and component
description. Figure 2-5 is an extract of a T&R event and component description.

See Figures 2-3, 2-4, and 2-5 on the next several pages.

Chapter 2 2- 4
Systems Approach To Training Manual Design Phase

Figure 2-3 ITS Task List Extract


SUMMARY/INDEX OF INDIVIDUAL TRAINING STANDARDS

1. General This enclosure contains a summary listing of all of the ITS tasks grouped by MOS and Duty Area.

2. Format The columns are as follows:

a. SEQ Sequence Number. This number dictates the order in which tasks for a given duty area are
displayed.

b. TASK ITS Designator. This is the permanent designator assigned to the task when it is created.

c. TITLE ITS Task Title.

d. CORE An “X” appears in this column when the task is designated as a “core” task required to earn the
title United States Marine and Basic Rifleman.

e. FS/D Formal School/Detachment. An “X” is in this column when the FS/D is designated as the initial
training setting.

f. PST Performance Support Tool. An “X” in this column indicates that at least one PST is associated with
this task. Consult enclosure (6) for details.

g. DL Distance Learning Product. An “X” in this column indicates that at least one DL product is associated
with this task. Consult enclosure (6) for details.

h. SUS Sustainment Training Period. An entry in this column represents the number of months within which
the unit is expected to train or retrain this task to standard provided the task supports the unit’s METL.

i. REQ BY Required By. An entry in this column depicts the lowest rank required to demonstrate
proficiency in this task.

j. PAGE Page Number. This column lists the number of the page in enclosure (6) that contains detailed
information concerning this task.

SEQ TASK # TITLE CORE FS/D PST DL SUS REQ BY PAGE

MOS , MCCS, Marine Corps Common Skills


DUTY AREA 11 – INDIVIDUAL WEAPONS (IMCCS)

1) MCCS.11.01 PERFORM WEAPONS HANDLING WITH M16.....X X 12 Pvt 6-A-38


2) MCCS.11.02 MAINTAIN THE M16A2 SERVICE RIFLE X X 12 Pvt 6-A-38
3) MCCS.11.03 ENGAGE TARGETS WITH THE M16A2 X X 12 Pvt 6-A-39
SERVICE RIFLE AT THE SUSTAINED RATE

Chapter 2 2- 5
Systems Approach To Training Manual Design Phase

TASK: MCCS.11.02 (CORE). MAINTAIN THE M16A2 SERVICE RIFLE

CONDITION(S): Given an M16A2 Service Rifle, cleaning gear and FIGURE 2-4
lubricants, Individual Training
Standard
STANDARD: To meet serviceability standards per the TM

PERFORMANCE STEPS:

1. Handle the weapon safely.


2. Place the rifle in Condition 4.
3. Disassemble the rifle.
4. Clean the rifle.
5. Lubricate the rifle.
6. Reassemble the rifle.
7. Perform function check.

INITITAL TRAINING SETTING: FS/D Sustainment (12) Req By (Pvt)

REFERENCE(S):

1. MCRP 3-01A, Rifle Marksmanship


2 TM 05538C-10/1A, Operator’s Manual for Rifle, 5.56mm M16A2 W/E.

ADMINISTRATIVE INSTRUCTIONS: NONE

Individual Training Standards Component Description


1. General ITS's contain six components: task, condition(s), standard(s), FIGURE 2-4 (CONT)
performance steps, reference(s), and administrative instructions. Components Description
of an Individual training
2. Alphanumeric System Each ITS is identified by a designator consisting, in Standard.
order, of four alphanumeric characters: a period, two numbers, a period and two
additional numbers.

a. The first four characters identify the job and should be the same as the
MOS if one exists. For the instructor, the job designator is 8806.
8806.01.01
b. The two Arabic numerals following the first period represent a DUTY area
of the JOB. The designator for the first DUTY area under JOB 8806 is 8806.01.

c. The last two Arabic numerals within the designator represent a task within JOB DUTY TASK
the DUTY area. The first TASK under the first DUTY area of JOB 8806 is identified Designator
as 8806.01.01. The second TASK under the third DUTY area of JOB 8806 is
designated as 8806.03.02, and so forth.

3. ITS Components

a. Task The task describes what a Marine has to do. It is a clearly stated,
performance-oriented action requiring learned skills and knowledge. A rank
(grade) is noted for each task. This rank is the grade at which the Marine must
be able to perform that task to standard.

Chapter 2 2- 6
Systems Approach To Training Manual Design Phase

b. Condition(s) The conditions set forth the real world or wartime


circumstances under which the tasks are to be performed. This element of an
FIGURE 2-4 (CONT) ITS underscores "realism" in training. When resources or safety requirements
Components Description of limit the conditions, this should be stated in Administrative Instructions. It is
an Individual Training important to understand that the conditions set forth in this Order are the
Standard. minimum and may be adjusted when applicable.

c. Standard(s) A standard states exactly the proficiency level to which


the task will be performed. It is not guidance, but a very carefully worded
statement, which sets the proficiency level required when the task is
performed. The standard is the established acceptable level of task
performance under the prescribed conditions.

d. Performance Steps There must be at least two performance steps


for each task. Performance steps specify actions required to fulfill the
proficiency established by the standard. These performance steps indicate a
logical sequence of collective actions required to accomplish the standard.

e. Reference(s) Reference(s) are directives and doctrinal/technical


publications that specify, support, or clarify the performance steps. References
should be publications that are readily available.

f. Administrative Instructions Administrative instructions provide the


trainer/instructor with special circumstances relating to the ITS such as safety
or real world limitations which may be a prerequisite to successful
accomplishment of the ITS.

g. Initial Training Setting All ITS's are assigned an Initial Training


Setting that includes a specific location for initial instruction [Formal School
(FS) or Operating Forces], level of training required at that location
(Core/Core-Plus/MOJT), a sustainment factor (number of months between
evaluation or retraining to maintain the proficiency required by the standard),
and a “Required By” rank (the lowest rank at which task proficiency is
required).

h. Training Material (Optional) Training materiel includes all training


devices, simulators, aids, equipment, and materials [except ammunition and
Marine Corps Institute (MCI) publications] required or recommended to
properly train the task under the specified condition and to the specific
standard. Mandatory items are preceded with an asterisk (*).

i. Ammunition (Optional) This table, if present, depicts ammunition,


explosives, and/or pyrotechnics required for proper training of the ITS.

j. Current MCI(s) (Optional) This section includes a list of any


currently available MCI publications designed to provide training related to this
task.

Chapter 2 2- 7
Systems Approach To Training Manual Design Phase

Figure 2-5 Components of a T&R Event


Event Description Sust Int

57XX-OPS-1010 Conduct operations in a chemically contaminated environment 3 mths

Tasks:
1. Prepare for NBC operations.
2. Prepare for chemical attack.
3. React to chemical attack.
4. Prepare to cross a chemically contaminated area.
5. Cross a chemically contaminated area.
6. Decontaminate individual Marines.
7. Conduct hasty equipment decontamination.
8. Conduct MOOP gear exchange.

Condition: Under any terrain and weather condition, given normal individual and unit combat
equipment.

Standard: All personnel don MOPP gear and be ready to continue unit movement, combat
support or combat service support within 10 minutes of the alarm. Perform operation in MOPP
gear for a minimum of 30 minutes. Decontaminate all personnel and equipment within 2
hours.

Performance Steps: N/A

References: MCO 3400.3, FMFM 11-1, FMFM 11-10, FMFM 11-9

Ordnance: N/A

External Support: Movement range appropriate for unit size.

Training and Readiness Event Component Description

1. General An event contained within a T&R Manual is an individual or collective


training standard and contains the following components.

a. Event Code The event code is a 4-letter alpha and/or numeric MOS FIGURE 2-5 (CONT)
designator followed by an up to 4-letter alpha and/or numeric functional area Component Descriptions
designator followed by a 4-letter numeric sequence designator. The purpose of of a Training and
coding events is to provide Marines with a simplified system for planning, tracking, Readiness Event.
and recording individual and unit training accomplishments. Grouping and
sequencing individual skills and unit capabilities build a “picture” for the user
showing the progression of training.
Grouping: The code is used for grouping events according to their functional
area. Categorizing events with the use of a recognizable up to 4-letter code
makes the type of skill or capability being referenced fairly obvious. Examples
include DEF (defensive tactics), MAN (maneuver), NBC (nuclear, biological, and
chemical), RAD (Radar), etc.
Sequencing: The 4-digit code is used to arrange events in a progressive
sequence.

Chapter 2 2- 8
Systems Approach To Training Manual Design Phase

b. Event Description Narrative description of the training event.


c. Tasks A listing of the component parts of the T&R Event. Tasks are
FIGURE 2-5 (CONT) usually a unit of work usually performed over a finite period of time that has a
Components Descriptions specific beginning and end can be measured, and is a logical, necessary unit of
of a Training and performance. In the T&R Program, a unit of work may actually be a T&R Event,
Readiness Event. and may reappear in a higher-level event as a component part – a Task. There
is often more than one task for each training event. A 100-level Event (to be
taught at the formal school) consists of a single task that is the Event Description
and will not normally have a listing of Tasks as shown here. A 100-level event
will often have multiple performance steps, although they may not be listed in
the T&R Manual if an event’s reference contains the necessary performance
steps.
d. Condition Condition refers to the constraints that may affect task
performance in a real-world environment. It includes equipment, tools,
materials, environmental and safety constraints pertaining to the task
completion.
e. Standard Standards are the metrics for evaluating the effectiveness of
the event performance. It identifies the proficiency level for the event
performance in terms of accuracy, speed, sequencing, and adherence to
procedural guidelines. It establishes the criteria of how well the event is to be
performed. It is not guidance; it states exactly the proficiency level to which the
task will be performed. Whenever possible, the standard should cite a reference
that defines the tasks in procedural or operational terms.
f. Performance Steps Performance steps specify the actions required to
accomplish a task. Performance steps follow a logical progression and should be
followed sequentially, unless otherwise stated. Normally, performance steps are
listed only for 1000-level individual events (those that are taught in the formal
MOS school). Listing performance steps is optional for 2000-level events and
above.
g. Prerequisite(s) Prerequisites are the listing of academic training
and/or T&R events that must be completed prior to attempting completion of the
event.
h. Reference(s) References are the listing of doctrinal or reference
publications that may assist the trainee in satisfying the performance standards
and the trainer in evaluating the performance of the event.
i. Ordnance Each event will contain a listing of ordnance types and
quantities required to complete the task.
j. External Support Requirements Each event will contain a listing of
the external support requirements needed for event completion (e.g., range,
support aircraft, targets, training devices, other personnel, and non-organic
equipment).
k. Combat Readiness Percentage (CRP) The CRP is a quantitative
numerical value used in calculating individual and collective training readiness.
The CRP of each event is determined by its relative importance to other events.
l. Sustainment Interval The number of months between evaluation or
retraining by the individual or collective event in order to maintain proficiency.

Chapter 2 2- 9
Systems Approach To Training Manual Design Phase
2. Determine Training Requirements Review the ITS order or T&R
manual to determine what tasks must be taught at the formal school/detachment.

a. For an ITS order, refer to enclosure three. Those tasks designated for
instruction at a formal school will have alpha indicator in the column labeled “FS”. STEP 2
This information is also spelled out for each task in Appendix A to Enclosure 6 of
the ITS order.

b. In a T&R manual, all tasks taught at the formal school for initial,
individual MOS training are listed at the 1000-level. For MOS progression training
conducted at the formal school, select events are identified in the manual.
In some cases, topics that need to be taught at a formal school/detachment will
not have corresponding tasks in the ITS order or T&R manual. To teach these
topics in a formal school/detachment, one of two courses of action must be
followed. The first course of action is to designate the lesson as "Lesson
Purpose"; it will not have learning objectives. Examples are a course overview or
an introduction to a series of tasks being instructed. Lesson purpose classes must
be kept to a minimum, because they use school resources (like time) without
directly supporting a given task. The other course of action is to contact the task
analyst at TECOM for further guidance. It is possible that the ITS order or T&R
manual needs a task added to it, and the analyst can provide authority to teach
until the revision is made.

3. Analyze the Target Population Description Before the knowledge,


skills, and attitudes (KSAs) are determined, the target population must be
analyzed. The TPD is analyzed so that the curriculum developer can make a
determination of the KSAs the students will bring into the instructional
environment. Instruction must capitalize on students’ strengths and focus on STEP 3
those KSAs the students must develop or possess to perform satisfactorily on the
job. The goal is for the learning analysis to reveal the instructional needs of the
target population so that selected methods and media are appropriate for the
student audience .

4. Record Task Data Record the data found in the ITS order or T&R manual.
The LAW in Appendix A serves as a guide for what information to record. Record
the T&R Event or ITS Duty Description, T&R Event or ITS Duty Code, the task,
task code, and the conditions and standards associated with the task. Then
STEP 4
record each performance step. A good strategy to stay organized and focused is
to only record one performance step per page. It is also a good idea to fill out all
LAWs required for a learning analysis prior to beginning step 5.

5. Generate Knowledge, Skills, and Attitudes for each


Performance Step When generating knowledge, skills, and attitudes (KSA),
analyze each performance step and break it down into a list of KSAs required for
each student to perform that performance step. Consideration of the level of
detail needed, transfer of learning, target population, and school resources is STEP 5
essential. The method used to identify KSAs is commonly called “brainstorming.”

Brainstorming is the process used for SMEs and curriculum developers to work
together to ensure that KSAs are generated for each performance step. In order
KSA – Knowledge, Skill,

Attitude.
to do this, the differences between knowledge, skill, and attitude must be
identified:

Chapter 2 2- 10
Systems Approach To Training Manual Design Phase
a. Knowledge is information required to effectively accomplish a step, task,
or job. Knowledge involves storing and recalling information and refers to
the learning of names, facts, processes, and principles. Examples include
“know rifle nomenclature”; “know the format of the operations order”;
“know the components of a NSN” etc.

b. Skill is the ability to perform an activity that contributes to the


accomplishment of the step, task, event, or job. Examples include “be able
to disassemble a rifle”; “be able to organize inventory” etc.

c. Attitude is the feeling or emotion in regard to a fact or state. Since the


majority of these cannot be observed or measured within the confines of
the instructional setting, they are generally not recorded during the
Learning Analysis. The exception is when analyzing the lower levels of
receiving and responding within the affective domain. (See Chapter 6,
Section 6502).

Knowledge and skills are generated from references for the subject or task,
such as an operator’s manual, SOP, user’s guide, and so forth. Also, consider
the knowledge and skills that the target population possesses upon entering the
course. This will ensure that resources are not wasted on instruction of
knowledge and skills that the target population already possesses.

KSAs are brainstormed and KSAs are brainstormed and recorded with one object and one verb, the words
recorded with one object “or” and “and” cannot be used as they would introduce a second object or verb.
and one verb.
A knowledge or skill must be recorded for each performance step to indicate
that the step has been analyzed and not overlooked. If no knowledge or skill
can be generated for the performance step, then record the performance step
as the knowledge or skill. These KSAs are an essential part of lesson plan
development, as they will become the information contained in the lesson plan.

6. Group KSAs Review all the knowledge and skills generated for the
entire task/event, regardless of the performance step under which they were
STEP 6 initially brainstormed. Circle and/or color-code the ones that are duplicative,
very similar, or common to one or more performance steps. For each group,
answer the question: “What behavior would confirm that the student
possesses these skills and knowledge?” Complementary knowledge and skills
are grouped to reduce the number of Enabling Learning Objectives (ELO).
Therefore the number of performance steps does not necessarily
ELO - Enabling Learning equate to the number of ELOs. Record behaviors on a working
Objective. copy/scratch paper and retain since these behaviors are the basis for
developing the ELOs. Also, use the scratch paper for notes and other
The number of performance considerations or decisions that are made.
steps does not necessarily
equate to the number of
ELOs.

Specifically, grouped knowledge and/or skills that are beyond the scope of
instruction (for more experienced personnel) or are possibly taught elsewhere
(in the course or school), still need to be grouped and recorded as the Learning TLO- TLO. The TLO has
Analysis progresses. For example, if any grouped KSAs identified during the already been identified by
Learning Analysis directly relate to the TPD of the course, they would be the task (ITS) or event
designated as “TPD.” Additionally, if a grouped KSA were taught in an earlier (T&R)
Chapter 2 2- 11
Systems Approach To Training Manual Design Phase
portion of the course, then it would not need to be re-taught but merely
recalled. These grouped KSAs will be designate as delete “del” since they will
not be taught in follow-on lessons. However, since these KSAs were identified
during the Learning Analysis, they must be recorded for every task. This is
critical to ensure that when future modifications to the course are made, key
KSA groupings are not lost or dropped from the instruction.

7. Sequence Groupings Review the draft behavior for each individual


task/event and all the groupings of knowledge, skills and/or attitudes. The
question to be answered during this step is, “Which grouping(s) of knowledge, STEP 7
skills, and/or attitudes should be taught first?” There are several methods
curriculum developers use to sequence and present course material. The
developer will use one, or a combination of two or more, of the methods listed
below to sequence the groupings.

a. Whole to Part Present the result or product first, and then present the
process or each step.

b. Part to Whole Present the process or steps first, then teach the final Methods used to sequence
result or product. and present material.

c. Simple-to-Complex Present concepts that students may be familiar


with or that are less complicated, then build on these concepts by
presenting newer or more difficult ones.

d. Complex-to-Simple Actions are sequenced in terms of decreasing


complexity; each associated with the larger complex structure of which it
is a part.

e. Chronological Present concepts or ideas in the order they occur over


time, such as with historical events.

f. Sequential Present procedures or steps in the order they are performed


on the job.

g. Cause and Effect order Actions are sequenced to demonstrate cause


and affect relationships. This technique is appropriate for relationships
that personnel must commit to long-term memory and for which training
environment performance failures can be tolerated.

h. Critical Order Actions are sequenced in the order of relative


importance, whether from the least important to the most or vice versa,
depending on the situation. Tasks favoring this technique are those that
require an important action such as “Clear the weapon before starting
disassembly.”

Chapter 2 2- 12
Systems Approach To Training Manual Design Phase
i. Known-to-Unknown Order Familiar topics are considered before
unfamiliar ones. This technique is appropriate in situations where the target
audience has some familiarity with the type of action, but the specific action is
generally unknown to them. For example, maintenance of military commercial
vehicles would precede maintenance of lesser-known military specific vehicles.

Assign a lower case Under each performance step, assign a lower case alpha designator to each
alpha designator to each grouping of KSAs, based on the order that they will be taught. For the first group
grouping of KSAs, based the lower case “a” would be assigned, “b” for the next and so on. If the groupings
on the order that they exceed a to z, continue with aa, ab, ac, etc…
will be taught.
8. Record Learning Analysis in MCAIMS MCAIMS is the designated
database that the Marine Corps uses for managing instruction. For documentation
of the Learning Analysis process, all elements must be recorded into MCAIMS.
STEP 8 Refer to MCAIMS User Manual for MCAIMS instructions. A Learning Analysis
Worksheet (LAW) must be produced for inclusion in the Master Lesson File (MLF).
The LAW used in the MLF may either be paper-based and/or the MCAIMS version.
The required components of an MLF are discussed in Section 3800 of this manual.
See Appendix A for the paper-based LAW and Appendix C for the Learning Analysis
checklist.

2202. DEVELOP LEARNING OBJECTIVES

The learning objective is the first of three primary products of the Learning
Analysis. A learning objective is defined as the precise behavior that the student
will accomplish under a specified condition, and to a prescribed standard. It is a
“contract” between the instructor and the student.

The purpose of a learning objective can be broken down into five areas. All areas
should be considered of equal importance. The learning objective should:

1. Tells student what he/she will be able to perform. (Behavior)


What a Learning 2. Describes the conditions under which the performance will occur.
Objective should do! (Condition)
3. Tells how well someone will perform. (Standard)
4. Establishes the basis for measurement of the performance.
5. Provides a focus for the instructor and the student.

Chapter 2 2- 13
Systems Approach To Training Manual Design Phase

2203. COMPONENTS OF A LEARNING OBJECTIVE


Prior to writing a learning objective, it is important to have an understanding of
each component: behavior, condition, and standard.

1. Behavior The behavior is the action the student is expected to perform


after instruction. The behavior must:

a. Contain one action verb and one object To avoid confusion by both
the student and the instructor, the behavior needs to state a single action and a
single object. For example, “type an electronic mail message.”“ In this example
“type” is the action verb, and “message” is the object.

b. Be free of ambiguity When a behavior is observable, measurable, and


uses qualifiers when necessary, the behavior will mean the same thing to all
students. An action verb must be observable in order to be measurable. It
should paint a picture in the student’s mind of what must be accomplished. This
is true whether it is knowledge or a skill. Some verbs require further explanation.
For instance, the verb “describe” requires a qualifier, either “in writing” or
“orally.” This will eliminate any confusion on the part of the student as to how he

What must a Behavior Statement do?


will be required to demonstrate the behavior. Examples of other verbs that
require qualifiers are “explain,” “select,” and “list.” By qualifying the action
statement, the action or the product of that action is made observable. Some
verbs are not useful even when qualified. These verbs are impossible to directly
observe. For example, a person cannot see someone “know.” A person cannot
see someone “understand.” These words are intangibles. Other verbs that are
not useful are known as constructs. A construct is something that exists only in
the mind. Love and hate are constructs. We cannot see, hear, taste, smell, or
feel love and hate, at least not in the physical sense, even though we know they
exist. (See figure 2-7 for a comprehensive domain/verb listing)

c. Be stated in student terms Instructors must understand that they


already possess knowledge that the student does not. Do not use acronyms or
technical terms that could create confusion. Keep it simple, clear, and concise.

d. Be a realistic performance of the behavior in the instructional


environment The behavior must reflect what the student will do within the
confines of the instructional environment and should closely as possible replicate
what the student will do on the job.

Chapter 2 2- 14
Systems Approach To Training Manual Design Phase

2. Condition The condition describes the situation under which the behavior
will take place. Conditions specify the resources provided to the student and the
environment in which the student must perform the behavior. The formal
The condition describes the situation under

school/detachment must attempt to duplicate the condition identified in the


learning objective. Conditions can be broken down into three areas:
which the behavior will take place.

aiding/limiting, environmental, and implied.

a. Aiding/Limiting Conditions A description of what a student will or will


not have available to him/her when performing the task. These include
references, tools, equipment, job aids, facts, formulas, specific situations,
Condition:

special instructions, and cues. If the task must be simulated because performance
could be hazardous or impracticable to reproduce, then the conditions must reflect
this simulation. For example, “in a simulated contaminated environment.”

(1) Aiding Conditions Any information or resource that is available to


the student is considered an aiding condition. Some examples are listed
below:

With the aid of references, draft a letter in accordance with the


references.

Given tools and equipment, tune an engine in accordance with the


references.
Aiding Conditions Given a observation scenario…, complete a “SALUTE” report in
Examples accordance with the references.

(2) Limiting Conditions Any information or resource that is not available


to the student is considered a limiting condition. Some examples are
listed below:

Without the aid of references, perform immediate action in accordance


with the references.

While blindfolded, assemble an M16A2 rifle in accordance with


Limiting Conditions references.
Examples
b. Environmental Conditions Environmental conditions describe the
environment in which the student will be performing the behavior. These
conditions can be physical or social.

(1) Physical Physical conditions deal with the time of day, weather,
location, and facilities. A few examples are listed on the next page.

Chapter 2 2- 15
Systems Approach To Training Manual Design Phase

During the hours of darkness, navigate from point A to point B in


accordance with the references.
Physical Environmental
Drive a HMMWV in mountainous terrain per the reference. Condition Examples

Given a mess galley, bake a cake according to recipe.

(2) Social Most learning objectives talk to the student as an individual


but they may also identify the student as a member of a team. For
example, “as a member of a machine gun team...” This is an
important aspect of the social environment since the person
performing the behavior could be affected by what the other team
members do or fail to do.

c. Implied conditions Quite often the verb or object in a learning


objective will have an implied condition in it. The learning objective,
“Without references, drive an LAV over rough, hilly terrain in accordance
with the Rough Terrain Checklist,” has an implied condition. It implies
that the driver will have an LAV, and anything else required to operate it
over rough, hilly terrain. For tasks that require the Marine to be equipped with
individual equipment, all efforts need to be made to simplify the condition
statement with regard to these items. Instead of listing each piece of gear that
the Marine would wear, a generic statement such as, “while wearing a combat

To be a standard, these four criteria must be met:


load” needs to be used. Clarification of those components that make up a
combat load is provided during the lesson or in a reference.

Completeness, Accuracy, Time, and Realistic.


3. Standard The standard describes the level of proficiency to which the
behavior must be performed. Standards state the quantity and/or quality of
acceptable behavior. There are four criteria for a good standard:

a. Completeness A standard specifies the level of task completion that


indicates acceptable performance of the task behavior. For instance, a standard
may specify the precise nature of the output, the number of features that the
output must contain, the number of steps, points, pieces, etc., that must be
covered or produced, or any quantitative statement that indicates an acceptable
portion of the total.

For example:

List, in writing, 3 of the 5 performance steps per the instructions.

Orally state in sequence the 11 general orders in accordance with the


reference.

Provided tools and equipment, tune up an engine so that the engine


idles at the proper RPM per the reference.

Chapter 2 2- 16
Systems Approach To Training Manual Design Phase

b. Accuracy A standard indicates what is considered an accurate


performance of a task behavior. Standards specify how well the behavior must be
performed and are normally contained in references such as Marine Corps Orders,
Technical Manuals, and Field Manuals. Only those references that describe in
detail an acceptable standard of performance may be cited. If parts of the
standard are contained in more than one reference, all references must be cited.

For example:

Given a formula, solve a mathematical problem to within two decimal


points.

Provided a range and ammunition, fire the rifle (annual rifle


requalification) with a minimum score of 25.

Given an information worksheet, complete form DD 295 in accordance


with the reference.

c. Time If the task is time critical, then the minimum time requirement must
be specified in terms of days, hours, minutes, or seconds.

For example:

Given the alarm for “gas,” don the field protective mask within 9
seconds.

Provided a rough draft, type a letter at a minimum speed of 40 words per


minute.

Given a mission, generate a 5-paragraph order per the reference in less


than 2 hours.

d. Realistic The standard must be realistic in order to expect the student to


perform the behavior based on the instruction provided. A standard is deemed
realistic when the time, accuracy, and completeness criteria allow for successful
completion.

2204. RECORD LEARNING OBJECTIVES

Before writing a learning objective, the curriculum developer must understand that
the Learning Objective Worksheet (LOW) is produced as documentation for the
Master Lesson File (MLF). All learning objectives must be recorded in MCAIMS for
the production of LOWs, inclusion in the MLF, and assignment to concept cards.
The LOW is a required element of the Master Lesson File (MLF). Refer to Chapter
3, Section 3800 for more information on the required MLF components. The LOW
contains the TLO behavior and, if necessary, a rationale for downgrading. The
LOW also contains test/evaluation items for learning objectives and selected
methods and media. See Appendix A for the LOW and Appendix C for the LOW
Checklist. Test items, methods, and media are covered later in this chapter. To
ensure that the curriculum produced during the Develop Phase matches the task in
the ITS/T&R, the learning objectives must be copied verbatim to the lesson plan
and student materials.

Chapter 2 2- 17
Systems Approach To Training Manual Design Phase

2205. WRITING TERMINAL LEARNING


OBJECTIVES
The ITS/T&R describes the real-world task the Marine performs in the Operating
Forces. The TLO behavior describes the task the student performs in the
instructional setting. The TLO must be directly related to the ITS/T&R because The school must attempt
instruction must prepare the student for the job. For each task in the ITS/T&R to accommodate the task
designated to be taught at the formal school/detachment, one TLO is required. behavior, condition, and
The school must attempt to replicate as closely as possible the task behavior, standard of the ITS/T&R
condition, and standard of the ITS/T&R in the TLO. in the TLO.

1. Record Behavior from ITS Order/T&R Manual When writing a


TLO, the first step is to copy the behavior directly from the ITS order/T&R
manual. The paper-based Learning Objective Worksheet (LOW) found in STEP 1
Appendix A can be used to record the learning objective.

2. Record Condition and Standard from ITS Order/T&R Manual


The next step in writing a TLO is to copy the condition and/or standard of the ITS
order/T&R manual. Upon writing the condition and/or standard, it may be
determined that minor modifications may be needed to reflect the actual
conditions of the instructional setting and/or the standard by which mastery is
measured. When writing ELOs, modifications to the condition and/or standard STEP 2
may be needed (i.e., the TLO is a performance based objective, however there
needs to be a knowledge based ELO in order to master the TLO). Then, the
modification must also be reflected in the TLO (i.e., the TLO’s condition and
standard must reflect every element in the subordinate ELOs).

3. Compare Formal School/Detachment Resources Against the


Task List Ideally, the formal school/detachment delivers instruction that
duplicates the real world task behavior. However, this is not always possible due
to numerous limiting factors called resources (refer to list of resources on next
page). The ultimate goal is to teach students the exact behaviors identified by
the ITS/T&R (when those behaviors are performed in the instructional setting,
they replicate verbatim as defined in the ITS/T&R). If the school does not
possess the resources to teach the task/event to standard, there are two possible
courses of action. The first is to downgrade the behavior and the second is to STEP 3
request additional resources.
a. If the school is going to downgrade the behavior, then the TLO is
modified to accommodate the constraints of the instructional setting. Once the
behavior is changed, a downgrade justification must be provided in the Program
of Instruction (POI). Modification of the behavior is done only as a last resort
and with the approval of Training Command. This situation may require a A group of ITSs describe
modification to the T&R event/ITS task. The conditions and standards of the and define the entire job.
ITS/T&R may have to be modified to accurately reflect what the student is or is
not provided and how his/her performance is measured within the instructional
environment. Figure 2-6, TLO Construction Flowchart, provides a guide for this
decision making process. For more information on how to perform a downgrade
justification in MCAIMS, see the MCAIMS User’s Manual.
b. If the school is going to request additional resources, then justification
must be forwarded to CG, Training Command via formal correspondence. The
justification must include rationale for why the task/event must be taught to
standard and not downgraded. It must also include a detailed list of all additional
resources and an explanation of how they will be used.

Chapter 2 2- 18
Systems Approach To Training Manual Design Phase
Resources include but are not limited to:

a. Time
b. Manpower
c. Facilities
d. Equipment
e. Budget
f. Safety (The ability to perform the task in an instructional environment safely)

The availability of these will impact how instruction is designed


throughout the course.

4. Determine Evaluation Methods How are students going to be


evaluated during this training evolution? This determination is based on a
combination of the formal school/detachments resources and the task
behavior statements. Ideally, the formal school/ detachment is able to
employ an evaluation method that matches the behavior identified in the
STEP 4 task statement. If the task is performance-based (psychomotor), then
performance evaluation of the process, the product, or both needs to occur.
If the behavior is knowledge-based (cognitive), then the evaluation will be
written or oral. Regardless the method of evaluation selected, a provision
for remediation of tasks/events not mastered by students must also be taken
into account. Resources are allotted for review and the retesting of these
individuals but must also be accounted for within the Program of Instruction
(POI) (Refer to Chapter 3, Section 3201 on Exam Concept Cards). Since
resources are often limited, performance-based evaluation and remediation
are not always possible. However, every attempt must be made to secure
required resources from TECOM to ensure that training and evaluation
replicate actual job conditions and standards.

5. Complete TLO The final step is to write the complete TLO in the form
of a complete sentence. For example: “Given an M16A2 service rifle and
STEP 5 with the aid of reference, disassemble the M16A2 service rifle in
accordance with the reference.” Each TLO is individually numbered to
ensure that each ITS/T&R is accounted for in the instruction. This number
is important since it provides an audit trail that is used by the school and
CG, Training Command to identify items in the Course Descriptive Data
(CDD) and Program of Instruction (POI).

a. ITS’s are numbered with the MOS/ITS number. For example


03XX.01.01, MCCS.01.01. (Marine Corps Common Skills)

b. T&R’s are numbered by event. 57XX-OPS-1010

Chapter 2 2- 19
Systems Approach To Training Manual Design Phase

FIGURE 2-6
TLO Construction Flowchart

Record
the ITS
Task/T&R
Event

Is the Modify the


Condition No Condition to
OK? reflect the
Instructional
Setting.

Yes

Is the Modify the


Standard No Standard to
OK? reflect the
Instructional
Setting.

Yes

Call Task
Is the No Analyst and
Behavior Provide
OK? Downgrade
Justification.

Yes

Combined
Elements
=
Terminal
Learning
Objective

All learning objectives must be recorded in MCAIMS for production of


LOWs, inclusion in the MLF, and assignment to concept cards. See
MCAIMS User’s Manual.

Chapter 2 2- 20
Systems Approach To Training Manual Design Phase

2206. WRITING ENABLING LEARNING


OBJECTIVES

ELOs are subordinate to the TLO and are statements of behavior, condition, and
standard to achieve the TLO. ELOs are derived from the identified knowledge and
ELOs are written to
skills needed by the students to perform the steps identified in the ITS/T&R task.
emphasize teaching
Remember, the number of ELOs is not necessarily equal to the number of ITS/T&R
points and evaluate
performance steps. There may be more ELOs or less depending on how KSAs
student understanding/
were grouped. ELOs are written to emphasize teaching points and to evaluate
performance.
student understanding/performance.

ELOs are determined in the Learning Analysis when a list of required knowledge
and skills are generated. One behavior is developed for each grouping of
Since every task common, similar, or duplicative knowledge and skills that are assigned an “alpha”
performed has a designator. This behavior is derived by keeping the evaluation method in mind
definitive beginning and and answering the following question: “What one behavior will the students
end, all TLOs require a perform that confirms they possess the grouped knowledge/skills?”
minimum of two ELOs.
ELOs support the TLO, therefore any addition to the condition and/or standard
must also be added to the TLO. Since every task performed has a definitive
beginning and end, all TLOs should have a minimum of two ELOs. ELOs provide
the level of detail necessary to describe knowledge and skills needed to master the
task. TLOs will not be repeated as ELOs since this is contrary to logical learning
analysis.

Below is a list of steps for writing ELOs. Figure 2-7 Enabling Learning Objective
Construction Flowchart provides a flowchart of the steps.

1. Document the Behavior When writing an ELO, the first step is to


document the behavior that was identified by each grouping of KSAs. The
STEP 1 behavior identified may need to be modified so that it follows the rules of a well-
written behavior. Refer to Section 2203 for more information on writing the
behavior.

2. Determine the Condition and Standard As a rule, if the ELO


condition and/or standard differs from the TLO’s condition and/or standard, then
that condition and/or standard needs to be added to the TLO. However, there
STEP 2 may be extenuating circumstances when an ELO’s condition and/or standard may
differ from that of the TLO. Justification for the difference must be noted on the
Learning Objective Worksheet (LOW) found in the Master Lesson File (MLF).

3. Record Completed Enabling Learning Objective The final step in


writing the ELO is to record it. Like the TLO, the ELO must also be written in the
form of a complete sentence. For example: Given an M16A2 service rifle and with
the aid of reference, disassemble the M16A2 service rifle in accordance with the
reference.
STEP 3

Chapter 2 2- 21
Systems Approach To Training Manual Design Phase

Figure 2-7. Enabling Learning Objective Construction Flowchart

Grouping of KSAs. The verbs listed below are based upon the Levels of
(Assigned an Alpha the Cognitive Domain. Refer to Chapter 6, section
Designator) 6500 for more information on the Domains of
Learning.
Refer to page 2-12.
Knowledge Comprehension Application
arrange Translation: relate
define change utilize
list reword solve
Select a verb and write memorize adopt
render
a draft behavior name convert employ
statement that reflects organize expand use
the grouped KSAs. relate transform avail
Refer to page 2-12. recall alter capitalize on
cite vary consume
find restate exploit
group qualify profit by
label mobilize
Apply rules from page match operate
Interpretation:
2-14 to modify draft locate infer ply
behavior. Finalize the omit define handle
behavior statement of pick explain manipulate
the ELO. quote construe exert
repeat outline exercise
rest annotate try
identify expound devote
recite handle
Write the Condition to hold
Extrapolation: wield
reflect the check put in action
project
Instructional Setting transfer put in use
propose
and ITS/T&R. tally make use of
advance
point to take up
contemplate
underline sketch
submit
illustrate
advance
apply
calculate
Write the Standard to choose
scheme
reflect the contrive
Instructional Setting Analysis Synthesis Evaluation
and ITS/T&R. breakdown create judge
uncover combine decide
look into build rate
dissect compile prioritize
examine make appraise
Combined Elements divide structure assess
= simplify reorder rank
Enabling reason reorganize weigh
Learning include develop accept
Objective deduce produce reject
check compose determine
audit construct referee
inspect blend umpire
All learning objectives must be recorded in MCAIMS for
section cause classify
production of LOWs, inclusion in the MLF, and scrutinize effect decree
assignment to concept cards. See Domains of Learning survey generate rule on
(Section 6500) and MCAIMS User’s Manual. search form award
screen criticize

Chapter 2 2- 22
Systems Approach To Training Manual Design Phase

2207. DEVELOP TEST ITEMS

The purpose of any test is to find out whether the objectives have been achieved. If
the task is important enough to dedicate resources to teach, it is equally important
enough to dedicate resources to evaluate. Test items are designed to determine if
the learner has acquired the KSAs to perform an objective or task. This promotes
learner development by providing feedback to the student and enabling the student
to demonstrate mastery. Evaluation is also critical to maintaining or improving the
effectiveness of instruction.

1. Analyze the Learning Objective Learning objectives tell the student what
he/she is expected to know or be able to perform following instruction. Test items
are written to assess the student’s level of mastery of the learning objective. Prior to
writing test items, the curriculum developer must analyze the behavior, condition,
and standard.

a. Behavior The test item must be written to evaluate whether the student
STEP 1 has acquired the knowledge, skills, and/or developed the appropriate attitude
required by the learning objective. The verb used in the behavior will either require
knowledge or performance. The behavior tells the curriculum developer whether the
test will be knowledge-based or performance-based. The only exception is when
there is a downgrade justification. Refer to Section 2205 for more information on
downgrade justification.

b. Condition The condition provides directions to the student on what will be


available and under what conditions he/she will be tested. For example, “given a
scenario” as a condition statement means to the test developer that a scenario will
need to be a part of the test item.

c. Standard The standard establishes the criteria of how well the event is to
be performed. The standard, as it is expressed in the learning objective, may need
to be reiterated verbatim in the test item or in the test instructions.

Test items that are written to reflect the behavior, condition, and standard outlined
in the learning objective are called criterion-based test items. The test item is
written so that the student will perform the behavior stated in the learning objective,
under the conditions specified, and to the established standard. If the behavior is to
“Disassemble the M16A2 Service Rifle,” then the test item cannot require the student
to both disassemble (psychomotor) and to list sequentially the steps (cognitive) to
disassemble the M16A2 service rifle. To be consistent, the curriculum developer
must ensure that the test item replicates the behavior statement. In this M16A2
example, the proper test item would be disassemble (psychomotor). Remember, the
learning objective is a contract between the instructor and the student.

Chapter 2 2- 23
Systems Approach To Training Manual Design Phase
2. Determine Type of Test Item Performance-based
(psychomotor/cognitive) and knowledge-based (cognitive) are the two types of
test items used to measure student mastery of learning objectives.

The Marine Corps strives for performance-based instruction and testing to


increase the transfer of learning from the instructional environment to the job.
For this reason, the TLO is derived from the ITS (the actual performance). The STEP 2
ELOs are derived from KSAs needed to support the TLO. A test item that
requires the student to perform a task (or part of a task) that is performed on the
job (whether the performance is filling out forms, writing operations orders, or
operating a radio) is considered a performance-based test. In some
circumstances, a performance test may be a written test designed as a job
sample for personnel whose responsibilities involve administrative duties. For
example, completing a DD Form 1057 is a valid performance test for a student
who must prepare one on the job. A performance test duplicates the job
behavior(s) by using the same equipment, resources, setting, or circumstances
that the student will encounter on the job.

a. Performance Test Items Performance test items are used to measure


the knowledge of a subject as well as the ability to perform the skills. Knowledge
at each of the learning levels (e.g., fact, rules, procedures, discriminations, and
problem solving) may be required to successfully perform the skill. Refer to
Chapter 6, Adult Learning, for more information on learning levels.

A performance test item can evaluate a process, a product, or both. The type of
test item that evaluates a process is valuable for tasks where, if the process is
not fully evaluated, much could be lost in the evaluation of the final product. For
instance, if a student makes a mistake in the process, but the end result is
correct, evaluators using this method are aware that a mistake was made. A
performance examination that evaluates a product must use specific criteria to
measure how well the student meets the desired outcome/objective. This type of
test item is useful for evaluating tasks that can be performed in a number of
different ways and still achieve the desired outcome. It is possible to have a test
item that evaluates both the process and product.

FOCUS OF ASSESSMENT
Assessing the  There is no product or product evaluation is
Process infeasible (e.g., unavailable or too costly).
 The procedure is orderly and directly observable.
 Correct procedure is crucial to later success.
 Analysis of procedural steps can aid in improving
a product.

Assessing the  Different procedures can result in an equally


Product good product (e.g., writing a theme).
 The procedure is not available for observation
(e.g., take-home work).
 The procedural steps have been mastered.
 The product has qualities that can be clearly
identified and judged.

Assessment of Student Achievement. By Norman E. Gronlund. pp. 142-143.

Chapter 2 2- 24
Systems Approach To Training Manual Design Phase

PERFORMANCE BASED TEST ITEMS


Advantages
1. Can evaluate complex learning outcomes and skills that cannot be
evaluated with traditional paper-and-pencil test.
2. Provides a more natural, direct, and complete evaluation of some types
of reasoning, oral, and physical skills.
3. Provides greater motivation for students by clarifying goals and making
learning more meaningful.
4. Encourages the application of learning to "real life" situations.
Limitations
1. Requires considerable time and effort to use.
2. Judgment and scoring performance can be subjective and burdensome,
if the evaluator is not knowledgeable in the assessment of the student’s
performance.
3. Evaluation must frequently be done individually, rather than in groups.
If evaluation is done in groups, careful allocation of task mastery must
be adhered to so that performers are not penalized for non-performers.

Assessment of Student Achievement. By Norman E. Gronlund. p. 137.

b. Knowledge (Cognitive) Test Items Time, cost, safety, and resource


constraints do not always permit performance-based instruction and evaluation. If
Knowledge Test Items: learning objective behaviors must be adapted and cannot duplicate the behavior,
conditions, and standards of the job, the test item still must mirror the learning
 True/False objective. Once the actual behavior is adapted, a knowledge-based learning
 Multiple Choice objective and written test item are developed. Written test items can still provide
 Matching realistic scenarios and circumstances, but must measure the stated learning
 Listing objective. For example, if resource constraints prevent the formal
 Fill-in-the-Blank school/detachment from having the students "climb a mountain,” an adapted
 Short Answer learning objective and corresponding written test item would be to "describe the
 Labeling steps to climb a mountain.” Some new information must simply be measured
 Essay through cognitive evaluation.

Figure 2-8 Figure 2-8 is a list of the types of knowledge test items. The following paragraphs
describe and outline the advantages and disadvantages of each.

Chapter 2 2- 25
Systems Approach To Training Manual Design Phase

c. True/False Test Items This type of test item is rarely effective for
testing higher-level cognitive skills. It deals mostly with simple factual
information and recall. Alone, this test item should not be used for evaluation True/False test items are
because a true/false test item always runs a fifty percent chance of being the least preferred
guessed. Therefore, it is not as reliable as other test items. It would not be a method of testing in the
good idea to send a graduate from the school out on the job based on Marine Corps.
evaluations supported solely by true/false test items. The students could have However, when used in
guessed their way to graduation. Another drawback to this item is that it is conjunction with a short
also extremely difficult to write one correctly. Most true/false items are poorly answer test item, T/F
written. However, when used in conjunction with a short answer test item items can help solidify
requiring the student to justify responses, this helps solidify the student's the student's
comprehension of the topic/task. comprehension of the
topic/task.

TRUE/FALSE CHOICE ITEMS


Advantages
1. The item is useful for outcomes where there are only two possible
alternatives (e.g., fact or opinion, valid or invalid).
2. Less demand is placed on reading ability than in multiple-choice items.
3. A relatively large number of items can be answered in a typical testing
period.
4. Complex outcomes can be measured when used with interpretive
exercises.
5. Scoring is easy, objective, and reliable.
Limitations
1. It is difficult to write items beyond the knowledge level that are free
from ambiguity.
2. Making an item false provides no evidence that the student knows
what is correct.
3. No diagnostic information is provided by the incorrect answers.
4. Scores are more influenced by guessing than with any other item
type.

Assessment of Student Achievement. By Norman E. Gronlund. p. 79.

Chapter 2 2- 26
Systems Approach To Training Manual Design Phase
d. Multiple-Choice Test Item This type of test item is versatile and flexible.
It is also the most common, and probably the most abused, of all test items. This
item can measure a wide range of cognitive abilities ranging from simple recall of
information to understanding of complex concepts. It is a quick and easy item to
score whether using computerized grading or a paper-based answer key. This is one
of the primary reasons this type of test item is seen so much in formal schools that
process large groups of students. It is time efficient as well as fairly simple to
construct if a few rules are followed.

MULTIPLE CHOICE ITEMS


Advantages
1. Learning outcomes from simple to complex can be measured.
2. Highly structured and clear tasks are provided.
3. A broad sample of achievement can be measured.
4. Incorrect alternatives provide diagnostic information.
5. Scores are less influenced by guessing than true-false items.
6. Scoring is easy, objective, and reliable.
Limitations
1. Constructing good items is time consuming.
2. It is frequently difficult to find plausible distracters.
3. This item is ineffective for measuring some types of problem solving
and the ability to organize and express ideas.
4. Reading ability can influence score.

Assessment of Student Achievement. By Norman E. Gronlund. p. 60.

Chapter 2 2- 27
Systems Approach To Training Manual Design Phase

e. Matching Test Item A matching test item is used to measure a


student’s ability to recognize facts and discriminate among related or similar
items. Matching test items normally use two columns of related items, and
students are required to match a series of items listed in one column with
Always have more
related items in the other column. It provides a way to test various
responses than premises.
knowledge factors simultaneously.
This keeps the student
from ascertaining correct
responses by process of
MATCHING ITEMS elimination.
Advantages
1. A compact and efficient form is provided where the same set of
responses fit a series of item stem (i.e., premises).
2. Reading and response time is short.
3. This item type is easily constructed if converted from multiple-choice
items having a common set of alternatives.
4. Scoring is easy, objective, and reliable.
Limitations
1. This item type is largely restricted to simple knowledge outcomes based
on association.
2. It is difficult to construct items that contain a sufficient number of
responses that are of similar kind or nature.
3. Susceptibility to irrelevant clues is greater than in other item types.

Assessment of Student Achievement. By Norman E. Gronlund. p. 85.

f. Listing Test Item A listing test item measures the student's


knowledge of information presented during instruction. This item requires the
student to list a specified number of items in response to a question. Listing
test items should not be used if the student’s grammar skills are not at the
appropriate level (refer to TPD).

LISTING ITEMS
Advantages
1. Easy to write.
2. Guessing is less likely than in selection-type items.
3. Preparation time is less than that for selection-type items.

Limitations
1. It is difficult to phrase statements so that only one answer is correct.
2. Spelling ability contaminates scoring.
3. Scoring is tedious and time-consuming.

Chapter 2 2- 28
Systems Approach To Training Manual Design Phase

g. Fill-in-the-Blank Test Items This type of item tests the student’s


knowledge and/or comprehension of information presented during instruction.
A fill-in-the-blank test item requires the student to write a short answer in the
blanks provided within the statement/question. The maximum number of
blanks should be limited to two within a question or statement. Fill-in-the-
blank test items are written as statements and do not require an action verb.
Fill-in-the-blank test items do not test the student’s ability to organize
thoughts and ideas., and are not useful for problem solving.

h. Short Answer Test Items A short answer test item is used to


evaluate the student when recall is important. Short answer is referring to a
one word, number, or very short phrase type of response. The student creates
the answer. Short answer test items are good to use, as they do not have a
list to select from or something to help jog the student's memory. This type
of item is unsuitable for complex learning.

FILL IN THE BLANK/SHORT ANSWER ITEMS


Advantages
1. Easy to write.
2. Guessing is less likely than in selection-type items.
3. Well suited to computational problems and other learning
outcomes where supplying the answer is important.
4. A broad range of knowledge outcomes can be measured.
Limitations
1. It is difficult to phrase statements so that only one answer is
correct.
2. Scoring is contaminated by student’s spelling ability.
3. Scoring is tedious and time-consuming.
4. Not very adaptable to measuring complex learning outcomes.

Chapter 2 2- 29
Systems Approach To Training Manual Design Phase

i. Labeling Test Items Labeling or identification test items are used to


measure a student’s ability to recall facts and label parts in pictures, schematics,
diagrams, or drawings. This form of test is most often used to measure
recognition of equipment components or other concrete objects. It has wide
application when teaching complex processes, especially via Interactive
Multimedia Instruction (IMI).

LABELING ITEMS
Advantages
1. Tests student’s visual recognition of equipment components or other
concrete objects.
2. Guessing is unlikely.
3. Scoring is easy.

Limitations
1. Must have a good diagram, sketch or illustration to be effective.
2. Scoring is contaminated by student’s spelling ability.

j. Essay Test Items The essay test item is fairly simple to produce by the
instructor and requires complex thought by the student. It differs from the test
items covered so far in that it generally requires the student to communicate the
response to the evaluator in his or her own words. The nature of the test item
makes it one of the most difficult for a student to complete and also, by far, the
most difficult to evaluate. The evaluator is also often required to make a
subjective assessment on whether the student has communicated the correct
response. It is critical that the student clearly understand the requirements of
the learning objective, and that the instructor replicate the learning objective in
the essay test item. Essay test items are usually used for learning objectives that
are not readily measurable such as certain mental skills like judging, problem
solving, evaluating, and analyzing to name just a few.

Assessment of Student Achievement. By Norman E. Gronlund. p. 103.

ESSAY ITEMS
Advantages
1. The highest level learning outcomes (analysis, synthesis, evaluation)
can be measured.
2. Preparation time is less than that for selection-type items.
3. The integration and application of ideas is emphasized.
Limitations
1. Each question is time intensive for measuring or achieving each
learning objective.
2. It is difficult to relate to intended learning outcomes because of
freedom to select, organize, and express ideas.
3. Scores are raised by writing skill and bluffing and lowered by poor
handwriting, misspelling, and grammatical errors.
4. Scoring is time consuming, subjective, and tends to be unreliable.

Chapter 2 2- 30
Systems Approach To Training Manual Design Phase

3. Write Test Items Once the decision has been made on the type of test most
appropriate to use for an objective, the curriculum developer must write the test
item(s). During this step, the curriculum developer is writing test items to be
STEP 3 recorded on the LOW. Grading criteria and the construction of the test occurs in the
Develop Phase. Refer to Section 3500 for information on Constructing Tests. Each
type of test item has different sets of guidelines to follow. Following these
guidelines will assist the curriculum developer to write valid test items.

a. Writing Performance-Based Test Items This involves stating the


performance objective, creating the checklist (if applicable), instructions to the
evaluator, and instructions to the student.

When developing performance test items, use the following steps:

For an example of a 1. State the performance objective as a brief description of what the
performance-based student must accomplish for successful completion of the performance
test item see test.
Figure 2-9. 2. List steps/activities/behaviors (process) or characteristics (product).
3. Note common errors that are made when using the checklist.
4. Arrange the activities or steps and characteristics in correct order.
5. Review the checklist for accuracy and completeness.

1) Checklist Performance test items, which require the student to


perform a task, usually have the format of a checklist. The checklist
is developed to correspond to the steps or activities of the task
being performed and the underlying knowledge and skill elements.
Checklists need to be detailed. This may help identify precisely what
occurred during performance. The checklist should identify
elements that have been taught and measure the behavior. Ensure
that all the criteria are included so that the evaluator will be able to
tell how well the student meets the objective. A checklist can be
either a YES/NO (Mastery/Non-mastery) checklist or a scaled credit
checklist with points for each specific action that the student
performs. The formal school/detachment will identify which of these
will be used in the overall evaluation of the student (See scoring and
grading in Test Construction, Section 3504). Additionally, a
determination of whether the student should have the checklist
when being evaluated must be made. If the checklist will be used
out on the job, then the student should be allowed to use the
checklist during the evaluation.

Chapter 2 2- 31
Systems Approach To Training Manual Design Phase

2) Process Checklist When a performance test requires the steps or


activities to be rated, a process checklist is used. The process checklist
should contain all of the essential steps or activities required for
successful performance. Process checklist construction guidelines are as
follows:

a) Use when the performance of steps or activities of a task is to be


evaluated.
b) The steps or activities must be observable.
c) Define all of the steps or activities of the task being performed.
d) Sequence steps or activities in order of performance.
e) Provide space for “checking” the performance of each step or activity.
f) Provide space for recording and describing errors.

3) Product Checklist When a performance test item requires the product


of a process or task to be evaluated, it will be beneficial to use a product
checklist. The product checklist should identify criteria or characteristics
of product acceptability.

a) Use checklist when the LO requires the student to produce a product.


b) Use checklist when the product can be readily evaluated.
c) Use checklist when there are no fixed or set procedures.
d) Identify the characteristics of the product.
e) Provide space on the checklist for product rating.
f) Provide space on the checklist for comments about the product.

4) Instructions to the Evaluator

a) The instructions specify all the information required by the evaluator


to include the planning and set-up of the exam, ensuring required
student materials are at hand, matching the conditions stated in the
learning objective to perform the behavior.
b) The instructions cover what the evaluator needs to evaluate the
student, such as checklists, tools, etc.
c) The instructions additionally state any start/stop signals, safety
considerations, time limits that the instructor should emphasize to the
student. Administrative information such as disposition of the
The
completed evaluation needs to appear in the instructions, if
instructions to
necessary.
the student
d) The instructions must be detailed enough to cover everything the
must be clear
evaluator needs to know or do to make the evaluation happen.
to ensure that
every student
5) Instructions to the Student Instructions include student directions,
is evaluated on
specifically any start/stop directions, any safety considerations, time limits,
the ability to
and how the performance will be evaluated. The instructions to the
perform the
student must be clear to ensure that every student is evaluated on the
behavior stated
ability to perform the behavior stated in the learning objective.
in the learning
objective.

Chapter 2 2- 32
Systems Approach To Training Manual Design Phase

Learning Objective: Without reference, given an M16A2 rifle, disassemble the


rifle in 30 seconds in accordance with the procedures listed on pages 2-29 to 2-
32 of FMFM 0-8.

Test Item:

1. Instructions to the Evaluator: Ensure you have an adequate training


facility to conduct testing. Also, ensure the student has an M16A2 rifle. Inform
the students that they have 30 seconds to disassemble the rifle. Inform students
that if the time limit is not adhered to or he/she misses a step, the student will be
FIGURE 2-9. given remedial training and retested. If the student fails a second time, he/she
Sample Performance will be recommended for an academic review board. Ask the students if they
Checklist have any questions. Tell the students to begin and evaluate the students by
using the checklist provided. Once the test is completed, let the students know if
they passed, send them to their next test station (if applicable), and turn
completed checklist into Academics Chief.

2. Instructions to the Student: When the instructor says begin, disassemble


the rifle. You have 30 seconds. You will be evaluated using a performance
checklist detailing the disassembly procedures of an M16A2 rifle in accordance
with FMFM 0-8. If you fail to complete this task in the time given you will receive
remedial training. If you miss a step in the process you will receive remedial
training. After completion of remedial training, you will be retested. If you fail to
pass the second attempt, you will be recommended for an academic review
board. Do you have any questions? You may begin.

3. Performance Checklist YES NO

a. Cleared the rifle. ___ ___

b. Removed the sling. ___ ___

c. Removed the hand guards. ___ ___

d. Separated rifle into two main groups. ___ ___

e. Removed the charging handle. ___ ___

f. Disassembled bolt carrier group. ___ ___

1) Removed firing pin retaining pin. ___ ___

2) Removed the firing pin. ___ ___

3) Removed cam pin. ___ ___

g. Disassembled the weapon in 30 seconds ___ ___


or less.

Chapter 2 2- 33
Systems Approach To Training Manual Design Phase

b. Writing Knowledge-Based Test Items

1) True/False Test Items. True/False items are comprised of statements


rather than questions. The item must be directly related to a learning
objective. True/False items are designed to test knowledge, which means
that they should be related to “knowledge” (Know-How-To or Know) from
the learning analysis. Guidelines for writing true/false test items are as
follows:

a) Include only one idea in each statement.


b) Place the crucial element at or near the end of the statement.
c) Avoid using negatives such as “no” or “not.” They tend to confuse
students.
d) Do not use absolutes such as “all,” “every,” “none,” and “never.”
e) Do not use statements containing “some,” “any,” and “generally.”

Below is a checklist that can be used to evaluate true/false test items.

TRUE/FALSE ITEMS CHECKLIST


YES NO
1. Is this type of item appropriate for measuring the
learning objective?
2. Does each statement contain one central idea?

3. Can each statement be undisputedly judged true or


false?
4. Are the statements brief and stated in simple, clear
language?
5. Are negative statements used sparingly and double
negatives avoided?
6. Are statements of opinion attributed to some
source?
7. Is there approximately an even number of true and
false statements?
8. When arranged in the test, are the true and false
items put in random order?

Assessment of Student Achievement. By Norman E. Gronlund. p. 85.

Learning Objective:

Without the aid of reference, given a M16A2 service rifle, associated equipment and ammunition, identify “make
safe” procedures in accordance with the FMFM 0-8.

Test Item: When given the command to “make safe,” the shooter will True False
place the M16A2 service rifle in Condition 3. ____ ____

Figure 2-10 Sample True False Test Item


Chapter 2 2- 34
Systems Approach To Training Manual Design Phase

Components Multiple Choice Test Items. Before getting into the rules for
writing this type of test item, the various components of a multiple-choice test item
Incomplete Stem: need to be discussed. There are three basic components to this test item: the
_________________is the stem, the responses, and the distracters.
first step in disassembling
the M16A2. 1. Stem. The stem is a statement of the problem and should be worded in
simple and understandable terms. Wording should be appropriate to the
Complete Stem: When subject matter and to the group being tested. The solution to the problem
disassembling the M16A2, should not depend upon the student's ability to translate complex sentence
what is the first step? structure contained in the stem. Basically, there are two types of stems:
the incomplete statement or the complete statement (usually in the form of
a question).
Correct response:
2. Responses. Apart from the stem, the test item also consists of several
c. Clear the weapon. possible answers or responses; only one of which is to be accepted as the
correct response. There are only two types of responses, the correct
response and alternative responses. Alternative responses are also known
Distracters: as distracters.

a. Take out the buffer. 3. Distracters. Distracters are incorrect alternative responses to the question;
however, all distracters are worded to be believable. Using commonly
b. Take off the hand mistaken ideas and common misconceptions concerning the subject matter
guards. can best compose distracters. Care should be taken in forming the
distracters. Distracters should not be designed to deceive students; rather,
c. they are designed so that a student who does know the material will clearly
know that the distracter is an incorrect answer. The student who does know
d. Take off the Lower the material should be able to select the correct response. Do not fall into
receiver. the trap of presenting the student with a choice between several "correct"
responses.

Chapter 2 2- 35
Systems Approach To Training Manual Design Phase

2) Writing Multiple Choice Test Items. Multiple choice test


items are used to test facts and application of rules and
procedures. They may also be used to test discriminations and
problem solving. Guidelines for writing multiple choice test
items are as follows:

a) Do not use the articles “a” and “an” at the end of the stem;
this tends to indicate the correct answer.
b) All responses should follow grammatically from the stem.
c) All responses should be of approximately the same length.
d) All responses should have a similar grammatical structure.
e) All responses should use similar terminology.
f) Provide as many responses as necessary but normally four.
g) Position the correct response randomly throughout the test.
h) Ensure that there is only one correct answer.
i) Distracters should be plausible (believable) but incorrect.
j) Logically order all responses. Examples are smallest to largest,
chronological order, or whatever makes sense.
k) Underline or CAPITALIZE all negatives and "in sequence"
words. It is best if negatives are not used in the stem.
l) Ensure that all items are independent from other items. No
hints at the answer to other test items should be in any item.
m) Avoid "all of the above,” "none of the above,” or "A and B
only" in responses. This kind of response reduces the validity
and reliability of test items.
n) Avoid the use of absolutes such as "never" or "always" since
they tend to assess the student's attention to detail rather
than the subject.
o) Never use double negatives or double-talk, such as “What
response is never true?”

Learning Objective: Without the aid of reference, select in sequence the basic steps for
performing preventive maintenance on the M16A2 service rifle in accordance with FMFM 0-8.

Test Item: IN SEQUENCE, select the basic steps for performing preventive maintenance on
the M16A2 service rifle.

a. Disassemble, clean, lubricate, inspect, reassemble, clear, perform functions check.


b. Clear, disassemble, clean, inspect, lubricate, reassemble, perform functions check.
c. Disassemble, clean, reassemble, lubricate, perform function check, clear, inspect.
d. Clear, disassemble, clean, inspect, reassemble, lubricate, perform functions check.

Figure 2-11. Sample Multiple Choice Test Item

Chapter 2 2- 36
Systems Approach To Training Manual Design Phase

MULTIPLE-CHOICE ITEMS CHECKLIST


YES NO
1. Is this type of item appropriate for measuring the intended
learning outcome?
2. Does the item task match the learning task to be
measured?
3. Does the stem of the item present a single, clearly
formulated problem?
4. Is the stem stated in simple, clear language?
5. Is the stem worded so that there is no repetition of
material in the alternatives?
6. Is the stem stated in positive form wherever possible?
7. If negative wording is used in the stem, is it emphasized
(by underlining or caps)?
8. Is the intended answer correct or clearly best?
9. Are all alternatives grammatically consistent with the stem
and parallel in form?
10. Are the alternatives free from verbal clues to the correct
answer?
11. Are the distracters believable (plausible) and attractive to
the uninformed?
12. To eliminate length as a clue, is the relative length of the
correct answer similar to that of the distracters?
13. Has the alternative "all of the above" been avoided and
"none of the above" used only when appropriate?
14. Is the position of the correct answer varied so that there is
no detectable pattern?
15. Does the item format and grammar usage provide for
efficient test taking?

Assessment of Student Achievement. By Norman E. Gronlund. p.75.

Chapter 2 2- 37
Systems Approach To Training Manual Design Phase

3) Writing Matching Test Items. A matching test items contains a list


of premises (items that require responses), a list of responses (possible
answers), and a brief explanation of how the premises and response are
related. Guidelines for writing matching items are as follows:

a) Provide, clear, concise directions on how to match the items in the


two columns.
b) Indicate whether the responses may be used more than once.
c) Limit test items to a single area of choices to a single subject
category.
d) Arrange the responses in the same logical order.
e) The responses and premises should have parallel grammatical
construction.
f) Keep each list of premises and responses as brief as possible. It is
recommended to have no more than ten items.
g) Always have more responses than premises.
h) The entire matching test item should be kept on the same page.

Learning Objective: Without the aid of reference, identify the individual components of
the three main groups in a M16A2 service rifle per TM 05538C-10/1.

Test Item: Column A contains the three main groups of the M16A2 service rifle and
column B contains a list of individual rifle components. Match the components to its main
group.

A B

___ Upper receiver a. Hand guards, rear sight, ejection port.

___ Lower receiver b. Bolt, firing pin, cam pin.

___ Bolt carrier group c. Slide, half cock notch, ejector.

d. Selector switch, magazine release, trigger.

Figure 2-12. Sample Matching Test Item

Chapter 2 2- 38
Systems Approach To Training Manual Design Phase

MATCHING ITEMS CHECKLIST


YES NO
1. Is this type of item appropriate for measuring the
intended learning outcome?
2. Does the item task match the learning task to be
measured?
3. Does each matching item contain only similar
material?
4. Are the lists of items short with the brief responses on
the right?
5. Is an uneven match provided by making the list of
responses longer or shorter than the list of premises?
6. Are the responses in alphabetical or numerical order?
7. Do the directions clearly state the basis for matching
and that each response can be used once, more than
once, or not at all?
8. Does the complete matching item appear on the same
page?
Assessment of Student Achievement. By Norman E. Gronlund. p. 87.

4) Writing Listing Test Items. A listing test item requires the


student to list a specified number of items in response to a question.
For example, a student may be asked to list the seven basic steps
for performing preventative maintenance on the M16A2. Listed
below are a few guidelines to keep in mind when writing a listing
test item.

a) The student should always be told the number of items to be


listed.
b) A listing test item can cover a complete procedure; such as, the
steps in the process of disassembling the M16A2.
c) lf the sequence of the process is important for the student to
know, then "in sequence" should be highlighted or printed in bold
text. For instance, if a Marine were being tested on failure to fire
procedures before going to the rifle range, then “in sequence”
would be very important.
d) Provide blanks of the same length at a length long enough for
Learning Objective: Without the aid of reference, list in sequence the five
the student’s answers.
phases of the SAT process in accordance with SAT Manual.

Test Item: List IN SEQUENCE the five phases of the SAT process.
________________
________________
________________
________________
________________

Figure 2-13. Sample Listing Test Item

Chapter 2 2- 39
Systems Approach To Training Manual Design Phase

5) Writing Fill in the Blank Test Items. A fill in the blank test item
requires the student to recall facts and supply one or more key
words that have been omitted from the statement. When placed in
the appropriate blanks, the word(s) make the statement complete,
meaningful, and true. Listed below are a few guidelines to keep in
mind when writing a listing test item.

a) Leave blanks for key words only.


b) Keep items brief.
c) Make all blanks approximately the same size.
d) Grammatical cues to the correct answer, such as the articles
“a” and “an” just before the blank, should be avoided.
e) Ensure that only one correct answer is possible for each blank.
f) Ensure that the sentence has enough context to cue the correct
response.

Learning Objective: Without the aid of reference, describe in


writing the performance characteristics of the M16A2 service rifle in
accordance with TM 05538C-10/1.

Test Item: The maximum effective range of the M16A2 service


rifle is _____ meters at individual/point targets and _____ meters at
area targets.

Figure 2-14. Sample Fill In The Blank Test Item

6) Short Answer Test Items. Listed below are a few guidelines to


keep in mind when writing a short answer test item.

a) Phrase the item so that the required response is concise.


b) May use a question or a statement.
c) Provide space for student to answer.
d) Provide same amount of space for each answer.

Learning Objective: Without the aid of reference, describe in writing


the performance characteristics of the M16A2 service rifle in accordance
with TM 05538C-10/1.

Test Item: State the cyclic rate of fire for the M16A2 service rifle.

_______________________

Figure 2-15. Sample Short Answer Test Item

Chapter 2 2- 40
Systems Approach To Training Manual Design Phase

SHORT ANSWER/FILL IN THE BLANK ITEMS CHECKLIST


YES NO
1. Is this type of item appropriate for measuring the intended
learning outcome?
2. Does the item task match the learning task to be measured?
3. Does the item call for a single, brief answer?
4. Has the item been written as a direct question or a well-
stated incomplete sentence?
5. Does the desired response relate to the main point of the
item?
6. Have clues to the answer been avoided (e.g., "a" or "an,"
length of the blanks)?
7. Are the units and degree of precision indicated for numerical
answers?

Assessment of Student Achievement. By Norman E. Gronlund. p.99.

7) Labeling Test Items. Listed below are a few guidelines to keep in mind
when writing a labeling test item.

a) Make all sketches, drawings or illustrations clear and of sufficient size.


If possible, use the actual parts of a unit.
b) Provide sufficient information to indicate what the equipment is and
which part is to be labeled.
c) The parts to be labeled or identified should be clearly pointed out by
using lines or arrows.
d) Ensure that only one definite answer is possible.

Chapter 2 2- 41
Systems Approach To Training Manual Design Phase

8) Essay Test Items. An essay test item requires a more or less


extensive discussion by the student. It should be used when the
students are expected to recall facts, apply rules and procedures, and
think reflectively or creatively, to organize knowledge in the solution
of a problem, and to express their solution in writing. Listed below
are a few guidelines to keep in mind when writing an essay test item.

a) State the essay test item clearly so the student knows exactly
what type of discussion is expected.
b) The essay test item should ask for comparisons, decisions,
solutions, cause-effect relationships, explanations, or summary.
c) When possible, use more essay test items and limit the discussion
of each.
d) Set limits on essay test items such as time or number of words.

Learning Objective: Without the aid of reference, based upon the


nature and theory of war, evaluate the Normandy Campaign.
Test Item: Within a one-hour time limit, compare and contrast the
theories of war applied by the Axis and Allied forces in the invasion of
Normandy.

ESSAY ITEMS CHECKLIST


YES NO
1. Is this type of item appropriate for measuring the
intended learning outcome?
2. Does the item task match the learning task to be
measured?
3. Is the question designed to measure complex learning
outcomes?
4. Does the question make clear what is being measured?

5. Has terminology been used that clarifies and limits the


task (e.g., "describe," not "discuss")?
6. Are all students required to answer the same questions?
7. Has an ample time limit been indicated for each
question?
8. Have adequate provisions been made for scoring answers
(e.g., model answers or criteria for evaluating)?
Figure 2-16. Sample Essay Test Item.
Assessment of Student Achievement. By Norman E. Gronlund. p. 109.

Chapter 2 2- 42
Systems Approach To Training Manual Design Phase

4. Recording Test Items. Test items are recorded on the Learning Objective
Worksheet (LOW), which is a required document of the Master Lesson File (MLF).
Refer to Section 3600 for more information on the required MLF components. See
Appendix A for the LOW and Appendix C for the LOW Checklist. Enter test items
STEP 4 into the instructional management system (MCAIMS/TIMS). Entering test items
into MCAIMS will facilitate the automated grading/scoring of tests, tracking of GPAs
(if applicable), and test item analysis (discussed in Section 5300 of this manual).
See MCAIMS User’s Manual.

2208. SELECT INSTRUCTIONAL METHODS

One of the most important tasks to be performed by the curriculum developer or a


member of a design team is selecting the instructional method. An instructional
method is the approach used to present instruction. The method selected will have
a direct impact on both the quality of the training system and its cost effectiveness.
Any given lesson will probably incorporate two or more methods to serve different
purposes at different points in the progression of the lesson.

1. Consider the Advantages and Limitations of Methods In order to


evaluate instructional methods, consideration of the advantages and limitations
inherent to each is required. There are twelve major types of instructional methods.
STEP 1
See Instructional Methods on the next several pages:

Chapter 2 2- 43
Systems Approach To Training Manual Design Phase

INSTRUCTIONAL METHODS ADVANTAGES LIMITATIONS


Lecture (Formal, Informal,  Ideal for presenting many  Does not provide an avenue
Briefing, Guest). ideas in a short time. for the instructor to estimate
Formal lecture involves one-way  Suitable for introducing a student progress.
communication used for reaching topic.  No active participation by
large audience in a classroom setting.  Convenient for instructing students.
Informal lecture involves considerable large groups.  Dependent on the
interaction between the instructor  Supplementing material from instructor’s speaking skills.
and student in both the form other sources.  Not responsive to individual
question and discussion. needs of students.
(Informal lectures however,
accommodate these concerns)
Indirect Discourse (Panel  Can effectively be used for  Does not permit students’
Discussion, Dialogue, Teaching extremely large groups. needs to be satisfied.
Interview).  Facilitates higher level  Instructors cannot gauge if
Involves verbal interaction among cognitive skills. learning has transferred.
two or more persons, which is seen  Class size is not an issue with  Requires a high level of
and heard by students. Some this method. instructor expertise to be
example include, dialogue, a teaching effective.
interview, a panel discussion  Recommended method to
(debate), skits, playettes, and other reach high levels of learning.
dramatizations.  Evaluation is not inherent in
method.
 Not responsive to individual
needs of students.
Demonstration.  Enables performance  Time consuming to develop
This instructional method is used to standards to be demonstrated. and requires a great deal of
allow students to observe instructors  Provides immediate preparation.
perform a sequence of events. It is feedback.  Requires a high level of
designed to teach a procedure,  Method may be tailored expertise.
technique, or operation. during instruction.  Instructor must be able to
 Responsive to individual anticipate student error.
needs.  Best conducted in small
 Extremely effective when groups.
used in conjunction with lecture  Success is dependent on
or prior to practical application. demonstrator skills.
 Evaluation is inherent in
method.
 Instructors can tell if learning
has transferred.
Reading (Books, Reference  Most effective and time  Not responsive to individual
Publications, Web-based efficient means of presenting needs.
Material, Manuals, Handouts). material.  Dependent on availability of
The assignment to a student or  Students progress at own resources.
printed materials including books, pace.  Evaluation is not inherent in
periodicals, microfilms, manuals and method.
regulations, and handouts. (Should be used as a
supplement with formal Marine
Corps curricula. In cases of
entry-level should be used
sparingly.)

Chapter 2 2- 44
Systems Approach To Training Manual Design Phase

INSTRUCTIONAL METHODS ADVANTAGES LIMITATIONS


Self-Paced (Programmed,  Accommodates learning  Has rigid rules and requires
Modular, Computer Assisted, rates. considerable development time.
Mediated). Self-paced instruction is  Provides immediate  Instructor experience must
a learning program, which is feedback. be high to utilize this method
organized so that students are  Responsive to individual effectively.
allowed to move through it at their needs.  Directed towards individual
own pace under the guidance of an  Evaluation is inherent the learning.
instructor. Some typical applications method.
include, programmed instruction
(paper and computer), modular
instruction (prepackaged units of
instruction containing clear statement
of objectives, computer-assisted
instruction (computer used as vehicle
for interaction), and mediated
instruction (slides, film, tapes, and
cassettes).
Questioning (Socratic Method,  Reaches higher levels of  Will not work if students are
Student Query). Questioning as a learning. Stimulates higher order unfamiliar with the topic.
method is used to emphasize a point, thinking.  Requires a high level of
stimulate thinking, keep students  Effective at developing instructor expertise to be used
alert, check understanding, review mental skills. effectively.
material, and seek clarification.  Evaluation is inherent this  Lends itself best to one-on-
Examples of this method are the method. one or groups of 8-12 for
Socratic method (instruction by asking  Responsive to individual instruction.
students questions), and student needs and differences.
query (students asking questions).
Discussion-Non Directed (Peer  Works best if students have  Danger the seminar method
Controlled Seminar Free experience with lesson topic. will pool ignorance.
Discussion). Non-directed  Responsive to the individual  Natural leaders of the class
discussion is an individual/group needs of students. may dominate discussion.
interactive process in which task or  Instructors play a
objective-related information and limited/passive role.
experiences are evoked from a  Recommended for both
student or the group. This method lower and higher level cognitive
places the responsibility for learning skills.
on the students through their  Most effective for small
participation. groups of 8-12 students.
 Evaluation not inherent in
method.

Chapter 2 2- 45
Systems Approach To Training Manual Design Phase

INSTRUCTIONAL METHODS ADVANTAGES LIMITATIONS


Guided Discussion. Guided  Involves interaction by all.  Not recommended for simple
discussion provides interaction among  Allows students to exchange recall of information.
students and instructors. This ideas, values, and attitudes.  Effective utilization of this
instructional method develops  Responsive to the individual method requires a high level of
concepts and principles though a needs of students. instructor expertise.
group process and the unobtrusive  Effectively used for teaching  Instructors must be able to
guidance of the instructor. in the Affective Domain. judge value of student
responses.
 8-12 students is the
optimum size to conduct a
guided discussion.
 Evaluation is not inherent
with this method.
Practical Application. Individual  Provides student’s maximum  Time consuming.
Projects. Students interact with flexibility to practice and  Require supervision and
things, data, or persons as necessary demonstrate acquired skills in a informal evaluation by the
to develop the skills to master the controlled setting. instructor.
learning objectives.  Method combines well with  Can take place outside the
other methods. classroom.
 Evaluation is inherent this  Students need to acquire
method. mastery for this method to be
 Responsive to student’s effective.
special weaknesses, interests,  One of the best methods for
and needs. ensuring learning at higher levels
of application, analysis, and
evaluation.
 High level of instructor
expertise
 Designed for individual
instruction.
Field Trips. A field trip is an out-of-  Students encounter real  May require extensive
classroom experience where students settings appealing to all senses. logistical considerations.
interact with persons, locations, and  Method is highly  Instructor must be at the
materials or equipment for the recommended for reaching the comprehension level.
attainment of objectives. Typically affective domain.  Not typically used as much
used for affective purposes rather for cognitive development.
than for measurable cognitive  Evaluation not inherent in
development. the method
 Not responsive to individual
needs.

Chapter 2 2- 46
Systems Approach To Training Manual Design Phase

INSTRUCTIONAL METHODS ADVANTAGES LIMITATIONS


Simulations (Role-playing,  Low risk and effective as  Not usually recommended
Games). Simulations are low risk, capstone methods following a for imparting knowledge to
educational experiences, which block of instruction. students.
substitute for some real life situation.  Students can operate at the  Knowledge is presumed to
It may involve groups or whole units. highest cognitive level in a low- be prerequisite for this method.
Some kinds of simulations are role risk environment.  Elaborate versions may
playing, in-basket exercises (used in  Student weaknesses and require special equipment.
random order to simulate a series of strengths can be quickly  Few students per instruction
matters or decisions which a leader identified and worked with. during simulation itself.
might actually encounter),  Recommend few students  Simulation areas are of
organizational or management per instructor various sizes and configurations.
games- students manipulate an  Evaluation is inherent in the  Requires trained staff to
organization or some component part method. conduct.
to produce certain outcomes,  Responsive to students
hardware simulations (students use needs.
trainers that resemble, to some
degree, the equipment that is to be
used on the job; e.g. flight simulator
and virtual reality).
Case Study. A learning experience  Students develop new  Can be time consuming.
in which students encounter a real-life insights into the solution of  One of the best methods for
situation in order to achieve some specific on-the-job problems. reaching higher levels in the
education objective.  No follow-up evaluation is cognitive domain
necessary  Students must have
 Responsive to student’s thorough understanding at the
needs, differences, and comprehension level prior to
creativity. starting.
 Evaluation is inherent in the  Level of instructor expertise
method. is high.
 Size of class is normally
small, but may accommodate
larger groups.
Coaching. A learning experience  Enhances learning and  Time consuming to develop.
where face-to face interaction occurs enables performance standards  Requires a great deal of
between the instructor and the to be demonstrated. preparation.
student in order to meld individuals  Provides immediate  Requires a high level of
with diversified backgrounds, talents, feedback. expertise.
experience and interests; encouraging  Responsive to individual  Instructor must be able to
them to accept responsibility and seek needs. anticipate student error.
continued improvement and  Extremely effective when  Best conducted in small
achievement. used in conjunction with lecture groups or individually.
or prior to practical application.  Accommodates evaluation
 Evaluation is inherent in the and instructors can tell if
method. learning had transferred.

Chapter 2 2- 47
Systems Approach To Training Manual Design Phase

2. Review Method Considerations In addition to considering the


advantages and limitations of each method, the curriculum developer must review
the following: learning objectives, the TPD, adult learning principles, transfer of
learning, and resource constraints.
STEP 2
a. Learning Objectives The method choice must complement the kind of
learning to be undertaken by the students (e.g. cognitive, affective,
psychomotor). Based upon the domain and the level required by the learning
objective, methods of instruction are chosen that will enable students to perform
at the specified level. A combination of methods work best. (See Chapter 6,
Adult Learning, Section 6500 for more on Using Domains of Learning.)

For example: If the learning objective required learners to assemble a


piece of equipment, then the informal lecture method alone is
inadequate to teach that particular skill. Since the objective is a motor
skill, students would benefit by adding demonstration and practical
application.

b. Target Population Description (TPD) Consider the level of


motivation, background, knowledge, and skill level of the target population.

For example: Since the case study method requires the learners to
analyze and evaluate the subject matter, then case study method would
not be the appropriate method for students with no prior knowledge.

c. Consider Adult Learning Principles Typically, adults are self-directed


learners and bring their own experiences to the classroom. Research has shown
that they learn best:
1) through problem-based learning.
2) in small groups.
3) when challenged.
4) NOTE: The TPD must also be considered along with these principles.
(See Chapter 6 Adult Learning for more on Adult Learning Principles.)

For example: If the course is entry level, the students would not
bring a lot of experience to the classroom and problem-based learning
would not be appropriatel. Also, courses for entry-level students are
generally large in size, which may not allow for certain interactive
methods.

d. Transfer of Learning Transfer of learning from the instructional


environment to the job is most likely to happen when the conditions of learning
best replicate what is being done on the job. Students are more likely to
remember when instruction is active and geared toward different learning styles.
When possible, incorporate all three learning styles (visual, auditory, and
kinesthetic) into instruction. (See Chapter 6, Adult Learning, Section 6300 for
more on Learning Styles.)

For example: If it has been determined that the TPD learns best
kinesthetically, then consider methods that are interactive and allow
students them to do something. Simulation and/or practical application
methods should be considered.

Chapter 2 2- 48
Systems Approach To Training Manual Design Phase
e. Resource Contraints Although resource constraints should not be
the primary factor in determining instructional methods, availability of resources
must be considered. This can include minimum level of instructor experience,
class size, evaluation potential, and the ability to meet the individual needs of
students. However, new methods will never be incoproated to break the status
quo if curriculum developers do not identify them.

For example: If the curriculum developer wanted to use the


demonstration method to show students how to field strip various
weapons, experience level of the instructors would need to be
considered. In this particular case, instructor experience must be
high in order for the demonstration to be successful.

3. Select Method

a. Methods Selection Matrix The methods selection matrix (see


Figure 2-17) can be used for evaluating existing courses as well as planning
STEP 3 new ones. Developers must keep in mind that the matrix provides
recommendations based upon the assumptions listed below. With an
understanding of these assumptions, the grid is a valuable tool for the
curriculum developer to select methods of instruction. While these assumptions
may affect the interpretation of the matrix, the grid’s recommendations should
be useful for all schools and courses under most situations.

Five Method Selection Matrix Assumptions


Method Selection Grid
(1) The terms and categories follow the general terms and definitions
Assumptions
of Bloom’s Taxonomy/Domains of Learning. (See Chapter 6, Section
6300 for more on Domains of Learning).

(2) The lessons being analyzed are assumed to be relatively short.


Extended periods of instruction present many considerations beyond
the scope of this grid.

(3) The methods are analyzed in their “pure” form, that is, not
combined with other methods. A method that is not recommended as
an exclusive approach to instruction may be highly recommended in
combination with another method.

(4) Quality instructional materials and adequate teaching skills must


be present. Poorly prepared materials and weak instructor skills have
a negative effect on all recommendations.

(5) The learners are adult and tend to be task oriented, highly
motivated, possess prerequisite skills for a given learning situation,
and often prefer interactive methodologies.

Chapter 2 2- 49
Systems Approach To Training Manual Design Phase
b. How to Use the Methods Selection Matrix
1) Review the Domains of Learning in Chapter 6, Section 6300. Identify
Matrix Selection Example whether the learning objective is written in the Cognitive, Psychomotor,
or Affective Domain. Find “Domains and Levels” in the left-hand
Behavior: column. Circle or highlight the correct level. Using the matrix selection
Clean the M16A2 service rifle. example in the left-hand column, the appropriate domain and level has
been selected in Figure 2-17.
Factors and Constraints:
Highly skilled instructors,  For learning objectives that are written in the Cognitive Domain,
large class, desire evaluation identify whether the learning objective is written at the knowledge
to be inherent and not level, comprehension level or one of the higher levels.
concerned about individual
 For learning objectives that are written in the Psychomotor
needs.
Domain, identify whether the learning objective is written at the
lower level or higher level.

 For learning objectives that are written in the Affective Domain,


identify whether the learning objective is written at the lower level
or higher level.

2) After selecting the appropriate level, then circle or highlight all of the
HR’s (Highly Recommended) for that row. By doing this, the methods
that are highly recommended to be used are identified for that domain
and level. The “Grid Key” is located in the upper left-hand corner. In
the example, the lower level Psychomotor Domain was selected in the
last step. The HR’s have been circled for that level in Figure 2-17.

3) Next, find the “Factors and Constraints” section under the columns
selected in the previous step and circle the factors and constraints for
that column. Refer to Figure 2-17 for an example. This section allows
the curriculum developer to review the factors and constraints to
determine whether the method(s) indicated are feasible.

 Minimum Level of Instructor Expertise. On this row, the level


of instructor expertise required for the method is identified: “NI”
(New Instructor) or “EI” (Experienced Instructor) .
CLASS SIZE
1 = indiv
 Class Size. On this row, the appropriate class size for the method
2-12 = small is identified: lg (large), med (medium), sm (small), or individual.
13-24 = medium
20+= large  Evaluation Inherent in Method. This row identifies whether
the evaluation is inherent in the method.

 Responsive to Individual Needs. This row identifies whether


the method is responsive to individual needs.

4) Choose the method that BEST reflects the resource constraints and the
Domain/Level of learning required to achieve the learning objective.
Most methods will have limitations that the school will have to address.
If the constraints cannot be overcome, then consider methods that are
R (Recommended). However, methods with NR (Not Recommended)
should not be used. See the bottom of matrix for method chosen.

See Methods Selection Matrix on the next page.

Chapter 2 2- 50
Systems Approach To Training Manual Design Phase
FIGURE 2-17 Methods Selection Matrix
METHODS SELECTION MATRIX
STUDENT VERBAL
GRID KEY PRESENTATION METHOD INTERACTION APPLICATION METHODS
METHODS
HR - Highly Recommended

Microforms, Manuals, Handouts)

Practical Application (Individual


Discussion-Non Directed (Peer-
Discussion, Dialogue, Teaching

Questioning (Socratic Method,

Guided Discussion (Instructor


Modular, Computer Assisted,
Demonstration (Operation of
R – Recommended

Reading (Books, Periodicals,

Simulations (Role-Playing,
Lecture (Formal, Informal

Self-Paced (Programmed,
Briefing, Student Speech)

Indirect Discourse (Panel

Controlled Seminar Free


NR - Not Recommended

Equipment, or System)
NI - New Instructors

Student Query)
EI – Experienced Instructors

Discussion)

Case Study
Controlled)
LG - Large Class

Field Trips
Interview)

Mediated)

or Group)

Coaching
Games)
SM - Small Class
Indiv - Individual

DOMAINS AND LEVELS

COGNITIVE

Higher Levels NR NR NR NR R R NR NR HR NR HR HR NR

Comprehension HR HR NR R HR HR NR HR R NR R R NR

Knowledge HR R NR HR HR NR NR NR R R NR NR NR

PSYCHOMOTOR

Higher Level NR NR HR NR R NR NR NR HR NR R NR HR

Lower Level NR NR HR NR R NR NR NR HR NR R NR HR

AFFECTIVE

Higher Levels NR NR NR NR NR NR NR NR NR R HR HR NR

Lower Levels HR HR NR R R R R HR R HR HR R HR
FACTORS AND
CONSTRAINTS

Minimum Level of Instructor EI EI H


NI NI EI EI EI N/A EI NI EI EI
Expertise
1 = indiv
2-12 = small
Class Size 13-24 = medium LRG/ LRG/ SM/ SM/ INDIV SM/ SM/ INDIV
SM* INDIV INDIV MED SM
MED MED MED MED /SM** MED MED /SM**
20+= large

Evaluation Inherent in
NO NO YES NO YES YES NO NO YES NO YES YES YES
Method

Responsive to Individual
Needs NO NO YES YES YES YES YES YES YES NO YES YES YES

** Consider breaking class into small groups if the number of students is large and there is instructional staff to support it.
Demonstration and Practical Application will work, but more demonstrators and instructors will
be needed to overcome the class size by breaking into groups. Even more instructors would be
necessary for Coaching. Demonstration and Practical Application are the methods chosen.

Chapter 2 2- 51
Systems Approach To Training Manual Design Phase

4. Record Instructional Methods The instructional method chosen is


recorded in MCAIMS for printing on the Learning Objective Worksheet (LOW).
.
STEP 4 2209. SELECT INSTRUCTIONAL MEDIA

In any instructional situation there is a message to be communicated. Video,


television, diagrams, multimedia, computers, and printed material are
examples of media used to communicate the message. Media are the delivery
vehicles used to present instructional material or basic sensory stimulus
presented to a student to induce learning. In other words, the means used to
give information to the students. Appropriate media ensures that information
is presented to students by the most effective and cost-efficient means
possible.

Selection of media types must not be influenced by the curriculum developer’s


experience as a student. Rather, the curriculum developer should select a
media mix that is best suited for the TPD. Selection of media types must also
take into consideration theories of adult learning.

1. Consider the Advantages and Limitations of Media


Media have various characteristics that make them either suitable or unsuitable
for particular training situations. Consider the characteristics listed in
Figure 2-18 carefully to ensure that the appropriate media are selected for the
instructional system.

STEP 1 See Media Figure 2-18 on the next page.

Chapter 2 2- 52
Systems Approach To Training Manual Design Phase

MEDIA CHARACTERISTICS ADVANTAGES LIMITATIONS


Printed Materials. Printed  Easily constructed.  Can not be tailored to needs
material must be kept simple, yet  Easily duplicated. of students during instruction.
meaningful; displaying only small  Effective for indoor use.  Can only be used outdoors if
amounts of information.  May be enlarged, weather permits.
displayed, or distributed  Time consuming if images are
to students as a illustrations or photographs,
handout. or digital images.
 Low cost.  Flat pictures cannot be
 Readily available. revised.
 Computer-generated
graphics and charts can
be easily revised.
Chalkboards and Dry Erase  Easy to prepare and use.  Displays are not effective if
boards. Common presentation  May be used to show lettering is not large enough
media tools that are familiar to development or buildup to be seen by all.
instructors and students alike. of an event or display.
 Allow information to be
tailored during
instruction.
 Effective when
instruction calls for great
deal of writing.
 Portable (in some
instances).
 Low in cost.
 Readily available.
Turn Chart. Simple and familiar in  Easy to prepare and use.  Displays are not effective if
design, specifically in small  May be used to show lettering is not large enough
interactive classes. development or buildup to be seen by all.
of an event or display.  Can only be used outdoors if
 Allow information to be weather permits, unless
tailored during lamination is applied (cost
instruction. factor).
 Effective when
instruction calls for great
deal of writing
 Allows for interaction
between instructor and
students.
 Portable.
 Low in cost.
 Readily available.

Figure 2-18 Media Characteristics (continued)

Chapter 2 2- 53
Systems Approach To Training Manual Design Phase

MEDIA CHARACTERISTICS ADVANTAGES LIMITATIONS


Models/Mock-Ups.  Appeals to students’  Time consuming to develop.
Models/Mock-Ups are sense of touch.  May require specialized
representations of actual  Realistic personnel.
equipment, structures or devises.  Shows details.  May require assistant
Models/Mock-Ups seek to  Useful in demonstrations instructors.
represent actual items when items and hands-on  Class size limited to the size of
are too large, to difficult, or too experiences the model/mock-up.
dangerous to be brought into the  May be costly.
classroom.  May not be readily available.
 Cannot be revised (only minor
modifications can be made).
Actual Item/Object. AIOs are  Appeals to students’  Time consuming to develop.
the equipment or devices that are sense of touch.  May require specialized
actually utilized in the  Realistic personnel.
performance of the task or job.  Shows details.  May require assistant
AIOs may be too difficult, or too  Useful in demonstrations instructors.
dangerous to be brought into an and hands-on  Class size limited to the size of
indoor classroom and therefore experiences the model/mock-up.
outdoor facilities may need to be  May be costly.
utilized. Actual Item/Object. AIOs  May not be readily available.
are the equipment or devices that  Can not be revised (Only
are actually utilized in the replaced)
performance of the task or job.
AIOs may be too difficult, or too
dangerous to be brought into an
indoor classroom and therefore
outdoor facilities may need to be
utilized.
Overhead Transparencies.  Effective for presenting  Can only used where there is
Overhead transparencies are text, drawings, pictures, source of electricity.
presented using an overhead charts, diagrams, or  Requires a large screen and
projector that projects images on graphs to large projector.
a large screen or wall. audiences.
 Easy to develop if
handwritten or computer
generated.
 Easy to produce ahead
of time.
 Can be tailored during
instruction.
 Students may write on
their own for
presentations.
 Instructor can maintain
eye contact, and easy to
use.
 Low in cost.
 Readily available.

Figure 2-18 Media Characteristics (continued)

Chapter 2 2- 54
Systems Approach To Training Manual Design Phase

MEDIA CHARACTERISTICS ADVANTAGES LIMITATIONS


Slides. Slides are presented using a  Effective for presenting  Slides can only be used where
slide carousel and projector that still images of text, there is source of electricity.
projects images on a large screen or photographs, and  Requires a large screen.
wall. graphics to large  Projector must be monitored
audiences. for overheating.
 Ideal for enlarging  Requires additional equipment
images. (e.g.. slide carousel, extra
 Easy to develop if bulb).
computer generated.  Can be costly to develop if
 Can be combined in any photos are developed into
sequence. slides.
 Instructor can maintain
eye contact with
students.
 Slide projectors are easy
to use
 Easy to update, move,
or rearrange.
Audiotapes. Audiotapes are  Effective for self-paced  Lengthy to develop.
generally used in conjunction with instruction.  Costly to develop.
other media (e.g. supplementing a  Easy to use.  Cannot be tailored during
slide presentation). instruction.
 Requires a source of electricity
and additional equipment (e.g.
Audio Deck).
 Cannot be revised.
Videotapes/Film. Videotapes/film  Effective for recreating  Generally requires a great deal
recreate or show footage of actual actual events. of planning.
sites, events, procedures, equipment  Presenting correct  They can not be tailored
that is difficult or dangerous to method for performing a during instruction.
observe in class (e.g. volcanic set of procedures.  Requires a source of
eruption).  Reproduced at a low electricity.
cost.  Normally requires additional
 Readily available from equipment (e.g. TV).
commercial sources.  Has a high initial development
 Easy to use. cost.
 Cannot be revised (copyright).

Figure 2-18 Media Characteristics (continued)

Chapter 2 2- 55
Systems Approach To Training Manual Design Phase

MEDIA CHARACTERISTICS ADVANTAGES LIMITATIONS


Computer Based Training  Allows students varying  Educators and students have
(CBT). Computer based training levels of control over the unrealistic expectations.
utilizes the computer as an rate and sequence of  Teaches only a limited range
instructional device. their learning. of interaction.
 Provides immediate  Human interaction is reduced
feedback. or completely eliminated.
 Provides reinforcement.  Start-up cost for both
 Tracks student progress. hardware and software can be
 Provides diverse very expensive.
learning experiences  Software cannot be revised.
Interactive Video Disc (IVD)  Multimedia format.  Very expensive to produce.
Computer based Interactive Video  Learner interactivity.  Can be time consuming for
creates a multimedia learning  Individualization to the learner to search through
environment that utilizes both video learner needs. frames.
and computer-assisted instruction.  Very flexible.  Start-up cost for both
the recorded video material is  Can be used to provide hardware and software can be
presented under computer control simulation experiences. very expensive.
to viewers that see, hear sound, and  Software cannot be revised.
make active responses. The video
portion is provided through a
videocassette, videodisc, CD-ROM,
or DVD. The interactivity is provided
through computers.
Compact Disc Interactive (CDI).  Easy to use.  Limited number of available
CDI relies on a highly intelligent  High quality sound and titles and applications.
special player that is connected to a video.  Does not have a keyboard, or
standard television or monitor. CDI  Easy to connect to home disc storage.
incorporates text, audio, graphics television set.  Cannot be revised.
and animation into the programs.  Relatively inexpensive
The user interacts by using a costing about as much
remote controlled unit, and also has as a VCR.
a joystick and activation buttons for
interacting with the program. Sony
Play station TM and Nintendo TM
are examples of CDI.
Virtual Reality (VR). Virtual  Creates a realistic world  Very complex to use, thus
reality is one of computer based without subjecting does not lend itself to most
technologies newest applications. It viewers to actual or classrooms.
is three-dimensional environment imagined dangers or  VR equipment tends to be
where the user can operate as an hazards. expensive.
active participant. VR provides the  Provides students with  VR software itself cannot be
learner to interact with the the opportunities to revised.
environment in a unique way. Head explore places not
mounted displays, gloves, joysticks, feasible in the real world
headphones are examples of types (e.g. outer space or
of equipment used to create the inside and active
virtual experience. volcano).

Figure 2-18 Media Characteristics (continued)

Chapter 2 2- 56
Systems Approach To Training Manual Design Phase

MEDIA CHARACTERISTICS ADVANTAGES LIMITATIONS


Interactive Multimedia  User can navigate  User can get lost in a
Instruction (IMI). Interactive through information by hyperlink environment
Multimedia Instruction systems selecting routes via  ICM can have lack of
incorporate the computer as a buttons, or hot spots structure.
display devise, management tool, that suit their personal  Some programs can be
and/or source of graphics, pictures, needs and learning difficult to use and are time
text, sound into an interactive styles. consuming to produce.
format through hypertext. The goal  User can also create  Ranges from moderate to very
of hypertext is to immerse users their own special expensive for the more
into an interactive environment of connections within the complex programs.
sounds, still and motion images that information.
are connected in diverse ways.  ICM is generally easy to
Examples of IMI development revise.
programs are hyperstudio,
Authorware, and Toolbook2.
Computer Mediated  Real-time interactivity to  The classroom used must be
Conferencing, Video large audiences in a cost dedicated for two-way
Teleconference, Virtual efficient way. communications
Conferencing, Interactive  All television/computer  Can not generally be used for
Television, and Desktop Video systems allow the other purposes.
Conferencing. All of these transmission.  Learners may feel isolated.
methods describe learning via  OD motion images and  Technical problems any
telecommunications. These types of sound over a distance. interrupt instruction.
media formats permit cost-effective  Learners can  Instructors may not feel
training to large numbers of people communicate with the comfortable using these
who may be disturbed across instructor and with each mediums.
numerous sites. other via telephone or  Students may be reluctant to
two-way video. assume greater responsibility
in this type of setting.
 Start up cost may be
expensive depending on
requirements.
 May not be available due to
space constraints.

Figure 2-18 Media Characteristics

Chapter 2 2- 57
Systems Approach To Training Manual Design Phase

2. REVIEW MEDIA CONSIDERATIONS The type(s) of media selected


should enhance the presentation of information and compliment the method of
instruction, be available, and be able to be developed prior to the onset of the
course. Curriculum developers weigh these factors and select the media for the
STEP 2 course of instruction being taught. The following factors are considered and
analyzed prior to selecting instructional media:

a. Target Population Description: Consider the abilities, education


level, and learning preferences of the learner to select media that meets their
learning preferences and abilities.

b. Learning Objective: Identify the learning domain addressed in the


learning objective as either, Cognitive, Psychomotor, or Affective. Refer to
Chapter 6, Adult Learning for more information on Domains of Learning.

c. Class size: Ensure the type of media selected compliments the size
of the class. For the Marine Corps, in most situations: 1-9 students is
considered a small class, 10 -20 students is a medium class, and over 20
Factors that must be students is a large class.
considered.
d. Resources: Money, time, scheduling, facilities, personnel, and
equipment availability must also be considered to assess whether certain types
of media are available, cost effective, and/or feasible to use.

e. Learning Styles: Consider that students will have different learning


styles. Instruction is best when it accomodates visual, auditory, and kinesthetic
learners. Refer to Chapter 6 Adult Learning for more information on Learning
Styles.

3. Select Media After considering the target population description,


STEP 3 learning objective, class size, resources and learning styles, select the media
that best accommodates these factors.

4. Record Instructional Media The instructional media chosen is


recorded on the Learning Objective Worksheet (LOW). The selected media is
then recorded in MCAIMS for inclusion on the LOW which is part of the POI. For
STEP 4 information on MCAIMS, see MCAIMs Users Manual.

Chapter 2 2- 58
Systems Approach To Training Manual Design Phase

2300. SEQUENCE TERMINAL LEARNING SECTION


OBJECTIVES (TLO)
Sequencing TLOs is the final process of the design phase and provides a
4
foundation for developing course structure. Once this is completed, instruction is
developed. The purpose of sequencing TLOs is to ensure the instruction
promotes learning by the optimum placement of learning objectives. Sequencing
TLOs provides the following benefits:

1. Efficiency Sequencing TLOs allows for an efficient system of instruction


while avoiding duplication.
2. Transition Properly sequenced TLOs allow the student to make a logical
transition from one skill or knowledge to the next while avoiding confusion. This
ensures that supporting knowledge and skills are acquired before dependent
subject matter is introduced.
3. Structured Learning Sequenced TLOs serve as a rough course structure
and outline the strategy for instruction. This is important as it facilitates
learning, particularly when introducing new concepts or material. This transfer
of learning is maximized when closely related learning objectives are kept
together.

2301. RELATIONSHIPS BETWEEN TLOS


To sequence TLOs, they are organized into broad categories. The relationships
between them are determined and they are sequenced in the order implied by
their relationship. Learning objectives do not necessarily have to be taught in the
sequence they are listed. For instance, facilities and equipment may not be
available for this. The organization provides the optimum sequence for learning,
but it may not always be possible to instruct the course in this order.

1. Grouping TLOs Before TLOs are sequenced, they should be grouped.


TLOs that deal with the same subject have a shared element relationship and may
be grouped together. The shared element may be that of an object (e.g.,
ammunition, supply procedures, M16A2 rifle) or a condition (e.g., a desert
environment, using a specific piece of equipment, nighttime).

a. Same Object TLOs with the same object may be grouped together
(e.g., all TLOs pertaining to the M16A2 rifle or all TLOs pertaining to a
communications annex). Same object TLOs can often be determined by
reviewing the Individual Training Standards (ITSs)/Training and Readiness (T&R)
events, because all tasks are grouped by duty areas that define similarities among
them. TLOs may be grouped by these same areas also. Grouping TLOs this way
maximizes the transfer of learning because closely related TLOs are kept together.

Chapter 2 2- 59
Systems Approach To Training Manual Design Phase

b. Same Condition The environment and the resources within a school


should be considered when grouping TLOs. TLOs may be grouped by like resources
(e.g., all instruction requiring the use of a radio or all instruction that takes place on
the firing range). Grouping learning objectives with the same condition maximizes
instructional time (e.g., instructional time is not lost due to traveling from one
location to another or due to obtaining the same equipment at different times
throughout the course) allowing an efficient system of instruction.

2. Relationships in Sequencing To logically sequence TLOs, the curriculum


developer must determine the relationship among them. The four most important
relationships in sequencing TLOs are dependent, supportive, independent, and
conflicting. They are described in detail below:

a. Dependent Relationships Dependent relationships exist between TLOs


that are a prerequisite to other TLOs. Personnel must master the dependent TLO
Examples of actions before they can master the others. Examples of actions having a dependent
having a dependent relationship are:
relationship are:
For a sniper to engage a target, he must first learn to apply the principles
of marksmanship.

To send a message in Morse code, one must first learn to identify the
Morse code symbols for each letter and number.

b. Supportive Relationships In supportive relationships, skills and


knowledge in one TLO have some relationship to those in another TLO. The learning
involved in mastery of one TLO transfers to the other, making learning involved in
the mastery of the other easier. In a supportive relationship, TLOs are sequenced so
that a logical transition can be made from one skill or knowledge to the next. This
type of sequencing is largely dependent upon the Curriculum Developers expertise in
the subject matter and subjective judgment of what makes the learning of a task
easier. For example, "maintain a checkbook ledger" has a supportive relationship to
"balance a checkbook.” You could learn how to balance a checkbook without first
learning to maintain a checkbook ledger. However, learning to maintain a ledger
first will make balancing a checkbook much easier.

Other examples are:

“Disassemble the M16A2 service rifle." Disassembling the M16A2 service


Examples of actions
rifle has a supportive relationship to the "assembly of the M16A2 service
having a supportive
rifle."
relationship are:
“Drive a 1/4 ton truck.” Driving a 1/4 ton truck has a
supportive relationship to “drive a 5 ton truck.”

“Write learning objectives.” Writing learning objectives has a supportive


relationship to “given learning objectives, write a lesson plan.

Chapter 2 2- 60
Systems Approach To Training Manual Design Phase

c. Independent Relationships In an independent relationship, skills and


knowledge in one learning objective are unrelated to those in another TLO. For
example, "balance a checkbook” has nothing to do with "selecting investments.”
Arrange TLOs with independent relationships in any reasonable sequence.
However, they should not be placed between TLOs having dependent or supportive
relationships. Examples of actions having an independent relationship are:

“Balance a checkbook” is independent of “select investments.”

“Solve mathematical equations (general math, geometry, calculus)” is


independent of "solve scientific equations (chemistry, physics).”

"Disassemble the M16A2" is independent of "disassemble the 9mm


pistol."

d. Conflicting relationships Conflicting relationships exist between TLOs


that involve opposite responses to the same cue in a different context. These
responses must be clearly related to the situation in which the cue is received. The
two actions should be taught together and the reason for the opposite response to
the same cue explained and reinforced. The conflicting element that causes two
very similarly stated TLO(s) to be conflicting usually involves a visual or auditory
cue within the learning objective(s).

e. Remember to sequence the TLO(s) with conflicting relationships as close to


one another as possible so that the conflicting issues/concerns can be addressed.
Examples of conflicting elements presented in similarly stated actions are:

In the TLO “As a member of a platoon and on the command fall in, fall in
platoon formation per the NAVMC 2691W/CH 1.” This command could
mean two distinctive different movements, depending on whether the
platoon has weapons or not. You may want to teach these movements
close to each other to show the major differences and make it clear to
the platoon.

In the TLO “As a member of a platoon and on the command right face,
execute a right face per the NAVMC 2691 W/CH 1,” the same holds true
depending on whether the platoon is armed. If the platoon is not armed,
on the command of execution, “Face,” the individuals in the platoon
simply execute a right face. On the other hand, if armed, the individuals
in the platoon have to execute trail arms, right face, and then order arms.

f. Relationship Table Not all actions fit neatly into one of the above
categories. Some may seem to be both dependent and supportive. Other
combinations may seem to be just as possible. The two things to remember are to
have justification for the sequence and that in some cases the sequence can be
changed. Sequencing decisions need to be documented to provide an audit trail.
The table in Figure 2-19 summarizes the relationships between TLOs.

Chapter 2 2- 61
Systems Approach To Training Manual Design Phase

DEPENDENT CONFLICTING SUPPORTIVE INDEPENDENT


Knowledge and skills Knowledge and skills Knowledge and skills Knowledge and skills
in one TLO are closely in one TLO conflict in in one TLO have some in one TLO are
Related to those in some respect with relationship to those unrelated to those in
the other TLO. those in another TLO. in the other TLO. the other TLO.
To master one of the Mastering one TLO Mastering one TLO Mastering one TLO
TLOs it is first may cause difficulty in transfers to other, does not simplify
necessary to master mmastering the other making learning mastering the other.
the other. TLO. Involved in the
mastery of the other
easier.
TLOs must be TLOs must be taught TLOs should be placed In general, the TLOs
arranged in the closely together, close together in the can be arranged in
sequence indicated directly addressing the sequence to permit any sequence without
by the knowledge conflicting elements optimum transfer of loss of learning
and skills hierarchy. between the two learning from one TLO efficiency.
TLOs. to the other.
Figure 2-19 Relationship Table

2302. STEPS FOR SEQUENCING TERMINAL


LEARNING OBJECTIVES
The following are steps for sequencing TLOs:

1. Group the TLOs based on shared elements


2. Determine if the relationship between the TLOs is dependent,
supportive, independent, or conflicting
3. Arrange TLOs based upon their relationship
a. Sequence the TLOs with dependent relationships in a hierarchical
arrangement.
b. Sequence TLOs with supportive relationships in an order that permits the
optimum transfer of learning from one learning objective to another.
c. Sequence the TLOs with independent relationships in any logical order.
Since the TLOs are independent of one another, the sequence in which they are
presented will not affect learning. Remember that these TLOs stand-alone and
should not be placed between dependent or supportive TLOs as this would disrupt
the transfer of learning.

Sequencing TLOs is the final process in the Design Phase. The outputs of
this phase are:
d. TPD
e. Learning Objectives
f. Test Items
g. Method and Media
h. Sequenced TLOs
These outputs become the inputs to the Develop Phase that begins with
developing course structure.

Chapter 2 2- 62
Systems Approach To Training Manual Develop Phase

DEVELOP PHASE
In Chapter 3:

3000 INTRODUCTION 3-1

ANALYZE 3100 DEVELOP A COURSE


STRUCTURE 3-2

E
 Purpose 3-2
 Steps for Developing a Course Structure 3-2

V
3200 DEVELOP CONCEPT CARDS 3-5
 Purpose 3-5
 Categories of Concept Cards 3-5

A
DESIGN
 Concept Card Elements 3-6

3300 CONDUCT AN OPERATIONAL

L
RISK ASSESSMENT (ORA) 3-8
 Steps for Conducting an ORA 3-8

U
DEVELOP 3400 DEVELOP LESSON
MATERIALS 3-18
A  Develop Lesson Materials 3-18
 Secure Resources 3-18
 Write the Lesson Plan 3-18
T

 Components of a Lesson Plan 3-19


IMPLEMENT  Student Materials 3-28
 Develop Media 3-30
E

 Media Development Considerations 3-31


 Media Delivery Format 3-35
 Develop a Course  Instructor Preparation Guide 3-38
Structure
3500 CONSTRUCT TESTS 3-40
 Develop Concept Cards
 Purpose 3-40
 Conduct an ORA
 Methods of Tests 3-40
 Categories of Tests 3-41
 Develop Lesson  Types of Tests 3-41
Materials  Steps for Constructing Tests 3-42

 Construct Tests 3600 CONDUCT VALIDATION 3-49


 Purpose 3-49
 Validate Instruction  Methods of Validation 3-49
 Types of Data 3-51
 Develop a CDD & POI  Steps for Validating instruction 3-52
 Validation Authority 3-54

3700 DEVELOP A CDD & POI 3-55


 Purpose 3-55
 CDD 3-55
 POI 3-62

3800 ASSEMBLE A MLF 3-64


 Purpose 3-64
 Minimum Requirements 3-64
 Optional Components 3-65
 Steps for Assembling 3-66

Chapter 3
Systems Approach To Training Manual Develop Phase

Chapter 3000. INTRODUCTION

3 The Develop Phase of the Systems Approach to Training (SAT) process builds on
the outputs of the Design Phase to produce Course Descriptive Data (CDD)/
Program of Instruction (POI) and the Master Lesson File (MLF).

This chapter is broken down into the following eight sections:

1. Develop A Course Structure The course structure is a detailed


chronological document identifying the implementation plan for a course.

2. Develop Concept Cards Academic and administrative concept cards are


created to assign resources within the formal school/detachment to lessons,
evaluation, and events.

3. Conduct an Operational Risk Assessment (ORA) An ORA will be


conducted on each lesson/event within a Program of Instruction (POI). The
associated ORA tools will be incorporated into the Master Lesson File (MLF).

4. Develop Lesson Materials Lesson plans, student materials, supplemental


student materials (optional), media, the Instructor Preparation Guide (IPG),
are all lesson materials to be used during the Implement Phase.

5. Construct Tests Constructing a test involves selecting and placing the test
items from the Learning Objective Worksheet (LOW) on the appropriate test.
It also involves providing ample instructions to the student, instructions to the
evaluator, and developing the grading criteria for each test given in the course.

6. Validate Instruction The goal of validation is to determine the effectiveness


of instructional material prior to implementation.

7. Develop Course Descriptive Data (CDD) and Program of Instruction


(POI) The CDD provides a summary of the resources required to administer a
course and the POI provides a detailed description of the course. These
documents record the formal school's plan for satisfying the training
requirements listed in the Individual Training Standard (ITS) or Training and
Readiness (T&R) order.

8. Assemble A Master Lesson File One Master Lesson File (MLF) is compiled
for EVERY class taught at the formal school/detachment in order to provide
continuity of instruction.

INPUT
Develop Course Structure

Develop Concept Cards


TPD
Conduct ORA
Learning Objectives
Develop Lesson Materials CDD/POI
Test Items PROCESS OUTPUT
PROCESS OUTPUT
Construct Tests MLF
Methods/Media
Validate
Validate
Instruction
Tests
Sequenced TLOs
Develop CDD/POI

Assemble MLF

Figure 3-1

Chapter 3 3-1
Systems Approach To Training Manual Develop Phase

3100. DEVELOP COURSE STRUCTURE SECTION


The purpose of developing course structure is to determine how much content is 1
appropriate for a single lesson or a single exam and arrange the lessons and
exams in a logical sequence. The course structure provides an outline of how the
lessons in the course will flow from start to finish. Course structure is not a course
schedule. However, the course structure provides a guideline for developing the
course schedule. A course structure contains lesson titles, methods, academic
hours, and lesson designators.

3101. STEPS FOR DEVELOPING COURSE


STRUCTURE

The four steps for developing course structure are: review source materials,
determine lesson/exam content, estimate instructional hours, and assign
lesson/exam titles and designators.

1. Review Source Material The first step in developing course


structure is to review the following items: STEP 1

a. The Learning Objective Worksheets (LOW) for the course that


contains the Terminal Learning Objectives (TLO) and their
associated Enabling Learning Objectives (ELO), the delivery
system, and test items.

b. Directives from higher headquarters that may influence the length


of the course.

c. School Standing Operating Procedures (SOP) for any additional


school academic requirements that may affect the course.

2. Determine Lessons/Exam Content The second step in


developing course structure is to decide how many objectives are
appropriate for a single lesson or exam. There is a process used to
determine lessons/exams.
STEP 2
See The Process of Determining Lesson/Exam Content
on the next page:

Chapter 3 3-2
Systems Approach To Training Manual Develop Phase

a. Review Objectives When reviewing the objectives, consider the following:

1) The complexity of an objective. If it is lengthy or technical, the lesson


may need to be divided into several lessons.

2) The domain of the objective. In general, objectives in the cognitive


domain require fewer resources to teach. Several cognitive objectives may
be reached during one lesson. However, an objective in the psychomotor
Process of Determining domain can require more methods and more resources. Therefore, it may
Lesson/Exam Content require more than one lesson.

3) Select closely related objectives. If multiple objectives are chosen for


one lesson, select those objectives that are closely related. When
combined, they must make a logical, self-contained group suitable for an
individual lesson.

a) Learning objectives are organized so that the group has a natural


beginning and ending point.

b) Look for “natural breaks” in the sequenced learning objectives that


indicate major changes in subject matter (e.g., changing from one
system to another or going from knowledge-based instruction to
performance-based instruction.)

b. Consider Target Population Description (TPD) The level of experience


the average student will bring into the classroom must be considered. Due to their
lack of experience, entry-level students may not be able to comprehend multiple
objectives in a single lesson. Remember that the students are seeing it for the first
time.

c. Assign Lesson/Exam Titles All lessons and exams are assigned titles (i.e.,
Perform Preventive Maintenance on the M16A2 Rifle). The titles must be
meaningful and relate to the lesson or exam content.

3. Estimate Instructional Hours An estimate of the hours required for


each lesson is necessary to ensure that the proposed curriculum does not exceed
the maximum training days authorized for the course. Academic time includes all
hours devoted to instruction, review, evaluation, and re-testing of the TLOs/ELOs.
STEP 3
Some other non-academic events, such as course overview, end of course critiques,
or training aid maintenance, should be included in this estimate. All other events
not directly related to the structure of the course are administrative in nature and
not considered when building course structure. If the estimate exceeds the
maximum authorized training days, the school/detachment must contact CG,
TECOM (GTB/ATB) for guidance. When estimating instructional time, consult and
review the following:

See How to Estimate Instructional Hours


on the next page:

Chapter 3 3-3
Systems Approach To Training Manual Develop Phase
a. Time requirements for similar lessons in other approved courses. This will
give you an estimate of how long it may take to teach your lesson.

b. The number and complexity of learning objectives within each lesson. A


learning objective's complexity is based upon whether its behavior is Steps for Estimating
knowledge-based or performance-based, what conditions must be Instructional Hours
present, and how the behavior is evaluated.

c. The amount of time spent performing the task on the job. Normally,
teaching a task takes longer than performing it on the job.

d. Review the instructional method selected for each learning objective. For
instance, performance-based instruction with practical application will take
longer to conduct than a lecture.

e. Total the number of hours and divide by 8 (the maximum number of


training hours per day under peacetime training conditions). This will
provide the estimated number of training days for the course.

f. Then review approved Course Descriptive Data (CDD) or CG, TECOM


development directive. The CDD or directive will state the training days
authorized for the course.

4. Assign Lesson/Exam Titles and Designators These codes are a


quick way to identify a lesson or exam. Designators must be purposeful, relate to
the lessons, and numbered sequentially. They can be written in any format,
provided they are meaningful (e.g., TBS.1 for Lesson 1 of The Basic School
Course; BIC.10 for Lesson 10 of the Basic Infantryman Course). School SOP will
provide guidance for the assignment of designators. Further, the rationale for STEP 4
assigning lesson/exam designators is located in Section IV of the POI. See the
MCAIMS User’s Manual for further explanation. Figure 3-2 Below is a sample
course structure.

SAMPLE COURSE STRUCTURE


Refer to Chapter 7,
TD-1 Lesson _Method DESIG Est Hours Administration for
______ SAT Overview __L IT-00 1.5 guidance on how to use the
______ Effective Comm __L/D/PA/G IT-01 2.5 Course Structure to
______ Effective Comm Exam X (W) IT-04 1 produce a Course Schedule.
______ Conduct a Lesson L/D/PA/G IT-04 2
______ Conduct a Lesson Exam X (P) IT-08 1

Figure 3-2

Chapter 3 3-4
Systems Approach To Training Manual Develop Phase

3200. DEVELOP CONCEPT CARDS SECTION


A concept card is a document that gives the reader a snapshot of one entire 2
lesson, exam, or event during a course of instruction. It identifies all of the
learning objectives, instructional methods and media, and the resources required
to conduct the entire lesson, exam, or event. Concept cards have both a primary
and a secondary purpose. The primary purpose is to provide the school with a
way to manage its resources. The secondary purpose of a concept card is to A Concept Card must be
document the formal school/detachments plan for implementing the ITSs. produced for inclusion in
Concept cards make up the bulk of Section IV of the Program of Instruction (POI) the Master Lesson File
and are produced in MCAIMS. A concept card must be produced for each lesson, (MLF)
exam, and administrative event for inclusion in the Master Lesson File (MLF).

3201. CATEGORIES OF CONCEPT CARDS

There are two categories of concept cards: Academic and Administrative.

1. Academic Concept Card There are three specific types of academic concept
cards: Lesson Purpose, Task Oriented, and Exam.

a. Lesson Purpose Lesson purpose concept cards are created when the
instructional content is not specifically related to a task list (ITS) for the
course and does not address any TLOs/ELOs. The lesson purpose
concept card will have a clearly defined lesson purpose statement Three specific types of
reflecting the rationale for presenting the lesson (i.e., orientation or academic concept cards
overview).

b. Task Oriented Task oriented concept cards identify the instructional


content as it relates to at least one task within the task list for the course.
They address one or more TLO(s) and their associated ELOs being taught. Every task oriented/exam
concept card will have the
c. Exam Exam concept cards capture the resources required for the TLO or associated ELOs
evaluation of at least one task within the task list (ITS/T&R Event) for the that the lesson/exam
course. One exam concept card is created for each initial exam supports.
administered over the course of instruction. Retesting is a remedial
action available to school commanders but must be accomplished with
resources provided for the course. The remediation hours and resources
required for any retests are captured in an administrative concept card.
These hours do not go toward the 40 hour training week. Exceptions to
this policy may be authorized by CG TECOM.

2. Administrative Administrative concept cards capture all of the non-


instructional information required to conduct the course. An example would be the
graduation ceremony upon completion of a course. Administrative concept cards
are found in Annex Z of the POI.

Chapter 3 3-5
Systems Approach To Training Manual Develop Phase

3202. CONCEPT CARD ELEMENTS


MCAIMS is the program used by the Marine Corps to record all elements of the
concept card. See APPENDIX B for a sample paper-based Concept Card. See
MCAIMS User’s Manual for instructions for recording in MCAIMS. The elements of
a concept card are:

MCAIMS will indicate the 1. Heading The information in the heading will include the name of the course,
date printed in the upper letter of the annex, and the title of the annex .
right hand corner.
2. Annex Annexes are established with an alpha designator to represent
subject areas into which concept cards are grouped. They may be established
Annexes A through Y are according to the duty areas by which ITSs/T&R events are organized or according
reserved for academic to some other logical grouping. Annexes A through Y are reserved for academic
concept cards (task concept cards (task-oriented, lesson purpose or exam). Exam concept cards may
oriented, lesson purpose or be assigned to the same annex in which the related task oriented cards appear or
exam). may be assigned a separate annex of their own. Annex Z is reserved exclusively
for administrative concept cards.

3. Lesson, Exam, Event ID The lesson, exam, or event ID is assigned during


the development of the course schedule and is recorded here.

4. Lesson, Exam, Event Title The lesson, exam, or event title is assigned
during the development of the course schedule and is recorded here.

5. Hours (Total) The total amount of time required to conduct a lesson, event,
or exam is reflected here. This is automatically calculated within MCAIMS based
on the total of hours allocated to individual methods. See hours (per method)
below.

6. Method Instructional methods selected during the design phase and


recorded on the LOW are assigned to the concept card. This is done not only for
quality control, but also as a quick reference. The method is recorded as a code
or abbreviation.

7. Hours (per method) The overall time for the lesson is broken down to
reflect how much time is allotted for each of the selected methods.

8. Student Instructor (S:I) Ratio The student instructor ratio is determined


so that it complements the selected method. For example, a ratio of 30:1
(students :instructor) would be appropriate for a lecture. During practical
application where additional instructors are required for closer supervision of the
students, a 6:1 or 30:5 ratio might be necessary. The difference in these two
ratios is determined by how the practical application exercise is actually
conducted. Are the students and instructors working in a small group, or are
additional instructors merely added to the classroom for additional control?
Mathematically, MCAIMS will treat the ratios the same for the Instructor
Computation Worksheet (Lockstep). Curriculum developers must remember the
concept card provides a “snapshot” of what is actually transpiring in the
classroom. (See concept card section of the MCAIMS Users Manual for specific
guidance).

9. Media Media that were selected during the design phase and recorded on
the LOW are assigned to the concept card. This is done not only for quality
control, but also as a quick reference. The media are recorded as a code or
abbreviation.
Chapter 3 3-6
Systems Approach To Training Manual Develop Phase

10. Learning Objectives On task oriented or exam concept cards, the learning
objectives included in the lesson for that task are recorded.

11. Lesson Purpose A detailed lesson purpose statement will replace the
learning objectives on a Lesson Purpose Concept Card.

12. Ammo Requirements Those learning objectives requiring ammunition for


instruction and evaluation must have the Department of Defense Identification
Code (DODIC) and nomenclature for each ammunition type used. This
information can be found in MCO P8011.4H Marine Corps Table of Allowance for (DODIC) - Department of
class V (w) material (peacetime). The DODICs are broken down by the number of Defense Identification Code
rounds per student, expended and unexpended, during the execution of the
lesson. The number of support rounds, expended and unexpended, are also
recorded by DODIC.

Notes. This section can be used to provide a word picture describing the
execution of the class, exam, or event. It can be used to capture, in detail, any
information that clarifies additional instructional and resource requirments such
as: Any logistical requirements
identified on task oriented
a. Safety (e.g., Highest initial and residial Risk Assessment Codes [RAC])
concept cards will be
b. Justification of exam method
transposed to the
c. Instructor ratios (e.g., justification of additional instructors for different
Instructor Preparation
methods and safety)
Worksheet.
d. Logistical requirements (e.g., requests for transportation, ammunition, or
ranges, etc.)
e. External personnel support (e.g., corpsman, RSO, contractors, guest
lecturers)
f. External facilities (e.g., pool, laboratories)
g. External equipment support (e.g., helicopters, computers, radios, tanks
etc…)

13. References References are source documents that include doctrinal


publications, technical manuals, field manuals, and Marine Corps Orders. These
references provide guidance for performing the task in accordance with the given
conditions and standards.

14. Signature Blocks MCAIMS allows names or titles of up to five officials to be


entered on the concept card for the purpose of routing/approval. Routing and
approval procedures are normally found in the formal school/detachment SOP.

15. Optional Items Phase and group designators are optional elements that
can be entered to identify certain timeframes or instructional teams responsible
for specific instruction during implementation of the POI. For example, Recruit
training has Forming, 1st, 2nd and 3rd Phase. During 2nd phase Weapons and Field
Training Battalion (WFTB) is responsible for the instruction of marksmanship skills.

Chapter 3 3-7
Systems Approach To Training Manual Develop Phase

SECTION 3300. CONDUCT AN OPERATIONAL RISK


ASSESSMENT (ORA)
3
The formal school/detachment conducts an Operational Risk Assessment (ORA)
on all events, lessons and exams. The concept of Risk Management formalizes
the process of training safely and efficiently. Rather than relying solely on
individual experience to achieve the best results, risk management provides
careful analysis and control of hazards in each lesson.

3301. STEPS FOR CONDUCTING AN ORA

There are two circumstances when a curriculum developer will conduct an


Operational Risk Assessment (ORA). An ORA is either going to be conducted on
a new lesson or an existing lesson. In either case, the decisions for conducting
an ORA must be based upon well documented facts about the lesson.

When conducting an ORA on an existing lesson, there is more information to


work with, such as test results, After Instruction Reports (AIR) and even mishap
reports. For a lesson under development, the initial ORA is conducted once the
concept card is completed. The concept card provides what will be taught
(objectives), which methods will be used, and what forms of media will support
the lesson. There may also be important information about ammunition and
other support requirements that by their nature have associated hazards. Based
on information from the concept card and other references, the initial ORA is
completed. This enables the curriculum developer to proceed with developing
the lesson materials while considering the related hazards and identified controls.
By conducting the ORA at this point, safety measures can be incorporated into
the lesson plans and related documents. Therefore, risks to personnel (students,
instructors, etc.) is minimized even before the first validation of the lesson. It’s
important to understand that this will be the initial ORA and that it is necessary
to review, update, and finalize the ORA after validation.

The steps for conducting an ORA are as follows:

1. Identify hazards.
2. Assess hazards.
3. Make risk decisions.
4. Implement controls.
5. Supervise.

Chapter 3 3-8
Systems Approach To Training Manual Develop Phase

1. Identify Hazards Identifying hazards is the most important of the five steps
in an Operational Risk Assessment. Failure to recognize a hazard means that a control
measure will not be developed and implemented. As a result, the hazard could
negatively impact training. This is the most time-consuming step in the ORA. Time
and resources must be spent wisely to identify all hazards associated with a lesson.
STEP 1
There are four sub-steps that must be completed to effectively identify hazards:

a. Gather resources.
b. Identify the major steps (learning objectives).
c. Identify sub-steps (skills).
d. Identify hazards.

a. Gather Resources There are several resources that may be used to assist in
conducting this analysis. Because identifying hazards is so critical, use all resources
that are available. Some suggestions are:

1) References References, especially technical manuals, often include


warning statements or icons that warn the user of a hazard. They can also
help identify hazards associated with the sub-steps.

2) The Master Lesson File The LAW will definitely help in identifying the sub-
steps. The LOW will provide visibility of the test items that may have hazards
associated with them. The concept card lists the learning objectives which
may be considered as the major steps associated with a lesson or exam. The
concept card will also list the major steps for an event (e.g., physical training,
graduation). Methods and media used for the lesson are identified on the
concept card and may have associated hazards. A lecture about using
explosives obviously has fewer hazards than a demonstration. Likewise, a
video of an explosion has less risk than observing an actual demolition. When
conducting an ORA on an existing lesson, it is also necessary to review the
lesson plan, student outline and media for reference to possible hazardous
situations.

3) School SOP/Orders These documents can provide guidance (like actions


during incement weather) that will act as controls. It may also spell out who
has authority to make risk decisions at different levels.

4) An ORA Worksheet (ORAW) The ORAW is used to record the results of an


Operational Risk Assesment. It’s helpful to obtain an electronic copy of the
ORAW, since the columns are small and hard to handwrite all of the
information in a column.

5) A Subject Matter Expert (SME) An SME is someone who has thorough


knowledge of a particular job. It’s important to remember that the “operation”
that is being analyzed is a training operation in a formal school. It’s not just
an experienced job holder that is needed, but one who has experience
teaching. In this case, an SME is an instructor who has taught the lesson
being analyzed or a similar lesson.

Chapter 3 3-9
Systems Approach To Training Manual Develop Phase
6) Data from past classes. If conducting an ORA on an existing lesson, data can
be gathered by interviewing graduates, studying test results, and reviewing
mishap reports. This can reveal hazards not found elsewhere. Data and points
of contact for former students can be found in the course books or other course
records.

b. Identify the Major Steps To identify the major steps in a lesson, begin by
looking at the learning objectives on the concept card. Both the terminal and
enabling learning objectives are the major steps in the lesson. Record the major
steps in the column labeled “Major Steps” on the ORAW. It’s not necessary to
record more than the behavior statement from the learning objectives. If an ORA
is being conducted on a “lesson purpose” lesson (lesson with no learning
objectives), identify the major steps by reviewing the lesson plan. The major
steps are the main points of the lesson.

Figure 3-3. ORAW with Major


Steps listed.

c. Identify Sub-Steps To determine what hazards exist, the major steps must
be broken down into sub-steps. If the lesson is being conducted solely by
lecture, there are probably no hazards. Demonstrations are normally safe for the
students, but may expose the demonstrator to hazards. A practical application
will expose all personnel to hazards. Keep the methods in mind when
determining the sub-steps. The sub-steps can be determined by consulting the
SMEs, reading a technical manual or other publication, or reviewing knowledge,
skills, and attitudes from the LAW. For example, if a major step is to jack up a
car, the sub-steps would be: (1) remove the jack, (2) place jack under the car,
and (3) raise the vehicle. This information is readily available by researching the
references or the LAW. The SME can add important details, like describing the
instructional environment or how training sub-steps are different from real-world
performance. This type of information can only come from an SME; it’s not in the
books! Record the sub-steps in the column labeled Sub-steps on the ORAW.

Chapter 3 3-10
Systems Approach To Training Manual Develop Phase

Figure 3-4. ORAW with


Sub-Steps listed.

d. Identify Hazards To identify hazards, look at each sub-step and ask the
question, “What could happen while performing this sub-step that may cause
lesson failure, injury to the student, or damage to property?” Here are some
resources to help answer this question:
1) Technical publications
2) SME
3) Concept card
4) Lesson Plan (if developed)
5) AIRs and IRFs
6) Mishap reports
Technical publications often contain warnings about hazards. Consider if this
warning is applicable to the training environment, and if so record it. An SME who
has taught the lesson (or a similar lesson) before can help also. That SME will
know intricate details of the sub-step that will help identify all hazards. Interview
the SME to determine details about the training, such as mistakes that students
often make. This will allow the curriculum developer to plan for potential
mistakes by developing a control to manage the hazard. Consider not just the
sub-steps, but also the method of instruction. If an SME is not available,
determine how the lesson will be taught by looking at the concept card. If
conducting an ORAW on an existing lesson, the lesson plan will provide
information in more detail than the concept card. Record the hazards associated
with each sub-step in the column labeled List Hazards on the ORAW.

Chapter 3 3-11
Systems Approach To Training Manual Develop Phase

Figure 3-5. ORAW with


Listed Hazards.

2. Assess Hazards A hazard is defined as a condition that can impair


mission accomplishment, but it does not indicate to what extent. The risk
associated with a hazard is quantified by using the Risk Assessment Code Matrix.
The matrix is used as a tool to assess the severity and probability of each hazard
and to assign a Risk Assessment Code (RAC).

STEP 2
a. Assess the level of Severity of the Hazard (Level I, Level II, Level
III, and Level IV) The severity is defined as the potential degree of injury,
illness, property damage, loss of assets (time, money, personnel), or effect on
mission. If the hazard does occur, how bad will the damage be? This data may
be found by reviewing mishap reports from the school and units, if available. If
not, use the worst, realistic case that could possibly happen. Hazard severity
categories are assigned as Roman numerals according to the following criteria:

Category I – The hazard may cause death, loss of facility/asset or result in


grave damage to national interests.

Category II – The hazard may cause severe injury, illness, property damage,
damage to national or service interests or degradation to efficient use of assets.

Category III – The hazard may cause minor injury, illness, property damage,
damage to national, service or command interests or degradation to efficient use
of assets.

Category IV – The hazard presents a minimal threat to personnel safety or


health, property, national, service or command interests or efficient use of assets.

Chapter 3 3-12
Systems Approach To Training Manual Develop Phase

b. Assessing the Probability of the Hazard (LIKELY to happen, PROBABLY


will happen, MAY happen, OR UNLIKELY to happen) What is the chance the
hazard will occur? Again, use available data or make a realistic estimate. Mishap
probability will be assigned an English letter according to the following criteria:

Sub-category A – Likely to occur immediately or within a short period of time.


Expected to occur frequently to an individual item or person or continuously to a
group.

Sub-category B – Probably will occur in time. Expected to occur several times


to an individual item or person or frequently to a fleet, inventory or group.

Sub-category C – May occur in time. Can reasonably be expected to occur


some time to an individual item or person or several times to a group.

Sub-category D – Unlikely to occur.

c. Risk Assessment Code – The RAC is an expression of risk that combines


the elements of hazard severity and mishap probability. Using the matrix shown
below, the RAC is expressed as a single Arabic number that can be used to help
determine hazard abatement priorities.

Figure 3-6. Risk Matrix

Find the Intersection of the Severity Row and the Probability Column. This is the
RAC. Record this RAC on the ORAW in the column labeled “Initial RAC.”

Chapter 3 3-13
Systems Approach To Training Manual Develop Phase

3. Make Risk Decisions When presented with risks in a lesson, a decision


must be made as to whether the benefits of the lesson outweigh the risk. Before
considering this, an effort must be made to control the risk. Risk controls are
designed to change risk by lowering the probability of occurrence and/or
STEP 3 decreasing the severity of a risk.

a. Begin by Focusing on the Hazards that Have the Highest RAC If it


is determined that a hazard with a high RAC cannot be minimized, then a decision
to not conduct the training at all may be made. In this case, time and resources
are not expended on analyzing hazards with low RACs.

b. Decide if the Benefit of the Training Outweighs the Risk If the


controlled risk outweighs the benefit, the decision-maker may still choose to
accept the risk.

c. Select a Control Measure Look at different ways to work with the risk
in order to mitigate it to the lowest level possible. There are many decisions that
can be made on how to handle the associated risk. Ensure those controls will
allow the learning objectives to be met. Some options are listed below:

1) Accept the Risk There will always be a risk level associated with any
hazard; however, there are acceptable levels of risk. For hazards where
the risk level is low or when the benefits outweigh the possible costs for
higher risks, the decision may be made to accept the risk. When training
reconnaissance Marines, trainers must accept some of the risk.

2) Reduce the Risk This is the most widely used variety of risk control.
Look at all available resources (e.g., SMEs, technical manuals, safety
officer) when developing controls.

3) Avoid the Risk Avoiding the risk requires canceling or delaying that
portion of the lesson, but this option is rarely used due to the importance
of the lesson. Sometimes it may be possible to avoid a risk by going
around the risk, or performing the operation in a different way. Pilot
trainees avoid the risk of crashing by doing much of their training in a
simulator.

4) Compensate for the Risk Creating a redundant capability may


compensate the risk under certain circumstances. A driving instructor
often has two brake pedals in a training car. If the student fails to stop,
the redundant brake pedal gives the instructor control, thus reducing the
risk.

5) Delay the Risk The risk may be delayed for a couple of hours or day if
need be to reduce the severity and probability of the risk. A force march
at a school could be delayed during inclement weather, for example.

Chapter 3 3-14
Systems Approach To Training Manual Develop Phase

6) Spread the Risk Risk is commonly spread out either by increasing the
exposure distance or by lengthening the time between exposures to an
element. By spreading the distance, exposure, or other means, the
chance of the risk happening is diluted equivalently, and the severity may
be decreased proportionately as well.

7) Transfer the Risk When the possible losses or costs are shifted or
transferred to another entity, the risk to the original individual or
organization may be either greatly decreased or eliminated altogether.

8) Reject the Risk A decision to reject the risk may be made when the risk
exceeds the lesson benefits. Sometimes, rejecting a risk at the curriculum
developer level may mean that someone else at a higher level will have to
reconsider the risk.

d. Record Controls on the ORAW Once the best control has been
determined, record the results on the ORAW in the column labeled Develop
Controls. Reassess each RAC with the selected control in place. Since a control
should reduce the severity, probability, or both of a hazard, the residual RAC is
probably lower than the original one. Record the resulting lower RAC on the
ORAW in the column labeled “Residual RAC.”

e. Provide Cease-Training Criteria A special type of control is the Cease-


Training Criteria (CTC). Instructions for Cease Training must be included when
there is potential for serious injury or damage to equipment. CTC is a pre-
determined circumstance where training must be stopped to avoid a serious
incident. A list of the criteria is recorded on the ORAW. These criteria are also
recorded during the introduction of the lesson plan, so that the instructor can brief
the class on the CTC, how training is stopped, and how training can resume.
Specific procedures on resuming training may be found in the formal
school/detachment SOP.

Figure 3-7. ORAW with


Risk Decisions.

Chapter 3 3-15
Systems Approach To Training Manual Develop Phase

f. Risk versus Benefit Once the control measures have been selected, it is
time to make risk decisions. Analyze the overall level of risk for the lesson with
the selected controls in place. Decide if the benefits of the lesson outweigh the
reduced level of risk. If the risk level is still too high for the benefits, a decision
to continue or discontinue training must be made at the proper level.

g. Identify the Proper Level of Decision Making The person at the right
decision level must be able to effectively oversee implementation of these
controls. Once this is known, making control decisions involves two components:

1) Which controls to implement. Exercise those controls that will reduce the
lesson risk to an acceptable level.

2) How much can be spent. Consider the cost of resources needed to


implement the control.

4. Implement Controls Once the risk control decision is made, resources


must be available to implement the specific controls. There are four things that
must be accomplished in this step, they are:

STEP 4 a. Record the Instructions Record how to implement the controls on the
ORAW in the column labeled “How to Implement.” For example, if the control for
an eye hazard is to wear impact-resistant safety goggles, record who will make
that happen (probably the instructor), when they will do it, etc. The idea is to
explain who, what, when, where, and how of implementing the control. Record
that information in this column.

Figure 3-8. ORAW with


Implement Controls and
CTC.

b. Make Implementation Plan Clear The implementation plan for


providing the necessary controls must be clear and concise. For example, if
safety goggles are listed as a control, be sure to include information like who will
get the goggles, who will maintain them, instructions for inserting notes about
wearing them in the lesson plan or student materials, etc.

Chapter 3 3-16
Systems Approach To Training Manual Develop Phase

c. Establish Accountability All key players must know what they are
responsible for and be held accountable for their part of the plan. Establish
accountability by writing specific names or billet titles on the ORAW.
Accountability for the entire plan is also established in the “Approving Signature”
block.

d. Provide Support The formal school/detachment must provide support to


the people helping accomplish the plan.

5. Supervise The fifth and final step of the Operational Risk Assessment
involves determining the efficiency and effectiveness of risk controls throughout STEP 5
the operation. Supervision requires the monitoring of risk control implementation
to ensure that all controls are implemented as planned. Any ineffective controls
are detected and corrected. Any unforeseen hazards are recorded and controls
developed.

a. Document the Supervision Plan Explain the details of the supervision


plan on the ORAW in the column labeled “How to Supervise.” These plans should
include details on how the instructor is to supervise the students.

b. Supervise the Implementation There are many ways to evaluate if the


plan created is functioning the way it was intended to. Some of the ways are
through the After Instruction Report (AIR), End of Course Critiques (ECC) and by
observing the lesson being taught. To ensure effectiveness, ask “Does the plan
work?” or “Are the controls working and are they effective?” If not, what needs
to be changed?

c. Decide a Course of Action Review the cost/benefit balance. Based on


the data reviewed from supervision efforts, there may be some things that need
to be changed to make the plan more effective.

d. Document the Changes Any changes in the original plan must be


documented. All changes must be made clear to everyone involved in the course,
such as the course chief, instructors, etc. Finish the ORAW by having the proper
authority sign the ORAW in the “Approving Signature” block.

Figure 3-9. Approved


ORAW.

Chapter 3 3-17
Systems Approach To Training Manual Develop Phase

SECTION 3400. DEVELOP LESSON MATERIALS

4 The purpose of developing instruction is to generate the lesson plans, student


outlines, supplemental student material, media, and an Instructor Preparation
Guide (IPG) to support the training. Curriculum developers must create materials
that support student learning and complement instruction. Real world relevance
is the key in the development of lesson materials to maximize the transfer of
learning from the instructional setting to the job. Relevance dramatically
increases the student’s motivation to learn and retain those skills that will help in
the performance of the job. The steps in developing lesson materials include
securing resources, writing the lesson plan and student materials, developing the
media, and creating an Instructor Preparation Guide.

3401. SECURE RESOURCES


LOW – Learning Objective
Worksheet. The development of instructional materials begins with acquiring all the resources
necessary for instruction. A final review of the concept card for the lesson needs
to be made to ensure that required resources are available. The concept card
will provide the methods to use, how much time is allowed for each method, the
type of media, instructor/student ratio, and other notes regarding the lesson. In
addition to the concept card, the Learning Analysis Worksheet(s) (LAWs),
Learning Objective Worksheet(s) (LOWs), and all applicable references are
LAW – Learning Analysis reviewed to provide the background and thought process from the Design phase.
Worksheet This information will assist the curriculum developer in ensuring that the written
lesson meets the intended lesson goals.

3402. WRITE A LESSON PLAN

The lesson plan is a comprehensive document that details the strategy to be used
in teaching the learning objectives. Before learning the mechanics of writing a
lesson plan, it is important to understand the function and components of the
The lesson plan is a lesson plan.
comprehensive document
that details the strategy to 1. Functions of a Lesson Plan The lesson plan has three functions:
be used in teaching the
learning objectives. a. First, it provides the instructor, regardless of individual level of expertise,
with a detailed script establishing all the information concerning preparation and
delivery of the lesson content.

b. Second, it establishes continuity and standardization of instruction so that


the same information is taught every time.

c. Third, it provides a historical document of what has been taught at the


formal school/detachment.

Chapter 3 3-18
Systems Approach To Training Manual Develop Phase

2. Components of a Lesson Plan The title page, introduction, body, and


summary are the primary components found in a lesson plan. Refer to Appendix
B for a sample of the format to use when writing the lesson plan.

a. Title Page The title page is the cover sheet for the lesson plan.

b. Introduction The introduction is written to gain the attention of the


student at the beginning of the lesson and provide an overview of what the
student can expect in relation to the lesson. There are seven parts to an
introduction. They are the gain attention, overview, learning objectives, method Refer to Appendix B for a
and media, evaluation, Safety/Cease Training brief (per the ORAW) and the sample of the format to use
transition into the body. These parts will be discussed in more detail later in this when writing the lesson
section. plan.

c. Body The body of a lesson plan is a detailed script of the content to be


presented. It is written to cover all of the information necessary to master the
learning objectives for the lesson. It includes the main ideas, lesson content,
transitions, instructor notes, and cues for the instructor.

d. Summary The summary provides a review of the main ideas, reinforces


the importance of the content, and includes administrative instructions for the
students.

How to Write a Lesson Plan

Writing the lesson plan involves producing a detailed script that establishes what
needs to be said and done during the lesson so that the students are able to meet
the learning objectives. The lesson plan is written in the following sequence:

1. Title Page.
2. Body.
3. Introduction.
4. Insert Transitions, Instructor Notes, and Cues.
5. Summary.

The title page is produced first. Next, the body is outlined and written
so that a conceptual framework is established. This conceptual
framework establishes the main ideas and the sequence in which they
will be introduced. Since the introduction provides an overview of the
conceptual framework, it is written after the body is developed. Once
the introduction is completed, transitions, instructor notes, and cues are
inserted into the lesson. Last, the summary is written to bring closure
to the lesson.

Chapter 3 3-19
Systems Approach To Training Manual Develop Phase

The following steps detail how to write each component:

1. Title Page The title page contains the school’s name and address centered
at the top. Centered in the middle of the page are the lesson title, lesson
designator, course title, course identification number, and the date the lesson
STEP 1 was developed or revised. At the bottom of the page is the signature block and
date of approval. (See Appendix B for a sample lesson plan.)

2. Writing the Body When writing the body the curriculum developer
establishes and sequences the main ideas, inserts content, provides examples,
determines and inserts methods. This is done in a manner so that the material
flows and aids the transfer of learning.
STEP 2
Steps in writing the body are as follows:

a. Establish Main Ideas.


b. Sequence Main Ideas.
c. Insert Content.
d. Provide Examples.
e. Determine Placement of Methods.
f. Insert Methods.

a. Establish Main Ideas Within the body, main ideas are numbered 1, 2,
3, and so on. As a general rule, the main ideas correspond with the learning
objectives. However, there are times when an ELO is complex and must be
broken into more than one main idea. The main ideas need to be bold,
underlined, and uppercase so that they can be easily distinguished from the rest
of the lesson content by the instructor. The grouped and sequenced knowledge
and skills that were produced during the learning analysis can be used to break
the main ideas into specific subheadings detailing the outline. Such knowledge
and skills provide the curriculum developer with more comprehensive information
to cover within the lesson.

The format for the body is as follows:

1. MAIN IDEA #1.

a. Paragraph Heading.

(1) Paragraph Heading.

(a) Paragraph Heading.

1 Paragraph Heading.

a Paragraph Heading.

b. Sequence Main Ideas The main ideas are normally presented in the
same order as the learning objectives. The initial sequence of the learning
objectives was determined during the Design Phase. However, the curriculum
developer may have to re-sequence the main ideas to ensure that the lesson plan
flows logically, student retention is maximized, and logistical constraints are met.

Chapter 3 3-20
Systems Approach To Training Manual Develop Phase

c. Insert Content Content is now inserted to explain, in detail, the main


ideas and subheadings. The information is compiled from the references noted on
the concept card (e.g., technical manuals, Orders, and reference publications). It
is the curriculum developer’s responsibility to ensure that the information is
explained in such a way that the instructor can easily understand the content.

d. Provide Examples In addition to the teaching points, real world


examples and experiences are placed in the outline to aid the learning process
and provide realism for students. The material must be structured to present a
variety of examples to explain a topic. The use of multiple examples helps
students with varying experience and comprehension levels better understand the
material. Examples also emphasize how to do or how not to do something and
why.

e. Determine Placement of Methods The methods that were


determined during the Design Phase are listed on the concept card. However, the
placement of the method in the lesson plan is determined at the time that the
lesson plan is being developed. For performance-based learning, the placement
of lecture, demonstration, and practical application methods in the lesson plan is
important to the learning process and level of retention.

1) Lecture (Formal/Informal) Lecture is assumed as the method for


presenting the content of the lesson unless another method is noted.
All other methods will have an associated instructor note and specific
directions for employing that method. Lecture is generally used to
provide the foundational knowledge required for the development of
skills or attitudes. For instance, lecture is generally used before
demonstration and practical application so that the students are
familiar with the process or procedures before seeing and performing
them.

2) Demonstration Demonstrations usually take place during or


following the lecture. When using demonstration, the instructor
explains the process or procedure while performing it for students.
Demonstrations can be placed anywhere in the lesson, but are
normally placed immediately prior to the practical application.

3) Practical Application Whenever students are learning a new skill or


acquiring knowledge, they should be given the opportunity to practice
what was taught. The more practice students are given during
instruction, the more likely students are to retain the information and
apply it to the job. When a practical application is involved, decisions
must be made on the placement and frequency of the practice
session(s) (massed versus distributed). Additionally, it must be
determined whether the task(s) need to be taught as a whole or
broken into parts (whole versus part practice sessions). More detail
on these type of practice sessions are discussed below along with a
chart to aid in this decision making process.

 Massed Versus Distributed Practice Sessions


 Whole Versus Part Practice Sessions

Chapter 3 3-21
Systems Approach To Training Manual Develop Phase

 Massed Versus Distributed Practice Sessions In massed practice,


the learner engages in one or a few intensive, extended periods of
practice with little or no rest between. The alternate form of practice is
called distributed, in which the learner participates in many relatively
short practice sessions spaced over time.

Based on the time constraints of the course, the curriculum developer


must decide whether to divide practice periods into segments of
distributed practice or plan one continuous session of massed practice.
For instance, distributed practice interspersed with rest periods
permits more efficient learning of psychomotor skills than does massed
practice. The reason for this is that rest periods allow students to
overcome the fatigue that builds up when performing the same
procedures continuously. The greater the length or difficulty of the task,
the more appropriate distributed practice is relative to massed practice.

Shorter & More Frequent Longer and Less Frequent


Is simple, repetitive, or boring Is complex
If the Task

Demands intense concentration Has many elements


Is fatiguing Requires warm-up
Demands close attention to detail Is a new one for the performer

Is young or immature (unable to sustain activity) Is older or more mature


Learner

Has short attention span Is able to concentrate for long periods of time
If the

Has poor concentration skills Has good ability to focus attention


Fatigues easily Tires less easily

Smith and Ragan (1999), Instructional Design, 2nd Edition.

 Whole Versus Part Practice Sessions The curriculum developer must


decide if it is more efficient to teach an entire task at each practice session
(whole) or to teach individual subtasks initially (part) and begin combining
them as the student masters each subtask. For tasks that have highly
interrelated subtasks (e.g., preparation of an operations order), the whole
method is more efficient than the part method. When tasks do not have
highly interrelated subtasks (e.g., preventive maintenance of the M16A2
rifle), the part method is superior to the whole method.

Emphasize Whole Emphasize Parts


If the Task

Has highly dependent (integrated) parts Has highly individual parts


Is simple Is very complex
Is not meaningful in parts Is made up of individual skills
Is made up of simultaneously performed parts Requires limited work on parts or different
segments
Is able to remember long sequences Has a limited memory span
Learner
If the

Has a long attention span Is not able to concentrate for a long period of time
Is highly skilled Is having difficulty with a particular part
Cannot succeed with the whole method

Smith and Ragan (1999), Instructional Design, 2nd Edition.

Chapter 3 3-22
Systems Approach To Training Manual Develop Phase

f. Other Methods During the Design phase, the appropriate method(s) for
each learning objective was determined and placed on the Learning Objective
Worksheet (LOW). The placement of method(s) in the lesson plan is determined by
where the content for each learning objective is being taught. However, there are
methods that can cover multiple learning objectives (see Section 2207). One such
method is the case study that is placed at the end of the lesson for reinforcement.
Before making decisions concerning the placement of alternate methods, consider
the following:
1) The amount of knowledge and/or skill that the student needs as
prerequisite for the method to enhance the learning process.
2) The amount of knowledge and/or skill that the student brings into the
lesson.
g. Insert Methods Whenever there is a method, other than lecture, inserted
in a lesson plan, specific instructions must be provided to the instructor. This
provides the instructor with the details so that he/she is able to implement
instruction as intended. When practical-application is inserted into the outline,
practice and provide help headings are indicated to offer detail to the instructor. All
other methods will have student role and instructor role headings. These headings
are described in detail below.
1) Practical Application There are three headings used when inserting
practical application. An example of the format follows.
 Practical Application Heading This heading is uppercase, bold, and
underlined. Beside practical application, general information is provided
to include group size, if applicable, and setup (handouts, turn charts,
actual items to distribute, etc.) for the practical application. The
purpose of the practical application and the desired outcome should be
explained.
 Practice Heading This heading is uppercase and bold. Beside practice,
the curriculum developer describes in detail step-by-step instructions for
what the student’s role in the practical application will be.
 Provide-Help Heading This heading is also uppercase and bold.
Underneath the provide help heading are three subheadings describing
the instructor’s role before, during, and after the practical application.
The subheadings include the safety brief, supervision and guidance, and
debrief.

PRACTICAL APPLICATION. Provide general information to include group size, if


applicable, and setup (handouts, turn charts, actual items to distribute, etc.).
Provide the purpose of the practical application. Example of format for
PRACTICE: Describe in detail step-by-step instructions for what the student's role Practical Application
in the practical application will be.
PROVIDE-HELP: Describe the Instructor's role.
1. Safety Brief: (per the ORAW) This is a brief to the students on safety
precautions and what to do if there is a mishap.
2. Supervision and Guidance: Describe what the instructor is doing during the
PA, i.e., moving about the room, assisting students, answering questions.
3. Debrief: (If applicable) Allow participants opportunity to comment on what
they experienced and/or observed. Provide overall feedback, guidance on any
misconceptions, and review the learning points of the practical application.

Chapter 3 3-23
Systems Approach To Training Manual Develop Phase

2) Other Methods There are also three headings used when inserting
other methods. An example of the format is on the next page.
 Method Heading The method heading identifies the method being
used. This heading is uppercase, bold, and underlined. Beside the
method, general information is provided to include group size, if
applicable, and setup (handouts, turn charts, actual items to
distribute, etc.). The purpose and desired outcome should also be
explained here.
 Student Role Heading This heading is uppercase and bold. Beside
student role, the curriculum developer describes in detail step-by-
step instructions for what the student’s role is during the method.
 Instructor Role Heading This heading is also uppercase and bold.
Beside instructor role are three subheadings describing the
instructor’s role before, during, and after method implementation.
The subheadings include the safety brief, supervision and guidance,
and debrief.

DEMONSTRATION. Provide general information to include group size, if


Example of format used for applicable, and setup (handouts, turn charts, actual items to distribute, etc.).
methods other than
practical application. STUDENT ROLE: Describe in detail step-by-step instructions for what the
student's role during the demonstration will be.

INSTRUCTOR ROLE: Describe Instructor's role.


1. Safety Brief: (per the ORAW) This is a brief to students on safety
precautions and what to do if there is a mishap.
2. Supervision and Guidance: Describe a detailed script of exactly what the
instructor is doing during the demonstration.
3. Debrief: (If applicable) Allow students the opportunity to comment on
what they experienced and/or observed. Provide overall feedback, guidance on
any misconceptions, and review the learning points of the demonstration.

3. Write the Introduction There are seven parts to an introduction. They are
the gain attention, overview, learning objectives, method (and media),
STEP 3
evaluation, Safety/Cease Training brief, and the transition into the body.

a. Gain Attention The gain attention is developed to capture the students’


interest. It must relate to the lesson content and inform the students why the
information is important. The gain attention needs to provide the student with
why he/she needs to learn the information. This is often referred to as the
WIIFM ("What's in it for me?"). According to adult learning principles, adults are
motivated to learn to the extent that they perceive what they learn is applicable
to what they do (Refer to Chapter 6 for more on adult learners). By providing the
relevance and intent of the lesson, the attention of the students is gained. The
curriculum developer needs to provide in the lesson plan a completed gain
attention, along with a few other possible ideas. Extra lines are placed below the
gain attention so that the instructor can personalize the completed gain attention
or use one of his/her own ideas. Any changes within the lesson plan need to be
approved by the appropriate personnel in accordance with the formal
school/detachment’s Standing Operating Procedures (SOP).

Chapter 3 3- 24
Systems Approach To Training Manual Develop Phase

The following items can be used for gain attentions:

 Film clip. Types of gain attentions


 Skit. that can be used.
 Historical/actual event.
 Rhetorical question.
 Unexpected/surprising statement.

Regardless of the type of gain attention used, its elapsed time should be in
proportion to the overall length of the lesson. For example, a gain attention for
a one-hour class should be no more than 3-5 minutes.

b. Overview In the overview, the instructors can first introduce themselves


along with their qualifications or experience. The overview then describes the
intended outcome of the instruction and the conceptual framework of the lesson. A
conceptual framework informs students of the learning agenda for the lesson by
stating the main ideas that will be covered to achieve the desired outcome. By
providing the conceptual framework, student anxiety is decreased. Adult learners
prefer to be oriented to what will be covered (Refer to Chapter 6 for more
information on adult learners). The overview may also state the lesson's relationship
to other lessons within the course, if applicable.

c. Learning Objectives Learning objectives are presented early in the lesson


to inform students what knowledge or skill is required for successful completion. It
is critical for students to understand at the outset of a lesson what is expected of
them. A lesson presents at least one (or more) Terminal Learning Objective(s)
(TLOs) and two or more Enabling Learning Objective(s) (ELOs). Even if an ELO
within a lesson only partially supports a TLO, that TLO is listed. Listing the TLO(s)
provides focus for both the student and instructor. The TLO(s) are the desired
goal(s) within the lesson plan and the ELO(s) are the main ideas that support the
TLO(s). The TLO(s) and ELO(s) are transferred to the lesson plan verbatim and in
the same sequence as they appear on the concept card. For lesson purpose classes,
a statement is placed in this section to state, “There are no formal learning
objectives.”

d. Method/Media The method/media section describes the delivery system


that was selected in the learning analysis. This information is found on both the
concept card and Learning Objective Worksheet (LOW). In addition, the
method/media section of the introduction is the natural area to place administrative
instructions that affect the delivery of the lesson. An instructor note must be
inserted immediately following this section to ensure these instructions are delivered
to the students.

An example of the format for the note explaining Instructional Rating


Forms (IRF) is below:

INSTRUCTOR NOTE Insert instructor note to


Explain Instructional Rating Forms. explain IRFs between the
Method/Media and the
Evaluation portion of the
Introduction.

Chapter 3 3-25
Systems Approach To Training Manual Develop Phase

e. Evaluation Most learners want to know how, when, and where they will
be tested on a lesson’s content. In order to reduce student anxiety, the
evaluation section of the introduction describes the type of evaluation, time, and
location (i.e., “in accordance with the training schedule”) of where the students’
knowledge or skills will be evaluated. This information can be obtained from the
training schedule and the exam concept card. For lesson purpose classes, there
are no formal evaluations.

f. Safety/Cease Training (CT) Brief Lessons that involve risk of injury,


or damage to equipment must include a safety brief. Explaining to the students
that hazards have been identified and controls implemented to minimize the risks
When developing the will reduce anxiety about the training. Identified controls and hazards are
Safety/Cease Training brief, provided from the ORAW completed on the lesson. This also serves to make the
refer directly to the ORA students safety conscious before the first main idea is introduced. Additionally,
worksheet. the CT will be briefed if all students are required to know and initiate a stoppage
of training. Regardless of the student’s role, the instructor is responsible for
reviewing and executing the CT policy for the lesson spelled out on the
Operational Risk Assessment Worksheet located in the Master Lesson File (MLF).

g. Transition A transition is placed between the introduction and the first


main idea. This transition is where the instructor ensures all students have an
understanding of what is going to be taught, how it is going to be taught, how
they will be evaluated, and Safety/Cease Training procedures. The transition then
introduces the first main idea. The curriculum developer must provide the
transition, along with blank lines, so that the instructor can personalize.

An example of a transition into the body of a lesson.

TRANSITION: Are there any questions about what we will be covered, how
it will be covered, or how you will be evaluated? Do you have any questions
about the safety or Cease Training procedures? Now let’s talk about (first
main idea).
_____________________________________________________________
_____________________________________________________________
_____________________________________________________

4. Insert Transitions, Instructor Notes, and Cues Now that the body is
outlined and the introduction is developed, the next step is to insert transitions,
instructor notes, and cues (time, media, and break) into the lesson.

a. Types of Transitions to Insert Transitions tie together the different


STEP 4 components, methods, and main ideas within the lesson by smoothly
summarizing one main idea and introducing the next idea. The transition should
reinforce the conceptual framework, allowing the instructor to probe and gather
feedback from the students.

1) Transition A transition is placed between the introduction and the


first main idea, between each main idea in the lesson, and between
the last main idea and the summary. A transition contains three
elements: review, probe, and introduce.

 Review. The review is done by smoothly summarizing the main idea


that was just taught. The students are then asked if they have any
questions about the information covered so far.

Chapter 3 3-26
Systems Approach To Training Manual Develop Phase

 Probe. The probe allows the instructor to confirm student


comprehension by asking questions and accepting feedback. At least
one question should be asked during a transition that relates to the last
main idea covered. The curriculum developer writes the question to be A transition contains
posed and its intended answer directly in the lesson plan. Questions three elements: review,
need to be challenging and cover the critical point of the last main idea. probe, and introduce.
Unless a direct question actually appears in the transition,
inexperienced instructors may not ask a probing question. However,
blank lines are left at the end of each transition so that the instructor
has room to personalize each transition. If the instructor personalizes
the transition, he/she must be sure to review, probe, and introduce the
next main idea. The transition should not be the only place where
questions are posed. (Refer to Section 4301 for more on questioning.)

 Introduce next main idea. The introduction of the next main idea takes
the instructor smoothly into the content to be covered next.

To sum it up, after a review of the main idea, an overhead question appears to
allow students an opportunity to ask any question on the material (i.e., “Are there
any questions on ___?”). This is followed by at least one direct question that will be
asked during the transition. It comes from the last main idea taught (i.e., Q: “What
is the… A:”). Then the next main idea is introduced.

An example of a transition is in the shaded box below:

TRANSITION: Now that we've talked about why the SAT is important, are there
any questions? QUESTION: Why does the Marine Corps use the Systems Approach
to Training? ANSWER: The process reduces subjectivity in how formal school
decisions are made regarding curriculum and instruction. Now, let's talk about how
the SAT is used in the formal school/detachment environment.
__________________________________________________________
__________________________________________________________

2) Interim Transitions Interim transitions are used when a method


(e.g., practical application, demonstration, etc.) or break is inserted
within the main idea. These transitions are placed before the new
method to provide the instructor with guidelines of how to transition
from the lecture to the next method or break. Once the method or
break is complete, another transition must be placed to close out the
method and transition back to lecture. An interim transition is different Interim Transitions
from other transitions because it does not require a probing question. contain Elements:
If a new main idea is to be introduced following the method or break,
then a transition should be written to review, probe, and introduce the 1. Review
next main idea. 2. Introduce method or
next heading
An example of an interim transition is in the shaded box below:

INTERIM TRANSITION: Thus far, we have discussed the techniques used for
effective questioning. Does anyone have any questions about questioning
techniques? At this point, we're going to do a practical application where you will
use the different questioning techniques.
_____________________________________________________________

Chapter 3 3-27
Systems Approach To Training Manual Develop Phase
b. Insert Instructor Notes Instructor notes include information pertinent
to the conduct of the lesson and can appear throughout the lesson. These notes
are normally short and concise. Instructor notes are especially valuable to
alternate instructors. As with cues, instructor notes should stand out from the
normal text as illustrated below with a text box.

An example of the format for an instructor note is below:

INSTRUCTOR NOTE
Have the students refer to TM 9-2350-264-10-1
for preparing the driver’s station on the
M1A1 tank.

c. Insert Cues There are three types of cues contained in the lesson to
assist the instructor with the presentation of the material. All cues must stand
out from the regular text of the lesson. The three types of cues are:

3 Types of Cues 1) Time Cues. Time cues are approximations for the amount of time
required by the instructor to present each lesson component.
1. Time Cues
 Main Headings. Time cues for main headings (Introduction, Body,
2. Media Cues
and Summary) are placed right justified of the heading. The cue is
3. Break Cues
bold, capitalized, and in parenthesis (30 MIN). The main heading
time cues (Introduction, Body, and Summary) add up to equal the
total lesson time.

 Main Ideas. Time cues for the main ideas within the Body are placed
two spaces after the main idea. The main idea cue is bold, natural-
cased text, and in parenthesis (30 min). All main idea time cues
add up to the Body time cue. The sum of all the main heading time
cues [and, at times, method time cues (see below)] in a lesson plan
equals the total time for the lesson reflected on the concept card.

 Methods. The time allotted for a particular method (demonstration,


practical application, case study, etc.) is explained within the method
One exception within time instructions, with the exception of one case. If a method is not
cues for methods within a main idea, then it has its own time cue. For example, if
three main ideas are covered and a practical application is inserted at
the end to practice material covered in all of the main ideas (mass
practice), then that practical application is given its own time cue
since the method is not associated with any one main idea.

2) Media Cues Media cues are placed throughout the lesson to tell the
instructor what media to use and when to present it during the
lesson. An alpha/numeric designator is used to identify specific
medium at a specific point in the lesson. For example, (ON SLIDE
#1) indicates to the instructor to employ slide #1 of the
presentation.

3) Breaks Cues Students are more likely to retain information if


breaks are built into the course. It is generally recommended to allow
a ten-minute break after every 50 minutes of instruction.
Remember, it is important for the instructor to summarize
information via a transition from day-to-day, lesson-to-lesson, and
before and after breaks so the flow of instruction is not disrupted. A
related activity can also be inserted to regain the students’ attention
to the subject material after the break.
Chapter 3 3-28
Systems Approach To Training Manual Develop Phase

NOTE: A break cue is not counted as a separate time cue when it is within
a main idea. However, if the break cue falls between main ideas, then it
becomes a time cue. In a case where a break cue becomes a time cue, the
main idea time cues and the break cue are added together to equal the
Body time cue.

An example of the format for a break cue is below:

(BREAK - 10 Min)

5. Review Lesson Plan and ORA Compare the lesson plan with the ORA STEP 5
worksheet to ensure that all hazards have been identified and that the controls have
been integrated into the lesson plan.

6. Write the Summary The summary is a short paragraph which restates the
main ideas (conceptual framework) discussed during the lesson. The summary is
not used to re-teach material nor introduce new material; it is concise. No
questions are asked or answered during the summary of a lesson. All questions
should have been answered in the body and in the transitions. The summary
needs to provide closure, a “mini-WIIFM,” quote, or similar item, which will confirm
why it is important for the student to remember what was taught. The final
element of the summary given to the student is closing instructions, i.e. “Fill out STEP 6
IRFs and take a ten-minute break.”

3403. STUDENT MATERIALS

Student materials serve as a guide to what is being presented in the course. These
materials can provide class participants with additional facts and information. These
materials also serve as a study guide that should be referred to during the course
and/or as a job aid that students can take back to their unit following the
completion of the course. When developing student materials, the appeal and ease IRF – Instructional Rating
of their use needs to be considered. These materials are supplied to aid the student Form.
in his/her learning. There are two types of student materials, student outlines, and
supplemental student materials. Each is designed for a specific purpose that will aid
the student during the course.

When developing student


materials, the appeal and
ease of their use needs to
be considered.

Chapter 3 3-29
Systems Approach To Training Manual Develop Phase

1. Student Outline The student outline provides the student with a general
structure to follow during the class and a conceptual framework that highlights
the main ideas of the class. It contains the learning objectives, lesson outline,
and any references used to build the lesson. It also includes any additional
notes or information, such as graphics (charts, graphs, clip art, photos,
diagrams) deemed necessary. When developing the student outline, it does not
have to be in Naval Format or in outline form. It should be developed in a way
that the student is able to easily follow and use. Regardless of the format, all
pertinent information from the lesson plan should be included, as described
above. Appendix B provides examples of some different formats for student
outlines. Student outlines can be written using one or a combination of the
following styles:

a. Completed This style provides students with a "cut-and-paste" of the


body from the instructor's lesson plan that excludes the administrative
information, introduction, any cues, instructor notes, and/or the summary. This
style is desirable when students are expected to read the entire outline during
class, are unable to take notes or follow along during class, or when instruction
takes place outdoors. It is very useful as a study guide or a job aid.

b. Fill in the blank This style uses an abridged form of the completed
style with key terms or phrases omitted and replaced with blank lines. It is
developed as a skeleton outline of the lesson plan. It is the student's
responsibility to follow the lecture and fill in the missing information. When
students complete the missing key terms or phrases, they are more likely to
remember the material as they heard, saw, and wrote it. Presentation of the
lesson must be structured to allow students time to fill in the blanks. This style
of outline is not recommended for subjects of a technical nature.

c. Bullet This style incorporates short, informational statements


presented in the same sequence as in the lesson plan. The student must take
detailed notes to complete the information. Curriculum developers must take
this into consideration and leave sufficient “empty space” for student’s notes in
the outline. The bulleted style is not recommended for those students with little
or no knowledge of the subject.

2. Supplemental Student Materials Supplemental student materials


include handouts, other than the student outline, given to the class in support of
the instruction. Supplemental student materials may include advance handouts
to prepare the student for class. Additionally, supplemental student materials
may include answer keys to quizzes, additional articles for reading, reference
materials such as technical manuals, graphs, charts, formulas, figures, and
maps. The use and number of supplemental student materials is optional and
they can be presented in any format. The distinction needs to be made
between supplemental student materials and classroom instructional aids. The
distinction is made based on ownership. Supplemental student materials will be
items that students are able to take with them following the class. Instructional
aids belong to the instructor for use in the classroom. Although the students
use them during the class, they do not retain them at the end of the lesson. All
supplemental student materials should support the learning objectives being
taught.

Chapter 3 3-30
Systems Approach To Training Manual Develop Phase

3404. DEVELOP MEDIA

Instructional media are developed to enhance verbal information and increase the
student’s ability to retain the information identified in the learning objectives by
appealing to the different senses.

Several factors affect the development of instructional media and materials. The
relative importance of each of these factors depends on the media that have been
selected. These factors are personnel, time, funds, equipment, and facilities. The
curriculum developer determines whether they will use the media and material in
their current form, make modifications, or purchase/create anew. This decision is
based upon the resources available to the formal school/detachment as identified on
the concept card. Resources may be available through local commands and/or
bases. For example, major Marine Corps installations have Combat Visual
Information Centers (CVIC) and Combat Camera Reproduction Units with trained
personnel to operate media production equipment and can produce various types of
media. Curriculum developers and/or instructors can contact the center/unit and
request the media type.

 Using existing materials is cost and time effective. The impact on


other classes must be considered.

 Modifications can be made easily to some but not all media types.
Cost and time may impact.

 Purchasing new materials involves few personnel but generally


entails a substantial acquisition cost.

 Creating new materials will involve all the resources of personnel,


time, funds, equipment, and facilities.

Copyright Some material used as media may be copyrighted. Permission to use


copyrighted material can be requested from the owner/holder. If copyright
permission has been given, then it should be referenced in the Master Lesson File
(MLF) and the media. Permission can usually be obtained in a timely manner by
faxing a letter to the copyright owner requesting permission and stating that “the
material will be used for government training/education.” In certain cases,
copyrighted material can be used without having obtained permission from the
owner/holder under the "fair use" doctrine. However, the appropriate legal
authority should be addressed when using copyrighted materials under "fair use.”
Contact your command/base legal service office for further information on the “fair
use” doctrine.

Chapter 3 3-31
Systems Approach To Training Manual Develop Phase

3405. MEDIA DEVELOPMENT CONSIDERATIONS

Using the lesson plan and visualizing different ways to present each point can
Storyboard – visual and generate new ideas for media. A storyboard is one way to plan the presentation. It
verbal outline. can be used as a visual and verbal outline of the lesson. Not all curriculum
developers will use a storyboard to write down/sketch ideas, but it is a tool that
generally saves time. Regardless of how the ideas are brainstormed, there are a few
general guidelines or considerations regarding alignment, fonts, lettering, graphic
devices, and color that can enhance development. Use a consistent design (e.g.,
backgrounds, colors, and fonts, throughout a presentation).

1. Alignment Alignment sets a clear pattern to help the viewer find information
in each frame and creates a sense of clarity within the presentation. The alignment
scheme chosen for each presentation dictates how the elements will line up
throughout the presentation.
Alignment Examples
a. Flush Left Alignment Natural and easy to follow, establishes a solid visual
anchor, and is the preferred alignment for presentations.
Left Alignment
b. Centered Alignment The most formal and conservative alignment formats
are centered. Centered alignment is done to align information in the middle
throughout the presentation without visual anchors. Center alignment is
Centered Alignment
predominately for headings. This alignment is used primarily in print media and only
sparingly in a presentation.

c. Right Alignment The opposite of flush left is flush right text. It is best to
avoid flush right text. Align graphics flush right only if there is a strong left vertical
Right Alignment
image. Right alignment can be used to intentionally call attention to text. Numbers
are always flush right.
Print media such as
d. Justified Text Justified text is flush on both the right and left sides. Print
newspapers columns are
media such as newspapers columns are usually justified text.
usually justified text

2. Font Type In presentation design we normally use two types of fonts: serif
and sans serif.
Serif Fonts
a. Serif Fonts Serif fonts have a small finishing stroke at the ends of the main
Times New Roman character stem. Serif fonts are primarily used for print media, such as books,
Courier New because they are easy to read in quantity. Examples: Times New Roman, Courier
Rockwell New and Rockwell.

b. Sans Serif Fonts Sans serif fonts are letters without small finishing strokes.
They are rarely used for books but are preferred for projected media. They project
Sans Serif Fonts
well and are more legible than serif fonts. Examples: Arial, Avant Garde and
Tahoma.
Arial
Tahoma

Chapter 3 3- 32
Systems Approach To Training Manual Develop Phase

3. Font Size Be sure to use a consistent type size throughout the media,
whether it is print or projected media.

a. The recommended font size for projected media is 32-36 points.

b. Font size should be no smaller than 24 points if it is going to be


projected.

c. The recommended font size for print media is 10-12 points.

d. The size of print to use for turn charts is dependent upon the room size
and the number of students. Lettering on turn charts needs to be at least 1
inch.

4. Case When using words in any form of media, there are four different cases
that can be used.

a. Title Case In this case, every word has the first letter capitalized, except
articles (a, an, the). As the name implies, it is used for titles. Notice that the items
on this page that are bold and underlined use Title Case.

b. Sentence Case This is normal capitalization where the first word of every
sentence is capitalized. The sentences found on this page use sentence case.

c. Lower Case This case uses no capital letters at all.

d. Upper Case This case uses all capital letters. It is useful for highlighting
important terms and should be used sparingly. It is not recommended to use
UPPERCASE LETTERS ONLY. All uppercase letters generally reduce legibility and
slow down reading. In projected media, upper case letters can be used for titles.
However, sentence or bulleted text should not use all upper case.

5. Graphic Devices

a. Lines Lines can be used to add clarity to a layout, but use them sparingly.
To emphasize a point, try changing the color or the weight (boldface) rather than
underlining. Pay strict attention to the weight of the line and the color of the line.

b. Borders Presentation graphics rarely benefit from a border around the


whole frame. Borders can convey the message that whatever is outside the border
is not related to what’s inside. However, borders can be used to frame a picture or
a portion of text. Used this way, borders can enhance the visual appeal.

c. Boxes A solid color box set against the background offers a good
alternative to a border. This can be a good way to call attention to content.
Applying 3-D capability can show depth.

e. Shadows Shadowing is an extremely useful technique for generating


dimension.

Chapter 3 3-33
Systems Approach To Training Manual Develop Phase

e. Images Most audiences sincerely appreciate image simplicity. Images may


be pictures or clip art. Generally, using pictures that display real world meaning or
tell a story can aid a student's ability to remember the teaching point. A picture can
be worth a thousand words. When using images, it is important to ensure that the
image is not skewed. Images can become skewed when their width or length is
tampered. To ensure that you do not skew a picture, drag the corner of the picture
so that the width and length increase by the same amount.

f. Image Overload Don’t fall prey to image overload and frame clutter. Just
because the capability is available to add unlimited typefaces, shadows, patterns,
colors, and lines does not mean they will contribute to the presentation. Well-made
images call attention to the content rather than themselves.

6. Why Use Color? Color is the best part of the presentation but also the
hardest to handle. The reason to use color in a presentation is to show things the
way they are seen…green grass; blue sky; orange sunset.

a. Color Association There can be benefits from color associations by


developing signature color schemes in the presentation. For example, all money is
green; all definitions are highlighted in blue. If using expected colors like blue
denotes skies or red denotes stop, etc., the audience will associate certain colors
with images or meanings.

b. Color Differentiation Colors can distinguish like and unlike elements.


Similar bits of information can be subtly differentiated using light and dark tones of
the same color. Consistent graphic elements should be linked from frame to frame
with a consistent application of color. Providing a good contrast in colors not only
enhances the ability to read and make a distinction between graphic devices, but it
also takes into consideration those who may have less than normal color vision. The
trick is to enhance difference in brightness and to avoid color combinations that do
not contrast well.

c. Color Emphasis By assigning a particular element a color that is brighter


or lighter than the rest, emphasis can be brought to the element.

Making a Color Palette

Select the background color first. Normally, the background in electronic


presentations is dark. Dark backgrounds require light or white text. The bullet
color should contrast with the background. Select colors that contrast with the
background but do not overwhelm. Be consistent with your color throughout
the presentation. If you can’t read it, neither can your audience.

 Title each turn chart page or poster.

 Use more than one color, but not more than four. Reproductions of pictures
or diagrams may already have more than four colors.

 If writing information on a turn chart for repeated use, then make sure to
use broad-tip marking pens that provide contrast but will not bleed through
to the next sheet. Watercolor brands will not bleed.

 Allow 3”-4” margins on both sides. When developing turn charts and
posters, consider the size of the classroom that the visual will be used in, the
seating arrangement, and the number of students. Keep a large margin at
Chapter 3 3-34
Systems Approach To Training Manual Develop Phase
the bottom so that images and information are not below the student’s line
of view.

 Print rather than use cursive writing.

 Ensure that images and writing is large enough to be legible to an intended


group.

 Keep words short or use well-understood abbreviations.

 Include simple drawings, symbols, and charts if it helps the learning point.

 Laminate to protect for repeated use.

 Indexing tabs can be used to quickly reference a specific chart.

7. Projected Media Projected media includes transparencies, projection of


actual object (opaque projector), computer aided graphics, and video.

a. Transparencies

 With transparencies more graphical elements than text need to be used.


Viewers will stop reading text once they know what it says, but they will
continue to look at illustrations. Use text for key ideas only.

 Use bullets with key words and phrases.

 Ensure that text will be large enough and clear enough for easy reading.
Use plain typeface. If you can read the transparency can be read while it's
on the floor, then the lettering is fine.

 Background. Blue transparencies provide a neutral background for black


text and illustrations; yellow transparencies with black text are highly legible;
clear transparencies are best if color is used in the content.

 Place frames on transparency. This will eliminate light leak when projecting
them.

 Number the slides to correspond with the media cues in the lesson plan.

Chapter 3 3-35
Systems Approach To Training Manual Develop Phase

b. Computer Aided Graphics (i.e., PowerPoint)

 One idea, thought, or concept per screen.

 It is recommended that no more than two fonts be used in a visual.

 For projected media, adhere to the six by six rule: no more than six lines
per visual; six words per line. Keep it short and simple (KISS).

 Provide generous use of empty space to increase ease of reading. Leave


blank lines between paragraphs (print media) or bullets (projected media).

 Check for technical accuracy, completeness, programming errors, and


blurred slides prior to publication or production to ensure quality. Spell-
check the presentation.

 Projected media will look different on the screen than it does on the
computer monitor. Different projection systems (LCD projectors, television
and large monitors) will all display slightly different shades of colors. Colors
need to have good contrast to project well.

 Be careful with animation and sound. Too much animation or sound can
become distracting to the audience.

 It is critical to construct a layout structure that will work for your whole
presentation. All frames should have the same basic structure,
backgrounds, color palette, type style and size, heading placement, and
alignment. Most computer graphic software titles allow a choice of
templates that will give the presentation this unity.

 Ensure the presentation is properly paced, not too fast or too slow. Do not
overload students with too much information or too many images at one
time. For example, developing a slide show, it may be appropriate to
animate one bullet at a time.

c. Video Videos can be produced in-house by curriculum developers or with the


aid of a videographer [Combat Visual Information Centers (CVIC), Combat Camera
Reproduction Units, etc.]. Video can be used stand alone or in conjunction with
computer-aid graphics. Additionally, video clips can be found on the Internet or
full-length tapes can be purchased. As with all media, ensure that appropriate
measures are taken if material is copyrighted.

Chapter 3 3-36
Systems Approach To Training Manual Develop Phase

Some guidelines to consider when developing video or inserting video


clips into other media are:

 Ensure that the computers that will be used for video/audio


clips can support the presentation. Pictures, clip art and video
can use a lot of memory.
 Ensure that images project well. Sometimes when video is
enlarged to full screen, the quality is compromised. When a
video is extended beyond its normal size, it can appear choppy
and blurred.
 Material must focus on the objectives. Edit unrelated material.
The video must communicate and reinforce major points. A
video may be fun and interesting, but it needs to provide focus
to class content.
 Show only relevant portion of video.
 Abstain from using offensive material (profanity, etc.) that can
create barriers to learning. When using movies or video clips,
consider all of the audience and ensure it maintains
professionalism.
 Videos clips should be no more than 5-8 minutes in length as
the viewer can easily lose focus of the intent of the clip and
become engrossed in the video. Instructor notes need to detail
how to introduce the video (what are the key elements the
students need to watch for) and how to conduct discussion or
questioning following the clip (tie what was observed into the
lesson objective). As a general rule, full length videos of thirty
minutes or more need to be stopped every 10-15 minutes for
discussion or questions. This information needs to be stated in
the instructor’s notes.

Chapter 3 3-37
Systems Approach To Training Manual Develop Phase

3407. INSTRUCTOR PREPARATION GUIDE

The Instructor Preparation Guide is a required element of the Master Lesson File
(MLF). This guide is created to provide the instructor with information that is An example of the
critical to the preparation for implementing the lesson. Detailed information is Instructor Preparation
given so that the instructor understands what resources are necessary for the Guide can be found in
lesson. Much of the information provided under administrative information is Appendix B.
copied from the concept card. Though this guide is a MLF item, instructors can
make a copy so that they can check off items when preparing for the lesson. An
example of the Instructor Preparation Guide can be found in Appendix B. The
minimum components for the Instructor Preparation Guide are listed below. This
checklist can be added to as needed.

1. Lesson Title and Lesson Designator The lesson title and lesson
designator are provided to identify the lesson. Both can be found on the
concept card.

2. Total Lesson Time Refer to the concept card for the total lesson time.
This provides the instructor with the amount of time that he/she has to teach the
lesson.

3. References List all of the references from the concept card.

4. Location of Test The location of the test is provided so that the instructor
will know where to go to gather the test materials.

5. Personnel Required List all personnel that will be required to implement


the lesson (e.g., instructors, support personnel, Corpsman). Check the student
to instructor ratio and notes on the concept card for this information.

6. Facilities The facilities required for the lesson need to be listed (e.g.,
classrooms, labs, ranges, etc.). Some facilities may require prior coordination to
ensure availability.

The above components are listed as follows:

LESSON TITLE: Assemble a Master Lesson File Top portion of the


Instructor Preparation
LESSON DESIGNATOR: CD0209 Guide.

TOTAL LESSON TIME: 30 Minutes

REFERENCES: MCO 1553.2, SAT Manual, MCAIMS


User’s Manual

LOCATION OF TEST: See Course Chief

PERSONNEL REQUIRED: 1 Instructor

FACILITIES: 30 seat classroom

Chapter 3 3-38
Systems Approach To Training Manual Develop Phase

7. Review Course Materials This checkbox is necessary so that the instructor


will review the course materials to identify any potential problems prior to
instruction.

REVIEW COURSE MATERIALS:


 Review the course/training schedule, administrative requirements, student
background information, lesson plans, student materials, media, and evaluations
(tests).

8. Personalization This checkbox is necessary so that the instructor adds


personalization to the lesson plan.

PERSONALIZATION:
 Personalize the lesson plan by adding subject matter detail, relating personal
experiences, providing examples, and/or interactive techniques.

9. Materials/Equipment All materials and equipment needed to conduct the


lesson are listed here with check boxes so that the instructor can gather materials
well in advance of the lesson. Materials may include models, mockups, audiovisual
equipment, handouts, etc.

MATERIALS/EQUIPMENT:
 Video Cassette for Gain Attention
 VCR
 30 Brown Binders for Master Lesson Files
 30 Master Lesson File Checklists

10. Exercise Setup and Planning Each exercise (e.g., demonstration, practical
application) is listed here. Underneath each, the setup and planning is described in
sequence with check boxes to the side.

EXERCISE SETUP AND PLANNING:


Demonstration
 An MLF binder is ready to hand out to each student.
 The MLF checklists are ready to hand out to each student.
 Ensure that classroom is set up so that demonstration can be seen by all.

11. Safety The ORA worksheet is a required element of the MLF and must be
reviewed by the instructor. This checklist also requires that the instructor reassess
the environment for changes (e.g., weather or worn equipment) and report findings
on the AIR.

SAFETY:
 Review ORA worksheet in Master Lesson File
 Reassess the environment for changes that affect the original ORA. Document
any additional considerations/controls on the After Instruction Report (AIR) for
future reference.

12. Other Possible Items Additional items can be added to the checklist if
deemed necessary by the formal school/detachment.

13. Approving Signature and Date A space is provided for the designated
approving authority’s signature and date. The formal school’s SOP will dictate who
approves the Instructor Preparation Guide.
Chapter 3 3-39
Systems Approach To Training Manual Develop Phase

3500. CONSTRUCT TESTS SECTION


Tests are designed to evaluate if the learner has the knowledge and skills
required to master the objective or task. Back in the Design Phase, test items
5
were developed for each learning objective. Now, based upon the course
structure and when specific learning objectives are to be tested, the test is
constructed. Before going into the steps for constructing a test, there must be
an understanding of the methods of tests, categories of tests, and the types of
tests.

3501. METHODS OF TESTS

1. Knowledge-Based Test As was discussed in Chapter 2, Section 2207


of the Design Phase, knowledge-based testing can be done through oral or
written tests. This method of testing does not evaluate the student’s ability to
perform the required job skills; however, it does determine if the student knows
how to perform the required job skills. Two advantages of this method are its
high degree of objectivity in scoring (the capability of measuring a large numbers
of facts, ideas, or principles in a relatively short time) and the convenience in the
development of statistical analysis. There are a number of factors that force
Formal Schools to administer knowledge tests: time, cost, safety, and resource
constraints that do not always permit performance-based testing.

2. Performance-Based Test This evaluation deals with the assessment


of technical skills, usually physical/motor skills. It usually deals with physical
performance that follows a procedure or sequence of steps, which is called a
process, or the end result, which is called a product. A test item that requires
the student to replicate a task that is performed on the job is considered
performance-based. A performance-based test will usually have a checklist that
clearly defines the steps or procedures that must be completed to master the
objective. In some circumstances, a written test can be considered a
performance-based test if the student actually performs that item on the job.
For example, filling out a DD Form 214 is a valid performance test for a student
who actually fills those out on the job. A performance test duplicates the actual
behavior by using the same equipment, resources, setting, or circumstances that
the student will encounter on the job.

Chapter 3 3-40
Systems Approach To Training Manual Develop Phase

3502. CATEGORIES OF TESTS

There are different purposes for giving tests. Below are some categories of
testing along with their purpose. Since criterion-referenced testing is the
preferred method of evaluation for the Marine Corps, more focus has been given
to it.

1. Criterion-Referenced Test These tests are used to evaluate the


student’s accomplishment of the criterion objective and to determine the
effectiveness of the instructional system. Criterion-referenced tests are composed
of items based on specific learning objectives. Each individual’s ability to
demonstrate mastery of the learning objectives is measured. The learner’s
achievement is measured against the predetermined criterion established in the
learning objectives.

2. Diagnostic Test The purpose of diagnostic testing is to measure the


achievement of the supporting skills and knowledge that contribute to the ability
to perform the criterion objective.

3. Survey Test These tests are designed to determine what prospective


students already know and can do before receiving instruction. The test is
administered while the instructional system is being developed and provides
important design data.

3503. TESTING INTERVALS

A student’s knowledge and skill level can be tested at different intervals before,
during, and after instruction. A pretest, progress test, and a posttest accomplish
this.
Different test intervals.
1. Pretest A pretest is designed to identify how much the student knows or is
able to do prior to starting the course. This kind of testing is diagnostic in nature. Before
It provides what level the student is at prior to the course. During
After
2. Progress Test A progress test is administered throughout a course to
evaluate student progress and to determine the degree to which students are
accomplishing the learning objectives.

3. Post-test The purpose of a post-test is to identify/evaluate the


effectiveness of instruction and how well the student learned. It is also a
certification process. The student's ability to graduate from the course is
generally based on post-test results. Therefore, certification that the student is
able to go out in the real world and perform the job is provided through
graduation.

Chapter 3 3-41
Systems Approach To Training Manual Develop Phase

3504. STEPS FOR CONSTRUCTING TESTS

The test items have already been written. Now the challenge is to properly
assign and arrange test items, determine the grading criteria, develop scoring
method, and develop the testing instructions.

1. Determining Mastery

a. Mastery Learning Criterion-referenced testing is the preferred


method of testing for learning objectives taught in the formal school/training
center. The criteria for test mastery are established by the learning
objectives. The student, when completing a test, receives either a master
(pass) or non-master (fail) for each learning objective. The student may be
assigned an overall score, but it does not remove the responsibility of
STEP 1 mastering each learning objective. Students that non-master a learning
objective will receive remedial instruction and retesting until they reach the
standard for mastery. The formal school will establish the remediation policy
based on school resources (i.e., time, equipment utilization, availability of
instructors). Students who do not master the learning objective during the
Informal evaluation of established number of retests could be recycled through the program or
learning objectives is dropped from the course.
accomplished through
class work, homework, b. Determination of Course Mastery The term “mastery” can be
quizzes, and practical misleading – mastery does not mean or require that students pass with 100%.
application. Students graduating from a course must, however, master 100% of the
learning objectives.

c. Determination of Learning Objective Mastery A determination is


made by the formal school/detachment as to what is the acceptable level of
performance for mastery of each learning objective. It may easily be that, for
some objectives, a score of 60% is indicative of mastery, while for others a
score of 90% or higher would be required. The determination is based upon
the criticality of the objective. Mastery of all ELOs does not necessarily result
in the mastery of the TLO, just as receiving a minimum score on each
individual event of the PFT does not necessarily mean that you receive an
overall passing score on the PFT.

Chapter 3 3-42
Systems Approach To Training Manual Develop Phase

2. Assigning Test Items When determining what test items to use, the
idea is to measure all learning objectives. Formal evaluation of learning
objectives is accomplished by testing each learning objective.

Informal evaluation of learning objectives is accomplished through class work,


homework, quizzes, and practical application. This does not meet the
requirement to test learning objectives in the formal school/detachment.
There is no established formula for determining the most appropriate number of STEP 2
test items required for testing any given learning objective. However, the
guidelines listed below are factors to consider:

a. Criticality of skill This refers to how important the skill is in relation to


its application to actual job performance.

 High: Skill is used during job performance.


 Moderate: Skill influences job performance.
 Low: Skill has little influence on job performance.

b. Other Criticality Factors Refers to a learning objective’s importance as


related to the performance of a job task.

 Safety to personnel/equipment: Critical tasks are those which


are considered high risk or dangerous.
 Frequency of performance: The more often a task is performed,
the more critical it becomes.
 Learning objective’s importance to on-the-job performance.
 Learning objective’s importance to the overall course mission.

c. Criticality of the objective When both most critical and least critical
objectives are measured on the same test, the critical objective should have more
items to ensure that the test reflects the critical aspects of the course.

d. Instructional time allotted to present the material For example, if


the majority of the material covers one objective, then the majority of the test
items should cover that objective. This ensures the emphasis in the classroom.

e. Complexity of the material The more complex the material is, then the
more test items are required to ensure understanding.

Chapter 3 3-43
Systems Approach To Training Manual Develop Phase

f. Table of Specification Another method to determine what test items


need to be used is to use a Table of Specification such as the one below. This
Table of Specification looks at the learning objectives and assists the test
designer in determining the number of test items to include on a specific
objective. This particular table relates to the cognitive domain. For more
information on the cognitive domain, refer to Chapter 6, Adult Learning.
1) Write in the lesson name.
2) Under “Major Subjects/Topics,” list the main subject/tasks for the
lesson(s).
3) Identify whether the learning objective requires knowledge,
comprehension or application.
4) Identify the learning objective(s) associated with the main
subjects/task. Under (1), write the number of learning objectives
covered in the main heading.
5) Identify the critical KSAs required to achieve the learning objective.
Determine how many test items it will take to determine mastery of
the learning objective. Since all learning objectives must be tested,
the number under (2) cannot be less than then number under (1).
Figure 3-10. Table of 6) Under (3), write in actual number of test items.
Specification 7) List totals of rows under “Totals By Subject/Topic” columns.
8) List totals of columns on the bottom row beside “Totals By LO Level.”

1. WORKSHEET TABLE OF TEST SPECIFICATIONS FOR _______Select Delivery System______________


LESSON NAME
4. TOTALS BY
3. LEVEL OF LEARNING OBJECTIVES SUBJECT/TOPIC
2. MAJOR SUBJECTS/TOPICS
(Learning Objective A. Knowledge B. Comprehension C. Application
Designator) Facts, Terms, of Concepts, Information,
Symbols Principles Concepts, Principles
(1) (2) (3) (1) (2) (3) (1) (2) (3) (1) (2) (3)

1. Adult Learning (9806.01.04c, 2 5 5 2 5 5


9806.01.05c)
2. Learning Domains 2 6 6 2 6 6
(9806.01.04b, 9806.01.05b)
3. Learning Styles (9806.01.04c, 2 6 6 2 6 6
9806.01.05c)
4. Method Selection (9806.01.04, 3 3 4 3 3 4
9806.01.04a, 9806.01.04d)
5. Media Selection (9806.01.5, 3 3 4 3 3 4
9806.01.5a, 9806.01.5d)

5. TOTALS BY LO LEVEL 12 23 25 12 23 25

(1) = Number of Objectives; (2) = Minimum Number of Test Items, (3) = Actual Number of Test Items

Chapter 3 3-44
Systems Approach To Training Manual Develop Phase

3. Arranging Test Items When making decisions on how to arrange test


items, consider the following:
STEP 3
a. Test item placement Test items should be placed on the page so
each item stands out clearly from the others. For example, a true or false item
that is two lines long would have single spacing with double-spacing between
items. A space should separate the stem of multiple-choice items and the list
of answers. The answers should be in a single column beneath the stem and
should be indented beyond the paragraph margin.

Example of Multiple Choice Item:

1. What are the three Domains of Learning?

a. Auditory, Visual, Kinesthetic


b. Intellect, Value, Tactile
c. Knowledge, Skill, Attitude
d. Cognitive, Affective, Psychomotor

b. Arrangement of test items Items of the same type (e.g., multiple


choice, short answer, essay) are grouped together in a test. Individual test
items should also be arranged in approximate order of difficulty, which allows
the students to progress as far as they can without spending excessive time on
difficult items at the first part of the test.

c. Design A test is designed so that the majority of the students can


complete it. When many students cannot complete a test, efficiency is lost and
student morale suffers.

d. Layout/Format Below are some guidelines to consider when


formatting the test:

1) Space items for easy reading and responding.


2) Provide generous borders.
3) List alternative responses vertically beneath the stem (multiple
choice).
4) Do not split an item onto two separate pages.
5) If an answer sheet is not being provided, allow space for student
answers.
6) Number items consecutively throughout the text.
7) If separate answers are used, number them so a check can be made
for complete sets of materials before and after test administration.
8) Select an arrangement of items that serve the purposes of the test.

Chapter 3 3-45
Systems Approach To Training Manual Develop Phase

EVALUATING THE ASSEMBLED TEST


1. Relevance Do the test items present relevant tasks?
2. Conciseness Are the test tasks stated in simple, clear language?
3. Soundness Are the items of proper difficulty, free of defects, and do they have answers that are
defensible?
4. Independence Are the items free from overlapping, so that one item does not aid in answering
another?
5. Arrangement  Are items measuring the same outcome grouped together?
 Are items of the same type grouped together?
 Are items in order of increasing difficulty?
6. Numbering Are the items numbered in order throughout the test?
7. Directions  Are there directions for the whole test and each part?
 Are the directions concise and at the proper reading level?
 Do the directions include time limits and how to record answers?
8. Spacing Does the spacing on the page contribute to ease of reading and responding?
9. Typing Is the final copy free of typographical errors?

Assessment of Student Achievement. By Norman E. Gronlund. p. 122.

4. Developing Grading Criteria Grading criteria describe the standards


by which the student will be measured and factors that will be considered in
determining the student’s grade on an individual performance or knowledge
test/test items.
STEP 4
a. Uses of criteria Grading criteria enable the instructor to determine
whether or not the student/group have met the objective. Additionally, it
provides an unbiased and non-subjective evaluation of the student’s ability
with respect to a particular area of performance or knowledge. The primary
concern of grading criteria should be that it describes what the student is
expected to do and what happens if the requirements are not met.

b. Grading Criteria for Performance Evaluations The creation of


grading criteria may be the most critical step in performance evaluation test
development because it ensures standardized grading. The scoring guide
contains a description of each step or group of steps to be graded. A pass/fail
checklist describes in detail what constitutes satisfactory and unsatisfactory
performance. Grading criteria for the course is a factor if the course is graded
Mastery or Non-mastery; a checklist may be the most appropriate to use. If
the course is graded with a numerical grade, a rating scale may be the most
appropriate to use. When defining the checklist steps and rating scale
decisions, all behaviors have to be written in sufficient detail so that all tasks
are as precise as possible. The more completely the behaviors are described,
the more effective the Job Sheet Checklist/Rating Scale will be. This helps
remove instructor subjectivity from the grading process. Performance- and
knowledge-based testing should not be combined. Multi-part tests can be
constructed in MCAIMS to support situations where both forms of testing are
needed.

Chapter 3 3-46
Systems Approach To Training Manual Develop Phase

c. Other important grading criteria factors should include:

1) Compliance with required safety precautions.


2) Correct operation of equipment after completed assembly.
3) Physical testing if the job is finished.
4) Time required completing the job.
5) Skill in using tools.
6) Care and use of the equipment.

5. Develop a Scoring Method

a. Manually graded A key or template needs to be developed to eliminate


any subjectivity in the scoring process. Ensure this item is safeguarded against
compromise. The essay test requires different scoring criteria. A model answer is
required that lists all essential data a knowledgeable student can be expected to
provide. This model is used as the standard answer by which all other answers
are scored and the worth of each item or part of an item is set.

b. Automated grading system If tests are to be machine scored,


precautions must be taken to see that the items can be used with machine-
scored answer sheets. The MCAIMS Users Manual lists the only stock answer
sheet to be used with schools’ NCS Optical Mark Reader scanners. STEP 5

Note: If test items have been recorded into MCAIMS and assigned to test, a
computer equipped with an optical character reader scanner has the capability to
score tests. This process is explained further in the MCAIMS User’s Manual.
Assessment of Student Achievement. 6th Ed. By Norman E. Gronlund. p.48

RULES FOR SCORING ESSAY ANSWERS


1. Evaluate answers to essay questions in terms of the learning outcomes
being measured.
2. Score restricted-response answers by the point method, using a model
answer as a guide.
3. Grade extended-response answers by the rating method, using defined
criteria as a guide.
4. Evaluate all the students' answers to one question before proceeding to
the next question.
5. Evaluate answers to essay questions without knowing the identity of the
writer.
6. Whenever possible, have two or more persons grade each answer.

Chapter 3 3-47
Systems Approach To Training Manual Develop Phase

6. Test Instructions for the Student Once the desired test items are
prepared, focus on the required information identifying the test. A complete
set of instructions, either written, oral and/or by visual aid, must be given to
the student. For written tests, a sample test item is given so that students
STEP 6 understand how they should answer the question (i.e., circle, write out, “X”).
The student instructions should specify the following:

a. References and materials are to be utilized during the test (if any).
b. Any rules for the test (e.g., “No talking.”)
c. Time allowed for each section or for the whole test.
d. How to proceed with the test (i.e., individually, from part to part, from
page to page or whether to wait for a signal.)
e. Procedures to follow after completing the test.
f. School’s policy on cheating.

Student evaluation instructions are covered in Chapter 4, Section 4400.

7. Test Instructions for the Test Administrator/Proctor Specific


instructions need to be written out to the test administrator/proctor so that
there is uniformity in how the test is to be administered. The instructions
should tell what is required for preparation in giving the test, how the test is to
be administered, and how remediation is handled.

a. Instructions for Preparing to Give Test should specify:


STEP 7
1) What the testing materials are.
2) Where to gather the testing materials.
3) How many can be tested at a time if there’s a limit.
4) Required testing environment (e.g. computer classroom, motor pool).
5) Seating arrangements (if applicable).
6) Prepare a “Testing” placard to be displayed outside the testing
environment.

b. Instructions for Administering the Test should specify:


1) Whether the students can use references or other materials during
the test.
2) Inform students of the cheating policy for the school.
3) Amount of time the students are given to complete the test.
4) Whether the test administrator/proctor is to answer questions during
the test.

c. Remediation Instructions should specify:


1) Type of remediation that will be conducted.
2) Where the retest will be located.
3) Procedures for giving retest.

By preparing detailed instructions, the administration of the test is more likely


to be standardized. The overall effect of the standardization is more reliable
test results on student progress and level of mastery.

Chapter 3 3-48
Systems Approach To Training Manual Develop Phase

3600. CONDUCT VALIDATION


SECTION
Validation is a process of trying out instructional materials and course materials
prior to implementation to ensure that mastery of the learning objectives is
6
possible and reasonable. Validation involves examining the effectiveness of
instructional materials by identifying strengths and weaknesses. The instructional
material should be presented to members of the target population to determine
its effectiveness. If the instruction does not enable students to reasonably
achieve mastery, it is revised until it does.

3601. METHODS OF VALIDATION

1. Curriculum Validators There are a variety of methods for validating


instruction. Validation of instructional materials should involve as many methods
as possible. If all methods are to be used, they should be conducted in the order
in which they are presented below. Personnel other than the curriculum
developer(s) should conduct the validation to enhance objectivity. The personnel
conducting the validation are referred to as curriculum validators.

Chapter 3 3-49
Systems Approach To Training Manual Develop Phase
2. Subject Matter Expert (SME) Technical Data Review SME technical
data review involves reviewing course materials to ensure the technical accuracy of
instructional material content. Although the instructional materials are not in final
form at this stage, the content should still support the information provided in
technical manuals and orders, job guides, and checklists. SME participation will help
identify specific problem areas and provide additional technical data.

3. Curriculum Validation Teams (CVT) The CVT is a method of validation in


which a team comprised of an experienced jobholder, a novice, a supervisor, an
instructor, and a curriculum developer meet to review the instructional materials.
The curriculum validator will coordinate the meeting as a facilitator only. As with the
SME technical data review, the instructional materials are not in final form yet. Each
of the participants of the CVT will examine the material from their different
perspectives ensuring that materials are technically accurate, instructionally sound,
and the learning level is appropriate to the target audience. For instance, a novice
can point out gaps in the content that may be unnoticeable to SMEs, or vice versa.
If there are disagreements among participants, a technical data review concerning
all participants may be assembled to resolve the issue.

4. Pilot Course In this validation method, instructional materials in final form


are presented to a target population group. This validation method is important
because it takes into account individual student learning differences. Student
samples should represent the entire range of the skill and knowledge level of the
target population. Instructional materials should be presented under normal
environmental conditions. For example, if the materials are intended for classroom
use or field use that is the environment in which the trials should be conducted. The
decision to use a pilot course as a validation method is based on the availability of
the necessary members of the target population and time.

a. Small Group In a small group validation, the curriculum validator presents


instructional materials to a small group (2-4 individuals) of the target
population to determine if mastery can be attained.

b. Large Group During large group validation, the lesson plan is presented
to a group of 5 or more people for validation. Presenting the lesson to a
large group allows many people from different learning perspectives to
receive the instruction. If time is a constraint, large group validation can be
conducted concurrently with implementation.

Chapter 3 3-50
Systems Approach To Training Manual Develop Phase

5. Validation at First Implementation This type of validation involves This is NOT the preferred
presenting instructional materials, in their final form, to members of the target method of validation
population at first implementation. In this case, validation and implementation are
conducted concurrently for one presentation of a scheduled class. This is NOT the
preferred method of validation, and is done only when there is not enough time to
conduct validation of materials prior to implementation. Validation at first
implementation should only be done as a last resort.

3602. TYPES OF DATA


The following are types of data gathered during validations. Depending upon the
type of validation, data may vary in quantity.

1. Data Collected from Students Student data are collected to determine


the attitude of students when they are presented with instruction, particularly
anything that kept them from attaining mastery of the learning objectives.
Additional student background information including age, time in service, past
experience, past academic experience, current job assignment, etc., should also be
collected. In the collection of data from students, students should provide their
comments on the following:

a. Length of instruction.
b. Comprehension of instruction.
c. Student interest/motivation level.
d. Realism to the job.

2. Instructional Material Data Information on the effectiveness of the


instructional material should be gathered from instructors, SMEs, students, and
curriculum developers. These data can include effectiveness of:

a. Lesson plan.
b. Student outline.
c. Supplemental student materials.
d. Media.
e. Tests (see Chapter 5, section 5300, for
procedures for analyzing test items).
f. Practical applications.

Chapter 3 3-51
Systems Approach To Training Manual Develop Phase

3. Instructional Procedures Data Data on the effectiveness of the delivery


system (instructional methods and media) should be gathered from instructors,
SMEs, students, and curriculum developers. These data may include effectiveness
of:

a. Method of instruction.
b. Order of instruction (training schedule).
c. Instructor presentation.
d. Number of instructors.
e. Instructional setting.

4. Test Item Data During validation, test items should be analyzed to


determine if they measure the knowledge or, skills required of the learning
objectives. Test items should also be analyzed for reliability to determine if they
produce consistent results. This is done through a process called test item analysis.
Test item analysis is a set of procedures for evaluating the effectiveness of test
items. Item analysis will identify which test items need to be revised or rejected. It
is critical to conduct item analysis during validation prior to course implementation to
ensure that the test items are valid. Chapter 5, Section 5300 presents detailed
procedures for conducting test item analysis.

3603. STEPS FOR VALIDATING INSTRUCTION

1. Review Formal School/Detachment (SOP) Standing Operating


Procedures The information needed to plan validation may be contained in the
school validation plan located in the school's Academic SOP. This document may
STEP 1 provide additional guidance on types of validation trials, data collection methods,
and appropriate authority for approval.

2. Plan and Schedule Validation Plan and schedule validation to allow


enough time to incorporate any improvements into the lessons prior to the start of
the course. This is a critical step that must be well thought out. Validation is
planned so that all trials can be conducted, data analyzed, and revisions made prior
to implementation of the course. During this step, the type of data to be gathered
(see Section 3602) and the type of validation methods (see Section 3601) are
STEP 2
determined.

Chapter 3 3-52
Systems Approach To Training Manual Develop Phase

3. Determine Data Collection Procedures Once the validation method is


selected, determine the system for collecting data. These data may be collected
using surveys, questionnaires, interviews, group discussions, observations or other
methods (see Chapter 5, Section 5603). Curriculum validators should ask open- STEP 3
ended questions so that participants can genuinely express their feelings, opinions,
and perceptions of the effectiveness of the instruction. Curriculum validators must
keep in mind that the purpose of validation is to obtain information that will improve
instruction.

4. Implement Validation Plan Using the validation methods planned in Step


2 and the data collection procedures identified in Step 3, conduct the validation. STEP 4
a. SME Technical Data Review Provide SMEs with instructional materials or
instructional material content. Directions should be provided as well as the
objectives of the validation.

b. CVT The curriculum validator gathers members for the CVT and serves as
the facilitator of the meeting. The curriculum validator should ensure the following:

1) All participants contribute to the meeting.

2) Recommendations for revisions are understood by all participants and


are recorded.

3) Any other feedback concerning the effectiveness of instruction is


collected and recorded.

c. Pilot Course Trial A pilot trial is the most comprehensive and time-
consuming validation to conduct. It involves conducting an actual class with a
group of students within or similar to the target population group. To conduct a
pilot trial, the curriculum validator will:

1) Gather students from the target population to receive the instruction.

2) Arrange the instructional setting as it will be arranged for the actual


implementation of the class.

3) Identify and brief instructors who will participate in the field trial.

4) Develop questionnaires to collect data from students and instructors


concerning their attitudes toward the effectiveness of instruction.

5) Ensure the instruction is conducted as it will be implemented.

Chapter 3 3-53
Systems Approach To Training Manual Develop Phase

5. Interpret and Record Validation Results Interpret and record data


from the validation. Since there is no specific format for doing this, curriculum
validators should record the results in a manner that meets their validation
objectives. For example, data can be summarized in a brief paragraph annotating
STEP 5 how many comments were made and the trends found detailing instructional
strengths and deficiencies. If the data were collected using a scaled rating system,
the answers should be averaged and presented as an average response for each
question. This summation should also include recommendations for solutions to
correct for instructional deficiencies. See Chapter 5, Section 5300 for detailed
procedures concerning the analysis and interpretation of data.

6. Report Validation Results Once validation data are collected and the
results are summarized, make recommendations for correcting problems. The
STEP 6 summarized results will indicate what materials, methods, or media need revision
report the validation results to the validation authority for approval.

3604. VALIDATION AUTHORITY


The responsibility for validation of instruction ultimately rests with the formal
school/detachment commander. The method of validation is based on resources
available. The commander provides guidance on conducting validations through a
validation plan, usually found in the Standing Operation Procedures (SOP). The plan
will identify who has validation authority. Decisions about how to validate are based
on resources, as outlined in the table below.

For example, the following decisions concerning validation must be made by the
formal school/detachment:

 What personnel are available to conduct the validation (SMEs,


instructors, curriculum developers, etc)?

 How many methods of validation (see Section 3402) will be used in


validating course material? What specific revisions to instructional
materials can be undertaken and still meet the planned course
schedule?

 How do we obtain members of the target population for validation?


If actual members of the target population are not available, then
the school director should select individuals with backgrounds as
similar as possible to those of the desired target population.

 How much time is available? If your time to design and develop a


course is limited, you will have to choose a validation method that
fits within the time constraints.

Chapter 3 3-54
Systems Approach To Training Manual Develop Phase

3700. DEVELOP A COURSE DESCRIPTIVE DATA


SECTION
(CDD) & PROGRAM OF INSTRUCTION (POI)
7
Every Marine Corps formal school/detachment must have an approved Program of
Instruction (POI). A POI documents a formal school's plan for instructing Individual
Training Standards (ITS). Specifically, a POI describes a course in terms of
structure, delivery methods and media, length, intended learning objectives, and
evaluation procedures. It also serves as a historical record that reflects the
continual evolution of the course. An important element of the POI is the approved
Course Descriptive Data (CDD) document. The CDD provides a summary of the
course including instructional resources, class length, and curriculum breakdown.
The CDD provides the justification and documentation for development or
refinement of POIs taught at Marine Corps formal schools and training detachments.

3701. COURSE DESCRIPTIVE DATA (CDD)


1. Description An approved CDD authorizes the school/detachment to develop a
new course or it authorizes a change to an existing course. The CDD does the
following:

 Indicates the school’s concept of how the course will meet the training
needs as established in the ITS order or T&R manual.
 Identifies resource requirements needed to conduct the course from which
decisions can be made.
 When approved, it authorizes further course development or refinement and
commits TECOM resources for implementation.

An approved CDD authorizes the school/detachment to develop a new course, or it


authorizes a change to an existing course. A school/detachment may submit a CDD
and POI as a proposal to change a course for a number of reasons, to include
departure from requirements published in an ITS order or T&R manual, new
equipment; and revised tactics, techniques, and procedures. Full justification for
any changes must accompany the revised CDD. Normally, the justification for
change is contained in the Record of Proceedings (ROP) written after a CCRB.

a. Formal School/Detachment Responsibilities Formal schools or


detachments should submit a MCAIMS generated CDD to CG, TECOM
(GTB/ATB) to justify resource requirements for a new or revised course of
instruction. If no significant changes are made, then a revised CDD/POI capturing
all minor changes must be submitted every three years. A formal school cannot
implement a course without a TECOM approved CDD. All formal schools and
detachments will review active CDDs annually in connection with TECOM Financial
Management Branch’s annual budget data call.

b. Submission and Approval of CDD The CDD is one of the documents


that TECOM uses to manage formal school instructional requirements.

1) A CDD is submitted to TECOM for review, staffing and approval. TECOM


(GTB/ATB) has staff cognizance for CDD review, coordination and
approval.

Chapter 3 3-55
Systems Approach To Training Manual Develop Phase

2) New Course of Instruction A formal school/Detachment will submit a


CDD and a cover letter requesting approval to add a new course. The
cover letter should address why the course is required, what deficiencies
it will correct, and why it should be conducted in a formal school setting.
MCO 1553.2_ describes the requirements for CDDs for new courses.

TECOM- Training & 3) TECOM (FSTB) records the information contained in the CDD along with
Education Command data collected from the Occupational Field (OccFld) sponsor at
Manpower and Reserve Affairs into the Training Resource Requirement
Management System (TRRMS) database. TRRMS processes this
information and produces a Training Input Plan (TIP) reflecting the
annual instructional requirements and a four-year instructional plan for
each formal school.

4) The CDD is also a source document for assignment of students to formal


courses of instruction. For each course listed on the TIP, TRRMS
generates a Training Quota Memorandum (TQM), which is loaded to the
BY NAME ASSIGNMENT (BNA) system, an automated information system
that enables order-writing commands to assign specific Marines to
available course seats. These memoranda translate the annual TIP
requirement into actual class seat quotas and form the basis for order
writing.

c. Elements of a CDD (see APPENDIX B for a sample CDD). Each element


of a CDD, in order of presentation (appearing as items #1 through #24 in a
CDD), is addressed below:

1) Course Title The course title must appear as it is listed in MCO


P1080.20_ (JUMPS/MMS Codes Manual) unless a change is required
or the CDD is for a new course.

2) Location Record the complete address for each location the course
is taught.

3) Marine Corps Service School Code (SSC) The SSC must


correspond to the SSC listed in MCO P1080.20_ (JUMPSIMMS Codes
Manual). If the course is new, record ‘”To be determined.’’

4) Other Service Course Number Use other pertinent service


course numbers as provided by other branches of the military. If
other service course numbers are not applicable, record “NA.”

5) Military Assistance Program Articles and Service List


Number The military assistance program articles and service list
number is a seven digit alphanumeric code used to identify a course
intended for foreign military instruction. If this type of instruction is
not applicable, record "NA.”

6) Purpose Include a concise statement about the goals of the


instructional program.

7) Scope Provide a list of the main subjects covered in the course.


The list should be comprehensive to include all topic areas.

Chapter 3 3-56
Systems Approach To Training Manual Develop Phase
8) Length (Peacetime) Record the total number of instructional
days required for the course. The peacetime instructional week
includes an average of 40 hours (8-hour day x 5 work days). Do
not include holidays or weekends where instruction does not
occur. TECOM’s Financial Management Branch, will reconcile any
exceptions, such as holidays, by comparing the number of
instructional days to the TIP.

9) Curriculum Breakdown (Peacetime) Provide a breakdown of


the curriculum in academic and administrative hours (see Section
3101). The Peacetime instructional week includes an average of
40 hours (8-hour day x 5 work days), 35 of which will be
academic time (Administrative time exceeding five hours per
week must be conducted after hours or justified in a waiver
request). For detailed organizational and tracking purposes of
instructional hours, academic hours should be further broken
down into methods (e.g., , practical application, lecture,
demonstration, performance evaluation, written evaluation).
Administrative hours should also be broken down into appropriate
methods . See the MCAIMS Users Manual for details.

10) Length (Mobilization) Record the total number of instructional


days required for the course during wartime mobilization. During
mobilization, the instructional week averages 60 hours (10-hour
day x 6 days). For courses on three shifts with equipment or
facility constraints, the mobilization instructional week averages
48 hours (8-hour day x 6 days). This time includes both
academic and administrative hours. If the course will discontinue
upon mobilization, enter “NA.” If the course length is the same
during mobilization as in peacetime, click “Same as peacetime.”

11) Curriculum Breakdown (Mobilization) Provide a breakdown


of the curriculum in academic and administrative hours for
mobilization. During mobilization, it is likely that academic hours
will increase and administrative hours will decrease. If the course
will discontinue upon mobilization, enter “NA.” If the curriculum
breakdown is the same during mobilization as in peacetime, click
“Same as peacetime.”

12) Maximum Class Capacity Record the maximum number of


students who can receive instruction using available resources.
Resources include classrooms, messing, billeting, equipment,
budget, and personnel available.

13) Optimum Class Capacity Record the number of students per


class that can take maximum advantage of all the resources (e.g.,
facilities, equipment, instructional capabilities) available to the
school.

14) Minimum Class Capacity Record the minimum number of


students per class that will make the course cost effective.

15) Class Frequency Record the number of classes required to


support the TIP for the current year.

16) Student Prerequisites List the prerequisites that personnel


must meet to attend the course. This information can be found in
Chapter 3 3-57
Systems Approach To Training Manual Develop Phase
the Target Population Description (TPD) developed in the Analysis
Phase and filed at the school.

17) MOS Received Record the Military Occupational Specialty (MOS)


assigned to the student upon successful completion of the course.
If the course does not result in an MOS assignment, record
“None.”

Chapter 3 3-58
Systems Approach To Training Manual Develop Phase

18) Quota Control Record the name of the agency managing course
quotas. The OccFld sponsor can provide this information.

19) Funding Record the name of the agency that funds temporary
additional duty incidental for students attending the course. In
those instances where the using agency must also bear cost, an
explanatory statement must be contained in this section. Courses
are funded from a variety of sources, depending upon a number of
factors such as student type, length of course, and career track.
Basic guidelines for schools to determine the funding source are:

 Courses over 139 days or 20 weeks at one location are PCS and funded by
MMOA/MMEA.
 Courses less than 139 days or 20 weeks may be unit-funded or TECOM-
funded.
 Entry-level pipeline students – normally funded by MMOA or MMEA.
 Lateral Move students – may be unit-funded or TECOM-funded.
Reserve students – normally funded by MARFORRES.

20) Reporting Instructions Designate to whom the student will


report when arriving for a course of instruction, to include
information on transportation and directions (both during and after
working hours). Contact phone numbers, fax numbers,
organizational e-mail, and website addresses are elements that are
to be included. Also include a statement indicating the availability of
government billeting and messing. Provide telephone number and
office contact information to obtain billeting reservations or confirm
that government quarters are not available. If there is more than
one school location, include a separate set of instructions for each
location.

21) Instructor Staffing Requirements Instructor staffing


requirements are based on the academic course hours and
computed in accordance with ITRO agreements, and are
automatically computed by MCAIMS in the Instructor Computation
Worksheet of the POI. Although instructor-staffing increases may
be validated based on an approved CDD, the CDD itself will not
generate a table of organization (T/O) change. After approval,
separate correspondence must be submitted to CG, TECOM G-1
requesting a T/O change.

This section of the CDD lists the school's T/O number and its date, and the
instructor and instructor supervisor billets by line number, grade, billet name, MOS
requirements, and number, indicating those line numbers not currently filled. The
Instructor Computation Worksheet (ICW) used to compute requirements should be
included as an appendix to the CDD with the POI. Additional comments as to
whether the billet is filled or not are required.

For formal school/training detachments located at another service's location,


refer to MCO 1580.7_ and compute instructor-staffing requirements using the lnter-
service Training Review Organization (ITRO) manpower computation formula. The
ITRO Manpower Analysis Subcommittee Procedures Manual detailing this formula
may be obtained by contacting TECOM G-3.

Chapter 3 3-59
Systems Approach To Training Manual Develop Phase
22) School Overhead List those billets, other than instructors,
required to operate the school. Personnel on school overhead often
perform curriculum development, test and evaluation, equipment
maintenance, Company GySgt, 1stSgt, XO, CO, etc. If there is a
need for more personnel (evaluators, curriculum developers, etc.)
than listed on the T/O, include them here.

23) Training/Education Support Requirements Item 23 is used to


list resource requirements other than personnel requirements
defined in the previous two paragraphs. List all training/education
support requirements specifically emphasizing the portions that
exceed already on-hand items or quantities in the school’s approved
Table of Equipment (T/E) that are directly related to the course of
instruction (e.g., training devices, Class V, operations, and
maintenance funds). Additionally, consumables or locally
purchased items should be identified here if deemed appropriate by
the Commanding Officer/Director of the school to account for all
items used by the school. Format is flexible, but should contain the
following information: (a) Table of Authorized Material Control
Number (TAMCN); (b) National Stock Number (NSN); (c) Unit of
Issue (U/I); (d) Quantity on Hand, and Quantity Short (that will
total quantity required); (e) Unit cost; (f) Extended cost (quantity
required x unit cost). Increases in training/education support
requirements are reviewed by TECOM (G-3 and Financial
Management) for consideration in the planning, programming, and
budgeting process. Although training/education support increases
may be validated based on an approved CDD, the CDD itself will not
generate a T/E change. Once approved, separate correspondence
must be submitted to CG, TECOM (Financial Management)
requesting a T/E change.

 If an approved T/E does not exist, list all training/education support


requirements necessary to implement the course.

 The required format for a T/E item is the Table of Authorized Material
Control Number (TAMCN), National Stock Number (NSN), Unit of Issue
(U/I), Quantity On Hand, and Quantity Short (which will total quantity
required), unit cost and extended cost (quantity required x unit cost). This
information can be obtained for the Consolidated Memorandum Receipt
(CMR) for the formal school/detachment.

 For facilities, identify the building(s), classrooms, or ranges by number


and/or square feet.

 The required elements for listing Class V items: Department of Defense


Identification Code (DODIC) number, ammunition type, number of rounds
per student, number of demonstration rounds per class, total number of
rounds per class, total number of rounds per year. TECOM (G-4 Ammo
Section) reviews Class V items by ensuring the round count matches the
round count contained in the applicable ITS order/T&R manual.

Chapter 3 3-60
Systems Approach To Training Manual Develop Phase
24) ITS Task List Record the tasks/events taught in the course. This
task list must be an approved ITS task list. If an approved ITS
does not exist, include a locally generated task list (See Chapter 1,
Analysis phase for the steps involved in generating a task list).

 Recommended Changes to ITS The CDD is one document for the formal
school/detachment to recommend changes to ITSs. If a review of the
approved ITS task list reveals ITS tasks that should be deleted or added,
these recommended revisions must be recorded here. Recommendations
should be based on a thorough review of the ITSs/T&Rs and any available
course or job documentation obtained by the formal school/detachments
concerning the Military Occupational Specialty (MOS)/job. When
recommended revisions to ITSs are submitted with a CDD, this initiates the
process with TECOM (GTB/ATB) for getting changes made to the ITSs.

 Added ITS ITSs that are recommended for addition are written as ITS/T&R
task behaviors and placed in the task list under the duty area and in the
recommended order in which they would appear if they were approved. If a
new duty area is recommended for addition, it should be named and include
the new ITS/T&R components. An appropriate ITS/T&R designator number
for each new ITS/T&R event should be included representing the MOS.

 Deleted ITS For every recommended addition and/or deletion to the


approved ITS task list, justification must be provided. Justification is
inserted into the CDD Notes portion of the CDD immediately following
item #24. Each ITS task behavior designator is listed, followed by the
justification for the recommended change. For guidance concerning
justification, contact TECOM (GTB/ATB).
 Approval Procedures. In their review of the CDD, TECOM (GTB/ATB)
will approve or provide guidance concerning the recommended changes to
the ITS task list. If the recommended changes are approved, the ITS task
list is renumbered accordingly, using the next available task number within
the duty area. The POI will then be developed based on the new
approved ITS task list and its numbering system. If TECOM (GTB/ATB)
does not approve the recommended changes, the formal
school/detachment will be provided with guidance to follow.

Chapter 3 3-61
Systems Approach To Training Manual Develop Phase

d. Following CDD Approval Once the CDD has been approved, the approved
task list is recorded in Appendix B of the POI. When submitting the POI for
approval, record in item #24 the following wording “Refer to Appendix B for task
list.”

 Optional Items in a CDD Optional items are those items that the formal
school/detachment feels amplify or clarify the information contained in the
CDD. These items may be included in the CDD and POI as appendices.
Contact TECOM (GTB/ATB) for guidance before including additional optional
items. Examples of optional items include the sequencing of the lessons,
training/education support requirements, complete listing of TLOs/ELOs,
student performance evaluation checklist, instructor prerequisites, etc.

e. Submission and Approval of CDD Formal schools/detachments submit


their CDD for staffing and approval to CG, TECOM (GTB/ATB) as follows:

 New Course of Instruction A formal school/detachment will submit a


CDD and a cover letter requesting approval to add a new course. The cover
letter should address why the course is required, what deficiencies it will
correct, and why it should be conducted in a formal school setting.

 Revised Course of Instruction The CDD requires resubmission only if


there is a change to one of the previously approved elements of the course.
The current TECOM approval authority for the CDD must be cited in the
forwarding correspondence.

Chapter 3 3-62
Systems Approach To Training Manual Develop Phase

3702. PROGRAM OF INSTRUCTION (POI)

The POI serves as a formal school/detachment plan for implementing and


evaluating a formal course of instruction. A POI is the management tool for
conducting a course. At a minimum, a formal school/detachment must have a
locally approved (by signature of school/detachment commander) POI for every
course of instruction it delivers. For each school, the POI is used as an important
element in the documentation and historical record that reflects the evolution of a
course. Accordingly, a copy of the POI is maintained at the school to document this
evolution (see Appendix B for an abbreviated version of a POI).

1. POI Development Process Using the information from the approved CDD,
the formal school will develop the POI. MCO 1553.2_, Management of Marine Corps
Formal Schools and Training Detachments, contains POI submission and approval
requirements and procedures. The curriculum module of the Marine Corps
Automated Instructional Management System (MCAIMS) is used to develop the POI.

2. POI Content Requirements Development of the POI primarily involves


the consolidation of materials produced during the Analysis and Design Phases.
MCO 1553.2_ mandates minimum POI content requirements. TECOM (GTB/ATB)
must clear any additional items to the POI for inclusion prior to submitting the POI.

3. POI Requirements Listed In Order:

a. Title Page The title page provides information necessary to identify the
document. This includes the course title, SSIC, school name/address, and effective
date. The effective date is left blank until the POI is approved, then the date
approved is recorded. Each time a revised POI is approved, the new approval date
is recorded.

b. Certification Page The signed certification page signifies that the CG,
TECOM has reviewed and approved the POI. The approved POI directs the school
commander to implement the course of instruction. For local approval, the school
commander will sign a local certification page.

c. Record of Changes Page The record of changes page is a chronological


log of all changes made to a POI. Each entry must indicate the change number,
date of change, date received, date entered, and the signature of the individual
entering the change.

d. Table of Contents This table details the content of the POI and is
arranged by section number and section title. The table of contents should include
by section the following: CDD, Summary of Hours, Scope of Annexes, Concept
Cards, Student Performance Data, and Distribution List.

Chapter 3 3-63
Systems Approach To Training Manual Develop Phase

e. CDD Section I of the POI consists of the CDD with preface. The preface
should include a brief purpose statement and the address where comments and
recommendations concerning the POI may be sent. The 24 elements of the CDD
provide a summary of the course.

f. Summary of Hours Section II of the POI consists of a summary of the


course. Included are two items: a breakdown of the academic and administrative
hours, and revision data.

 All academic hours are organized by using annexes. Annexes


organize the concept cards contained in the POI into chapters or
topic areas. Annexes can duplicate the duty areas ITS/T&R are
organized by or they may be some other organizational grouping
determined by the developer of the POI. Annexes A-Y are for
academic concept cards and annex Z is reserved for administrative
concept cards. Due to the MCAIMS’ automatic calculations of
academic and administrative hours from each concept card, the
totals shown in this paragraph will match the instructional hours
represented on the concept cards and the curriculum breakdown in
the CDD (items #9 and #11).

 Revision data is listed by lesson designator, lesson title, and lesson


time expressed in hours. The previous and current lesson
designators and hours are listed (when applicable) and rationale is
provided for each change to these items.

g. Scope of Annexes The scope of annexes carries a subheading, academic


subjects, and details a description of the scope of each annex contained in the POI.
If there is a difference in the scope between the conduct of the course during
peacetime and mobilization, it must be annotated here.

h. Concept Cards Section IV of the POI is made up of the concept cards.


See section 3200 Concept cards comprise the bulk of the POI and provide a snapshot of all lessons,
concerning concept examinations, and administrative events. An introduction is provided to explain the
cards. description of the contents of the concept cards, the location of learning objectives
report, and summaries of instructional hours.

i. Student Performance Evaluation Section V of the POI documents the


scope of the evaluation, standards for successful performance, and evaluation
procedures. Refer to the school SOP and MCAIMS users manual for guidance on
specific evaluation procedures. Student evaluation must be detailed and include, at
a minimum, the evaluation philosophy (mastery/non-mastery/GPA), methods of
evaluation (e.g., written, performance, remediation), Fitness Reports (if applicable),
Pro/Con marks (if applicable), disposition of academic failures (recycle/MOS re-
designation procedures).

4. Distribution List This section is automatically generated by MCAIMS.

Chapter 3 3-64
Systems Approach To Training Manual Develop Phase

3800. ASSEMBLE A MASTER LESSON FILE SECTION


A Master Lesson File (MLF) is a compilation of living documents that are kept in the 8
school to provide everything needed to conduct a lesson. The MLF is kept at the
formal school/detachment and serves as the central repository for all the
instructional and supporting materials for a given lesson. A MLF must exist for
each lesson taught. All copies of materials that are created for distribution must
come from the MLF. Since the MLF is a living document, it can be altered to fit
current doctrine or updated to provide better media, more complete handouts, new
methodology, etc. The MLF is constantly being improved and is the most up-to-
date file on what is occurring at the school for a particular lesson. Thus, it provides
accountability, documents the use of school resources, and most importantly,
provides continuity.

3801. MINIMUM REQUIREMENTS


In an academic MLF, nine items must be present. However, inclusion of
supplemental materials and media (i.e. actual item object) in the MLF may not
always be practical; therefore, are not always required to be present. For each of
these items, there will also be a completed checklist. In a lesson purpose class, the
first two items are omitted.

1. Learning Analysis Worksheet The Learning Analysis Worksheet (LAW) is


required in the MLF because it documents the transition between the ITS tasks
events and learning objectives.

2. Learning Objective Worksheet The Learning Objective Worksheet


(LOW) is placed in the MLF because it describes the anticipated learning outcome,
provides a test item for each Learning Objective (LO), and contains the selection of
methods and media for that specific LO.

3. Concept Card A concept card is placed in the MLF because it is a quality


control document. The concept card is always located in the Program of
Instruction (POI), but for quick reference will be placed in the MLF. The concept
card provides a quick snapshot of the class (i.e. learning objective(s), method,
media, instructor to student ratio, references).

4. Operational Risk Assessment Worksheet (ORAW) The ORAW


documents the school plan to conduct training in the safest manner possible. The
ORAW document the 5-step Operational Risk Management process as it relates to
the lesson. Refer to appendix B and MCO 1553.2_ for further guidance on the
preparation of the ORAW.

5. Instructor Preparation Guide This document is used to guide the


instructor in preparing for the lesson.

Chapter 3 3-65
Systems Approach To Training Manual Develop Phase

6. Lesson Plan No MLF is complete without a lesson plan. The lesson cannot be
conducted without a lesson plan. The lesson is written in such detail that an
alternate instructor, with minimal preparation time, could effectively deliver the
lesson.

7. Student Outline The student outline will be contained in the MLF.

8. Supplemental Student Materials Any other materials used to enhance


instruction or student learning during the class should be maintained in the MLF. If
the actual copies are not maintained in the MLF, a locator sheet is used to inform the
instructor where to locate these materials.

9. Media Media and/or a list of supporting papers are placed in the MLF. If the
actual media are not contained in the MLF (e.g., films, tapes, wallcharts), then,
supporting papers that list the media required and where they are located should be
included. It may be possible to provide paper copies of slides, transparencies, or
wallcharts in the MLF. Any copyright authorizations related to the media should also
be filed here.

3802. OPTIONAL COMPONENTS


Each formal school/detachment’s SOP will dictate optional components that must be
kept in a MLF.

Some examples of optional components are:

1. ITS Extract An extract from the applicable Individual Training Standard (ITS)
may be included as a foundation to the material being taught.

2. Approval Signature Most schools require verification by a supervisor in an


official capacity for any or all documents found in the MLF. This can be placed on a
separate form that depicts whose signature is necessary for approval.

3. Other Course Related Materials Any other item dictated by local SOP
should be contained in the MLF. These items may include items such as test,
training area requests, and other items applicable for use during the lesson that aid
in the delivery or evaluation of that particular class.

Chapter 3 3-66
Systems Approach To Training Manual Develop Phase

3803. STEPS FOR ASSEMBLING A MASTER


LESSON FILE

REVIEW MCO 1553.2 AND LOCAL SOP

Review the current copy of MCO 1553.2 (Marine Corps


STEP 1 Formal Schools and Training Centers Order) along with
your school’s SOP to determine the requirements of the
MLF.

GATHER DOCUMENTS AND CHECKLISTS


STEP 2
Gather all documents along with their blank checklists
once you have determined your requirements.

COMPLETE THE MLF CHECKLIST

Complete checklists for each component of the MLF.


STEP 3 Sample checklists for each component may be found in
APPENDIX C of this Manual. Similar or additional
checklists to be used for the optional MLF components
may be mandated by local SOP.

ARRANGE EACH DOCUMENTS IN THE MLF

Arrange each document in accordance with your


checklist in the MLF. Ensure each checklist is
STEP 4 completed to ensure that all required items are
included in the MLF.

Chapter 3 3-67
Systems Approach To Training Manual Implement Phase

IMPLEMENT PHASE
In Chapter 4:

4000 INTRODUCTION 4-1


 Purpose 4-1
ANALYZE

4100 REVIEW LESSON

E
MATERIAL 4-2
 Purpose 4-2
 Review Course/Training 4-2

V
Schedule
 Review Lesson Plan 4-2

A
DESIGN  Review Student Materials 4-3
 Review Media 4-4
 Review Operational Risk 4-4

L
Assessment (ORAW) 4-4
 Review Instructor Preparation 4-4
Guide (IPG) 4-4

U
 Review Student Test 4-4
DEVELOP
4200 TIME-CRITICAL
A
OPERATIONAL RISK 4-5
ASSESSMENTS (ORA)
 Purpose 4-5
T

 Identify Change(s) 4-5


 Apply 5-Step Process 4-6
E

IMPLEMENT
4300 PREPARE FOR
INSTRUCTION 4-7
 Purpose 4-7
 Review Lesson Material  Instructional Environment 4-7
 Rehearsals 4-9
 Time-Critical ORA  Instructor Evaluation 4-12

 Prepare for Instruction 4400 CONDUCT INSTRUCTION 4-13


 Purpose 4-13
 Conduct Instruction  Effective Communication 4-13
 Conducting the Lesson 4-24
 Administer Student  Method Employment 4-25
Evaluations

 After Lesson 4500 ADMINISTER STUDENT


Management TESTS 4-36
 Purpose 4-36
 Types of Tests 4-36
 Methods of Testing 4-37
 Steps in Administering Student 4-38
Tests

4600 AFTER LESSON


MANAGEMENT 4-42
 Purpose 4-42
 Completing an AIR 4-42

Chapter 4
Systems Approach To Training Manual Implement Phase

Chapter 4000. INTRODUCTION

4 During the Implement Phase the following is accomplished: reviewing the lesson
materials, preparing for instruction, conducting instruction, administering student
tests, and performing after-lesson management. This includes the instructor
reviewing the training schedule and, class materials (to include tests), preparing
personnel and the training environment, and conducting rehearsals. Once the
instruction has been delivered, the instructor must administer student evaluations to
determine if the learning objective has been met. The instructor must conclude the
instruction by completing an After Instruction Report (AIR) to document the course
data for later use in the Evaluation Phase.

4001. PURPOSE

Using the curriculum produced during the Develop Phase, the instructor executes
the instruction during this phase. The purpose of the Implement Phase is the
effective and efficient delivery of instruction to promote student understanding of
material, to achieve student mastery of learning objectives, and to ensure a transfer
of student knowledge and skills from the instructional setting to the job. This is
accomplished by effectively implementing the POI that was designed, developed,
validated, and approved during the previous phases.

The Implement Phase is made up of five separate sections, each of which has a
specific purpose.

1. Review Lesson Materials This section provides guidance on reviewing lesson


plans, student materials, media, Operational Risk Assessment Worksheet
(ORAW), Instructional Preparation Guide (IPG), and tests when preparing to
conduct a lesson.
2. Time Critical Operational Risk Assessment The Time-Critical Operational
Risk Assessment addresses preparing and reacting to changes in the
instructional environment that affect safety.
3. Prepare for Instruction This section addresses preparing the instructional
environment, rehearsing, and preparing for instructor evaluations.
4. Conduct Instruction This section addresses effective communication, steps
in conducting a lesson, and how to employ instructional methods.
5. Administer Student Tests A step-by-step process for administering student
tests is provided in this section.
6. After Lesson Management This section provides the responsibilities of an
instructor after the lesson.

INPUT
Review Lesson Materials

Prepare for Instruction Delivery of Instruction


CDD/POI
PROCESS
PROCESS Conduct Instruction OUTPUT
OUTPUT
Course Graduates
MLF
Administer Tests Course Data

After Lesson Mgmt

Chapter 4 4-1
Systems Approach To Training Manual Implement Phase

4100. REVIEW LESSON MATERIALS SECTION


Reviewing lesson material involves all those activities that instructors must perform
before preparing and conducting instruction. Instructors must have a clear
1
understanding of all aspects of the lesson. This is accomplished by reviewing the
course/training schedule, the MLF, and tests. By reviewing these items, the
instructor can identify any conflicts, questions, or potential problems before the
rehearsals begin. More importantly, the instructor can make required adjustments
prior to delivering the instruction to the students. The instructor must ensure the
lesson plan, student materials, and media all have the same information.

4101. REVIEW COURSE/TRAINING SCHEDULE

The instructor should review the schedule as early as possible before instruction
begins. This allows the instructor ample time to deal with any conflicts or problems.
By reviewing the schedule early, the instructor has time to schedule resources (i.e.,
ranges, weapons, or transportation), rehearsals (i.e., a dress rehearsal in the
instructional setting), and any administrative requirements (i.e., printing of student
materials).

4102. REVIEW LESSON PLAN

Detailed lesson plans ensure that the instructor has all the critical information
needed to maximize student learning. The purpose of reviewing the lesson plan is to
ensure it contains all of the required components; to match the learning objectives
to the information in the lesson plan; and to personalize the lesson plan to the
instructor’s style of delivery. After reviewing the lesson plan, the instructor should
fully understand lesson content and have confidence in the detailed script that allows
for the smooth and effective delivery of instruction.

1. Lesson Plan Personalization The instructor will be provided with the


approved lesson plan for the block of instruction. The instructor personalizes the
lesson plan, tailoring it to his or her style of teaching. Lesson plan personalization
allows the instructor to make the class unique without deviating from the approved
content. Personalization includes adding subject matter details, related personal
experiences, and discussion topics which may be needed to cover the topic in
greater depth. Personalization also includes the addition of notes to indicate when
to stress a point, relate a personal experience, or use an example or analogy.
2. Subject Matter Detail Use this type of information to provide technical data
such as purposes, descriptions, facts, operations, and functions. Course reference
materials provide this information.
3. Instructional Techniques Use carefully written questions, well-planned
media, or additional student/instructor activities to enhance the lesson.

Chapter 4 4-2
Systems Approach To Training Manual Implement Phase

4. Personal Experience Relate personal on-the-job experiences to the lesson to


increase student interest. Relating personal experiences has the positive effect of
reinforcing the practical application of the material. It also serves to increase
student interest and motivation.
5. Examples and Analogies When possible, support main points of the lesson
plan by examples and analogies to simplify the concepts or ideas being taught. Use
them as a part of personalization for each lesson. For example, if the lesson is on
the way sound waves travel through air, but the class has difficulty understanding
that concept, then perhaps an analogy, such as “it is similar to the way ripples
travel through water after a stone is dropped,” will help them understand.

4103. REVIEW STUDENT MATERIALS

Student materials assist the instructor in the delivery of instruction by providing


tools that stimulate the learner and reinforce key concepts. An instructor influences
the transfer of learning by the way the content of the Master Lesson File (MLF) is
used. There are two types of student materials: student outlines and supplemental
student materials. All student material must be reviewed to ensure they match and
support the lesson. Using outdated and irrelevant materials must be avoided at all
cost. The students’ performance and motivation will suffer when knowledge and
skills are received that no longer pertain to the job.

1. Student Outlines The student outline is the primary document that supports
the instruction. This outline provides the student with a general structure to follow
during the class and a conceptual framework that highlights the main ideas of the
class. The primary purpose for reviewing the student outline is to ensure it is
written in proper terms for the student, not the instructor, and to verify that it
contains all required components.

2. Supplemental Student Materials Supplemental student material is any


material, in addition to the student outline, provided to the student prior to or
during instruction. Supplemental student materials may include advance handouts
to prepare the student for class (e.g., orientation material), answer keys to quizzes,
additional articles for reading, and reference materials (e.g., technical manuals,
graphs, charts, formulas, figures, maps). The use and number of supplemental
student materials is optional, and they can be presented in any format that will be
easily understood by the student. The difference between supplemental student
materials and classroom media is that students take ownership of the former, while
the latter remains the property of the school. The primary purpose for reviewing
supplemental student materials is to ensure the information does not contradict the
information contained in the student outline and that it is an essential tool required
to meet the learning objective.

Chapter 4 4-3
Systems Approach To Training Manual Implement Phase

4104. REVIEW MEDIA

Instructional media can come in many forms. The primary purpose for reviewing
media is to ensure that they match the information in the lesson plan and are visible
to the students in the classroom.

4105. REVIEW OPERATIONAL RISK ASSESSMENT


WORKSHEET (ORAW)

The purpose of the ORAW is to record the results of an Operational Risk Assessment.
During the Develop phase of the SAT, an ORAW is developed and then maintained in
the Master Lesson File (MLF). An ORAW is required for every lesson. However, some
lessons may not have any hazards identified for the lesson, in which case the ORAW
will state “No Identified Hazards.” Through the ORA, identifiable hazards are listed,
assessed, risk decisions are made, controls are developed and placed in the lesson
plan, and supervision of the controls is determined. Instructors must identify the ORA
and review it for safety issues pertaining to the lesson prior to the conduct of the
lesson. The ORA must also contain the Cease Training Criteria (CTC) for the lesson.
These criteria detail the circumstances when training must be stopped. The CTC is
specified in the safety brief of the introduction in the lesson plan. When there are
CTC associated with a practical application or other method, it is reiterated prior to
the practical application. For each safety control identified on the ORAW, a
corresponding control must be in the lesson plan where applicable. It is absolutely
imperative that this information is reviewed for accuracy to ensure the safety of the
students during the lesson. Ensure that the ORAW is valid by looking at the
approving signature and date. Any problems concerning the ORAW (such as
acquiring resources necessary to implement controls, etc.) must immediately be
brought to the attention of the appropriate authority.

4106. REVIEW INSTRUCTOR PREPARATION GUIDE


(IPG)

The Instructor Preparation Guide is a required element of the Master Lesson File
(MLF). This checklist is created to provide the instructor with information that is
critical to the preparation for implementing the lesson. Detailed information is given
so that the instructor understands what resources are necessary for the lesson. Much
of the information provided under administrative information is copied from the
concept card. Though this checklist is an MLF item, instructors can make a copy so
that they can check off items when preparing for the lesson.

4107. REVIEW STUDENT TEST

The primary purpose for reviewing the student test is to ensure the instructor has a
complete understanding of how the students will be evaluated. Every block of
instruction begins with an introduction. One of the steps in the introduction is to
explain how the students will be evaluated. By reviewing the test, the instructor will
also determine if the test items are supported by the content of the lesson plan,
instructional materials, and student materials. The instructor must never use this
information to teach specific test items or questions.

Chapter 4 4-4
Systems Approach To Training Manual Implement Phase

SECTION 4200. TIME-CRITICAL OPERATIONAL RISK


ASSESSMENT (ORA)
2
When instructing in the classroom, the need for Operational Risk Management
(ORM) is paramount. Instructors are entrusted with the safety of the students.
For this reason, ORM is needed in every aspect of training, whether the training is
in the classroom or out on the machinegun range. Hazards still exist in the
instructional environment. That is why the curriculum developer at the
schoolhouse has done an in-depth Operational Risk Assessment (ORA) and placed
a report of the assessment in the Master Lesson File (MLF). Though the in-depth
ORA is already done, the instructor can have an impact on controlling risk by
conducting a Time-Critical ORA when applicable.

4201. IDENTIFY CHANGE(S)

Change has been called the “Mother” of all risk. Changes can occur during the
preparation of the lesson, during the conduct of the lesson, and during the
administration of tests. When talking about changes, what is really being
discussed is what can happen in the instructional environment to change the
hazards documented in the in-depth ORA that was completed by the curriculum
developer. Remember, the instructor is in charge of the class and must ensure the
safety of the students. There are several tools that can be used to aid the
instructor in preparing for the lesson. Change Analysis and the What If Tool (WIT)
will help identify changes or potential changes. Once a change has been
identified, a determination can be made to whether the associated hazard is a high
or low risk. If the risk is determined to be high, then training is ceased to ensure
the safety of the students. If the risk is determined to be low, then the instructor
applies a Time-Critical ORA to ensure the safety of students and continue with the
training. If possible, enlist the aid of experienced instructors when using these
tools. Their experience can shed light into areas an inexperienced instructor may
not have thought about.

1. Change Analysis Change Analysis is an excellent tool for use in time-


critical applications where change has been introduced. It is very simple to use:
simply look at a training event and ask, "What is different?" As the name implies,
Change Analysis allows the identification and assessment of hazards resulting from
planned or unplanned changes to a lesson. Case in point would involve an event
that has been thoroughly planned and briefed, but something or somebody
introduced some change and the whole plan fell apart. Examples of when to apply
change analysis include when assessing the impact of:

a. Resource cuts, to include time, dollars, people or other resources.


b. Changes in weather or the environment.
c. Changes to equipment or supplies, such as a HMMWV truck instead of a 7-
ton truck.
d. Changes to the location of a classroom or the number of students
attending class.

Chapter 4 4-5
Systems Approach To Training Manual Implement Phase

2. What If Tool (WIT) Asking the question, “What If?” may possibly identify
additional hazards not even considered by the in-depth ORA. To use the WIT,
assume that Murphy's Law applies. Murphy's Law states, “What can go wrong, will
go wrong.” Remember to consider possible mistakes or problems. Look at the
worst-case scenario, even if it does not seem likely to happen. Also, consider the
mistakes or problems that are much more likely to happen, but may not be as
detrimental. “What-If” questions can be formulated around human errors, process
upsets, and equipment failures. These errors and failures can be considered during
normal operations and during training activities. The questions could address any of
the following situations:

a. Failure to follow procedures


b. Operator inattentive or operator not trained
c. Equipment failure
d. Instrumentation calibrated wrong
e. External influences such as weather, fire
f. Combination of events such as multiple equipment failures

Experienced personnel are knowledgeable of past failures and likely sources of


errors. That experience should be used to generate “What-If” questions.

4202. APPLY THE 5-STEP PROCESS

Time-Critical Risk Management will suffice only when the risk is low. It is used when
there is no need to develop a written Risk Assessment for an evolution, such as
would be required for a Deliberate or In-depth level of ORM (refer to Chapter 3,
Section 3300 for more information on the In-depth ORA and the 5-step ORM
process). It is also particularly helpful in choosing the appropriate course of action
when an unplanned event occurs during the execution of a planned operation or
training course.
Applying

The Time-Critical level of Operational Risk Assessments is employed by experienced


personnel to consider risk while making decisions in a time-compressed situation. In
the steps of Risk Management, identify the hazard(s), make an assessment by
examining probability and severity associated with the hazard, and use the Risk
Matrix to determine a Risk Assessment Code (RAC). Next, the instructor must make
a risk decision. This only refers to instruction that is not of a high-risk nature, to
which the instructor can apply the five-step process. If the risk is low and the
decision is to continue training, then the instructor must implement some form of
control and supervision to reduce the risk. This has already been done for the
instructor in the In-Depth ORA worksheet. However, changes do occur and that is
when the instructor needs to apply a Time-Critical ORA. Clearly, the assignments of
risk are subjective; different people may assign different values. The point is to
increase Situational Awareness so that a mishap or incident is more likely to be
avoided.

Chapter 4 4-6
Systems Approach To Training Manual Implement Phase

SECTION 4300. PREPARE FOR INSTRUCTION

3 The preparation portion of the Implement Phase involves all those activities that
instructors and support personnel must perform to get ready for delivering the
instruction. To maximize the transfer of knowledge and the development of skills
by the learner, instructors must rehearse the lesson, prepare instructional
materials, and prepare the instructional environment. This is accomplished by
organizing the instructional material and environment in a manner that promotes
the smooth exchange of information between the instructor and the students.
Prior to conducting instruction, instructors should think about how to influence the
following: transfer of knowledge and skills, the instructional environment, delivery,
facilitation techniques, use of media, and questioning techniques.

4301. INSTRUCTIONAL ENVIRONMENT

Prior to delivering instruction, the instructor must prepare the instructional


environment for an organized and smooth presentation to maximize the transfer of
knowledge and skills. The instructional environment refers to the instructional
setting (classroom), media/equipment, support personnel, student materials, and
the administrative functions the instructor must perform.

1. Prepare Instructional Setting (Classroom) The instructor must


ensure that the instructional setting replicates the job setting as much as possible.
This is achieved by organizing and placing required equipment or supplies, as they
would be in the job setting. The instructor must also ensure that the instructional
setting is conducive to learning. This is accomplished by ensuring the following:

a. Lighting and ventilation are adequate, media equipment is


accessible, and the climate control is functioning properly.

b. Chairs and desks are available for each student.

c. Unnecessary distractions are removed.

d. If an outdoor area is to be used, the instructor must survey the


area to ensure it can be prepared per the specific lesson plan and local
Standing Operating Procedure (SOP). An alternate site should be
designated in the event the primary site cannot be used.

e. Ensure that all ORM and safety considerations have been


addressed.

Chapter 4 4-7
Systems Approach To Training Manual Implement Phase
2. Prepare Media/Equipment The instructor must gather and set up all the
instructional equipment and media required for the presentation of the lesson.
Equipment can include items such as Digital Video Disc (DVD) players, Liquid Crystal
Display (LCD) projectors, computers, etc. Media can include board media
(chalkboards, dry erase boards, etc.), established media (actual item/object, printed
materials, etc.), computer media (Computer-Based Tutorials (CBT), Interactive
Media Instruction (IMI), etc.), and multimedia (computer aided graphics, audio,
video, etc.). Equipment and media preparation should include a review of the
following requirements:

a. All the required equipment is operational. If the equipment


cannot be repaired or replaced, an alternate media with
equipment must be obtained.

b. The media must be easily seen and heard from any part of the
instructional area.

c. The media are in good condition. The media are appropriate to the
subject matter and target audience.

3. Brief Support Personnel Support personnel include assistant instructors,


demonstrators, role players, Corpsmen (when applicable), and any other personnel
who will be involved in the presentation or support of instruction. The instructor
must brief support personnel so that each person’s role is clearly understood.
Additionally, the learning objectives of the lesson and any needed preparations for
instruction must also be briefed.

a. The primary instructor is responsible for ensuring that all


personnel are informed when to meet. Some personnel may need
to be at the instructional area early to secure and set up
equipment or to have student materials in place prior to the start
of the class.

b. Demonstrators should be briefed on their roles and, if time


permits, a walk through of the demonstration should be conducted
prior to instruction.

4. Prepare Student Materials The instructor must ensure that all materials
required by the students are available, in good condition, and ready to be
distributed. These may be student outlines (primary document that supports the
instruction) or supplemental student materials (something other than the student
outline that is retained by the student after instruction).

Chapter 4 4-8
Systems Approach To Training Manual Implement Phase

5. Perform Administrative Functions There are several administrative


functions the instructor must address prior to implementation of instruction. The
following is a list of some of these administrative actions:

a. Verifying the time and location of the class.

b. Obtaining the class roster.

c. Making arrangements for monitor/visitor seating in accordance with local


SOP.

d. Ensuring appropriate security or safety measures have been followed.

e. Preparing all administrative paperwork for presentation.

6. Personal Appearance One of the last things to do before “stepping on


the platform” is look in the mirror to check personal appearance. Whether military
or civilian, an instructor must make sure that his/her attire is neat and
professional. There is nothing worse than an instructor who appears before a
class looking sloppy and unkempt, which in most situations distracts the learners’
attention from the material.

4302. REHEARSALS

Most people perform best when they are well prepared. The success of any
presentation is a direct result of the amount of work that went into preparing it.
Rehearsal is the process in which an instructor practices delivering his/her lesson.
Rehearsing the lesson will reveal the most effective wording, enhance the
instructor’s knowledge of the subject matter, ensure a smoother flow of the
presentation, and increase the chances for success. Rehearsal also provides the
instructor a gauge of how his or her delivery fits the allocated time for the lesson.

Chapter 4 4-9
Systems Approach To Training Manual Implement Phase

1. Types of Rehearsals The three types of rehearsals are: individual, small


critical audience, and dress rehearsal. Each of these can stand alone; however,
preparation is maximized when they are all conducted in sequence.

a. Individual The individual rehearsal requires the instructor to practice


delivering the material alone. Individual rehearsals can take place anywhere,
anytime, and at the convenience of the instructor. Some instructors rehearse on
their way to work in their car, in the shower, or while watching television. It is
recommended to videotape individual rehearsals when possible.

b. Small Critical Audience Upon completion of an individual rehearsal, the


lesson should be presented to a small group of people. Emotional attitudes must
be considered when selecting the audience. Ensure the people selected will provide
constructive feedback. Peers make the best critical audience, but using family
members at least provides an opportunity to rehearse in front of an audience. The
instructor should be thick-skinned enough to accept feedback at face value. Tape
this rehearsal if possible.

c. Dress The dress rehearsal should be the final rehearsal and most
important of all rehearsals. By this point, every effort should have been made to
remove any discrepancies in the lesson. This rehearsal should be accomplished in
the instructional setting that will be used when the actual lesson is conducted.
Rehearse with all media and equipment that will be used on presentation day.
Also, make certain any assistant instructors or support personnel are available to
rehearse during the dress rehearsal. As with the other two types of rehearsals, tape
this if possible.

2. How to Rehearse There are several keys to remember when rehearsing.

a. Avoid Memorization Never memorize the lesson because it will give the
presentation a canned effect that causes the instructor to appear robotic. Know the
outline (conceptual framework), sequence, and the points to be covered, but do not
memorize the lesson verbatim (word for word) from the lesson plan.

Below are some recommendations that can help avoid memorization:

1) Read the lesson plan at least twice and highlight words or key
phrases that need to be emphasized. If anything is unclear, request
guidance from other instructors.

2) Research the technical manuals and references to broaden


knowledge of the subject.

Chapter 4 4-10
Systems Approach To Training Manual Implement Phase

3) Review all supplemental material.

4) Print the media (3 slides per page) and write notes on the right
hand side of the page. The notes can include key phrases from
the lesson, examples, analogies, stories, or anything else that
needs to be mentioned or accomplished when that particular slide
is displayed. If using a turn chart or transparencies, write notes
as well. Once the instructor is comfortable, rehearse without the
notes.

b. Rehearse by Parts If there is any part of the lesson that feels


uncomfortable or needs more practice, rehearse that part separately until you gain
confidence with the material and delivery.

c. Rehearse for Criticism After completing the previous step, rehearse the
lesson for the sake of criticism in front of an audience. This audience should be
fellow instructors or curriculum developers responsible for the development of the
curriculum.

d. Rehearse the Whole Lesson After the instructor rehearses and is


comfortable with the different parts, the lesson should be rehearsed from start to
finish. An instructor can get a false sense of security when comfortable rehearsing
only specific parts. This is essential to ensure that the lesson flows smoothly.

3. Evidence of Rehearsal The following are indicators of effective rehearsal.


It is important to note that a lack of rehearsal may cause students to form
negative opinions regarding the lesson, the instructor's professionalism and
abilities, and the course or instructional program. However, proper rehearsal will
produce the following positive results.

a. Presentation Flows Smoothly If the entire presentation flows smoothly,


it is most likely due to instructor rehearsal. Conversely, if the presentation is
choppy or disjointed, it can be presumed that the instructor did not rehearse
appropriately.

b. Instructor Appears Knowledgeable When an instructor appears


knowledgeable about the subject matter, it is evidence of rehearsal.

c. Instructor Appears Comfortable The next consideration is whether or


not the instructor appears comfortable in the classroom. The instructor should
know where all the equipment and media are located and the presentation should
not be interrupted because the instructor could not operate the equipment or
media. If the instructor appears relaxed while delivering the presentation, then he
or she most likely spent enough time rehearsing.

d. Time Limit Further evidence of rehearsal is the effective delivery of the


instruction within the time allocated. If the instructor remains within the time
limit, then it is most likely due to rehearsal.

Chapter 4 4-11
Systems Approach To Training Manual Implement Phase

4304. INSTRUCTOR EVALUATION

Evaluation of instructors for the purpose of improving the quality of training is an


ongoing process. All instructors should welcome the opportunity to be evaluated by
others. Through this evaluation process, the instructor will receive feedback on
strengths as well as those areas that need improvement.

1. Types Two types of instructor evaluations are conducted: content and delivery.
Content evaluations are normally conducted by occupational field subject matter
experts to verify the content qualifications of the instructor. Seasoned instructors,
who have completed training at the Instructional Management Schools evaluate the
instructor’s ability to effectively deliver the training. Schools should contact the
delivery experts at the IMS for specific delivery evaluation support. Further, school
and detachment commanders can request a Curriculum Assistance Visit (CAV) from
CG, TECOM (Training Management and Evaluation Section). The CAV team provides
expert consultation on all aspects of the curriculum and instruction.

2. Scheduled or Unscheduled Evaluations may be conducted on a scheduled


or unscheduled basis. Each method of evaluation has its advantages and
disadvantages. A scheduled evaluation allows the instructor to prepare for the
evaluation. It may also allow the instructor time to prepare a “show” that is not
typical of usual performance. An unscheduled evaluation permits the evaluator to
observe the instructor in a normal mode, which can result in a more realistic
appraisal of the instructor. The drawback to an unscheduled evaluation is that an
instructor may feel threatened and fail to perform at normal levels. Whether the
evaluation is scheduled or unscheduled, the instructor should never switch from their
usual performance for the benefit of the evaluator.

3. Preparing for Evaluation Instructors need to always be prepared for an


evaluation, because they are always being evaluated when they instruct--by their
students. Instructors should always view the evaluation process as an opportunity to
gather information that will help them become more effective as instructors. A
preliminary meeting with the evaluator will aid the instructor in preparation for the
evaluation. The evaluator should answer any question the instructor may have and
should provide the instructor with a copy of the instrument(s) being used during the
evaluation.

Chapter 4 4-12
Systems Approach To Training Manual Implement Phase

SECTION 4400. CONDUCT INSTRUCTION

4 The effective and efficient delivery of instruction is a key point in the SAT process.
Although the curriculum developer may have designed and developed the material
so that it would maximize the transfer of learning, it is crucial that the instructor
present the lesson in a manner that ensures comprehension and on-the-job
application. While comprehensive planning and preparation early in the Implement
Phase is necessary, it does not guarantee success. The instructor must
communicate effectively, conduct the lesson, and manage the classroom during
and after the presentation.

4401. EFFECTIVE COMMUNICATION

How an instructor presents information can influence student understanding,


retention, and ultimately, on-the-job performance. In conducting instruction, the
instructor should adhere to the following effective communication guidelines to
ensure the maximum transfer of knowledge and skills to the students.

1. Communication Process Communication is the act of sending and


receiving messages and providing feedback on those messages. The messages can
be verbal, nonverbal, written, or physical--even a lack of action can be a message.
Communication is an on-going process; however it is incomplete if the person with
the message does not have a person to receive the message. Therefore,
communication is always an exchange between two or more people. In Figure 4-1,
the communication model "freezes" the process so that what happens during
communication can be examined.
Communications Model

Nachricht

Communication
Process
Sender

Receiver

Message/Feedback

Figure 4-1 Communications Model

Chapter 4 4-13
Systems Approach To Training Manual Implement Phase

2. Communication Techniques The communication techniques that


instructors must skillfully employ in the classroom are: verbal, nonverbal,
listening, and questioning. These techniques dramatically affect the transfer of
learning and the instructor's ability to maintain student attention. 1. Verbal
2. Non verbal
a. Verbal There are eight speech techniques that instructors must be cognizant 3. Listening
of when speaking. 4. Questioning

1) Volume Volume is the loudness or softness of a speaker's voice. Be sure to


adjust your voice to the acoustics of the room, the size of the audience, and
the level of background noise. If an instructor speaks too loud, he or she
could be perceived as overbearing. If an instructor speaks too softly,
students will have difficulty hearing the material and may perceive the
instructor as timid or unsure of the content. Remember that the speaker's
voice always sounds louder to the speaker than to a listener. If students look
puzzled, are leaning forward in their seats, or are otherwise straining to hear,
then the instructor needs to talk louder.

2) Rate Rate involves the speed at which a person speaks. The best rate of
speech depends partly on the mood the speaker is trying to create. If a
person wanted to communicate the experience of mastering the crucible or to
express the excitement upon graduation from boot camp, then a faster-than-
normal rate may be used. If speech is too slow, it may put students to sleep.
If too fast, students may lose track of the ideas that the instructor is trying to
convey. Change the rate of delivery to get students' attention and to hold
their interest. The rate of speech should be governed by the complexity of
the subject and the emotion to be expressed.

3) Dialect Most languages have dialects, each with a distinctive accent,


grammar, and vocabulary. Dialects are usually based on regional or ethnic
speech patterns. These dialects affect the way people talk in different parts
of the country. For example, in the southern U.S., parents may tell their
children to stop "squinching" their eyes while watching television and to go
clean up their rooms "rat" now. There is no such thing as right or wrong
dialect. However, it can be troublesome for the instructor when the audience
does not share that dialect. In such a situation, this may cause listeners to
make negative judgments about the speaker's personality, intelligence, and
competence. Even worse, students may not be able to understand the
material being taught.

4) Pronunciation Pronunciation is the accepted standard of sound and rhythm


for words in a given language. Below are some of the most frequently
mispronounced words in the English language:

Chapter 4 4-14
Systems Approach To Training Manual Implement Phase

Word Common Error Correct pronunciation


genuine gen-u-wine gen-u-win
arctic ar-tic arc-tic
nuclear nu-cu-lar nu-cle-ar
February Feb-u-ary Feb-ru-ary

Every word leads a triple life: it is read, written, and spoken. Most people recognize
and understand many more words in reading than they use in ordinary writing and
about three times as many as occur in spontaneous speech. This is the reason for
occasionally stumbling when speaking words that are part of reading or writing
vocabularies. In other cases, commonplace words may be mispronounced out of
habit. If there are any doubts about the proper pronunciation of certain words, check
the dictionary or listen to someone say it properly.

5) Articulation Articulation is the delivery of particular speech sounds. Sloppy


articulation is the failure to form particular speech sounds distinctly and
carefully. Most of the time poor articulation is caused by laziness. Words are
habitually chopped, slurred, and mumbled, rather than enunciating plainly.
Though it is known that "let me" is not "lemme,” "going to" is not "gonna,”
and "did you" is not "didja,” yet we persist in articulating these words
improperly. If sloppy articulation is used, work on identifying and eliminating
common errors so that thoughts and ideas can be effectively expressed to
students.

6) Force Use force by emphasizing the correct word or syllable. Placing


emphasis on different words or syllables can change the meaning of a
sentence. Practice placing emphasis on the underlined word in the following
sentences: Why did you join the Marine Corps? Why did you join the Marine
Corps?

7) Inflection Inflection refers to changes in the pitch or tone of a speaker's


voice. It is the inflection of the voice that reveals whether a question is being
asked or a statement is being made or whether a person is being sincere or
sarcastic. Inflections can also make a person sound happy or sad, angry or
pleased, dynamic or listless, tense or relaxed, interested or bored. If all
sentences end on the same inflection (upward or downward), work on varying
pitch patterns so they fit the meaning of the words. Inflection is one of the
keys to expressing something emotional, persuasive, or convincing. Using
inflection can make the difference between just saying words and making
ideas meaningful.

8) Pause Learning how and when to pause is a major challenge for instructors.
Even a moment of silence can seem like an eternity. As confidence is gained,
however, it will be discovered how useful the pause can be. It can signal the
end of a thought, give students a chance to absorb the material, give a
speaker an opportunity to concentrate on the next point, and lend dramatic
impact to a statement. Unfortunately, many times pet words are used in place
of a pause, such as "um," "OK," "er," and "uh." These can become extremely
annoying and distracting to students. To minimize the use of pet words, be
familiar with the material, be well rehearsed, and make a conscious effort to
use a natural pause in its place.

Chapter 4 4-15
Systems Approach To Training Manual Implement Phase

b. Nonverbal Communication (Platform Behavior) Communication is not


complete without the nonverbal signals that complement verbal communication.
The factors of posture, movement, nervousness, gestures, facial expressions, and
eye contact can contribute to, or hinder, the communication process.

1) Posture Posture is very important; it shows enthusiasm for the subject.


Posture is referring to platform stance. It should be comfortable without
being slouchy. Do not lean on the lectern. In fact, it is best to stay
completely away from the lectern in classroom instruction. Remember to
stand erect with confidence.

2) Movement Move with a purpose. Is movement excessive? Is there a


reason for movement? Movement can attract the attention of the listener.
Move to convey a thought or as an aid in transitions. The basic rule in use
of movement is moderation. Avoid: moving constantly, staying anchored
to the podium, standing in one spot, blocking media, dragging feet,
swaying back and forth.

3) Nervousness Some nervousness or anxiety is natural and normal.


Nervousness causes poor voice techniques and mannerisms.

Overcome nervousness by:

 Focusing on student learning.

 Rehearsing the lesson.

 Having a positive mental attitude.

 Relaxing and enjoying teaching.

 Being organized.

4) Gestures Gestures are the motions of instructor’s hands or arms. The


primary rule is this: The gestures made should not draw attention to the
instructor or distract from the message. Gestures should appear natural
and spontaneous, help to clarify or reinforce ideas, and be suited to the
audience. Gestures tend to work themselves out as experience and
confidence is acquired. Avoid: flailing arms about, rubbing hands, cracking
knuckles, slapping legs, toying with rings, or any other distracting motions.
Think about communicating with students and gestures will take care of
themselves just as they do in conversation.

5) Facial Expressions Facial expressions can reinforce, modify, or even


contradict the spoken word (showing an instructor's thoughts and feelings).
Instructors that appear relaxed and express enthusiasm in the subject
create a bond with their students and make them feel comfortable (e.g., a
smile indicates pleasure). Expressionless instructors are usually unprepared
or nervous, focusing too hard on their delivery vice students, uninterested
in the subject, or not attempting to make learning fun.

Chapter 4 4-16
Systems Approach To Training Manual Implement Phase

6) Eye Contact The use of the eyes is probably the most meaningful channel
of nonverbal communication available. An instructor’s eyes convey thoughts
and feelings and can open communication, prolong communication, or cut off
communication. As eye contact is established, remember to:

 Be alert Be alert for student reactions. Can they hear? Do they


understand? A stare used in conjunction with silence can be quite useful in
gaining the attention of misbehaving or inattentive students.

 It isn't enough to just look at listeners How the instructor looks at


students also counts. A blank or intimidating stare is almost as bad as no eye
contact at all.

 Try to establish eye contact with the whole class Some common errors
are darting the eyes around the room, looking at the floor or demonstrators
vice the audience, or looking at one part of the audience while ignoring the
rest. The rule of thumb is to hold the eye contact until communication
occurs.

c. Listening Look at Figure 4-2: notice that on an average day, 9% of our time is
spent writing, 16% is spent reading, 30% is spent speaking, and the major portion,
45%, is spent listening. Listening takes in more information and is used more than
reading and writing combined.

45
40
35
30 Writing
25 Reading
20 Speaking
15 Listening
10
5
0

Figure 4-2

Chapter 4 4-17
Systems Approach To Training Manual Implement Phase

1) Definition Listening is paying close attention to and making sense of what


is being heard. It is the channel used most often for learning. Ironically, it
is the least understood function of all. When thinking about listening, the
tendency is to assume that listening is basically the same as hearing. This
is a dangerous misconception because it leads many to believe that
effective listening is instinctive. As a result, little effort is made to learn
how to develop listening skills and unknowingly a vital communication
function is neglected. Consequently, misunderstandings, confused
instructions, loss of important information, and frustration are created.

2) Exercises There are exercises that can be performed to increase


awareness of listening efficiency. A common exercise is for an individual
who will be the listener to pick a person as the speaker and ask that person
to do a listening check. The listener will listen to the speaker. The listening
check involves summarizing what the listener thinks the speaker said. If
the listener is unable to summarize, ask the speaker to help examine what
is lowering listening efficiency. Another exercise is simply writing all the
sounds heard in a certain time frame. Over a period of time, listening
practice should help improve listening efficiency and two-way
communication in the classroom.

(a) Instructor Barriers to Listening As instructors, be aware of signals


that give students the perception that you are not listening to them.
These barriers interrupt the communication process as the model depicts
below.

Message
Instructor
Barriers

Communication
Process
Sender(Student)
Receiver(Instructor)

Message/Feedback

Figure 4-3 Instructor Barriers to Listening

Chapter 4 4-18
Systems Approach To Training Manual Implement Phase

 It is important for instructors to orient their body towards the speaker


(student) and maintain eye contact when answering or receiving a
question.

 Folded arms or hands on hips are examples of different types of body


language or gestures that can indicate an instructor has a lack of interest
in the speaker or may intimidate the speaker (student).

 Rolling eyes are some instructor facial expressions that may signal
disapproval or disinterest.

 Instructor should not allow emotionally laden words to distract him/her


during questions. Examples: a student uses the word sex vice gender or
WM vice female Marine.

 Instructors should avoid using words or phrases that may have a negative
effect on students when directed by instructors/fellow students such as:
rock, idiot, stupid, lost one, wonder child, you fail to understand, you do
this all the time, or are you confused again?

(b) Student Barriers to Listening An instructor must be aware of


possibilities that cause student barriers to listening. Below are five
causes to poor listening along with signs that will cue the instructor.
This will assist tremendously with identifying these barriers and help
minimize the interruption of the communication process.

(1) Lack of Concentration The brain is incredibly efficient.


Although we talk at a rate of 120-150 words per minute, the brain can
process 400-800 words a minute. This would seem to make listening
very easy, but it actually has the opposite effect. Because the brain
can take in a speaker's words and still have plenty of spare "brain
time," there may be the temptation to give into physical or mental
distractions. Concentrating is hard work.

 Signs: Lack of eye contact with instructor, tapping foot or pencil,


fidgeting, doodling, clock-watching, inability to answer questions, a look
of confusion, or lack of involvement in class discussions

(2) Listening Too Hard Listening too hard happens when a


student tries to turn into a human sponge, soaking up a speaker's
every word as if every word were equally important. Students try to
remember all the names, all the dates, and all the places. In the
process, students often miss the speaker's point by concentrating on
too many details. Even worse, they may end up confusing the facts
as well. It is impossible to remember everything a teacher says.

 Signs: Student is frantically trying to write down every word; seems


frustrated, confused or overwhelmed.

Chapter 4 4-19
Systems Approach To Training Manual Implement Phase

 Suggestions: Tell the student to try highlighting the student outline,


recording the class, and/or develop note-taking skills. The student should
take notes in the form of a key-word outline. It is a rough outline that
briefly notes a teachers main points and supporting evidence. Students who
take effective notes usually receive higher grades than those who do not.

(3) Jumping to Conclusions This may also be referred to as “putting


words into an instructor’s mouth.” It is one reason why communication is
poor between those who are close. A person does not listen to what is
being said due to thinking that he/she knows what is meant. Another way
of jumping to conclusions is by prematurely deciding a topic is boring or
misguided. The student may decide that an instructor has nothing valuable
to say. For example, the topic could be on arguments to support women
being in combat. If a student disagrees with the precept, the instructor may
be tuned out. Nearly every class has something to offer - whether it is
information, point of view, or technique.

 Signs: Interrupting other students, not enthusiastic, disruptive behavior or


lack of concentration.

(4) Focusing on Delivery and Personal Appearance People tend to be


judged by how they look or speak. Some people become so distracted by a
speaker's accent, personal appearance, or vocal mannerisms that they lose
sight of the message. Focusing on a speaker's delivery or personal
appearance is one of the major barriers in the communication process, and
it is something that always needs to be guarded against.

 Signs: Disrespectful to the instructor, know-it-all, distractive behavior.

 Suggestions: Apply speaking techniques discussed earlier, class


management techniques, rehearse the lesson, and maintain high
appearance standards.

d. Questioning By asking questions throughout the lesson instructors can


emphasize a teaching point, monitor student comprehension, stimulate thinking,
increase interest, and promote student participation. Instructors tend to ask
questions in the "knowledge" category 80% to 90% of the time. These questions
are not bad, but using them all the time is. Instructors should try to use higher
order level of questions as defined by Dr. Bloom in Chapter 6. Questions that cause
the learner to process, synthesize, and apply the knowledge presented during the
instruction lead to better comprehension and application.

(1) Characteristics of a Well Constructed Question

 Clear - state questions in language familiar to the students and phrase the
question so that the students understand its meaning.

Chapter 4 4-20
Systems Approach To Training Manual Implement Phase

 Concise - contains only one idea and is short enough for students to
remember (not too wordy).

 Relevant - relates to the subject or material taught in the lesson.

 Thought Provoking - state so that the answer is not suggested in the


question; open-ended (cannot be answered with a yes or no response);
answer must not be displayed in the classroom (media); apply Bloom's
Taxonomy (range of higher-level questions) as discussed in Chapter 6.

(2) Asking students questions

Step 1 - ASK the question (call students by name). Ensure the question
is well constructed.

Step 2 - PAUSE to allow the student time to think. If the student cannot
answer, rephrase the question or redirect the question to another
student. For example: "Can someone help him/her out?" or "Sgt Smith,
can you help him/her out?" Once the question has been answered move
to the next step.

Step 3 - ENSURE EVERYONE HEARD the answer. For example, "Did


everyone hear his/her answer?"

Step 4 - PROVIDE FEEDBACK. Inform the class whether or not the


answer was correct. For example: "That's right" or "Good job.” Avoid
saying "wrong answer"; try to rephrase your response. For example:
"That wasn't quite what I was looking for; can someone help him/her
out?" or "Does everyone agree with that?" or "Does anyone have
anything to add to that?" If no one can answer the question, provide the
answer and clear up any confusion.

(3) Receiving questions from students The situation will dictate whether
or not Steps 2 and 3 are necessary. Therefore, both steps are left up to the
discretion of the instructor (optional).

Step 1 - RECEIVE the question. Ensure students raise their hands and
solicit one student at a time (by name).

Step 2 - REPHRASE. If the question is unclear, rephrase it or have the


student rephrase it. If the instructor rephrases the question, verify the
student’s question before moving to the next step. For example, "Let me
make sure I understood your question. You wanted to know if we are off
this weekend. Was that your question?" (OPTIONAL STEP)

Chapter 4 4-21
Systems Approach To Training Manual Implement Phase

Step 3 - ENSURE THE QUESTION WAS HEARD. State -"Did


everyone hear SSgt Hall's question?" If you know the question was
obviously loud enough for everyone to hear, then this step may be
skipped. If it was not loud enough, then repeat it (paraphrase if
needed) or have the student repeat it. (OPTIONAL STEP)

Step 4 - ANSWER the question. The instructor can either answer the
question or redirect the question to the entire class to allow for student
participation. For example, "That's a good question, can anyone answer
it?" If it cannot be answered then provide the answer. If the instructor
does not know the answer, let the student know that he/she will find out
and get back with him/her at the break or after class.

Step 5 - VERIFY. Ask the student if the answer provided was


adequate. For example: "Did that help you out?" "Did that clear up any
confusion?” or "Did that answer your question?"

3. Facilitation Techniques Transfer of learning refers to the extent to which


students learned material/skills in the instructional setting that could be readily
applied on the job. The instructor influences the transfer of learning through
facilitation techniques. The way a lesson is presented will influence the success of
the instruction. The instructor should strive to provide real world relevance, student
focus, control the lesson, motivation techniques, and interaction with students.
Below is a discussion of each.

a. Real World Relevance Whenever possible, maximize the similarity


between the instruction and the job situation to show relevance. The instructor can
also physically organize the instructional environment to create a realistic job setting
for instruction.

b. Students Focus The most common attention getting techniques used by


instructors are:

1) Direct Attention Essentially, it consists of directing students' attention


to what was said or will be said through the use of verbal statements,
gestures, or even a pause. For example: "Know this diagram well!" A
combination is even more effective, but be careful not to overuse these
techniques.

2) Present Concepts from Simple to Complex Discuss basic principles


and ensure they are understood before introducing complicated details.

Chapter 4 4-22
Systems Approach To Training Manual Implement Phase

c. Control the Lesson Ensure the objectives of the class are met and that the
discussion/questions do not go beyond the focus of the class. In addition, create a
comfortable learning environment and use discretion/tact when correcting a student's
inappropriate or disruptive behavior so that it is not detrimental to the learning
environment.

d. Motivation Techniques For learning to be effective, students must be


motivated to learn. There exists a shared responsibility for motivation between the
instructor and the student. The learner controls the desire to learn, and the instructor
controls the stimulation. Below is a list of what instructors can do to stimulate that
motivation in students.

1) Give Recognition When students do something worthy of recognition,


instructors need to give positive feedback to the student. Such recognition
makes the student feel alive, important, and significant.

2) Serve as a Good Role Model An instructor has considerable influence on


the student’s motivation, thru the example given. Show them the proper
way to complete a task, wear a uniform, or treat students to PRACTICE
WHAT YOU PREACH. Research indicates that teachers with low self-
concepts tend to have students in their classroom with lower self-concepts,
and vice-versa.

3) Stimulate Cooperation Among Students Modern society places a lot of


emphasis on competition. While competition with the self can lead to
improved performance as students strive to do their best, competition
against others can result in negative perceptions of the self especially if it
isolates a person. With cooperation, everyone can experience the success
of the group, and no one is viewed as the winner or loser.

4) Consider Mastery Learning Mastery is defined in terms of a specific set


of major objectives that students are expected to exhibit by subject
completion. Using this approach, a student’s performance is measured
against objectives rather than against the performance of other students.
Students learn at different rates, therefore the instructor sets expectations
for each individual. This allows time for learning to vary, so all or almost all
students achieve the desired level of mastery.

5) Have High but Reasonable Expectations for Students There is a


considerable amount of research that suggests that students perform up to
the expectations that instructors have for them. Students grow, flourish,
and develop better in a relationship with someone who projects an inherent
trust and belief in their capacity to become what they have the potential to
become.

6) Recognize Potential in Students Behavioral scientists have concluded


that human’s function at 10 percent or less of their potential. Negative self-
concepts certainly stand in the way of releasing the potential of students.

7) Providing Examples and Analogies Providing a variety of examples and


analogies when teaching concepts or skills will help solidify the key elements
of the material and can further motivate students to learn.

Chapter 4 4-23
Systems Approach To Training Manual Implement Phase

8) Recognizing Individual Differences As discussed in Chapter 6


(Adult Learner), some students learn at a slower pace than others,
and some students require different stimuli to become motivated to
learn. The instructor must establish an effective student-instructor
relationship. It is important that the instructor does not create
barriers, but builds a rapport with the students, and shows empathy
and genuine concern for their learning.

9) Providing Feedback Student performance improves when the


instructor provides meaningful feedback on performance. Timely and
constructive comments about student performance provide
recognition of their efforts and help to correct errors. Used
appropriately, feedback should specify clearly the action being
reinforced and should be believable. Examples: "Good point!”
"Outstanding,” "Sgt Frost, that's a good idea! Let's discuss what
might happen if you implemented that concept." Provide written
comments on student assignments about the strengths and
weaknesses of the student’s ideas/concepts. But be cautious with
praise for if it is used too often, or inappropriately, it can have a
negative effect on the motivation of adult learners.

e. Interaction with Students Learning is an active process for adult


learners. The instructor should strive to involve students in the instruction process.
To do so, the instructor should be aware of students’ prior knowledge, the context in
which the material is presented, how learning will be applied to the job, and the
realization that student understanding of new information depends on how well it
relates to their prior knowledge. Probe throughout the lesson to increase interaction.
Have students answer each other's questions whenever possible, and allow the adult
learner every opportunity to take responsibility for his or her own learning.

4402. STEPS IN CONDUCTING THE LESSON

1. Present the Introduction The instructor provides the students with a


brief preview of the class by explaining the purpose of the class, reviewing the
learning objectives with the students, how the lesson will be taught (including STEP 1
mentioning the administrative instructions), and how and when students will be
evaluated.

a. The first element (Gain Attention and WIIFM) must always be mentioned first, WIIFM – What Is In It For
and the remaining elements should be mentioned as a structured event using the Me? Why do I need to listen
acronym GOLMEST (Gain Attention, Overview, Learning Objectives, Method/Media, to this class?
Evaluation, Safety, and Transitions). By employing this sequence, your students will
become familiar with the important elements of the introduction and this will help
reduce the number of student questions that always seem to pop up about the GOLMEST - (Gain Attention,
introduction. Overview, Learning
Objectives, Method/Media,
b. The introduction must be completed prior to transitioning into the body of the Evaluation, Safety, and
lesson. Transitions).

Chapter 4 4-24
Systems Approach To Training Manual Implement Phase

2. Present the Body After presenting the introduction, present the body of the
lesson. The body will be presented in the same sequence as the learning objectives
in order for the lesson to "flow smoothly.”
STEP 2 a. Transitions tie together the main ideas in the lesson, smoothly summarizing
one main idea and introducing the next one. They essentially form "bridges" that
reinforce the conceptual framework, enabling the instructor to probe for
understanding and gather feedback from students before opening the next main
idea.

b. The term "probing" simply means asking follow-up questions to students.


“RECALL” – This is Probes can ask for specifics, clarifications, consequences, elaborations, parallel
memorization of the subject examples, relationships to other issues, or explanations. Probes are important
without displaying because they help students explore and express what they know, even when they
comprehension. aren’t sure they know it. You should probe throughout the lesson to assess
students’ comprehension of the material. You can probe at any time, but the
questions must be thought provoking and should not be simple "recall" questions as
discussed in Section 4301 (Effective Communication).

3. Present the Summary Once finished with the last main idea, transition into
the summary. In the summary, the instructor must mention all main ideas that were
STEP 3 covered in the lesson. In addition, provide closure that explains why the student
just sat through the lesson. Then provide closing instructions to alleviate any
concerns the student may have (i.e., fill out IRFs and take a ten-minute break).

4403. METHOD EMPLOYMENT

The definition of instructional methods is "an educational approach for turning


knowledge into learning." Instructional methods are the "how to" in the delivery of
training. The methods used in any learning situation are primarily dictated by the
learning objectives decided upon by the course developers. In many cases, a
combination of methods is used to intensify the learning experiences. All instructors
need to understand the following methods and their responsibilities in using them:
lecture, indirect discourse, demonstration, reading, self-paced, questioning,
discussion non-directed, guided discussion, practical application, field trips,
simulations, case study, and coaching. The lecture method and the demonstration
method are the two most commonly used in Marine Corps training. However, for
purposes of this chapter, the methods are discussed as sequenced above.

1. Lecture (Formal, Informal, Briefing, Guest) The lecture method is an


instructional presentation of information, concepts, or principles. Its main purpose is
to present a large amount of information in a short period of time. The lecture
method is an efficient way to introduce a new topic of study or present background
material students need for future classes.

Chapter 4 4-25
Systems Approach To Training Manual Implement Phase

a. A formal lecture allows instructors to present a subject to a large audience


because they use no media and there is no interaction between the students and the
instructor. The lecture method depends primarily on student listening and note-
taking skills for the transfer of learning. The instructor must have effective speaking
skills, an in-depth knowledge of the subject matter, and find realistic examples and
analogies to use with explanations. In preparing to deliver a lecture, the instructor
must set clear-cut goals and objectives. The instructor should remember that the
only feedback received from the audience will be nonverbal communications. Since
the audience may lose interest with no active part in the instruction, the lecture
should last no more than 30 minutes. Lectures should be short, well organized, and
to the point.

b. In the informal lecture, the size of the group is usually smaller than the
formal lecture and student participation develops when the instructor questions the
students or they question the instructor on points presented. Considerable verbal
interaction between instructor and student is often possible in the form of both
questions and discussion. The delivery style is even more conversational, with
students often addressed directly by name. An informal lecture with media is
commonly used in the Marine Corps for presenting information, concepts, and
principles. Most learning takes place through the sense of sight. It follows then that
all students must be able to see the media being used, which will limit class size.
The media used can reduce the amount of explanation time required for students to
grasp concepts, structures, and relationships. Instructors simply cannot get some
ideas across to students without the use of media. For example, think how difficult
an explanation of the operation of the internal combustion engine would be without
the use of media.

When using informal lecture with media, the instructor must prepare properly.
That includes practicing with the actual media in the places they will be used.
Instructors should plan the timing of the use of media to keep the students'
attention and to stress important points. Since the instructor’s explanation of
the media will require the use of effective instructor techniques, he/she needs to
decide which ones to use. Mentally rehearse those techniques and practice
using the media until the lecture can be presented smoothly.

c. A briefing is a formal or informal presentation in which a variety of


significant facts are presented as concisely as possible. The briefing is rarely
concerned with material beyond the knowledge level and is almost always
accompanied by media in various forms. Strictly speaking, the briefing is not a
teaching method, but it is sometimes used in school situations.

d. A guest lecture is a presentation by a person other than the instructor who


is usually an expert. It is used to give variety to the class period or to supply
information in an area where the instructor is not an expert.

Chapter 4 4-26
Systems Approach To Training Manual Implement Phase

2. Indirect Discourse (Panel discussion, Dialogue, Teaching


Interview) These presentational methods provide situations in which the skill or
material to be learned is in some way presented to or demonstrated for the learner.
In some presentational methods there is little if any activity or interaction required of
students other than their attention and desire to learn. When a question-and-
answer period follows the interview, students can interact with the expert.

a. A dialogue is an interaction between two or more persons, one of whom


may be the instructor. It is generally used to present sharply opposing points of
view for students. The dialogue is often highly structured towards preplanned goals
and may take the form of questions and answers between the participants.

b. A panel is a structured or unstructured discussion between two or more


experts (generally excluding the regular instructor) presented in a variety of ways,
such as constructive arguments followed by debate, response to questions from the
instructor or the students, a preplanned agenda, a fixed or a random order of
speakers, or free discussion.

c. A teaching interview is when the instructor questions a visiting expert and


follows a highly structured plan, which leads to educational objectives. The
advantage of the teaching interview over the guest lecture is that the instructor
controls the expert’s presentation. The expert normally requires little or no advance
preparation, but responds on the spur of the moment from general experience.

3. Demonstration The basic, and most often used, method of instruction for
teaching skill-type subjects is the demonstration method. It covers all of the steps
the students need to learn a skill in an effective learning sequence. Though it
primarily appeals to auditory and visual learners, it is also extremely effective when
used in conjunction with lecture and prior to practical application. This method
always includes a demonstration step and a performance step and allows other steps
as needed. Use the following techniques when giving an actual demonstration:

a. Position the students and media properly Direct the students to gather
around a worktable or media and make sure every student has an unobstructed
view. Make sure that all students will be able to see and hear the demonstration.
This should be accomplished right before the lesson; someone else may have used
the classroom and rearranged the setting. A demonstration will not be effective if
someone cannot see it.

b. Show and explain the operations Perform the operations in step-by-


step order. Whenever possible, present the telling and doing simultaneously. Do
not hurry; instructor will not normally emphasize speed in performing operations or
in moving from one operation to another in the demonstration step. Make certain
the students understand the first step before proceeding to the second, and so on.
Repeat difficult operations. Pause briefly after each operation to observe student
reaction and to check student comprehension.

Chapter 4 4-27
Systems Approach To Training Manual Implement Phase

c. Observe safety precautions Rigging a safety line, donning a safety mask,


or tagging an electric cable may take a few more seconds, but time is not wasted.
Instead, the instructor has impressed the students with the importance of exercising
extreme care in dealing with potentially dangerous equipment.

d. Give proper attention to terminology Call each part of the media used
by its proper name each time attention is called to it. Getting students to retain the
correct nomenclature requires more than just mentioning the name. The following
suggestions should prove helpful:

1) List the names of parts.


2) Refer students to any available chart that shows the parts and
names of parts.
3) Conduct a terminology drill on the parts of the actual item/object
while it is being assembled or disassembled, as appropriate.

e. Check student comprehension carefully Ask questions during the


demonstration step that require the students to recall nomenclature, procedural
steps, underlying principles, safety precautions, and the like. Watch the class for
reactions indicating lack of attention, confusion, or doubt. Do not depend solely
upon visual observations.

f. Obtain necessary assistance When teaching skills, such as donning a


field protective mask, in which a distinction between right and left is important; ask
an assistant instructor. Ask the assistant to stand so that the class may see what he
or she is doing. Then direct the assistant in performing the activity while observing
the reaction of the students.

g. Check equipment and tools The most important items to check are the
equipment and tools that will be used to conduct the demonstration. Ensure all
equipment is functioning properly.

h. Rehearse When the instructor rehearses, he or she needs to perform the


actual demonstration. Rehearsing in the mind is not the same as rehearsing by
doing. Rehearsal by doing will reveal possible problems. If an assistant is being
used, practice with that person as discussed in Section 4200 (Prepare for
Instruction).

i. Start simple Remember the law of primacy when performing the


demonstration step. Always proceed from simple to complex in logical sequence;
show the correct way to perform the steps the first time you demonstrate them.
Along with teaching a skill, develop proper attitudes, such as the desire to perform
safely, and the desire to exercise economy of time and effort.

4. Reading (Books, Periodicals, Microfilms, Manuals, Handouts)

a. Reading assignments for students may include the following printed


materials: books, periodicals, microfilms, manuals and regulations, and handouts.
This is very effective and time efficient method of presenting materials since students
can progress at their own pace.

b. However, since individuals read at different speeds, keeping the entire class
on schedule can be a challenge. Individual reading is also dependent on the
availability of resources. Reading is geared for individual instruction and the
instructor must be very knowledgeable with the material.

Chapter 4 4-28
Systems Approach To Training Manual Implement Phase

5. Self-Paced (Programmed, Modular, Computer Assisted,


Mediated)

a. Programmed instruction is a method of instruction, which usually includes


a carefully planned sequence of small units of instruction that require the learner to
respond to cues and receive immediate feedback. Various media (books, teaching
machines, and computers) are used to deliver the programmed instruction to the
learner.

b. Modular instruction are prepackaged units of instruction which typically


contain a clear statement of objectives and all necessary learning resources to
permit the learner to achieve these objectives. A module can be a complete unit or
part of a course.

c. Computer-assisted instruction is a learning experience that uses a


computer as the vehicle for interaction between the learner and the planned course
of instruction.

d. Mediated instruction includes such devices as slides, films, tapes, and


cassettes used to present the planned course of instruction to the learner.

6. Questioning (Socratic Method, Student Query) For those


instructors who want to emphasize a point and stimulate student thinking, this
method is very effective. It not only keeps the student focused, but it also checks
understanding and seeks clarification in the students. Two examples of this method
are Socratic Method and Student Query. Both require a high level of instructor
expertise.

a. Socratic Method While rarely seen in its pure form, instruction by asking
students questions is a method as old as ancient Greece and as modern as a great
books course. The method may resemble a guided discussion, but the goal is often
to obtain specific answers to specific questions (reiteration) and not to stimulate
discussion. An instructor may use the method for “trapping” students into
inconsistencies in logic, which sharpen their thinking skills. Law professors often use
the method for “interrogating” specific students using a series of questions as they
might be used in a court of law.

b. Student Query. “Students asking questions” is often used in combination


with other methods such as the lecture, the panel discussion, or the teaching
interview, but it can be used by itself, either on a one-to-one basis in tutoring or
coaching or as part of small or large groups. The method is student controlled,
although a skilled responder can also control the session to a certain extent.
Students’ questions may often be a measure of the degree of their understanding of
a particular subject. That is, they “know enough to ask the right questions.”

Chapter 4 4-29
Systems Approach To Training Manual Implement Phase

7. Discussion-Non Directed (Peer Teaching, Small Group, Free


Discussion) In its original form, the peer-controlled seminar is a group of highly
qualified peers (such as a doctoral-level faculty) who meet periodically for the
exchange of ideas, usually in the form of prepared papers with discussion or
questions following. The research seminar resembles a peer-controlled seminar
when the instructor allows qualified students to lead the discussion with the
instructor providing proper supervision. In Professional Military Education (PME), a
peer often acts as a “facilitator” to lead discussions or conduct workshops. When
used, the instructor should provide a statement of the educational objectives, a
suggested discussion guide, and should require some tangible evidence of the results
of the discussion.

8. Guided Discussion The guided discussion is an activity in which people talk


together to share information about a topic or problem or to seek possible available
evidence on a solution. When using discussion, make sure the seating arrangement
allows all participants to have eye contact with each other. This limits class size.

a. This method involves an interchange of ideas by the students while the


instructors provide guidance. Used alone or in combination with other methods, it
stimulates every student to think constructively. It also encourages students to share
their personal experiences and knowledge with their classmates and to contribute
ideas as a means of solving problems.

b. Initiating discussion and channeling students' thinking and responses along


predetermined lines is called "directed discussion." This method is useful in teaching
skills such as problem solving and understanding cause-and-effect relationships.

c. Directed discussion is often used in training that is conducted for the purpose
of developing favorable attitudes toward a subject or situation. When that is your
purpose, directed discussion gives students more freedom to express their opinions.
The success of directed discussion depends to a large extent on instructor leadership.

d. As in previous methods discussed, the success of a discussion depends on


careful planning. Remember that some elements of the discussion method are
included in every other method of instruction except for a straight lecture. The goal
in using the discussion method is to actively involve your students in the learning
process. The old Chinese proverb, "I hear and I forget, I see and I remember, I do
and I understand," certainly applies in the training arena. Strive for maximum
student involvement.

9. Practical Application This is a method of practice used to reinforce a skill


or a task as it relates to the work place. This method is not an examination.
The student should be supervised and then provided feedback to determine if more
practice is needed. This method generally follows an instructor demonstration and
the student replicates the instructor demonstration alone or in groups.

Chapter 4 4-30
Systems Approach To Training Manual Implement Phase

a. Individual/Group Projects

1) Determine Size Determine whether or not the exercise will be


accomplished on an individual basis or in groups.

2) Adequate Space If the lesson will be conducted in a classroom, make


sure there is adequate room for the students to perform any necessary skills.
If it is outside, ensure the area is clear and safe.

3) Double-Check Double-check the materials, equipment, and tools the


students will be using when conducting the practical exercise. Ensure all the
material is current and available. Also, ensure the equipment is functioning
properly.

b. Supervise, Observe, Help The job of the instructor is to supervise, observe


and provide help. The instructor or assistant instructors should supervise to facilitate
learning, watching the students and correcting any mistakes made during the
exercise. It is a good idea to talk to your assistant instructors to determine if they
have observed anything additional.

10. Field Trips Field trips are extensions of classroom instruction and provide
worthwhile learning opportunities for students to participate in unique and enriching
educational experiences. Instructors should develop systematic procedures for
ensuring that all trips provide optimal learning opportunities for students. The
following minimal procedures should be used when conducting field trips:

a. Identify any special requirements for participation on the trip--special skills,


fitness, certification--as well as any hazards or dangers on the trip or at the site that
might affect the health and safety of the students.

b. Obtain approval where appropriate.

c. Ask students to advise you of any special disabilities, problems or needs that
may need to be accommodated.

d. Consider the need for special clothing or equipment needed in case of weather
or other conditions.

e. Determine transportation needs--reservation of vehicles, drivers, need for site


supervision.

f. Plan for emergencies--theft, illness, vehicle emergency, weather delays,


student misconduct or threats to the safety of others.

g. Communicate information to students in advance about schedules, departure


locations, route, rest and meal stops, lodging, emergency procedures, protocol for
problems, and rules of conduct.

h. Familiarize students with the site and their surroundings.

i. Identify learning objectives for the field trip to assist the students’ learning

Chapter 4 4-31
Systems Approach To Training Manual Implement Phase

11. Simulations (Role-playing, Games) Many Marines in supervisory or


administrative billets require proficiency in two separate and distinct skill sets. The
first skill set is MOS related, while the second deals with leadership and interpersonal
skills. Simulations are a preferred method for building proficiency in these two
areas.

a. Role-playing requires the students to assume active roles in a low risk


simulated situation that involves effective, realistic behaviors. It may involve
individuals, groups or whole units. The role-play is followed by a group discussion
that gives students a chance to re-examine their behavior. It is particularly useful in
teaching the development of interpersonal skills (e.g., leadership or counseling
skills). The new skill is normally taught through lecture and then practiced within
the role-play. For example, one student could play the role of an instructor and the
other one could play the role of the student. However, it is also used in MOS
training, such as firefighting, flight training, and M1A1 training. In these examples
training simulators are used to create "real life" situations while controlling risk to
personnel and equipment.

b. Successful role-playing provides a chance for every student to take part in


the lesson. It provides vivid experiences both for the participants and for the
observers. Simulation mainly prepares or refreshes both MOS and interpersonal
skills. However, it does not eliminate the need for Marines to learn through
application on-the-job. Prior to selecting the type of role-play to be used the
instructor must consider how many students are involved and how to deal with
difficult students (overly defensive, or non-participating). The instructor must check
the master lesson file for a detailed orientation package that describes the student’s
role in the overall scenario and any supporting information.

1) Types of Role-Play

a) Single The simplest role-play involves two people who are asked to re-
enact a problem either from a description or one that came up in a
previous discussion. The advantage here is the whole group is able to
see and then discuss the same problem. The disadvantage is that the
chosen players may feel self-conscious about being the focus of
attention and only those two players get to practice the behaviors. It is
recommended that the instructor ask for volunteers for the role-play.

b) Double Each player has an alter ego who stands behind the player
adding comments or questions during the role-play that perhaps the
primary person may be thinking but not saying. The second player can
be assigned to the role or participants can spontaneously get into the
action when they think of an additional response. They can also help
out the primary player with a new idea or get that player back to reality.
The facilitator should demonstrate this type of role-play before getting
others to try it.

c) Reverse During the role-play, the facilitator asks the two students to
switch roles and seats.

d) Rotation During the role-play, the facilitator asks new participants to


continue the role-play.

e) Multiple Small groups are formed and they simultaneously enact the
role-play. Processing may be more difficult.

Chapter 4 4-32
Systems Approach To Training Manual Implement Phase
2) Employment The instructor must ensure that all students understand
related material and the objective of the role-play. The instructor must state the
behavioral objectives, step-by-step instructions, any rules, and tell the students
that the role-play is not a pass/fail exercise.

a) Pass out Role Play Information Hand out all background information
and allow the students enough time to read it carefully and provide
clarification as needed.

b) Demonstrate Conduct a demonstration of a role-play prior to its first


time being used in course.

c) Assign and Define Roles Verbally designate roles or distribute printed


descriptions of the roles and observers handout. Think about how to
handle students who have been part of a similar situation, get overly
defensive, or do not want to participate.

d) Monitor Create a comfortable environment to encourage active


participation.

e) Focus. Ensure participants focus on the process of practicing


interpersonal skills rather than the content of the situation.

3) Supervise, Observe, Provide Guidance The job of the instructor is to


supervise, observe, and provide guidance to the students. The instructor or
assistant instructors should facilitate learning by refocusing the group and
correcting any mistakes students make during the exercise. It is a good idea to
talk to any assistant instructors to determine if they have observed anything
additional.

12. Case Study Case studies are normally developed from actual events that
have occurred in the operating forces or supporting establishment. Case study
focuses predominantly on analyzing and understanding the process of making
decisions and making sense of complex or ambiguous information. Case studies are
an excellent method for bringing together multiple learning points under a culminating
exercise that causes students to process, analyze, and synthesize information. The
instructor will normally present a case study in printed form, but it may also be
presented using pictures, films, role-playing, or oral presentations. After the case
study is presented, the class can be divided into groups. The students then analyze,
discuss, and report the key elements of the case and the lessons to be learned.

a. Objective The main objective of a case study is for students to gain practical
knowledge from an actual event and to develop analytical and problem-solving skills.
The greatest value of the case study is that it challenges students to apply what they
know and comprehend to a realistic situation. Normally in the case study, concepts
and principles are not taught directly. Instead, they emerge gradually as students are
forced to formulate theories to support their case decisions. In preparation, the
instructor should do the following:

Chapter 4 4-33
Systems Approach To Training Manual Implement Phase

1) Distribute Copies of the Case

2) Make the Following Suggestions

a) Skim Read the first few paragraphs of the case, and then skim the
rest to find out in general what the case is about and what kind of
information is included for analysis.

b) Facts as you go Put yourself in the position of the main character in


the case and ask yourself what the basic issue/problem is, how the
issues/problems are affected by the information presented in the case,
and how those issues/problems should be handled.

c) Take Notes Note the basic issues on a sheet of paper. Then read
through the case again, jotting down the relevant considerations for
each problem.

3) Develop Solutions Instruct the students to develop possible solutions


to the case issues as they are reading. Solutions must be supported by
evidence found in the case.

4) Instruct the Students to Begin Reading Allow ample time for


careful reading of the case.

5) Re-Read Go back and carefully read the entire case, underlining key to
the case.

6) Opening Question Some case leaders begin with the question, "What
is the issue here?" Then go on to, "What are the pertinent facts?”
Others begin with a more general question, "What action should be
taken?” The approach depends on the intellectual maturity of the
students and the subject matter.

7) Refrain from Lecturing The case study method is inherently a


student-centered approach. Keep instructor comments to a minimum
and let the students do the talking.

8) Be Non-Directive In most case studies, there is no single correct


answer. It is more important to lead the students toward the
application of sound principles than to persist in an endless search for a
single correct answer. The instructor should focus on facilitation and
must avoid imposing personal views and passing judgment on student
contributions. The instructor’s role is to encourage independent thinking
and the achievement of the lesson objective.

9) Summarize The key learning points (should be no more than 3-4) and
they must tie back to the learning objective.

b. Controlling Participation The case discussion is controlled much like the


guided discussion, except that in this case, the instructor may feel free to enter the
discussion. However, he/she needs to remain neutral. The instructor can keep track
of the discussion on the chalkboard, turn chart, dry erase board, etc., so that the
entire class has a visual record of where the discussion has been and where it is
going.

Chapter 4 4-34
Systems Approach To Training Manual Implement Phase

13. Coaching This method is an intensive learning experience for individuals or


small groups. It is characterized by significant student involvement and immediate
instructor feedback. A videotape of student performance is an excellent teaching aid
when supplemented by an instructor’s analysis and critique. This technique is
particularly effective in instructor training.

a. Preparation This is the key to coaching. The first thing the instructor must
do is to identify the student’s current strengths, weaknesses, and overall level of
competence. After identifying these key elements, the instructor/coach takes the
following steps:

1) Identify Needs List specific knowledge, skills, or attitudes to be


addressed with the application.

2) Determine Desired Goal The goals should address the identified needs.

3) Select Activities List resources, strategies, and initiatives needed for


development.

4) Determine Target Dates

b. Employment

1) Define Roles Discuss your role, goals, and target dates with the student
and reach an agreement.

2) Probe Determine what the student already knows and build on that
knowledge throughout a step-by-step process. Use thought-provoking
questions (Effective Communication) and have the student explain
performance. Demonstration prior to the exercise is highly recommended.

3) Problem Solving Teach the students to search for alternatives and solve
problems on their own. Strive to make them self-sufficient (minimal
guidance needed). This will increase their confidence and ensure they do
not immediately request assistance. Provide suggestions if needed.

4) Intervention Know when to intervene, when to stand back from


situations, and let the learner figure out a solution. Become involved in
risky situations that demand your intervention, but avoid unnecessary
involvement that will detract from your learners’ training and achievement.

5) Feedback It is extremely important to tell the student what they are


doing throughout the exercise so they can get a sense of achievement.

6) Supervise, Observe The job of the instructor is to supervise and


observe. The instructor or assistant instructors should supervise to
facilitate learning, watching the students, and correcting any mistakes
made during the exercise. Observe the exercise for any discrepancies.

7) Collect and Analyze Performance Data

8) As Needed Review and Modify Goals or Training.

9) Evaluate Performance

Chapter 4 4-35
Systems Approach To Training Manual Implement Phase
4500. ADMINISTER TESTS
SECTION
The primary purpose for administering tests is to determine if the learning objectives
have been met, improve instruction, and thereby increase student learning. This is
5
accomplished by having a well thought out evaluation process. The following is a
basic process to be used by formal schools/detachments. However, some schools
may need to modify this process because of the unique nature of their instruction
and/or resource constraints.

4501. TYPES OF TESTS

A student’s knowledge and skill level can be tested at different intervals before,
during, and after the course of instruction. This is accomplished by a pre-test,
progress test, and post-test.

1. Pre-Test A pre-test is administered to students prior to entry into a course or


unit of instruction to determine the knowledge, skills, and behaviors the students
already possess in a given subject. A pre-test is useful for tailoring instruction to
match the entering student’s knowledge and skill level. Example: A pre-test may
reveal that incoming students have in-depth knowledge of M16A2 rifle loading and
unloading procedures. With this information, an instructor can teach loading and
unloading procedures as a refresher only.

2. Progress Test A progress test is administered throughout a course to evaluate


student progress and to determine the degree to which students are accomplishing
the learning objectives.

3. Post-Test A post-test reveals the effectiveness of instruction and how well the
student learned by determining whether or not the learning objectives were
achieved. Test items are designed to duplicate the behavior expressed in the
learning objectives so that this determination can be made.

Chapter 4 4-36
Systems Approach To Training Manual Implement Phase

4502. METHODS OF TESTING

1. Performance-Based Testing A performance test duplicates the job


behavior(s) by using the same equipment, resources, setting, or circumstances that
the student will encounter on the job. The Marine Corps strives for performance-
based instruction and testing to increase the transfer of learning from the instructional
environment to the job. Normally, a performance checklist is used to record the
student’s level of mastery on the test. The test must have specific instructions for
both the instructor and the student.

2. Knowledge-Based Testing Knowledge-based test can be oral or written.


This method of testing does not evaluate the student’s ability to perform the required
job skills; however, it does determine if the student knows how to perform the
required job skills. The advantages of knowledge-based tests are high degree of
objectivity in scoring and the capability of measuring a large numbers of facts, ideas,
or principles in a relatively short time. The most frequently used knowledge tests are:

a. Multiple-choice
b. Matching
c. True-false
d. Essay
e. Short answer
f. Completion (fill-in-the-blank)

There are other knowledge-based tests known as authentic assessments.


These include:

a. Take–home tests This type of test allows students to take the test at home
with the use of references and resources.

b. Open-book tests This type of test can reduce stress, but may decrease the
student’s motivation to study.

c. Paired testing This type of test allows students to work in pairs on single essay
exams. Pairs can be self-selected or assigned.

d. Portfolios This may not be a specific test but merely a collection of student’s
work. A student's portfolio may include, sample papers (first drafts and revisions),
journal entries, essay exams, and other work representative of the student's progress.
Portfolios may be given a letter grade or master/non-master.

Chapter 4 4-37
Systems Approach To Training Manual Implement Phase

4501. STEPS IN ADMINISTERING STUDENT


TESTS

1. Gather Test Materials When gathering test materials, an instructor needs


to know the materials required, the type of test to be given, and have access to the
materials. STEP 1

a. The materials needed to administer a test will depend on the type of


test to be given.

b. If the test is knowledge-based, the instructor needs enough copies


of the test, test booklets, and answer sheets for each student. The
instructor should also ensure the students have a writing instrument
(pen/pencil) to answer the questions.

c. Extra answer sheets and pencils or materials that may be needed


should be available.

d. If the test is performance-based, such as disassemble/assemble an


M16A2, the instructor will need at least one M16A2 and performance
checklists for the students to demonstrate the ability to
disassemble/assemble the M16A2.

When gathering test materials, here are some simple questions an


instructor should ask prior to a test; the who, what, where, when, and how
questions:

e. Who will be administering the test?

f. What type of test is being administered?

g. Where are test materials located and does liaison need to be made to access
materials?

h. Where is the test being administered?

i. When is the test being administered?

j. How is the test being administered?

Chapter 4 4-38
Systems Approach To Training Manual Implement Phase

2. Prepare the Environment When preparing the environment, the selection


of a place to administer a test is very important for reliable evaluation results. Some
of the key elements that need to be considered are as follows:

STEP 2
a. Arrange for tests to be administered in the morning when students
are fresh and alert. Students have a higher probability of not doing as well
in the afternoon due to fatigue. Note: This does not apply if the conditions
of the test require fatigue or a specific time of day. Example: Conduct a
night attack.

b. Ensure the environment is prepared and conducive to the testing.


The environment should be quiet, well ventilated, have adequate lighting,
and provide the student with ample working space.

c. Arrive at the testing room well in advance of the class to ensure all
testing materials have been gathered, are assembled, and ready when
administrating the test to the students. Some instructors prefer to have the
tests and other materials in place prior to the students arriving.

d. Post a sign or a placard outside each doorway to inform that a test is


being conducted.

e. Instructors should follow their local Standing Operating Procedures


(SOP) for handling visits by distinguished guests.

f. Ensure that logistical and safety requirements are met.

3. Clarify Directions When administering a test, provide clear and concise


instructions/directions to avoid confusion. When students understand exactly what
they are supposed to do, they are less likely to become nervous or tense. Therefore,
their test scores will represent a more accurate picture of their achievement.
Although carefully written instructions/directions for taking the test should be a part of
STEP 3
the evaluation, oral directions should be given as well. When providing
instructions/directions to the students, there are some key elements that need to be
kept in mind. A complete set of instructions provided in written form, orally, and/or
by media should specify at a minimum the following:

a. The test instructions. These should be kept uniform from class to class.

b. How the test will be collected. After conducting the test, the evaluator
must collect all test materials in a predetermined order.

Chapter 4 4-39
Systems Approach To Training Manual Implement Phase

c. The time allowed for each part of the test.

d. Beginning and ending test times. If the test has time limits, these need to
be announced and observed. Example: Beginning and ending times written on the
chalk or dry erase board.

e. How students will proceed when taking the test. Students should be
directed on whether to proceed individually, from part to part, from page to page,
whether to wait for a signal or further instructions.

f. The number of test items on the test and how the student is to
respond. It is often a good plan to provide a sample test item with the correct
response.

g. What references or tools may be used during the test.

h. Inform the students the procedure(s) to follow when they have


completed the test. Are they free to turn in their papers and leave the room or
are they to remain seated until all materials are collected?

i. Inform students to keep their eyes on their own paper.

4. Provide An Opportunity For Questions After providing the students


with instructions/directions and prior to the students taking the test, the evaluator STEP 4
needs to invite the students to ask questions concerning procedures and make it
clear whether questions may or may not be asked of the instructor after the test
begins. If any questions arise from the student(s), be verbally clear on the
instructions/directions and check back with the student(s) to see if they understand
the directions mentioned.

5. Conduct the Test After the test materials have been gathered, the
environment prepared, the instructions/directions given, and an opportunity for
questions has been provided, the evaluator is ready to conduct the test. Some
elements that the evaluator should apply, as well as keep in mind when conducting a
test, are as follows: STEP 5

a. Start and stop the test on time if a time has been given.

b. Monitor the test throughout the testing period by frequently walking


about the classroom.

c. Keep distractions to a minimum.

d. Collect the tests in a pre-determined order.

Chapter 4 4-40
Systems Approach To Training Manual Implement Phase

e. Before conducting a review with the students, the instructor should


pass out Examination Rating Forms (ERFs) to cover at least 10% of the
students that took the test. This is to gather data on the students’
impression of the test and its overall process.

The review should cover f. Conduct a review of the test with the students. The review should
the correct performance cover the correct performance that was expected of the student. This
that was expected of the review should always be conducted before the students receive their
student. This review results. Students will always try to debate or justify their answers once
should always be they learn their grade. This type of exchange will hinder the review process
conducted before the and could create student/instructor barriers that will be difficult to
students receive their overcome. In the event a student does want to debate their answers,
results. inform them to wait until they receive their results, as that is the
appropriate time for recourse.

6. Scoring and Grading A test may be valid, reliable, and comprehensive, but
if not scored and graded properly individual scores and grades are useless.

a. Knowledge-Based Tests When scoring and grading knowledge tests, an


STEP 6 answer key along with a grading key must be obtained to maintain standard results
for each test being scored and graded. Scoring is nothing more than marking the
correct answers on a copy of the test answer sheet and then utilizing it to score the
students’ test answer sheets. Grading is done after the test has been scored by
assigning numerical values in accordance with the grading key.

Example:

When using a bubble sheet test, involving a, b, c, d, or e, it is possible to take a copy


of that evaluation and punch out the desired answers, then utilize it as a key to score
the test answer sheets.

b. Performance-Based Tests When scoring and grading a performance test,


a performance checklist is usually made. This checklist must be configured to a skill
level, which shows whether the student has accomplished the desired skill. Some
performance checklists may only involve a master or non-master qualification. In this
case, if multiple instructors are involved in the scoring and grading process, all
instructors must use the same scoring and grading procedure.

Example:

If one instructor assigns a “Poor” score and another instructor assigns a “Good” score
to the same paper, the grades may express instructors’ bias and not student
proficiency.

Chapter 4 4-41
Systems Approach To Training Manual Implement Phase

4600. AFTER-LESSON MANAGEMENT


SECTION
The primary purpose for employing after-lesson management is to ensure the
effective and efficient use of school resources. By ensuring the instructional
6
environment is well maintained, the instructor is saving the school valuable
resources. The secondary purpose is to capture specific lesson related data for
future use in the schools evaluation program.

After-lesson management actions are all the activities that must be performed after
the lesson has been conducted. These activities include:

1. Removal of media from the instructional environment.

2. Securing all classified material.

3. Leaving the instructional environment as it was found.

4. Conducting a cleanup of outdoor facilities.

5. Turning in any equipment and resources temporarily borrowed for the


lesson.

6. Reviewing the school SOP. There may be additional after lesson


management actions or requirements (e.g., march the students to chow).

7. Complete the After Instruction Report (AIR).

4601. COMPLETING AN AFTER-INSTRUCTION


REPORT

After conducting a lesson, it is an instructor’s responsibility to assess the


effectiveness of instruction. The primary means of recording this assessment is the
After-Instruction Report (AIR). Included in the AIR is the compilation of IRF data,
instructor’s analysis, and recommendations for improvement. The AIR is a single
document that summarizes one iteration of a lesson. To have an effective AIR, the
following must be completed: collect data, analyze data, record data, make
recommendations, and submit the completed AIR. See APPENDIX F for a sample
AIR.

Chapter 4 4-42
Systems Approach To Training Manual Implement Phase

1. Collect Data This is predominantly done through two sources:

a. Students By providing Instructional Rating Forms (IRFs) to students and


allowing them the opportunity to respond to the lessons, formal schools/detachments
STEP 1 are provided data to make future revisions, if necessary. Data feedback that comes
from the students may include, but is not limited to, problems with a lesson,
instructors, or other materials associated with instruction. IRFs should be completed
for each lesson. The frequency and number of rating forms used will depend upon
the school’s Standing Operating Procedures (SOP). At a minimum, survey ten percent
of the students. When a lesson is being given for the first time, it is recommended
that all students complete an IRF. More information on the IRF can be found in
Chapter 5, Section 5205. See APPENDIX D for a sample IRF.

b. Instructors Instructors are a valuable source of data. They can report


problems with any part of the instruction. This could include, but is not limited to, the
instructor’s observation of student difficulties with certain learning objectives, the
amount of time spent in presenting a lesson, the instructional environment, and
opinions about instructional materials. Instructors can make any recommendations
associated with the lesson, and the course as a whole. All instructor comments are
recorded on the AIR.

2. Analyze Data Before data can be analyzed, the instructor should organize
data into topics areas. For example, an instructor could organize using the four broad
categories listed below:

STEP 2 a. Problems with the course material.

b. Problems with student performance.

c. Problems with instructor performance.

d. Problems with the instructional environment.

Instructors should review their notes and comments for each of the topic areas that
were identified. Then, look for any trends in the data and draw tentative conclusions
concerning effectiveness or efficiency of the lesson. The process of identifying trends
involves the instructor looking for data that occurs more than once. A single,
provocative comment would not be considered a trend. For example, a trend might
be recorded of students missing a particular question or several of the same
comments from IRFs. From these trends, identify problem areas and make
recommendations for change. Problem areas can also be identified from singular
comments on an IRF. For example, if a student pointed out that the outline quoted a
Marine Corps Order that was superceded, this would be an immediate problem area,
with no need to establish a trend of similar comments.

Chapter 4 4-43
Systems Approach To Training Manual Implement Phase

3. Begin Recording Data Once all data has been collected and analyzed,
record the data on the AIR. Listed below are the procedure for recording data:
STEP 3
a. Instructional Rating Form (IRF) Data After the block of instruction, the
instructor should collect all IRFs and compile all the data. Record the compiled data
on the appropriate block of the AIR. This is done right after instruction because the
instructor still has a fresh memory of what took place during instruction and can
analyze the feedback given from the students. After analyzing the data, the
instructor as well can make comments and recommendations related to areas of
concern dealing with students, instruction, and the feedback given back from the
students.

b. Time-Critical Operational Risk Assessments If new safety requirements


are identified during the lesson, the instructor should record the ORM lessons
learned, additional controls used, and/or occurrences in the Instructor Comments
area labeled “Reassessment of ORA.” By allowing the Risk Assessment to be
included in the AIR, other instructors will benefit in the future.

4. Make Recommendations Recommendations come in the form of instructor


comments. These recommendations are based on the instructor’s analysis of the
identified trends. Recommendations to revise instruction should include the STEP 4
following:

a. A statement of the problem (for example, only 10% of the students stated
that their knowledge increased as result of the lesson).

b. The probable cause(s) of the problem (for example, the lesson is written for a
much less experienced target population).

c. All possible alternative solutions to the problem. (For example, a suggested


solution may be to redesign the lesson for the experienced target population or make
the lesson a self-paced homework assignment).

Chapter 4 4-44
Systems Approach To Training Manual Evaluate Phase

EVALUATE PHASE In Chapter 5:

5000 INTRODUCTION 5-1


ANALYZE
5100 PLAN EVALUATION 5-2
 Purpose 5-2
 Identify Evaluation Plan 5-3

E
 Identify Evaluation Issues 5-4
 Select Evaluation Approach 5-5

V
DESIGN 5200 CONDUCT EVALUATION 5-8
 Purpose 5-8

A
 Document Review 5-10
 Analysis Phase Evaluation 5-12
 Design Phase Evaluation 5-12

L
 Develop Phase Evaluation 5-14
 Implement Phase Evaluation 5-16
 Instruments for Overall Course 5-21
DEVELOP U Evaluation

5300 ANALYZE & INTERPRET


A

DATA 5-26
 Purpose 5-26

T

Organize Data 5-27


IMPLEMENT  Quantify & Interpret Data 5-28
 Summarize Data 5-55
E

5400 MANAGE DATA 5-57


 Purpose 5-57
 MCAIMS 5-57
 Databases/ Spreadsheets 5-57
 Introduction  Course History Folders 5-58
 Records of Proceedings (ROP) 5-58
 Plan Evaluation
5500 CCRB 5-59
 Conduct Evaluation  Purpose 5-59
 CCRB Function 5-60
 Analyze and Interpret  CCRB Uses 5-60
Data  CCRB Preparation 5-61
 Submitting the ROP 5-64
 Managing Data
5600 ADMINISTRATION 5-66
 Course Content
Review Board (CCRB)
 Purpose 5-66
 Evaluation Requirements 5-67
 Administration  Prepare an Evaluation Plan 5-68
 Sampling 5-72
 Design Evaluation Instruments 5-74

Chapter 5
Systems Approach To Training Manual Evaluate Phase

Chapter 5000. INTRODUCTION

5 The purpose of the Evaluate Phase of the Systems Approach to Training (SAT) is
to determine the effectiveness and efficiency of an instructional program. This
chapter provides guidance for a systematic and standardized approach to
assessing the effectiveness and efficiency of an instructional program in each
phase of the SAT. It details specific steps, the evaluation instruments used, and
statistical methodologies to allow easy reference on how to conduct, analyze, and
interpret evaluation results. Evaluation data is used to ensure that instruction is
providing the Marine Corps with combat-effective Marines; to monitor the
allocation of funding and resources for an instructional program; and to provide
the basis for decision-making concerning the maintenance, revision, continuation,
or termination of an instructional program. Using the processes and procedures
outlined in this chapter, formal schools and unit commanders can establish a
systematic evaluation program to evaluate instruction, identify training
deficiencies, document evaluation results, and make recommendations for use by
decision-makers to modify, continue, or terminate a program.

This chapter has six sections. The first five cover the five Evaluate Phase
processes and the six provides some administrative responsibilities:

1. Plan Evaluation This section provides an introduction to the types of


evaluation and guidance for determining the focus of an evaluation.

2. Conduct Evaluation This section provides how evaluation takes place


within each phase of the SAT to provide checks and balances. It addresses
specific ways to conduct evaluation for each phase of the SAT process.

3. Analyze Data This section takes the evaluator through the steps of
organizing, quantifying, interpreting, and summarizing data so that information
supporting changes can be presented in a Course Content Review Board (CCRB).

4. Manage Data This section addresses how to manage the documentation of


evaluation results and recommendations for revising or refining an instructional
program.

5. Conduct Course Content Review Board (CCRB) This section addresses


how to prepare and conduct for a CCRB.

6. Administration This section references the directives requiring evaluation at


the formal school/detachment. It also covers developing an evaluation plan, how
to sample a population, and the design of evaluation instruments.

INPUT
Plan Evaluation

Conduct Evaluation Evaluation Summary


Delivery of Instruction
or CCRB ROP
PROCESS Analyze Data OUTPUT
Course Data PROCESS OUTPUT
Course Revision Plan
Manage Data

Conduct CCRB

Chapter 5 5-1
Systems Approach To Training Manual Evaluate Phase

5100. PLAN EVALUATION SECTION


Thorough and systematic planning is key to a successful evaluation. For an 1
evaluation to provide the information required for making decisions concerning an
instructional program, the evaluation must identify the critical evaluation issues
and topics influencing the program. These topics will define the focus of the
evaluation. Potential evaluation questions, criteria, and issues need to be
identified and specific evaluation topics selected. Recognizing important questions
and avoiding minor issues will enhance the merit of the evaluation by providing the
data required for making informed decisions about an instructional program. This
section provides an introduction to the types of evaluation and guidance for
determining the focus of an evaluation. A few questions are listed in Figure 5-1 to
assist in providing focus to the evaluation process by establishing the need.

QUESTIONS FOR DETERMINING EVALUATION NEED

1. Does the instructional program affect a large segment of the Marine


Corps?

2. Are many iterations of the instructional program planned? Normally, a


one-time program will not be evaluated.

3. Have instructional program deficiencies been identified by the using


command(s)?

4. Has there been an equipment change, technology advance, or doctrinal


change that may affect the instructional program?

5. Will evaluation information affect important instructional program


decisions scheduled to take place? Such decisions may relate to course Figure 5-1.
content, course length, funding, continuation, instructor requirements, or Questions for
student throughput. Determining Evaluation
Need

Chapter 5 5-2
Systems Approach To Training Manual Evaluate Phase

5101. IDENTIFY EVALUATION TYPE


There are two types of evaluation. A distinction between the two types of
evaluation can be made by first determining when the evaluation will be
conducted; and then, what will be the focus of the evaluation.

1. Formative Evaluation Formative evaluation is conducted during the


development of an instructional program. It is also possible to conduct
formative evaluation through the first iteration of implementation, but this is not
the preferred method for validating instruction. Validating instruction
(formative) will involve content reviews by Subject Matter Experts (SME),
Process Action Teams (PAT), and field trials. These validation methods are
discussed in more detail in Chapter 3, Section 3402. Formative evaluation will
The primary object of never assess student performance, will rarely assess instructional environment,
formative evaluation is and will only occasionally assess instructor performance. Formative evaluation
to review the provides information useful for improving an instructional program and leads to
effectiveness and decisions concerning instructional program development. For example, during
efficiency of course the development of a course curriculum, formative evaluation could involve
materials and to make review of Individual Training Standards (ITSs)/Training and Readiness (T&R)
any revisions necessary Manual, content review of course materials by SMEs, and validation of
prior to implementation instruction. Formative evaluation results in feedback for the curriculum
of the course materials. developer, who then uses the information to make the necessary revisions to
course materials (e.g., lesson plans, student materials, media, test items).

2. Summative Evaluation Summative evaluation is conducted after a


Program of Instruction (POI) has been implemented. It provides judgments
about a program's worth or merit. This type of evaluation can be conducted by
schoolhouse personnel or by personnel external to the school (i.e., a TECOM
instructional system specialist). Summative evaluation leads to decisions
concerning program improvement, continuation, extension, or termination. For
example, after a course curriculum is completely developed, a summative
Summative evaluation evaluation might be conducted to determine how well graduates are performing
leads to decisions on the job following instruction. Summative evaluation assesses effectiveness
concerning program of student performance, course materials, instructor performance, and/or
improvement, instructional environment. Summative evaluation can also be a comprehensive
continuation, extension, assessment of all these factors to evaluate the instructional program's overall
or termination. effectiveness and efficiency.

Chapter 5 5-3
Systems Approach To Training Manual Evaluate Phase

5102. IDENTIFY EVALUATION ISSUES


A school commander must identify the curriculum and instruction issues to be
addressed during the evaluation so that the proper information can be gathered
to determine the effectiveness of the program.

1. Gather Information The evaluator begins the identification process by


generating an exhaustive list of potentially important questions, criteria, and
issues. Possible questions to use for each phase of the SAT process can be found
in the next section. To develop this comprehensive list, the evaluator must gather
information from a variety of sources including:

a. Subject Matter Experts, instructors, students, and managers to


identify questions, concerns, and goals regarding the instructional
program and formal school/detachment. The evaluator should focus on
obtaining input from those individuals who are or will be affected by the
results of the evaluation.

b. Existing curriculum, instructional documentation, previous


evaluation data, Marine Corps directives, local Standing Operating
Procedures (SOP), and other appropriate doctrinal publications.

2. Select Evaluation Topics It is usually not feasible to address all issues


in one evaluation. Practical considerations, such as availability of resources and
time constraints, will limit what can be addressed. If resources are not available
and the evaluation is critical, it must be postponed until they are available. The
evaluator must narrow the scope of the evaluation to address the most critical
questions and issues affecting the instructional program. The conduct of the
evaluation will be driven by the topics selected. Figure 5-2 provides criteria that
can be used for selecting evaluation topics.

Criteria That Can Be Used in Selecting Evaluation Topics

a. Who will use the information?


b. Issues that reduce present uncertainty, provide information not
already available, or yield important information.
c. Issues that address a critical concern of the instructional
program.
Figure 5-2. Criteria Used
d. Issues that, if not addressed, seriously limit the scope or in Selecting Evaluation
comprehensiveness of the evaluation. Topics.

Chapter 5 5-4
Systems Approach To Training Manual Evaluate Phase

In addition to the above criteria, the selection process may also be based on
decisions that will be made as a result of the evaluation. These can include
decisions concerning:

a. Whether instructional needs are being met.

b. The development or acquisition of new training aids, devices, or


systems.

c. The continuation, modification, expansion, or termination of an


instructional program.

d. The extent to which the instructional program is being


implemented as designed.

e. The relative value/cost of an instructional program compared to


comparable programs.

5103. SELECT EVALUATION APPROACH


Once the focus of the evaluation is defined, the evaluation approach is selected.
Three approaches to evaluation are recommended for use in the Marine Corps:
objectives-oriented, management-oriented, and operational test and evaluation.
These approaches are based on the goal of the evaluation; they determine the
focus of the evaluation but do not change the procedure for conducting
evaluation.

1. Objectives-Oriented Evaluation The objectives-oriented approach


determines the extent to which learning objectives have been achieved. It is
the most common evaluation approach used in the Marine Corps. Information
obtained from such an evaluation can be used to revise the goals of the
instructional program, the program itself, or the instruments and methods used
to measure instructional effectiveness. Figure 5-3 describes the focus of
objective-oriented evaluation.

When using Objective-Oriented Evaluation, the focus is on


determining whether:
a. Students master the learning objectives.
b. Learning objectives meet the goal(s) of the program and support the
Figure 5-3. Objective- Individual Training Standards (ITS).
Oriented Evaluation. c. The standards in the learning objectives are realistic and obtainable.
d. Student tests support learning objectives.
e. Graduates are able to perform the tasks in the operating forces.

Chapter 5 5-5
Systems Approach To Training Manual Evaluate Phase

2. Management-Oriented Evaluation The management-oriented


approach to evaluation entails collecting information to aid management decision-
making as an instructional program operates, grows, or changes. This approach
enables the school director to determine if an instructional program responds to
changes in technology, resources, new developments in instruction, or day-to-day
operations. For example, if an upgrade to a computer program for inventory
control is being implemented, the school director may direct that an evaluation be
conducted to determine the upgrade’s affect on the instructional program. The
formal school/detachment's concerns, informational needs, and criteria for
instructional effectiveness guide the direction of the evaluation. Figure 5-4
provides how management-oriented evaluation assists the decision-maker.

When using Management-Oriented Evaluation, the approach allows


decision-makers to:

a. Determine what instructional needs or objectives should be addressed to


provide a basis for assessing the effectiveness of instruction. For example, the Figure 5-4. Management-
introduction of new equipment would identify a need to revise learning Oriented Evaluation.
objectives and create or modify a lesson plan to incorporate instruction on that
equipment.
b. Determine resource requirements and their availability and adaptability to
alternative instructional strategies. The decisions may facilitate the design of
the instructional program and, ultimately, provide the formal school with a
basis, for assessing how well the program is being implemented. For example,
instruction on a new piece of equipment may require additional instructors or
specialized training equipment that traditional lecture/demonstration methods
do not support. Alternative strategies, such as Mobile Training Teams (MTT),
distance learning, Computer-Based Training (CBT), etc., may be proposed.
c. Determine how well a program is being conducted, what barriers
threaten its success (e.g., lack of resources, instructors, facilities), and what
revisions are required. Once these questions are answered, instructional or
administrative procedures can be monitored, controlled, and refined. For
example, an evaluation of instructor performance and instructional
environment may indicate a need to increase instructor preparation time or
improve the instructional environment.
d. Determine whether to continue, modify, or refocus a course of
instruction. An evaluation of graduate performance on the job will provide
data to aid these decisions.

Chapter 5 5-6
Systems Approach To Training Manual Evaluate Phase

3. Operational Test and Evaluation Operational test and evaluation is


an approach that enables the evaluator to determine whether a product
represents a significant improvement or benefit over alternative products.
Example products include an off-the-shelf instructional program, an instructional
method or media, a training system/device, etc. This approach is effective
when an existing product is being evaluated for implementation. This approach
also allows the evaluator to assess the effectiveness of a product while it is still
under development. When determining whether an alternative product
represents an improvement over an existing product, the evaluator should
consider the following factors: cost, benefits, effectiveness, and feasibility.
Figure 5-5 provides how operational test and evaluation assists the decision-
maker.

When using Operational Test and Evaluation, decision-makers are


able to consider:

Figure 5-5. Operational Test a. Cost. Cost is analyzed to determine if it will be cost efficient to
and Evaluation. invest in an alternative product or upgrade the existing product.
b. Benefits. This analysis includes determining how the benefits
among products will be measured. The analysis results in the determination
of whether the benefits are worth the expenditure of resources (e.g., time,
money, personnel) to implement.
c. Effectiveness. An analysis of product effectiveness is performed to
determine whether an alternative product will be more effective than an
existing product in meeting the goals of the instructional program.
d. Feasibility. A final analysis is that of feasibility. How feasible
would it be for the school to invest the resources necessary to educate their
personnel and structure/acquire the facilities required to use the alternative
product? If the benefits and effectiveness of the alternative product are
minimal, would it be feasible to alter the school budget to implement an
alternative product?

Chapter 5 5-7
Systems Approach To Training Manual Evaluate Phase

5200. CONDUCT EVALUATION SECTION


In Marines Corps training, the revision of courses is paramount to meeting the 2
needs of the operating forces. Whether it is affected by new equipment, new
orders, or new technology, how a task is performed in the operating forces can
change (and does more often in some MOSs than others). Formal
schools/detachments must be prepared to obtain data/information compiled from
different phases of the SAT process to improve the product. As the SAT model
shows on page 5-0, evaluation can require revisiting any phase of the SAT
process. The diagram in Figure 5-6 shows the variety of routes that can be taken
in evaluation.

This section provides how evaluation takes place within each phase of the SAT to
provide checks and balances. This section allows the user of this manual to
address specific ways to conduct evaluation for each phase of the SAT process.
For a new course being developed, this process shows how evaluation (formative)
occurs during the initial stages of course development when limited data is
available. Evaluation during this time can reveal potential problems prior to
course implementation. Existing courses (summative), however, will have data
that is used to assist in identifying the strengths and weaknesses within the
course as it is. Evaluation instruments have been identified and information is
provided on conducting the evaluation. However, specific guidelines on the
development of evaluation instruments and sampling a population can be found in
Section 5600. Referrals to other sections are made regarding how data is
analyzed and interpreted after it is collected.

Chapter 5 5-8
Systems Approach To Training Manual Evaluate Phase

Conduct Evaluation

Document
5201 DR assists Schools/
Review
Detachments in the decision
(DR)
making process of evaluation.

Analyze Design Phase Develop Implement


Phase Evaluation Phase Phase
Evaluation Evaluation Evaluation

5202-5206 discusses the different


Instrument used to conduct
evaluation for each phase.

5300 describes
the process for
analyzing data
Analyze & Interpret collect from
Data individual phases.

5400 provides
guidance concerning Manage
documentation of Evaluation Data
evaluation results
and recommendation.

Course Content
Review Board 5500 describes the
(CCRB) process of conducting a
CCRB.

Figure 5-6. Course Evaluation.

Chapter 5 5-9
Systems Approach To Training Manual Evaluate Phase

5201. DOCUMENT REVIEW

During any stage of the evaluation process, a review of documents significant to


the course and school can assist in the decision-making process and approach to
evaluation. Some of the documents listed may or may not be available depending
on whether the evaluation is for a new course/school or an existing course.
Additional documents to those discussed here may also be identified. Listed below
are documents to be discussed in more detail later in this section.

1. Individual Training Standard (ITS) Order/Training and Readiness


(T&R) Manual
2. Course Descriptive Data/Program of Instruction (CDD/POI)
3. Master Lesson File (MLF)
4. School's Standing Operating Procedures (SOP)
5. School's Evaluation Plan
6. Inspection Reports/Assist Visit Reports (if applicable)
7. Record of Proceedings (ROP)

1. Individual Training Standard Order/Training And Readiness


Manual The ITS/T&R defines the training requirement and serves as the base
on which instruction is built. Therefore, the ITS/T&R must always be reviewed to
determine if there is a disconnect between the curriculum and the training
standard. Take for instance, if evaluation data has indicated a problem with
Terminal Learning Objectives (TLOs), then it is probable that the problem is with
the construct or content of the ITS/T&R event. Chapter 2, Section 2204, provides
the procedure for downgrading the TLO if a problem like this is revealed.

2. Course Descriptive Data/Program Of Instruction (CDD/POI)


All existing courses will have a CDD and POI according to MCO 1553.2_. These
documents (maintained in MCAIMS) provide the resources required for the course,
learning objectives, instructional hours, number of instructors required for each
class, methods and media and more. This information is vital to the evaluation of
a course. For example, an evaluator needs to ensure that the class reflects the
POI. If there are problems with the approved Program of Instruction, then the
data needs to be gathered so that it can be presented at a Course Content Review
Board (CCRB). Refer to Chapter 3, Section 3500 for more information on
CDD/POI.

Chapter 5 5-10
Systems Approach To Training Manual Evaluate Phase

3. Master Lesson File (MLF) An MLF is required for each class that is
taught in the course. All of the documentation required to conduct the class is in
the MLF. More information on specific contents can be found in Chapter 3,
Section 3600. If the course is new, then this file will not be produced until the
end of the develop phase. For existing courses, the MLF is of great value for
comparing data results with what is in the MLF. For instance, if a student
comments on an Instructional Rating Form (IRF) that numerous words are
misspelled in the student handout, then the MLF can be pulled and checked for
misspelled words. If the words are not misspelled in the MLF, then there is an
internal problem that exists; the MLF is not being used as THE source document.

4. School Standing Operating Procedures (SOP) The school's SOP


or Academic SOP specifies Academic policy for the school. The SOP may provide
information on how data is gathered and compiled for the school and the
resources available to provide evaluation data. This is key information for
evaluation of the design, develop, implementation, and evaluate phases. This
document may not be available to a new school, but needs to be developed to
provide policy and procedures. A checklist providing some key elements to
include in an SOP can be found in APPENDIX E.

5. Evaluation Plan The evaluation plan needs to be reviewed so that there


is an understanding of the evaluation process in accordance with school policy.
At some schools, the evaluation plan may be found in the school's SOP. Refer to
Section 5602 for more information on an evaluation plan.

6. Inspection Reports/Assist Visit Reports Some Military


Occupational Specialties (MOSs) have inspection teams that visit the operating
forces to ensure that the standards required by the Marine Corps are adhered to.
If possible, retrieve information revealing strengths and weaknesses from the
operating forces so the school can then use the data to assist in the
improvement of the instructional program. The challenge comes in determining
whether the strengths/weaknesses are linked to the schoolhouse, the operating
forces, or both.

7. Record Of Proceedings (ROP) The ROP provides documentation of


the discussion items and recommendations made during a Course Content
Review Board (CCRB). For existing courses, this provides data for recommended
changes, additional operational needs that were identified, additional resource
needs at the schoolhouse, etc. Sometimes, the ROP will reveal areas where
additional information/data needs to be collected to determine or support needs
that were identified during the CCRB. Refer to section 5500 for more
information on the ROP and CCRBs.

Chapter 5 5-11
Systems Approach To Training Manual Evaluate Phase

5202. ANALYSIS PHASE EVALUATION


Data is collected during the Analysis Phase to identify the task list, ITS,
Instructional Setting, and the Target Population Description (TPD). Through the
methods discussed in Chapter 1, the products of the Analysis Phase are
determined by TECOM. Methods of evaluation are established to ensure the
accuracy of the outputs from the Analysis Phase. If evaluation data at the formal
school/detachment identifies a problem with the outputs, then all supporting data
is sent to the Task Analyst at TECOM. The questions in Figure 5-7 are a few
questions that will assist in examining the outcomes of the Analysis Phase.

Evaluating the Analysis Phase


Figure 5-7. Evaluating the
1. Does the ITS reflect the task necessary to perform the job in the
Analysis Phase.
operating forces?
2. Does the task analysis include all of the prerequisite skills and
knowledge needed to perform the learning goal and is the prerequisite
nature of these skills and knowledge accurately represented?
3. Does the environment accurately replicate, within the confines of
resources, the environment where the job is performed?
4. Does the target population description accurately define those who
perform the task?

CONDUCT EVALUATION

An Analysis Phase review occurs prior to the development of instructional


materials of a new course. A review may take place for an existing course due to
the end of course evaluation and post-course data indicating a gap between what
is taught at the school and what is being performed in the operating forces. In
accordance with MCO 1200.13E, a Front-End Analysis (FEA) is initiated when job
requirements change or a performance deficiency is detected. This, too, is a type
of Analysis Phase review.

5203. DESIGN PHASE EVALUATION

During the Design Phase, knowledge and skills are identified, learning objectives
and test items are developed, the delivery system is selected, and the sequence
of instruction is determined. Methods of evaluation must be established to
ensure that these outputs are accurate. The questions in Figure 5-8 are
questions that will assist in examining the outcomes of the Design Phase.

Chapter 5 5-12
Systems Approach To Training Manual Evaluate Phase

Evaluating the Design Phase


Figure 5-8. Evaluating the
Design Phase. 1. Do the knowledge and skills accurately reflect what needs to be taught
for each performance step?
2. Do the learning objectives support the ITSs?
3. Does the learning objective accurately and clearly state what
knowledge/skill will be required for performing the job?
4. Does the test accurately measure the knowledge, skill, or the task being
taught?
5. Are the testing methods appropriate to the subject matter (knowledge
vs. performance-based)?
6. Do the test items consistently measure the same knowledge or
performance?
7. Do the assessment instruments and their related mastery criteria reliably
distinguish between competent and incompetent learners?
8. Is the delivery system selected appropriate for the level of knowledge
that the target population will possess?
9. Is the sequence of instruction organized logically to enhance the process
of learning the material?

CONDUCT EVALUATION

Throughout the Design and Develop phases of the SAT process, checklists are
used to ensure accuracy and to guide decision-making. Checklists provide
continuity to the process and a standard for the product. During the Design
phase, checklists provide detailed questions on products of the Design phase.
For new courses, these checklists must be completed and placed in the MLF for
each class in the course. In existing courses, these should be reviewed if there
are indicators that the products of this phase are flawed. The checklists are
available in the appendices. Additional items can be added to the checklists to
meet school needs.

Learning Analysis Worksheet (LAW) Checklist. The LAW checklist ensures


that components are recorded from the ITS verbatim. It also checks to make
sure that the knowledge and skills were identified and grouped for each
performance step. The LAW checklist can be found in APPENDIX C. Refer to
Chapter 2, Section 2200, for more information on learning analysis.

Learning Objective Worksheet (LOW) Checklist. The LOW checklist


ensures that the behavior, condition, and standard of the learning objectives are
accurate and clear. The LOW checklist can be found in APPENDIX C. Refer to
Chapter 2, Section 2202, for more information on learning objectives.

Chapter 5 5-13
Systems Approach To Training Manual Evaluate Phase

Test Item Checklist. The test item checklist ensures that test items replicate
the behavior, standards, and conditions identified in the learning objectives.
Many questions can be included on the checklist to require the test developer to
assess each test question (knowledge or performance) for clarity and
conciseness. The test item checklist can be found in APPENDIX C. Refer to
Chapter 2, Section 2206, for more information on test items.

Construct a Test Checklist. The construct a test checklist ensures that the
test is constructed to include detailed instructions, scoring criteria, appropriate
grouping of test items, and any safety precautions. Refer to Chapter 2, Section
2300, for more information on constructing a test.

5204. DEVELOP PHASE EVALUATION


During the Develop Phase, the course schedule is determined, the media is
produced, Master Lesson Files (MLFs) are created, and the CDD/POI is
generated. Methods of evaluation must be established to ensure that these
outputs are accurate. The questions in Figure 5-9 are questions that will assist in
examining the outcomes of the Develop Phase.

Evaluating the Develop Phase


1. Does the content present a consistent perspective?
2. Do the instructional materials support the learning objectives?
3. Does the instructional method facilitate maximum learning?
4. Is the instructional method appropriate to the subject matter? Figure 5-9. Evaluating the
5. Are training aids suitable to the instruction and subject matter? Develop Phase.
6. Are examples, practice exercises, and feedback realistic and accurate?
7. Is the approach consistent with current instructional theory in the content
area?
8. Is sufficient time allotted for instruction and practice?

CONDUCT EVALUATION

Several forms of evaluation take place during the Develop phase. For both a new
and existing course, checklists are used to evaluate the products of the phase.
For a new course, the checklists are completed and placed in the MLF as source
documents. Once the course development is completed, then validation takes
place so that problems with the Program of Instruction (POI) are identified prior
to implementation. When evaluating an existing course, the checklists in the MLF
are still referenced and reviewed periodically. If evaluation indicates problems
with the POI, then the checklists need to be reviewed. However, reviewing the
checklists may not identify the problem and an expert review may be required.
An expert review, not to be confused with an SME review, is discussed in more
detail below. Figure 5-10 shows the different course that evaluation takes
depending upon whether the course is new or existing.

Chapter 5 5-14
Systems Approach To Training Manual Evaluate Phase

NEW COURSE
Figure 5-10. Conduct of
Evaluation in the Design
Phase. Develop
Checklists Validation
see (1)

EXISTING COURSE

Develop
Checklists Expert Review
see (2)

1. Develop Phase Checklists During the Develop phase, checklists provide


detailed questions on products of the develop phase. The checklists are available
in the appendices. Additional items may be added to the checklists to meet school
needs.

a. Concept Card Checklist The concept card checklist ensures the contents
and accuracy of the necessary components of the concept card. The concept card
checklist can be found in APPENDIX C. Refer to Chapter 3, Section 3200, for more
information on concept card.

b. Lesson Plan Checklist The lesson plan checklist ensures that each
component required in a lesson plan is present and complete. The lesson plan
checklist can be found in APPENDIX C. Refer to Chapter 3, Section 3302, for
more information on lesson plans.

c. Student Outline Checklist The student outline checklist ensures that


each component required in the student outline is present. The student outline
checklist can be found in APPENDIX C. Refer to Chapter 3, Section 3303, for more
information on student outlines.

d. Method/Media Checklist The method/media checklist ensures that


method and media used is consistent with the learning objective behavior. The
method and media checklist can be found in APPENDIX C. Refer to Chapter 2,
Section 2207, for more information on methods. Refer to Chapter 2, Section
2208, and Chapter 3, Section 3304, for more information on media.

Chapter 5 5-15
Systems Approach To Training Manual Evaluate Phase

2. Expert Review An expert review can be held for further examination of the
design and develop phases. These types of reviews are where experts review the
material prior to implementing the instruction. An expert review is different from
a Subject Matter Expert (SME) review in that the expert review involves more than
SMEs. The experts may include: SMEs, seasoned curriculum developers, and/or
experienced education specialists. During a content review, an SME examines the
content of the instructional material for accuracy and completeness. Then an
individual familiar with the target audience (could be someone from the operating
forces) reviews for appropriateness. This individual may look at vocabulary,
examples, and/or illustrations. The education specialist can evaluate presentation
of the content with current educational thoughts and practices. Expert reviews
can take place toward the end of the design phase or at the beginning of the
develop stage for a new course. If an existing course, then this review can take
place at any time.

3. Validation The process of validation occurs for new courses prior to


implementation. The best indication of whether the instruction is effective is to
try it out on a representative population of who is expected to be in the
classroom. This will provide information on how well the learners are able to
learn and the problems encountered with the instruction in its current form.
Validation allows changes to be made prior to the implementation of the
instruction. Methods of validation are covered at length in Chapter 3, Section
3400.

5205. IMPLEMENT PHASE EVALUATION


During the Implement Phase, instruction is delivered. Evaluating the outcome of
instruction is imperative to identifying the strengths and weaknesses of the course
as a whole. The implement phase is where most evaluation data is compiled at
the formal school/detachment. Once a course is implemented, evaluation is
conducted for each iteration of a course. Since this is a continuous process, it is
important that each school have an evaluation plan in place to ensure that data is
collected properly and that there is standardization. More information on writing
an evaluation plan can be found in Section 5602. The four common topics
evaluated in the Implementation Phase are course materials, instruction,
instructional environment, and student performance. The questions in Figure 5-
11 are questions that will assist in examining these four topics.

Chapter 5 5-16
Systems Approach To Training Manual Evaluate Phase

Figure 5-11. Evaluating Evaluating Course Materials


Implementation.
1. Do the instructional materials support the learning objectives?

2. Is the student outline easy to follow?

3. Are training aids suitable to the instruction and subject matter?

4. Are the test instructions clear and understandable?

5. Is the format of the test easy to follow? (Students don't have to flip
pages, like questions are grouped together, etc.)

6. Do students have all of the materials (equipment, calculator, etc.)


necessary to complete the test?

7. Do students use the course materials available to them?

Evaluating Instructor

1. Is the instructor’s presentation of instruction effective?

2. Does the instructor promote student participation?

3. Does the instructor provide feedback to the students?

4. Does the instructor have sufficient knowledge of the course material?

5. Does the instructor communicate and interact effectively?

6. Does the instructor utilize media effectively?

7. Is the administration of tests effective?

Evaluating Instructional Environment

1. Does the instructional setting facilitate maximum learning?


2. Do available resources allow the course to be as performance-based as
possible?

3. Is the instructor to student ratio adequate?

4. Is the instructional environment appropriate to the subject matter and


realistic to the job setting?

Evaluating Student Performance

1. Are students mastering the learning objectives?

2. Are students able to perform tasks?

3. Are there test items or tasks that students repeatedly have problems
mastering?

Chapter 5 5-17
Systems Approach To Training Manual Evaluate Phase

CONDUCT EVALUATION

Figure 5-13 provides a breakdown of which instruments are used to provide data
regarding course materials, instruction, instructional environment, and student
performance, how the instrument is used, when it's used, and who completes the
instrument. Most of the instruments will fall under more than one category. As
identified in Figure 5-13, evaluation data for the implement phase is gathered
during the course, immediately following the course, and even three months
following the course. When reviewing data, keep in mind that all data has to be
considered to get a true picture of instruction. Once the data is compiled, it is
then compared and analyzed so that trends between classes can be identified.

1. Instructional Rating Form (IRF) The IRF is a student reaction form to


instruction. Common types of feedback revealed by IRFs can be found in Figure
5-12. Information provided by the students can identify areas of strengths and
weaknesses in a lesson. However, this should not be the sole indicator of
proficiency or effectiveness. For every block of instruction, the IRF is distributed
at the beginning of class to, at a minimum, 10 percent of the students. Students
are provided time to complete the forms at the end of the class. The school SOP
may designate a higher percentage of IRFs to be completed for each class, but it
must be at least 10 percent. Students should be informed that IRFs are not
restricted to the selected individuals and that anyone in the class can complete
an IRF at any time. IRFs provide the student's immediate reaction to the lesson.
Specific information regarding a particular lesson may be lost unless data is
gathered for each lesson. A sample IRF can be found in APPENDIX D. Data from
the IRF is transferred to the After Instruction Report (AIR) where the instructor
also makes comments regarding the lesson. The AIR is discussed in detail later
in this section and in Chapter 4, Section 4500. Information regarding quantifying
and interpreting the results of questionnaires can be found in Section 5302.

Common Types of Feedback from Student Reaction Forms


Progress with Objectives: Did the instruction meet the stated learning
objectives met?

Class Content: Did the content make sense?

Instructional Materials: Were the materials useful? Figure 5-12. Common


types of Feedback from
Pre-Work Materials: Were the pre-work materials necessary? Student Reaction Forms.
Helpful?

Assignments: Were the out-of-class assignments helpful?

Methods of Delivery: Was/Were the method(s) of delivery


appropriate for the objectives?

Instructor/Facilitator: Was/Were the facilitator(s) effective?

Overall Evaluation: What is your overall rating of the


lesson/course?

Chapter 5 5-18
Systems Approach To Training Manual Evaluate Phase

2. End of Course Critiques (ECC) Like the IRF, the ECC is also a student
reaction form. It provides feedback on the areas listed in Figure 5-14. However,
the ECC references the course in broader terms than the IRFs. This particular
instrument reveals information on the course as a whole. ECCs should, if
possible, be completed by 100 percent of the class. These critiques are
completed after the program of instruction is complete. Students that may not
have filled out an IRF or ERF during these periods may apply comments on the
ECC in the areas of instruction or evaluation. An example of an ECC can be
found in APPENDIX D. Any information specific to a lesson gathered from the
ECC is documented on the AIR for that lesson. The AIR is discussed in more
detail later in this section and in Chapter 4, Section 4500. Information regarding
quantifying and interpreting the results of questionnaires can be found in Section
5302.

3. Instructor Evaluation Checklist This particular checklist is used when


evaluating an instructor. The Instructor Evaluation Checklist critiques the same
elements that are evaluated at the Formal School Instructor Course (FSIC),
Instructional Management School. FSIC graduates have been taught and
evaluated on all of the items of this checklist. The Instructor Evaluation Checklist
reflects Marine Corps requirements for formal school/detachment instructors to
provide standardization to instruction. It covers platform techniques, thorough
coverage of the lesson, questioning techniques, communication skills,
employment of method/media, and instructor/student interaction. Additional
requirements can be added to the checklist by schools, but the requirements
should not be modified unless revised and adopted by the Instructional
Management School and TECOM. The evaluators of instructors need to be
graduates of the FSIC so that they are familiar with the requirements. The
Instructor Evaluation Checklist can be found in APPENDIX E. Information
regarding quantifying and interpreting the results of a checklist can be found in
Section 5302.

4. Observation Checklist An observation checklist is available to be used by


an evaluator who is reviewing a class in session. The class may be in a
classroom setting or field setting. This checklist provides a list of items to assist
in evaluating course materials, instruction, instructional setting, student
interaction, and class exercises. Unlike the Instructor Evaluation Checklist, the
focus provided by this checklist is not on the instructor, but rather on class
content and effectiveness. This checklist allows an observer to evaluate whether
the instruction, student materials, and media follow the lesson plan and materials
submitted in the MLF. This checklist allows room for other comments by the
observer. Comments may include recommendations to change the method,
media, student materials, instructional environment, etc. If the changes are
minor, then they may be made immediately. Otherwise, data gathered from the
checklist remains as documentation for the next convening Course Content
Review Board . Evaluators should be familiar with the program of instruction,
graduates of the Curriculum Developers Course (CDC), and graduates of the
Formal School Instructor Course (FSIC). The frequency of observations can be
determined in the school SOP. An example of an observation checklist can be
found in APPENDIX E. Modifications, as far as additions, can be made to this
checklist to meet the needs of the schoolhouse. Information regarding
quantifying and interpreting the results of a checklist can be found in Section
5302.

Chapter 5 5-19
Systems Approach To Training Manual Evaluate Phase

5. Environment Checklist The environment checklist reveals information about


physical conditions and training conditions. If training takes place in a classroom
environment, then information regarding lighting, noise, classroom setup,
ventilation, room temperature, etc., is available through an environment checklist.
This checklist can be completed by the instructor prior to the class or by a
classroom observer during the class. An environment checklist for training that
occurs outside of a classroom can reveal information about setup and availability of
equipment, ventilation, noise, facilities, and the overall conditions that training took
place under. Safety can be included in the environment checklist to eliminate the
additional safety checklist. An example of an environment checklist can be found
in APPENDIX E. Information regarding quantifying and interpreting the results of a
checklist can be found in Section 5302.

6. Safety Questionnaire The safety questionnaire is distributed to students so


that the student has an opportunity to assess whether he/she was informed about
safety issues. Were students provided ample instructions regarding safety? Was
safety emphasized in the instruction? Did the instructor exemplify safety in the
training environment? The formal school/detachment's SOP may have specific
guidelines to how this is assessed. Courses where students are exposed to
potentially dangerous situations must ensure that Operational Risk Management
(ORM) is referenced. Refer to MCO 3500.27 for more information on ORM. A
sample safety questionnaire can be found in APPENDIX D. Information regarding
quantifying and interpreting the results of a questionnaire can be found in Section
5302.

7. Safety Checklist This checklist is to be completed by the instructor or a


qualified observer. The items on the checklist indicate whether the training facility
has been set up to present a safe working environment. It can also be used in
addition to the observation checklist to provide information on whether the
instructor provided ample instructions regarding safety, emphasized safety, and
practiced safety in the training environment. Courses where students are exposed
to potentially dangerous situations must ensure that Operational Risk Management
(ORM) Policy is referenced. An example of a safety checklist can be found in
APPENDIX E. Information regarding quantifying and interpreting the results of this
questionnaire can be found in Section 5302.

8. Examination Rating Form (ERF) Immediately following an examination


(performance or written), ERFs are distributed to, at a minimum, 10 percent of the
students. Students are advised that these forms will not be viewed until after the
students have received their grades for the test. The ERF allows the school to
assess the students' perception of a test's suitability and fairness. This does not
provide the students with the final say on the validity of the test, nor does it
suggest that their judgment is necessarily accurate. However, it does provide the
students' reactions to the test providing information that cannot be assessed
through mere test scores. This information can be used to adjust confusing
questions, instructions, facilities, equipment, etc. The results should be indicated
on the After Instruction Report (AIR) of the class teaching the learning objectives
tested. The AIR is discussed in detail later in this section and in Chapter 4, Section
4500. An example of an ERF can be found in APPENDIX D. Information regarding
quantifying and interpreting the results of this questionnaire can be found in
Section 5302.

Chapter 5 5-20
Systems Approach To Training Manual Evaluate Phase

9. Practical Application/Class Exercises Practical application and class


exercises are evaluative tools that the instructor(s) use to assess the progress of
students. If students are having a particular problem with a practical application or
during a class exercise, then it may be necessary to make adjustments in the training
schedule (late training day or extra work during lunch) to spend more time on the
problem area. This is especially necessary when the course builds on elements
learned in previous material. This information needs to be annotated under
"Instructor Comments" on the After Instruction Report (AIR) for documentation.

10. Tests During the Course Implementation Stage, pre-test, written examinations,
and performance examinations can be given. Each test has a different purpose. This
is discussed more in-depth in Chapter 2, Section 2301. Test scores reveal how well
an individual in the class performed. Item analysis reveals how well students
performed on each item in comparison with the rest of the class. This information
should be tracked over time and aids in determining the validity and reliability of the
test. Refer to Section 5302 for more information on test analysis and determining
the validity and reliability of tests.

a. Pre-Test The results of a pre-test can be used for tailoring instruction to


the target audience. It can also be used to compare with post-test data to determine
if instruction was effective. For instance, if students are unable to perform a task
before instruction, but can perform that task after instruction, a general
determination can be made as to the effectiveness of instruction. Of course, there
are other factors outside of instruction, such as peer teaching and additional reading,
that may have attributed to learning.

b. Performance/Written Examinations Results from performance and


written examinations reveal whether the student has mastered the learning
objectives. Test scores can be compared, specific problem items can be identified
and linked to specific classes or learning objectives, and defective test items can be
identified. Refer to Section 5302 for more information on test analysis.

5206. INSTRUMENTS USED FOR OVERALL


COURSE EVALUATION
The instruments discussed above have been specific to course materials, instructor,
instructional setting, or student performance. This section will discuss student data
forms, after instruction reports (AIR), post-graduate surveys, and site visits normally
associated with the formal Evaluation Phase of the SAT process. Questions that
these evaluation instruments can be designed to answer are found in Figure 5-13.

Chapter 5 5-21
Systems Approach To Training Manual Evaluate Phase

Questions for Course Evaluation


Figure 5-13. Questions of
1. Who is represented in the student population?
Evaluation.
2. Have there been changes to the method of performing the task?
3. Are tasks performed differently in the operating forces?
4. Is there new equipment or computer programs being used in the
operating forces?
5. Has the environment changed?
6. Are students who pass the test (evaluation) able to perform their job
successfully?
7. Do supervisors feel confident in the graduates from the formal
school/detachment?
8. Do the students feel confident in the skills taught at the formal
school/detachment when they get to the operating forces?
9. Do graduates of the course believe non-essential instruction is contained in
the instructional program?
10. Are graduates performing well on the job?
11. Are graduates performing better than they did before instruction?
12. What tasks are causing graduates difficulty on the job?

1. Student Data Form Student data will reveal information about the
population. This data is generally collected at the beginning of the course. Some
of the student data may be available from By-Name Assignment (BNA). A student
data form completed by the student reveals background knowledge, computer
experience, student expectations, language proficiency, etc. This data can be
helpful in determining why students do particularly well or not so well on a test.

2. After Instruction Report (AIR) An AIR is a report that consolidates the


student reaction, instructor reaction, and test scores into one form so that data
analysis can be performed. An example of AIR can be found in APPENDIX F. Refer
to Chapter 4, Section 4500 for information on how an AIR is completed.

3. Post-Graduate Survey The post-graduate survey is developed to assess


how well the graduate felt that he/she was prepared for his/her job. It can also be
developed to find out types of equipment being used, computer programs used,
content not covered, suggestions/recommendations, etc. Post-graduate surveys
are course specific and sent to graduates approximately 3 months after graduation.
For courses with extenuating circumstances where graduates are being delayed
from performing the job, e.g., a backlog of obtaining security clearances, the
timeframe may be extended up to 120 calendar days after the graduation month.
Document the reasons for extending the 90-day timeframe. Surveys can be
mailed, emailed, or used for interviewing graduates over the phone or in person.
An example of how to prepare a post-graduate survey can be found in APPENDIX
D.

Chapter 5 5-22
Systems Approach To Training Manual Evaluate Phase

4. Site Visit Site visits provide the school with an opportunity to visit where
graduates from the school will perform their duties. Both interviews and
observations can be conducted during a site visit. Environment, work conditions,
and equipment can be viewed, while allowing schoolhouse representatives to
conduct interviews with supervisors and graduates. Representatives from the
schoolhouse need to possess a thorough knowledge of the instructional programs
related to the site to be effective. Additionally, they need to possess knowledge of
educational and training principles so that recommendations for improvement to
the program can be documented and presented at the next CCRB.

a. Observation Observation will reveal the environment that the graduate


contends with, how well he/she is able to perform in the environment, how well
he/she implements what was taught in the course, and how well what was taught
in the course coincides with what is happening in the operating forces. Developing
a checklist for use during the observation provides a standard of comparison.
When observing graduates, it is recommended to observe recent graduates as well
as graduates who have been in the operating forces for a while. This provides the
ability to compare what is learned through on-the-job training and the consistency
between operating forces and the formal school/detachment. An example of an
observation checklist can be found in APPENDIX E. Section 5302 provides guidance
on how to quantify and interpret data from a checklist. Designing checklist is
covered in Section 5603.

b. Interview During a site visit, interviews are conducted with supervisors


and graduates from the course. Supervisors and graduates will provide different
information. Therefore, these interviews should be conducted separately and the
evaluation instruments should be developed with the intended audience in mind
(graduate or supervisor). Although site visits are ideal for conducting such
interviews, interviews can also be conducted over the phone or by email. Refer to
Section 5603 for how to prepare for an interview. Section 5302 provides guidance
on quantifying and interpreting data.

Chapter 5 5-23
Systems Approach To Training Manual Evaluate Phase

EVALUATION INSTRUMENTS OF THE EVALUATION PHASE


Evaluation Instrument How Evaluation is When Who Completes
Topic Used Administered Conducted Instrument
Course Instructional Rating Instructor distributes at the Completed at end Student
Materials Form (IRF) beginning of each lesson to a of each lesson
percentage (at a minimum, 10%)
of students determined by local
SOP.
Observation Checklist Observation checklist is used to During Lesson Curriculum Developer
review course materials during Academics
implementation. Normally
completed by sitting in the back of
the classroom with all of the paper-
based course materials on-hand.
End of Course Critique Instructor/academics distributes Completed at end Student
(ECC) ECC to 100% of the class at the of each course
end of the course.
Instructor Instructor Evaluation Instructor is evaluated using the During Lesson Academics
Checklist checklist. Normally evaluator sits
at the back of the class to minimize
distractions.
Instructional Rating Instructor distributes at the Completed at end Student
Form (IRF) beginning of the class to a of each lesson
percentage (at a minimum, 10%)
of students determined by local
SOP.
End of Course Critique Instructor/academics distributes Completed at end Student
(ECC) ECC to 100% of the class at the of each course
end of the course.
Safety Questionnaire Instructor distributes Completed at end Student
questionnaires at beginning of of lesson
lesson.
Instructional Instructional Rating Instructor distributes at the Completed at end Student
Environment Form (IRF) beginning of the class to a of each lesson
percentage (at a minimum, 10%)
of students determined by local
Standing Operating Procedures
(SOP).
Observation Checklist Observation checklist is used to During Class Curriculum Developer
review instructional environment Academics
during implementation. Normally
completed by sitting in the back of
the classroom with all of the paper-
based course materials on-hand.
Environmental Checklist Used by instructor to review Prior to Lesson Instructor
environment prior to conducting During Lesson Classroom observer
the class. Good device for
classroom management. An
observer uses this instrument
during the class.
Figure 5-14. Evaluation Instruments of the Evaluation Phase.

Chapter 5 5-24
Systems Approach To Training Manual Evaluate Phase

EVALUATION INSTRUMENTS OF THE EVALUATION PHASE (Continued)


Evaluation Instrument When Who Completes
How Evaluation is Administered
Topic Used Conducted Instrument
Instructional Safety Instructor/observer completes to ensure Prior to Class Instructor
Environment Checklist that the training facility presents a safe
During Class Classroom observer
(cont) learning environment.
End of Instructor/academics distributes ECC to Completed at Student
Course 100% of the class at the end of the end of each
Critiques course. course
(ECC)
Student Examination Distributed by instructor after exam is Immediately Student
Performance Rating Form complete to a percentage (at a minimum, Following Exam
(ERF) 10%) of students. ERFs should not be
viewed until after all students have
received scores to eliminate any chance of
bias during grading.
Practical A part of the standard curriculum that During the Instructor can note
Application/L provides instructors an opportunity to Lesson; observation on the After
esson informally evaluate how well the class is Determined by Instruction Report (AIR)
Exercise learning the material. Curriculum if needed
(approved
lesson plan)
Tests Instructor administers the test in Determined by Student
accordance with Program of Instruction Curriculum
(POI) and local Standing Operating
Procedures (SOP).
End of Instructor/academics distributes ECC to Completed at Student
Course 100% of the class at the end of the end of each
Critiques course. course
(ECC)
Other Data Student Instructor distributes to 100% of students Day one of Student
Collected Data Form to be completed at beginning of course. Course
After Consolidated report of student reaction, Completed Instructor
Instruction instructor reaction, and test scores. after each
Report (AIR) Completed for every class. Lesson
Graduate Post Mailed or emailed to 100% of graduates. 3 months after Graduate
Performance Graduate each
Survey graduation
Site Visits Interviews, surveys, and observation Anytime Instructor Curriculum
checklists can be completed during the site Developer Academics
visit.
Figure 5-14. Evaluation Instruments of the Evaluation Phase (cont.).

Chapter 5 5-25
Systems Approach To Training Manual Evaluate Phase

5300. ANALYZE AND INTREPRET DATA SECTION


Evaluations involve data analysis and interpretation to produce meaningful 3
results. Data analysis reduces and combines information to make it easier to
make comparisons for drawing conclusions. Interpretation involves making
sense of the data so outcomes and relationships can be described, conclusions
drawn, and recommendations made concerning any element of an instructional
program. The decisions for creating, revising, maintaining, continuing, and
improving an instructional program rests with sound data collection methods
and thorough data analysis and interpretation. This section takes the
evaluator through the steps of organizing, quantifying, interpreting, and
summarizing data so that information supporting changes can be presented in
a Course Content Review Board (CCRB). Figure 5-17 provides a diagram
showing the process and steps of analyzing and interpreting data.

Step 1
Collect Data Organize Data

Step 2
Quantify &
Interpret Data

Step 3 Course Content Figure 5-15. Process


Summarize Review Board of Analyzing and
Interpreting Data

Chapter 5 5-26
Systems Approach To Training Manual Evaluate Phase

5301. ORGANIZE DATA

Data must be compiled and organized before it can be analyzed and interpreted.
The focus of the evaluation will guide what data should be compiled. Data needs to
be organized by topic. The organization of the data will depend upon the questions
that need to be answered. For example, an evaluator might organize data into
topics of “Course Data,” “Instructor Data,” “Student Performance Data,” etc. Figure
5-16 at the end of Section 5200 identifies the instruments that provide information
for each of the categories. Organizing the compiled data into topic areas further
isolates data pertaining to the questions that need to be answered. Data is also
organized so that categories can be established for data comparison.

ESTABLISH CATEGORIES FOR DATA COMPARISON

Determinations must be made regarding what comparisons will need to be made to


provide meaning to the data. It is necessary to determine which comparisons will
provide results that can reliably identify both strong and weak areas within the
training program. Evaluators should compare data from several different sources.
Categories are established for data comparisons so that these comparisons can be
made when interpreting data. Such comparisons will minimize decisions being made
based upon one data source. Some examples of possible comparisons that can be
made are in Figure 5-16.

Examples of Possible Comparisons

Figure 5-16. Examples of 1. Percent of students accomplishing an objective with a previously


Possible Comparisons. established standard or with performance of previous classes on the same
objective.
2. Job performance data with class performance data.
3. Job performance before and after attending instruction.
4. The frequency of responses on different Instructional Rating Form (IRF)
items, on different test items, or within multiple-choice items.
5. Student opinions about the course with their test performance.
6. Student comments about the course with those of the school staff.
7. Final test scores.
8. Number of remedial instruction sessions per iteration of the course over a
period of a year or more.

Chapter 5 5-27
Systems Approach To Training Manual Evaluate Phase

5302. QUANTIFY AND INTERPRET DATA

Quantifying data is systematically assigning numbers to data allowing statistical


analysis to be performed so that trends and relationships can be identified and
interpreted. Through quantifying the data, the interpretation of data is possible.
For test items, item analysis is used to quantify data so that defective test items
are identified. Another way that data is quantified and interpreted is through
descriptive statistics. Some of the data may need to be coded prior to performing
statistics. In these cases, it is important to understand the scales of
measurement: nominal, ordinal, interval, and ratio. The scales of measurement
provide an understanding of what statistical procedures can be performed for
different types of instruments. Item analysis, descriptive statistics, and assigned
numbers allow the evaluator to pinpoint trends. Trends can be defined as a
pattern or prevailing theme. These trends can reveal strengths and weaknesses
within the instructional program. Interpreting data also involves analyzing the test
results for validity and reliability. This section will discuss what an evaluator is
looking for in the test results to find out if the test is valid and reliable. The use of
computer programs can make the process of data interpretation an easier task.
Data must be interpreted to identify the problems so that recommendations can be
made and solutions generated.

Quantify and Item Analysis


Interpret Data Figure 5-17. Quantify and
Interpret Data.

Descriptive
Statistics

Scales of
Measurement

Interpreting
Quantified Data

Test Reliability
and Validity

Chapter 5 5-28
Systems Approach To Training Manual Evaluate Phase

1. Item Analysis Item analysis provides information about the reliability


and validity of test items. Reliability and validity are discussed later in this
section. There are two purposes for doing an item analysis. First, the analysis
identifies defective test items. Secondly, it indicates areas where learners have
not mastered the learning objective(s). Through item analysis, trends are
identified as far as which test items are problematic. An example of one way to
determine item difficulty and item discrimination can be found in Figure 5-18.

a. Item Difficulty The frequency of students who answered an item correctly


determines the level of difficulty. For example, if 45 of 50 students answer an
item correctly, then the level of difficulty is low (.90) since 90 percent were able
to answer correctly. However, if 10 out of 50 students answer correctly, then the
level of difficulty is high (.20). The Individual Response Report in MCAIMS
provides the number and percentage of students who answered each item
correctly. This makes it easy to determine the level of difficulty of an item
through percentages. The difficulty index is calculated below.
Difficulty Index (p). Proportion of students who answered item correctly.

p= Number of students selecting correct answer___


Total number of students attempting the test item
p = 45 = .90
50
When Difficulty Index (p level) is less than about .25, the item is considered
relatively difficult. When Difficulty Index (p level) is above .75, the item is
considered relatively easy. Test construction experts try to build tests with an
average p level (difficulty) of about .50 for the test.
b. Item Discrimination A percentage of high-test scorers (U) are compared
to a percentage of low-test scorers (L) to determine how both groups of test
scorers performed on the same item. To perform item discrimination, a
percentage of high-test scorers and low-test scorers must be designated.
(Example: Compare the top 10% test scorers to the bottom 10% test scorers
who answered the test item correctly.) If a high percentage from both groups
missed the item, then more extensive evaluation of the test item and/or
instructional process is needed.
Item Discrimination Index (D) Measure of the extent to which a test
item discriminates or differentiates between students who perform well on
the overall test and those who do not perform well on the overall test.
(Number who got item (Number with item
D = correct in upper group) - correct in lower group)
Number of students in either group
Some experts insist that D should be at least .30, while others believe that as
long as D has a positive value, the item's discrimination ability is adequate.

Chapter 5 5-29
Systems Approach To Training Manual Evaluate Phase

There are three types of discrimination indexes:

1) Positive discrimination index - those who did well (U) on the overall
test chose the correct answer for a particular item more often than those
who did poorly (L) on the overall test.
2) Negative discrimination index - those who did poorly (L) on the
overall test chose the correct answer for a particular item than those who
did well (U) on the overall test.
3) Zero discrimination index - those who did well (U) and those who did
poorly (L) on the overall test choose the correct answer for a particular
item with equal frequency.

U M L Difficulty Discrimination
Item
(10 stu) (10 stu) (10 stu) (U + M + L) (U-L)
1 7 4 3 14 4 Figure 5-18. Item Analysis:
2 10 10 9 29 1 Number of Learners Giving
3 8 6 4 18 4 Correct Response in Each
4 4 4 6 14 -2 Criterion Group.
5 6 7 6 19 0
6 8 7 4 19 4
7 3 0 0 3 3
8 10 7 5 22 5
9 1 2 8 11 -7
10 8 5 3 16 5

The table above shows a simple analysis using a percentage of 33 percent to


divide a class into three groups – Upper (U), Middle (M), and Lower (L). For
instance, if you have a class of 30 students, then the students would be divided
by test scores into the following groups: 10 (U) students (33 percent), 10 (M)
students (33 percent), and 10 (L) students (33 percent).

Using the above table, a measure of item difficulty is obtained by adding Upper
(U) + Middle (M) + Lower (L). The difficulty index for item 2 is found by dividing
29 by 30 equaling .97 (97% of students answered correctly). Either the material
is covered extremely well in the class or the question does not have convincing
distracters. MCAIMS’ Individual Response Report provides a look at the
distracters and is discussed in the next section. On item 7, 3 students answered
the question correctly. This is an indicator that the material has not been
covered adequately, the test question is poorly written, or answer is miskeyed.

A rough index (ratio) of the discriminative value (Upper test scorers compared to
the Lower test scorers) of each item can be provided by subtracting the number
of individuals answering an item correctly in the Lower (L) group from the
number of individuals answering an item correctly in the Upper (U) group (Ex: U-
L). Negative numbers indicate that there were more students from the Upper (U)
group who missed the question. Positive numbers indicate that more students in
the Lower (L) group missed the item. Zero indicates that there was no difference
between the Upper (U) group and the Lower (L) group.

Chapter 5 5-30
Systems Approach To Training Manual Evaluate Phase

2. Descriptive Statistics
a. Frequency of Collection Descriptive statistics should be calculated every
time a test, questionnaire, survey, etc., is administered. Even if these data are not
used immediately to summarize results in a report or to provide feedback to
respondents, these data can be useful for future analysis to identify trends or
relationships among groups.
b. Types of Descriptive Statistics This section presents information and
detail concerning descriptive statistics.

1) Frequency Frequencies are determined by counting the number of


occurrences. As example in Figure 5-20, the score 75 has a frequency of
3 because it occurs 3 times. Frequency counts are used to describe data
(e.g., responses, scores, factors, variables) in raw numbers. Arranging
variables into a frequency distribution makes the description of the
variables easier than it would be if the scores were just listed in order.
To illustrate, Figure 5-20 presents ten scores on a test and the same ten
scores listed in a frequency distribution below.

2) Uses Frequency counts are useful for counting the number of students
who took a particular test, the number of students who passed a
particular test, the number of students who selected answer A on item 13
of a test, the number of people who responded to a survey questionnaire,
the number of people who rated an instructional program as effective, etc.

FREQUENCY
Figure 5-20. Frequency
Test Scores: 75, 75, 85, 90, 60, 65, 65, 75, 100, 85
Distribution.
Frequency Distribution

Score Frequency

100 1
90 1
85 2
75 3
65 2
60 1

3) Appropriate Scale of Measurement Frequency counts can be


performed on data represented by nominal, ordinal, interval, and ratio
scales (Scales of Measurement will discussed in detail later in this
section).

Chapter 5 5-31
Systems Approach To Training Manual Evaluate Phase

a. Graphic Representation Frequency distribution data can be readily


interpreted by the use of graphs.

1) The simplest graph, known as a frequency polygon, involves


representing the frequency count (expressed in raw numbers or by
percent) on the Y-axis (vertical). Test scores should be divided into
equal intervals and plotted on the X-axis (horizontal). Using the data
in Figure 5-20 and grouping the test scores in three intervals, Figure
5-21 displays the frequency distribution in graphic form. A frequency
polygon is useful for displaying data within a group or data across
groups. An example of data within a group is student scores on a
test. Subsequent class scores can be plotted on the same graph to
display data across groups.

FREQUENCY POLYGON
Figure 5-21. Frequency
Polygon.
4.5
4 85.5
3.5
3 75.5
2.5 FREQUENCY
2 65.5 COUNT (y)
1.5
1 55.5 95.5
0.5
0
0 50 100 150
TEST SCORES (x)

2) Figure 5-22 presents different frequency distributions in graphic form.


A frequency distribution is said to be "normal" when it represents a
bell-shaped curve. It is important to graph data to see if it is "normal"
before performing any statistical analyses. A frequency distribution in
which scores trail off at either the high end or the low end of the
spectrum is said to be skewed. Where these scores trail off is
referred to as the "tail" of the distribution. If the tail of the
distribution extends toward the low or negative end of the scale, the
distribution is considered negatively skewed; if the tail extends toward
the high or positive end of the scale, the distribution is positively
skewed.

Chapter 5 5-32
Systems Approach To Training Manual Evaluate Phase

Figure 5-22. Frequency Frequency Distributions


Distributions.

f f f

SCORES SCORES SCORES

NORMAL NEGATIVELY POSITIVELY


SKEWED SKEWED

b. Measures of Central Tendency While frequency distributions typically


represent a breakdown of individual scores or variables among many, it is often
useful to characterize a group as a whole. Measures of central tendency are
measures of the location of the middle or the center of a distribution. The definition
of “middle” or “center” is purposely left somewhat vague so that the term “central
tendency” can refer to a wide variety of measures. Three measures of central
tendency are the mode, median, mean. The mean is the most commonly used
measure of central tendency. Figure 5-23 provides a description and sample of how
to determine each.

Measures of Central Tendency


The mode is the most frequently occurring response or
score.
Mode Sample Test Scores: 52, 78, 85, 88, 90, 93, 93, 100
Mode = 93
Figure 5-23. Measures of NOTE: More than one mode can exist in a set of data.
Central Tendency.
The median is the score above and below which 50
percent of the scores in the sample fall. It is sometimes
referred to as the "breaking point.”
1. Place numbers in order from least to greatest.
2. If number of scores is odd, then the median is the
Median central number or midpoint.
3. If number of scores is even, then add the two
middle scores and divide by two.
Sample Test Scores: 52, 78, 85, 88, 90, 93, 93, 100
88+90=178/2=89
Median = 89

Mean is the "average" score.


Sample Test Scores: 52, 78, 85, 88, 90, 93, 93, 100
Mean 52+78+85+88+90+93+93+100 = 679/8 = 84.875
Mean = 84.875

Chapter 5 5-33
Systems Approach To Training Manual Evaluate Phase

Figure 5-24 provides the scales of measurement (to be discussed next), the
data types (i.e., test items, questionnaires), and how the measures of
central tendency can be used for each.

1) Mode As the most frequently occurring response, mode is simple


to compute. The mode is not affected by extreme values.
However, it is usually not very descriptive of the data so it is
important that other measures of central tendency are used to
describe the data.

a) Mode is useful for determining what most students score on a


given test or test item.

b) Mode is particularly useful for determining what response most


students select in a multiple-choice test item, thereby allowing
analysis of the item's ability to clearly discriminate between correct
and incorrect responses (a good multiple-choice test item has a
clear "correct" response and several plausible distracters).

2) Median Median is useful for splitting a group into halves. The


median is the middle of a distribution; half the scores are above the
median and half are below the median. The median is less
sensitive to extreme scores than the mean and this makes it a
better measure than the mean for highly skewed distributions. For
example, the median income is usually more informative than the
mean income.

a) The median is not affected by extreme values and it always


exists.

b) Though median is easy to compute, the numbers must be


properly ordered to compute the correct median.

3) Mean Mean is the “average.”

a) Mean is calculated to produce an average response per test


item across a class or to produce an average response per
respondent.

b) Mean is also useful for determining overall attitudes toward a


topic when using a Likert rating scale. For example, using a five-
response Likert scale, a student rates the overall effectiveness of a
course by answering 20 questions concerning course content,
instructor performance, use of media, etc. The value circled for
each response can then be summed for a total score. This score is
then divided by the number of questions (20) to come up with the
mean. In this case, the mean is a total rating of course
effectiveness.

Chapter 5 5-34
Systems Approach To Training Manual Evaluate Phase

c) Mean is generally the preferred measure of central tendency because it


is the most consistent or stable measure from sample to sample. The
mean is good measure of central tendency for roughly symmetric
distributions but can be misleading in skewed distributions since it can
be greatly influenced by extreme scores. For example, ten students
score the following: 20, 86, 88, 94, 92, 90, 40, 88, 76, 83. Although the
mean is 76, it hardly reflects the typical score in the set. Mode or
median may be more representative of that group's performance as a
whole. When the distribution of scores is widely dispersed, median is the
most appropriate measure of central tendency. For example, if five
students achieved test scores of 60, 65, 70, 72, and 74, and three
students achieved scores of 90, 95, and 100, the overall class score
should be reported as a median score. Since the scores achieved by the
second group of students are much higher than those of the first group,
calculating a mean score would inflate the value of the scores achieved
by the lower scoring group. In this example, the mean score is 78, while
the median score is 73. When a distribution is extremely skewed, it is
recommended that all three measures be reported and the data be
interpreted based on the direction and amount of skew.

Measure of
Measurement Instrument Type of Data
Central
Scale Type Measured
Tendency
Mode Nominal Scale Student Data Most frequent score
Figure 5-24. Type of Test Data Most frequent
Data Measured By Questionnaires answer
Central Tendency. Interview
Ordinal Scale Test Data Useful for splitting
Interval Scale groups in halves i.e.
Median
Ratio Scale Mastery and Non-
Mastery
Ordinal Scale Test Data Avg. response per
test item
Questionnaires Avg. response per
respondent
Interview Overall attitudes
toward topic/total
rating of course
Mean
effectiveness
Interval Scale Test Data Allows comparisons
Ratio Scale Questionnaires of individuals to
Interview overall class mean
(test scores,
responses to
particular items)

Chapter 5 5-35
Systems Approach To Training Manual Evaluate Phase

c. Variability The variability of a set of scores is the typical degree of


spread among the scores. Range, variance, and standard deviation are used to
report variability.

1) Range Range is the difference between the highest and the lowest
scores in the set. Range is typically not the best measure of variability
because it is dependent upon the spread in a set of scores, which can
vary widely. For example, 10 students take a test and score as
follows: 100, 92, 94, 94, 96, 100, 90, 93, 97, and 62. The range of
scores varies from 100 to 62 so the range is 38 (100-62 = 38). If the
lowest score were dropped, the range would be 10 (100-90 = 10),
which more accurately reflects the sample. Range serves as a rough
index to variability and can be useful to report when the mean of a set
of scores is not really representative due to a wide ranging of scores.

2) Variance Variance is a more widely accepted measure of variability


because it measures the average squared distance of the scores from
the mean of the set in which they appear. An example of how to
determine variance from a population is shown in Figure 5-25. The
variance (136) is the average of the squared deviation of the scores
and is used to calculate standard deviation, which is the most widely
accepted measure of variability.

Student Scores (X) X-Mean (X-Mean)2

100 100-88 = 12 144


90 90-88 = 2 4 Figure 5-25. Variance and
70 70-88 = -18 324 Standard Deviation of Test
80 80-88 = -8 64 Scores.
100 100-88 = 12 144
440 680

Number of Scores = 5

Mean = X = 440 = 88
Number of Scores 5

Variance = (X-Mean)2 = 680 = 136


Number of Scores 5

Standard Deviation = √136 = 11.7

Chapter 5 5-36
Systems Approach To Training Manual Evaluate Phase

3) Standard Deviation Standard deviation is the square root of the


variance for a set of variables. Standard deviation can reflect the
amount of variability among a set of variables, responses,
characteristics, scores, etc. In Figure 5-25, the variance score is
136. When the square root of 136 is taken, the standard deviation
is 11.7. This means that the average distance of the students'
scores from the class mean is 11.7. As another example, the mean
score on a test is 70 with a standard deviation of 10. Thus, the
average amount students deviated from the mean score of 70 is 10
points. If student A scored a 90 on the test, 20 points above the
mean score, we interpret this as a very good score, deviating from
the norm twice as much as the average student. This is often
referred to as deviating from the mean by 2 standard deviation
(SD) units (z score or standard score). If student B scored a 30 on
the test, 40 points below the mean score, we interpret this as a
very bad score, deviating from the norm four times as much as the
average student.

3. Scales Of Measurement Scales of measurement specify how the


numbers assigned to variables relate to what is being evaluated or measured.
It tells whether a number is a label (nominal), a ranking order (ordinal),
represented in equal intervals (interval), or describing a relationship between
two variables (ratio). The type of measurement scale used affects the way
data is statistically analyzed. Scales of measurement represent the varying
degree of a particular variable. Figure 5-30 provides the types of statistical
analysis that can be performed for different instruments using the scales.
Sample questions illustrating the use of the following scales can be found in
Section 5603, Design Evaluation Instruments.

a. Nominal Scale A nominal scale measurement is simply a


classification system. For instance, observation data can be labeled and
categorized into mutually exclusive categories. Nominal numbering involves
arbitrarily assigning labels to whatever is being measured. Assigning a 1 to a
"yes" response and a 0 to a "no" response is an example of nominal
numbering; so is assigning a 1 to "male" respondents and a 0 to "female"
respondents. Quantification of data by nominal numbering should be done
only when an arbitrary number is needed to distinguish between groups,
responses, etc. Characteristics of a nominal scale are listed in Figure 5-26.

Chapter 5 5-37
Systems Approach To Training Manual Evaluate Phase

Characteristics of a Nominal Scale


1. Characterized by a lack of degree of magnitude. In other words,
assigning a 1 to a variable does not mean that it is of a greater Figure 5-26. Characteristics
value than a variable assigned a 0. Using the example below, of a Nominal Scale.
answering "yes" is not of greater value than answering "no.” The
numbers serve only to distinguish among different responses or
different characteristics.
a. YES = 1
i. NO = 0

2. Does not reflect equal intervals between assigned numbers. For


example, the numbers distinguishing the military branches are
just data labels.
a. Air Force = 1
b. Army = 2
c. Navy = 3
d. Marine Corps = 4

3. Does not have a true zero; because a variable is assigned a 0


does not mean that it lacks the property being measured. Using
the example below, assigning the number "0" to those who
answered female on a student data sheet does not mean that the
participant lacks gender.
a. MALE = 1
b. FEMALE = 0

Chapter 5 5-38
Systems Approach To Training Manual Evaluate Phase

b. Ordinal Scale The ordinal scale permits a "ranking" between values.


Differences cannot be “quantified” between two ordinal values. A Likert scale is an
Figure 5-27. example of an ordinal scale. For example, rating the effectiveness of instruction
Characteristics of Ordinal from 1 (ineffective) to 5 (very effective) permits comparisons to be made
Scale. regarding the level of effectiveness; a larger number indicates more of the
property being measured. Characteristics of an ordinal scale are listed in Figure 5-
27.

Characteristics of an Ordinal Scale


Strongly Strongly
Disagree Disagree Neutral Agree Agree

1 2 3 4 5
1. Degree of magnitude exists in ordinal numbers because each higher
rating indicates more of the property being measured. Above, the level
of agreement is being measured. A 5 indicates a higher level of
agreement than a 2.

2. Equal intervals do not exist between ordinal numbers. For example, a


rating of a 4 in the above example is not twice as effective as a rating of
2. Numbers used in an ordinal scale should not be added or multiplied
because this can produce misleading results [i.e., two ratings of 2
(disagree) do not equal a single rating of 4 (agree). A 4 means
something totally different than a 2.

3. There is no true zero in an ordinal scale. In the above example, it is


meaningless to assign a 0 to a variable to indicate a lack of effectiveness
because a rating of 1 indicates "ineffective."

c. Interval Scale Interval numbering allows comparisons about the extent


of differences between variables. For example, on test X, student A scored 20
points higher than student B. An example of an interval numbering system is a
response to a question asking the respondent's age, the number of years in grade,
etc. Characteristics of an interval scale are listed in Figure 5-28. This will help the
evaluator determine when to quantify data using an interval scale.

Characteristics of an Interval Scale


Figure 5-30.
1. Degree of magnitude exists in interval numbers because each higher rating
Characteristics of Interval
indicates more of the property being measured. For example, a score of 95
Scale.
is better than a score of 90 on a test.

2. Equal intervals exist between interval numbers. For example, 30 years in


service is 10 more years than 20 years in service, which is 10 more years
than 10 years in service.

3. There is no true zero in an interval scale. Temperature is an example


because temperatures can dip below 0 degrees and a temperature of 0
degrees does not indicate an absence of temperature.

Chapter 5 5-39
Systems Approach To Training Manual Evaluate Phase

d. Ratio Scale A ratio scale has equal intervals and a meaningful zero
point. Point values assigned to responses to score a test is an example of a
ratio scale. Ratio numbering permits precise relationships among variables to
be made. For example, student A received a score of 40 on the test, which is
twice as good as student B's score of 20. Characteristics of a ratio scale are
listed in Figure 5-31. This will help the evaluator determine when to quantify
data using a ratio scale.

Characteristics of a Ratio Scale

1. Degree of magnitude exists in a ratio numbering scale. Test scores are


an example of a ratio scale illustrating degree of magnitude (e.g., a
score of 80 is better than a score of 70).
Figure 5-31. Characteristics
2. Equal intervals exist on a ratio numbering scale (e.g., a score of 90 is
of a Ratio Scale.
twice as good as a score of 45).

3. A true zero exists in a ratio numbering scale (e.g., a score of 0 indicates


no score). A ratio numbering system is typically used to quantify
pass/fail data on a performance checklist, with "pass" quantified by a 1
and "fail" quantified by a 0.

GUIDE TO QUANTIFYING DATA TO PERMIT STATISTICAL


ANALYSIS

The following is presented to aid the evaluator in quantifying data and selecting
appropriate statistical analyses based on the evaluation instrument being used
(see Figure 5-30).

See Figure 5-30 on the next page.

Chapter 5 5-40
Systems Approach To Training Manual Evaluate Phase

GUIDELINES FOR QUANTIFYING DATA TO PERMIT STATISTICAL ANALYSIS

Evaluation Scale of Examples Statistical Analyses Statistical Analyses


Instrument Measurement of That Can Be Performed That Cannot Be
Qualifying Performed
Data

Frequency counts of responses Mean (average response


Multiple- Nominal per test item per test item or per
choice A = 1,
Mode (most frequently selected student)
Test Item B = 2,
responses per test item) Median
C = 3,
Item Analysis (when used in Overall test score per
D = 4, etc. conjunction with a ratio scale) student
Variability (range, variance,
standard deviation)
Point Frequency counts for Frequency counts for all
Ratio system: correct/incorrect responses incorrect responses
1 = correct  Per test item (distracters)
answer  Per student Mean (average response
0= per test item or per
Mean (calculated to produce item student)
incorrect difficulty)
answer Mode (most frequently
Median (score for overall test selected response per test
which splits class in half) item)
Item analysis (but cannot Variability of responses per
determine where problem lie) test item
Overall test score per student
Variability of overall test scores
Frequency counts for Mean (average response
True/False Ratio correct/incorrect responses per test item or student)
Test Item Point
system:  Per test item Mode (most frequently
 Per student selected response per test
item)
1 = correct Mean (calculated to produce item
answer difficulty)
0= Median (score for overall test
incorrect which splits class in half)
answer Item analysis
Overall test score per student
Variability of overall test scores

Frequency counts of responses


Fill-in-the- Ratio
blank Point  Per test item
Short- system:  Per student
Answer Mean score per test item and per
Test Item Points for student
correct Mode (most frequently scored
response points per test item)
and partial
credit Median (score for overall test
which splits class in half)
Preliminary item analysis
Overall test score per student
Variability of overall test scores
and points scored per test item
Figure 5-30. Guidelines for Quantifying Data to Permit Statistical Analysis.

Chapter 6 5-42
Systems Approach To Training Manual Evaluate Phase

GUIDELINES FOR QUANTIFYING DATA TO PERMIT STATISTICAL ANALYSIS (cont.)

Evaluation Scale of Examples of Statistical Analyses Statistical Analyses


Instrument Measurement Qualifying That Can Be Performed That Cannot Be
Data Performed

Fill-in-the- Ratio (cont.) Point system: Frequency counts of correct/incorrect Frequency counts for all
blank/ responses incorrect responses
Short- 1=correct  Per student
Answer Test answer  Per test item Mean (average response per
Item 0=incorrect test item or per student)
(cont.) answer Mean (calculated to produce item
difficulty) Mode (most frequent
response per test item)
Median
Variability of responses per
Preliminary item analysis test item

Overall test score per student

Variability of overall test scores

Nominal Categorize Frequency counts of all responses per Mean (average response per
responses and test item test item or per student)
assign a
number to Mode (most frequently occurring Median
each response response)
Overall test score per
Item analysis student

Variability of responses per


test item

Performance- Ratio Point system: Frequency counts of pass/fail Mean (average response per
Based Test  Per student test item or per student)
Item 1=pass  Per test item
0=fail Mode (most frequent
Mean (calculated to produce item response per test item)
difficulty)
Variability of outcomes per
Median test item

Preliminary item analysis

Overall test score per student

Variability of overall test scores

Figure 5-30. Guidelines for Quantifying Data to Permit Statistical Analysis (cont.).

Chapter 5 5-43
Systems Approach To Training Manual Evaluate Phase

GUIDELINES FOR QUANTIFYING DATA TO PERMIT STATISTICAL ANALYSIS


(cont.)
Evaluation Scale of Examples of Statistical Analyses Statistical Analyses
Instrument Measurement Qualifying That Can Be Performed That Cannot Be
Data Performed

Interview/ Nominal Categorize Frequency counts of responses per Frequency counts per
Survey responses and item student
Questionnaire assign a
number to Mode (most frequently occurring Mean response per item and
each response response) per student

Variability of responses per item Median

Ordinal Likert scale Frequency counts of responses per


item

Mean response per item

Mean response per student


(assuming scale is same throughout
survey)

Median (response per item which splits


respondent groupin half)

Mode (most frequently occurring


response per item)

Variability of responses per item

Interval Response Frequency counts of responses per Mean response per student
serves as the item
code when
response is Mean response per item
numerical
(e.g., age, Median (response per item which splits
years in respondent group
service) in half)

Mode (most frequently occurring


response)

Variability of responses per item

Figure 5-30. Guidelines for Quantifying Data to Permit Statistical Analysis (cont.).

Chapter 5 5-44
Systems Approach To Training Manual Evaluate Phase

4. Interpreting Quantified Data

a. Multiple-Choice Test Item Both nominal and ratio scales can be used
for multiple-choice test items. Using these scales to analyze multiple-choice test
items is explained below.

1) Nominal Scale Labels are assigned to different responses. For


example, in a 4-choice item, answer "a" is coded as 1, answer "b" as
2, answer "c" as 3, and answer "d" as 4.

a) A nominal scale permits frequency counts, mode, and item


analysis of individual test items to be performed. Figure 5-34
presents data from three students who took the same 10-item
test and their responses to each question. Next to each response
is the number assigned to categorize the response (an asterisk
indicates an incorrect response). Nominal numbers can be added
across test items to calculate frequency counts (e.g., two out of
three students selected response "a" on test item 1; all three
students selected response "b" on test item 2). Mode can be
determined for an item by looking for the most frequently
occurring response an item (e.g., the mode for test item 1 is “a”).

Test Student Student Student


Figure 5-31. Student Test Item #1 #2 #3
Data.
1. a 1 *d 4 a 1
2. b 2 b 2 b 2
3. *a 1 *d 4 *b 2
4. c 3 c 3 *a 1
5. d 4 d 4 d 4
6. a 1 a 1 a 1
7. b 2 *d 4 b 2
8. d 4 *a 1 d 4
9. c 3 c 3 c 3
10. d 4 d 4 d 4

b) Nominal numbers cannot be summed to provide an overall score


on the test for each student because a nominal scale only assigns
labels to responses and does not reflect degree of magnitude (a
higher score does not reflect a better score). In Figure 5-31, it
would be incorrect to sum the coded responses to provide an
overall score of 25 for student #1, 30 for student #2, and 24 for
student #3. In actuality, student #1 performed the best with
only 1 incorrect answer, student #3 performed second best with
two incorrect answers, and student #2 had four incorrect
answers.

5-45
Systems Approach To Training Manual Evaluate Phase

c) A nominal scale cannot be used to calculate mean, median, or


variability (range, variance, and standard deviation) because these
data are meaningless in this context. For example, in Figure 5-34, a
calculated mean or average response to test item #1 [(1 + 4 + 1)
divided by 3 = 2] is meaningless because it would reflect that the
average response to item #1 is "b." It would also be incorrect to
calculate a mean by, for example, adding student #1's scores for
each item and dividing by the number of items (25 divided by 10) to
produce a mean response of 2.5. To interpret this would mean that
the average response is halfway between a response of "b" and a
response of "c," which is a meaningless calculation.

2) Ratio Scale A ratio scale can be used in conjunction with a nominal


scale when quantifying responses to multiple-choice test items or it may
be used as the only means of quantifying the data.

a) If an evaluator is solely interested in how many questions a student


answers correctly, a simple scoring system is needed to count the
number of correct and incorrect responses so a total score for the test
can be calculated for each student. To do this, multiple-choice test
items can be quantified using a ratio scale (e.g., 1 point is given to
each correct answer and a 0 is given to each incorrect answer). This
numbering system permits some frequency count data to be gathered
(e.g., 22 of 50 students answered test item #1 correctly), but it does
not permit frequency counts to be made across responses. This is
because every incorrect response is assigned a 0, making it
impossible to discern how many students selected any response other
than the correct response. This numbering system permits
preliminary item analysis to be performed (e.g., determining the
percentage of students who got the answer right and those who did
not), but it does not permit further item analysis to determine the
item difficulty level of each response.

b) The evaluator can code the data using a ratio scale by assigning point
values for correct responses and no points for an incorrect response.
This allows the calculation of an overall test score per student by
summing the point values for each question. A median (i.e., score for
overall test which splits the class in half) can also be calculated, as
can the variability of overall test scores.

Chapter 5 5-46
Systems Approach To Training Manual Evaluate Phase

c) A ratio scale can also enable the calculation of mean to produce


item difficulty rating. When responses are quantified with either
of two numbers (e.g., 0 and 1), the evaluator can sum the
responses to get a frequency count. The frequency counts relate
to the number of correct and incorrect answers. The frequency
count is then used to calculate item difficulty. Item difficulty is
calculated by dividing the number of students who got the item
correct by the total number of students taking the test.
Therefore, if 20 students answered a test item correctly and five
answered incorrectly, the item difficulty would be .80.

# of Students Who Answered Correctly = 20 = .80


# of Students Taking the Test 25

d) Quantifying data using a ratio scale does not, however, permit


calculation of a mean response per student or per test item.
Variability is not calculated for the same reason.

e) Mode (i.e., most frequently selected response per test item) is not
calculated when using a ratio scale on a multiple-choice test item.
This is because test data are coded as incorrect or correct rather
than labeling all of the responses as is done with a nominal scale.

b. True/False Test Items A true/false test item is typically quantified using


a ratio scale (1 point for a correct response 0 points for an incorrect response).
This allows frequency counts, mean (calculated to produce item difficulty), median
(overall test score that splits the class in half), an overall test score per student,
and variability of overall tests scores to be calculated. However, a mean response
per test item or per student and a mode cannot be calculated because the actual
response of "true" or "false" is not quantified; the correctness of the answer is.

c. Fill-in-the-Blank and Short-Answer Test Items Fill-in-the-blank and


short-answer test items can be quantified using a ratio scale and a nominal scale.

1) One method for quantifying this type of data is to devise a scoring


system so that answers are given points based on the "correctness" of
the response. This is typically done by creating an answer key that
details the levels of acceptable responses to each question. For
instance, a test question may require the student to list, in order, the
seven essential qualities of leadership. The answer key may be
established so that the student receives 1 point for each correct quality
listed and another 3 points if they are listed in correct order. This
creates a scale of measurement that ranks performance on each item
by the response's level of correctness. This is a good scale of
measurement if there is some flexibility in the answers so that partial
credit may be given to some information.

Chapter 5 5-47
Systems Approach To Training Manual Evaluate Phase

a) This type of scoring system permits frequency counts of responses


per test item and per student, a mean score per test item, a mode
(most frequently scored points) per test item, a median test score
that splits the class in half, preliminary item analysis, an overall test
score per student, the variability (range, variance, and standard
deviation) of overall test scores, and the variability in the point
spread among students per their overall test scores and per test
item.

b) Item difficulty and item discriminability may be calculated per test


item to determine the percentage of students who answered
correctly and the percentage who did not. However, an analysis of
responses to determine if students responded incorrectly, but in
similar ways, cannot be performed. For instance, it may be useful to
know that students who missed a particular test question all
responded with the same "wrong" answer. These data would help
determine if the question was worded poorly so that it may be
reworded in the future to remove any uncertainty or
misinterpretation of its meaning. This can only be accomplished
through use of a nominal scale.

2) Another ratio scale involves establishing a scale of measurement with


equal intervals and a true zero. Unlike the previous example where
each response is keyed to a point system that may or may not be the
same for each response, this method uses a point system that is the
same for all responses. Such a system may be as simple as assigning a
1 to a correct response and a 0 to an incorrect response. This scale of
measurement is only useful if there is a clearly defined correct and
incorrect response for the item. This scoring system permits the same
statistical analyses to be performed that a ratio scale for a multiple-
choice test item permits.

3) Fill-in-the-blank and short-answer test items can also be quantified using


a nominal scale, although this can be time consuming. To quantify data
using a nominal scale, the responses must first be categorized into same
or like responses. This can be difficult if the responses in the group vary
greatly. If the responses can be categorized, the data are then
quantified by assigning a number to each category through use of a
nominal scale. Frequency counts, mode, and item analysis can be
calculated. Mean (i.e., average response per test item or per student),
median, an overall test score per student, and variability cannot be
calculated.

d. Performance-Based Test Items Performance-based test items are


typically pass/fail items quantified as either a 1 (pass) or a 0 (fail). This scoring
system permits the same statistical analyses to be performed that a ratio scale for
a multiple-choice test item permits.

Chapter 5 5-48
Systems Approach To Training Manual Evaluate Phase

e. Interview Data/Survey Questionnaires Interview data and survey


questionnaires are structured to collect data through fill-in-the-blank/short-answer
questions, multiple-choice items, and Likert rating scales.

1) Nominal

a) Fill-in-the-Blank/Short-Answer Response Survey and


interview data of this nature can be difficult to quantify because
they require a subjective judgment by the evaluator to categorize
responses into meaningful groups of like responses. Unlike test
data, survey and interview data are not quantified by "points" that
can be added up for a total score but, rather, by using numbers to
assign labels to responses (nominal scale). The difficulty lies in
grouping the responses because an open-ended question can
produce a multitude of different responses. For example, Figure 5-
32 presents an open-ended question. Just below the question are
the categories of responses identified during analysis of the test.
The responses should be categorized into the smallest number of
groups possible. In this example, all responses were easily
categorized into one of five groups and quantified accordingly.
Care should be taken when constructing a survey questionnaire to
minimize fill-in-the-blank/short-answer items so the data can be
easily quantified and analyzed (see Section 5603). In this example,
the question was better suited to be a multiple-choice item that
could have been quantified readily by allowing respondents to
select their responses.

Figure 5-32. Categorizing


CATEGORIZING RESPONSES TO AN OPEN-ENDED QUESTION
Responses to an Open-
Ended Question.
How often did you receive hands-on training with the equipment while attending
the Radio Repairman Course?
______________________________

Less than once a week = 1


Once a week = 2
Twice a week = 3
Three times a week = 4
More than three times a week = 5

b) Multiple-Choice Response Survey and interview data that use a


multiple-choice response format can be quantified like their
counterpart knowledge-based test items using a nominal scale to
assign labels to responses.

Chapter 5 5-49
Systems Approach To Training Manual Evaluate Phase

2) Ordinal An ordinal scale is used to measure responses gathered using a


Likert rating scale. A Likert rating scale is the primary data collection tool that
employs an ordinal scale. Typically, responses to a subject are rated across a
continuum using a scale that varies from three to seven possible responses.
The bottom of the scale typically represents a low amount of the property
being measured while the top of the scale typically represents a high amount
of the property being measured.

a) Unlike knowledge- and performance-based test items and other types of


survey/interview questions, a Likert rating scale is a measure where the
mean (i.e., typical response per item) per respondent is calculated. When
using a Likert scale, it is appropriate to add the responses and divide by
the number of questions per student to produce a student's overall
response or attitude to a subject. For example, a survey evaluating the
improvements made to a training program uses a 3-point Likert scale.
Respondents answer questions concerning the improvements made with 1
= "not improved," 2 = "improved," and 3 = "greatly improved." In this
example, it would be appropriate to calculate a mean response to the
survey per student. It would be possible for a student's mean response to
be 2.5 which could be interpreted as the training program overall is
considered to be improved.

b) A mean is calculated using a Likert scale only if the same scale is used
throughout the survey and the whole survey measures the same topic.
For example, half of a survey measures the effectiveness of graduate job
performance on a 5-point Likert scale from "ineffective" to "very effective."
The other half of the survey measures graduate training in terms of
effectiveness by using the same 5-point scale. It would be inappropriate
to calculate an average response per respondent to the overall survey
when the survey is measuring two different topics.

3) Interval Responses to a survey questionnaire or interview that are


numerical in nature (e.g., respondent's age, years in service) are quantified
using an interval scale. An interval scale quantifies the responses by the
value of the response. If a respondent answers 23 to a question asking his
age, his response is coded as 23. An interval scale permits the following
statistics to be performed on a per item basis only: frequency counts, mean
response, mode (most frequently occurring response), median (the response
that splits the respondent pool in half), and variability (range, variance, and
standard deviation). Unlike a Likert scale that may be the same scale used
throughout a survey, an interval scale is not usually the same throughout a
survey. A survey is usually designed with interval questions to gather
primarily demographic data. Therefore, it is not appropriate to sum
responses in an interval scale to calculate the above descriptive statistics for
the overall survey.

Chapter 5 5-50
Systems Approach To Training Manual Evaluate Phase

5. Test Reliability And Validity The reliability and validity of a test


provide the foundation for effective evaluation of student performance. Both the
reliability and validity of a test should be assessed to identify the appropriateness
of the test as an accurate measure of instructional effectiveness.

a. Reliability Reliability refers to the ability of an instrument to measure


skills and knowledge consistently. The reliability of a test is determined based
on the calculation of a reliability coefficient (r). It is recommended that this
coefficient be computed using a computer statistical analysis software package.
A reliability coefficient is the correlation, or degree of association, between two
sets of scores. Correlation coefficients range from -1.0 to +1.0. The closer a
coefficient gets to -1.0 or to +1.0, the stronger the relationship. The sign of the
coefficient tells whether the relationship is positive or negative.

Coefficient Strength Direction


r = - .85 Strong Negative
r = +.82 Strong Positive
r = +.22 Weak Positive
r = +.03 Very Weak Positive
r = - .42 Moderate Negative

The different methods of estimating reliability fall within three categories:


determining the internal consistency of a test, determining the stability of a test
over time, and determining the equivalence of two forms of a test.

1) Test-Retest Test-retest is a method of estimating reliability by


giving the test twice and comparing the first set of scores and the
second set of scores. For example, suppose a test on Naval
correspondence is given to six students on Monday and again on the
following Monday without any teaching between these times. If the
test scores do not fluctuate, then it is concluded that the test is
reliable. The problem with test-retest reliability is that there is usually
some memory or experience involved the second time the test is
taken. Generally, the longer the interval between test administration,
the lower the correlation.

First Second
Administration Administration
Student Score Score
1 85 87
2 93 93
3 78 75
4 80 85
5 65 61
6 83 80

Chapter 5 5-51
Systems Approach To Training Manual Evaluate Phase

2) Alternate Forms If there are two equivalent forms of a test, these


forms can be used to obtain an estimate of the reliability of the test.
Both forms of the test are administered to the same group of students
and the correlation between the two sets of scores is determined. If
there is a large difference in a student's score on the two forms of the
test that are suppose to measures the same behavior, then it indicates
that the test is unreliable. To use this method of estimating reliability,
two equivalent forms of the test must be available and they must be
administered under conditions as nearly equivalent as possible.

3) Split-Half Method If the test in question is designed to measure a


single basic concept, then the split-half method can be used to determine
reliability. To find the split-half (or odd-even) reliability, each item is
assigned to one half or the other. Then, the total score for each student
on each half is determined and the correlation between the two total
scores for both halves is computed. Essentially, one test is used to make
two shorter alternate forms. This method has the advantage that only
one test administration is required, so memory or practice effects are not
issues. This method underestimates what the actual reliability of the full
test would be.

b. Interpreting Reliability

1) Scoring reliability limits test reliability If tests are unreliably scored,


then error is introduced that limits the reliability of the test.

2) The more items included in a test, the higher the test's reliability
When more items are added to a test, the test is better able to sample
the student's knowledge or skill that is being measured.

3) Reliability tends to decrease as tests are too easy or too difficult


Score distributions become similar which makes it tough to know whether
the instrument is measuring knowledge and skills consistently. When
tests are too difficult, guessing is encouraged which creates a source of
error in the test results.

c. Validity The term validity refers to how well an instrument measures what
it is suppose to measure. Validity can be assessed for tests, questionnaires,
interviews, etc. However, validity is most often calculated for tests. Without
establishing its validity, a test is of questionable usage since the evaluator does not
know for sure whether the test is measuring the concepts it is intended to measure.
There are several types of validity that can be determined.

1) Content Validity Content validity assesses the relevance of the test


items to the subject matter being tested. Content validity is established
by examining an instrument to determine whether it provides an
adequate representation of the skills and knowledge it is designed to
measure. No statistical test is used to establish content validity. To
determine whether a test has content validity, SMEs review the test items
and make a judgment regarding the validity of each item. For this
approach to be effective, two major assumptions must be met. First, the
SMEs must have the background and expertise to make a judgment
regarding the content of the test. Second, the objectives to which the
test is compared must be valid.

Chapter 5 5-52
Systems Approach To Training Manual Evaluate Phase

2) Criterion-Related Validity Criterion-related validity is established


when test scores are compared to a criterion (such as graduate
performance on the job) to determine how well a test predicts the
criterion. For example, the validity of a test on map reading can be
determined by comparing the scores students received on the test with
their performance on a field exercise in land navigation. The test will
have criterion-related validity if a student who received a high score on
the map reading test receives a high score on the map reading portion of
the land navigation exercise. Criterion-related validity is usually
expressed as a correlation. There are two types of criterion-related
validity: concurrent and predictive validity.

a) Concurrent Validity To establish the criterion-related validity of a


test, it is often faster to test people already considered successful on
the criterion (e.g., individuals who were rated highly on their job
performance). If successful individuals are used, valuable time is
saved since they already have job performance scores ready for
comparison with the test scores. If the test correlates highly with the
job performance data, then the test has concurrent validity. In other
words, if the successful job performers also score highly on the test,
then the test is shown to be related to the criterion (successful job
performance). The test is able to identify which individuals are doing
well in their jobs. Once a test has been determined to possess
concurrent validity, predictive validity is often tested or inferred.

b) Predictive Validity Predictive validity refers to how well the test


predicts some future behavior of the student. This form of validity is
particularly useful for aptitude tests, which attempt to predict how
well the test taker will do in some future setting. The predictive
validity of a test is determined by administering the test to a group of
subjects, then measuring the subjects on whatever the test is
supposed to predict after a period of time has elapsed. Often an
instructor will want to design a test to try to predict how well students
will perform on a criterion. If the test is able to predict the student's
scores on the criterion with a good deal of accuracy, then the test has
predictive validity. Predictive validity is very useful to instructors. If
an instructor is able to predict future performance with a good deal of
accuracy, he/she can identify students who need more attention in
order to succeed.

Chapter 5 5-53
Systems Approach To Training Manual Evaluate Phase

d. Using A Computer To Perform Statistical Analysis


Use of computer statistical programs enables an evaluator to perform data analysis
quickly and to generate a variety of statistics based on the specific requirements of
the evaluation. Statistical analysis is currently not a discipline required of Marine
Corps evaluators; however, it can greatly improve the evaluator's ability to analyze
and interpret evaluation data by providing the tools to describe and define
outcomes, compare relationships, and identify trends. Skill in statistical analysis is
generally acquired through training or schooling. However, it can be learned and
practiced, particularly if the evaluator has a computer statistical package. Along
with learning the computer program, the key to performing statistical analysis on a
computer is understanding the different statistical procedures, when to use them,
how to use them, and how to interpret their results. Throughout this section,
specific statistical analysis procedures have been discussed. Many of these statistics
can be calculated by hand (e.g., frequency, mean, mode, median, range, item
analysis). However, many of the more complex statistics are time consuming to
calculate and leave greater room for human error in their calculation. An easier way
to calculate these is through use of a computer statistical program.

1) Use of Computer Programs There are many statistical programs that


run on standard personal computers. Most of these programs are
designed to allow the user to enter data from tests, questionnaires, etc.,
and select the type of statistics desired. The use of statistical software
packages enables the user to perform data analysis quickly and
efficiently and generate a variety of statistics based on the specific
requirements of the evaluation. One of the most widely available
computer programs is SPSS (Statistical Package for the Social Sciences).
SPSS is a powerful tool that allows the calculation of all the statistics
discussed in this Manual. Additionally, SPSS allows the calculation of
several other higher-order statistics too complicated to discuss here.

2) Automated Instructional Management Systems The Marine Corps


is using the TECOM Integrated Management System (TIMS) to manage
students’ attendance and performance during courses. The student
evaluation module of TIMS can produce reports and statistics. For
instance, TIMS can retrieve the class test results, an individual response
report, an incorrect response report, an absentee report, and GPA/class
standings reports for use by administrators. Within test statistics, TIMS
automatically configures the mean, median, mode, and standard
deviation. It also provides the number of perfect scores, number tested,
number passed, and number failed. Refer to the TIMS help screens for
more information and guidance.

Chapter 5 5-54
Systems Approach To Training Manual Evaluate Phase

5303. SUMMARIZE DATA

After data is assimilated, it should be summarized for ease of interpreting the


results. Decisions must be made regarding how the data should be summarized.
Data may be summarized in paragraph form and/or a table, graph, chart, or
matrix. Strengths and problem areas are identified so that solutions can be
formed and recorded.

1. Identify Strengths By identifying and documenting strengths, support


is available to prevent changes being made to components of the program that
work well.

2. Identify Problem Areas The evaluator should identify any problem


areas found during the interpretation of data. It is this step that identifies where
changes may be necessary or what areas need to be reviewed for trends.
Problem areas should be identified within the summarized data. Descriptive
statistics, graphic summarization, and paragraph form are three ways that data
can be summarized.

a. Descriptive Statistics Descriptive Statistics are ideal for summarizing


evaluation results. Descriptive statistics can be used to present evaluation
results in paragraph form. Some examples:

1) 80 out of 100 students passed the written exam resulting in a pass


rate of 80%.
2) Scores on the test ranged from a low of 65 to a high of 100, with a
class mean of 92.5.
3) Students were asked to complete a comprehensive questionnaire
rating the effectiveness of the instructional program. Students
indicated responses on a scale of 1 to 5, 5 representing extremely
effective. The mean value of class responses was 4.1, indicating an
overall impression that the instructional program was very effective.
4) Of the 125 graduates surveyed, only 3 felt the instructional program
did not prepare them for performance on their current job.

b. Graphic Summarization of Evaluation Results Graphs, tables, and


charts can be used to summarize evaluation results so that they are easily
understand. Many types of data can be easily plotted on bar charts or line
graphs to show relationships, indicate trends, or explain results. To provide
comprehensive information, the results may need to be explained in paragraph
form.

Chapter 5 5-55
Systems Approach To Training Manual Evaluate Phase

3. Determine Solutions Decisions must be made based upon the


interpretation of the data. Any recommended solution should consider future
goals and the feasibility of the change within the school. A plan of action should
be formed. If revisions can be made to correct the identified problems, they
should be made in a timely manner. A Course Content Review Board can be held
at any time (not just annually) if major changes are necessary.

4. Record Solutions Evaluation results must always be documented in some


form. Evaluation results are used to inform personnel about the findings resulting
from the collection, analysis, and interpretation of evaluation information. Once
evaluation information is interpreted, there are three courses of action that can
be taken:

a. All evaluation data are recorded and preserved for future use should no
revisions to the course be determined.

b. Evaluation is continued through the collection of additional data by the


Formal School/Detachment. The focus of this evaluation is targeted at the
suspected deficiency in the instructional program.

c. Revisions to course materials are identified and presented at a CCRB.

Chapter 5 5-56
Systems Approach To Training Manual Evaluate Phase

SECTION 5400. MANAGE EVALUATION DATA

4 The next step in the evaluation process is to manage the documentation of


evaluation results and recommendations for revising or refining an instructional
program. These documents and reports are prepared to serve as a historical
record of the evaluation, provide an audit trail for the continuing development
and improvement of instruction, and direct the activities for implementing
changes to the instructional program. Efficient data management. Therefore,
requires that the information presented in these documents be clear, concise, and
accurate. This chapter provides guidance concerning the documentation of
evaluation results.

5401. MARINE CORPS AUTOMATED


INSTRUCTIONAL MANAGEMENT SYSTEM
(MCAIMS)
A CDD and POI is maintained by each formal school/detachment in MCAIMS for
every formal course of instruction. MCAIMS can also track the drops, progress,
and absenteeism of students through the use of the Student Module. Once test
items/questions for questionnaires are entered into MCAIMS, then the test
data/questionnaire data can be scanned or manually entered. Using MCAIMS for
Tests/Questionnaires is optional. MCAIMS has the capability to print reports that
can be used for statistical analysis for the test/questionnaires entered into the
data system. Refer to the MCAIMS User Manual for specific guidance.

5402. DATABASES/SPREADSHEETS

To meet specific school needs in maintaining and managing data, some schools
develop databases or spreadsheets to assist in conducting analysis and
interpreting data. Specific reports can be generated from databases that compile
entered data for easy interpretation. Prior to building such a database, the focus
of the evaluation and the development of evaluation instruments should be
complete. Formulas can be applied so that the database/spreadsheet will provide
statistical data. Users skilled with both MCAIMS and standard spreadsheet or
database applications can benefit from MCAIMS’ ability to produce Student and
Evaluation Data Export files for use in other applications.

Chapter 5 5-57
Systems Approach To Training Manual Evaluate Phase

5403. COURSE HISTORY FOLDERS

Course history folders are the paper-based version of maintaining data. Schools
must maintain course booklets at the schoolhouse for at least five years. The data
placed in the course history folders can be paper-based, printed out of databases,
MCAIMS, or other computer programs. However, by maintaining a folder for each
iteration of a course, all data regarding a particular class can be easily assessed for
reviews, inspections, or Course Content Review Boards. The following
documentation, at a minimum, should be maintained in course history folders:

1. Enrollment rosters
2. Student data sheet information
3. Test results (i.e., reports, statistics, item analysis)
4. After Instruction Reports
5. End of Course Critique Summary
6. Graduation Roster

5404. RECORD OF PROCEEDINGS

The Record of Proceedings (ROP) is generated from the Course Content Review
Board (CCRB). CCRBs are discussed in detail in the next section. An ROP
documents evaluation results and recommendations for revising an instructional
program identified during the conduct of a CCRB. Within the formal
CCRB- Course Content
school/detachment, the ROP not only documents an evaluation, it also functions as
a record submitted to higher headquarters for implementing changes to an Review Board.
instructional program. If there are no recommended changes, then the ROP is
simply maintained for documentation purposes at the formal school/detachment.
The ROP also serves to initiate action at higher headquarters to address
requirements outside the scope of the formal school. To ensure that changes to
instruction are implemented properly and that recommendations for changes
outside the scope of the formal school are addressed, the standard ROP format is
located in APPENDIX G.

Chapter 5 5-58
Systems Approach To Training Manual Evaluate Phase

SECTION 5500. COURSE CONTENT REVIEW BOARD

5 MCO 1553.2 mandates that formal schools/detachments convene a Course


Content Review Board (CCRB) every three years to ensure the efficiency and
effectiveness of an instructional program. However, a CCRB can be more
frequent if the need arises. Figure 5-36 provides a Conduct a CCRB Checklist to
assist the host in the preparation and conduct of a CCRB.

Examples of When to Conduct CCRB

1. Biennially (every two years) for quality control.

2. When doctrine is updated or new requirements surface.

3. If evaluation results reveal a need to revise some facet of an


instructional program, then a CCRB is held.

Chapter 5 5-59
Systems Approach To Training Manual Evaluate Phase

5501. CCRB FUNCTIONS

A CCRB functions to record information and make recommendations to improve


the effectiveness and efficiency of an instructional program. The Record of
Proceedings (ROP) must provide justification and recommended courses of action
for implementing any revisions to the instructional program. All recommended
lesson and/or course revisions must be reflected in the ROP.

CCRBs include, but are not limited to:

1. Review of formative and summative evaluation data.

2. Review of higher headquarters policy change or direction,which


affects a course of instruction.

3. Review of recommended lesson/course modifications to instructional


materials.

4. Review of lesson additions/deletions to instructional materials.

5. Review of appropriate doctrinal publications, Individual Training


Standards (ITS), and/or task list.

6. Review of recommended changes to ITSs.

5502. CCRB USES

Changes pertaining to instructional time, resources, or the ITSs that form the basis
for the course may be identified by a CCRB. Training and Education Command
must approve any recommended changes that fall outside of content revisions. A
CCRB’s ROP can be used to effect changes in an instructional program that the
formal school/detachment cannot perform or is not authorized to perform.

1. Revise ITS Task List A CCRB is the ideal means to record recommended
changes to ITSs events and task lists so that the findings can be presented to
Training and Education Command.

2. Revising the Instructional Setting A CCRB is the means for


recommending revisions to the instructional settings identified in the ITSs. For
example, if a school does not have the resources to teach an ITS task that is
designated as “core,” the school can submit a recommendation to change the
instructional setting to “core plus” so that the task is taught to standard through
managed on-the-job training (MOJT) and vice versa.

Chapter 5 5-60
Systems Approach To Training Manual Evaluate Phase

3. Revising Instructional Resources A CCRB is the means to record and


present recommended changes to resources. All recommended changes to
training time, personnel, equipment, facilities, or budget must be submitted to
Training and Education Command with justification. Training and Education
Command will review and staff the changes, providing the formal
school/detachment with guidance.

5503. CCRB PREPARATION

Initial planning for a CCRB should be conducted three months prior to the CCRB.
Regardless of CCRB composition, all proceedings and findings are formally
recorded by the CCRB Recorder (discussed later in this section). In addition, a
CCRB can be videotaped. Figure 5-34 lists the formal school detachment
responsibilities and Figure 5-35 lists the CCRB member responsibilities.

Formal School/Detachment Responsibilities

Figure 5-34. Formal 1. Scheduling the time, date, and location for the CCRB.
School/Detachment
Responsibilities. 2. Providing guidance to members in advance of the CCRB so they will be
prepared to discuss agenda items. This guidance includes the agenda,
evaluation data, and any directions from higher headquarters. The
CCRB agenda is based on a review of evaluation data and focuses on
identified or perceived instructional deficiencies.

3. Assigning a Recorder to record the CCRB minutes. These minutes are


used to develop the Record of Proceedings (ROP).

4. Assigning a facilitator to manage and guide the CCRB.

5. Funding for CCRB participants has to be worked out by the sponsoring


school.

Assigned CCRB Member Responsibilities


Figure 5-35. Assigned
CCRB Member 1. Study all collected evaluation data and directions from higher
Responsibilities. headquarters that are related to the agenda items.

2. Be prepared to discuss recommended changes to instructional materials.


If revisions are necessary, determine the specific changes and discuss
how they should be made and how they will affect the instructional
program. Recommendations must be specific and comprehensive and
they must detail how changes should be implemented to best meet
instructional needs.

Chapter 5 5-61
Systems Approach To Training Manual Evaluate Phase

CCRB MEMBERS

A CCRB will consist of:

1. A formal committee with representation from instructors/curriculum


developers.

2. Subject matter experts.

3. School supervisors.

4. Occupational Field (OccFld) Specialist and Task Analyst


representation (if possible).

5. Operating Force Representation.

1. Appointment CCRB members are appointed by the Director of the formal


school/detachment or as directed by Standing Operating Procedures (SOP).
Potential members may be contacted either by phone or correspondence. A CCRB
should have representatives from each phase of the instructional program such as
SMEs, curriculum developers, and instructors. Should major changes to the
instructional program (e.g., resources, ITSs) be foreseen, a Training and Education
Command representative should be invited to attend the CCRB.

2. Facilitate Discussion Of Agenda Items

a. Facilitator This is the individual who controls the meeting, ensures that
all agenda items are discussed, and that recommendations are recorded. The
facilitator will establish guidelines or parameters for making decisions. This may
include the number of sources and type of evaluation information that will be
reviewed and analyzed. This may also include the order/priority of agenda items
and any imposed time constraints of the proceedings. Additionally, the facilitator
should encourage and promote participation by all CCRB members. Contributions
from all CCRB members should be treated respectfully and discussed.

1) Specific recommendations to each agenda item must be made. It is not


necessary for all CCRB members to agree on a decision or an approach,
but all recommendations must be reached by consensus. This is
accomplished by weighing all information from all sources, dismissing
unnecessary information, further analyzing points for clarification, and
assimilating information into a solid recommendation.

Chapter 5 5-62
Systems Approach To Training Manual Evaluate Phase

2) Recommendations should be detailed. They must provide justification


and they should include methods for implementing revisions to the
instructional program. Recommendations for each agenda item should
be reviewed and summarized by the facilitator.

3) All CCRB members must understand the recommendations and a


consensus must be reached before the next agenda item is addressed.

b. Recorder Under the guidance of the facilitator, the recorder should


record all recommendations legibly. The ROP must clearly state CCRB findings
and recommended courses of action in a detailed, concise format. The required
format for the ROP can be found in APPENDIX G.

3. Record of Proceedings (ROP) MCO 1553.2 mandates that


evaluation results and recommendations for revising instruction at formal
schools/detachments be documented through the publication of an ROP. ROPs
are generated based on CCRBs and are used to revise instructional materials,
provide information and judgments about the effectiveness of an instructional
program, and effect changes in a program beyond the scope of the formal
school/detachment. The ROP provides a summary of evaluation results,
recommendations, and justification for revising training.

a. Format The required format for the ROP can be found in APPENDIX G.
It must contain justification for any recommended revisions to instruction. The
CCRB may make recommendations on revising instruction where appropriate.
No changes may be made to the instruction unless supported by evaluation data
or direction from higher headquarters.

b. Members Review The ROP must be checked and approved by CCRB


members to ensure that specific recommendations have been made for each
issue encountered in the instructional program. This review also ensures that
each recommendation is documented with solid justification and that the content
is an accurate reflection of the conduct of the CCRB.

c. Member Certification The members of the CCRB then sign the ROP
certifying the accuracy of the content.

Recommendations should be detailed.


They must provide justification.
They should include methods for implementing
revisions to the instructional program/POI.

Chapter 5 5-63
Systems Approach To Training Manual Evaluate Phase

5504. SUBMITTING THE ROP

Submit the final ROP to the CO/Director of the formal school/detachment for
approval. A copy of the ROP will be sent to CG, TECOM (GTB/ATB) for review and
a copy will remain on file at the school/detachment. If the CCRB has identified a
required change to the ITS Order or T&R Manual, then the ROP must be submitted
to TECOM with justification and supporting documentation.

See Figure 5-36 for a checklist of how to conduct a CCRB


on the next page.

Chapter 5 5-64
Systems Approach To Training Manual Evaluate Phase

CONDUCT A CCRB CHECKLIST

PREPARE FOR THE CCRB YES NO


1. Study evaluation data and directions from higher HQ
2. Set a time and date
3. Ensure members are appointed
4. Provide guidance to members (agenda, data, etc.)
5. Assign a recorder
CONDUCT THE CCRB YES NO
1. Open meeting on time
2. Explain purpose of meeting
3. Avoid stating preferences as to outcomes
4. Explain ground rules:
a. Establish discussion method (s)
b. Establish decision making method (s)
c. Establish time limits
5. Employ effective group communication techniques:
a. Promote systematic problem solving
b. Keep group focused on problem solving
c. Create/maintain suitable atmosphere
d. Show respect and interest in group members
e. Demonstrate sensitivity to attitudes
f. Maintain impartiality
g. Encourage balanced participation
h. Refrain from dominating the group
I. Deal with conflict effectively
j. Consider several courses of action
k. Consider drawbacks of preferred course of action
L. Consider problems of implementation
m. Provide “second chance” to air remaining doubts

CLOSE THE CCRB YES NO


1. Review minutes
2. Seek approval from members concerning the minutes
3. Close CCRB on time

COMPLETE THE CCRB YES NO


1. Write the Record of Proceedings (ROP) based on the minutes
2. ROP reviewed and certified by all CCRB members
3. Submit Record of Proceedings to director for approval and to CG, TECOM
4. Evaluate the conduct of the CCRB
Figure 5-36. Conduct a CCRB Checklist.

Chapter 5 5-65
Systems Approach To Training Manual Evaluate Phase

5600. ADMINISTRATION SECTION


This section provides the evaluation requirement as stated by various Marine Corps
Orders and Publications. These documents provide guidance to formal
6
school/detachments on requirements in training. With the requirements being
understood, personnel working in academics at the formal schools/detachments need
to carefully consider the approach to evaluation. This is done through an evaluation
plan for the school. The evaluation plan discusses how, where, and when to conduct
evaluation, the types of data retrieved, and what to do with the data. Details on types
of sampling are referred to in detail so that this can be addressed in the plan. In
addition, specific information on how to design questionnaires, interview questions, and
evaluation checklists is covered so that schools are able to ensure that the instruments
used are meeting the needs of the school. Instruments should be designed with ease
of data compilation and interpretation in mind.

Chapter 5 5-66
Systems Approach To Training Manual Evaluate Phase

5601. EVALUATION REQUIREMENTS

Schoolhouse administration needs to be familiar with the requirement for


evaluation. The first step in evaluation planning involves the identification of an
evaluation requirement. The source and scope of this requirement will drive
subsequent evaluation activities. Establishing this requirement ensures that
personnel and resources are allocated appropriately and effectively in support of
an instructional program. This section provides direction and guidance in
identifying an evaluation requirement and focusing on the source of this
requirement: Marine Corps doctrinal publications and the formal
schools/detachments.

1. Marine Corps Requirement For Evaluation Marine Corps


doctrine or local SOP mandate the conduct of certain evaluations including their
frequency, the type of evaluation to be conducted, and the specific issues to be
evaluated. The following subparagraphs briefly describe the 1553 series of
Marine Corps Orders (MCO) and Marine Corps Reference Publications (MCRP) as
they pertain to instructional evaluation. The evaluator should be familiar with
the effect of these orders on the organization's evaluation activities. In addition
to these documents, Marine Corps Training and Education Command (TECOM)
may be contacted for guidance concerning the conduct of evaluation.

a. MCO 1553.1 MCO 1553.1_ Marine Corps Training and Education


System, establishes CG, Training and Education Command as the organization
that evaluates Marine Corps training and education policy, plans, concepts, and
programs; conducts and reviews evaluations of training and education
performed in units and institutions; and resolves emergent issues.

b. MCO 1553.2 MCO 1553.2_, Management for Marine Corps Formal


Schools and Training Centers, addresses Course Content Review Board (CCRB)
requirements, curriculum assistance visits conducted by Training and Education
Command, and the conduct of a Training Situation Analysis (TSA) to assess a
formal school's philosophy, management, facilities, staffing, curriculum, and
instructional support.

c. MCO 1553.3 MCO 1553.3_, Marine Corps Unit Training Management


(TM), establishes a Marine Corps-wide Training Management (TM) process
wherein all individual and collective training conducted by units within the
operating forces and supporting establishment shall be performance-oriented
and prioritized by the commander relative to assigned missions. Additionally,
the Marine Corps Combat Readiness Evaluation System (MCCRES) evaluation
process is identified as the training management and diagnostic tool to improve
training.

Chapter 5 5-67
Systems Approach To Training Manual Evaluate Phase

d. MCO 1553.5 MCO 1553.5_, Marine Corps Training and Education


Evaluation, establishes an evaluation policy and requirement to provide feedback
on training and education programs from all Marine Corps activities. This order is
the most comprehensive in setting forth guidelines for conducting, monitoring, and
reporting evaluation.

e. MCO 1553.6 MCO 1553.6_, Development, Management, and Acquisition


of Interactive Courseware (ICW) for Marine Corps Instruction, establishes policy,
prescribes requirements, and assigns responsibilities for the development,
management, and acquisition of ICW for Marine Corps instructional programs.

f. MCO 1553.7 MCO 1553.7_, Using the By Name Assignment (BNA)


System provides information, guidance, and the responsibilities concerning the use
By Name Assignment (BNA) system. BNA is the Marine Corps Class I System used
to collect training workload data, to include entrants and graduates.

g. MCRP 3-0A, Unit Training Management/MCRP 3-0B, How to


Conduct Training MCRP 3-0A and MCRP 3-0B set forth evaluation requirements
for unit training. These manuals provide guidance to plan, prepare, and evaluate
training conducted at battalion or squadron level units. These manuals help
evaluators determine if unit training produces technically and tactically proficient
Marines capable of accomplishing their assigned missions.

2. Formal School/Training Center Evaluation Requirement Evaluation is a


continuous process whereby information is gathered to assess the value, worth, or
merit of a program. A formal school/detachment may conduct an evaluation any
time it is deemed necessary to verify the effectiveness of an instructional program,
identify instructional deficiencies, or determine the most efficient allocation of
instructional resources.

5602. PREPARE AN EVALUATION PLAN

After an evaluation requirement has been identified, a plan for conducting the
evaluation is developed to ensure that no important steps in the process are
overlooked. This section presents the critical elements of an evaluation plan,
including supporting data, sources of data, sampling, an evaluation schedule, and
data collection, analysis, and interpretation.

ELEMENTS OF AN EVALUATION PLAN

Whether the evaluation will be formative or summative, the planning topics


discussed below will help ensure effectiveness. The evaluator must be prepared to
modify the plan as required during the conduct of the evaluation if new issues are
identified or events mandate revision of the plan. Any changes to the plan should
be carefully documented. A sample evaluation plan is provided in APPENDIX H.

Chapter 5 5-68
Systems Approach To Training Manual Evaluate Phase

1. Data Required to Support the Evaluation This element of the


evaluation plan is a clear and detailed statement of the data required to support
the evaluation. For example, if the evaluation focuses on student mastery of
learning objectives, student performance (test) data must be collected. If the
focus concerns whether course graduates meet the needs of using commands,
graduate on-the-job performance data are required. Throughout the planning
process and during data collection, the evaluator should review this portion of
the plan to ensure the appropriate data are collected to support conclusions and
recommendations concerning the revision, maintenance, or termination of an
instructional program.

2. Sources of Data As part of the evaluation plan, the evaluator must


determine who will provide the data and what sources of information will be
used. Sources include existing data, instructors and other school personnel,
students, graduates, SMEs, and/or using commands.

a. Existing data include all task and course materials (e.g., ITS, T&R
Manual, POI, lesson plans), documentation from higher headquarters that may
change the course requirements, and previous evaluation data (e.g., CCRB or
SME Conference reports, test data).

b. Data from individuals include student performance data (test results),


instructor performance data, and graduate performance data.

3. Sampling This element of the evaluation plan should identify, when


applicable, the sampling procedure including sample size and sampling
technique to be used. Sampling is discussed later in this section.

4. Evaluation Schedule The evaluation plan should indicate when the


evaluation would take place. In addition, the evaluation plan should include a
schedule for each evaluation task or event. The schedule should be developed
to ensure the evaluation is conducted when the most reliable data can be
collected.

a. Timely Evaluation An evaluation should be planned to ensure timely


collection of data. For example, if the evaluation focuses on graduate job
performance, the graduates should have been on the job for at least 30 days,
but less than three months to ensure valid data can be collected. Graduates
new on the job may not have had the opportunity to perform certain tasks; and
if they have been on the job longer than three months, they may have trouble
separating what they learned in school from what they learned on the job. As
an additional example, if the evaluation is being conducted to determine the
consistency of instructional results, the instructional program must have been in
place through several iterations. This will ensure the data collected will provide
a comprehensive basis for decision making about an instructional program.

Chapter 5 5-69
Systems Approach To Training Manual Evaluate Phase

b. Schedule of Evaluation Events The evaluator must determine the time


required to complete each evaluation activity (e.g., instrument design, data
collection, data analysis) and prepare a schedule so each activity is conducted at
the proper time. This involves factoring in the personnel and resources available
for the evaluation. The use of a milestone chart to schedule the beginning and
ending date for each event is recommended. This schedule should include the
time allocated for the analysis, interpretation, and reporting of results. The
evaluator should keep in mind that changes to the evaluation plan must be
reflected in the schedule and may affect one or more evaluation activities.

5. Methods of Data Collection This part of the evaluation plan should specify
how the data will be collected, what personnel will collect the data, and under
what conditions the data will be collected.

a. How Data Will Be Collected Selecting the appropriate evaluation


instruments is a critical step in planning, since the instrument controls the type
and validity of data collected. These instruments are discussed in section 5200.
Designing evaluation instruments is discussed later in section 5603.

b. Who Will Conduct the Evaluation The goal of the evaluation should
always be considered when determining whether an internal evaluator (one who
works within the formal school/detachment) or an external evaluator (one not
associated with the formal school/detachment) will collect evaluation data. Due to
time and budgetary constraints, most evaluations will be conducted by internal
evaluators. However, if resources are available to permit a choice, the following
should be considered when determining whether to use an internal or external
evaluator. Figure 5-37 asks questions that will help decide who conducts the
evaluation.

1) Internal Evaluator An internal evaluator is familiar with the


instructional program and is able to provide immediate feedback during
the evaluation. He/she is generally able to devote more time to an
evaluation and at less cost than an external evaluator because he/she
will not incur any Temporary Additional Duty (TAD) costs. However, an
internal evaluator may lack the experience or expertise required to
conduct the evaluation. In addition, the evaluator may be biased,
especially if the evaluator has personal involvement in the instructional
program.

Chapter 5 5-70
Systems Approach To Training Manual Evaluate Phase

2) External Evaluator An external evaluator is more likely to be


impartial because he/she has no vested interest in the program's
success or failure. His/her findings may be viewed as more credible,
especially if the program is controversial and evaluation findings are
to be used in settling a dispute. In addition, personnel associated
with an instructional program are often more willing to reveal
sensitive information to an external evaluator (since an internal
evaluator may inadvertently breach their confidentiality). On the
other hand, an external evaluator may be unfamiliar with the
instructional program, requiring him/her to devote time to learn about
it, and he/she may not have the ability to identify subtle issues or
concerns related to the instructional program. If possible, an
organization should use an external evaluator when the answer to any
of the following questions is no.

Internal vs. External


Figure 5-37. Internal and
External Evaluators  Are technically qualified internal evaluators available to effectively and
competently evaluate the program?

 Can internal evaluators be fully committed to the evaluation? That is,


are they without additional duty responsibilities that would hinder the
evaluation effort?

 Will there be sufficient internal evaluators to sustain an evaluation?

 Will the internal evaluator have the credibility to perform the evaluation
objectively?

b. Plan for Briefing/Training Data Collectors Once personnel


requirements have been identified, a plan for briefing and training data
collectors should be developed.

1) Personnel Brief The brief should include the intent of the


evaluation, the role of the data collectors, when and how they will
collect the data, how to monitor the process, and how to ensure that
data collected are complete.

2) Personnel Training A relatively simple orientation for data


collection personnel is all that will be needed for most evaluation
instruments. However, if interview or observation instruments are to
be used, personnel may need training sessions on their use, including
written instructions, job aids, and/or practice. Procedures (including
time tables) for this training should be included in the evaluation plan.

Chapter 5 5-71
Systems Approach To Training Manual Evaluate Phase

c. Conditions Under Which Data Will be Collected The plan should also
specify the appropriate conditions for data collection. For example, will students
be observed during a class? Will they be tested in groups or individually? Will
graduate performance on the job be assessed? Will evaluation instruments be
mailed, emailed, or administered in person? Planning the data collection effort will
ensure that valid data can be collected under the conditions specified.

d. Data Collection Arrangements The evaluation plan should also specify


the administrative requirements to support data collection. Depending on the
evaluation to be conducted, these requirements may include contacting school or
command personnel to schedule visits, making travel reservations, ensuring that
evaluation instruments are duplicated and mailed on schedule (if not carried by the
evaluator), etc.

6. Method for Data Analysis and Interpretation The evaluation plan should
specify the method for data analysis and interpretation. This includes formatting,
coding, organizing, storing, and retrieving the data along with the statistical
techniques used to analyze the raw data and methods for interpreting results.
Refer to Section 5302 for information on the analysis and interpretation of
evaluation data.

7. Method for Reporting The evaluation plan should specify the method for
making recommendations and reporting evaluation results.

5603. SAMPLING

It is not always feasible to survey or test every member of a specific population,


e.g., every Marine in the Marine Corps. Therefore, a sample representative of the
population is selected for evaluation. When selecting a sample, the larger the
sample, the more precise the estimate of the characteristic in the population.
Sampling techniques are particularly common when conducting surveys or
interviews rather than testing individual performance in school or on the job where
it is important to test everyone. Often the target population (the people or events
that are of interest) is too large to survey practically, so an evaluator focuses
instead on a subset of the population known as a sample.

1. Sampling Techniques When a sample is selected, it is important that the


sample be unbiased or truly representative of the whole population to provide the
highest degree of reliability and validity with respect to making conclusions and
recommendations regarding an instructional program. There are two basic ways
to achieve a representative sample: simple random sampling and stratified
random sampling.

a. Simple Random Sample A simple random sample is one in which every


member of the population has an equal chance of being selected for the sample
and the selection of any one member of the population does not influence the
chances of any other member being selected.

Chapter 5 5-72
Systems Approach To Training Manual Evaluate Phase

b. Stratified Random Sample A stratified random sample involves


dividing the population into two, three, or more strata [e.g., rank, military
occupational specialty (MOS)] and then randomly sampling from each stratum.
”Strata” refers to subpopulations. This method of sampling allows the evaluator
to generalize results to the population as a whole, particularly if the population is
not homogenous. A stratified random sampling procedure ensures that segments
of the population having a low frequency of occurrence (e.g., female Marines) are
represented in the sample.

2. Process For Selecting A Sample Size The selection of a sample size


is not a subjective process. In lieu of any other method, evaluators can rely on
their past experiences to select a sample size. However, there is a standardized
method that can be used to determine an appropriate sample size. To calculate
sample size, an expected response rate and confidence level must be identified.
The expected response rate is the proportion of responses expected from the
population being sampled. For example, if a survey is sent to 100 Marines and it
is expected that 30 Marines will return the survey, the expected response rate is
30%. The confidence level corresponds to the degree of assurance or confidence
that a given value will occur other than by chance. The most commonly used
confidence levels are 95% and 99% such that a 95% confidence level means that
the likelihood of a value occurring by chance is 5 in 100 and a 99% confidence
level corresponds to the likelihood of a chance occurrence of 1 in 100.

a. Determining Sample Size for a Random Sample APPENDIX I


provides a sampling table and formula for determining sample size. For example,
for a population of 4,200 course graduates, an estimated (desired) return rate of
85%, and a confidence level of 95%, sample size would be determined using the
following procedure:

1) Using APPENDIX I, locate the number corresponding to the


population size. Since 4,200 is not provided in the table,
round the number up or down to the nearest value. For
example, the population value of 4,200 would be rounded
down to 4,000.

2) Locate the value corresponding to the 95% confidence level


with a population size of 4,000. Using APPENDIX I, this value
is 364 (meaning that 364 questionnaires are required). This
figure should be 85% of the questionnaires mailed out.

3) To determine the number of questionnaires that need to be


mailed out to obtain 364 usable questionnaires, substitute
the values in the formula provided in APPENDIX I. Using our
example, for a population of 4,200 and an expected return
rate of 85%, the desired sample size would be 364.
Therefore, to obtain an 85% response rate (364 responses),
428 questionnaires need to be gathered.

Chapter 5 5-73
Systems Approach To Training Manual Evaluate Phase

b. Determining Sample Size for a Stratified Sample If an evaluator


wishes to divide a population into several strata (such as rank or MOS) and select
sample sizes based on these strata, sample size is determined in the same way
described above. In a stratified sample, population size corresponds to the
number of individuals within each stratum. For example, given a graduating class
of 200 students in which 160 are male and 40 are female, two sample sizes would
be calculated, one for a population size of 160 and another for a population size of
40.

5604. DESIGN EVALUATION INSTRUMENTS

The evaluation instrument is the tool that elicits information to accurately assess
the effectiveness and efficiency of an instructional program. An evaluation
instrument controls the nature and type of information collected and the reliability
and validity of that information. This section provides additional guidance on the
design of evaluation instruments such as survey questionnaires and interviews,
and the use of evaluation checklists. Particular emphasis is placed on guidelines
and considerations for developing and using standardized evaluation instruments,
stressing the importance of clarity, consistency, and brevity in their design.

1. Survey Questionnaires A survey questionnaire must be well-organized


and easy to read to be an effective data collection tool. When selecting or
designing survey questionnaires, the following guidelines should be followed:

a. Format Format is important in gaining the cooperation of respondents,


analyzing the data, and interpreting the results. Design the layout or structure of
a questionnaire so that it is attractive and uncluttered, permitting the respondent
to readily determine what types of questions are being asked and how to record
responses. A respondent should be able to complete the questionnaire within a
short period; respondents will often put aside and fail to complete a questionnaire
that requires more than 20 minutes of their time.

b. Instructions To ensure that the questionnaire is completed properly,


clear, concise instructions should be included at the beginning of the
questionnaire. These should include a brief explanation of the purpose of the
questionnaire, how it is organized, and how responses should be recorded. If the
questionnaire is mailed or distributed for later return by respondents, instructions
for its return should be provided and a metered return envelope should be
included.

c. Questionnaire Items Questions should be grouped by topic or subject


and presented in a logical format. For example, in a questionnaire administered to
graduates of Basic Rifleman covering both M16A2 Service Rifle and M203 Grenade
Launcher, all questions pertaining to the Service Rifle should be grouped together
and all questions pertaining to the Grenade Launcher should be grouped together.

Chapter 5 5-74
Systems Approach To Training Manual Evaluate Phase

d. Response Format When possible, the method for responding to


questionnaire items should be consistent to avoid confusion and facilitate the
recording of accurate responses. If a variety of answer formats must be used,
group items with the same answer format together. Survey questionnaires
involve self-reporting by respondents and, therefore, provide qualitative data.
For those data to be scored for later analysis and interpretation, they must be
quantified. The response format of the questionnaire controls the way the data
are gathered, how they can be quantified, and the ease or difficulty of their
quantification. Response formats include open-ended and fixed alternative (or
closed) questions. The fixed alternative format, which includes nominal, ordinal,
and interval scale responses, provides data that are more easily quantified for
later scoring and analysis. Open-ended responses may also be quantified for
data analysis, although it is a much more time-consuming process. Figure 5-38
provides examples of questionnaire response formats.

1) Open-ended An open-ended question has no pre-determined


response category. It allows the respondent to answer the question in
his/her own words without restricting the kind of answer he/she can
give. Data collected using open-ended questions can be quantified by
categorizing the responses and assigning a number to each category.
Open-ended questions in survey questionnaires or interviews allow
respondents to provide additional comments, descriptions, and rationale
or explanation for their answers. They are useful for collecting
information pertaining to perceived effectiveness of a particular course
of instruction. Unlike rating scales and checklists, information gathered
from open-ended questions can be difficult to collate, analyze, and
quantify because scores or ratings are not assigned to responses.
However, an answer key can be made to allow open-ended (e.g.,
essay) questions to be scored for partial and full credit through the
assignment of point values. Refer to Section 5302 for information on
quantifying data.

2) Nominal Scale A nominal scale response format is used primarily to


elicit information that falls within a single measurement dimension in
which responses can be easily categorized such as sex (e.g., male,
female) or rank (e.g., corporal, sergeant, captain). This type of scale is
particularly appropriate for gathering demographic information.

3) Ordinal Scale A Likert rating scale is an example of an ordinal scale


response format and is most commonly used to measure respondents'
attitudes, preferences, or feelings about a topic. A Likert rating may
involve a 1-3, 1-4, 1-5, 1-6, or 1-7 scale. Level of agreement, level of
preparedness, and level of ability are a few examples of what the scale
can measure. Each statement requires only one judgment and carefully
avoids ambiguity in expression or interpretation. Figure 5-38 provides
more information on the Likert rating scale.

Chapter 5 5-75
Systems Approach To Training Manual Evaluate Phase

Likert Rating Scale Figure 5-38. Likert Rating


 Method of recording responses to a question. Scale.

 Scale that responds to a spectrum of responses (e.g., behavioral


ratings, frequency ratings, attitudinal ratings) concerning a certain
topic.

 Respondents check the response that corresponds to the intensity of


their judgment of the topic.

 Ideal for obtaining varying judgments or scores on a topic by using a


number of statements on the same subject and giving an intensity
value for each.

QUESTIONNAIRE RESPONSE FORMATS

Open-Ended
1. What do you feel is the most important information you received while attending the Supply Officer Course?
_____________________________________________________________________________
_____________________________________________________________________________
_____________________________________________________________________________
Nominal Scale
2. Which of these qualities do you feel is the most important for an instructor to possess? (Circle the
appropriate number below.)
1. In-depth knowledge of subject matter
2. Professionalism
3. Sincerity
Ordinal Scale
3. The Supply Officer School's minimum rank requirement for attendance is Major. Which of the following
expresses your opinion concerning this statement? (Circle the appropriate number below.)
1. Strongly disagree
2. Disagree
3. Agree
4. Strongly agree
Interval Scale
4. How many personnel are assigned to your unit? (Circle the appropriate number below.)
1. Under 25
2. 26-50
3. 51-75
4. 76-100
5. Over 100

Figure 5-39. Questionnaire Response Formats.

Chapter 5 5-76
Systems Approach To Training Manual Evaluate Phase

4) Interval Scale An interval scale response format elicits information that


is quantifiable in terms of absolute or continuous values such as age,
years of service, time in billet, etc. This type of question can be designed
to require the respondent to either write in his response or select a
particular interval in which a value falls.

5) Development of Questionnaire Items Questionnaire items should be


short, direct, and written at a reading level appropriate to the respondent
population. The evaluator should adhere to the following guidelines
when developing questionnaire items. Figure 5-40 provides examples of
good and poor questions.

GUIDELINES FOR WRITING QUESTIONNAIRE ITEMS


Figure 5-40. Guidelines for
1. Avoid the use of negatives.
Writing Questionnaire
Items. POOR: The instructor was not available to answer my questions. (Yes/No)
GOOD: The instructor was available to answer my questions. (Yes/No)

2. Use short, common words; avoid jargon.

POOR: Does the AIR include IRFs?


GOOD: Does the After Instruction Report (AIR) include Instructional Rating
Forms (IRF)?

3. Do not combine two issues in one questionnaire item.

POOR: Was the instructor knowledgeable and effective?


GOOD: Was the instructor knowledgeable?
Was the instructor effective?

4. Avoid leading questions.

POOR: Do you feel the school needs to lengthen the course to better equip the
graduates?
GOOD: Are there changes the school can make to the course to better equip
the graduates?

5. Ensure the question can be answered by the respondent.

POOR: Was your knowledge comparable to the previous students' knowledge


when you entered the class?

GOOD: Do you feel you had the prerequisite knowledge and skills to succeed in
this course?

6. Avoid the use of emotionally-tinged words and embarrassing questions.

POOR: Did you have difficulty understanding the materials?


GOOD: Were the materials presented in a manner easy to understand?

Chapter 5 5-77
Systems Approach To Training Manual Evaluate Phase

6) Distribution In addition to well-written questions, valid results from


survey questionnaires depend on the selection of respondents. A
representative sampling is essential. Variations in job requirements occur
because of command, geographic locations, organization level, etc.
Therefore, the sample should include respondents assigned to each using
location in the population. Section 5603 provides detailed information on
sampling.

a) When to Send Questionnaires Proper timing is important when


sending questionnaires. For example, questionnaires should be in
graduates' hands one to three months after graduation and
assignment to the using command. Beyond three months, it may be
difficult to determine whether the graduate learned a skill from the
instructional program or on the job. If the questionnaire is distributed
too soon after course completion, the graduate may not have had time
or occasion to perform all of the tasks taught. However, the optimum
time for questionnaire distribution is also dependent on the complexity
of the job/tasks the instruction covered.

b) Follow-up Follow-up can ensure the return of a sufficient number of


completed questionnaires to support valid and reliable data analysis.
Procedures for appropriate follow-up should be included in the
evaluation plan. These would include the timing of the follow-up, a
method for identifying non-respondents, and the method of follow-up
(e.g., phone, mail). When the date for follow-up arrives, reminder
calls or notices to non-respondents should be made to encourage their
completion of the questionnaire. It is also a good practice to thank
respondents for their participation. Sending a simple thank-you form
requires little time but can be very worthwhile in motivating
respondents to cooperate in future surveys.

2. Interviews Although interviews may be structured or unstructured, the


collection of reliable data for evaluation purposes is best obtained from structured
interviews. The following are guidelines that can be used when conducting
interviews. The advantages and disadvantages of interviews are listed in Figures
5-42 and 5-43.

a. Introductory Statement The interview should always begin with an


introductory statement that outlines the purpose and structure of the interview.
The purpose should be explained in terms the respondent can understand and
should identify what types of questions will be asked. The introductory statement
should also provide a clear transition to the interview itself.

b. Conducting the Interview The goal of the interviewer is to maximize


the flow of information from the respondent.

Chapter 5 5-78
Systems Approach To Training Manual Evaluate Phase

Conducting the Interview


 Keep the language pitched to the level of the respondent. Do not use
Figure 5-41. Conducting
technical terms or acronyms unless the respondent is familiar with
the Interview.
them.

 Choose words that have the same meaning for everyone.

 Do not assume the respondent has factual or firsthand information.

 Establish the frame of reference for the questions being asked. For
example, to narrow a respondent's comment on the effectiveness of
testing, the interviewer may ask the respondent to focus on
performance testing during the last three weeks of a course.

 If asked, either suggest all possible responses to a question or do not


suggest any.

 If unpleasant questions must be asked, give the respondent a chance


to express his positive feelings first by structuring the interview so
those questions are asked first.

 Speak clearly and slowly and listen to the respondent's answer before
recording the response.

 Include a closing statement to let the respondent know the interview is


concluded.

c. Types of Interview Questions The type of interview questions


developed should be based on the objective of the interview.

1) Open-ended Questions A question that asks for narrative responses


and allows respondents to respond in their own words is an open-
ended question. Open-ended questions are used when a discrete
answer is not desired or possible (i.e., there is no yes/no or categorical
response possible). These questions often rely on the respondent's
opinion and judgment rather than the respondent's knowledge of
information or facts.

2) Probing or Clarifying Questions Ask probing or follow-up questions


to confirm a respondent's answer or to clarify what the respondent has
said. The respondent's statements should be included in the probe to
provide a point of reference and elicit elaboration or clarification of a
topic.

3) Closed Questions A question that limits respondents' answers to


predetermined response categories is a closed-ended question.
Multiple choice and yes/no questions are examples of closed-ended
questions. Closed questions employ a nominal, ordinal, or interval
scale response format. Closed questions are used to elicit information
that is easily categorized or to elicit specific factual information such as
rank, age, etc. Closed questions restrict the range of responses
received.

Chapter 5 5-79
Systems Approach To Training Manual Evaluate Phase

d. Recording Responses For open-ended questions or questions in which


probing or clarifying responses have been provided, the interviewer should:

1) Record responses using the exact words and phrases used by the
respondent.

2) Use key words or phrases to further clarify a response or as a reminder


of what was said.

Advantages of Interview
 If the questions are few and easy to answer, the interview method results
in a higher percentage of responses and, therefore, better sample results
than a survey questionnaire. Figure 5-42. Advantages
of Interview.
 The interview method ensures that the targeted audience answers the
questions. The individuals required to answer the questions can be pre-
selected, ensuring the evaluation information is obtained.

 An interviewer can judge the sincerity of the respondent as he gives his


answers.

 An interview can be conducted simultaneously with observation of


performance. Observation of performance adds merit to the interview
information obtained.

Disadvantages of Interview
Figure 5-43.
 Face-to-face interviews can be expensive and time consuming based on
Disadvantages of
the time required to conduct the interview and location of the interview.
Interview.
 Interviews do not allow respondents to remain anonymous which can
affect their responses.

 Interviews preclude the respondent from returning to a question at a


later date.

 If a respondent cannot be present during the scheduled time, it can be


difficult to reschedule the interview.

 An interviewer can introduce bias into the study by suggesting a


possible answer to a question when the respondent has difficulty giving
one. This produces questionable evaluation results.

Chapter 5 5-80
Systems Approach To Training Manual Evaluate Phase

3. Evaluation Checklists Checklists are typically used when the


evaluation consists of a review of documentation, course materials, etc., or an
observation of performance. Checklists that are used as evaluation instruments
are not simply lists of items that can be "checked off" as they are identified or
located. These checklists consist of carefully worded questions that the
evaluator answers by his review of course materials or observation of course
components (e.g., graduate or instructor performance, conduct of a class). If
existing materials will be reviewed as part of the evaluation, data are collected
via checklists as the evaluator reviews the applicable documents. To perform an
evaluation of an instructional program, two or more of these checklists may be
used, as required. Checklists can be used to conduct both formative and
summative evaluations of an instructional program.

a. Use of Checklists During Formative Evaluation During instructional


program development, checklists can be used to ensure instructional
development is proceeding according to plan. Checklists are also used to assess
and validate instructional materials. The use of checklists helps the evaluator
ensure that the materials being developed (e.g., learning objectives, test items,
lesson plans, student materials, instructional setting, media) will result in an
effective and efficient course of instruction. Using evaluation checklists as a
systematic method for validating instruction ensures:

1) The instruction does not contain unnecessary information, maximizes


the use of instructional time and media, follows the SAT process, and
prepares graduates to perform their job tasks to the specified standard.

2) An audit trail is created that enables evaluators to track each


component of the instructional program to the ITS or T&R event it
supports and to document the SAT methodology followed. To create
an audit trail, a progress or process method can be used.

a) Progress Method This method is used to keep management


informed of the progress of the course development effort. In
consultation with the course manager(s), the evaluator should
identify what information the manager needs to make effective
decisions concerning the course and how frequently it is needed. A
recommended approach is to report on the completion of key
checkpoints in the course development (See Figure 5-44 for a
portion of a sample project schedule). Often, managers need only
to know that an activity was completed on time. If deviations
occur, they should be explained and discussions held to produce an
acceptable solution. When the development effort is complete, the
project schedule will provide one form of an audit trail that can
later be reviewed when evaluating an instructional program.

Chapter 5 5-81
Systems Approach To Training Manual Evaluate Phase

Estimated Actual
Activity Completion Completion Note Figure 5-44. Sample
Project Schedule.
Develop Course Schedule 20 Sep 20 Sep

Develop Lesson Plan 20 Dec 20 Jan (1)

Develop Student Guide 20 Jan 20 Feb (2)

Develop Media 30 Jun 30 Jun

Notes:
(1) Delay of travel funds caused site visit to be postponed.
(2) Development was dependent on completed lesson plan.

b) Process Method This method uses a checklist to describe and


document the actual development process of a specific course. A
recommended approach is to list every major activity of the
course development process. Changes to the standard SAT
procedures as well as steps or processes not completed should
be documented. Figure 5-48 illustrates a sample process
checklist, although any suitable form can be used. The important
information to be captured is the explanation of any deviations so
that future managers will know what was done during course
development.

Figure 5-45. Sample


Development Activity Completed Explanation
Process Checklist.
YES NO

Develop Course Schedule [


Review Source Documents [
Determine Course Structure [
Organize TLO's and ELO's [
Assign Lesson Titles [ Used existing titles
Assign Lesson Designators [
Estimate Instructional Hours [
Organize Information [

b. Use of Checklists During Summative Evaluation During a summative


evaluation, checklists provide the evaluator with a systematic method for
examining an instructional program to ensure it prepares graduates to perform
their job tasks to the specified standard. Checklists can be used to evaluate the
following:

Chapter 5 5-82
Systems Approach To Training Manual Evaluate Phase

1) Student Performance A pass-fail checklist is commonly used in


performance tests where students are rated on mastery of learning
objectives. A typical approach to this type of checklist is to list the
learning objective behaviors (although it can also be detailed enough to
list performance steps) on one half of the page. On the other half,
present the checklist in two columns, one to be checked if the student
successfully accomplishes the learning objective (Pass) and one column
to be checked if the student does not accomplish the learning objective
(Fail). This checklist is easy for an instructor to complete while observing
student performance during a performance test. If an evaluation includes
visits to using commands to evaluate graduate on-the-job performance, a
very similar checklist may be used. Changes to the checklist may be
required to account for differences between the instructional environment
and that of the "real world."

2) Instructor Performance Instructors are commonly evaluated and


rated by students through Instructional Rating Forms (IRF) and Course
Critique questionnaires. An evaluator can use a checklist during
observation of a class to record data on the instructor's ability to
effectively present the materials in the lesson plan (See APPENDIX E for a
sample checklist). The checklist can also be used to assess the
instructor's qualifications.

3) Course Materials Course materials (e.g., lesson plans, student


materials, media, test items) should be reviewed and updated regularly.
The evaluator should ensure that current materials are being used as
planned and in accordance with an approved POI (see APPENDIX C for
the Master Lesson File checklists). In addition, a review of course
materials should include course control documents including the POI,
record of proceedings (ROP), etc. Course control documents provide an
administrative check of how the course is being implemented in support
of the ITS or T&R.

4) Instructional Environment and Instructional Equipment An


evaluator can use checklists in determining whether existing instructional
facilities are meeting the requirements of the instructional program (see
APPENDIX E for a sample checklist). The evaluator should first review
the course requirements for instructional equipment and facilities.
Evaluation of the instructional environment should include appearance
and cleanliness, condition, adequacy of space, and environmental factors
(e.g., noise, lighting, distractions). The condition, operation, and
appropriateness of instructional equipment should also be evaluated. A
preventive maintenance plan should be followed to ensure training
devices, simulators and computer equipment remain operable.

Chapter 5 5-83
Systems Approach To Training Manual Adult Learning

ADULT LEARNING
ANALYZE

E
In Chapter 6:

V
6000 INTRODUCTION 6-1

A
DESIGN 6100 PEDAGOGY TO
ANDROGOGY 6-2
 The Marine Corps Student 6-2

L
6200 CHARACTERISTICS 6-3
OF THE ADULT LEARNER

U
 Purpose 6-3
DEVELOP A
6300 LEARNING STYLES 6-6
 Purpose 6-6
 Instructional Preference
T

Model 6-6
 Accommodating Learning
Styles 6-8
E

IMPLEMENT  Cone of Learning 6-8

6400 HOW ADULTS LEARN 6-9


 Purpose 6-9

6500 DOMAINS OF
LEARNING 6-10
 Purpose 6-10
 Cognitive Domain 6-11
 Affective Domain 6-12
 Psychomotor Domain 6-13
 Using Domains of Learning 6-14

6600 GROUP DYNAMICS 6-17


 Purpose 6-17

6700 MOTIVATION 6-19


 Purpose 6-19

6800 CONSTRUCTIVIST 6-20


LEARNING ENVIRONMENTS
 Purpose 6-20

Chapter 6 6-15
Systems Approach To Training Manual Adult Learning

Chapter 6000. INTRODUCTION


6 The literature on adult education generally supports the idea that adults should be taught
differently than children and adolescents. However, until late in the 20th century, all students
were treated alike. The instructor was at the center of instruction and considered to have all
the answers, while students were merely the passive receptors of what the instructor
delivered. Little thought was given to the experiences and knowledge that students brought
to the learning environment, especially what adult learners had to offer. Fortunately, the
past 50 years has seen rapid growth in adult learning theory and the adoption of its principles
in the military training environment. The U.S. military trains more adults than any other
institution, and increasingly we are incorporating ideas to improve training. How can Marines
assist in their own learning? What motivates adults to want to learn? Is there a shared
responsibility between the instructor and the student in learning? These questions and many
more will be addressed in this chapter. It will introduce and discuss pedagogy and
andragogy, the Marine Corps student, learning styles, adults and how adults learn, the
domains of learning, motivation techniques, and group dynamics. The chapter will culminate
with a discussion on the application of these principles to the SAT process.

Progression from entry-level through career-level training.


Figure 6-1

Chapter 6 6-1
Systems Approach To Training Manual Adult Learning

6100. PEDAGOGY TO ANDRAGOGY SECTION


Formal educational institutions in modern society were initially established 1
exclusively for the education of children. As such, the theory that dominated
education was that of pedagogy, the art and science of teaching children. The
growth through the centuries of centers for higher education was not accompanied
by the development of any new theories on how education in these institutions
should be addressed. The theories that had been applied for centuries to the
education of children were simply applied to anyone pursuing education or
training, regardless of age or experience. Early in the 1960s, however, adult
educators in Europe began to focus their research on the education of adults.
These educators, led by Malcolm Knowles, adopted the term andragogy, the art
and science of teaching adults, to describe their emerging theories about adult
learners. Adult learning theory, andragogy, provides some basic assumptions that
should be considered when preparing to teach or train adults.

6101. THE MARINE CORPS STUDENT

The nature of our organization is such that everyone we train is an adult learner.
However, not all adult learners or learning environments are alike. Before
attempting to design, develop, or implement training for Marines, it is necessary to
know more about who Marine students are and how they learn.

One of the factors that curriculum developers and instructors consider when
creating or implementing training is establishing whom they will teach. For
example, consider how drill instruction differs for a recruit at MCRD from that of a
Sergeant going to Drill Instructor School. Both are considered adults, but the
approach to training each is completely different (see figure 6-1). Entry-level
Marines learn in a very structured, teacher-centered environment because they
lack experience or knowledge of the Marine Corps. The Marine Corps is a new
world to them. More structure must be provided for instruction to be efficient and
effective. At the same time, however, it is important to treat them like adults.
They do bring life experience into the classroom and they will exhibit some
characteristics of adult learners. Young Marines will be more motivated and more
apt to take responsibility for their learning if they are respected as adults.

By contrast, senior and career-level Marines bring a wide range of knowledge and
experience into the instructional environment. As such, more learner-centered
activities are needed to allow the students to use and build upon the knowledge
and experience they already possess. This chapter will discuss adult learning
theories in broad terms and how they apply when designing, developing and
implementing instruction for different populations of Marine Corps students.

Chapter 6 6-2
Systems Approach To Training Manual Adult Learning

SECTION 6200. CHARACTERISTICS OF THE ADULT


LEARNER.
2
Most adults would not want to sit through a class on the alphabet, taught by a
drill instructor that screamed at them for no apparent reason. This is because
most adults have developed a sense of self that they expect will be respected
and appreciated in a learning environment. These and many other
characteristics have been found to be somewhat universal amongst adult
learners. These characteristics should be studied and carefully considered when
designing, developing and implementing instruction for adults.

SOME CHARACTERISTICS OF ADULT LEARNERS

1. They prefer self-direction.


2. They have experience that should be used and built upon.
3. Their readiness to learn depends on their needs.
4. Their orientation to learning is life or problem-centered.
5. They often learn best in small groups.
6. They need a supporting and challenging environment.

1. Self-Direction Adults avoid, resist, and resent situations where they are
not respected as adults. They desire to be treated by others as capable of self-
direction.

a. Adults need a learning climate that provides them with a sense of


acceptance, respect and support. Those who have a positive self-image are
likely to be better learners. Criticizing or judging adult learners can quickly shut
down the learning process. When necessary, instructors must correct the adult
learner in a supportive and respectful manner.

b. Any student's ability to learn is directly proportional to the degree of


emotional safety he or she feels. Anxiety, fear, and lack of confidence are
emotions that can negatively affect a student's ability and willingness to learn.
Well designed and delivered instruction that considers the potential for anxiety
can reduce or eliminate fears. An example is the Marine Corps Combat Water
Survival School (CWSS). Marines at CWSS are trained in stages that progress
from the shallow end of a pool to the high dive platform. Because they are able
to succeed at simple tasks before moving on to more difficult ones, fear is
minimized.

c. Students and instructors have a shared responsibility for learning. The


instructor provides the atmosphere, resources and guidance the students require
for success; the student is responsible for the learning.

d. Instructors take on the role of facilitator, mentor, or coach, providing


scaffolding and “just-in-time” assistance to guide the student in their quest to
build knowledge and gain skills. Activities that have students reflect upon their
learning and self-evaluate can be very effective for adult learners because it
gives them “ownership” of the problem and the solution.

Chapter 6 6-3
Systems Approach To Training Manual Adult Learning

2. Learner Experience Adults possess a large repertoire of previous learning


comprised of formal education, training, culture, and life experience. Based on this
prior learning, adult learners formulate assumptions about the world. Their
assumptions can either help or hinder the learning of new material. Learning new
concepts is more difficult for students whose assumptions differ from what is being
taught. Adults enter the learning environment with a wide range of experiences.
The older the learners, the more experience they have and the more varied the
group. People attach more meaning to what they gain from experience than what
they acquire passively, thus it is critical that instructors and curriculum designers
consider students’ experiences during the instructional process. Some
instructional techniques that can be used to capitalize on students’ experiences are
problem solving, case studies, small and large group discussions, role-playing, and
simulation exercises.

a. Effective questioning techniques (refer to Chapter 4, Section 4401) is one


way to uncover student experiences that may have bearing on a lesson. Allow
students to provide real-world examples to help anchor and solidify instruction.

b. Group and individual projects involving open-ended and/or real-world


problems can be used to allow students to apply what they have learned and to
hone their problem solving skills.

c. Exposure to multiple perspectives and experiences will challenge the


students to review their previous experiences and question their assumptions.
Learning is accomplished when the recognition of and reflection upon differing
experiences and assumptions forces students to change their view. Open
discussions and journaling can assist in this endeavor.

3. Readiness to Learn Adults are motivated to learn when they feel the
learning is relevant to their jobs or their personal lives. They need to know why
information or skills are important to them, what they can anticipate learning, and
how it will be taught. It is important to provide this information in the introduction
to the lesson. Conversely, they are not usually motivated to learn what they will
have little or no use for. However, there are times when Marines must attend
training regardless of their motivation to do so. The implication for curriculum
developers is that they must know their audience so they can choose subject
matter and appropriate delivery methods, and also effectively explain their
relevance.

4. Orientation to Learning is Life or Problem-Centered Training


must be attuned to the concerns of the students. Adults are motivated to learn to
the extent that they perceive the new knowledge or skills will help them perform
tasks or deal with problems that they confront in their daily lives. Lesson plans
should include materials that address real life concerns. Case studies, simulations,
and practical applications using realistic settings provide a problem-centered
orientation. Instructors can also demonstrate the relevance of concepts by relating
them to the experiences of their students.

Chapter 6 6-4
Systems Approach To Training Manual Adult Learning
5. Small Groups Research on adult learning has shown that most adults
learn best in small groups. This makes students responsible not only for their own
learning, but for the learning of the group. Students who grasp concepts faster
help those who do not, and the collective experience of the group adds to the
process of learning. Further, working in small groups forces students to hear and
consider multiple perspectives and requires them to make concessions to
accomplish the mission of the group. Small groups (fire teams, squads, etc.) are
the backbone of the organizational structure of the Marine Corps.

6. Supportive and Challenging Environment Being openly criticized


by an instructor is a sure way to stop the learning process. Instructors must
provide and maintain a learning environment that assists students in meeting
goals and objectives. Training and supervision of newly assigned instructors will
help reduce these barriers to learning. Instructors of adults must become
proficient in the use of constructive feedback and positive reinforcement.
Instructors can remove or lessen anxieties by spelling out clearly up front
expectations for participants, and setting up group norms, for example, letting
participants know that active participation is encouraged, divergent opinions are
welcomed, and that you are there to help them learn. Further, instructors must
learn how to be effective facilitators, encouraging groups to discuss their solutions
to problems and facilitating the interaction between group members, groups
themselves, and the class as a whole. Additionally, curriculum developers and
instructors must strive to create learning environments that build upon the
experience of the students and challenge them to go beyond what they know or
can do. Two approaches to creating such an environment are “without the
information given” (WIG) and “beyond the information given” (BIG). “WIG”
environments provide the students with little guidance, which forces them to
discover on their own solutions to the given problem. “BIG” environments provide
the students with a scenario and a possible solution, and they must delve deeper
and find other, better solutions. Teaching senior SNCOs how to read a map or
use a compass will not challenge them, but telling them to lead a convoy through
enemy territory – where they would have to employ previously learned skills and
problem-solve – will test their mettle and challenge their ability. Scaffolding
(providing supports and gradually taking them away as students progress),
mentoring, and coaching are other effective instructional techniques.

Chapter 6 6-5
Systems Approach To Training Manual Adult Learning

6300. LEARNING STYLES SECTION


A learning style refers to an individual's preferred way of gathering, interpreting, 3
organizing, and thinking about information. Some students need to see the
information on a chart, screen, or paper; others may need to hear it explained or
discussed; and many need to perform tasks themselves in order to learn. There
are at least sixteen models of learning styles and 20 cognitive dimensions that
have been claimed as true (Boylan, 1989). This chapter provides a summary
explanation of learning styles and preferences. Understanding these preferences
will impact the way schools plan for and implement instruction. Figure 6.2
provides the characteristics of various learning styles and instructional tips that
apply to each.

6301. Instructional Preference Model


Instruction that addresses
This model focuses on the medium by which information is presented. It all three learning styles
assumes that students have a preferred method for receiving information. will be the most effective.
Because classes are diverse, and thus have a variety of preferred methods within
each, instruction that addresses all three learning styles will be the most
effective.

1. Visual Learners Visual learners tend to learn better when they see the
subject matter to be learned. They like to learn with photos, diagrams, charts,
physical objects, or demonstrations. To teach a visual learner how to swim, do a
demonstration or use a video.

2. Auditory Learners Auditory learners tend to learn best when they hear
the subject matter to be learned. To teach an auditory learner how to swim, give
verbal instructions prior to getting in the pool.

3. Kinesthetic Learners Kinesthetic learners tend to learn better by


performing the new task. Although they may benefit from other methods, they
learn best when they perform a task. When teaching a kinesthetic learner how to
swim, a lecture is less useful than a practical application session.

Chapter 6 6-6
Systems Approach To Training Manual Adult Learning

Learning
Characteristics Instructional Tip
Style
Visual  Needs to see it  Use graphics to reinforce
 Strong sense of color learning (i.e. charts, graphs,
 Trouble following pictures)
lectures  Use written directions
 Misinterpretation of  Use flow charts and diagrams
words for note-taking
 Use videos

Auditory  Prefers to get  Read directions


information by  Use audio
listening  Have students participate in
 Difficulty following discussions
written directions
 Difficulty with reading
and writing

Kinesthetic  Prefers hands-on  Use experiential learning (i.e.


learning role play)
 Learns better when  Have students do as much as
physical activity is possible (practical
involved application)
 Provide frequent breaks in
study periods
 Recommend students to
memorize or drill facts to be
learned while walking or
running
 Recommend students write
out facts to be learned
several times
Figure 6-2. Learning Styles

Chapter 6 6-7
Systems Approach To Training Manual Adult Learning

6302. Accommodating Learning Styles


It is not feasible for instructors to prepare individual lesson plans for each of the learning
styles described above. Therefore, curriculum developers must commit themselves to
developing curriculum that appeals to a variety of learning styles. This takes thought and
creativity, but the effort will help to make instruction effective for all students. It is also
helpful for curriculum developers and instructors to take a learning style inventory so they
are aware of their own learning style, as their preference can affect the way instruction is
designed, developed, and/or implemented. Knowing their own learning preferences will
help them to overcome the tendency to tailor instruction to meet their own needs. As
stated earlier, it is best to choose a variety of teaching methods and media to meet the
needs of as many students as possible. Dr's Bandler, R. and Grinder, J. in the Field of
Neuro-Linguistic Programming categorized these different learning styles in four
modalities: Students may prefer a visual (seeing), auditory (hearing), kinesthetic
(moving) or tactile (touching) way of learning.
The Four Modalities:
Those who prefer a visual learning style:
1.) Look at the teacher's face intently
2.) Like looking at wall displays, books etc.
3.) Often recognize words by sight
4.) Use lists to organize their thoughts
5.) Recall information by remembering how it was set out on a page

Those who prefer an auditory learning style:


1.) Like the teacher to provide verbal instructions
2.) Like dialogues, discussions and plays
3.) Solve problems by talking about them
4.) Use rhythm and sound as memory aids
Those who prefer a kinesthetic learning style:
1.) Learn best when they are involved or active
2.) Find it difficult to sit still for long periods
3.) Use movement as a memory aid
Those who prefer a tactile way of learning:
1.) Use writing and drawing as memory aids
2.) Learn well in hands-on activities like projects and demonstrations

What teaching methods and activities suit different learning styles of the Four Modalities?
Visual
1.) Use many visuals in the classroom. For example, wall displays posters, flash
cards, graphic organizers etc.
Auditory
1.) Use audio tapes and videos, storytelling, songs, jazz chants, memorization and drills
2.) Allow learners to work in pairs and small groups regularly.
Kinesthetic
1.) Use physical activities, competitions, board games, role plays etc.
2.) Intersperse activities which require students to sit quietly with activities that allow
them to move around and be active
Tactile
1.) Use board and card games, demonstrations, projects, role plays etc.
2.) Use while-listening and reading activities. For example, ask students to fill in a
table while listening to a talk, or to label a diagram while reading

Chapter 6 6-8
Systems Approach To Training Manual Adult Learning

SECTION 6400. HOW ADULTS LEARN

4 To provide effective instruction, curriculum developers must understand how


adults learn. Adapting instruction to the stages of learning will improve the
effectiveness of instruction and enhance knowledge transfer. To help adult
students learn, instructors should: (1) put the task into context, (2) divide
information into manageable chunks, and (3) afford students the opportunity to
practice using new knowledge and skills.

1. Context It is very important to let students know how a task fits into the
“big picture” when they begin learning. First, explain to the students how a task
relates to the whole job, and then provide the details. For example, an instructor
can explain the importance of preventive maintenance on a weapon (increased
readiness, longer life, etc.) before teaching the details of disassembly and
cleaning. This simple process orients the student to the learning, shows the
relevance of a task, and prepares the student to learn.

2. Manageable Chunks Breaking information into manageable chunks


means dividing the instruction into small, logical pieces and identifying the critical
points. For example, when teaching preventive maintenance on a weapon, one
manageable chunk of instruction would be disassembly. A critical point of
disassembly is clearing the weapon. Before moving from one chunk to the next,
the instructor must verify that the student understands what has been taught.
The verification of understanding can be written into a lesson plan as questions
or practical applications.

3. Practice The best way to learn how to do something is to do it. Once


students have been introduced to a new concept or task, allow them time to
practice what they have learned. As they practice, instructors stay near to assist
them and coach them through the process. As they progress, instructors provide
less and less assistance until students are finally able to perform the task on their
own (scaffolding). The following steps describe the modeling technique:

a. Demonstrate a task at full speed.

b. Demonstrate a task slowly, emphasizing critical points.

c. Allow the students to perform the task with you.

d. Allow the students to do the task on their own.

Chapter 6 6-9
Systems Approach To Training Manual Adult Learning

6500. DOMAINS OF LEARNING SECTION


Learning objectives can be categorized into three domains or general areas: 5
cognitive, affective, and psychomotor. Classifying instruction into a domain allows
curriculum developers and instructors to design, develop and select activities and
strategies that match objectives. The cognitive domain includes all intellectual
processes, from knowing to evaluating. The affective domain includes values,
attitudes, beliefs, emotions, motivation, and interests. This domain includes
emotional responses rather than intellectual ones; therefore, it is the most difficult
to describe and assess. The psychomotor domain includes physical
performance of a task. Many military training objectives are in this domain;
however, all three domains of learning are usually addressed in a learning
objective. For example, consider the learning objective behavior, “Clean the M-
16A2.” Immediately you can see that “clean” is a psychomotor skill – it is a
physical performance of a task. But you must also realize that in order to clean
the weapon, a Marine must know how to disassemble it (cognitive domain) and
must understand the importance of maintaining the weapon (affective domain).
The predominant domain is used to classify objectives. In this example, it is the
psychomotor domain.

Classification schemes have been developed by educators for defining and


categorizing the type of learning that occurs within each domain. These schemes
or categories are referred to as taxonomies, which are organized from the simplest
to the most complex. Before students can perform at the most complex level, they
must master the knowledge and skills of the lower levels. This section will
describe the three domains and the levels of learning for each. It will also describe
how instructors and designers can use this information to maximize the
effectiveness of instruction.

A simple way to remember the three domains is to use the acronym


ASK:

Attitudes (affective domain)


Skills (psychomotor domain)
Knowledge (cognitive domain)

Chapter 6 6-10
Systems Approach To Training Manual Adult Learning

6501. Cognitive Domain (Bloom)

Cognitive learning is demonstrated by recall of knowledge and other intellectual


skills such as applying knowledge in a new situation, displaying comprehension of
information, problem solving, organizing information, analyzing, synthesizing, and
evaluating ideas or actions. The lower levels of this domain require a student to
recall, comprehend, or apply knowledge. In the higher levels, students must
analyze, synthesize or evaluate. Refer to Chapter 2, Section 2206, for the verbs
that can be used for writing objectives in the cognitive domain. Figure 6-4
provides definitions and examples of the behavior for each level of the cognitive
domain.

COGNITIVE DOMAIN
Type of
Level Definitions and Examples of Behavior
Learning
Making judgments about the value of ideas, works, solutions, methods,
6 Evaluation materials, etc. Judgments may be either quantitative or qualitative.
Examples: To argue, to decide, to compare, to consider, to contrast.
Putting together elements and parts to form a new whole.
5 Synthesis Examples: To write, to produce, to plan, to design, to derive, to
combine.
Breaking down material or ideas into their constituent parts and detecting
the relationship of the parts and the way they are arranged.
4 Analysis
Examples: To distinguish, to detect, to employ, to restructure, to
classify.
Knowing an abstraction well enough to apply it without being prompted
3 Application or without having been shown how to use it.
Examples: To generalize, to develop, to employ, to transfer.
Understanding the literal message contained in a communication.
2 Comprehension Examples: To transform, to paraphrase, to interpret, to reorder, to infer,
to conclude.
Remembering an idea, material, or phenomenon in a form very close to
1 Knowledge that in which it was originally encountered.
Examples: To recall, to recognize, to acquire, to identify.

Adapted from Taxonomy of Education Objectives: Handbook I: Cognitive Domain (pp. 201-207), by B.S.
Bloom (Ed.), M.D. Englehart, E.J. Furst, and D.R. Krathwohl, 1956, New York: David McKay Co.

Figure 6-4. Cognitive Domain

Chapter 6 6-11
Systems Approach To Training Manual Adult Learning

6502. Affective Domain (Krathwohl & Bloom)

Objectives written in this domain are intended to change attitudes that affect
behavior. The Affective Domain of learning deals with learning objectives on an
emotional level, to include feelings, appreciation, enthusiasm, attitudes, and
motivation. Figure 6-5 provides definitions and examples of the behavior for each
level of the affective domain.

AFFECTIVE DOMAIN
Level Type of Learning Definitions and Examples of Behavior
Characterization Acts consistently in accordance with the values he or she has internalized.
5 by Value or Value Examples: To revise, to require, to be rated high in the value, to avoid, to
Set resist, to manage, to resolve.
Relates the value to those already held and brings it into a harmonious and
4 Organization internally consistent philosophy.
Examples: To discuss, to theorize, to formulate, to balance, to examine.
Willing to be perceived by others as valuing certain ideas, materials, or
phenomena.
3 Valuing
Examples: To increase measured proficiency in, to relinquish, to subsidize,
to support, to debate.
Committed in some small measure to the ideas, materials, or phenomena
involved by actively responding to them.
2 Responding
Examples: To comply with, to follow, to commend, to volunteer, to spend
leisure time in, to acclaim.
Being aware of or sensitive to the existence of certain ideas, material, or
1 Receiving phenomena and being willing to tolerate them.
Examples: To differentiate, to accept, to listen (for), to respond to.

Adapted from Taxonomy of Education Objectives: Handbook II: Affective Domain (pp. 176- 185), by D.R.
Krathwohl, B.S. Bloom, and B.B. Masia, 1964, New York: David McKay Co.

Figure 6-5. Affective Domain

Chapter 6 6-12
Systems Approach To Training Manual Adult Learning

6503. Psychomotor Domain (Simpson’s Taxonomy)

The psychomotor domain includes physical movement, coordination, and mental


skills such as speaking. This is the domain in which most Marine Corps training
objectives occur. As an example, Marine Combat Training (MCT) is primarily
designed to transfer physical combat skills to new Marines. Objectives in this
domain require physical motion or manipulation of an object (e.g., “fire a
weapon”). Some psychomotor skills are inherently more complex than others.
An example is land navigation, a skill that requires more thought and planning for
success than “fire a weapon.” Figure 6-6 provides definitions and examples of
the behavior for each level of the psychomotor domain.

PSYCHOMOTOR DOMAIN
Level Type of Learning Definitions and Examples of Behavior
The ability to develop an original skill that replaces the skill as initially
7 Origination learned.
Examples: Create, design, originate, arrange, compose, construct.
Can modify motor skills to fit a new situation.
6 Adaptation
Examples: Adapt, change, modify, revise, alter, rearrange.

Complex Overt The ability to perform the complete psychomotor skill correctly.
5 Response
Examples: Carry out, operate, perform.
The ability to perform a complex motor skill; the intermediate stage of
learning a complex skill
4 Mechanism
Examples: Attempt, imitate, try, assemble, build, construct,
dismantle, disassemble, display, fasten, fix, mend, organize, work.
The early stage of learning a complex skill: includes imitation; can
complete the steps involved in the skill as directed.
3 Guided Response
Examples: Attempt, imitate, try, assemble, build, construct,
dismantle, disassemble, display, fasten, fix, mend, organize, work.
The readiness to act; requires the learner to demonstrate an
awareness or knowledge of the behaviors needed to carry out the skill.
2 Set
Examples: Assume a position, demonstrate, show, display, move,
respond, start.
The ability to use sensory cues to guide physical activity.
1 Perception Examples: Distinguish, identify, select, choose, describe, detect,
isolate.

Adapted from The Classification of Educational Objectives in the Psychomotor Domain: The Psychomotor
Domain. Vol. 3. Washington, DC: Gryphon House.

Figure 6-6. Psychomotor Domain

Chapter 6 6-13
Systems Approach To Training Manual Adult Learning

6504. USING DOMAINS OF LEARNING


The domains and levels of learning are extremely useful to the SAT process.
Understanding domains can assist the curriculum developer in writing learning
objectives, selecting test questions, developing lesson materials and choosing
instructional methods. Instructors who comprehend the domain in which they are
teaching can adopt appropriate strategies for reaching the objective.

1. Writing Learning Objectives The domains and levels of learning can


be used when writing learning objective behavior statements. If students are new
to the information, they must start at the lower levels of the domain. Entry-level
Marines need to learn the parts of a M16A2 rifle (cognitive) prior to learning how
to assemble or disassemble the rifle (psychomotor). Whether writing objective
behavior statements for the cognitive domain or the psychomotor domain, the
verbs used will be from the lower levels of the domain for new knowledge/tasks.
Higher levels of the domain are considered when the students already have a
foundation for the information/tasks. However, if new information or tasks are
being taught to career or advanced level Marines, the lower levels of the domain
are considered when developing the objectives. When developing the learning
objectives, the verb list in Chapter 2, section 2206, can help to ensure that the
appropriate level is used.

2. Tests The domains of learning can be used when deciding how to test
(figure 6-7). If knowledge is being tested, then the level of the cognitive domain
will indicate what type of test items are appropriate. If attitude is being tested,
then the level of the affective domain will be referenced. The psychomotor
domain is referenced when students are required to perform a task to a specific
level of proficiency after instruction. Figure 6-7 shows the types of tests that are
appropriate for each level in these domains.

3. Methods One of the factors for selecting an appropriate instructional


method is the domain and level of the learning objective. When considering the
method to use for a particular objective, also consider the method of testing.
Methods of instruction are chosen that will enable students to perform at the
specified level. For example, if students are being tested on their analytical
abilities, then a case study may be an appropriate method to use in the classroom.
It provides the student with practice in analyzing a real-life case, and applying
rules to the scenario. Figure 6-8 provides a list of some methods that can be
used to teach an objective, based on its domain and level. The list is not
exhaustive, but provides many instructional methods that are appropriate.

Chapter 6 6-14
Systems Approach To Training Manual Adult Learning

APPROPRIATENESS OF TESTING TECHNIQUES IN COGNITIVE DOMAIN


KNOWLEDGE-BASED PERFORMANCE-BASED
LEVELS OF Multiple Short Essay Oral Rating
True/False Matching Checklist
DOMAIN Choice Answer Test Test Scale
Knowledge Yes Yes Yes Yes No No Maybe No
Comprehension Yes Yes Yes Yes No No Maybe No
Application Yes No No Yes Maybe Maybe Maybe Maybe
Analysis Maybe No No Maybe Yes Yes No Maybe
Synthesis No No No No Yes Yes No Maybe
Evaluation No No No No Yes Yes No Maybe
APPROPRIATENESS OF TESTING TECHNIQUES IN AFFECTIVE DOMAIN
KNOWLEDGE-BASED PERFORMANCE-BASED
LEVELS OF Multiple Short Essay Oral Rating
True/False Matching Checklist
DOMAIN Choice Answer Test Test Scale
Receiving Yes Maybe Maybe Yes No No Yes No
Responding Yes No No Maybe Maybe Maybe Yes No
Valuing Maybe No No No Yes Yes Yes Yes
Organization No No No No Yes Yes No Yes
Characterization No No No No Yes Yes No Yes
APPROPRIATENESS OF TESTING TECHNIQUES IN PSYCHOMOTOR DOMAIN
KNOWLEDGE-BASED PERFORMANCE-BASED
LEVELS OF Multiple Short Essay Oral Rating
True/False Matching Checklist
DOMAIN Choice Answer Test Test Scale
Perception No No No No No No Maybe Maybe
Set No No No No No No Maybe Maybe
Guided Response No No No No No Maybe Maybe Maybe
Mechanism No No No No No Maybe Maybe Maybe
Complex Overt No No No No No Maybe Maybe Yes
Response
Adaptation No No No No No Maybe No Yes
Origination No No No No No Maybe No Yes

Yes = Appropriate Maybe = Can be Appropriate in Some Situations No = Never Appropriate

Adapted from Planning Instruction for Adult Learners By P. Cranton, 1989, Toronto, Ontario: Wall & Emerson, Inc.
Figure 6-7. Using Domains to Determine Test Item Type.

Chapter 6 6-15
Systems Approach To Training Manual Adult Learning

Domain Level of Learning Most Appropriate Methods


KNOWLEDGE Lecture, Programmed Instruction, Drill and Practice
COGNITIVE
DOMAIN
COMPREHENSION Lecture, Modularized Instruction, Programmed Instruction
Discussion, Simulations and Games, CAI, Modularized
APPLICATION Instruction, Field Experience, Laboratory
Discussion, Independent/Group Projects, Simulations, Field
ANALYSIS Experience, Role Playing, Laboratory

Independent/Group Projects, Field Experience, Role Playing,


SYNTHESIS
Laboratory
Independent/Group Projects, Field Experience, Laboratory
EVALUATION
RECEIVING Lecture, Discussion, Modularized Instruction, Field Experience
AFFECTIVE
RESPONDING Discussion, Simulations, Modularized Instruction, Role-Playing,
DOMAIN
Field Experience
VALUING Discussion, Independent/Group Projects, Simulations, Role-
Playing, Field Experience
ORGANIZATION Discussion, Independent/Group Projects, Field Experience

CHARACTERIZATION BY A
Independent Projects, Field Experience
VALUE

PERCEPTION Demonstration (lecture), Drill and Practice


PSYCHO-
SET Demonstration (lecture), Drill and Practice
MOTOR
DOMAIN
GUIDED RESPONSE Peer Teaching, Games, Role-Playing, Field Experience, Drill and
Practice
MECHANISM Games, Role-Playing, Field Experience, Drill and Practice

COMPLEX OVERT RESPONSE


Games, Field Experience

Independent Projects, Games, Field Experience


ADAPTATION
Independent Projects, Games, Field Experience
ORIGINATION

Figure 6-8. Using Domains to Determine Method of Instruction

Chapter 6 6-16
Systems Approach To Training Manual Adult Learning

SECTION 6600. GROUP DYNAMICS

6 Problem solving exercises, practical applications, and classroom layouts are


frequently designed for small group work. Using small groups in the learning
process allows experiences to be shared, tasks to be performed, and productive
relationships to be established. In many cases, groups help reduce anxiety and
increase learning for individuals. Individuals generally prefer to work in smaller
groups (3-4 participants). Individual participation tends to decrease with
increasing group size – a group leader is more likely to emerge and individuals
are likely to conform to the majority opinion. The learner with more experience
related to the task tends to make more contributions and to have more influence
on the group.

1. Handling Group Disruption Though groups can be effective, there are


times when a group member hinders others from learning. This may be through
over-aggressiveness, lack of participation, negative attitude, or anxiety.
Regardless of the behavior, the method of dealing with the behavior is the same.
The instructor must first identify the problem behavior and then privately address
the problem with the individual. Problems should be addressed with
understanding and directness (focus on the problem behavior, not the
personality), and a solution should be offered. The instructor and learner can set
goals to eliminate the problem behavior for the next group session. Assigning
roles to each member of the group is often an effective way to avoid or eliminate
problems.

2. Handling Lack of Progress A prime problem with group work is the


amount of time that can be spent off task. As the group members build
relationships, the discussions often digress. There are several methods
instructors can use to ensure all members contribute and that the group stays on
task:

a. Have group members conduct an evaluation of how well the group, and
individuals within the group, are progressing towards their goal. Provide
members with a rubric for the evaluation. The results will assist you with your
assessment of the learners as well as provide feedback to the group so they can
work on any deficiencies.

b. Structure group activities. This can be done through time constraints,


assigning roles (e.g., leader, reporter, briefer), providing a specific structure for
the activity, or by limiting the number of tasks to be accomplished at one time.
For example, if a group must complete 3 tasks that should take about 10 minutes
each, assign them individually. Allow students 10 minutes to complete the first
task. That task can be discussed, if necessary, and a new task assigned with a
new time constraint. This will help students stay focused. The curriculum
developer must make instructor notes in the lesson plan. If instructors format
the class to accommodate group behavior, then they must note the changes in
the After Instruction Report. A permanent change to the lesson plan may need
to be made.

Chapter 6 6-17
Systems Approach To Training Manual Adult Learning

c. Reassigning groups may be considered, but should be avoided if possible.


This decision should be carefully considered after other remedies have been
exhausted, and should be evaluated for the possible negative consequences.

3. Working With Resistant Learners There are many reasons why


learners may resist training. Regardless, instructors must be able to deal with
them as adults. Below is a list of some common reasons why adults resist
training:

a. Unsuccessful in previous learning environments.

b. Hard time adapting to change. (Learning is change.)

c. Unsure of the expectations.

d. Individual learning styles may be different than instructors’ styles.

e. The individual may see the class or activities as irrelevant to him/her.

f. Knowledge/Skills are new and the learner does not want to appear
ignorant or incapable.

g. Learning level may be inappropriate for the individual.

Instructors will, at times, encounter resistant adult learners. However, an effective


instructor can deal with resistant learners by involving them in the learning
process, providing feedback, and encouraging peer interactions. It is up to the
instructor to create an environment conducive to learning, to clarify what is
expected, to provide authentic activities, and to create an atmosphere that allows
the students to make their experiences part of the instructional setting.
Sometimes, students’ prior experiences will hinder their progress. For example, in
a career level course, students may have been taught how to perform a task
differently in the Operating Forces than what is taught at the school. This may be
a source of frustration for the student, and the instructor must be prepared to
provide reasoning for the method taught in the course. The instructor must strive
to establish his or her credibility early as the subject matter expert to deal
effectively with adult learners.

Chapter 6 6-18
Systems Approach To Training Manual Adult Learning

SECTION 6700. MOTIVATION

7 Students must be motivated to learn for learning to be effective. Curriculum


developers can plan motivational activities for the instructor. However, there is a
shared responsibility for motivation between the instructor and the student. The
learner controls the desire to learn and the instructor controls the stimulation.
Below are some ways that instructors can stimulate motivation.

1. Give Recognition When students do something worthy of recognition,


instructors need to give positive feedback. Such recognition makes the student
feel that his or her contribution to learning is significant. Recognition can
encourage further participation and enhance learning.

2. Serve as a Good Model Instructors have a considerable influence on


student motivation. As an instructor, you are the model to be emulated:
your uniform, treatment of students, demonstration of desired behavior and your
enthusiasm must be beyond reproach. Research indicates that teachers with low
self-esteem tend to have students with lower self-esteem.

3. Stimulate Cooperation Among Students Society places a great deal


of emphasis on competition, and Marines tend to be competitive by nature. While
competition among students can lead to improved performance, it can also cause
stress and poor performance for those students that cannot keep up. When
students compete with a standard, rather than with each other, all students can
experience success. While learners must accomplish some objectives individually,
working in cooperative, collaborative teams (when and where it is appropriate)
ensures success for all.

4. Consider Mastery Learning Mastery is defined in terms of a specific set


of objectives that students are expected to meet. When using this approach,
student performance is measured against objectives, not against the performance
of other students.

5. Have High but Reasonable Expectations There is a considerable


amount of research that suggests students will perform up to the expectations
that instructors have for them. Marines expect training to present a challenge.
When the standard for performance is high, students will be motivated to reach
that level. Ensure the challenge is not beyond the abilities of your students.
Consider what has been taught, the amount of practice allowed, and the
environment in which the student must perform. Most students who put forth
effort can meet high, but reasonable, expectations. A few will find the task
simple, while some will find it too difficult. Success will come at various paces, but
will come to most of the students with the guidance of a patient, understanding
instructor.

6. Recognize Potential in Students Behavioral scientists have concluded


that human’s function at 10 percent or less of their potential. Negative self-
esteem can stifle the potential of students. Instructors who recognize true
potential in students that are struggling can motivate them to continue by
recognizing prior successes. For example, a rifle range coach can quickly point
out marksmanship fundamentals that a shooter is properly applying before
correcting the shooter’s problems. When students know that their instructors see
potential for success, they are motivated to meet the instructors’ expectations.

Chapter 6 6-19
Systems Approach To Training Manual Adult Learning

6800. CONSTRUCTIVIST LEARNING SECTION


ENVIRONMENTS (CLEs)
8
Until recently, most instructional design efforts were based on objectivist
conceptions of learning, which assume that knowledge can be transferred from
teachers or transmitted by technologies and acquired by learners. Constructivist
conceptions of learning, by contrast, assume that knowledge is individually
constructed and socially co-constructed by learners based on their interpretations of
experiences in the world. Constructivist Learning Environments (CLEs) are an
attempt by instructional designers to build environments that allow students to
explore, and discover meaning for themselves, in a specified domain. This chapter
will highlight the key components of a CLE and present some examples of various
kinds of CLEs. This information is essential for Marines involved in designing
instruction for advanced schools and senior Marines.

6801. Designing CLEs

The model for designing CLEs (Figure 8-1) illustrates their essential components.
The model conceives of a problem, question, or project as the focus of the
environment, with various interpretative and intellectual support systems
surrounding it. The goal of the learner is to interpret and solve the problem or
complete the project.

Adapted from Designing Constructivist Learning Environments, D. Jonassen, 2002,


Hillsdale, New Jersey: Lawrence Erlbaum Associates.
Figure 8-1. Model for designing CLEs

Chapter 6 6-20
Systems Approach To Training Manual Adult Learning

6802. The Problem


Since the key to meaningful learning is ownership of the problem or learning goal,
you must provide interesting, relevant, and engaging problems to solve. The
problem should be ill-structured or ill-defined, so that some aspects of the
problem are embedded within the problem and must be discovered by the
learners. Here are some key aspects of ill-structured problems:

 Unstated goals and constraints


 Multiple solutions, solution paths, or no solutions at all
 Multiple criteria for evaluating solutions
 Present uncertainty about which concepts, rules and principles for the
solution or how they are organized
 Provide no general rules for predicting the outcome of most cases
 Require learners to make judgments about the problem and to defend
their judgments by expressing personal opinions or beliefs (Jonassen).

Additionally, problems need to include three integrated components: the problem


context, the problem representation or simulation, and the problem manipulation
space (the environment in which the students will work towards their solution).
The context refers to the physical, organizational, and socio-cultural atmosphere
in which problems occur. The problem representation/simulation must be
interesting, appealing, and engaging; but perhaps most importantly, it must be
authentic. Authentic means that learners should engage in activities that present
the same type of cognitive challenges as those in the real world, that is, tasks that
replicate the particular activity structures (goals of the activity, physical setting
that constrains/fosters certain actions, and the tools required) of a context
(Savery & Duffy, 1996). Other key elements of CLEs (Figure 8.1) are access to
ample, pertinent resources; the use of collaboration/corroboration between
students and instructors; and a student-centered atmosphere where the instructor
facilitates student learning through the use of modeling, coaching, and
scaffolding.

6803. Examples of CLEs


This section provides just a few examples of the vast array of Constructivist
Learning Environments. The goal is to stimulate your curiosity and to encourage
you to explore the many available resources in this area so you can create a
learning environment that best suits the needs of your students.

1. Situated Learning or Anchored Instruction Situated learning


promotes authentic activities to ensure that learning is situated in contexts that
reflect the way the knowledge will be useful in real-life situations. Situated
learning environments provide instruction through the exploration of authentic
scenarios, cases, or problems that allow students to experience the complexity
and ambiguity of the real world with out real-world consequences. Learners work
in small groups while the teacher provides structures collaborative activities,
learning resources, and instructional support.

Chapter 6 6-21
Systems Approach To Training Manual Adult Learning

Instructional characteristics of situated learning environments:

 Promote authentic learning through coherent, meaningful, and purposeful


activities that represent the ordinary practices in real-life situations and
contexts.
 Provide opportunities for learners to internalize learning and develop self-
monitoring and self-correcting skills.
 Support exploration and interaction within a real-world context.
 Provide multiple perspectives through the different roles depicted in the
scenario and the different strategies presented by individuals and groups.
 Promote articulation, reflection, and critical thinking skills (decision making and
problem solving).
 Student-centered: Teacher asks questions, facilitates discussions, provides
resources and encourages critical thinking, but does not provide solutions or
impose procedures.

2. Problem-Based Learning Problem-Based Learning engages the learner


in a complex problem-solving activity in which the problem drives all learning. No
prior learning is assumed. Learning begins with a complex, ill-structured, real-world
problem to be solved, rather than content to be mastered. Students, in groups of
4-6, take ownership of the problem and construct their own understanding of the
situation by identifying what the problem is, identifying learning needs, determining
a plan of action, and eventually finding a sensible, workable solution. Tutors or
teachers are assigned to each group to act as mentors and coaches, facilitating the
problem-solving process and providing appropriate resources. The primary goals of
Problem-Based Learning are to help students develop collaborative learning skills,
reasoning skills, and self-directed learning strategies.

Instructional characteristics of Problem-Based Learning:

 Promote ownership of the learning process (the context motivates students to


“own” the problem, students must define the problem).
 Assumes no prior knowledge in the content area(s) for which the problem is
intended.
 Promote student-centered, group learning environment and self-directed
learning (students must set their own learning goals, generate hypotheses,
develop strategies, and search for/identify relevant resources to accomplish
goals).
 Promote authentic learning through real-world, ill-structured problems (multiple
solutions and solution paths).
 Problem solving as primary learning goal; self-reflection primary assessment.
 Support recursive, iterative cycling through a reasoning process to reach goal.
 Allows learners to integrate, use, and re-use newly learned information in
context.
 Promote facilitation and scaffolding through instructor guidance.

3. Cognitive Apprenticeship Cognitive apprenticeship is very much like


situated learning. The key difference is that in a cognitive apprenticeship, learners
are invited into the actual practices of a knowledge domain and are asked to
perform these practices as an apprentice or intern. Students interact with experts
who model and explain what strategies are being used in solving problems in their
domain of knowledge. This is very much like the military’ s practice of on-the-job
training.

Chapter 6 6-22
Systems Approach To Training Manual Adult Learning
Instructional characteristics of cognitive apprenticeships:

 Promote mentoring and coaching relationship between novice learner and


expert practitioner.
 Support modeling and explaining of expert performance (teacher models
activity by making tacit knowledge explicit through think aloud procedures
and worked examples).
 Focus on mastery of performance within the context of the knowledge
domain.
 Encourage collaborative learning such as collective problem solving,
developing teamwork skills, experiencing multiple roles, and confronting
misconceptions.
 Support learning strategies such as articulation of understanding and
reflection on performance.
 Promote the enculturation of students into authentic practices through activity
and social interaction (apprentice-type learning, introducing students into the
community of practice).

(Note: The information in this section was collected from a variety of sources, to
include Chapter 5, Pedagogical Models for Online Learning, from an unpublished
manuscript by Dr. Nada Dabaugh, George Mason University)

4. Conclusion The constraints of this manual allowed for the explanation of


just a few models of constructivist learning environments. As the developers of
educational programs and learning environments for Marines, we must take the
time to explore all of the learning environments available. By building our
understanding, we will be better able to provide effective instruction. Below is a
list of other constructivist learning environments that can be further researched at
the Theory Into Practice Database: tip.psychology.org.

Cognitive Flexibility Hypertexts


Communities of Practice/Learning Communities
Computer Supported Intentional Learning Environments (CSILEs)
Microworlds, Simulations, Virtual Worlds

Chapter 6 6-23
Systems Approach To Training Manual School Administration

SCHOOL
ADMINISTRATION

ANALYZE

E
V
A
DESIGN

L
U

DEVELOP
In Chapter 7:
A
T

7000 INTRODUCTION 7-1


E

7100 TRAINING
IMPLEMENT
INFORMATION 7-2
MANAGEMENT
SYSTEM (TIMS)
 Purpose 7-2
 TRRMS 7-2
 BNA 7-5
 MCAIMS 7-6

7200 STAFF AND FACULTY


DEVELOPMENT 7-7
 Purpose 7-7
 Elements of a Staff & 7-8
Faculty Development Plan
 Additional Elements 7-10
 Staff Certification Policy 7-11
 Staff Selection Policy 7-13
 Training Resources 7-14

Chapter 7
Systems Approach To Training Manual School Administration

Chapter 7000. INTRODUCTION

7 Computer-based management systems are used in the Marine Corps to assist


the formal schools/detachments and operational forces in tracking training,
identifying quotas, assigning seats, managing formal school curriculum, and
performing student administration. Three key systems within the Training
Information Management System (TIMS) are Training Requirements and
Resources Management System (TRRMS), By Name Assignment (BNA), and
Marine Corps Automated Instructional Management System (MCAIMS).
Administrators in the formal school/detachment must be aware of the functions
and uses of each of these systems. Additionally, the chapter provides
administrators with the importance, development steps, and the rewards gained
by a school that implements a well thought-out staff and faculty development
plan.

Chapter 7 7-1
Systems Approach To Training Manual School Administration

7100. TRAINING INFORMATION MANAGEMENT SECTION


SYSTEM (TIMS)
1
TIMS (https://tims.tecom.usmc.mil/) is a system maintained by Formal Schools
Training Branch, that training information to the individual Marine and command
training chiefs, feedback opportunities for users of the training information
systems, status of ongoing projects for training information systems, a link to
3270 for BNA access, and general user information. TIMS is made up of three
main systems: Training Requirement Resource Management System (TRRMS),
Marine Corps Automated Instructional Management System (MCAIMS), and By
Name Assignment (BNA). Each of these systems is discussed in more detail
below.

1. Training Requirement Resource Management System


(TRRMS) The relational database that produces the TIP and TQM. It is also
the primary source of data for developing the POM and future budget submission
for formal training. TRRMS is also used to develop and report the Marine Corps
portion of the Military Manpower Training Report (MMTR) to the Department of
Defense (DOD) and the Institutional Training Readiness Report to the Congress.
The two main components of TRRMS are Training Input Plan (TIP) and Training
Quota Memorandum.

a. Training Input Plan (TIP) The TIP is produced in relation to the Fiscal
Year (FY) - 1 October to 30 September XX - and covers one year for execution
and four “out years” for planning. Only approved Formal Courses of Instruction
are included. In general, the TIP represents centrally controlled training courses
that lead to an MOS, provide MOS-related skills, or which are deemed relevant to
the overall Marine Corps mission.

1) Components of the TIP The combined TIP is composed of a cover


letter from the Commander Training Command, a TIP/TQM Reference
Guide, the MOS Training Tracks, the FY XXXX-XXXX Requirements
Plan, and Appendices. The cover letter provides information and
guidance for commands and schoolhouses, and it solicits feedback,
correction, and course scheduling information. The MOS Training
Tracks consist of the approved course(s) required to obtain the given
MOS in accordance with the MOS Manual and Individual Training
Standards (ITSs)/Training and Readiness (T&Rs). The most
substantive section of the TIP deals with the FY XXXX-XXXX plan.
This section is presented in two formats on the CD ROM. One format
is organized by Service (Army, Navy, Air Force, Marine Corps, Civilian)
and the other by Sponsor (i.e., MPP-20, C473, POE-21). All courses
are listed by Course Identifier (CID), the schoolhouse number, and
service school code (SSC). The following appendices are also
included for reference purposes: 1) Sponsor Codes, 2) MOS List, 3)
School Codes/Locations, and 4) Student Type Codes. The CD ROM
also includes a Course List sorted alphanumerically by the CID for
easy reference. The only means of accessing the TIP is via TRRMS.
The TIP can be viewed online.

Chapter 7 7-2
Systems Approach To Training Manual School Administration

2) Cycle of the TIP The TIP is produced annually in May for the
upcoming fiscal year and four out-years. First the training tracks are
routed for updating. Then the worksheets are routed to the OccFld
sponsor for non-MOS training requirements. Manpower Plans and
Policies (MPP) produces the Manpower TIP based on the latest Grade
Adjusted Recapitulation (GAR). These inputs are entered into the
Training Requirements and Resource Management System (TRRMS),
which is a computer-based model for development of the TIP. TRRMS
provides automation of the numerical decision making process and
provides analysis for resolving conflicts and shortfalls. It also does
training/workload calculations and budget costing for planning. After the
TIP is produced, it is routed for comment and appropriate adjustments
are made before final publication and distribution. Once published, the
TIP acts as the source document for assignment of students through the
Training Quota Memorandum (TQM) process.

3) Reading the TIP Along with course dates, alphanumeric codes are used
in the TIP to identify the course and the student type.

a) Course Identifier (CID) The Course Identifier (CID) is a unique


alphanumeric code composed of several identifying elements:
Service, Location, Service School Code (SSC), and the School. There
is a table in front of each TIP, which breaks down the service and
location codes. Figure 7-1 identifies the information provided by the
seven-character alphanumeric known as the CID.

Figure 7-1. Course COURSE IDENTIFIER (CID)


Identifier (CID) CID = M03SBC2

Service Service School Code (SSC)

M 03 SBC 2

Location School

Chapter 7 7-3
Systems Approach To Training Manual School Administration

b) Student Type Training requirements and seat assignments are


broken down into various student categories. Each student type
is a two to five character alphanumeric code, indicating status,
whether Enlisted, Officer, or Warrant Officer, and special type.
There is a table in front of each TIP that breaks down the
Student Type Codes. Figure 7-2 identifies what the three-
character alphanumeric code means.

STUDENT TYPE
Student Type = 0EE
Figure 7-2. Student Type.
Status Special Type

0 E E

Enlisted, Officer, or Warrant

b. Training Quota Memorandum (TQM) Training Quota Memorandum


(TQM) are documents produced to translate annual TIP requirements into actual
class seat assignments and form the basis for order writing. They cite funding
authority for travel and per diem while providing a breakdown of seats by
student type. A TQM is produced for each course on the TIP.

1) Components of TQM A TQM is comprised of a cover letter and an


enclosure. The cover letter provides execution instructions in several
standard paragraphs. The first paragraph requests the ordering of
students to school. The second paragraph provides funding data in
the form of a five-character alphanumeric field to identify the service,
the course, and the authorization. Figure 7-3 identifies the meaning
of the alphanumeric code. The third paragraph contains special
instructions provided by the school or the OccFld sponsor, if
applicable. This is followed by a paragraph containing prerequisites
provided by the MOS Manual, the school, or the OccFld sponsor.
Next comes a paragraph of administrative instructions. Then a
paragraph listing class capacity obtained from the Course Descriptive
Data (CDD). If revisions have been made to the TQM, a justification
paragraph will be included to describe what caused the revision. The
enclosure to the TQM is the actual breakdown of class schedule and
seat allocations by student type or command.

Chapter 7 7-4
Systems Approach To Training Manual School Administration

Figure 7-3. TQM Funding TQM FUNDING DATA


Data. Funding Data = AKN9R

Service Service School Code


Service Codes
A - Army
A K N9R C - Civilian
F - Air Force
M - Marine
Authority N - Navy
O - Foreign

Ground Aviation
K - Travel M - Travel
L - Per Diem/Other N - Per Diem/Other
X - Per Diem/Other Y - Per Diem/Other

2) Cycle of the TQM TQMs are produced annually in consonance


with the fiscal year and are revised as necessary to reflect schedule
changes, seat allocations and other changes. In most cases, TQMs
are produced at least 60 days prior to the first class; however, some
revisions require short-fused telephonic coordination.

2. By Name Assignment (BNA) The BNA system is the Marine Corps'


Class I system used to collect enrollment, graduation, and non-graduation
training data. BNA is used to track Marine Corps sponsored students through
their training pipeline. It provides training seat coordination among the various
personnel assignment agencies and aids in evaluating annual training plan
performance. MCO 1553.7 mandates BNA as the official training reservation
and reporting system of the Marine Corps. The order directs reporting of student
enrollment, completion status, and class validation in BNA. The BNA User's
Manual can be downloaded or viewed on-line. It is suggested that anyone who
uses BNA on a regular basis download the user's manual and print it as a
reference.

a. Impact of BNA BNA lists all formal training opportunities in the Marine
Corps. It provides the user with a roster and other reports. BNA interfaces with
many USMC systems and other service training systems as indicated in Figure 7-
4 and Figure 7-5. Data from BNA is reported to a variety of agencies, including
DoD and Congress through the Military Manpower Training Report (MMTR) and
the Institutional Training Readiness Report (ITRR). The MMTR reports to
Congress all formal training of Marines conducted at each school for all the
services. This report has to match budget and manpower reports. The ITRR
identifies to Congress all formal training conducted at USMC schools.

Chapter 7 7-5
Systems Approach To Training Manual School Administration

USMC Systems Interfaced With By BNA

 Marine Corps Total Forces System (MCTFS) - single source for


Figure 7-4. USMC Systems
USMC student information
Interfaced With By BNA.
 Recruit Distribution Model (RDM) - Schedules initial MOS training
 Unit Diary/Marine Integrated Personnel System (UD/MIPS) -
Schedules recruit training & MCT
 Automated Orders Writing Process (AOWP) - Orders have class
dates and course information
 Training Requirements and Resource Management System
(TRRMS) - Quota Management
 Marine Corps Automated Instructional Management System
(MCAIMS) - student progress and rosters
 Reserve Affairs Personnel Entry Level Assignment System
(RAPELLA) - Schedules training for resources

Other Service Training Systems Interfaced With By BNA


Figure 7-5. Other Service
 United States Navy - Navy Integrated Training Administrative
Training Systems Interfaced
System (NITRAS)
With By BNA.
 United States Army - Army Training Requirements and Reserve
System (ATRRS)
 United States Air Force - Air Force Military Modernization
Program/Oracle Training Administration (MILMOD/OTA)

b. Process of BNA BNA is loaded with students by monitors and using units.
It monitors school throughput and unfilled quotas. The information produced by
BNA impacts the budget, quotas, and the schedules of courses. Proper and timely
class validation triggers BNA to send an Administration Instruction Manpower
Management System (AIMMS) transaction to Marine Corps Total Force System
(MCTFS) to update the Marines' Basic Training Record (BTR). A transaction is sent
to the Sailor Marine Academic Record Transcript (SMART) to update the civilian
equivalency transcript of a Marine for courses completed at formal schools.

3. Marine Corps Automated Instructional Management System


(MCAIMS) The Marine Corps is using the TECOM Integrated Management
System (TIMS) which is a government-owned software application that is available
for use throughout the Marine Corps. TIMS is the Marine Corps’ standard
automated system for instructional management and school administration. All
Course Descriptive Data (CDD) and Programs of Instruction (POI) must be
produced and submitted using TIMS. For maximum effectiveness and flexibility,
TIMS is operated and supported at individual formal schools/detachments. School
administrators rely on MCAIMS to manage students, instructors, resources,
reporting requirements, etc. Instructional staff members (curriculum developers,
instructors, testing officers, etc.) use TIMS as a tool for automating key functions
of the Design, Development, Implementation, and Evaluation phases of SAT.
Refer to the TIMS help screens for more information and guidance.

Chapter 7 7-6
Systems Approach To Training Manual School Administration

SECTION 7200. STAFF AND FACULTY DEVELOPMENT

2 Staff/Faculty are those individuals who directly or indirectly contribute to the


primary mission of the school. A school staff encompasses the instructors,
curriculum developers, instructional systems designers, Academic Chief and
Officers, Executive Officers, Commanding Officers, secretary, the clerk that does
BNA, and the Information Systems Coordinator (ISC). This list is by no means
exhaustive and can encompass a myriad of other personnel depending on the
mission of the school. In order to foster continuous improvement in the school,
the development of faculty/staff needs to be considered and planned. Needs may
be specific to instruction, curriculum development, administration, or within the
occupational specialty.

7201. PURPOSE

Over the past few years, a number of factors created the emergence of the
movement towards staff development. Some of these factors include the diverse
student population, demands for accountability, and the information/technology
explosion. More and more institutions of learning are becoming aware of the
direct positive correlation between effective faculty development programs and
improved student performance. The benefits realized from staff development are
not solely limited to student outcomes; a good plan can also bring about
organizational growth. The purpose of having a staff/faculty development plan or
program is to impact the following areas:

Curriculum Implementation
Instructional Improvement
Professional Development
School/Organizational Development

1. Curriculum Implementation The staff/faculty plan needs to be


designed to influence curriculum implementation by increasing the knowledge-
base within the subject area, providing diversification and exposure, providing a
buy-in to new program objectives, and supporting the ability to model any new
initiatives. Staff/Faculty can be exposed to a variety of mechanisms that can be
used to increase transfer of learning. Developing staff/faculty also increases the
cooperation in the implementation of essential programs for improving curriculum.
Take for instance, if the school is experimenting with writing new instructional
strategies into the curriculum to increase transfer of learning, staff/faculty will be
more supportive if they have been educated and sold on its effectiveness.
Through a well-developed staff/faculty, the staff/faculty is provided the training
and education to successfully model implementation of new programs and they
are prepared to coach for technical support when problems arise in
implementation.

Chapter 7 7-7
Systems Approach To Training Manual School Administration

2. Instructional Improvement The staff/faculty development plan needs


to provide the instructional staff with the means of improving the technical skills
for teaching, exposure to the different range of instructional strategies to use, and
ways to develop their individual strengths within instruction. Individual strengths
can be used through out the staff/faculty to build the competencies of others.
This team orientation of staff/faculty assisting and grooming staff/faculty develops
collaborative structures and supportive working relationships.

3. Professional Development The staff/faculty development plan needs


to contribute to professional development by establishing norms of continuous
adult learning, experimentation, openness to new ideas, and feedback. It is best
if the plan encourages responsible, autonomous decisions reflecting school goals,
values, and student-centered instruction. Professional development contributes to
the ability to develop skills of reflective self-analysis, self-assessment, and goal
setting for improvement. By implementing a plan that fosters and supports
continuous improvement and a focused mission, the school is able to build a
collaborative work culture. Effectiveness will be apparent in the school's product.

4. School/Organizational Development On a grander scale, the


staff/faculty development plan creates clarity, integration and commitment to
goals at all levels. It improves the quality of interactions and relationships in the
school and organization improving effectiveness. The plan needs to be developed
with an overall objective of improving the professional work climate and
advocating structures that facilitate improvement. For example, a plan that
represents training practices that advocate "we" versus "they" would be counter-
productive. The plan needs to unify staff/faculty toward the mission. Any
dysfunctional structures and/or practices must be eliminated. With every section
of the plan, there should be a mechanism in the design to provide feedback for
renewal. The outcomes of the staff/faculty development plan must be reviewed
and evaluated as to whether the desired outcomes are being achieved.

7202. ELEMENTS OF A STAFF AND FACULTY


DEVELOPMENT PLAN

Before developing a plan, it is important to note that the plan or program adopted
must address the development of all staff members within the organization.
Additionally, it should also be flexible enough to be tailored to each individual. An
effective staff development plan should consider and include the following
elements:

Chapter 7 7-8
Systems Approach To Training Manual School Administration

1. Training Policy/Mission Statement


2. Job Definition of Staff/Faculty
3. Staff Certification/Training Continuum
4. Levels of Competencies for Each Job/Billet During Assignment
5. Staff Recognition

This list is by no means exhaustive of what can and should be addressed in a


staff development plan. However, it is reflective of some of the more significant
considerations in devising a plan. Most importantly, an effective plan should
take into account the nature of adults as learners (see Chapter 6, Adult
Learning), the importance of making development options accessible to them,
while allowing them to take responsibility for their own learning. The goal is to
create a facilitative, collaborative environment, where staff has a sense of
freedom and opportunity to grow, experiment, and take risks.

1. TRAINING POLICY/MISSION STATEMENT This should state the


scope of the schools charter. The extent of the school’s mission, and how the
staff and faculty development plan supports it, will determine what is included in
this section of the plan. This mission statement should also include goals of the
development plan as they pertain to professional growth.

2. JOB DEFINITION OF STAFF/FACULTY In order to develop a plan


for each member of the school's staff, their role/roles must be clearly defined.
If performance requirements and job criteria, knowledge, skills and
characteristics required to optimally perform on the job are clear, then both
leaders and their staff have an undisputable basis for establishing performance
expectation. If the scope of the job is clear, it also provides clarity as to how
and if the job relates to organizational goals. If the school has curriculum
developers, instructors, and administrators, Marines or civilians, their position
description or scope of duties must be clearly articulated to them and in writing.
Collateral duties must be included in the job definition as required. Figure 7-6
provides an example of a job definition.

Course Manager – Personnel responsible for training in a specific course or


for specific areas of training in several courses. Examples include Chief
Instructors, Program of Instruction (POI) Manager or Instructional Systems
Specialist. It is the job of the course manager to ensure that policy provided by
the training manager and higher authority is carried out at the course level.
Figure 7-6. Sample Job The duties of the course manager include:
Definition.
 Coordinates the training program for all assigned personnel and maintains
instructor training records.
 Conducts scheduled and unscheduled instructor evaluations.
 Manages the instructor certification program and coordinates instructor of
the quarter certification.
 Compiles evaluation and course feedback from both student and instructors
and provides to Academics Officer and Director.

Chapter 7 7-9
Systems Approach To Training Manual School Administration

3. STAFF CERTIFICATION/TRAINING CONTINUUM Staff


certification outlines the training tracks, requirement, and certification procedures
for the school’s staff, both military and civilian. It states how the training
requirements will be accomplished and the timeframe for completion. The
selected training should enable selected personnel to work independently to stay
current in the duties of the assigned position. It is here the Instructor,
Curriculum Developer, or Administrator’s Training and Qualification Process are
outlined.

4. COMPETENCIES Significant creativity and thought is required to be able


to define the levels of competencies expected by personnel from the time that
they are assigned to each billet to the end of their assignment in a formal
school/detachment. However, it is important for personnel to understand the
expectations and how those expectations increase with experience and time in a
billet. While it is easy to define the instructor training requirement, which some
schools implement in the form of a Master Instructor and Instructor of the Year
Orders, it is more difficult to define the requirements for those who function
strictly as curriculum developers or administrators.

5. STAFF RECOGNITION This section of the plan should identify what


constitutes certification. Once the levels of competencies are met, how will they
be documented? Will the individual gain an entry in their instructor folder or
service record book?

7203. ADDITIONAL ELEMENTS/


CONSIDERATIONS

It is recommended that when considering implementing or revising a staff


development program, examine how well the above elements are addressed. It is
also important to consider that the program is being developed for adults who
have a wealth of experience, which can impact the extent of their participation in
faculty development. Any faculty development program should have a “What's In
It For Me” (WIFFM) for the participants.

The plan needs to be developed based upon the average individual that arrives at
your school. It should outline a development program for an individual from the
day he/she arrives to the day he/she leaves. At times, a new person with prior
experience may not require the first year of training as defined in the school's
plan. The opposite can also be true where an individual needs in excess of what
has been defined in the plan to get him/ her to where he/she needs to be.

Chapter 7 7-10
Systems Approach To Training Manual School Administration

1. MENTORSHIP/COACHING This addresses the need for assigning


new individuals to a master or experienced individual to effectively assist/guide
the less experienced in working in the formal schools. This mentor would
perform a myriad of roles, primarily aimed at assisting the new member in
dealing with issues related to teaching and other aspects of the new
environment.

2. OVER-ARCHING GOAL OF THE PLAN It is important that the


content of the plan is aimed at developing specific skills related to the job. It is
also imperative that the plan reflects clear organizational and operational
objectives. There needs to be a clear definition of how the participants will use
the new information they have acquired, and it must fit in with the
organizational goals. Sending staff to classes to learn just in case they need the
information is not a good use of resources. For example, sending an instructor
to Navy Interactive Courseware Development and Multimedia Tool Book is not a
good idea unless this person will be performing a task specific to the training in
the immediate future.

3. OWNERSHIP OF DEVELOPMENT EXPERIENCES This addresses


the actual types of the staff development activities. Here are some
considerations in assisting this process.

a. Ensure content of plan is presented in a variety of modes and activities.


b. When new information is provided, demonstrate application for current
use, reinforce new information and provide feedback.
c. Provide opportunities to practice/experiment-learned information in a
non-threatening environment.
d. Provide coaching/mentoring - opportunity to learn from others.

4. REWARDS EXCELLENCE This is an often overlooked and essential


element of a development program. It should be examined carefully to find
creative ways to reward those on your staff who, through their own initiative,
meet and exceed the staff development criteria.

7204. STAFF CERTIFICATION POLICY


As discussed earlier, staff certification requirements must be clearly detailed in
the plan. When tailoring the plan for staff members, it is essential that the
individual development take place in more than one incident. The plan must
cover the duration that staff member will be assigned to that position or school.
It should ensure opportunities for development takes place often enough, with
the goal of ensuring that the participants progressively gain knowledge, skill,
and confidence. Even with a plan in place, as training managers, we need to
identify our staff’s potential and performance and determine if the plan in place
should be personalized based on skills or lack thereof for specific individuals.

Chapter 7 7-11
Systems Approach To Training Manual School Administration

1. ACTIONS BASED UPON PERFORMANCE/POTENTIAL Based upon


personnel's performance and potential, administrators must be prepared to take
actions to maintain or increase the level of performance. Below is a list of ways to
encourage improvement and motivate personnel to exceed expectations.

High Performance/High Potential.


 Assign more responsibility.
 Provide more decision-making power.
 Provide exposure to other course areas.
 Provide opportunity to manage.
 Increase the span of supervision.
 Assign high-risk assignments.
 Cross-train.
 Assign important special projects.
 Provide educational experiences that prepare for future advancement.
 Provide anything upward.

High Performance/Low Potential.


 Provide opportunities to update skills or knowledge.
 Provide experiences that enable instructors to stay abreast of their field of
expertise.
 Provide opportunities to attend professional conferences.
 Encourage and support involvement in professional groups.
 Select to represent organization to others inside and outside the
organization.
 Provide opportunities for advanced training.
 Give public recognition.

Low Performance/High Potential.


 Provide opportunities for skills training.
 Quick job move to match skills.
 Change in supervision.
 Provide opportunity to gain additional knowledge of the school.

Low Performance/Low Potential.


 Provide another chance in same or different job under same or different
supervisor.
 Downgrading.
 Outplacement.

2. Medium Performance/Medium Potential Personnel at this rating


should be treated as those with high performance or potential thereafter. Re-
access after some time and then rate again. If no improvements, then treat as
low performance or potential.

Chapter 7 7-12
Systems Approach To Training Manual School Administration

7205. STAFF SELECTION POLICY


In most Marine Corps Schools, there is a limited ability to select the staff. Staff
members are primarily assigned to the school based largely on the needs of the
Marine Corps and not necessarily on the extent of their teaching or training
experience. As administrators in Marine Corps Schools, there is some flexibility
in who is assigned to critical positions in the school, and as such, screening and
selection is an important aspect of the plan. Staff selection and screening may
or may not be formally outlined in the plan, but one should exist. The selection
portion of the plan is where the individual is selected to match the job
previously defined. Some schools are working closely with the monitors to
ensure instructors go through a selection program.

1. Components of a Good System A good selection system provides


the tools with which to judge a potential staff member’s skills, experience,
qualification and knowledge. This is related to having a job clearly defined; an
extremely important step. Once the extent of the job is known, it becomes
easier to match the individual to the job. It is recommended to require the
following competencies, when possible, for selecting instructors:

a. Physically, psychologically and temperamentally and


suited for instructor duty.
b. Knowledge and expertise in the subject area to teach.
c. Good communication skills or the potential to develop
them.
d. Maturity.
e. Emotional Stability and ability to maintain self-control
under all circumstances.
f. Positive role model.
g. People-oriented.
h. Desire to teach.

2. Developing Individual Plan After the screening and selection


process has been concluded, it becomes necessary to assess individual
performance and potential, and devise appropriate developmental steps for each
staff member. It is important to look at what competencies the new staff
member already possesses and determine where he/she falls in the staff
development continuum. It is here the individual training track is outlined, a
billet description is given, and if required, a mentor is assigned.

Chapter 7 7-13
Systems Approach To Training Manual School Administration

7206. TRAINING RESOURCES


Training resources are available to formal schools/detachments for personnel.
However, the school has to plan both time and money for the implementation of
such resources. The school needs to identify money in the budget for faculty
development. Some of the resources that should be considered include, though
not limited to, the Instructional Management School, professional level schools,
scholarly publications, a school library, and/or the internet.

Chapter 7 7-14
Systems Approach To Training Manual Appendices

APPENDIX A

DESIGN WORKSHEETS

APPENDIX A is comprised of the following worksheets.

a. Learning Analysis Worksheet. This worksheet can be used to brainstorm knowledge


and skills for each performance steps during the Design Phase.

b. Learning Objective Worksheet. This worksheet can be used to develop learning


objectives, test items, and to determine the delivery system to be used.

c. Media Matrix. This matrix can be used by the curriculum developer to select a
instructional method.

A-1
Systems Approach To Training Manual Appendices

LEARNING ANALYSIS WORKSHEET


T&R Event or ITS
Duty Description:
T&R Event or ITS Date:
Duty Code:
Task: Task
Code:
Condition(s):
Standards(s):

Performance Steps: Knowledge, Skills, Attitudes (KSA):

A-2
Systems Approach To Training Manual Appendices

LEARNING OBJECTIVE WORKSHEET


Task Behavior:
Event Code (ITS #): Date:
Downgrade Justification:
(ITS ONLY)
Use a Learning Objective Worksheet for each learning objective.
TLO/ELO (Circle one):

Test Item/Evaluation:

Method/Media:

A-3
Systems Approach To Training Manual Appendices

METHOD SELECTION GRID


STUDENT VERBAL
GRID KEY PRESENTATION METHOD INTERACTION APPLICATION METHODS
METHODS

HR - Highly Recommended

Microforms, Manuals, Handouts)

Practical Application (Individual


Discussion-Non Directed (Peer-
Discussion, Dialogue, Teaching

Questioning (Socratic Method,

Guided Discussion (Instructor


Modular, Computer Assisted,
Demonstration (Operation of
R – Recommended

Reading (Books, Periodicals,

Simulations (Role-Playing,
Lecture (Formal, Informal

Self-Paced (Programmed,
Briefing, Student Speech)

Indirect Discourse (Panel

Controlled Seminar Free


NR - Not Recommended

Equipment, or System)
NI - Comprehension Level

Student Query)
EI -Higher Level

Discussion)

Case Study
Controlled)
LG - Large Class

Field Trips
Interview)

Mediated)

or Group)

Coaching
Games)
SM - Small Class
Indiv - Individual

DOMAINS AND LEVELS

COGNITIVE

Higher Levels NR NR NR NR R R NR NR HR NR HR HR NR

Comprehension HR HR NR R HR HR NR HR R NR R R NR

Knowledge HR R NR HR HR NR NR NR R R NR NR NR

PSYCHOMOTOR

Higher Level NR NR HR NR R NR NR NR HR NR R NR HR

Lower Level NR NR HR NR R NR NR NR HR NR R NR HR

AFFECTIVE

Higher Levels NR NR NR NR NR NR NR NR NR R HR HR NR

Lower Levels HR HR NR R R R R HR R HR HR R HR
FACTORS AND
CONSTRAINTS

Minimum Level of Instructor NI NI EI EI EI EI N/A EI EI NI EI EI EI


Expertise
1 = indiv
2-12 = small INDI INDI
LRG/ LRG/ IND SM/ SM/ SM/ SM/
Class Size 13-24 = medium SM* INDIV MED V/SM SM V/SM
MED MED IV MED MED MED MED
20+= large ** **

Evaluation Inherent in NO NO YES NO YES YES NO NO YES NO YES YES YES


Method

Responsive to Individual
Needs NO NO YES YES YES YES YES YES YES NO YES YES YES

** Consider breaking class into small groups if the number of students is large and there is instructional staff to support it.

A-4
Systems Approach To Training Manual Appendices

APPENDIX B

DEVELOP PHASE TEMPLATES

APPENDIX B is comprised of the following items developed during the Develop Phase.

a. Paper-based Concept Card. This worksheet can be used to consolidate information


[i.e. lesson designator, lesson title, hours, method, training support equipment, Terminal
Learning Objectives (TLO), Enabling Learning Objectives (ELO), and the references] for each
lesson prior to entering the information into the automated instructional database.

b. Operational Risk Assessment Worksheet (ORAW). The ORA worksheet document


the 5-step Operational Risk Management process as it relates to the lesson.

c. Lesson Plan. The lesson plan template provides the format for writing a lesson plan.

d. Student Outline. Two examples of formats for student outlines are included. There is
no standard format for a student outline. It should be developed with the student in mind.

e. Instructor Preparation Checklist. This checklist is a required element in the Master


Lesson File (MLF). It provides the instructor with information that is critical to the preparation for
implementation of the lesson.

f. Course Descriptive Data (CDD)/Program of Instruction (POI). The CDD provides


a detailed summary of the course including instructional resources, class length, and curriculum
breakdown. The POI describes the course in terms of structure, delivery methods and media,
length, intended learning objectives, and evaluation procedures.

B-1
Systems Approach To Training Manual Appendices

CONCEPT CARD
COURSE TITLE: DATE:
ANNEX: ANNEX
TITLE:
LESSON ID: LESSON
TITLE:
METHOD(S) HOURS S:I RATIO MEDIA:

TOTAL HOURS:
TERMINAL LEARNING OBJECTIVE(S) OR LESSON PURPOSE STATEMENT:

ENABLING LEARNING OBJECTIVE(S):

AMMUNITION Expended Unexpended


REQUIREMENT(S):
UNITS PER UNITS FOR UNITS PER UNITS FOR UNIT OF
DODIC NOMENCLATURE
STUDENT SUPPORT STUDENT SUPPORT ISSUE

NOTES:

(This is created in a table in Microsoft Word. If more space is needed in a section, just continue to return and the form
will expand.)
REFERENCES: Reference #

B-2
Systems Approach To Training Manual
SCHOOL: COURSE:

LESSON TITLE: LESSON DESIGNATOR:

PREPARED BY: DATE:

IDENTIFY HAZARDS ASSESS MAKE RISK DECISIONS IMPLEMENT SUPERVISE


HAZARDS CONTROLS

List Learning Sub-steps List Hazards Initial RAC Develop Controls Residual How to How to
Objective (If applicable) RAC Implement Supervise
Behaviors

Cease Training
Criteria (CTC): (During
training, instructors may
identify other hazards that

Appendices
require a decision to CT.)
B-3

Approving Signature: Date:


Systems Approach To Training Manual Appendices

UNITED STATES MARINE CORPS


(SCHOOL NAME)
(LOCAL COMMAND)
(COMMAND)
(SCHOOL ADDRESS)

(For USMC in heading; font size is 14 and in bold. For rest of heading; font size is 10)

(The address should be ALL CAPS)

(1-inch margins: top, bottom, left, and right)

LESSON PLAN
(Courier New, font size 18 for LESSON PLAN only. All other font size is 12.)

LESSON TITLE

LESSON DESIGNATOR

COURSE TITLE

COURSE ID

REVISED MM/DD/YYYY

(If lesson plan is the original version, then type the MM/DD/YYYY the lesson plan originated. If
lesson plan is a revised version, then type REVISED MM/DD/YYYY).

APPROVED BY _________________________ DATE _______________

B-4
Systems Approach To Training Manual Appendices

INTRODUCTION ( MIN)
(Time cues are explained in the SAT Revision 2002 and in the Curriculum Developer Course at IMS.)

(ON MEDIA # ) (Media may consist of PowerPoint slides, transparencies, turn charts, etc.
If using PowerPoint, then ON SLIDE #1, ON SLIDE #2 etc. If using turn charts, then TURN CHART
#1, etc. NOTE: Media cues are placed wherever they fall, even if it is within the text.)

(ON VIDEO "TITLE", VIDEO COUNTER #/SCENE #)(Provide the video counter
number if VHS tape or the scene number if DVD in the media cue. If neither are available,
provide a brief description of what segment of the video that is to be shown in an INSTRUCTOR
NOTE.)
(OFF VIDEO "TITLE", VIDEO COUNTER #/SCENE #)

1. GAIN ATTENTION.
(A gain attention is provided along with other possible ideas. Lines are provided so that the
instructor can personalize the gain attention to fit his/her personality.)
_________________________________________________________________________
_________________________________________________________________________
_____________________________________________________________________________
_____________________________________________________________________

(ON SLIDE # )
2. OVERVIEW. Good morning/afternoon class, my name is _______.
[Overview contains the conceptual framework (outline/main ideas) that will be covered in the
lesson. If applicable, it can also contain a statement that relates the lesson to previous
learning/another lesson.]

INSTRUCTOR NOTE
Introduce learning objectives.

3. LEARNING OBJECTIVES.

a. TERMINAL LEARNING OBJECTIVE. (List as on concept card. List ITS


designator after each TLO.)

b. ENABLING LEARNING OBJECTIVES. (List as on concept card.)

(1) (List ITS designator after each ELO.)

(2)

4. METHOD/MEDIA. (Describe the delivery system that will be used.)

INSTRUCTOR NOTE
Explain Instructional Rating Forms to students.

5. EVALUATION. (Provide how, when, and where the students will be tested.)

6. SAFETY/CEASE TRAINING (CT) BRIEF. (If applicable) Lessons that


involve risk of injury or damage to equipment must include a safety brief. This is explained
more thoroughly in the SAT.

(ON SLIDE # )

B-5
Systems Approach To Training Manual Appendices

TRANSITION: (Ensure understanding of what is being taught, how it is being taught, and the
expectations. Then introduce first main idea. Provide a Transition for the instructor to use and
provide lines for instructor personalization.)
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________

(ON SLIDE # )

BODY ( MIN)
(MAIN HEADING time cues are found at the INTRODUCTION, BODY, and SUMMARY. The MAIN HEADING time
cues are right justified of the MAIN HEADING, bold, uppercase, and in parenthesis. Time cues are
explained in the SAT Revision 2002 and in the Curriculum Developer Course at IMS.)

1. MAIN IDEA #1. ( Min) (Main idea time cues are located 2 spaces right of the
main idea in bold, letters with parenthesis. Main ideas are bold, underlined, and uppercase.)

a. Paragraph Heading. [If no paragraph heading, then use natural case


(meaning regular sentence text – first word capitalized). Paragraph headings are bold,
underlined, and title case (meaning first letter of each word is capitalized).]

(1) Paragraph Heading. [If no paragraph heading, then use natural


case (meaning regular sentence text – first word capitalized). Paragraph headings are underlined
and title case per example above.]

(a) Paragraph Heading.

1 Paragraph Heading. (To minimize confusion in following


the outline, it is recommended that the use of these subparagraphs be minimized.)

a Paragraph Heading. (To minimize confusion in


following the outline, it is recommended that the use of these subparagraphs be minimized.)

(ON TURN CHART # )

INTERIM TRANSITION: (Thus far, we've discussed main idea #1 . Do you have any
questions? Let's move on to a demonstration of ….)
___________________________________________________________________________________________
_____________________________________________________________
_____________________________________________________________

INSTRUCTOR NOTE
Perform the following demonstration.
(Guidance on the instructional method being used should be in a box and
shaded at 12.5 shading. There should be one space between the instructor
note and the teaching method.)
DEMONSTRATION. [Provide general information to include group size, if applicable, time,
along with setup (handouts, turn charts, actual items to distribute, etc.) and the number of
instructor(s) required based upon the concept card. Provide the purpose of the demonstration.]

STUDENT ROLE: (Describe in detail step-by-step instructions of what the student's role
during the demonstration will be.)

INSTRUCTOR(S) ROLE: (Describe each Instructor's role.)


1. Safety Brief: (If applicable) (Brief students on safety precautions and what
to do if there is a mishap.)
2. Supervision and Guidance: (Describe a detailed script of exactly what the
instructor is doing during the demonstration.)
B-6
Systems Approach To Training Manual Appendices

3. Debrief: (If applicable) (Allow students the opportunity to comment on what


they experienced and/or observed. Provide overall feedback, guidance on any misconceptions, and
review the learning points of the demonstration.)

TRANSITION: (Review, Probe, and Introduce next main idea. More explanation on how to write
transitions is provided in the SAT Revision 2002 and in the Curriculum Developer Course. Provide
a Transition for the instructor to use, but also provide lines for personalization.)
_____________________________________________________________________________
_____________________________________________________________________________
_____________________________________________________________________________

(ON SLIDE # )

2. MAIN IDEA #2. ( Min)

(ON SLIDE # )

INTERIM TRANSITION: (So far, we've discussed main idea #2. Do you have any questions?
If not, let's move on to the practical application of ….)
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________

INSTRUCTOR NOTE
Introduce the following practical application.

PRACTICAL APPLICATION. [Provide general information to include group size, if


applicable, time, along with setup (handouts, turn charts, actual items to distribute, etc.) and
the number of instructor(s) required based upon the concept card. Provide the purpose of the
practical application.]

PRACTICE: (Describe in detail step-by-step instructions of what the student's role in the
practical application will be.)

PROVIDE-HELP: (Describe each Instructor's role.)


1. Safety Brief: (If applicable) (Brief students on safety precautions and
what to do if there is a mishap.)
2. Supervision and Guidance: (Describe what the instructor is doing during the PA
i.e. moving about the room, assisting students, answering questions.)
3. Debrief: (If applicable) (Allow participants opportunity to comment on what
they experienced and/or observed. Provide overall feedback, guidance on any misconceptions, and
review the learning points of the PA.)

TRANSITION: (Review, Probe, and Introduce next main idea. More explanation on how to write
transitions is provided in the SAT Revision 2002 and the Curriculum Developer Course. Provide a
Transition for the instructor to use, but also provide lines for personalization.)
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________

(BREAK – 10 Min) (Break cues are explained in the SAT Revision 2002 and in the Curriculum
Developer Course at IMS.)

(ON SLIDE # )
3. MAIN IDEA #3. ( Min)

(ON HANDOUT # )

B-7
Systems Approach To Training Manual Appendices

INTERIM TRANSITION: (Thus far, we've discussed main idea #3 . Do you have any
questions? If not, let's move on to the case study of ….)
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________

INSTRUCTOR NOTE
Introduce Case Study.

CASE STUDY. [Provide general information along with setup (handouts, turn charts, actual
items to distribute, etc.) and the number of instructor(s) required based upon the concept card.
Provide the purpose of the case study.]

STUDENT ROLE: (Describe in detail step-by-step instructions of what the student's role
during the case study will be.)

INSTRUCTOR(S) ROLE: (Describe each Instructor's role.)


1. Safety Brief: (If applicable) (Brief students on safety precautions and what
to do if there is a mishap.)
2. Supervision and Guidance: (Describe a detailed script of exactly what the
instructor is doing during the case study.)
3. Debrief: (If applicable) (Allow participant opportunity to comment on what
they experienced and/or observed. Provide overall feedback, guidance, and review the learning
points of the case study.)

(ON SLIDE # )

TRANSITION: (Review, Probe, and Introduce next main idea. More explanation on how to write
transitions is provided in the SAT Revision 2002 and the Curriculum Developer Course. Provide a
Transition for the instructor to use, but also provide lines for personalization.)
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________

4. MAIN IDEA #4. ( Min)

(ON SLIDE # )

TRANSITION: (Review, Probe, and Introduce next main idea. More explanation on how to write
transitions is provided in the SAT Revision 2002 and the Curriculum Developer Course. Provide a
Transition for the instructor to use, but also provide lines for personalization.)
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________

INSTRUCTOR NOTE
Introduce the following practical application.

PRACTICAL APPLICATION. ( Min) (This is a special case where a method has its
own time cue. Time cues are explained in the SAT Revision 2002 and the Curriculum Developer
Course at IMS.)
[Provide general information to include group size, if applicable, along with setup (handouts,
turn charts, actual items to distribute, etc.) and the number of instructor(s) required based
upon the concept card. Provide the purpose of the practical application.]

PRACTICE: (Describe in detail step-by-step instructions of what the student's role in the
practical application will be.)

B-8
Systems Approach To Training Manual Appendices

PROVIDE-HELP: (Describe each Instructor's role.)


1. Safety Brief: (If applicable) (Brief students on safety precautions and
what to do if there is a mishap.)
2. Supervision and Guidance: (Describe what the instructor is doing during the PA
i.e. moving about the room, assisting students, answering questions.)
3. Debrief: (If applicable) (Allow participant opportunity to comment on what
they experienced and/or observed. Provide overall feedback, guidance on any misconceptions, and
review the learning points of the PA.)

TRANSITION: (Review, Probe, and Introduce next main idea. More explanation on how to write
transitions is provided in the SAT Revision 2002 and the Curriculum Developer Course. Provide a
Transition for the instructor to use, but also provide lines for personalization.)
_________________________________________________________________________
_____________________________________________________________________________
_____________________________________________________________________

SUMMARY ( MIN)
[Provide overview of main ideas covered (no questions should be asked here). Provide closure
(relevance to job) and administrative instructions (IRF's, break).]

B-9
Systems Approach To Training Manual Appendices

Develop a Concept Card


We will be discussing the information necessary to develop concept cards. We
will do this by covering the purpose, categories, elements of a concept card,
Introduction and the steps in developing a concept card utilizing the Marine Corps
Automated Instructional System (MCAIMS).

A Concept Card provides continuity of instruction by identifying the method,


media, hours allowed, student/ instructor ratio, TLO/ELO’S, references, and/or
Importance any notes pertinent to the conduct of the lesson. By accounting for all of the
aspects associated with a given course on their respective concept cards, this
will accurately reflect the resources required to successfully conduct a course.

TERMINAL LEARNING OBJECTIVE: Given a course structure and learning


objectives, develop a concept card per the SAT Guide and MCAIMS Users
Manual.
Lernen
ENABLING LEARNING OBJECTIVES:
Objectives
With the aid of references, given a course structure and learning objectives,
record the required elements to establish a task oriented concept card in
accordance with the SAT Guide, the MCAIMS User’s Manual and the IMS
Concept Card Checklist.

With the aid of references, given a course structure and learning objectives,
record the required elements to establish an exam concept card in accordance
with the SAT Guide, the MCAIMS User’s Manual and the IMS Concept Card
Checklist.

With the aid of references and given a course structure working as a group,
record the elements required to establish a administrative concept card in
accordance with the SAT Guide, the MCAIMS User’s Manual and the IMS
Concept Card Checklist.

With the aid of references and given a course structure working as a group,
record the required elements to establish a lesson purpose concept card in
accordance with the SAT Guide, the MCAIMS User’s Manual and the IMS
Concept Card Checklist.

STUDENT OUTLINE EXAMPLE

B-10
Systems Approach To Training Manual Appendices

UNITED STATES MARINE CORPS


Instructional Management School
Marine Corps Service Support Schools
PSC Box 20041
Camp Lejeune, North Carolina 28542-0041

DEVELOP A CONCEPT CARD


STUDENT OUTLINE

CD0203
SEP 99

What Will I Learn From This Class?

1. Terminal Learning Objective. Given a course structure and


learning objectives, develop a concept card per the SAT Guide and
MCAIMS Users Manual.

2. Enabling Learning Objectives

a. With the aid of references, given a course structure and


learning objectives, record the required elements to establish a task
oriented concept card in accordance with the SAT Guide, the MCAIMS
User’s Manual and the IMS Concept Card Checklist.

b. With the aid of references, given a course structure and


learning objectives, record the required elements to establish an exam
concept card in accordance with the SAT Guide, the MCAIMS User’s
Manual and the IMS Concept Card Checklist.

c. With the aid of references and given a course structure


working as a group, record the elements required to establish a
administrative concept card in accordance with the SAT Guide, the
MCAIMS User’s Manual and the IMS Concept Card Checklist.

d. With the aid of references and given a course structure


working as a group, record the required elements to establish a lesson
purpose concept card in accordance with the SAT Guide, the MCAIMS
User’s Manual and the IMS Concept Card Checklist.

Let’s Get Started!

1. Purpose. Concept cards have both a primary and a secondary


purpose. The primary purpose is to provide the school with a…

B-11
Systems Approach To Training Manual Appendices

UNITED STATES MARINE CORPS


(SCHOOL NAME)
(LOCAL COMMAND)
(COMMAND)
(SCHOOL ADDRESS)

INSTRUCTOR PREPARATION GUIDE


LESSON TITLE: (As on concept card)

LESSON DESIGNATOR: (As on concept card)

TOTAL LESSON TIME: (As on concept card)

REFERENCES: (List references from concept card)

LOCATION OF TEST: (List where the test is located)

PERSONNEL REQUIRED: (List as listed on concept card


i.e. instructors, support personnel, Corpsman)

FACILITIES: (List as listed on concept card i.e. classroom,


laboratory, ranges, etc.)

REVIEW COURSE MATERIALS:


 Review the course/training schedule, administrative requirements, student
background information, lesson plans, student materials, media, and
evaluations (tests).

ADD PERSONALIZATION:
 Personalize the lesson plan by adding subject matter detail, relating
personal experiences, providing examples, questions, and/or interactive
techniques.

MATERIALS/EQUIPMENT: Make a checklist of items that the instructor needs for the
lesson (i.e. Models, Mock-ups, training aids, audio-visual equipment).
Example:
 Video Cassette

EXERCISE SETUP AND PLANNING: List exercises (i.e. Demonstrations, Practical


Applications) and the setup and planning involved for each specific to the lesson. Describe as a
step-by-step process).
Example:
Demonstration



SAFETY:
 Review ORA in Master Lesson File
 Reassess the environment for changes that affect the original ORA.
Document any additional considerations/controls on the After Instruction
Report (AIR) for future reference.

APPROVING SIGNATURE _________________________DATE ___________

B-12
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

PREFACE

This course is designed to train Marines in the formal preparation and delivery of instruction at
the Marine Corps School of Infantry (SOI) and Marine Combat Training (MCT) as defined by the SAT
Guide. Comments/recommendations related to this POI may be sent to:

Director Instructional Management School


Marine Corps Combat Service Support Schools
PSC Box 20041
Camp Lejeune, NC 28542

iii

B-13
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION I - COURSE DESCRIPTIVE DATA

1. COURSE TITLE. INSTRUCTOR ORIENTATION COURSE

2. LOCATION. Instructional Management School


Marine Corps Combat Service Support Schools
PSC Box 20041
Camp Lejeune, North Carolina 28542-0041

This course is taught at the School of Infantry, Camp Lejeune, NC.

3. COURSE ID. M03H4UA

4. OTHER SERVICE COURSE NUMBER. N/A

5. MILITARY ARTICLES AND SERVICE LIST NUMBER. N/A

6. PURPOSE. The purpose of this course is to train Squad Instructors assigned to the School of
Infantry in the formal preparation and delivery of instruction in accordance with the Systems
Approach to Training (SAT). Students work in small groups with the emphasis on employing the
demonstration, coaching, and practical application methods of teaching.

7. SCOPE. This course provides the skills required for Marines to succeed as Squad Instructors
at the School of Infantry. The course includes rehearsal techniques, coaching techniques, lesson
presentation, student management techniques, administration of performance evaluations, and
refinement of the basic speaking/listening skills.

8. LENGTH (PEACETIME). 5 Training Days

9. CURRICULUM BREAKDOWN (PEACETIME).

35.75 Academic Hours


12.00 Practical Application (Individual)
1.50 Demonstration
9.50 Lecture
0.50 Instructional Videotape
10.50 Performance Exam
1.25 Remedial Performance Exam
0.50 Written Exam
Administrative Hours 1.50
1.00 Graduation Exercise/EOC
0.50 IOC Course Overview

10. LENGTH (MOBILIZATION). 5 Training Days

11. CURRICULUM BREAKDOWN (MOBILIZATION). Same as Peacetime.

12. MAXIMUM CLASS CAPACITY. 12

13. OPTIMUM CLASS CAPACITY. 12

14. MINIMUM CLASS CAPACITY. 6

15. CLASS FREQUENCY. 7

16. STUDENT PREREQUISITES. This course is appropriate for Sergeants and below who are assigned as
a Squad Instructor at the School of Infantry (Infantry Training Battalion (ITB)) or Marine Combat
Training (MCT).

17. MOS RECEIVED. None.

I-1

B-14
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION I - COURSE DESCRIPTIVE DATA

18. QUOTA CONTROL. CG MCCDC (C463 FT)

19. FUNDING. CG MCCDC (463 FM)

20. REPORTING INSTRUCTIONS. Report to the Academics Officer, School of Infantry, Camp Geiger,
Bldg TC855. Report time is no later than 0715 on the course convening day. Messing and
billeting are available for students.

Instructional Management School


DSN: 750-0941
COMM: 910-450-0941

School of Infantry
DSN: 750-0118/0134
COMM: (910) 450-0118/0134

21. INSTRUCTOR STAFFING REQUIREMENTS. See Appendix A for Instructor Computation Worksheet.

1. Request T/O 7551 be modified to reflect the addition of 2 instructors to support


this course per the Instructor Computation Worksheet. Current T/O does not provide
enough instructors to effectively support IMS's current tasking and the additional
requirements for this course.

2. Instructors assigned to most Marine Corps formal schools are subject matter
experts and only require training to hone their instructional techniques. This is
not the case with instructors assigned to the Instructional Management School.
Instructors are assigned from a wide range of occupational fields and have no
previous experience in the areas of Instructional Systems Design, instructional
delivery techniques, adult learning theories, education processes, and the Systems
Approach to Training. The staff development process required to train an IMS
instructor is extremely extensive. It takes approximately eighteen to twenty four
months for an instructor to gain proficiency in one IMS Program of Instruction.
This is comparable to the eighteen months required to train a Marine Enlisted
Education Staff Non-Commissioned Officer, MOS 9917, to become Individual Standards
Designers, or Professional Development Education Analysts. As the organization
responsible for training the trainers, curriculum developers and school
administrators to support all Marine Corps Formal Schools and Training Centers, it
is imperative that instructors assigned to IMS are duty experts in the areas of
training and education. Instructors of the highest caliber, knowledgeable in
education processes are absolutely essential and a requirement in order to
effectively train the trainers who "Sustain the Transformation."
3. IMS has a small staff (School Director, and 9 instructors) responsible for
developing, managing, and implementing its seven formal courses of instruction (both
resident and MTT). Because of this limited structure, IMS must have instructors who
are capable of effectively teaching each course if it is to successfully train all
Marine Corps formal school faculty. The only way to successfully accomplish this
mission and meet the student numbers identified in the Training Input Plan (TIP) is
to cross-utilize instructors. Several factors dictate this cross-utilization.
These factors include: number of instructors on staff, number of courses taught,
student to instructor ratio for each course, and the number of classes taught for
each course during the year. Because of course scheduling during the year, each
instructor may be used in every course. For example, an instructor may teach an
Instructor Orientation Course (IOC) one week, teach and act as faculty advisor for
the Curriculum Developers Course (CDC) students the next, and then be sent on a
Formal School Instructor Course (FSIC) Mobile Training Team (MTT) for the following
two weeks. When looking at the IMS as a whole, the difficulties of associating
specific T/O line numbers to a course becomes evident. If this formula is applied,
instructor usage would not be adequately reflected in the CDD. Therefore, all nine
instructor billets are listed below with the asterisk identifying the Course Chief.

I-2

B-15
Systems Approach To Training Manual Appendices

Date: 20020313
INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION I - COURSE DESCRIPTIVE DATA

LN# GRADE MOS BILLET DESCRIPTION REQUIRED

52 E8 9917 Chief Instructor 1


54 E7 3529 Instructor 1
55 E7 3529 *Instructor 1
56 E7 1371 Instructor 1
57 E7 0369 Instructor 1
57A E7 1371 Instructor 1
58 E7 0193 Instructor 1
58A E7 0369 Instructor 1
58B E7 3537 Instructor 1

22. SCHOOL OVERHEAD REQUIREMENTS.

LN# GRADE MOS BILLET DESCRIPTION REQUIRED


51 O4 9602 Director 1
51A O3 9910 Course Coordinators 2
51B G12 1750 Instructional System Specialist 1
59 G04 0318 Secretary 1

Comments Line# 51B: GS-12 Instructional System Specialist, this line number is currently
unfunded. The need for a professional level educator on the staff of IMS is imperative. IMS
(East) has the role of lead school, responsible for developing curriculum to train instructors,
administrators, and curriculum developers for Marine Corps formal schools and training centers
both on the east and west coast. The transient nature of the IMS staff has resulted in a serious
erosion in the area of curriculum development. The turnover time for instructors assigned to IMS
averages eighteen to twenty four months. This presents a severe problem for the school in the
areas of continuity, instructor development and certification. A resident professional level
civilian educator would alleviate the problems associated with the lack of staff continuity. If
the GS-12 billet continues to remain unfilled, the quality and content of IMS curricula and
instruction will deteriorate. This will adversely impact the quality of instruction provided to
all Marine Corps Formal Schools and Training Centers. Additionally, if the quality of IMS
instruction is deficient, this will have a direct negative effect on the quality of the Marines
provided to the Operating Forces. This billet is absolutely essential in order to execute the
dictates of the IMS charter and support the Marine Corps Training Modernization Initiative. If
this line number remains unfunded, it will have a severe negative impact on IMS (East) ability to
execute its tasking as lead school.

Comments Line# 59 : This billet number should be upgraded to GS-07,


Education Technician. This would provide continuity within the instructor staff,
and address the deficiency in the level of expertise resulting from the highly
transient nature of the IMS instructors. Additionally, an Education Technician
would support the offering of this course, the Instructor Orientation Course, the
Curriculum Developer Course and Administrators Course.

23. TRAINING/EDUCATION SUPPORT REQUIREMENTS. The following are training and education
support requirement shortfalls:

Video Camera/Tripod: Funding for 2 video cameras at a cost of $600.00 per camera,
totaling $1200.00, and 2 tripods at a cost of $100.00 totaling $200.00. These
represent costs for 2 cameras needed to support the new requirements of this course.
The course is structured to provide the ability to record student presentations for
review and critique.

The following facility requirements are identified for one iteration of this course:
FACILITY FACILITY ID SQ FT REQ'D ON HAND SHORT
CLASSROOMS NA 150 3 3 0

The following materiel requirements are identified for one iteration of this course:

NOMEMCLATURE NSN UNIT OF ISSUE REQ'D ON HAND SHORT


BOOKS NA EACH 12 12 0
VIDEO CAMERA 671000C002007 EACH 2 2 0
VIDEO CAMERA TRIPOD 676001C003271 EACH 2 2 0

I-3

B-16
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

24. TASK LIST. See Appendix B.

CDD NOTES: IMS's current T/O does not provide enough instructors to effectively
execute this course's requirement and the concurrent training IMS provides.

I-4

B-17
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION II - SUMMARY OF HOURS

PEACETIME (5 TRAINING DAYS)

ACADEMIC TIME

TITLE HOURS ANNEX

PREPARE FOR INSTRUCTION 20.00 A

IMPLEMENT INSTRUCTION 15.75 B

TOTAL ACADEMIC HOURS: 35.75

ADMINISTRATIVE TIME

GRADUATION EXERCISE/EOC 1.00 Z

IOC COURSE OVERVIEW 0.50 Z

TOTAL ADMINISTRATIVE HOURS: 1.50

SUMMARY (PEACETIME)

ACADEMIC TIME 35.75

ADMINISTRATIVE TIME 1.50

TOTAL ACADEMIC AND ADMINISTRATIVE TIME: 37.25

MOBILIZATION (5 TRAINING DAYS)

During Mobilization, the length of training days would not change.

II-2

B-18
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION I - COURSE DESCRIPTIVE DATA

APPENDIX A - INSTRUCTOR COMPUTATION WORKSHEET (LOCKSTEP)


________________________________________________________________________________________
SECTION I COURSE DATA___________________________________________________________________

COURSE: M03H4UA INSTRUCTOR ORIENTATION COURSE

LOCATION: Instructional Management School


Marine Corps Combat Service Support Schools
PSC Box 20041
Camp Lejeune, North Carolina 28542-0041
This course is taught at the School of Infantry, Camp Lejeune, NC.

PROGRAMMED ANNUAL INPUT (FY 02): 84 LENGTH (AVG CAL DAYS): 5

PROGRAMMED NUMBER OF CLASSES/YEAR: 7 LENGTH (TRAINING DAYS): 5

SYLLABUS HOURS: 35.75


________________________________________________________________________________________
SECTION II CURRICULUM BREAKOUT__________________________________________________________
(A) (B) (C) (D) (E) (F)
MAX MAX
TRAINING CLASS RATIO INST SYLLABUS INST
SITUATION SIZE (X:1) REQ HOURS MANHOURS
Practical Application
(Individual 12 ÷ 6.00 = 2.00 x 12.00 = 24.00
Demonstration 12 ÷ 12.00 = 1.00 x 1.50 = 1.50
Lecture 12 ÷ 12.00 = 1.00 x 9.50 = 9.50
Instruct Videotape 12 ÷ 12.00 = 1.00 x 0.50 = 0.50
Performance Exam 12 ÷ 6.00 = 2.00 x 10.50 = 21.00
Remedial Perf Exam 12 ÷ 9.00 = 1.33 x 1.25 = 1.67
Written Exam 12 ÷ 12.00 = 1.00 x 0.50 = 0.50

TOTAL INSTRUCTOR MANHOURS PER CLASS(G): 58.67


________________________________________________________________________________________
SECTION III INSTRUCTOR COMPUTATION______________________________________________________

TOTAL INSTRUCTOR PROGRAMMED NUMBER ANNUAL INSTRUCTOR


MANHOURS PER CLASS x OF CLASSES = CONTACT HOURS 410.67

ANNUAL INSTRUCTOR ANNUAL INSTRUCTOR


CONTACT HOURS x 1.26 = HOURS 517.44

ANNUAL INSTRUCTOR MONTHLY INSTRUCTOR


HOURS ÷ 12 = HOURS 43.12

MONTHLY INSTRUCTOR
HOURS ÷ 145 = INSTRUCTORS REQUIRED 0.297 = 1

ICW NOTES: According to the ICW worksheet, 1 instructor is required to teach this
course. IMS utilizes a faculty advisor concept with a 6 to 1student to instructor
ratio. The actual instructor requirement to support a 6:1 ratio is 2.

I-A-1

B-19
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

APPENDIX B – TASKLIST

DUTY: 9806.03 IMPLEMENT INSTRUCTION

TASKS: (S) 9806.03.01 Prepare for instruction


(S) 9806.03.02 Conduct a lesson
(P) 9806.03.03 Administer tests

TASK LIST NOTES: None.

I-B-1

B-20
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION III - SCOPE OF ANNEXES

A. PREPARE FOR INSTRUCTION. These lessons address the preparation skills and techniques
required to effectively implement instruction.

B. IMPLEMENT INSTRUCTION. These lessons address the actual performance and delivery
of instruction.

Z. ADMINISTRATIVE. This annex addresses the commencement exercise.

III-1

B-21
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION IV - CONCEPT CARDS

1. A concept card is developed to describe each academic or administrative block of time during a
course. These concept cards are then grouped into subject areas, called annexes, which are summarized in Section III.
Annexes A through Y are reserved for academic lessons and exams. Annex Z is reserved for
administrative time.

2. The following information is contained on each academic concept card in Section IV:

a. Heading. The heading listed at the top of the concept card includes the name of the
course, the section of the POI, and the letter and title of the annex to which the lesson or exam
is assigned.

b. Lesson/Exam ID. This designator is a unique code assigned to this specific


lesson or exam within this course.

c. Hours. This number (carried to the second decimal place) depicts the amount of time
required to conduct the lesson or exam once, even if it is presented multiple times to smaller
groups of students.

d. Title. This is the title assigned to this lesson or exam. It should refer to the subject
matter covered in the lesson or exam when possible.

e. Phase (optional). This is a code depicting the phase (e.g., week, month, etc.) of the
course during which this lesson or exam takes place.

f. Group (optional). This is a code depicting the instructional group or section


responsible for teaching or developing this lesson or exam.

g. Methods,Hours,S:I Ratio. Displayed on the concept card are codes which


symbolize the methods of instruction used to present this lesson or exam. Following each method
code is the time (in hours) allocated to that method and the student to instructor ratio
associated with that period of time. (The hours and ratios depicted on the concept card are used
to determine instructor staffing requirements.) The following is a comprehensive list of methods
used in this course and their respective codes:

Method Code
Practical Application (Individual) A(I)
Administrative ADMIN
Demonstration D
Lecture L
Instructional Videotape VT
Performance Exam X(P)
Remedial Performance Exam X(P) INDV
Written Exam X(W)

h. Media. Displayed on the concept card are codes which symbolize the media used to support
this lesson or exam. The following is a comprehensive list of media used in this course and their
respective codes:

Medium Code
Chart C
Computer Aided Graphics CAG
Handout HO
Model M
Mockup MU
None N
Overhead Projector OH
Printed Materials PM
Slides S
Slide Projector SP
Transparency Projector T-P
Transparencies TP
Television TV
Videotape VT
Dry Erase Board WB
IV-1

B-22
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION IV - CONCEPT CARDS

i. Learning Objective(s)/Lesson Purpose. Academic concept cards contain either learning


objectives or a lesson purpose statement, but not both.

(1) Learning Objective. A learning objective describes a behavior that


students are expected to perform following instruction, not necessarily identical to a behavior
performed on the job. It also details the conditions under which that behavior is performed and
the minimum standards of acceptable performance. A student masters the objective when his or her
performance equals or exceeds the standard. (Information concerning student evaluation and
mastery is contained in Section V of this POI.)

(a) Terminal Learning Objective (TLO). One, and only one, TLO is written for each
task in Section I-B of the POI. The behavior in the TLO duplicates the actual behavior required
on the job, modified only if the constraints of the academic environment will not allow it. A
TLO should only appear on a concept card for a lesson or exam during which students actually
perform the TLO. Each TLO is assigned a numeric designator identical to the designator of its
corresponding task in Section I-B, which is identical to the designator of the Individual
Training Standard (ITS) from which the task was derived. This designator is located in
parentheses at the end of the TLO.

(b) Enabling Learning Objective (ELO). ELOs are designed to teach students the
knowledges and skills required for successful performance of the TLOs. Each ELO is placed only
on concept cards for lessons or exams during which students actually perform the ELO. Many
introductory lessons will contain only ELOs. Each ELO is assigned the same numeric designator as
the TLO it supports, followed by a unique combination of one or two letters. This designator is
located in parentheses at the end of the ELO. (The first 26 ELOs are assigned the letters "a"
through "z" consecutively. If there are more than 26 ELOs, they are assigned the letters "aa"
through "az," then "ba" through "bz," etc.)

(2) Lesson Purpose.A lesson purpose statement is recorded on a concept card where no
learning objectives are appropriate (e.g., overview, orientation, or enrichment lesson) and the
lesson is not to be evaluated. The lesson purpose statement clearly describes the rationale for
presenting the lesson.

j. Ammunition Requirements. Whenever a lesson requires the use of ammunition by students or


by the instructional staff in support of the lesson, the concept card for that lesson will
include a table depicting those requirements. Included for each type of ammunition will be its
Department of Defense Identification Code (DODIC), its nomenclature, the average number of rounds
used by each student, and the number of support rounds.

k. Notes (optional). This section of the concept card contains any information pertinent to
the lesson. Examples of items which may be addressed here are instructor requirements, scheduling
notes, special prerequisites, references to tests on which material will be evaluated, etc.

l. References . This section contains the source documents used for development of the
lesson or other references that relate to the lesson. At a minimum, it must contain all documents
referenced in the learning objectives included on the concept card.

3. The following information is contained on each administrative concept card in Section IV:

a. Heading. The heading listed at the top of the concept card includes the name of the
course, the section of the POI, and the fact that this concept card is part of Annex Z,
Administrative Time.

b. Event ID. This designator is a unique code assigned to this administrative event within
the course.

IV-2

B-23
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION IV - CONCEPT CARDS

c. Hours. This number (carried to the second decimal place) depicts the amount of
administrative time required for this event. If this is a repeating event, one concept
card may indicate the cumulative hours associated with this event throughout the course.

d. Event. This is a short description of the administrative event.

e. Notes (optional). This section of the concept card contains any information pertinent to
the administrative block of time.

4. The following pages contain useful information for locating the learning objectives and
lessons that make up this course.

a. Location of Learning Objectives Report. This report lists, by learning


objective designator, all learning objectives developed for this course. It also identifies every
concept card on which each learning objective is included.

b. Academic and Administrative Summaries. These reports list, by annex, all academic and
administrative concept cards in Section IV. Within each annex the concept cards are listed in
lesson identifier order. The information provided for each entry includes Identifier, Title,
Hours, and Type [Task-oriented lesson (T), Lesson Purpose lesson (LP), Exam (E), or
Administrative Time (ADM)]. A subtotal of hours is provided for each annex and for all academic
and administrative concept cards. Total POI hours are listed at the end of the Administrative
Summary.

IV-3

B-24
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE

SECTION IV - CONCEPT CARDS

ANNEX A - PREPARE FOR INSTRUCTION

LESSON ID: IT04A HOURS: 5.25

TITLE: Demonstration Rehearsal

METHOD HOURS S:I RATIO


A(I) 5.25 6:1

MEDIA: N

TERMINAL LEARNING OBJECTIVE(S):

1. With the aid of references and given instructional materials and the requirement to present a
lesson, prepare for instruction per the SAT Guide. (9806.03.01)

2. With the aid of references and given instructional materials, a time, place, students, and a
time limit, conduct a lesson per the SAT Guide. (9806.03.02)

ENABLING LEARNING OBJECTIVE(S):

1. With the aid of references and given instructional materials and the requirement to present a
lesson, rehearse a lesson per the IMS Conduct a Lesson Checklist and the SAT Guide.
(9806.03.01i)

2. With the aid of references and given instructional materials and the requirement to present a
lesson, prepare instructional aids per the IMS Instructional Aids Checklist and the SAT Guide.
(9806.03.01c)

3. Given instructional materials and the requirement to present a lesson, employ communication
techniques per the IMS Conduct a Lesson Checklist and the SAT Guide. (9806.03.01g)

4. Given instructional materials and the requirement to present a lesson, employ questioning
techniques per the IMS Conduct a Lesson Checklist and the SAT Guide. (9806.03.01h)

5. With the aid of references and given instructional materials and the requirement to present a
lesson, prepare instructional environment per the IMS Conduct a Lesson Checklist and the SAT
Guide. (9806.03.01j)

6. Given instructional materials, a time, place, students, and a time limit, present the
introduction per the IMS Conduct a Lesson Checklist and the SAT Guide. (9806.03.02a)

7. Given instructional materials, a time, place, students, and a time limit, present a
demonstration per the IMS Conduct a Lesson Checklist and the SAT Guide. (9806.03.02b)

8. Given instructional materials, a time, place, students, and a time limit, present the summary
per the IMS Conduct a Lesson Checklist and the SAT Guide. (9806.03.02e)

9. Given instructional materials, a time, place, students, and a time limit, employ classroom
management techniques per the IMS Conduct a Lesson Checklist and the SAT Guide. (9806.03.02f)

10. Given instructional materials and the requirement to present a lesson, display instructional
aids per the SAT Guide and AFMAN 36-2236. (9806.03.02h)

NOTE(S):

Individual practical application allows each student 45 minutes to rehearse their 30


minute presentation in the classroom environment.

Each presentation will take 45 Min, this includes setup, breakdown and critique, time.
45 Min x 6 Students = 270 Min. (4.5 hrs.) + 45 Min. (Breaks) = 5.25

IV-A-5

B-25
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE

SECTION IV - CONCEPT CARDS

ANNEX A - PREPARE FOR INSTRUCTION

REFERENCE REFERENCE #

1. Handbook for Air Force Instructors AFMAN 36-2236

2. DOD Handbook MIL-HDBK-1379-2

3. Systems Approach to Training Manual SATMANUAL

4. Navy Education and Training Manual NAVEDTRA 130 SERIES

5. Instructional Management School Conduct a Lesson Checklist IMS CLC

6. Instructional Management School Instructional Aids Checklist IMS IAC

IV-A-6

B-26
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE

SECTION IV - CONCEPT CARDS

ANNEX A - PREPARE FOR INSTRUCTION

EXAM ID: IT03 HOURS: 2.50

TITLE: MLF Review

METHOD HOURS S:I RATIO


A(I) 1.50 6:1
L 1.00 12:1

MEDIA: CAG, HO, TV

TERMINAL LEARNING OBJECTIVE(S):

1. With the aid of references and given instructional materials and the requirement to
present a lesson, prepare for instruction per the SAT Guide. (9806.03.01)

ENABLING LEARNING OBJECTIVE(S):

1. With the aid of references and given a lesson plan, a learning objective checklist,
and the requirement to present a lesson, review learning objectives per the IMS
Learning Objective Checklist and the SAT Guide. (9806.03.01a)

2. With the aid of references and given a lesson plan, lesson plan checklist, and the
requirement to present a lesson, review a lesson plan per the IMS Lesson Plan
Checklist and the SAT Guide. (9806.03.01b)

3. With the aid of references and given a student outline, a student outline
checklist, and the requirement to present a lesson, review student outline per the
IMS Student Outline Checklist and the SAT Guide. (9806.03.01d)

4. With the aid of references and given supplemental student material(s) and the
requirement to present a lesson, review supplemental student material(s) per the
IMS Supplemental Student Material Checklist and the SAT Guide. (9806.03.01e)

5. Without the aid of references and given testing material(s) and the requirement to
present a lesson, state in writing the process to review testing material(s) per
the IMS Testing Material Checklist and the AFMAN 36-2236. (9806.03.01f)

NOTE(S):

Individual Practical Application provides the instructor 15 Minutes to review each


Master lesson file with the students. 15 Minutes x 6 Students = 1.50 hours

REFERENCE REFERENCE #

1. Handbook for Air Force Instructors AFMAN 36-2236

2. DOD Handbook MIL-HDBK-1379-2

3. Systems Approach to Training Manual SATMANUAL

4. Instructional Management School Supplemental Student IMS SSMC


Materials Checklist

5. Instructional Management School Learning Objective Checklist IMS LOC

6. Instructional Management School Lesson Plan Checklist IMS LPC

7. Instructional Management School Student Outline Checklist IMS SOC

8. Instructional Management School Testing Material Checklist IMS TMC

9. Instructional Management School Media Checklist IMS MC


IV-A-3

B-27
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE

SECTION IV - CONCEPT CARDS

ANNEX B - IMPLEMENT INSTRUCTION

LESSON ID: IT06 HOURS: 1.50

TITLE: Administer Tests

METHOD HOURS S:I RATIO

L 1.50 12:1

MEDIA: CAG, HO

TERMINAL LEARNING OBJECTIVE(S):

1. Given the requirement to evaluate the student's learning, state in writing how to administer a
test per the AFMAN 36-2236 (9806.03.03)

DOWNGRADE JUSTIFICATION: This task is being taught to a preliminary level at the school.
Due to time constraints, it is not feasible to effectively evaluate all students
administering norm referenced and criterion referenced tests during the course of
instruction.

2. With the aid of references and given instructional materials and the requirement to present a
lesson, prepare for instruction per the SAT Guide. (9806.03.01)

ENABLING LEARNING OBJECTIVE(S):

1. Given the requirement to evaluate the student's learning, state in writing how to prepare the
testing environment per the AFMAN 36-2236. (9806.03.03a)

2. Given the requirement to evaluate the student's learning, state in writing how to conduct
testing per the AFMAN 36-2236. (9806.03.03b)

3. Without the aid of references and given testing material(s) and the requirement to present a
lesson, state in writing the process to review testing material(s) per the IMS Testing
Material Checklist and the AFMAN 36-2236. (9806.03.01f)

NOTE(S):

Task 9806.03.03. The standard has been modified to accurately reflect the current
standard used by IMS. The SAT Manual does not go into great detail on this subject and
it is not feasible for every student to properly administer tests in their groups in the
time allotted for the course.

REFERENCE REFERENCE #

1. Handbook for Air Force Instructors AFMAN 36-2236


PAGE/CHAPTER : Chapters 20 thru 24.

2. Systems Approach to Training Manual SATMANUAL


PAGE/CHAPTER : Pages 2-23 thru 2-33.

3. Instructional Management School Testing Material Checklist IMS TMC

4. DOD Handbook MIL-HDBK-29612-2A

IV-B-8

B-28
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE

SECTION IV - CONCEPT CARDS

ANNEX Z - ADMINISTRATIVE

EVENT ID: IT00 HOURS: 0.50

EVENT: IOC Course Overview

METHOD HOURS S:I RATIO

L 0.50 12:1

MEDIA: CAG

NOTE(S):

No student handouts are required for this class. The student will be required to
complete a NAVSO 5724/1 (Fleet Home Town News Release Form) and a locally produced
Student Data Sheet.

REFERENCE REFERENCE #

1. Systems Approach to Training Manual SATMANUAL

2. Academic SOP MCCSSSO P5000.1_

3. Standing Operating Procedures for Instructional IMS SOP


Management School

IV-Z-2

B-29
Systems Approach To Training Manual Appendices

Date: 20020313

INSTRUCTOR ORIENTATION COURSE PROGRAM OF INSTRUCTION

SECTION V - STUDENT PERFORMANCE EVALUATION

1. SCOPE. This course utilizes performance examination requiring students to duplicate


job performance requirements.

2. MASTERY LEARNING. The evaluation philosophy used in this course requires student
mastery of 100% of the Terminal Learning Objectives, 80% of the Enabling Learning
Objectives and to achieve an overall passing score of 80%.

3. EVALUATION OF STUDENTS

a. Exams. Each student is evaluated on the Learning Objectives completed during each testing
period before proceeding to the next. All Learning Objectives are tested utilizing performance
based examinations.

b. Remedial Training. In accordance with school policy, remedial training will be conducted one
time for each exam failed or for students experiencing difficulty mastering objectives.

V-1

B-30
Systems Approach To Training Manual Appendices

APPENDIX C

MASTER LESSON FILE (MLF) CHECKLISTS

APPENDIX C is comprised of the following checklists. Additional items may be added to the checklists as
required.
a. Master Lesson File Checklist. This checklist provides a list of documents and checklists that
can be placed in the Master Lesson File (MLF).
b. Learning Analysis Worksheet (LAW) Checklist. This checklist is used during the Design
Phase to evaluate the products of the learning analysis. The LAW is a required item for the MLF.
c. Learning Objective Worksheet (LOW) Checklist. This checklist is used during the Design
Phase to evaluate the learning objectives. The LOW is a required item for the MLF.
d. Test Item Checklist. This checklist is used during the Design Phase to evaluate the test items.
The test itself is usually kept in a secure place. However, the test item checklist can be placed in the
MLF. It is not a required item.
e. Delivery System Checklist. This checklist is used during the Design Phase to ensure that the
delivery system selected is compatible. The checklist is placed in the MLF, but it is not a required item.
f. Concept Card Checklist. This checklist is used during the Develop Phase to evaluate the items
on the concept card. The concept card is a required item to be placed in the MLF.
g. Operational Risk Assessment (ORA) Worksheet Checklist. This checklist is used during the
Develop Phase to evaluate the items on the ORA Worksheet. The ORA Worksheet is a required item to
be placed in the MLF.
h. Lesson Plan Checklist. This checklist is used during the Develop Phase to evaluate the lesson
plan. The lesson plan is a required item to be placed in the MLF.
i. Student Outline/Student Supplementary Materials Checklist. This checklist is used during
the Develop Phase to evaluate the student outline and student supplementary materials (if applicable).
The student outline and all supplementary materials are required to be place in the MLF.
j. Media Checklist. This checklist is used during the Develop Phase to evaluate the applicability of
the method and media. The media is required to be placed in the MLF.
k. Instructor Preparation Guide (IPG) Checklist. This checklist is used during the Develop
Phase to evaluate whether the necessary elements have been included in the Instructor Preparation
Checklist. It also ensures that the information provided is in accordance with the Concept Card. The IPG
is a required item to be placed in the MLF.

OTHER COURSE DEVELOPMENT CHECKLIST

l. Construct a Test Checklist. This checklist is used during the Develop Phase to evaluate the
construct of a test. The test itself must be kept in a secure place. This checklist is not a required item
for the MLF.

m. Course Descriptive Data (CDD)/Program of Instruction (POI) Checklist. This is a


checklist used during the Develop Phase that evaluates the information on the CDD/POI. The CDD/POI is
a separate product from the MLF. This checklist is maintained with the CDD/POI.

C-1
Systems Approach To Training Manual Appendices

MASTER LESSON FILE CHECKLIST


MASTER LESSON FILE REQUIRED ITEMS
1. Learning Analysis Worksheet Checklist YES NO
2. Learning Analysis Worksheets YES NO
3. Learning Objective Worksheet Checklist YES NO
4. Learning Objective Worksheet YES NO
5. Test Item Checklist YES NO
6. Method and Media Selection Checklist YES NO
7. Concept Card Checklist YES NO
8. Concept Card YES NO
9. Operational Risk Assessment Worksheet (ORAW) Checklist YES NO
10. Operational Risk Assessment Worksheet (ORAW) YES NO
11. Lesson Plan Checklist YES NO
12. Lesson Plan YES NO
13. Student Outline/Student Supplementary Materials Checklist YES NO
14. Student Outline YES NO
15. Student Supplementary Materials YES NO N/A
16. Media Checklist YES NO
17. Media (paper copy or explanation of where the media is located) YES NO
18. Instructor Preparation Guide (IPG) Checklist YES NO
19. Instructor Preparation Guide (IPG) YES NO
OTHER COURSE DEVELOPMENT ITEM/CHECKLIST
20. Test* YES NO
21. Test Checklist YES NO
* The test needs to be maintained in a secure place with limited access. Therefore, it is recommended
that this item be placed somewhere other than the MLF.
Remarks:

C-2
Systems Approach To Training Manual Appendices

LEARNING ANALYSIS WORKSHEET CHECKLIST


1. Is the LAW dated when analysis occurred? YES NO

2. Is the Task Designator/Event Code recorded from the ITS/T&R? YES NO

3. Is the Task Behavior/Event Description recorded from the ITS/T&R? YES NO


YES NO
4. Is the Condition recorded from the ITS/T&R (paper-based only)?
K.A.
YES NO
5. Is the Standard recorded from the ITS/T&R (paper-based only)?
K.A.
6. Are the performance steps verbatim from the ITS/T&R? YES NO

7. Is there at least one KSA for each performance step? YES NO


8. Are all knowledge and skills worded properly? (“Know how to…, Be
able to…”) YES NO

9. Are the grouped KSAs appropriate? (If the “grouping” is appropriate,


the student will exhibit a single behavior that proves mastery of all the YES NO
KSAs for that group.)
10. Are all groupings annotated with a designator and a draft behavior
YES NO
statement?
11. Which of the following methods were used to sequence the groups? (Circle all that
apply.)
a. Whole to part

b. Part to whole

c. Simple to complex

d. Complex to simple

e. Chronological (History)

f. Sequential

g. Cause and Effect order

h. Critical

i. Known to unknown

Remarks:

C-3
Systems Approach To Training Manual Appendices

LEARNING OBJECTIVE WORKSHEET CHECKLIST


1. Are the LOW’s dated (if using paper-based LOW)? YES NO NA

TERMINAL LEARNING OBJECTIVE


2. Was a TLO developed from the ITS/T&R? YES NO

3. Are the TLO condition(s) valid for the school? YES NO


4. Is the condition realistic? (Can school provide?) YES NO
5. Is the TLO behavior verbatim from ITS/T&R? YES NO

6. If TLO behavior has been modified, is there a downgrade justification


explaining why (i.e. due to resource constraints)? YES NO N/A

7. Are the TLO standard(s) valid? YES NO


8. Is the standard realistic? (Considering what is taught, can the student
YES NO
perform at this level?)
ENABLING LEARNING OBJECTIVES

9. Was an ELO developed for each group of KSA’s on the LAW? YES NO

10. Is an alpha designator assigned to all ELO’s? YES NO

11. Is the task behavior verbatim from the LAW? YES NO

12. Is the task designator verbatim from the LAW? YES NO

13. Are the LO’s written using proper grammar? YES NO

14. Do LO’s possess a single action verb? YES NO

15. Do LO’s possess a single object?


YES NO

16. Is the behavior observable and measurable?


YES NO
17. If not, is there a modifier/qualifier ensuring an observable/measurable YES NO
product? N/A
18. Is the condition consistent with the behavior? YES NO
YES NO
19. Does the condition describe the environment?
K.A.
20. Does the condition describe aiding/limiting factors? YES NO N/A

21. Is the standard consistent with the behavior? YES NO

22. Does the standard tell how well the student has to perform? YES NO

23. Are the LO‘s clear and concise? YES NO


Remarks:

C-4
Systems Approach To Training Manual Appendices

TEST ITEM CHECKLIST


1. Do the test items replicate the conditions of the LO? YES NO

2. Do the test items replicate the standards of the LO? YES NO

3. Do the test items avoid measuring common knowledge? YES NO

4. Do the test items avoid common sense answers? YES NO

5. Does the student perform the behaviors as they are stated in the LO’s? YES NO N/A

6. Are the test items constructed in the proper format? YES NO

7. Are the test items free from ambiguity? YES NO

8. Are the test items closed to interpretation? YES NO

9. Do the test items avoid opinions? YES NO N/A

10. Are the test items free of repeated words or phrases? YES NO

11. Does the test item avoid the use of absolutes (e.g. never, always)? YES NO

12. Are the test items written using proper grammar? YES NO

13. Is the test item as concise as possible? YES NO

14. Are the test items or their location annotated on the LOW’s? YES NO

15. Are the answers (or their location) to the test items annotated on the LOW’s? YES NO N/A

16. If a performance-based test item, are there detailed instructions to the evaluator? YES NO N/A

17. If a performance-based test item, are there detailed instructions to the student? YES NO N/A
18. If a performance-based test item, do the detailed instructions to the evaluator identify
YES NO N/A
the characteristics of a good product?

Remarks:

C-5
Systems Approach To Training Manual Appendices

METHOD AND MEDIA SELECTION CHECKLIST


1. Is the method selection consistent with the learning objective behavior? YES NO

2. Is the media selection consistent with the learning objective behavior? YES NO

3. Is the method selection consistent with the level of learning? YES NO

4. Is the media selection consistent with the level of learning? YES NO


YES NO
5. Is the media selected appropriate for the class size?

6. Is the method appropriate for the class size? YES NO

7. Is the method appropriate for the task(s)/topic? YES NO


8. Is the method and media (combined) appropriate for the target
YES NO
population?
9. Does the method and media (combined) compliment different learning
YES NO
styles (auditory, visual, and tactile)?
10. Does the method provide students the opportunity to practice their skills
YES NO N/A
in a safe environment?
YES NO
11. Is the method reflective of how the student will be evaluated?
K.A.
12. Are the resources required to implement the method available to the
YES NO
school?
13. Are the resources required to implement the media available to the
YES NO
school?
Remarks:

C-6
Systems Approach To Training Manual Appendices

CONCEPT CARD CHECKLIST


1. Is the course title listed? YES NO

2. Is the appropriate annex listed? (check school S.O.P.) YES NO

3. Is the lesson ID correct? YES NO

4. Is the lesson title the same as listed on the course structure document? YES NO
5. Do the total hours for the concept card match the course structure
YES NO
document?
6. Are all methods and hours listed? YES NO

7. Is the student:instructor ratio appropriate for the method? YES NO


YES NO
8. Are all forms of media listed? (not required for admin concept card)
K.A.
YES NO
9. Are ammunition requirements listed?
K.A.
10. Are there explanatory notes that clarify information on the concept card or
YES NO
list additional resources required?
11. Are ALL the references used to write the lesson listed? YES NO
ACADEMIC CONCEPT CARD ONLY
(Task oriented, lesson purpose or exam)
12. Does the concept card contain ALL objectives or a lesson purpose
YES NO
statement?
13. Are the learning objectives listed in the order that they will be taught or YES NO
evaluated? (not applicable if lesson purpose) N/A
YES NO
14. If this is an exam concept card, is there a re-test concept card as well?
K.A.
ADMINISTRATIVE CONCEPT CARD ONLY

15. Does the concept card describe the event in sufficient detail (e.g., check in,
YES NO
check out, graduation)?
16. Is this concept card in Annex Z? YES NO

Remarks:

C-7
Systems Approach To Training Manual Appendices

OPERATIONAL RISK ASSESSMENT (ORA) WORKSHEET CHECKLIST


1. Is the lesson title and lesson designator on the ORA worksheet? YES NO

2. Are all learning objective behaviors listed? YES NO


YES NO NA
3. Are sub-steps to the learning objective behaviors listed?

4. Have hazards been listed for the learning objective behaviors/sub-steps? YES NO NA
5. Have realistic controls been formulated for all listed hazards? (Available
YES NO NA
resources must be considered.)
6. Do the controls change the RAC code to an acceptable level? YES NO NA

7. Is it explained on the ORA worksheet how to implement the controls? YES NO NA


8. Is it explained on the ORA worksheet how to supervise? YES NO N/A
YES NO
9. Is the Cease Training Criteria (CTC) provided?
K.A.
10. Is there an approving signature and date on the ORA Worksheet? YES NO

Remarks:

C-8
Systems Approach To Training Manual Appendices

LESSON PLAN CHECKLIST


TITLE PAGE

1. Is the school name and address present? YES NO

2. Is the document’s title (lesson plan) present? YES NO

3. Is the lesson title present? YES NO

4. Is the lesson designator (ID) present? YES NO

5. Is the course title present? YES NO

6. Is the course identification number present? YES NO

7. Is the originating/revision date of lesson present? YES NO

8. Is an approval signature present? YES NO

INTRODUCTION

1. Does the Gain Attention:

a. Relate to the lesson? YES NO

b. Detract from the lesson? YES NO

c. Provide WIIFM for student? YES NO

d. Establish rapport? YES NO

2. Does the Overview:

a. Contain the conceptual framework? YES NO

b. Describe the purpose of the lesson? YES NO

c. Relate to other instruction? (Recall previous learning) YES NO


3. Are the Learning Objectives:
a. Noted in an instructor note for the instructor to introduce? (i.e.
YES NO
Students to read the LO’s to themselves)
b. Verbatim from the Concept Card? YES NO
4. Does the Method/Media:

a. Tell how the class will be presented? YES NO


b. Is there an instructor note mentioning the Instructional Rating Form
YES NO
(IRF)?

C-9
Systems Approach To Training Manual Appendices

5. Does the Evaluation:


a. State how the student will be evaluated (tested)? YES NO
b. Tell the student when they will be evaluated (tested)? YES NO
6. Are safety issues explained? YES NO
7. Is there a transition to the body? YES NO
BODY

1. Are the main ideas in the same sequence as the learning objectives? YES NO N/A

2. Do methods, other than lecture, provide:


a. General information about the method including the amount of time
YES NO
that the method will take to execute?
b. Detailed instructions for what the student’s role will be? YES NO

c. The instructor’s role. YES NO


 Provide Safety Brief (if applicable) to inform students of any
safety precautions related to the exercise and what to do if YES NO N/A
there’s a mishap.
 Provide Supervision and Guidance instructions to describe what
the instructor is to be doing (i.e. moving about the room, YES NO
assisting students, answering questions).
 Provide Debrief (if applicable) instructions to comment on what
was observed, provide overall feedback, additional guidance, YES NO N/A
and review learning points.
3. Do Time Cues:

a. Exist for each Main Heading? (Intro, Body, Summary) YES NO

b. For the Main Headings add up to the lesson time on the concept card? YES NO

c. Exist for each Main Idea? YES NO


d. Exist for methods that are not executed within a main idea? (There is
YES NO N/A
no time cue required for a method that is within a main idea).
e. For the main ideas, methods (if method is not a part of a main idea),
and breaks (that fall between main ideas) add up to the time cue for YES NO N/A
the Body?
f. Stand out from normal text? YES NO N/A

4. Do/Are Media Cues:

a. Stand out from normal text? YES NO N/A

C-10
Systems Approach To Training Manual Appendices

b. Identified by a number designator? YES NO N/A

5. Are Break Cues written into the lesson? YES NO N/A

6. Are Instructor Notes:

a. Placed where needed throughout the lesson? YES NO

b. Clear and concise? YES NO

7. Are Transition(s):

a. Between each main idea? YES NO

b. Does the transition summarize the last main idea, probe, and
YES NO
introduce the next main idea?
c. Between the last main idea and summary? YES NO N/A

8. Are Interim Transition(s):

a. Between the demo/practical application sessions? YES NO N/A

b. Before and after breaks? YES NO N/A


SUMMARY

1. Does the Summary review the main ideas? YES NO

2. Review each main idea without re-teaching? YES NO

3. Refrain from presenting any new material? YES NO

4. Provide closure? (Reaffirm importance of content.) YES NO

5. Provide closing instructions? YES NO

6. Contain instructions for Instructional Rating Forms (IRF)? YES NO


ENTIRE CLASS

1. Is the lesson detailed enough that all information can be covered by a


YES NO
first-time instructor?
Remarks:

C-11
Systems Approach To Training Manual Appendices

STUDENT OUTLINE CHECKLIST


1. Learning Objectives - Are they verbatim from the concept card? YES NO

2. Outline – Does it follow conceptual framework? YES NO


3. References – are all the references used in the lesson annotated on the
YES NO
last page?
4. Is the student outline written as if addressing the student? YES NO

5. Is the font size at least 10? YES NO

6. Is the text easy to read? YES NO

7. Is there ample white space (margins) for the student to take notes? YES NO

8. Do exercises or activities match those in the lesson plan? YES NO N/A


SUPPLEMENTAL STUDENT MATERIALS
1. Is the material relevant to the learning objectives? YES NO
2. If intended as a job aid, is it durable (e.g. laminated)? YES NO

Remarks:

C-12
Systems Approach To Training Manual Appendices

MEDIA CHECKLIST
1. Does the media enhance the information in the lesson plan? YES NO

2. Is the alignment used appropriate to the type of media? YES NO


3. Is the use of upper-case lettering minimized (only used for titles or to highlight
YES NO
text)?
4. Are the images used related to the content? YES NO

5. Does the page or frame refrain from clutter/image overload? YES NO

6. Do the colors contrast well? YES NO N/A

7. Is the appropriate level of vocabulary used? YES NO

8. Has the media been checked for spelling and grammar? YES NO

9. Do terms in the media match terms in the student outline? YES NO

PRINT MEDIA

10. Is the font size at least 10? YES NO

11. Is the text style consistent (headings, text, etc.)? YES NO

12. Is the format/layout consistent (spacing)? YES NO

13. Is there ample white space (margins)? YES NO

14. Is the text easy to read? YES NO

PROJECTED MEDIA

15. Are sans serif fonts (without finishing strokes) used? YES NO

16. Is the font size large enough to be seen by all (at least 24 for projected media)? YES NO

17. Is information bulleted using key words and phrases? YES NO

18. Are no more than six words a line and six lines (6X6 rule) per visual used? YES NO

19. Is the animation distracting within the presentation? YES NO N/A

20. Is the sound distracting within the presentation? YES NO N/A

21. Is the layout consistent throughout the presentation? YES NO

Remarks:

C-13
Systems Approach To Training Manual Appendices

INSTRUCTOR PREPARATION GUIDE (IPG) CHECKLIST


1. Is the lesson title the same as on the concept card? YES NO

2. Is the lesson designator the same as on the concept card? YES NO

3. Is the total lesson time the same as on the concept card? YES NO

4. Are all references the same as on the concept card? YES NO

5. Is the location of tests identified? YES NO

6. Are all personnel required the same as on the concept card? YES NO

7. Are all facilities required the same as on the concept card? YES NO

8. Are all course materials that need to be reviewed listed? YES NO

9. Is there a step to personalize the lesson plan? YES NO

10. Are all materials and equipment needed to conduct the lesson listed? YES NO
11. Are there detailed instructions for the setup and planning of each
YES NO
exercise?
12. Are all safety precautions related to lesson listed? YES NO

Remarks:

C-14
Systems Approach To Training Manual Appendices

TEST CHECKLIST
YES NO
1. Does it contain detailed instructions to the instructor?
N/A
2. Are there instructions to the evaluator concerning scoring? YES NO
YES NO
3. Does it contain detailed instructions to the student?

4. Are there instructions covering the consequences of cheating? YES NO

5. Does it state the safety precautions? YES NO

6. Is the purpose of this test clear? YES NO

PERFORMANCE-BASED TEST

7. Does it identify the task to be completed? YES NO


8. Does it contain a checklist of steps to be evaluated OR criterion of a good
YES NO
product?
KNOWLEDGE-BASED TEST

9. Are there an appropriate number of test items for each objective? YES NO
10. Are all like test items grouped by type (Fill-in–the–blank, multiple choice,
YES NO
etc…)?
11. Have they been compared to the rest of the test to ensure they are:

a. Free of hints? YES NO

b. Not repeated elsewhere? YES NO

c. Consistent in format with like test items? YES NO

12. Are items on the test verbatim from the LOW’s? YES NO

13. Is this test valid? (Does it measure what it is supposed to measure?) YES NO

14. Is this test usable? (easy to administer, score and interpret the results) YES NO

Remarks:

C-15
Systems Approach To Training Manual Appendices

COURSE DESCRIPTIVE DATA (CDD)/PROGRAM OF INSTRUCTION (POI)


1. Does the POI contain a title page? YES NO

2. Does the POI contain a certification page? YES NO

3. Does the POI contain a record of changes page? YES NO

4. Does the POI contain a preface page? YES NO

5. Does the preface have a narrative of the purpose for the course? YES NO

6. Does the preface include information regarding graduates? YES NO

7. Does the preface include a point of contact for recommended course changes?
YES NO
8. Does the POI contain a course descriptive data (CDD)?
YES NO
9. Does the POI contain a table of contents?
YES NO

10. Does the CDD reflect the Course Title? YES NO

11. Does the CDD reflect the school name and address? YES NO

12. Does the CDD reflect the Course ID? YES NO

13. Does the CDD purpose identify the course intent? YES NO

14. Does the CDD scope identify all areas of instruction? YES NO

15. Is the mobilization length justified? YES NO

16. Does the course length equate to the curriculum breakdown? YES NO

17. Does the CDD reflect the Max Class Capacity? YES NO

18. Does the CDD reflect the Min Class Capacity? YES NO

19. Does the CDD reflect the Optimum Class Capacity? YES NO

20. Does the CCD reflect the Class Frequency? YES NO

21. Does the CDD reflect Student Prerequisites? YES NO

22. Does the CDD reflect the Quota Control? YES NO

C-16
Systems Approach To Training Manual Appendices

23. Does the CDD reflect the Funding Agency? YES NO

24. Do Reporting Instructions contain Messing and Billeting notes? YES NO

25. Are the Supervisor and Instructor billets identified? YES NO

26. Are Instructor Staffing notes present? YES NO

27. Are the School Overhead billets identified? YES NO

28. Are School overhead notes present? YES NO

29. Does the CDD contain Appendix A-Instructor Computation Worksheet (ICW)? YES NO

30. Does the CDD’s Appendix A contain accurate information? YES NO

31. Does the ICW require any further explanatory notes? YES NO

32. Are the notes clear and concise? YES NO

33. Does the POI contain Section II-Summary of Hours? YES NO

34. Are the Summary of Hours broken down correctly? YES NO

35. Does the total time justify the total number of training days? YES NO

36. Does each annex have its appropriate total time? YES NO

37. Does the POI contain Section III-Scope of Annexes? YES NO

38. Does the Scope of Annexes define the purpose of each? YES NO

39. Does Section IV contain a location of Learning Objectives Report? YES NO

40. Does Section IV contain an Academic Summary? YES NO

41. Does the Academic Summary justify the total academic/administrative time? YES NO

42. Does section IV contain all the concept cards? YES NO

43. Is a concept card developed for each lesson, administrative event, and exam? YES NO

44. Do the concept cards contain notes that clarify activity? YES NO

45. Does the POI contain a Section V-Student Performance Evaluation? YES NO

46. Does Section V contain statements that describe the purpose of the student’s evaluation? YES NO

C-17
Systems Approach To Training Manual Appendices

47. Does Section V contain statements that describe the method(s) of the student’s evaluation? YES NO

48. Does Section V contain statements that that describe the remediation? YES NO
49. Does Section V contain statements that describe what happens if the student fails
remediation?
YES NO

50. Does the POI contain a Section VI-Distribution List? YES NO

51. Does the Distribution List contain the agencies, which the POI is to be distributed? YES NO

Remarks:

C-18
Systems Approach To Training Manual Appendices

APPENDIX D

SAMPLE QUESTIONNARIES

APPENDIX D is comprised of the following evaluation questionnaires. Additional items may be


added to the questionnaires as required.

a. Instructional Rating Forms (IRF's). This questionnaire is a student reaction form


completed by at least 10 percent of the students immediately following each lesson.

b. Examination Rating Forms (ERF's). This questionnaire is a student reaction form


completed by at least 10 percent of the students immediately following each examination.

c. End of Course Critiques (ECC's). This questionnaire is student reaction form completed
by 100 percent of the students at the end of a course.

d. Post Graduate Survey. This questionnaire is sent (i.e. emailed, mailed, available online)
to course graduates approximately 3 months following completion of the course.

e. Post Graduate Supervisor Survey. This questionnaire is sent (i.e. emailed, mailed,
available online) to the supervisors of course graduates approximately 3 months following the
graduate's completion of the course.

f. Safety Questionnaire. This questionnaire is a student reaction form that provides the
student with an opportunity to assess whether he/she has been informed about safety issues.

D-1
Systems Approach To Training Manual Appendices

INSTRUCTIONAL RATING FORM


One way instruction is improved is by sampling student reaction to the instruction. To assist in improving this lesson, please
answer the following questions. This will assist the school in improving our courses.
Instructor: Date:
Course: Lesson:
INSTRUCTIONS: Circle the answer that indicates your level of agreement or disagreement as follows: Strongly Disagree = 1, Disagree=2,
Agree=3, and Strongly Agree=4. Please explain in the section labeled comments any ratings of 1 or 2. If the question is not applicable,
then circle NA.
Strongly Strongly
1. INSTRUCTOR Disagree
Disagree Agree
Agree
NA

a. The instructor showed a thorough knowledge of the lesson material. 1 2 3 4 NA


b. The instructor communicated the lesson material in a way that could
1 2 3 4 NA
be easily understood.
c. The instructor gave precise instructions concerning in-class 1 2 3 4 NA
exercises.
d. The instructor encouraged student participation. 1 2 3 4 NA
e. Student’s questions were answered in a professional (not
1 2 3 4 NA
demeaning to the student) manner.
2. LESSON CONTENT
a. The content was presented at the right pace. 1 2 3 4 NA
b. The student outline aided my understanding of the content covered. 1 2 3 4 NA
c. The environment of the class was interactive. 1 2 3 4 NA
3. SAFETY
a. Lesson related safety to job performance. 1 2 3 4 NA
b. Cease Training procedures were adequately explained. 1 2 3 4 NA
c. Safety precautions were reemphasized prior to commencing tasks. 1 2 3 4 NA
d. Safety was paramount at all times. 1 2 3 4 NA
e. Equipment/material was safe for use. 1 2 3 4 NA
4. METHODS/MEDIA:
a. The in-class exercises required in the course were worthwhile
1 2 3 4 NA
learning experiences.
b. The way that the class material was presented enhanced my ability
to learn/perform the concept/task.
1 2 3 4 NA
I especially liked the ___________________________ method.

c. The media complimented instruction. 1 2 3 4 NA


5. STUDENT: Circle the answer that best describes your knowledge level.
Avera Above
a. My knowledge of the content prior to this class was None Very Little Expert
ge Average
Avera Above
b. My knowledge of the content after completing the class was None Very Little Expert
ge Average
Name___________________________ Parent Unit: ___________________________________
Overall Comments/Suggestions for the Class (use back of form if more space is needed):
____________________________________________________________________________
____________________________________________________________________________
_________________________________________________________________________

D-2
Systems Approach To Training Manual Appendices

EXAMINATION RATING FORM


One way that we improve the examination process is by sampling student reaction to the examination. To assist in improving this
process, please answer the following questions. These forms will not be viewed until after all tests have been scored and returned.

Name: Date:
Course: Exam:
A. INSTRUCTIONS: Circle the answer that indicates your level of agreement or disagreement as follows: Strongly Disagree = 1,
Disagree= 2, Agree=3, and Strongly Agree=4. Please explain in the section labeled comments any ratings of 1 or 2. If the question
is not applicable, then circle NA.
Strongly Strongly
PRIOR TO TEST: Disagree
Disagree Agree
Agree
N/A

1. Test instructions were clear and concise. 1 2 3 4 N/A

2. I was allowed the opportunity to ask questions. 1 2 3 4 N/A

3. The time allowed for testing was indicated prior to the start of the test. 1 2 3 4 N/A

4. The instructor indicated what materials could be used during testing. 1 2 3 4 N/A

DURING THE TEST:


5. Distractions were minimal. 1 2 3 4 N/A

6. I was aware of the time remaining to complete the test. 1 2 3 4 N/A

7. Unfair advantage was not given to any other student during the test. 1 2 3 4 N/A

8. A monitor was present at all times during the test. 1 2 3 4 N/A


B. INSTRUCTIONS: If you have taken a written test, please answer questions 9-11. If you have taken a performance
test, please answer questions 12-18. If you are unsure of your test type, ask the test proctor.
WRITTEN TEST ONLY:
ONLY
9. All materials (pen, paper, etc.) necessary for the test were available. 1 2 3 4 N/A

10. Questions were written in a way that I could understand. 1 2 3 4 N/A

11. The information I was tested on was covered in class. 1 2 3 4 N/A

PERFORMANCE TEST ONLY:


ONLY
12. I had sufficient practice time prior to the test. 1 2 3 4 N/A
13. All equipment necessary for the test was accessible. 1 2 3 4 N/A

14. The skills/information I was tested on were covered sufficiently in class. 1 2 3 4 N/A

15. Performance task requirements were effectively communicated. 1 2 3 4 N/A

16. Safety precautions were reemphasized prior to commencing tasks. 1 2 3 4 N/A

17. Equipment/material was safe for use. 1 2 3 4 N/A

18. Cease Training procedures were adequately explained. 1 2 3 4 N/A

Circle your answer.


Less Than 2-3 More than Did Not
19. Prior to the test, I studied: 1-2 Hours
1 Hour Hours 3 Hours Study
Check (X) yes or no. If answer is yes, please indicate subject areas. YES NO
20. Was there any portion of the test that you believe should have been covered more thoroughly
during class/practical application? If so, please indicate the subject areas.
________________________________________________________________________
________________________________________________________________________
Other Comments (Please explain any questions rated 1 or 2): (REMARKS ON BACK)

D-3
Systems Approach To Training Manual Appendices

POST GRADUATE SURVEY


_______________________________________Course

Instructions: This questionnaire is designed to gather information to evaluate the


effectiveness of the _______________ Course in preparing you for your current duty
assignment. Please respond to all questions and return the completed questionnaire by
(email or mail).

SECTION I. PERSONAL DATA - Please fill in appropriate data.

Name Rank Graduation Month/Year

Billet MOS
DSN number for
Email
contact

SECTION II: TASK TRAINING

The tasks listed below presently receive some emphasis in the course. Please rate each
task/knowledge on the scales at the right in terms of its importance to your current job
and the adequacy of training received by bolding or highlighting the most appropriate
number. (Level of Preparedness scale may be skipped if the task has never been
performed on the job.)

TASK/KNOWLEDGE FREQUENCY LEVEL OF PREPAREDNESS


1- Daily 1- Not at all prepared
2- Weekly 2- Somewhat prepared
INSTRUCTIONS: BOLD or
3- Monthly 3- Prepared
Highlight the number that applies. 4- Never 4- Well-prepared
5- Very well prepared
(List tasks required in the course
1 2 3 4 1 2 3 4 5
HERE)
1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5
(Add additional pages if needed)

D-4
Systems Approach To Training Manual Appendices

(Additional sections may be added to provide specific information for the school,
i.e. type of equipment being used in the FMF, procedures being followed in the
FMF)
SECTION III: COMMENTS AND RECOMMENDATIONS

This section allows you to provide additional information and comments regarding
the effectiveness of the course in preparing you for your current job. Please
record your response in the spaces provided. (Attach additional sheets if more
space is required.)

1. Are there tasks you are required to perform on your job that were not covered in the course?
If so, list the tasks and briefly describe your duties.

2. What recommendations do you have for training tasks you feel were not covered adequately in
the course?

3. If you feel some tasks listed need not be trained in the formal school, please list them here and
explain your reasons.

4. Do you believe you benefited from this course? If so, how? If not, why not?

5. How can we improve this course for future students? (Consider present/future procedure and
equipment changes.)

Additional Comments:

D-5
Systems Approach To Training Manual Appendices

POST GRADUATE SUPERVISOR SURVEY

____________(Course Name)___________Course

Instructions: This questionnaire is designed to gather information to evaluate the effectiveness of


the ____(Course Name)________ Course in preparing graduates for future duty assignments.
Please respond to all questions and return the completed questionnaire by (email or mail).

SECTION I. PERSONAL DATA - Please fill in appropriate data.


Graduate's Name Rank Graduation Month/Year

Graduate's Billet Type of Unit MOS

DSN number for contact Email


Over
How long have you served in your current billet?
0-6 mths 7-12 mths 13-18 mths 18
(Bold or highlight one)
mths

SECTION II: TASK TRAINING

The tasks listed below presently receive some emphasis in the course. Please rate each
task/knowledge on the scales at the right in terms of its importance to the graduate's current job
and the adequacy of training received by Bolding or highlighting the most appropriate number.
(Level of Preparedness scale may be skipped if the task has never been performed on the job.)
LEVEL OF
TASK/KNOWLEDGE FREQUENCY
PREPAREDNESS
1- Daily 1- Not at all
2- Weekly prepared
3- Monthly 2- Somewhat
INSTRUCTIONS: Bold or Highlight the 4- Never prepared
number that applies. 3- Prepared
4- Well-prepared
5- Very well
prepared
(List tasks required in the course HERE)
1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

1 2 3 4 1 2 3 4 5

(Add additional pages if needed)

D-6
Systems Approach To Training Manual Appendices

(Additional sections may be added to provide specific information for the school,
i.e. type of equipment being used in the FMF, procedures being followed in the
FMF)
SECTION III: COMMENTS AND RECOMMENDATIONS

This section allows you to provide additional information and comments regarding
the effectiveness of the course in preparing the graduate for his current job.
Please record your response in the spaces provided. (Attach additional sheets if
more space is required.)

1. What recommendations do you have for training tasks you feel were not covered
adequately in the course?

2. If you feel some tasks listed need not be trained in the formal school, please list them
here and explain your reasons.

3. Do you believe the graduate benefited from this course? If so, how? If not, why not?

4. How can we improve this course for future students? (Consider present/future
procedure and equipment changes.)

Additional Comments:

D-7
Systems Approach To Training Manual Appendices

END OF COURSE CRITIQUE


The End of Course Critique provides the school with your reaction to the course you just completed.
The information you provide us is treated confidentially and is used to improve the quality of
instruction for the overall course. Thank you for your input.

COURSE: D
A
T
E
:
STUDENT NAME:

Ne
St
D ith
ro
is er Str
ng
A. Circle or highlight the rating that indicates your level of agreement or a Dis on
ly Ag
disagreement. Please comment on all ratings of 1 or 2. All comments g ag gly
Di ree
are encouraged regardless of whether you agreed or disagreed. r ree Ag
sa
e or ree
gr
e Ag
ee
ree
1. I had a clear understanding of what I would be required to learn or do in this 1 2 3 4 5
course? (The learning objectives were clearly stated.)
COMMENTS:

2. I am confident that I have learned or can perform the tasks required by the 1 2 3 4 5
learning objectives?
COMMENTS:

3. The written and performance exams tested my knowledge and/or ability to 1 2 3 4 5


perform the learning objectives?
COMMENTS:

4. The quizzes/puzzles/games/review sessions, when used, increased my


knowledge of the subject and prepared me for the tests. 1 2 3 4 5 N/A

COMMENTS:

5. Class time was used to achieve the learning objectives. 1 2 3 4 5


COMMENTS:

6. The time allotted to cover each lesson was appropriate for what I was 1 2 3 4 5
expected to learn.
COMMENTS:

D-8
Systems Approach To Training Manual Appendices

7. Course length was appropriate for what was expected. 1 2 3 4 5


COMMENTS:

8. The overall schedule for the course flowed logically and was well-organized. 1 2 3 4 5
COMMENTS:

9. Student outlines, training aids (i.e. internet sites, graphs, charts, maps), 1 2 3 4 5 N/A
and/or references were available.
COMMENTS:

10. The student outlines, training aids (i.e. internet sites, graphs, charts, maps), 1 2 3 4 5 N/A
and/or references used supported instruction.
COMMENTS:

11. Student outlines aided my understanding of the material. 1 2 3 4 5 N/A


COMMENTS:

12. Student outlines were easy to follow. 1 2 3 4 5 N/A


COMMENTS:

13. The media (i.e. PowerPoint, models, posters) used supported instruction. 1 2 3 4 5 N/A
COMMENTS:

14. Considering the amount of material covered during the course, there was 1 2 3 4 5
sufficient time available on both in-class and out-of-class (if applicable) work.
COMMENTS:

15. The methods (i.e. lecture, demonstration, practical application, case study,
group exercises) used to present course information helped me to understand 1 2 3 4 5
the course material.
COMMENTS:

16. Instructors were knowledgeable and well-prepared. 1 2 3 4 5


COMMENTS:

17. The instructors responded effectively to questions and input. 1 2 3 4 5 N/A


COMMENTS:

D-9
Systems Approach To Training Manual Appendices

18. The instructors were professional. 1 2 3 4 5


COMMENTS:

19. The overall course gave me a thorough understanding of my duties in the 1 2 3 4 5 N/A
operating forces and sufficient knowledge and skills to perform those duties.
COMMENTS:

20. Instructors followed safety precautions at all times. 1 2 3 4 5 N/A


COMMENTS:

21. Lessons on safety were included as applicable. 1 2 3 4 5 N/A


COMMENTS:

22. Lessons related safety to job performance as applicable. 1 2 3 4 5 N/A


COMMENTS:

23. Cease Training procedures were adequately explained as applicable. 1 2 3 4 5 N/A


COMMENTS:

24. Emergency action procedures were adequately explained as applicable. 1 2 3 4 5 N/A


COMMENTS:

25. Safety precautions were put in place prior to each event as applicable. 1 2 3 4 5 N/A
COMMENTS:

B. Answer the following questions.

26. Were there any particular lessons/blocks of instruction that were particularly confusing or could be improved?
YES/NO -- If you responded yes, please explain.

D-10
Systems Approach To Training Manual Appendices

27. Were there any portions of the course where there was idle time (i.e. standing around, not focused)? YES/NO -- If
you responded yes, please explain.

28. What is your overall evaluation of the instructors?

29. What is your overall evaluation of the course?

D-11
Systems Approach To Training Manual Appendices

SAFETY QUESTIONNAIRE
INSTRUCTIONS: This checklist is to ensure that you, the student, have been properly advised
of safety issues specific to this training. Your comments will help this school provide safe
training, improved guidance to the instructional staff, and to address your concerns regarding
safety measures.

LESSON TILE/PRACTICAL APPLICATION:

INSTRUCTOR: DATE:

A. Check the appropriate answer. YES NO

1. Did instructors follow safety precautions at all times?

2. Were safety precautions explained prior to training?

3. Were safety precautions reemphasized prior to practical applications and/or


performance exam?

4. Were Cease Training procedures adequately explained?

5. Did the instructor explain the procedure to be taken in the event of a mishap?

6. Was a safety brief included as applicable?

7. Did the lesson relate safety to job performance?

8. Were the tools and equipment in good working condition and safe to use?

9. Was supervision available when performing potentially dangerous tasks?

10. Was there encouragement to report any unsafe or unhealthy conditions?


B. Circle the rating that indicates
Strongly
your level of agreement or Disagree Strongly Agree Agree N/A
Disagree
disagreement.
9. I felt my safety was always a primary
1 2 3 4 N/A
concern of the instructor.
10. I felt that the training environment was
1 2 3 4 N/A
both safe and non-hazardous.

Additional Comments (Write number of reference and then comment):

STUDENT NAME: _____________________________________ DATE: ___________________

D-12
Systems Approach To Training Manual Appendices

APPENDIX E

SAMPLE CHECKLISTS

APPENDIX E is comprised of the following evaluation checklists. Additional items may be added to
the checklists as required.

a. Standing Operating Procedures (SOP) Checklist. This checklist is used as a job aid
for writing an SOP.

b. Instructor Evaluation Checklist. This checklist is used during the Implement Phase to
evaluate the instructor.

c. Observation Checklist. This checklist is used during the Implement Phase to evaluate
the effectiveness of the course materials during the class.

d. Environment Checklist. This checklist is used during the Implement Phase to evaluate
the instructional environment.

e. Safety Review Checklist. This checklist is used during the Implement Phase to evaluate
how well safety measures have been employed.

E-1
Systems Approach To Training Manual Appendices

SCHOOL SOP CHECKLIST


(The Marine Corps Directives System is the final authority on the requirements for writing an order.)

1. Determine Contents
a. Usability Information
( ) Purpose ( ) Locator Sheet
( ) Scope ( ) Record of Changes
( ) Background ( ) Table of Contents
( ) Recommendations ( ) Appendices
( ) Certification ( ) Index
( ) Distribution

b. Non-Academic Information
( ) Mission
( ) Organization
( ) Operations & General Information
( ) Facilities
( ) Billeting
( ) Messing
( ) Discipline
( ) Staff Development
( ) Turnover Files
( ) Transportation
( ) Safety/Operational Risk Management
( ) Administration and Logistics
( ) Field Exercises
( ) Inspections
( ) Daily Routine
( ) Physical Fitness

c. Academic Information
( ) Job Analysis
( ) Design
( ) Development
( ) Methods and Media Selection
( ) Scheduling
( ) Master Lesson File
( ) Formats
( ) Validation
( ) Implementation
( ) Evaluation
( ) Course Content Review Board (CCRB)
( ) Mastery
( ) Graduation Requirements
( ) Remediation
( ) Programs of Instruction (POI)
( ) Course Description Data (CDD)
( ) Homework

E-2
Systems Approach To Training Manual Appendices

INSTRUCTOR EVALUATION CHECKLIST


NAME: RANK: DATE:

COURSE: LESSON TITLE:

EVALUATION: Rehearsal 1 2 3 Presentation Certification (if applicable)


Quarter 1 2 3 4
INSTRUCTIONS: Evaluate each item on the checklist as YES, NI, (Needs Improvement), NO or NA (Not
Applicable).
1. INTRODUCTION YES NI NO NA COMMENTS
a. Gain Attention. Presented effectively; relates to LO's
b. WIIFM. Established need for students to listen.
c. Overview. Identified purpose of lesson and main points.
d. Learning Objectives. Introduced the learning
objectives.
e. Method/Media. Identified specific method(s)/media
used.
f. Administrative Instructions. IRF's, any other rules, etc.
g. Evaluation. Identified how and when evaluation would
occur.
h. Transition. Closed introduction and opened main idea.
i. Safety Brief (if applicable). Safety
precautions/controls and Cease Training Criteria are
explained.
2. BODY YES NI NO NA
a. Lesson Plan. Followed lesson plan.
b. Examples. Clarified teaching points through use of
examples.
c. Transitions. Closed main idea and opened next.
d. Probed. Used questions to check learning in transitions.
3. SUMMARY YES NI NO NA
a. Review Main Ideas. Reviewed conceptual framework.
b. Provide Closure. Reaffirm importance of content.
c. Closing Instructions. Clear and concise.
4. PROBING/QUESTION & ANSWER TECHNIQUES YES NI NO NA
a. Probing. Used probing questions throughout.
b. Response. Responded effectively to student's
questions/inputs.
c. Response. Responded to areas of confusion.
d. Questioning Techniques. Assessed student learning.
5. FACILITATION TECHNIQUES YES NI NO NA
a. Real World. Provided real world relevancy.
b. Participation. Encouraged student participation.
c. Interaction. Provided intellectual stimulation with
students.
d. Motivation. Used motivation techniques to monitor
activity progress toward meeting lesson purpose.
e. Focus. Established and maintained student attention.
f. Instructions. Clear and concise for exercises/PA's
6. METHOD YES NI NO NA
a. Method employment. States purpose and desired
outcome. Employs effectively.
b. Safety Brief (if applicable). Safety precautions, Cease
Training Criteria, and emergency action procedures are
explained.
c. Supervision and Guidance. Instructions and guidance
to students is adequately provided.
d. Debrief (if applicable). Overall feedback; review of
learning points.

E-3
Systems Approach To Training Manual Appendices

7. MEDIA YES NI NO NA
a. Set-up. Able to use equipment. Ensured students were
able to see media.
b. Employment. Media employed at the appropriate time.
8. COMMUNICATION – Nonverbal YES NI NO NA
a. Eye Contact. Evenly distributed, creating a "connection"
with all students.
b. Movement. Natural, smooth and coordinated with
dialogue.
c. Gestures/Mannerism. Avoided distracting mannerisms.
d. Facial Expressions. Varied with mood and content,
sincere, showed concern, reinforced and expressed
pleasure.
e. Appearance. Well-groomed, professional appearance.
f. Nervousness. Controlled nervousness and anxiety.
g. Barriers. Body language displays interest. Avoided
emotionally-laden words.
h. Enthusiasm. Displayed excitement.
9. COMMUNICATION - Verbal YE NI NO NA
S
a. Volume, Rate, Force, Inflection, and Pause. Natural
and appropriately varied.
b. Pronunciation, Articulation, Dialect. Easy to
understand.
c. Pet Words. Minimized.
10. SAFETY YE NI NO NA
S
a. Cease Training. Procedures were adequately explained.
b. Safety Precautions. Followed safety precautions at all
times.
c. Equipment/Material. Safe for use.
d. Safety Practices. Monitored students for good safety
practices.
11. CLASSROOM MANAGEMENT YE NI NO NA
S
a. Classroom Arrangement. Properly arranged classroom.
b. Time. Ideal use of time available.
12. OVERALL INSTRUCTION
SATISFACTORY UNSATISFACTORY
SATISFACTORY, BUT NEEDS IMPROVEMENT

OTHER REMARKS COMPLETED BY THE EVALUATOR:


All behaviors evaluated as "NI" or "NO" will be explained under this section. Also include any
comments of an outstanding nature.

SIGNATURE AND TITLE OF THE EVALUATOR


DATE
PRINTED NAME:

E-4
Systems Approach To Training Manual Appendices

INSTRUCTOR IMPROVEMENT PLAN


I have been debriefed on this evaluation. I understand the areas that need improvement and will
take the following action:

SIGNATURE AND TITLE OF INSTRUCTOR


DATE

E-5
Systems Approach To Training Manual Appendices

OBSERVATION CHECKLIST
OBSERVATION CHECKLIST: An observer completes this checklist while observing the lesson. This checklist is
designed as a source of quality control as well as evaluate the effectiveness of the materials during implementation.
INSTRUCTIONS: Check the appropriate box: YES, NO, or N/A. If you answer "NO" to a question, note the item
number with a comment for clarification and state a recommendation.
COURSE:
OBSERVER/TITLE: DATE:

A. COURSE MATERIAL YES NO N/A


1. Is the lesson plan the instructor uses the same as in the MLF?
2. Is the student outline the student uses the same as in the MLF?
3. Is the media the same as in the MLF? (Any modifications should be noted.)
4. Are all supplemental student materials used the same as in the MLF?
5. Are adequate directions for all supporting materials used located in the MLF?
6. Has Operation Risk Assessment Worksheet (ORAW) been updated within the last
year?
7. Is the ORAW still accurate?
8. Is the Instructor Preparation Guide still accurate?
B. LESSON PLAN YES NO N/A
9. Is the Gain Attention relevant to the learning objectives?
10. Does the lesson being taught reflect a logical sequence of the material?
11. Based upon viewing the lesson, are the lesson plans written with sufficient content
so that any instructor can teach the class if needed?
12. Do the instructor notes provide sufficient directions for the instructor?
13. Are the activities/exercises in the lesson meaningful? (Do students seem to be
learning from them?)
14. Are the activities/exercises appropriately placed in the lesson?
15. Are the method(s) effective to teaching the lesson content?
16. Is the method used to teach students reflective of how students will be evaluated?
C. STUDENT MATERIALS YES NO N/A
17. Are the student materials easy to read?
18. Are the student materials easy to follow?
19. Can the students take the materials home?
20. Are the students using the materials?
D. MEDIA YES NO N/A
21. Is the media visually appealing?
22. Is the media large enough for all to see?
23. Does the media compliment the lesson?
E. FACILITIES/EQUIPMENT
24. Are the facilities used conducive to the type of training?
25. Is the equipment used adequate for the purposes of training?

Comments/Recommendation (Write item number and then comment/recommendation):

OBSERVER SIGNATURE: _____________________________________ DATE: ______________


LESSON TITLE: ________________________________________________________________

E-6
Systems Approach To Training Manual Appendices

ENVIRONMENT CHECKLIST
ENVIRONMENT CHECKLIST: An instructor or an observer can complete this checklist. An
instructor may use the checklist to ensure classroom management. An observer may use it
to evaluate the management of the instructional environment.
INSTRUCTIONS: Check the appropriate box: YES, NO or N/A (not applicable). If
you answer "NO" to a question, note the item number with a comment for clarification and
state a recommendation.
COURSE:

INSTRUCTOR/OBSERVER: DATE:

A. TRAINING ENVIRONMENT YES NO N/A


26. Is the instructional area well-ventilated (i.e. heat, hazardous
fumes)?
27. Is the lighting sufficient in the instructional area for the
instruction and/or task?
28. Is the temperature comfortable?
29. Is noise minimized?
30. Are distractions minimized?
31. Are safety signs (i.e. hard hat area, welding in progress)
visibly posted?
32. Is safety equipment available and/or being used?
33. Is the training facility clean?
34. Is there adequate space for planned activities?
35. Is the facility set up so that all students can view media,
demonstrations, etc?
B. TRAINING CONDITIONS YES NO N/A
36. Are training aids and equipment operating effectively?
37. Do training support personnel perform their duties properly?
38. Is the support appropriate to requirements?

C. VISITOR/OBSERVER PREPARATION YES NO N/A


39. Is there a designated place for an observer station?
40. Is there a Visitor/Observer Folder available in accordance with
SOP policy?
Comments/Recommendations (Write item number and then comment/recommendation):

INSTRUCTOR/OBSERVER SIGNATURE: ______________________________ DATE_______________

LESSON/PRACTICAL APPLICATION TITLE: _______________________________________________

E-7
Systems Approach To Training Manual Appendices

SAFETY REVIEW CHECKLIST


SAFETY REVIEW CHECKLIST: An instructor or an observer can complete this checklist. An
instructor may use it in preparation for a lesson/practical application. An observer may use it to
ensure that safety concerns are addressed appropriately and in accordance with Operation Risk
Management. (Safety procedures/measures include, but are not limited to, heat stress control
procedures, respiratory protection, sight protection, hearing protection, hand protection, head
protection, foot protection, etc.)
COURSE:

INSTRUCTOR/OBSERVER: DATE:

INSTRUCTIONS: Check the appropriate box: YES, NO, or N/A. If you answer "NO" to a question,
note the item number with a comment for clarification and state a recommendation.

COMPLETE AS APPLICABLE YES NO N/A


41. Instructor training completed.
42. Instructors are present in sufficient numbers to prevent accidents
during potentially hazardous or dangerous situations.
43. Facilities ensure a safe working environment.
a. Inspections of fire extinguishers are up to date.

b. Exits are labeled and accessible.

c. Area has appropriate ventilation for fumes

44. An Operational Risk Assessment Worksheet (ORAW) has been


completed for the lesson and is located in the Master Lesson File
(MLF).
45. Hazard controls to eliminate or minimize potential risks are
included in the instructor preparation guide and/or the detailed
outline (lesson plan) for the lesson.
46. Cease Training Criteria and procedures are thoroughly explained
on the ORAW.
47. Tools and equipment are in good working condition and safe to
use.
48. Training evolutions that require students to perform hazardous
tasks are essential to accomplish learning objectives.
49. Applicable safety procedures/protective measures are in place.

50. A Training Safety Officer (TSO) has been assigned to high risk
training events.
51. Setback information (academic or personal issues) on students is
available to the instructor.

Comments/Recommendation (Write item number and then comment/recommendation):

INSTRUCTOR/OBSERVER SIGNATURE: ___________________________________ DATE: ____________


LESSON/PRACTICAL APPLICATION TITLE: __________________________________________________

E-8
Systems Approach To Training Manual Appendices

APPENDIX F

AFTER INSTRUCTION REPORT (AIR)

APPENDIX F consists of a sample After Instruction Report (AIR). The AIR is used to summarize
and compile information from Instructional Rating Forms, Examination Rating Forms, End of Course
Critiques, and instructor comment regarding one lesson.

F-1
Systems Approach To Training Manual Appendices

AFTER INSTRUCTION REPORT

INSTRUCTOR: DATE:

LESSON TITLE: COURSE NUMBER:

NUMBER OF
NUMBER OF IRFs:
TUDENTS:

INSTRUCTIONS TO INSTRUCTOR: The Instructional Rating Form (IRF) allowed students to use a 1 to 4 rating
scale with the level of agreement or disagreement as follows: Strongly Disagree = 1, Disagree = 2, Agree = 3, or
Strongly Agree = 4. NA is on the IRF as an option for statements that are not applicable. For the After Instruction
Report (AIR), calculate how many student(s) circled "1" and place that number in the blank under "1" beside the
corresponding question. Follow the same procedure for the ratings of "2", "3", and "4". The instructor should address
all negative responses ("1's" or "2's") under the instructor comments.

1. INSTRUCTOR: Questions related to the instructor. 1 2 3 4 NA

a. The instructor showed a thorough knowledge of the course material.


b. The instructor communicated the lesson material in a way that could
be easily understood.
c. The instructor gave precise instructions concerning in-class
exercises.
d. The instructor encouraged student participation.
e. Student’s questions were answered in a professional (not demeaning
to the student) manner.

2. LESSON CONTENT: Questions related to the lesson content. 1 2 3 4 NA

a. The content was presented at the right pace.


b. The student outline aided my understanding of the content covered.
c. The environment of the class was interactive.

3. SAFETY: Questions related to safety. 1 2 3 4 NA

a. Lesson related safety to job performance.


b. Cease Training procedures were adequately explained.
c. Safety precautions were reemphasized prior to commencing task.
d. Safety was paramount at all times.
e. Equipment/material was safe for use.

4. METHODS/MEDIA: Questions related to Methods/Media. 1 2 3 4 NA


a. The in-class exercises required in the course were worthwhile learning
experiences.
b. The instructional method(s) used in presenting the class material
enhanced my ability to learn/perform the concept/task.
c. The media complimented instruction.

5. STUDENT: Questions indicating student's perspective of any noted increased in his/her knowledge level.
Refer to questions 5a and 5b to answer the questions below. Place number of students who indicate an
increase in knowledge level in "a" and number of student indication NO increase in knowledge level in "b".

F-2
Systems Approach To Training Manual Appendices

a. How many students increased their knowledge to "Average", "Above Average", or "Expert"?
(For example, if a student answers 5a on IRF as "None" and answers
5b as "Average", then the student perceives an increase in his/her
knowledge level).

b. How many students indicated that there was no change in their knowledge level?
(For example, if a student answers 5a as "Average" and answers 5b as "Average", then no change has
occurred in knowledge level).

AFTER INSTRUCTION REPORT


Comments (as noted by students from Instructional Rating Forms IRF):

Instructor Comments:

Reassessment of ORA:
(Comment on ORA, recommend additional safety considerations to ORA, provide lessons
learned, etc.)

____________________________________
Instructor Signature/Date

Course Chief Comments:

Course Chief Signature/Date

F-3
Systems Approach To Training Manual Appendices

F-4
Systems Approach To Training Manual Appendices

APPENDIX G

SAMPLE RECORD OF PROCEEDINGS (ROP)

APPENDIX G consists of a sample record of proceedings (ROP). The information provided in this
plan is hypothetical.

G-1
Systems Approach To Training Manual Appendices

Sample ROP
1500
TRNG
Date

From: GySgt I.M. Design 123-45-6789/9917 USMC


To: Commanding Officer

Subj: RECORD OF PROCEEDINGS: COURSE NAME, COURSE CONTENT REVIEW


BOARD (CCRB)

Ref: MCO 1553.2_

Encl: (1) Any Material to substantiate the proposed change(s)

1. As per the reference, a CCRB was conducted for the The Name of Your School or Course on
Date of CCRB. The members for the CCRB were:

Capt Yohoo Company Commander Board Inbrief


Capt Training Task Analyst, MCCDC TECOM SME
Msgt Chair PLT Commander Board Chairman
Msgt Education MCI Representative Board SME
GySgt Design Curriculum Developer Board SME
Ssgt Leader Company Gunny Board SME
st
Ssgt Man 1 MARDIV Rep Board SME
nd
Ssgt Benifit 2 MARDIV Rep Board SME
rd
Sgt Kamp 3 MARDIV Rep Board SME
Sgt Hill Instructor Board Recorder

2. The key function of the CCRB was to formally record information and make recommendations to
improve the effectiveness and efficiency of the course’s Program of Instruction (POI).

3. Areas reviewed and briefed included: (List all topics that were discussed in your CCRB)

a. The principal parameters guiding the board were:

(1) Topic items (contained in paragraph 3) were briefed, discussed and voted on
when action to change/correct a topic item was deemed necessary.

(2) Dissenting opinions were encouraged. The board believed that there are
numerous ways to execute any segment of the course. However, it was the board’s responsibility to
identify and recommend the best solutions possible.

4. Areas if discussion.

a. Topic: Master Training Schedule

Discussion:

Recommendations:

b. Topic: Rewriting Individual Training Standards (ITS):

G-2
Systems Approach To Training Manual Appendices

Discussion:

Recommendation:

c. Topic: Implementation of the AN.VIC-3(V) Intercommunication Set.

Discussion:

Recommendation:

d. Topic: Standardizing AFVID

Discussion:

Recommendation:
5. Again, all recommendation were voted on by every member of the board. For any questions or concerns POC GySgt
Design at DSN 999-8888 or Comm at (345)432-9879.

I.M. Design

_________________________________ __________________________________
SIGNATURE SIGNATURE

_________________________________ __________________________________
SIGNATURE SIGNATURE

_________________________________ __________________________________
SIGNATURE SIGNATURE

_________________________________ __________________________________
SIGNATURE SIGNATURE

_________________________________ __________________________________
SIGNATURE SIGNATURE

G-3
Systems Approach To Training Manual Appendices

G-4
Systems Approach To Training Manual Appendices

APPENDIX H

SAMPLE EVALUATION PLAN

APPENDIX H consists of a sample evaluation plan. The information provided in this plan is
hypothetical. Any similarity to a military occupation specialty (MOS) or formal school is
coincidental.

H-1
Systems Approach To Training Manual Appendices

SAMPLE EVALUATION PLAN

GRADUATE JOB PERFORMANCE

1. Purpose and Data Required. The purpose of this evaluation is to determine the effectiveness of
the XYZ Course in adequately preparing graduates to perform the duties of MOS XXXX. The
following data will be required to determine the effectiveness of the course.

a. Individual Training Standards (ITS) for MOS XXXX.

b. On-the-job performance data for graduates of Class XX-X from graduates and graduates'
supervisors.

c. Applicable technical and doctrinal references.

2. Sources of Data

a. XYZ course materials [lesson plans, student materials, supplemental student materials,
media, tests, Program of Instruction (POI), Record of Proceedings (ROP)]

b. Graduates of Class XX-X (30 students).

c. Supervisors of graduates from Class XX-X.

3. Schedule

a. Class XX-X will graduate on 30 July 2001. To ensure valid data can be collected, the
evaluation is scheduled for 30 August allowing graduates to have been on the job at least 30 days.
The principal evaluator has determined that it will require 30 days to complete an evaluation of the
XYZ Course. The evaluation is scheduled to be completed 1 October 2001. However, any
unforeseen delays or changes to the schedule will affect the completion date.

b. One principal evaluator and one part-time evaluator will be required to properly collect,
analyze, and interpret data and report the results. The time and resources required are based on
the following evaluation activities.

(1) Collect and review course materials in preparation for survey design - 1 day

(2) Design and validate survey questionnaires - 4 days

(3) Conduct survey (mail/email questionnaires; receive/monitor responses; follow-up) - 20


days

(4) Train part-time evaluator in data analysis and interpretation - 1 day

(5) Data analysis and interpretation - 3 days

(6) Prepare report of findings and recommendations - 1 day

4. Data Collection Methods

a. XYZ Course materials will be reviewed to gather data to develop relevant survey questions.

b. Survey questionnaires will be used to collect graduate job performance data. The survey
questionnaires will be designed using a Likert rating scale to allow quantification and analysis of the

H-2
Systems Approach To Training Manual Appendices

data. A small number of questionnaire items will be designed for open-ended responses to solicit
recommendations and other comments. The survey will be validated using SME's assigned to the
school.

c. Because of time and resource constraints, this evaluation will be conducted by personnel
assigned to the XYZ school. One evaluator will be assigned as the principal evaluator during data
analysis and interpretation. The part-time evaluator will be trained to format and code data to assist
in performing the data analysis. A copy of the survey questionnaire containing hypothetical data
will be used as a training aid.

5. Method for Data Analysis and Interpretation. The following analyses will be conducted using
data from the returned questionnaires.

a. For all responses concerning how well the course prepared students for subsequent job
duties:

(1) Descriptive statistics for graduate and supervisor responses.

(2) Comparison between graduate and supervisor ratings of course effectiveness.

b. Descriptive statistics for graduate and supervisor responses concerning the importance of
each ITS task trained and how well each ITS task was trained.

c. The qualitative data collected by open-ended responses will be categorized and analyzed to
identify trends that may affect the structure of the course.

d. The results of these analyses will be interpreted to determine the extent to which training
prepared graduates to perform the duties of MOS XXXX and the importance of each task trained.

6. Method for Reporting. A preliminary report of evaluation results will be presented and reviewed
at the CCRB, scheduled for 20 October 2001. Based on this meeting, an ROP will be prepared
documenting evaluation results and any recommended revisions to the course.

H-3
Systems Approach To Training Manual Appendices

H-4
Systems Approach To Training Manual Appendices

APPENDIX I

SAMPLING TABLE

APPENDIX I is comprised of a sampling table that can be used to determine how many
questionnaires need to be sent out to approximate the desired return rate.

I-1
Systems Approach To Training Manual Appendices

SAMPLING TABLE
Population 95% Confidence 90% Confidence 80% Confidence

10 10 10 9
20 19 19 18
40 36 35 32
60 52 49 44
80 67 62 54
100 80 73 62
120 92 83 69
160 114 101 81
200 133 115 90
250 154 130 99
300 171 142 106
350 187 153 112
400 200 161 116
450 212 169 120
500 222 176 123
600 240 186 129
700 255 195 133
800 267 202 136
900 277 208 139
1,000 286 213 141
1,500 316 229 148
2,000 333 238 151
2,500 345 244 154
3,000 353 248 155
3,500 358 251 157
4,000 364 253 157
4,500 367 255 158
5,000 370 257 159
10,000 383 263 161
25,000 394 268 163
100,000 398 270 164

HOW TO USE THIS TABLE

Example: For a population of 4,200 course graduates, an estimated (desired) return rate of 85%, and a confidence level
of 95%, sample size would be determined using the following procedure:

1. Locate the number corresponding to the population size. Since 4,200 is not provided in the table, round the number
up or down to the nearest value. For example, the population value of 4,200 would be rounded down to 4,000.

2. Locate the value corresponding to the 95% confidence level with a population size of 4,000. Using the table above,
this value is 364 (meaning that 364 questionnaires are required). This figure should be 85% of the questionnaires mailed
out.

3. To determine the number of questionnaires that need to be mailed out to obtain 364 usable questionnaires, substitute
the values in the formula provided below. Using our example, for a population of 4,200 and an expected return rate of
85%, the desired sample size would be 364. Therefore, in order to obtain an 85% response rate (364 responses), 428
questionnaires need to be mailed out.

The table can be used as shown in the following example:

For a population of 4,000, 95% confidence level desired, and estimated return (response) rate of 85%:

364 X 100 = 428


85

I-2
Systems Approach To Training Manual Acronyms

AIMMS - Administration Instruction Manpower Management System

AIR - After Instruction Report

AFTMS - Air Force Training Management System

AOWP - Automated Orders Writing Process

ATRRS - Army Training Requirements and Reserve System

BNA - By Name Assignment

BTR - Basic Training Record

CBT - Computer Based Training

CCRB - Course Content Review Board

CDD - Course Descriptive Data

CDI - Compact Disc Interactive

CID - Course Identifier

CMC - Computer Mediated Conferencing

CT – Cease Training

CTC – Cease Training Criteria

DOD - Department of Defense

DODIC - Department of Defense Identification Code

DVC - Desktop Video Conferencing

EDCOM – Education Command

ECC - End of Course Critique

ELO - Enabling Learning Objective

ERF - Examination Rating Form

FEA - Front-End Analysis

FMF - Fleet Marine Force

GAR - Grade Adjusted Recapitulation

ICM - Interactive Courseware Multimedia

IRF - Instructional Rating Form

J-1
Systems Approach To Training Manual Acronyms

ISD - Instructional System Development

IT - Interactive Televisions

ITRR - Institutional Training Readiness Report

ISC - Information Systems Coordinator

ITS - Individual Training Standard

IVD - Interactive Video Disc

KSA - Knowledge, Skills, and Attitudes

LAW - Learning Analysis Worksheet

LOW - Learning Objective Worksheet

MCAIMS - Marine Corps Automated Instructional Management System

MCCDC - Marine Corps Combat Development Command

MCTFS - Marine Corps Total Forces System

MILMOD/OTA - Air Force Military Modernization Program/Oracle Training Administration

MLF - Master Lesson File

MMTR - Military Manpower Training Report

MOS - Military Occupation Specialty

MPP - Manpower Plans and Policies

MPS - Mission Performance Standard

NITRAS - Navy Integrated Training Administrative System

ORM – Operational Risk Management

PAT - Process Action Teams

POI - Program of Instruction

RAPELLA - Reserve Affairs Personnel Entry Level Assignment System

RDM - Recruit Distribution Model

ROP - Record of Proceedings

SAT - Systems Approach to Training

J-2
Systems Approach To Training Manual Acronyms

SMART - Sailor Marine Academic Record Transcript

SME - Subject Matter Expert

SOP - Standing Operating Procedures

SSC - Service School Code

TRNGCOM – Training Command

TECOM – Training and Education Command

TIMS - Training Information Management System

TIP - Training Input Plan

TLO - Terminal Learning Objective

TPD - Target Population Description

TQM - Training Quota Memorandum

T&R - Training and Readiness

TRRMS - Training Requirement Resource Management System

UD/MIPS - Unit Diary/Marine Integrated Personnel System

VC - Virtual Conferencing

VR - Virtual Reality

VTC - Video Teleconference

VTT - Video Teletraining

J-3
Systems Approach To Training Manual Glossary

Academic Time. Academic time includes curriculum hours dedicated to lecture, practical application, performance
examination, written examination, remedial instruction, review, and tutoring within the Program of Instruction (POI).

Actual Item/Object. Equipment or devices that are actually utilized in the performance of the task or job.

Administrative Time. Administrative time consists of curricula hours committed to in and out processing times,
commanding officer's time, graduation, physical training (when it does not have TLO's or ELO's associated with it and
does not affect the student's GPA), inspections, and field days in a Program of Instruction (POI).

Affective Domain. A taxonomy for classifying objectives that deals with feelings, attitudes, values, and other
indicators of emotionally-based behavior.

After Instruction Report (AIR). An evaluation tool that summarizes one-iteration of a lesson by documenting the
student's assessment of a lesson, the instructor's assessment of a lesson (Instructional Rating Form) and exam
(Examination Rating Form), test results related to the instruction, and any end of course critique data related to the
specific lesson.

Aiding Conditions. Any information or resource that is available to the student and identified in the learning objective.

Analysis. Level of cognitive domain (Bloom, 1956) in which students are able to break down complex organizational
structures into their component parts.

Analyze Phase. Initial phase of the Systems Approach to Training (SAT) process. The purpose of the analyze phase is
to determine what the job holder must know or do on the job.

Andragogy. Literally means the art and science of teaching adults.

Application. Level of cognitive domain (Bloom, 1956) in which students are able to use learned material in new and
concrete situations.

Attitudes. An acquired mental state that influences choices for personal action, such as preferences, avoidance, or
commitment.

Audiotapes. Magnetic media that presents and uses audio to strengthen the learning of languages or other materials
that require verbal repetition.

Auditory. Learners who tend to learn better by hearing.

Background Knowledge. The knowledge a student already knows prior to the start of instruction. Research
suggests that, outside of socio-economic factors, the best predictor of student learning is what the student’s background
knowledge is. Transference of knowledge from one domain to another is likely to be more successful if connections can
be made between what we want the student to know and what the student already knows.

Basic Fundamental Movement. Level of psychomotor domain (Simpson, Harrow, & Simpson) in which students can
perform inherent movement patterns by combining reflex movements which are the basis for complex skilled
movements.

Behavior. Any activity, overt or covert, capable of being measured. Also, any activity the student is expected to
exhibit after instruction and the primary component of a learning objective.

Body. Major section of a lesson in which learning is developed through support material and various teaching exercises
to achieve instructional objectives; preceded by an introduction and followed by a conclusion.

Break Cues. Reminds the Instructor when to provide students with a break.

Briefing. A briefing is a formal or informal presentation in which a variety of significant facts is presented as concisely
as possible. The briefing is rarely concerned with material beyond the knowledge level and is almost always
accompanied by visual representation of the material in the form of charts, graphs, slides, and other aids. Strictly
speaking, the briefing is not a teaching method, but it is sometimes used in school situations.

Case Study. The case study is a learning experience in which students encounter a real-life situation in order to
achieve some educational objective. By studying realistic cases in the classroom, students develop new insights into the
solution of specific on-the-job problems and also acquire knowledge of the latest concepts and principles used in
problem solving.

Cease Training (CT). An agreed upon verbal and/or non-verbal signal used to temporarily cease all training when, in
the opinion of the signaler, a serious hazard exists or an individual is experiencing serious problems.

J-4
Systems Approach To Training Manual Glossary

Cease Training Criteria (CTC). Conditions or hazards, when present, require Cease Training (CT).

Central Tendency. A single number that best represents a distribution of a set of numbers. The three most common
measures of central tendency are the mode, median, mean.

Characterization. Highest level of affective domain (Krathwohl, 1956) in which students integrate values or value
systems into their own life style or philosophy of life.

Checklists. Checklist consists of carefully worded questions that the evaluator answers by his review of course
materials or observation of course components (e.g., graduate or instructor performance, conduct of a class).

Clarifying Questions. Follow-up questions to confirm a respondent's answer or to clarify what the respondent has
said.

Closed-ended Question. A question that limits respondents' answers to predetermined response categories. Multiple
choice and yes/no questions are examples of closed-ended questions.

Closure. The final segment of a lesson during which instruction is appropriately ended by reemphasizing how the
lesson presented will be meaningful to the student.

Coaching. An intensive learning experience for individual or for small groups, characterized by significant student
involvement and immediate instructor feedback. A videotape of student performance is an excellent teaching aid when
supplemented by an instructor’s analysis and critique. This technique is particularly effective in instructor training.

Coding. Coding data is the process of organizing data into sets of categories to capture the meaning or main themes in
the data. Coding is usually done in the analysis of qualitative data, but quantitative data can also be grouped into code
categories.

Cognitive Domain. A taxonomy for classifying objectives that deal with verbal knowledge and intellectual skills such
as concept learning and procedural skills.

Collaborative Learning or Cooperative Learning. An instructoional approach in which students of varying abilities
and interests work together in small groups to solve a problem, complete a project, or achieve a common goal.

Collective Training Standard (CTS). Measures of mission performance used to determine whether units can or
cannot perform an assigned task. Collective training standards equate to Mission Performance Standards (MPS)
contained in the MCCRES and consist of the minimum three components: task, condition, and standard.

Compact Disc Interactive (CDI). It is an interactive multimedia system combining moving and still video, audio and
program content on a compact disc, which can be played back in a dedicated CD-player. It operates on its own and it
can be connected to a standard TV-set for displaying pictures and sound, and optionally to a stereo-system.

Comprehension. Level of cognitive domain (Bloom, 1956) in which students begin to develop understanding and are
able to translate, interpret, and extrapolate subject matter under study.

Computer-Assisted Instruction. The use of computers to aid in the delivery of instruction. A variety of interactive
instructional modes are used including tutorial, drill and practice, gaming, simulation, or combinations.

Computer-Based Training (CBT). An instructional methodology where students interact individually, presented
through a variety of media, controlled and monitored by a computer.

Computer Mediated Conferencing (CMC). Conferencing using the personal computer and telephone line as the
communication vehicles. It provides Instruction-Student and Student-Student interaction in both an asynchronous and
synchronous mode.

Concept. A class of people, objects, events, ideas, or actions which are grouped together on the basis of shared critical
attributes or characteristics, and are called the same name.

Concept Card. Provides formal schools/training units with a snapshot of individual lessons.

Concurrent Validity. The validity between a new exam and a previously recognized exam already accepted as valid.

Condition. That portion of the learning objective that describes the situation/environment in which the students
perform the specified behavior. Conditions include any pertinent influence upon task performance, including any or all
of the following: location of performance, environment, equipment, manuals, or supervision required.

J-5
Systems Approach To Training Manual Glossary

Cone of Learning. The Cone of Learning shows the progression from reading to doing and how it correlates to what is
remembered over time.

Conflicting Relationships. Conflicting relationships exist between learning objectives that involve opposite responses
to the same cue in a different context.

Consistency. Describes the results of a reliable evaluation instrument which remain similar given similar testing
conditions (similar students, knowledge base, physical testing situation, etc.) over a period of several uses.

Construct. Exists only in the mind. Examples are love and hate.

Content Validity. A test with high content validity measures the material being covered in the curriculum or unit being
tested as defined in our objective(s). In other words the test questions should refer to the subject matter covered.

Course Content Review Board (CCRB). A formal review of course materials to determine the validity of course
topics and make recommendations for changes, revisions, or deletions of the content a course to MCCDC (C 461).

Course Descriptive Data (CDD). A report, which documents course description, resource requirements, and
justification for the development or refinement of formal programs of instruction (POI), taught at Marine Corps training
and education institutions.

Courseware. Paper-based, audiovisual, and electronically stored instructional material necessary to deliver a lesson,
instructional module, or course.

Creativity. The imaginative recombination of known elements into something new and useful.

Criterion-Related Validity. Any test of carefully written measurable objectives to obtain data to compare student
performance levels with that specified in the objectives.

Criterion-Referenced Assessment. An assessment that measures what a student understands, knows, or can
accomplish in relation to specific performance objectives. It is used to identify a student’s specific strengths and
weaknesses in relation to skills defined as the goals of the instruction but it does not compare students to other
students. (Compare to norm-referenced assessment.)

Cues. Markings that are graphically place in the body of the lesson to assist the instructor in the presentation of
instruction

Curriculum. All instruction conducted within a school, outlined into specific topics, along with detailed learning
objectives, to include behavior, conditions, and standards.

Curriculum Validation Team. A method of validation in which an experienced jobholder, novice, supervisor,
instructor, and instructional designer meet to review the instructional material.

Delivery System. The instructional method and media used to present the instruction.

Demonstration. A teaching method in which students observe and then practice a sequence of events designed to
teach a procedure, a technique, or an operation. It combines oral explanation with the operation or handling of systems,
equipment, or materials.

Dependent Relationships. Dependent relationships exist between learning objectives that are prerequisite to other
learning objectives.

Design Phase. The second phase of the Systems Approach to Training (SAT) process, which defines the course
learning objectives, test, and delivery system, and from which instruction is developed.

Diagnostic Test. The purpose of a diagnostic test is to measure the achievement of the supporting skills and
knowledge that contribute to the ability to perform the criterion objective.

Dialogue. Interaction between two or more persons, one of whom may be the instructor, generally to present sharply
opposing points of view for students. The dialogue is often highly structured towards preplanned goals and may take
the form of questions and answers between the participants.

Dichotomous Variable. A variable with only two possible responses.

Differentiation. A characteristic of evaluation which requires that tests and rating instruments be capable of making
distinctions between selected groups; usually masters or non-masters of specific instructional objectives in criterion-
referenced testing or high and low overall test performers in norm-referenced testing.

J-6
Systems Approach To Training Manual Glossary

Directed Discussion. Involves initiating discussion and channeling students' thinking and responses along
predetermined lines.

Direct Question. A question directed at an individual or group with a specific answer.

Discussion Non-Directed Method. A group interactive process in which task or objective-related information and
experiences are evoked from the student. The instructor normally plays a very limited or passive role.

Distracters. Incorrect alternative responses to questions. Distracters should be worded so they are believable, but
clear enough so the student is never presented with a choice between several correct answers.

Distributed Practice Session. Based on time constraints of the course, the instructional developer divides practice
periods into segments. This permits more efficient learning of the psychomotor skills.

Domains of Learning. A broad classification of learning types. The three widely accepted domains that are used in
this manual are the cognitive (thinking, understanding), affective (attitudes, values), and psychomotor (physical skills).

Dress Rehearsals. A process in which an instructor delivers a lesson plan in its entirety to a group prior to the actual
class.

Duty. A duty (primary skill) consists of one or more tasks performed in one functional area. A duty is the major
subdivision of the work performed by one individual. It is recognized as being one of the position incumbent's principal
responsibilities. A set of operationally related tasks within a given job.

Enabling Learning Objective (ELO). A subordinate learning objective which describes the behavior for prerequisite
knowledge and skills necessary for a student to perform a TLO or steps of the ITS.

End of Course Critique. Evaluation instrument completed by the student after a course so that the student can assess
the overall course.

Environment. The physical conditions and surroundings in which a job is performed, or in which learning takes place,
including tools, equipment, and job aids.

Environment Checklist. Evaluation instrument used to assess physical conditions and training conditions.

Environmental Conditions. A Physical or social condition, in which the behavior of a learning objective must be
performed.

Evaluate Phase. The fifth phase of the SAT process during which the formal school/detachment determines value,
worth, or merit of the instructional program.

Examination Rating Form (ERF). A reaction form completed by students upon completion of examination.

External Evaluator. In either formative or summative evaluations, external evaluators, individuals not responsible for
the instructional program, conduct the evaluations. External evaluators normally include Mobile Training Teams (MTTs)
from higher headquarters, site visit teams from other schools.

Extrapolation. A type of learning at the comprehension level (Bloom, 1956) in which students develop sufficient
understanding to estimate trends or predict outcomes regarding the subject matter under study.

Field Trips. A field trip is an out-of-classroom experience where students interact with persons, locations, and
materials or equipment for the attainment of instructional objectives. An important aspect of the field trip is the student's
encounter with real settings.

Formal Lecture. A structured and often rehearsed teaching lecture with no verbal participation by students.

Formal Training. Training (including specialized training) in an officially designated course conducted or administered
in accordance with appropriate course outline and training objectives.

Formative Evaluation. Form of evaluation designed to collect data and information that is used to improve the
activities and products of the ISD/SAT process while the system is still being developed.

Free Discussion. Akin to the “bull session” or the “war story” hour, free discussion can be a valuable adjunct to
participatory management or brainstorming but, by its very nature, it seldom supports measurable objectives.

J-7
Systems Approach To Training Manual Glossary

Front-End Analysis (FEA). A systematic process in which: (1) A job is analyzed to determine its component tasks
and the knowledge and skills necessary to perform these tasks; (2) tasks are selected from training based on the
determination of which knowledge and skills are not already in the students' repertoire; and (3) job related performance
criteria are developed to measure trainees' ability to satisfy job requirements.

Gain Attention. An approach that stimulates student curiosity and describes the benefits students will obtain by paying
attention to the instruction.

Guest Lecture. A guest lecture is a presentation by a person other than the instructor who is usually an expert. It is
used to give variety to the class period or to supply information in an area where the instructor is not an expert.

Guided Discussion. An instructional method in which the students participate in an instructor-controlled, interactive
process of sharing information and experiences related to the achievement of one or more learning objectives.

Hazard. A condition with the potential to cause personal injury or death, property damage or mission degradation.

Hierarchy. The characteristic of a domain of learning that rank orders the levels-of-learning of which it is composed.
See Taxonomy of Educational Objectives and Domain of Learning.

Higher Levels of Learning. Those levels of learning above the comprehension level (Bloom, 1956) which may be
considered as the practical application of concepts and principles to complex, real problems.

High Risk Training. Basic or advanced individual or collective training, essential for preparing Marines and units for
combat, that exposes students and instructors to the risk of death or permanent disability despite the presence and
adherence to proper safety controls.

Implement Phase. The fourth phase of the SAT process during which instruction is delivered to the students.

Independent Relationships. Skills and knowledge in one learning objective are unrelated to those skills and
knowledge in another learning objective.
Indirect Discourse. Indirect discourse involves verbal interaction among two or more persons, which is seen and
heard by students. Some examples include: dialogue, teaching interview, panel, skits, playlets, and other
dramatizations.

Individual Rehearsals. A process in which an instructor practices a lesson plan without any assistance from other
instructors.

Individual Training Standard (ITS). The standards used to specify individual training proficiency requirements
(tasks) that support unit mission performance. They include a task (behavior), condition, proficiency standards (often
steps), and references. ITSs are generally derived from collective training standards. ITSs constitute the basis for
design, development, implementation, and evaluation of all individual training conducted in units and institutions.

Informal Lecture. A conversational teaching lecture with considerable verbal interaction between instructor and
students employing questions and discussion.

Instruction. The delivery of information to enable learning. The process by which knowledge and skills are transferred
to students. Instruction applies to both training and education.

Instructional Aids. Materials used to teach where ownership belongs to the instructor. The students do not get to
take the materials with them.

Instructional Design. An area of theory and practice that forms a knowledge base in the field of instructional
technology. Processes for specifying conditions for learning.

Instructional Environment. Instructional environment refers to the instructional setting, media/equipment, support
personnel, student materials, and the administrative functions the instructor must perform.

Instructional Material. All items of material prepared, procured, and used in a course or programs as part of the
teaching or general learning process.

Instructional Method. The means used to present information to the student.

Instructional Rating Form (IRF). A reaction form (questionnaire) submitted to students following completion of a
period of instruction that provides feedback on instructor performance, course materials, and instructional environment.

Instructional Setting. The location and physical characteristics of the area in which instruction takes place.

J-8
Systems Approach To Training Manual Glossary

Instructional System Development (ISD). Identical to definition for "systems approach to training."

Instructor. The individual, military and/or civilian, assigned the responsibility of providing instruction.

Instructor Notes. Includes any information pertinent to the conduct of the lesson and can appear throughout the
lesson plan.

Instructor Preparation Guide. A checklist that includes essential data, the instructor can quickly look at when
preparing the lesson to get an idea of lesson content, duration, method, location, instructors required, references, and
necessary instructional aids/equipment.

Interactive Courseware Multimedia (ICM). A set of commercially produced, computer-based, multimedia


instructional modules, which comprise a full credit-bearing course. This courseware contains text, computer graphics,
photographic stills, animation, sound and motion video. It offers highly interactive learning functionality for the learner,
and contains instructional support and student support systems. The courseware is integrated into the delivery of
courseware by the faculty. It is not intended for use as a "bolt on" attachment to a traditional lecture based course.

Interactive Multimedia Instruction (IMI). A group of predominantly interactive, electronically delivered training
and education support products. IMI products include instructional software and software management tools used in
support of instructional programs.

Interactive Television. Literally it combines traditional TV watching with the interactivity of the Internet and personal
computer. Programming can include richer graphics, links to Web sites through TV Crossover Links, electronic mail, and
chatroom activity and online commerce through a back channel (T-commerce).

Interactive Video Disc (IVD). Computer-controlled laser disc player used to present segments of video in a course or
lesson.

Internal Evaluator. In either formative or summative evaluations, individuals working within the organization
responsible for the instructional program, conduct the evaluation.

Interpretation. A type of learning at the comprehension level (Bloom, 1956) in which students develop and
understand relationships among the various aspects of a communication and are able to perform such activities as
making inferences, generalizing, and summarizing.

Interval Scale. Consists of mutually exclusive, exhaustive categories arranged in a hierarchical order. The intervals
between numbers that represent categories are equal, but there is no true zero on the scale.

Interview. A set of structured questions used to gather information from respondents. Conducted in person or over
the telephone.

Item Analysis. A set of methods used to evaluate the effectiveness of test items.

Item Difficulty. The number of people who get a particular test item correct, generally expressed in a percentage.

Item Discrimination. A comparison between people who have done well on a test and people who have not done
well.

Introduction. Major section of a lesson designed to establish a common ground between the instructor and students,
to capture and hold attention, to outline the lesson and relate it to the overall course, to point out benefits to the
students, and to lead the students into the body of the lesson; usually contains gain attention, motivation, and overview
steps.

Job. The duties, tasks, and task elements performed by one individual that constitutes his/her job. The job is the basic
unit used in carrying out the personnel actions of selection, training, classification, and assignment.

Job Aid. Any item developed or procured for the purpose of assisting in the conduct of instruction and the process of
learning. Examples of job aids include checklists, procedural guides, worksheets, etc.

Job Performance Measure. An instrument used to evaluate proficiency of a job holder on each task performed.

Job Task Analysis. A process of examining a specific job to identify all the duties and tasks that are performed by the
job incumbent at a given skill level.

Kinesthetic. The ability to learn by through the use of motion, movement, or the performance of the required activity.
Related to task requirements, one of the criteria for selection of delivery systems.

J-9
Systems Approach To Training Manual Glossary

Kinesthetic Learner. Learners who tend to learn better by doing.

Knowledge. Information required to develop the skills for effective accomplishment of the jobs, duties, and tasks.

Knowledge-Based Tests. A knowledge-based test measures cognitive skills.

Knowledge level. The lowest level of the cognitive domain (Bloom, 1956) in which students have the ability to recall
or recognize material in essentially the same form as it was taught.

Learning. A change in a person's behavior as a result of stimulus or experience. The behavior can be physical and
overt, or it can be intellectual or attitudinal.

Learning Analysis. A procedure to identify a task's related knowledge and skills that must be learned before a student
can achieve mastery of the task itself.

Learning Analysis Worksheet (LAW). Worksheet used during the learning analysis to generate knowledge and skills
related to the task and its performance step(s).

Learning Objective. A statement of the behavior or performance expected of a student as a result of a learning
experience, expressed in terms of the behavior, the conditions under which it is to be exhibited, and the standards to
which it will be performed or demonstrated.

Learning Objective Worksheet (LOW). Worksheet used to generate learning objectives, test items, and the delivery
system to be used.

Learning Style. An individual's preferred ways of gathering, interpreting, organizing, and thinking about information.

Lesson Plan. An approved plan for instruction that provides specific definition and direction to the instructor on
learning objectives, equipment, instructional media material requirements, and conduct of the training. Lesson plans are
the principal component of curriculum materials in that they sequence the presentation of learning experiences and
program the use of supporting instructional material.

Lecture. A formal or informal presentation of information, concepts, or principles by a single individual.

Likert Rating Scale. A rating system that allows data to be evaluated on a quantitative scale.

Limiting Conditions. Any information or resource that is not available to the student and identified in the learning
objective.

Main Points. The primary, logical break out of subject matter to support an instructional objective.

Managed On-The-Job Training (MOJT). Training conducted in the unit environment which utilizes a combination of
classroom instruction and practical application. The classroom instructor is also the work supervisor of the trainee.
Evaluation of the students is based upon the capability to demonstrate specific training standards.

Management-Oriented Evaluation. Approach to evaluation that entails collecting information to aid management
decision-making as an instructional program operates, grows or changes.
Massed Practice Session. The instructional developer plans one continuous practice session due to time constraints
of the course,.

Master Lesson File (MLF). A compilation of documents that contain all the materials necessary to conduct a period of
instruction or lesson.

Mastery. The achievement of the prescribed learning objective.

Mastery Learning. Criterion-referenced testing is the preferred method of testing for learning objectives taught in the
formal school/training center. The criteria for test mastery are established by the learning objectives. The student,
when completing a test, receives either a master (pass) or non-master (fail) for each learning objective. The student
may be assigned an overall score, but it does not remove the responsibility of mastering each learning objective.
Students that non-master a learning objective will receive remedial instruction and retesting until they reach the
standard for mastery.

Measurement. The act of acquiring data in the educational environment without making value judgments regarding
the relative or absolute merits of those data.
Measurement Error. The extent to which a score has been influenced by irrelevant or chance factors such as fatigue,
practice, time between the instruction and the administration of the instrument, etc. Also, every test contains errors of

J-10
Systems Approach To Training Manual Glossary

measurement. No one test accurately measures a student’s achievement or ability. Carefully designed standardized
tests may have measurement errors of 5-10 percent. Teacher-designed tests have large measurement errors. A test
result shows that a student falls into a range of scores and not just a single reported score. Focusing on a single score
and ignoring the score range is among the most serious of score reporting errors.

Media. Means of presenting instructional materials to the learner; for example, filmstrips, videotapes, slides, wall
charts, etc.

Media Cues. Used to remind instructors what media to use and when to present it during the lesson plan.

Median. The score above and below which 50 percent of the scores in the sample fall. Median is sometimes referred
to as the "breaking score".

Mean. Arithmetic average of all scores.

Mediated Instruction. Includes such devices as slides, films, tapes, and cassettes used to present the planned course
of instruction to the learner.

Mental Skill. Cognitive ability involving the processing, synthesis, and analysis of information.

Military Occupational Specialty (MOS). A four-digit code that describes a group of related duties and job
performance tasks that extend over one or more grades. It is used to identify skill requirements of billets in T/Os, to
assign Marines with capabilities appropriate to required billets, and to manage the force. It is awarded when
performance-based criteria have been met as set forth in ITS/T&R Orders.

Mission Performance Standards (MPS). Criteria that specify mission and functional area unit proficiency standards
for combat, combat support, and combat service support units. They include tasks, conditions, standards, evaluator
instructions, and key indicators.

Mode. The most frequently occurring score.

Models & Mock-ups. A model is a copy of a real object. It can be an enlargement, a reduction, or the same size as
the original. The scale model represents an exact reproduction of the original, while simplified models do not represent
reality in all details. Some models are solid and show only the outline of the object they portray, while others can be
manipulated or operated. Still others, called mock-ups, are built in sections and can be taken apart to reveal the internal
structure. Whenever possible, the various parts should be labeled or colored to clarify relationships.

Modular Instruction. A prepackaged unit of instruction which typically contain a clear statement of objectives and all
necessary learning resources to permit the learner to achieve these objectives. A module can be a complete unit or part
of a course.

Motivation. Motivation interests the learner and focuses their attention on the lesson. The motivation for a lesson
may be intrinsic or extrinsic. Intrinsic motivation refers to topics that students like or enjoy. Extrinsic motivation focuses
on external rewards for good work or goal attainment.

Nominal Scale. Consists of descriptive categories. The number represents different categories in the set but has no
mathematical meaning.

Non-Discursive Communication. Level of psychomotor domain (Simpson, Harrow, & Simpson) in which students
communicate through bodily movements ranging from facial expressions to sophisticated choreographics; going from
one movement to another in a specified order.

Norm-Referenced Assessment. An assessment designed to discover how an individual student’s test scores are
compared to scores on the test taken by a group of individuals who represent the target audience. Prevalent to aptitude
and achievement tests that relate scores to a percentile. (Compare to criterion-referenced assessment.)

Objectives-Oriented Evaluation. Approach to evaluation that determines the extent to which learning objectives
have been achieved (see criterion-referenced testing).

Objectivity. A characteristic of evaluation which requires that measurement in an educational environment be correct
and factual and be free from instructor bias.

Observation. A form of evaluation conducted during practical applications, performance test, or on the job, where
evaluators, instructors, or supervisors can observe the students’ performance.

J-11
Systems Approach To Training Manual Glossary

Observation Checklist. Evaluation instrument used to provide quality control and review effectiveness of instruction
through the review of the Master Lesson File and the effectiveness of the lesson, activities, student materials, media,
etc. as observed during a convening lesson.

Occupational Field (OCCFLD). A range of related military occupational specialties (MOS's) that share the same first
two digits (e.g., 0300, 0311).

Open-Ended Question. A question that asks for narrative responses and allows respondents to respond in their own
words.

Operational Risk Management (ORM). The process of dealing with risks associated with military operations. It
includes risk assessment, risk decision-making, and the implementation of risk controls.

Operational Test and Evaluation. Approach to evaluation that determines whether a product represents a significant
improvement or benefit over alternative products.

Ordinal Scale. Consists of categories arranged in a hierarchical order. The intervals between numbers that represent
categories are not equal.

Organization. Level of affective domain (Krathwohl, 1956) in which students compare, relate, and synthesize new
values into their own value systems.

Overhead Question. A question not specifically related to subject matter, but solicits a general response to the lesson.

Panel. A structured or unstructured discussion between two or more experts (generally excluding the regular
instructor), presented in a variety of ways, such as constructive arguments followed by debate, response to questions
from the instructor or the students, a preplanned agenda, a fixed or a random order of speakers, or free discussion.

Part Practice Session. A method of teaching that breaks down a task into parts. Used when tasks do not have highly
interrelated subtask.

Pedagogy. Literally means the art and science of teaching children.

Peer Teaching. Method where instructors allow students to teach other students with the student available to clarify
material presented unclearly.

Perceptual. Level of psychomotor domain (Simpson, Harrow, & Simpson) in which students interpret various stimuli
(something that directly influences action) and make adjustments to the environment. Suggests cognitive as well as
psychomotor behavior.

Performance. Part of a criterion objective that describes the observable student behavior (or the product of that
behavior) against an established standard of performance as proof that learning has occurred.

Performance Checklist. The breakdown of a task into elements that must be correctly performed to determine
whether each student satisfactorily meets the performance standards described in the objective....

Performance Measure. The absolute standard by which job performance is judged. It includes behaviors, results,
and characteristics that can be observed and scored to determine if a student has performed a task correctly.

Performance-Based Test. Sample work situation that measures how well the student has mastered the psychomotor
(physical) and cognitive (mental) skills required for task or job performance.

Physical Activities. Level of psychomotor domain (Simpson, Harrow, & Simpson) in which students perform activity
requiring endurance, strength, vigor, and agility.

Physical Skill. Directly observable behavior requiring the movement of body muscles. Also referred to as psychomotor
skill.

Pilot Course. A validation method used where instructional materials in final form are presented to a target population
group.

Population. A well-defined group of subjects, things, or characteristics from which measurements are taken (for
example, all students 6 feet or taller represents a specific population).

Post-Graduate Survey. Evaluation instrument to collect data from the graduates regarding a course previously
attended.

J-12
Systems Approach To Training Manual Glossary

Posttest. A test administered after the completion of instruction to assess whether a student has mastered the
objectives of the class, lesson, course or other unit of instruction (see summative evaluation).

Practical Application. A technique used during an instructional session which permits students to acquire and practice
the mental and physical skills necessary to perform successfully one or more learning objectives.

Practice and Provide-Help Cues. Practice cues describe the student’s role in the practical application portions of a
lesson, while provide-help cues describe the instructor’s role.

Predictive Validity. We can establish predictive validity for our Correct Response to Test (CRT) in much the same
fashion as we can determine concurrent validity. When we have two CRT measurements of what we believe to be the
same skill or knowledge taken at a considerable length of time from each other, we may wish to determine how well the
first CRT predicted success on the second CRT. We may wish to see how our school posttest predicts success on the job
as measured by supervisor ratings. Or we may wish to determine how well a pencil-and-paper test can be used to
predict future success on a performance exam. In these and similar situations, we can use various statistics to establish
predictive validity between two CRT’s as long as they are both scored on a pass or fail basis and the tests are separated
by a substantial period of time.

Prerequisite. A requirement the student must possess before being able to receive instruction. It covers what a
student must know before taking a lesson of instruction.
Pretest. A test administered prior to instruction to determine how much the student already knows (see formative
evaluation).

Primacy. Material presented earlier or first.

Printed Materials. A form of visual information media that includes flat pictures, charts, diagrams, and graphs.

Probe. An unplanned instructor-initiated question used to seek clarification, probe for understanding, or to control the
direction of the discussion; may be either direct or overhead question.

Process Method. Method used by evaluators to describe and document the actual development process of a specific
course by use of a checklist.

Process Testing. Testing where the procedure or steps (tasks) used to get to the end result are used to evaluate the
student.

Product Testing. Testing where the characteristics of a good product are used to evaluate the student.

Program of Instruction (POI). A training management document that describes a formal course in terms of
structure, delivery systems, length, intended learning outcomes, and evaluation procedures.

Programmed Instruction. A method of instruction that usually includes a carefully planned sequence of small units of
instruction which require the learner to respond to cues and receive immediate feedback. Various media (books,
teaching machines, and computers) are used to deliver the programmed instruction to the learner.

Progress Method. Method used by evaluators to provide an audit trail that keeps management informed of the
progress of the course development effort.

Progress Test. Tests administered throughout a course to evaluate student progress and to determine the degree to
which students are accomplishing the learning objectives (see formative evaluation).

Projected Still Images. A form of visual information media that includes overhead transparencies and slides.

Psychomotor Domain. A major area of learning which deals with acquiring the ability to perform discrete physical
skills requiring dexterity, coordination, and muscular activity.

Psychomotor Skills. Motor action directly proceeding from mental activity. Also referred to as physical skill.

Qualitative data. Qualitative data are subjective in nature. They emphasize standardization, precision, and reliability
of measures of efficiency when evaluating training/education outcomes.

Quantitative data. Quantitative data are objective in nature and are gathered through standard methods (measures
of efficiency, participant observation, interviews, etc.).

Questioning Method. Method used to emphasize a point, stimulate thinking, keep students alert, check
understanding, review material, and seek clarification.

J-13
Systems Approach To Training Manual Glossary

Questionnaire. A data collection instrument consisting of a printed form containing a set of questions used together
information from respondents.

Range. The difference between the largest and smallest scores occurring in a distribution.

Rating Scales. Any number of instruments upon which instructors record their assessments of student performance
through a process of observation or measurement and judgment.

Ratio Scale. Consists of categories arranged in hierarchical order that has equal intervals between categories (i.e., any
two adjoining values in a ratio measure are the same distance apart). A true zero anchors the scale of a ratio measure.

Reading Method. Reading is the assignment to a student of printed materials including books, periodicals,
microforms, manuals and regulations, and handouts (instructor-produced).

Receiving. Lowest level of affective domain (Krathwohl, 1956) in which students become aware of and pay attention
to someone or something.

Recency. Material presented not long ago.

Record of Proceedings (ROP). The evaluation results and recommendations that is the result of the Course Content
Review Board.

Reflex Movements. Level of psychomotor domain (Simpson, Harrow & Simpson) in which students perform an action
without learning it in response to some stimuli (something that directly influences the activity).

Reliability. An indicator of score consistency over time or across multiple evaluators. Reliable assessment is one in
which the same answers receive the same score regardless of who performs the scoring or how or where the scoring
takes place. The same person is likely to get approximately the same score across multiple test administrations.

Remedial Instruction. Supplemental instruction designed to correct student misunderstanding of course material or a
student learning deficiency. A sequence that provides an alternative, more basic approach to meeting the same
instructional objective.

Responding. A level of the affective domain (Krathwohl, 1956) in principle which students act or comply with the
instructor's expectations by performing an act and obtain satisfaction from it.

Risk. An expression of possible loss in terms of severity and probability.

Risk Assessment. The process of detecting hazards and assessing associated risks.

Role-playing. Students project themselves into simulated interpersonal situations and act out the parts of the persons
and situations assigned by the instructor. Role-playing is generally limited to practice of the skills involved in
interpersonal relations, such as counseling, interviewing, and conference leadership.

Safety Brief. A brief provided to make students aware of the identified hazards and the controls implemented to
minimize risks.

Safety Checklist. Evaluation instrument used by instructors or the administration to ensure that proper safety
procedures have been adhered to.

Safety Questionnaire. Student reaction form used to provide evaluation feedback on safety within the instructional
environment.

Scales of Measurement. Method of measurement that specify how numbers assigned to variables relate to the
property being evaluated or measured.

Self-Paced Instruction. Instructional method which permits a student to progress through a course of instruction at
the student's own rate.

Simulation. Actual or mock-up of a piece of equipment that allows duplication of job performance.

Site Visit. Visit by formal school personnel to the Fleet Marine Force to observe and interview graduates.

Skill. The ability to perform a job related activity that contributes to the effective performance of a task.

Skilled Movements. Level of psychomotor domain (Simpson, Harrow, & Simpson) in which students perform a
complex task with a degree of efficiency.

J-14
Systems Approach To Training Manual Glossary

Slides. A piece of 35-millimeter film on which individual slides or frames appear in sequence. Some filmstrips are
accompanied by a tape or disc that contains narration and a signaling device that indicates when to advance the filmstrip
to the next frame. Depending on the type of projector, the film advances either manually or automatically.

Small Critical Audience Rehearsals. A process in which an instructor delivers a lesson plan in its entirety to a small
group of instructor/peers to evaluate the delivery of a lesson.

Socratic Method. A conversation or discussion wherein two or more people assist one another in finding the answers
to difficult questions. The method may resemble a guided discussion, but the goal is often to obtain specific answers to
specific questions and not to stimulate discussion. This method facilitates the student’s quest for understanding by
requiring the student to answer questions on his/her own, to ponder the validity of what others have said or written, and
to give reasoned support of his/her opinion to the other students in the group.

Standard. Part of a learning objective, the standard establishes a criterion for how well the task or learning objective
must be performed.

Standard Deviation. Describes the amount of variability in a group of scores.

Standing Operating Procedure (SOP). A document that outlines the policies and procedures of an organization.

Stem and Responses. Makes up multiple choice test items. The stem presents a problem, question, statement, or
situation, all information needed to answer the multiple-choice question. The response is made up of several possible
responses where only one response is the correct answer.

Storyboard. A script sheet that shows key visualization points with accompanying video information.

Student. The individual receiving instruction, the individual learning from the interactive courseware, or an individual
who has been placed in a learning situation to acquire knowledge and skills required for accomplishment of specific
tasks.

Student Data Form. Form used to collect personal data from the student upon arrival at a course.

Student Materials. Additional facts and information given to the students as a study guide that can be referred to
during the course and as a job aid that students can take back to their unit following completion of the course. There
are two types of student materials, student outlines and supplemental student materials.

Student Outline. Student material which provides the student with a general structure to follow during the class and a
conceptual framework that highlights the main ideas of the class.

Student Query. “Students asking questions” is often used in combination with other methods such as the lecture, the
panel discussion, or the teaching interview, but it could be used by itself, either on a one-to-one basis in tutoring or
coaching or as part of small or large groups. The method is student controlled, although the responder can also control
the session to a certain extent if skillful enough. Students’ questions may often be a measure of the degree of their
understanding of a particular matter, that is, they “know enough to ask the right questions.”

Subject Matter Expert (SME). An individual who has a thorough knowledge of a job, duties/tasks, or a particular
topic, which qualifies him to assist in the training development process (for example, consultation, review, analysis,
advise, critique).

Summary. A major section of a lesson, which follows an introduction and body. It should contain a summary, closure,
and administrative directions.

Summative Evaluation. Used to make judgements and determinations concerning student achievement and the
effectiveness of the instructional program. Summative evaluations lead to grades, to reports about a student’s relative
level of competence, and to alterations of instructional programs. Also designed to collect data and information during
the operational (field) tryouts of equipment/system in order to determine the effect of the instruction under operational
conditions and to make any changes or revisions to the system prior to becoming operational.

Supplemental Student Materials. Any handout, other than the student outline, given to the students to support the
instruction.

Supportive Relationships. Skills and knowledge in one learning objective have some relationship to those in another
learning objective.

Survey Test. A survey test is designed to determine what prospective students already know and can do before
receiving the instruction.

J-15
Systems Approach To Training Manual Glossary

Synthesis. Level of cognitive domain (Bloom, 1956) in which students are able to put parts together to form new
patterns or structures.

Systems Approach to Training (SAT). An orderly process for analyzing, designing, developing, implementing, and
evaluating an instructional program which ensures personnel acquire the knowledge’s and skills essential for successful
job performance.

Target Population Description (TPD). The TPD provides a general description of the target population and
establishes administrative, physical, and academic prerequisites that students should possess to be assigned to a formal
school of instruction. The level of experience the average student will bring into the classroom must be considered. Due
to their lack of experience, entry-level students may not be able to comprehend multiple objectives in a single lesson.

Task. A unit of work usually performed over a finite period of time, which has a specific beginning and ending, can be
measured, and is a logical and necessary unit of performance.

Task List. The sequential, component steps in a larger task; represented by achievement of a criterion objective.

Taxonomy of Educational Objectives. A systematic classification scheme for sorting learning outcomes into three
broad categories (cognitive, affective, and psychomotor) and rank ordering these outcomes in a developmental hierarchy
from least complex to most complex.

Teaching Interview. The instructor questions a visiting expert and follows a highly structured plan, which leads to
educational objectives. The advantage of the teaching interview over the guest lecture is that the instructor controls the
expert’s presentation. The expert normally requires little or no advance preparation, but responds extemporaneously
from general experience. When a question-and-answer period follows the interview, students can interact with the
expert.

Terminal Learning Objective (TLO). A TLO is a statement of what a student is expected to perform upon
completion of a lesson, topic, major portion of a course, or course completion.

Test. Any device or technique used to measure the performance , skill level or knowledge of an individual.

Time Cues. Approximations for the amount of time required for presenting each lesson component. Each component
and main idea of a lesson plan has a time cue. The sum of all the main idea time cues equal the time cue for the body.

Training. Instruction and applied exercises for the attainment and retention of skills, knowledge, and attitudes required
to accomplish military tasks.

Training & Readiness (T&R) Event. An individual or collective training standard.

Transfer of Learning. The extent to which what the student learned during instruction is used on the job.

Transparencies. An overhead transparency is usually made from acetate or plastic, which has been prepared for us on
an overhead projector. If hand drawn transparencies are needed, materials such as heavy-duty, clear plastic bags,
document protectors, and reprocessed x-ray film can be used in place of commercially produced acetate. In addition,
transparencies can be made from existing printed material by using a thermoprocess machine and special film.

Transitions. Statements used by the instructor to move from the introduction of a lesson to the body, between main
points between sub points within each main point, and from the body to the conclusion of the lesson. These statements
show a logical relationship between the lesson segments they connect.

Validation. The process by which the curriculum materials and instruction media materials are reviewed by the
contractor for instructional accuracy and adequacy, suitability for presentation, and effectiveness in providing for
trainees' accomplishment of the learning objectives. Validation is normally accomplished in tryouts with a representative
target population. The materials are revised as necessary as a result of the validation process.

Validity. A characteristic of evaluation, which requires that testing instruments measure exactly what they were
intended to measure. A test with high content validity measures the material covered in the curriculum or the unit being
tested. A test with high criterion validity successfully predicts the ability to do other work. For example a test to be an
auto mechanic with high criterion validity will successfully predict who will become a good mechanic.

Variability Attributes. Characteristics shared by some, but not all, members of a class of people, object, events,
ideas, or actions which are grouped together on the basis of shared critical attributes and called by the same concept
name.

J-16
Systems Approach To Training Manual Glossary

Variance. The average squared deviation from the mean variance is useful for determining how far off the mean
students score on a particular test item or test.

Valuing. Level of affective domain (Krathwohl, 1956) in which students accept, prefer, or commit themselves to an
object or behavior because of its perceived worth or value; to appreciate.

Video Tele-Training (VTT). Supports distance learning and video teleconferencing which allows us to send and
receive presentations, allows students to interact with the instructors and students at distance sites and has the
capability of connecting to more than 20 classrooms/sites around the world with one instructor teaching them all. This
technology also has the capability of connecting to almost any kind of broadcast format.

Virtual Conferencing. Video teleconferencing that allows instructors the ability to send and receive presentations, and
allow students the opportunity to interact with instructors at distance sites.

Virtual Reality (VR). Virtual reality is the computer-generated simulation of a real or an imagined environment or
world. It can be graphics-based (e.g., a walk-through of a building) or text based (e.g., a description of a city where
participants can interact with one another).

Visual Learners. Learners who tend to learn better by seeing.

Whole Practice Session. A method of teaching an entire task. Used when tasks have highly interrelated subtask.

J-17
Systems Approach To Training Manual References

Bloom, B.S. (Ed). (1956). Taxonomy of Educational Objectives. New York: Longman.

Bloom, B.S. (Ed.), Englehart, M.D., Furst, E.J., and Krathwohl, D.R. (1956). Taxonomy of
Education Objectives: Handbook I: Cognitive Domain. New York: David McKay Co.

Caffarella, R. S. (1994). Planning Programs for Adult Learners. San Francisco: Jossey-Bass
Publishers.

Cranton, P. (1989). Planning Instruction. Toronto: Wall & Emerson, Inc.

Cranton, P. (1992). Working with Adult Learners. Toronto: Wall & Emerson, Inc.

Dale, E. (1969). Audio-Visual Methods in Teaching (3rd Ed.) Holt, Rinehart, and Winston.

Davis, B.G. (1993). Tools for Teaching. San Francisco: Jossey-Bass Publishers.

Gagne, R.M., Briggs, L.J., and Wager, W.W. (1992). Principles of Instructional Design (4th Ed.).
New York: Harcourt Brace Jovanovich College Publishers.

Gronlund, N. E. (1998). Assessment of Student Achievement. (6th Ed.) Boston: Allyn and Bacon.

Harrow, A.J. (1972). Taxonomy of Psychomotor Domain. New York: David McKay Co.

Heinich, R., Molenda, M., Russell, J.D., and Smaldino, S.E. (1999). Instructional Media and
Technologies for Learning. (6th Ed.) Upper Saddle River, New Jersey: Prentice-Hall, Inc.

Kirkpatrick, D.L. (1998). Evaluating Training Programs. (2nd Ed.) San Francisco: Berrett-Koehler
Publishers, Inc.

Krathwohl, D.R., B.S. Bloom, and Masia, B.B. (1964). Taxonomy of Education Objectives:
Handbook II: Affective Domain. New York: David McKay Co.

Knowles, M.S., Holton III, E.F., and Swanson, R.A. (1998). The Adult Learner. (5th Ed.) Houston:
Gulf Publishing Company.

Mager, R.F. (1984). Preparing Instructional Objectives. (2nd Ed.) Belmont, California: David S.
Lake Publishers.

Powers, B. (1992). Instructor Excellence: Mastering Delivery of Training. San Francisco: Jossey-
Bass Publishers.

Reiser, R.A. and Gagne, R.M. (1983). Selecting Media for Instruction. Englewood Cliffs, New
Jersey: Educational Technology Publications.

Seels, B. and Glasgow, Z. (1998). Making Instructional Design Decisions. (2nd Ed) Upper Saddle
River, New Jersey: Prentice-Hall, Inc.

Smith, P.L. and Ragan, T.J. (1999). Instructional Design. (2nd Ed) New York: John Wiley & Sons,
Inc.

J-18

You might also like