Monday, April 30, 2012

ADDIE Process- Evaluation

This post is part 5 in a 5-part series on the Phases of the ADDIE Process of Instructional Design
<< Previous post: Implementation



A critical phase in the ADDIE Process of instructional design is evaluation. To evaluate means to judge effectiveness.  This "phase" is a little misleading, because in reality you should be continually evaluating your work at every phase through formative evaluation, described below 

 Summative evaluation is performed to determine how effective a given piece of instruction is at improving student learning and usually takes place after a piece of instruction has been implemented.

Gathering Formative Feedback
A formative evaluation is performed to solicit feedback and guidance during the design process with the purpose of improving and refining the instruction. Formative feedback is gathered during this process, and there are several important things to remember: 
  • Gather data iteratively and at every step in the design process. You should make sure you are on the right track as you move through the design phases.
  • Focus on refining and improving instruction. Your goal should be to make the instruction more effective, more efficient, and more engaging for the students.  Gather feedback from many different sources, including:
    1. Experts. Ask people who know the content to review your instruction. This will help ensure that the content is accurate.
    2. Designers. Ask other instructional designers to review your instruction and give you feedback on how you could make it more clear and effective. Instructional design is the creative application of research-based principles, and creativity can be enhanced through collaboration and brainstorming.
    3. Editors. Have a course editor review your materials to help improve their quality. This could be seen as part of the design process, but I mention it here because it is formative in nature and improves the quality of the instruction.
    4. Learners. Test your instruction on your learners to see how it is received. Your instruction should be tailored to the needs of your target audience, and their feedback will help you improve the quality of your instruction.
In our MS Degree at Franklin University, we follow a stronger formative evaluation process to improve the quality of each of our courses. Each course undergoes at least two revisions based on data gathered from each of the sources described above. We see a dramatic increase in student satisfaction with our courses as we continually refine and improve them.


Here is an excellent video providing more ideas on evaluation in instructional design:




Dr. Richard Clark - HPT Conference Interview


In recent posts I reported my Experience at the ISPI 2012 Performance Improvement Conference. While there, Dr. Richard Clark, one of the keynote speakers, spoke on the importance of using research-based practices to improve human learning and performance. At the conference, Dr. Clark was interviewed by Guy Wallace, on his experiences and insights on Human Performance Technology and ISPI. I've embedded the video below.



You can learn more about Dr. Clark's work at http://www.cogtech.usc.edu/.

Wednesday, April 25, 2012

The 2012 ISPI Conference - Ideas and Experiences

As you may have read in my earlier posts, I recently attended the ISPI Performance Improvement Conference in Toronto, Canada. I had an outstanding experience and blogged some of what I learned throughout the week. If you missed the conference, these links share my experiences.

Before the Conference
Here are a few posts before the conference and during the trip to Canada.
Principles and Practices of HPT Workshop
In Toronto, I first attended the Principles and Practices of HPT Workshop, which was outstanding. My notes and thoughts from this workshop:
Podcast:
While I was at the conference, I had a few minutes to talk with my brother about some of the things I had learned and describe some of the basic concepts of the field of Human Performance Technology.
ISPI Performance Improvement Conference
After the workshop, I attended the ISPI Performance Improvement Conference. The ideas and the knowledge presented were astonishingly powerful. Here are my reactions from the conference.

Monday, April 23, 2012

Podcast: 2012 ISPI Conference, Torontion, Canada

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

Here is a podcast on my experiences at the ISPI Human Performance Improvement Conference. It was a conversation between my brother and myself over the phone. (I was in Toronto Canada, and he was in Idaho Falls, Idaho). We touch on some of the foundations of Human Performance Technology and how it is applicable to businesses and organizations. Hope you enjoy it!

http://www.edtechdojo.com/110-ispi-conference-report.html

ISPI Performance Improvement Conference - Day 3

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

I have been attending ISPI's Performance Improvement Conference 2012, and today is the final day of the conference. I had to leave this afternoon to get to the airport, so I missed the afternoon presentations.


Morning Session - Eileen Maeso, CPT  How do You Apply HPT?

This was the last session I attended at the conference, and it served to really help me take things back a step and see things from the big-picture. Eileen went over some of the key terminology in the field, described some of the key models used in the field, and then had us practice using the “ISPI-adopted HPT Model” to a simple case study. It was nice to work through a sample situation and it really helped me to sort of blend everything back together as I completed my time at the conference. 
























This model seems somewhat overwhelming when you look at the big picture, but it really follows the same general phases of ADDIE Process for designing instruction. Once you think of it in these terms, you can follow each of the phases and use the HPT model as a guide. It really is powerful. The funny thing is that all of these strategies on their own seem practical and straightforward, but it is the systematic application of all of these strategies that really creates the powerful results.

*     *     *     *     *

Additional Themes from the Conference

Communicating HPT Ideas Effectively 

It is crucial that knowledge be communicated clearly. I've recently come from the academic world where researchers use very methodical steps for presenting their work, but in the world of business, this kind of rigor and extraneous detail wastes a great deal of time. Here are a couple of ideas for communicating problems, solutions and results:
  • Present your ideas in clear, simple terms. The goal is to help others understand the problem and solution very quickly. It is important to make your deeper analysis visible, but this should only be made available if requested. 
  • Use the fewest words possible. If you can say it with less, do it.
  • Use effective visuals to communicate complex things. Visual presentation can be very powerful, as well. I find that I naturally think this way, and I will use this approach more effectively as I do my work. This can be as easy as using simple graphics or laying out a page in a visually appealing way.
Systems-thinking

Everything operates in a system in which everything is interrelated. Focusing on a minor component might have positive impact on that component but the system might not be affected positively. And sometimes changing one component can negatively impact the system.  We should be aware of the larger system in which we work and live. For example, we have limited resources on earth, and we must figure out how to align our lives, our communities, and our businesses with that larger system or there will be terrible consequences.

Working Systematically

This is related to systems thinking but it somewhat different. In systems thinking one looks at things as a whole interacting system. But working systematically means working in an objective-oriented manner, it means thinking critically and using proven processes to identify and solve problems.When we are working systematically, we analyze and design before we begin implementing solutions. We gather data and consider our needs before we act. There are several things that make it difficult to work systematically:
  • Human nature. It is efficient to use knowledge we already have, and if we didn't have this capacity, we would have to relearn almost everything we do. The problem is that it is often not effective.
  • Lack of time. We are often so busy that we do not have time to do a proper analysis and to really identify an overall goal, to discover what is getting in the way of reaching that goal, and to identify the most effective path for reaching that goal.
  • Overload. We are so often buried by the constant flow of information and pressure that our minds literally become overloaded and we are unable to take the time to do what is effective.
In a future post, I will share a podcast in which I discuss my experience at this conference with my brother, instructional designer J. Clark Gardner, on the EdTech Dojo.

Sunday, April 22, 2012

ISPI Performance Improvement Conference - Day 2

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

I have been attending ISPI's Performance Improvement Conference 2012, and today was the second day of the conference. I missed the keynote speaker this morning to attend church. (I normally attend the LDS church but missed the bus to get to the chapel so I attended Mass at St. Michael's Cathedral which is right on Bond Street. A beautiful Cathedral and a very nice service).

Morning Session - Tim Brock, PhD, CPT, Peggy Meli, PhD - HPT Backyard Research: Tales From the Frontline

I attended a presentation by Tim Brock and Peggy Meli. (By the way, Dr. Brock is an adjunct faculty member of the Instructional Design and Performance Technology Masters Degree Program at Franklin University). They described how they have applied a sort of action-research approach to their own HPT work. Their ideas were really practical and showed how a practitioner can reflectively apply their knowledge and iteratively improve the work they are doing as  they do it. A great presentation.

Afternoon Session - Debunking Common Myths

The afternoon session was somewhat unique - I had never seen this kind of format at a conference. Several researchers presented for 20 minutes each on several erroneous ideas that are often heard in the field and debunked them with solid research. The idea is that basing our practice on strategies supported by research is the most logical and effective route. It was actually very insightful, here are some highlights:
  1. Designing instruction based around Learning Styles does not work. (I have written about how to combat this error in a previous post.)
  2. "Digital Natives" do not learn any differently than older learners.
  3. The media used doesn't increase learning, it is the strategies used that improve learning.
  4. IQ is actually a very good predictor of success at many things.
  5. Learners need guidance to learn most effectively - that is, minimal guidance works very poorly.
  6. Learner reactions are a very poor way to evaluate learning.
Some More Notes From the Conference

At the Principles and Practices of Human Performance Technology Workshop, Jim Hill visited with us for a few minutes and gave some really sound advice for beginning performance consultants (in my own words):
  • Apply the HPT tools that you are learning to yourself first. This is a good way toget started.
  • Start on smaller projects so that you can manage them easily. You can later start to expand to bigger projects.
  • When talking with people, make the complex seem simple. Instead of saying you will increase sales by X%, say that the goal is to have "one more deal per sales rep." This is very simple, easy to understand, and seems easy to apply.
  • Don't be afraid to use big numbers, though. People like to invest in big ideas.

*     *     *     *     *

This conference has been an incredible introduction to HPT tools and practices. The Principles and Practices Workshop was a great introduction, and I am now having that knowledge reinforced and expanded as I attend the presentations.

I feel like I am reaching cognitive overload. I find that my mind has gained about all that it can, and I will need to reflect on and begin to apply what I have learned over the coming weeks and months. I am thinking more long-term, now. It will likely take another couple of years of conferencing, practicing, and learning before I feel like I have a level of expertise. It is frustrating because I have gained expertise in other (related) fields, but I must remember that it took years of study and practice.

It's hard to believe that tomorrow is the last day of the conference, it has been such an experience.

Saturday, April 21, 2012

ISPI Performance Improvement Conference - Day 1

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

I have been attending ISPI's Performance Improvement Conference 2012. Yesterday was the final of the 3-day Principles and Practice of Human Performance Technology workshop, and today was the first day of the conference.

Keynote Speaker - Dr. Richard Clark

I was excited to see that Dr. Clark was the keynote speaker- I have heard him speak before and in the past he has complimented me on some of my writing. Today he presented on the need to use evidence-based practices. He stated that as a society, ISPI should be willing to research what the evidence shows, to promote research, and to share it more completely.

After his presentation, I overheard someone saying that Dr. Clark is disconnected with the real world, that in business we have to just work with what we have due to pressure and constraints. I agree that practitioners must act quickly; however, I don't believe that Dr. Clark was saying that practitioners need to do research. He was saying ISPI as a society should create, promote, and share research and that there should be a conscious effort toward making this happen. Is he disconnected from the real world? Yes, and I think he is consciously disconnected in the right direction. We should base our actions on the evidence, on what we know works.

As a side note, in his presentation, Dr. Clark mentioned First Principles of Instruction as a research-based framework for research and practice. I have previously written and published an article about how these principles can be used: Applying Merrill's First Principles of Instruction

You can also view a recorded interview with Dr. Clark at the ISPI Conference.

Morning Session - Patti Phillips, CPT, PhD - ROI Basics

Dr. Phillips presented on Return on Investment in a performance improvement project. This topic is new to me (since I am an academic with no financial background to speak of) and it was eye-opening. She provided three great resources: (1) a guide for the workshop, (2) a really nice fold-out of a very clear model demonstrating how to measure ROI in a performance improvement project, and (3) her book The Bottom Line or ROI.

I was really surprised about how practical and logical her ideas were. She and her husband Jack really created a very powerful set of tools. For some reason, thinking in this way is totally new to me. I want to apply the knowledge, but realize that as a novice, it will be difficult. I think I will start small by just gathering data on what is happening with my program so that I have more information with which to make decisions. I think that the more performance data I have available, the greater the opportunity to calculate ROI and make useful decisions.

 I also attended several afternoon sessions that were very powerful. I find that my mind is filled with some really great knowledge, but new exposure to an entire field can be overwhelming. Several things give me an advantage in understanding the principles taught- my ISD background, my experiences as a corporate trainer, and my understanding of the goal-setting process - but it is still a lot to take in.

I am excited to see what I learn tomorrow. I am confident it will be worthwhile...

HPT Workshop: Day 3

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

I have been attending ISPI's Performance Improvement Conference 2012. Yesterday was the final of the 3-day Principles and Practice of Human Performance Technology workshop. In the previous 2 days, we spent a great deal of time discussing different analysis methods, which are design to help identify what is contributing to performance needs or gaps.

In this final day of the workshop, Dr. Addison introduced us to several models and tools for Selecting and Implementing an intervention. The first tool he gave us what he calls the Performance Map Quick Check (by Roger Addison). This tool can be used when working with an organization to help clients understand that there are multiple solutions to a performance problem and that the solution they are requesting might not fit their needs. Here is a simplified version of the Performance Map Quick Check:

High Rating



Competence
Do the performers have the knowledge, skills, and abilities






Low Rating
Motivation-based solutions
(the why)

These solutions might include feedback, consequences, incentives, coaching, etc.

Environment-based solutions
(the where)

Physical layout, equipment needed, furniture, lighting, etc.

Structure-based solutions
(the what)

Mission, vision, values, goals, job functions and tasks, talent recruitment, etc.

Learning-based solutions
(the how)

Job aids, training, e-learning, information, etc.


Low Rating                                                High Rating

                  Confidence – confidence they will perform


In this basic plot diagram there are 2 continuum to consider - the Competence Continuum, and the Confidence Continuum. When using this tool, first seek to determine whether the group has the competence (the knowledge, skills and abilities) required to do the work. Next determine of there is confidence that the learners can apply what they have learned.

Depending on how you might rate the performers on competence and confidence, you would end up with the following possibilities:
  1. High Competence and High Confidence: environment-based solutions may be appropriate
  2. High Competence and Low Confidence: motivation-based solutions may be appropriate
  3. Low Competence and High Confidence: learning-based solutions may be appropriate
  4. Low Competence and Low Confidence: structure-based solutions may be appropriate
Once the client sees that training may not be the answer, it gives you greater opportunity to start diagnosing and ultimately implement the appropriate solution. I particularly liked one think Dr. Addison said:
"It's the questions that drive the solutions... The best thing a performance consultant can do is ask the right questions and then observe to confirm the answers."
Very insightful - I have learned this as a researcher, and as an instructor and I look forward to applying the same principle when analyzing a performance problem. 

*      *      *
So, as the workshop has come to a close, here are my final thoughts.

This workshop did a great job introducing key concepts related to performance technology. I really appreciated the stories that were shared and really felt like I learned a great deal from my peers in the workshop. They each had unique perspectives that were refreshing, sometimes challenging, and always insightful.

I think I would like to a few more stories that related what we were learning to the models we learned. I find that listening to stories that are linked to the tools is very helpful. The presenters did a good job with this, and I found that I wanted more. This is probably a good thing - I am left with a strong desire to learn as much as I can, and I am already seeing that the ISPI Performance Conference is giving me much of what I am looking for.

I would highly recommend this workshop and this conference to anyone who would like to enter the world of Human Performance Technology. The people are wonderful, the knowledge is powerful, and I really feel like I am gaining meaningful knowledge.

Thanks to Dr. Addison and Dr. Lane for a great workshop!

Thursday, April 19, 2012

HPT Workshop: Day 2

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

I have been attending ISPI's Performance Improvement Conference 2012. Today was the second day of the 3-day Principles and Practice of Human Performance Technology workshop. Yesterday we spent a lot of time getting oriented to the field and doing some initial analysis activities. (I talk more about Day 1 here).  Today we focused in a lot more on Analysis, which is one of three major tasks that a performance consultant performs: (1) Analysis of problems, needs or gaps, (2) Implementation of solutions to those problems, and (3) Evaluation of the effectiveness of those solutions.

We learned about and applied 3 fundamental models for analyzing the performance needs and situations of an organization:
1. The Total Performance System - this is a model that allows the performance consultant to analyze the system in which the performance takes place. The model highlights several key components to the system and helps the performance consultant get a feel for what is happening in an organization related to the performance in question.
2. The Behavior Engineering Model -this is a model (developed by Thomas Gilbert) that helps the performance consultant identify what is needed to enable correct performance in an organization. These things can be categorized using the following table: 

Information
Instrumentation
Motivation
Environment
Data – receives description of, guides for, and feedback on performance
Resources – time, materials, tools
Incentives – pay, benefits, opportunities, consequences for poor performance
Worker
Knowledge  - appropriate placement, training that matches performance needs
Capacity – physical capacity, visual aids, adaptation, selection
Motives – assessment of worker motives, recruitment of matching people

We did an interesting exercise with this model - Dr. Addison asked each of us to identify which of these sections was the thing that was standing in the way of our being able to perform most effectively. We all independently responded and the following numbers of people ended up in each of the categories:


Information
Instrumentation
Motivation
Environment
6
17
2
Worker
1
2
2

So, most people felt that they just needed more resources so that they could more effectively do their work. Interestingly, Dr. Addison noted that Thomas Gilbert found that this was generally the case in most situations - people rarely need training to improve their performance - they often need the other categories, and mostly just need the right resources!

3. The final analysis tool we used what the Driver System Analysis Matrix. To be honest, I really struggled with this tool. It requires the performance consultant to look at many of the elements at the organizational level/administrative level. We attempted to use the matrix to analyze performance at a fictitious organization, and to be honest, I had a hard time with this for several reasons: (1) I am accustomed to thinking at the Work and the Worker levels and found the shift to the Workplace level difficult; (2) I was unfamiliar with the case study and had a hard time understanding the specific context; (3) I didn't sleep too well the night before and was having a hard time focusing, and (4) I was so enthralled by the other two models that I found myself looking back at them.

Really, this is a slightly more complex way to look at a complex system - I am certain that it is a great thing, I just need to revisit it sometime soon.

*     *     *
I really liked today's experience and I already have ideas for analyzing my own organization and work. As I wrote in previous post, humans have a tendency is to jump to conclusions. I see this happen all the time in the workplace among my peers, and I know that I often do the same thing. I have to force myself to slow things down, and it really takes more time to think clearly. I like the models presented in the workshop today because they really helps an individual focus on the things that are most important and really understand what is contributing to their issues and problems. I am excited to test these models out in my own work!

I look forward to day 3 of the workshop. The presenters have done an incredible job at creating experiences that are meaningful so that we can really grasp the content and consider how to use them in our own work.

Wednesday, April 18, 2012

HPT Workshop: Day 1

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

I have been attending ISPI's Performance Improvement Conference 2012. Today was the first day of the 3-day Principles and Practice of Human Performance Technology workshop. The workshop is being facilitated by Roger Addison, EdD, CPT, and Miki Lane, PhD, CPT. Both have a great deal of experience as performance consultants, and share meaningful experiences that help make the content seem more relevant. I also met several excellent co-attendees at the conference. We have worked as teams for several exercises, and I am getting to know and appreciate the knowledge and experience of my team members. I'll write more about these team members in a later post.

We spent the morning getting an overview of the field of human performance. We then spent much of the afternoon learning important techniques for analyzing and identifying business needs and opportunities, and it sounds like a significant part of tomorrow's workshop activities will also focus on analysis.

Performance Definitions

One of the things I found interesting was the distinction between Performance Improvement and Performance Technology. In a previous post, I've described The Difference Between Instructional Design, Instructional Science, and Instructional Technology. I will add on what was learned in today's workshop to include similar categories. Here are the four components I think are relevant to understanding the field of human performance:
  • Performance Improvement is the GOAL of the performance consultant. 
  • Performance Science is what we know about what works in improving human performance. These can be called "best-practices" or "evidence-based practices."
  • Performance Technology is the means for reaching the goal of improving performance.
  • Performance Consulting is the use of performance technology (based on performance science) to reach the goal of performance improvement. It usually involves the following major steps: (1) the diagnosis of performance problems and opportunities, (2) the implementation of research-based strategies for improving performance related to those problems and opportunities, and (3) the evaluation and follow-up to see how effective those strategies have been. 

Levels of Performance Needs

 When diagnosing performance problems and needs, it can be useful to identify the level of the performance problem, need or gap. These levels are:
  1. Worker (individual level)
  2. Work (process level)
  3. Workplace (organizational level)
  4. A fourth level can also be added: World (society level)

All About the Money?

One of the most important things I learned this first day was that when we are analyzing a problem or a need, we should always link that need to how it affects the financial success of the organization. If an organization has the goal of profits, then all activity should be focused on increasing profits in sustainable ways. For some reason this seemed new to me, something I had never considered. This is possibly because I work as an academic in higher education where making a profit does not seem to be at the forefront of my (nor my peers') thinking. But the discussions and the activities in our workshop today helped me realize the importance of this kind of thinking.

Higher Education

I wonder when higher education institutions will finally realize the importance of linking performance to the bottom-line. And if/when they do, how will they deal with the potential conflict between profitability and academic freedom? It's clear that some for-profit organizations are focused on financial gains and have had some success at being very profitable, but at what point does this compromise the mission of the organization? I have a close friend who works at a for-profit university, and he describes increasing pressure to allow students to pass poorly-designed courses, even when he knows the students do not have the skills needed to move on. Even if the institution reaches its goal of profit, it abandons its goal to provide quality education to its students, thereby failing to fulfill its mission as an institution of higher education.

 What About the World Level?

The other consideration is the World (society) level that is now being considered by many in the HPT field. We live in a society in which there are limited natural resources, and without responsible use of these resources, we might find them damaged or totally depleted. The famous Dodo bird provided very valuable feathers centuries ago, but those in the feather-finding business did not consider the World level when following their business plan and eventually eliminated all Dodo birds, thereby halting their own performance. What if this same thing happened with ore, water, or fuel? The broader mission of an organization should (in my opinion) consider the impact of its work on the global scale, including impact on natural resources and on things like international relationships and society in general. 

*     *     *

So, there are my thoughts and reflections from day 1 of the Principles and Practices of HPT workshop. I will continue to write as I progress through the workshop and the conference. (Here are my thoughts on day 2).

Tuesday, April 17, 2012

What is Human Performance Technology?

#ISPI2012 
I am on my way to ISPI's Performance Performance Improvement Conference 2012. What will I be learning at the upcoming Principles and Practices of Human Performance Technology workshop? The ISPI website defines Human Performance Technology:
"Human Performance Technology (HPT), a systematic approach to improving productivity and competence, uses a set of methods and procedures -- and a strategy for solving problems -- for realizing opportunities related to the performance of people. More specific, it is a process of selection, analysis, design, development, implementation, and evaluation of programs to most cost-effectively influence human behavior and accomplishment. It is a systematic combination of three fundamental processes: performance analysis, cause analysis, and intervention selection, and can be applied to individuals, small groups, and large organizations."

I have a pretty solid grasp of the basic steps and phases described in this definition. What will likely be new to me are the methods and procedures used. I have spent many years of my life working to improve my own performance as an individual in my personal and professional life. I am also very familiar with systematic processes for achieving results. So I think that I have a good foundation for what I am about to learn.

The Bond Place Hotel

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

Well, I just arrived at the Bond Place Hotel in Toronto, Canada. It's not the official conference hotel, but I am pretty close, exactly 1 kilometer from the Sheraton Centre Hotel. I think I am pretty close to Lake Ontario, probably within a mile walk. I'll have to go walk down to check it out. Here is a map of where I am, including walking instructions to the conference hotel:



View Larger Map
This hotel is really pretty groovy - the decor is pretty modern-looking, and there is a nice view of a part of downtown Toronto. Here are some pictures of my hotel room:




I better get to bed. I'll be getting up early tomorrow for the first day of Principles and Practices of HPT workshop. I'm excited to see what I learn. (Oh, by the way, none of my flight worries actually happened. My bag was not stolen, I did not get queasy from flying, and I didn't sit next to the "stink guy." Just thought I would let you know...)

Traveling to Toronto for ISPI Conference

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

As promised in my previous post, I will be blogging about my experience at the ISPI Conference, so the next several posts will be a slightly different format than my normal approach.

Right now I am at airport in Columbus, Ohio, waiting for the flight to leave for Chicago and then on to Toronto for the conference. As recommended, I am here 2 hours early, so I get to spend my time in an uncomfortable seat sitting around people I don't know. I am looking forward to the conference, though I am usually less than comfortable on the flight. I find that I subconsciously worry about three general things when flying:
  1. Someone might steal my bag. I know this is really a pretty irrational fear, but I once had a really cool backpack stolen off a bus when I lived in Guatemala, and since then I have become what I call "cleptophobic."
  2. Getting sick on the flight. Not that I will actually puke or anything, but I always feel a little queasy. One time I was sitting on a flight next to a youth who's quartet had just won an award at an international barbershop quartet competition (not kidding). I felt awful the whole trip as he chattered continuously about the award, singing in a quartet, creating "overtones," the importance of the bass voice in a quartet, and his quartet's ambitions for taking the entire competition next year.
  3. Sitting next to the "stinky guy." I have been lucky with this one and usually sit with clean-smelling passengers, although once I did sit next to the "stinky guy" on a flight from California. The one thing that does worry me is that, since I never sit by the stinky guy, I might actually be the stinky guy...
So, we will see how the flights go. I am definitely excited for the conference and for the opportunity to expand my knowledge, skills, and abilities. And who knows, maybe I'll run into a member of an internationally acclaimed barbershop quartet....

The Upcoming ISPI 2012 Conference

This post is part of a multiple part Series on the 2012 Performance Improvement Conference.
#ISPI2012

This week I am attending the 2012 Performance Improvement Conference, which is presented by the International Society for Performance Improvement. Since I am fairly new to the field of performance improvement (though I have had some practical experience and related knowledge), I plan to post every day or two about what I learn at this conference, including my reflections on how it is important and how it might be applied.

As is apparent from my previous blog posts, my education, my research, and my career has been focused on instructional design strategies and effectiveness. However, as I have adapted to my new role here at Franklin, it is apparent that I need to build my knowledge and capacity in the important field of Performance Technology. To this end, my administration has graciously funded my attendance at this conference, including a three day workshop entitled Principles and Practices of Performance Improvement.

I look forward to the learning experience and hope to be able to share help insights as the week progresses.

Friday, April 13, 2012

Creating Results in Life

I've been thinking a lot about achieving success and happiness in life. I believe that there are some fundamental processes that shape our reality, and we have the power to use those processes to create happiness, success, and peace. Here is how I believe our reality and the results we get in life are shaped:


In life, we receive endless information input- what we sense with our bodies. These inputs have massive influence on our thoughts and on how we perceive reality. This perception of reality shapes our attitudes, or how we feel about and relate to reality. Our attitudes have a huge influence on our actions, what we do in the world, which ultimately drive the results and shape our reality. This reality then becomes input which perpetuates the cycle.

Using The Cycle for Success and Happiness
We have the power to take advantage of this cycle and use it to achieve goals of happiness, success, and peace. Each of these elements - inputs, thoughts, attitudes, and actions - can be used to our advantage:
  • Inputs - We can choose to place information that is Positive and True into our minds. It should be positive and provide motivation to act in a proactive manner. It should be true or based on what is real it is useful. Tips include:
    • Do: Read motivational, empowering books by authors such as Stephen Covey, Zig Ziglar, and Anthony Robbins. Read sacred literature, scripture. Surround yourself with positive poeple who have similar goals and ambitions in life. Listen to music and spoken words that are positive and uplifting.
    • Don't: Spend time with people who are negative or who complain excessively. Watch mindless television or waste time excessively on the internet.
  • Thoughts - While our thoughts are heavily influenced by Inputs, we can still choose what we think about. We are constantly "talking to ourselves" in our minds. Our minds are always thinking, and we can choose what we think about and how we think about it.  Tips for creating empowering thoughts include:
    • Do: Choose thoughts and beliefs that empower you, that bring success and happiness to you and to others. These include thoughts that are positive, true, and hopeful. 
    • Don't: Dwell on negative things. Acknowledge that they are there, to be sure, but live proactively, focusing your thoughts on the things you can control and on things that will bring positive results.
  •  Attitudes - Although attitudes are heavily influence by our thoughts, we have the capacity to consciously choose a positive, self-sustaining attitude. By controlling attitudes through consistent efforts (and trial and error), we will find that our actions are more proactive and bring greater results.
    • Do: Have a positive attitude about life and about yourself. This means exposing our minds to people and media that infuse a healthy, positive attitude into our minds.
    • Don't: Focus on the negative or on things that you cannot control. Avoid watching too much news or being exposed to people and media that are overtly negative.
  • Disciplined Action- Our actions are the most vital component to success and happiness in life. We must align our actions and habits with the results we want, and we have the power to control our actions and habits.Habits are the vehicle through which success is attained.
    • Do: Focus your actions on those that will bring you positive, lasting results in your life. This usually entails work of some kind.
    • Don't: Waste time or energy on habits that limit personal success, growth and happiness. This can include addictive habits and gross time-wasters such as mindless television or internet.
By taking control of each of these components in our live, we have the power to create the Results we desire - success, happiness, peace, or whatever outcome we desire.

This view certainly isn't absolute, and outside influences (Input from the environment and people we are in and around) on our thoughts, attitudes and actions are massive. But viewing the world according to the process described above focuses our energy on the things that we can control. It enables the individual to take control of personal habits and to begin to move forward toward desired goals.

What do you think? Any insights? Does this seem right? I appreciate all critiques and feedback.

Friday, April 6, 2012

Putting Wikipedia in its Place: Problems and Pitfalls

I was recently introduced to a graphic (shown below) which apparently summarizes some of the latest information related to Wikipedia, the massive online community-created encyclopedia. In the knowledge society, information and knowledge is paramount, and having access to current, accurate, useful knowledge is absolutely vital. So, how important is a site like Wikipedia in this current society? Here are my thoughts.

Information is Not Instruction
Wikipedia provides its readers with information, but as the great learning researcher Dr. David Merrill often says, "information is not instruction." As a summarizing source, Wikipedia provides readers with surface-level, basic knowledge. This can be seen as a lower level of learning- we can remember what we learn, perhaps understand it but may not be able to apply, analyze, or evaluate with what we learn from a Wikipedia page. There is a fair amount of educational research indicating that to achieve these higher levels of learning, people need to have specific instructional experiences that will help them gain more meaningful learning.


Wikipedia does not encourage critical thinking. As members of society, we have to be very careful about what we believe - we must be willing to check the facts, go deeper, look at multiple perspectives, draw conclusions. Accepting things in wikipedia as fact without thinking is dangerous, particularly because (in my experience) the information is often inaccurate.

Wikipedia is Not Peer-Reviewed  
Clearly there is some level of "peer-review" in Wikipedia, but in this review process, one is left to ask who the peer is. (I discuss this further in number 1 below).
Is Wikipedia Bad?
No, Wikipedia is very good, a tool that I sometimes use. Wikipedia is the premier example of user-created content, a very powerful democratic and pragmatic movement. My thoughts in this post are not intended to slam the site, just to put it in its place. Remember, "information is not instruction," and if we want members of our society to really understand and be able to use knowledge in meaningful ways, we must provide experiences (in the form of well-designed instruction) that will help them gain the necessary knowledge and skills.

My Thoughts on the Graphic
As I mentioned above, I was directed to a graphic that summarized some information about Wikipedia. It is an attractive graphic, but I find it misleading, in some ways- not that the information is necessarily inaccurate (though it doesn't provide information sources for what it is representing), but that the way it is organized might imply certain things that are not accurate. I will share the graphic below and then share my critique.

Wikipedia
Via: Open-Site.org
 Here are some observations and critiques of the image above:
  1. Comparing college textbooks and wikipedia implies that Wikipedia is almost as good, but the comparison is on accuracy and not on the actual content. A textbook is likely much more effective at teaching than wikipedia because of the strategies that are potentially used in the textbook. Also, which college texts and wikipedia articles are being compared? Which subjects? What level of education (Associates, Bachelor, Master, Doctorate)? This information is not provided, and I am confident that there was no comparison of all Wikipedia articles and all college textbooks. I know for a fact that the Wikipedia articles that summarize key knowledge in the field of instructional design (you can review it here) and instructional technology (you can view it here) have fundamental inaccuracies, and when I attempt to change the pages, the changes are sometimes rejected by someone else who is not a scholar in the field as I am. This is a huge problem.
  2. Comparing Wikipedia and libraries is also a poor comparison. Because Wikipedia is in a sort of a "encyclopedia" format, it is meant to be a summary of knowledge. This kind of summary can be useful, but without deeper resources like those found in the library it would be surface-level information and likely not deep enough to delve into a field to any meaningful level.
  3. there is probably not a lot of evidence that it is Wikipedia alone that has "stopped the presses" for Britannica. There are literally billions of other sites on the internet providing information.
  4. Comparing the number of visits to libraries and to Wikipedia is not a valid comparison. When I go to the library, I skim through many different books and interact with their online database for several minutes, learning a great deal from that interaction alone. I usually check out several books and read them for literally hours and hours, obtaining deep knowledge about the subject I am studying, and this is all resulting from just one visit to the library. A visit to Wikipedia, on the other hand, is probably tracked as simply a user visiting a page, which could last literally 3 seconds. This is a very poor comparison that does not measure the right thing (time spent on learning or amount of knowledge gained).
  5. On this graphic, there are no references. Everything as facts but do not tell where you get the facts. This is a big problem- if I am going to put stock into some of the stats and research results that you cite, I really need to be able to review the studies and find out what they actually mean. If a site visit to Wikipedia is tracked as a click on a page, that is not very robust tracking. Every stat and every piece of research has more to it than just the numbers, and not being able to check the references and verify what actually happened in the study is a big problem.
I think my reactions above reveal my worry that people think that they can get all the information they need from Wikipedia and from a quick internet search. This is absolutely not true. I personally use Wikipedia to get a basic introduction to a topic, but to really get meaningful content, I find that I must locate and study resources that are more robust, more credible, and that provide much deeper knowledge than a simple Wikipedia page. Without such things as books and libraries, scholarly journals and professional associations, we would live in a world of surface-level knowledge and surface-level thinkers. Wikipedia is useful for some basic knowledge-gathering, but to gain meaningful knowledge the individual must be willing to (1) not accept the article as absolute truth, (2) check the facts in the article for accuracy, (3) research more credible, robust sources for deeper knowledge.

Wikipedia is certainly a meaningful phenomenon, but it needs to be put in its place.

The Problem of Perfection in Instructional Design

Many instructional designers want to create the most effective, engaging product possible. I know I do, but deep down somewhere in my psyche, I believe that I subconsciously want to create a "perfect" unit of instruction, something that will help each and every student learn, something that manages cognitive load effectively, and help students acquire knowledge and skills that are fully transferred to real-world situations. Unfortunately, I have seen some instructional designers become obsessive about the quality of their work, to the point of losing sleep and generally becoming unhappy and unpleasant people.

I have realized over several years of teaching, training and designing instruction that instructional perfection is not possible. I may be burned at the scholarly stake for saying this, but here are a few considerations:
  1. We design under constraints. We all have limitations of time, resources, money, expertise, tools, etc. Without any one of these, we will likely never have all we need to obtain instructional perfection.
  2. All learners are different. Even if we design and develop a fabulous piece of instruction, our learners all have different backgrounds, different experiences, and different skill-levels. This means that what works "perfectly" for one learner likely won't for another.
  3. Perfection is a mirage. The idea of perfection is not necessarily a reality- even when we meet 100% of the criteria for an instructional product, there is still always more that could be done. Perfection is, from a design perspective, literally unattainable.
Now, having made this (perhaps heretical) assertion, I believe that as designers, we should strive for excellence in our work, but must learn to be satisfied with what is possible given our constraints. The classic Whinston Churchill quote rings true: "Do what you can with what you have where you are."

In striving for excellence, it should be noted great deal of joy can be derived from continually refining and improving a piece of instruction. As we develop our curriculum in the IDPT Master's Program at Franklin University, we formatively evaluate each course and determine how we can refine and improve them. This means restructuring sequence and assignments, eliminating or clarifying confusing materials, providing worked examples, supportive multimedia, templates, and tips to guide students toward learning success.

Working to improve a course should be seen as a pleasure, done for the love of instruction and not as the result of some compulsion to work for instructional perfection. Instructional design is a creative endeavor- it should be seen as the creative application of research-based principles toward the goal of helping learners learn.

Let us move in our design toward excellence, avoid the mirage of instructional perfection, and find satisfaction in the principles and practices of instructional design.

Monday, April 2, 2012

Article: Applying Merrill’s First Principles of Instruction: Practical Methods Based on a Review of the Literature

Pre-publication draft, reference information at end of post.
To access PDF copies of this and other articles, visit my Academia.edu page.

 Abstract:
Research has shown that when Merrill’s First Principles of Instruction are used as part of an instructional strategy, student learning increases. Several articles describe these principles of Instruction, including specific methods for implementing this theory. However, because teachers and designers often have little time to design instruction, it can be difficult to implement a comprehensive theory like First Principles of Instruction. Therefore, this article provides basic methods for applying First Principles, including several examples from the literature. It also provides a basic template for organizing a module or lesson plan using First Principles of Instruction.
Keywords: instruction; instructional theory; First Principles of Instruction; applying theory; instructional design;

Introduction
The issue of transferring theory into teaching practice is often discussed in the field of education, (De Corte, 2000; Defazio, 2006; Randi & Corno, 2007). For example, a recent study found that most courses in higher education, even those that are award-winning, do not effectively use First Principles of Instruction in their teaching strategy (Cropper et al., 2009). Without using sound theory in the educational practice, instruction can potentially fall short of its power to increase student learning.
Research has shown that the use of First Principles of Instruction in education improves student learning and satisfaction (Frick et al., 2007; Merrill, 2006; Thomson, 2002). However, although several articles describe First Principles of Instruction (Merrill, 2002, 2006), including methods for implementing and evaluating these principles (Merrill, 2009), experience has shown it can be difficult to apply this theory into educational practice. This article therefore describes basic ways for instructional designers and educators to begin using Merrill’s First Principles of Instruction, including a template for designing instruction. Specific methods for applying each principle are also provided.
First Principles of Instruction
A principle describes a relationship that is always true under appropriate conditions, regardless of program or practice (Merrill, 2002). Principles are different from methods, which are “ways to facilitate learning” (Reigeluth, 1999). For methods to effectively bring about student learning, they must be based on principles that describe a true relationship.
Principles are often included in instructional theory, which “offers explicit guidance on how to better help people learn and develop” (Reigeluth, 1999). Merrill emphasizes that instructional design theory should address what actions to take and how and why we should take those particular actions (Merrill & Twitchell, 1994). Instructional theories describe how to teach effectively. They identify methods of instruction, which can be broken into detailed steps, and the situations in which those steps should be taken (Driscoll, 2005, p. 352; Reigeluth, 1999).
In presenting First Principles of Instruction, Merrill (2002, 2006) provides very powerful instructional methods based on five foundational principles of instruction. He writes that learning is promoted when: 
  • Instruction is in the context of real-world problems or tasks and students are engaged in solving a sequence of increasingly complex problems or tasks. 
  • Students activate relevant cognitive structures and recall or acquire a structure for organizing new knowledge, which structure is used for instruction, coaching, and reflection activities. 
  • Students observe a demonstration of skills to be learned that is consistent with the content type, guides students to relate general information to specific instances, and uses media that is relevant to the content and appropriately used. 
  • Students engage in application of new knowledge that is consistent with the type of knowledge being taught, receive intrinsic or corrective feedback, and receive coaching that is gradually withdrawn for each subsequent problem or task. 
  • Students integrate their new knowledge or skill by reflecting on, discussing, or defending the new knowledge or skill, and exploring personal ways to use it and displaying it publicly.
These five principles can be converted into four phases of instruction, occurring in the context of a real-world problem or task. See Figure 1. This four-phase process guides instructional designers and educators to bundle their teaching and learning activities in a way that improves student learning and that makes it easy to incorporate new methods within that process. The process begins with activation of students’ prior learning, followed by demonstration of new knowledge, student application of knowledge, and student integration of knowledge, all based on the real-world problem or task.

Figure 1. Merrill’s First Principles of Instruction.

Several research articles provide significant empirical and anecdotal support for First Principles of Instruction (Frick et al., 2007; Merrill, 2006; Thomson, 2002). Thomson (2002) showed how these principles were used to teach a potentially drab spreadsheet course. By using real-world scenarios and following the four-phase cycle of instruction, students achieved a 30% performance improvement over the traditional instruction, including a 41% improvement in time performance (p.8). In another study, Frick et al. (2007) found strong correlation between the use of Merrill’s First Principles and student satisfaction and perceived and actual performance in the class. In addition, several authors show how these principles have been applied in educational and corporate settings (Collis & Margaryan, 2005; Gardner et al., 2008; Gardner & Jeon, in press; Mendenhall et al., 2006).
While the results of these studies and cases are impressive, it can be difficult to apply this theory. To help transfer these principles from theory into educational practice, this article will provide prescriptions based on a review of several instructional theories and case descriptions.

Applying First Principles of Instruction
This section describes instructional methods and strategies based on First Principles of Instruction. Several questions are asked followed by practical answers for applying First Principles of Instruction. Figure 2 is a worksheet for planning how to use these principles in a lesson or unit.
How can I base my instruction on real-world problems or tasks?
Real-world experience is the bedrock of all learning (Dale, 1996). The goal of the instruction should be to have students solve problems (Jonassen, 1999), so have students do performances that matter in the real world (Gardner, 1999). Make sure the problems are authentic (Nelson, 1999), useful (Dale, 1996), meaningful (Mayer, 1999), and intrinsically motivating to the student (Schank et al., 1999). The challenges should be easy at first (Burton et al., 1984) but be increasingly difficult as you move through the materials (Gardner et al., 2008; Perkins & Unger, 1999; Schwartz et al., 1999). Make sure your problems and tasks safely allow the practice of skills and subskills (Burton et al., 1984) and try to make them physical, tangible activities (Collis & Margaryan, 2005).
How do I activate my students’ prior knowledge? 
  • Have your students relate or recall what they already know about the subject (Gardner et al., 2008; McCarthy, 1996). Try to choose subjects the students will relate to (Schank et. al., 1999) and build on your students’ relevant prior knowledge (Collis & Margaryan, 2005).
  • Allow your students to “look ahead” and preview what they will learn (Schwartz et al., 1999). Let them see the problem(s) to be solved and the subjects they will learn (Mendenhall et al., 2006). Also, show them the process they will go through to solve these problems (Nelson 1999). Try to make the structure of the information and knowledge obvious by using a model to organize instructional materials (Darabi, 2002).
  • Give your students a foundation to build new knowledge on. Give them a good reason for engaging in the problem (Jonassen, 1999). Tell stories, give them statistics, and provide hands-on activities (Gardner, 1999) upon which to build new knowledge. Discuss the fundamentals of the topic or give a simple analogy upon which to build the new knowledge (Mayer, 1999).
How can I effectively demonstrate new knowledge to the students? 
  • Your students will learn a lot by watching you work, so model performance of the task as you teach (Collins et al., 1991). Teach and model the entire task (Gardner et al., 2008; Mendenhall et al., 2006). Give varied examples of the topic (Gardner, 1999), and related cases and information sources (Jonassen, 1999), including multiple expert perspectives (Schank et al., 1999; Schwartz et al., 1999). This will broaden your students’ understanding.
  • Be sure to make the structure of the knowledge clear (Mayer, 1999) by following and referring to the organizing model for the knowledge. Be sure to have students relate new knowledge to old knowledge (McCarthy, 1996) to promote encoding of new knowledge.
  • Encourage your students to ask questions during the demonstration (Gardner et al., 2008) and give them instruction as they request or need it (Nelson, 1999).
  • To broaden student understanding, tell analogies and metaphors (Gardner, 1999). Focus your students’ attention by asking questions (Nelson, 1999) and summarizing your instruction (Mayer, 1999). Make your thinking obvious to the student (Collins et al., 1991).
How do I have students apply this new knowledge effectively? 
  • Have your students use what they have been taught (McCarthy, 1996) and spend significant time on practice (Schank et al., 1999). Your students’ activities should include problem-solving activities and you should have them apply new skills in a realistic setting as soon as possible (Keller, 1987). Students should solve as much of the problem or task as possible at each stage (Mendenhall et al., 2006). Problems and tasks should be done in the actual environment using real workplace situations and resources (Collis & Margaryan, 2005).
  • Help your students be cognitively active (Mayer, 1999) by having them solve problems. You can even encourage them to solve problems as a group (Nelson, 1999). Have them recognize and articulate the elements that are common across the differing problems and tasks you have them solve (Collins et al., 1991).
  • Give your students coaching and feedback (Burton et al., 1984; Collins et al., 1991; Nelson, 1999; Perkins & Unger, 1999). Feedback should occur frequently (Darabi, 2002; Perkins & Unger, 1999). Your feedback may need to be visually demonstrated so that students can see their error (Burton et al., 1984) and should be based on clear criteria (Perkins & Unger, 1999). Help your students use your feedback to bring their performance closer to the level of an expert (Collins et al., 1991) and to plan future performance (Perkins & Unger, 1999). When you praise successful work, be sure to attribute the students’ success to their effort, not luck or ease (Keller, 1987). Provide more guidance initially, reducing it as expertise is developed (Gardner et al., 2008).
How do I encourage students to integrate this new knowledge into their everyday life? 
  • Have your students reflect on what they learn (Collis & Margaryan, 2005; Gardner et al., 2008; Jonassen, 1999; Nelson, 1999; Perkins & Unger, 1999; Schwartz et al., 1999), describing their experiences and challenges in applying what is taught (Darabi, 2002). Have students relate their new knowledge to future goals (Keller, 1987). As well, have students leave tips and ideas for future students (Schwartz et al., 1999).
  • Have your students take part in a culminating performance that includes students and parents outside the class (Perkins & Unger, 1999). Afterwards, show the students a recording of their performance (Jonassen, 1999).
Subject of Lesson or Module:

Principle of Instruction
Your instructional plan

Problem-Centered

What real-world, relevant problem or task will the learner be able to perform when we finish this lesson or unit?




Activation

How will you activate the learner’s prior knowledge about this subject and prepare them to learn?

How will your students preview what they will learn?




Demonstration

How will you show the learner how to perform the real-world problem or task?

What various examples of the problem or task will you give your students?




Application

How will your learner practice solving the problem or task?

How will you give them feedback on their performance?




Integration

How will you encourage your learner to integrate this new knowledge and skill into their life?

How will they reflect on, discuss or debate this new knowledge?




Discussion and Summary
The diverse ways that First Principles of Instruction are used in these theories and cases is refreshing, and one can recognize the abundant theoretical and anecdotal support for First Principles of Instruction in the articles cited. By understanding the purpose of the principle and using it in a way that matches design style and personal preference, instructional designers can apply these principles in natural, meaningful ways.
This article is designed to provide teachers and instructional designers with ideas for creating effective instruction. The goal is to provide a framework for organizing teaching and learning activities in a way that is easy to implement and beneficial to students. Designers are encouraged to use the worksheet provided in Figure 2 to plan how to apply these principles. By doing so, one can expect an increase in student learning and satisfaction.
It is often difficult to transfer theory to practice instructional design. This description of how several theorists and designers use Merrill’s First Principles of Instruction should generate ideas for applying this theory in design practice. As well, the template found it Figure 2 provides structured, basic methods for application.
Merrill has synthesized and distilled First Principles of Instruction through a lifetime of research, practice and synthesis. Using these principles increases the efficiency and effectiveness of instruction. Most importantly, instructional designers and educators who use these principles will increase student learning and satisfaction by engaging them in solving meaningful problems and tasks.


 
References
Burton, R. R., Brown, J. S., & Fischer, G. (1984). Skiing as a model of instruction. In B. Rogoff and J. Lave (Eds.), Everyday cognition: Its development in social context (pp. 139-150). Cambridge, MA and London , Harvard University Press.
Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator, 15(3), 6-11.
Collis, B., & Margaryan, A. (2005). Merrill plus: Blending corporate strategy and instructional design. Educational Technology, 45(3), 54-58.
Cropper, M., Bentley, J., & Schroder, K. (2009). How well do high-quality online courses employ Merrill's first principles of instruction? In M. Orey, V. J. McClendon, & R. Branch (Eds.), Educational media and technology yearbook (Vol. 34, pp. 121-140). New York: Springer Publishing Company.
Dale, E. (1996). The "cone of experience." In D. P. Ely & T. Plomp (Eds.), Classic writings on instructional technology (Vol. 1, pp. 169-180). Englewood, CO: Libraries Unlimited.
Darabi, A. (2002). Teaching program evaluation: Using a systems approach. American Journal of Evaluation, 23(2), 219.
De Corte, E. (2000). Marrying theory building and the improvement of school practice: A permanent challenge for instructional psychology. Learning and Instruction, 10(3), 249-266.
Defazio, J. (2006). Theory into practice: A bridge too far? AACE Journal, 14(3), 221-233.
Driscoll, M. P. (2005). Psychology of learning for instruction (3rd ed.). Needham Heights, MA: Pearson Education, Inc.
Frick, T., Chadha, R., Wang, Y., Watson, C., & Green, P. (2007, December 11). College student perceptions of teaching and learning quality. Educational Technology Research and Development. Retrieved June 19, 2009, from http://www.springerlink.com/content/722jm250401j7l77/fulltext.pdf.
Gardner, H. E. (1999). Multiple approaches to understanding. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 69-89). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Gardner, J., Bentley, J., & Cropper, M. (2008). Evaluating online course quality: Teaching evaluation using first principles of instruction. Midwest Journal of Educational Communication and Technology, 2(2), 1-7.
Gardner, J., & Jeon, T. K. (in press). Creating task-centered instruction for web-based instruction: Obstacles and solutions. Journal of Educational Technology Systems.
Glesne, C. (2006). Becoming qualitative researchers: An introduction (3 ed.). Boston, MA: Pearson Education, Inc..
Jonassen, D. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 215-239). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Keller, J. M. (1987). Development and use of the ARCS model of instructional design. Journal of Instructional Development, 10(3), 2-10.
Mayer, R. H. (1999). Designing instruction for constructivist learning. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 141-159). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
McCarthy, B. (1996). About learning. Barrington, IL: Excell, Inc.
Mendenhall, A., Buhanan, C. W., Suhaka, M., Mills, G., Gibson, G. V., & Merrill, M. D. (2006). A task-centered approach to entrepreneurship. TechTrends, 50(4), 84-89.
Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development, 50(3), 43-59.
Merrill, M. D. (2006). First principles of instruction: A synthesis. In R. A. Reiser & J. V. Dempsey (Eds.), Trends and issues in instructional design and technology, 2nd Edition (Vol. 2). Upper Saddle River, NJ: Merrill/Prentice-Hall, Inc.
Merrill, M. D. (2009). Finding e3 (effective, efficient, and engaging) instruction. Educational Technology, 49(3), 15-26.
Merrill, M. D. & Twitchell, D. (1994). Instructional design theory. Englewood Cliffs, NJ: Educational Technology Publications.
Nelson, L. M. (1999). Collaborative problem solving. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 241-267). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Perkins, D. N., & Unger, C. (1999). Teaching and learning for understanding. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 91-114). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Randi, J., & Corno, L. (2007 ). Theory into practice: A matter of transfer. Theory into Practice, 46(4), 334-342.
Reigeluth, C. M. (1999). What is instructional-design theory and how is it changing? In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 5-29). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Schank, R. C., Berman, T. R., & Macpherson, K. A. (1999). Learning by doing. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 169-181). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Schwartz, D. L., Lin, X., Brophy, S., & Bransford, J. D. (1999). Toward the development of flexibly adaptive instructional designs. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 183-213). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Thomson. (2002). Thomson job impact study: The next generation of learning [electronic version]. Retrieved June 13, 2009 from http://www.delmarlearning.com/resources/job_impact_study_whitepaper.pdf.


This is a pre-publication draft of an article published in Educational Technology Magazine in 2010. Please feel free to refer to and use these materials, just be sure to use the reference below when citing the publication:

 Gardner, J. (2010). Applying Merrill’s First Principles of Instruction: Practical Methods Based on a Review of the Literature. Educational Technology Magazine. 50(2), pp. 20-25.