Simulation games have been popular for the last 50 years to teach management and by 1998 were used in 97.5% of all AACSB-accredited business programs.(Faria, 1998) Since then, there has been a strong trend to develop new games and retool older games to take advantage of the world-wide web as a way for students to interact with sophisticated games1. Now most AACSB-accredited business programs are using online games. Although there are online games for every business discipline, operations is particularly suited to games because operations is concerned with the management of dynamic systems and processes. Such systems can have different contexts like inventory maintenance, quality control, service delivery, or production. However, they all run more or less continuously, requiring monitoring and remediation where future events depend on past interventions as well as exogenous events. Games are better suited to capturing the dynamic nature of systems management than, for example, static case studies or problem sets.
Over the last decade, I have been an active participant in the trend towards greater use of online games, as a professor in a traditional MBA program using online games, as a dean of an online business school, and as a designer, developer, and marketer of online games. This guest editorial presents a taxonomy for online games that I developed and use to evaluate games and plan their use in courses. Table 1 sumarizes the taxonomy. The taxonomy categorizes games by pedagogical objective instead of structure as is done, for example, by Aldrich (2005). This editorial also references several examples of commercially available online games used to teach operations management topics.
Table 1. A Taxonomy of Online Games
Because most business students lack work experience in operations, they tend to lack a framework for the concepts covered in an Operations Management course. For example, when discussing queueing, students may imagine a line roughly always the same length. That is, many new students tend to think only in terms of means instead of means and variances. Thinking in terms of systems instead of isolated resources is a challenge when learning about managing processing networks. Inventory control and supply chain management concepts may be difficult to appreciate without having experienced inventory management in the presence of random demand.
Insight games allow students to quickly gain a conceptual background or key insights that can then serve as a context for subsequent lectures or class discussion. These games typically take several minutes and are often played in class. Most of these games are played without use of computers. For example, the original Beer Game (Sterman, 1992) developed at MIT allows students to experience the bullwhip effect and how misperceptions of feedback exacerbate the effect. The classic version of the popular2 game is played with pads of paper to place inventory and production orders among team members, coins representing inventory, and cards representing demand. Students track their own orders, inventory levels, and scores.
Even though most Insight games can be played without using computers, online Insight games have begun to emerge. (Jacobs, 2007a) For example, there are now online versions of the Beer Game that are either free (Jacobs, 2007b, MIT Forum, 2005, MASystem, 2007) or priced similarly to published case studies (Darden Publishing, 2007, Responsive Learning Technologies, 2007a). What is the motivation for an online Beer Game given the pedagogical success of the physical Beer Game, including the excitement of physically placing orders and moving inventory? Possible answers illustrate more general motivations for online Insight games:
- Online games can require less set up than their offline counterparts. It can take three people as much as an hour to set up Beer Game boards for a class of 60 students. The set up time for an online Beer Game is almost zero.
- Well-designed online games make it easier for an instructor to review and present results from multiple games. Many online versions of the Beer Game allow the instructor to view the results of the games as they are played and then display them to students during subsequent lectures and discussion. When students see the results of multiple games presented next to each other, they can begin to draw more general lessons from the results.
- Online games can be used to reduce students' coordination requirements. Coordination activities can distract students from the more important activities that support the game's learning objectives. For example, one of the challenges with the physical Beer Game is simply keeping score. I have administered the physical Beer Game more than a dozen times to more than 1000 students and I cannot remember ever seeing a score sheet that turned out to be completely accurate when entered and checked by a spreadsheet program. Students make mistakes during the game and sometimes never quite get the rules right. Online versions of the Beer Game can practically eliminate such mistakes and speed up games by enforcing the rules, coordinating dynamics and keeping score.
Imagine using lectures or written material to teach how to ride a bicycle. The material could successfully teach required concepts such as the location and purpose of the brakes and pedals, as well as the rules of the road. However, a student probably wouldn't be able to ride a bicycle well on a first attempt following such instruction, even if the student had performed well on a written exam. Clearly there is a component of bicycle riding that is a skill, consisting of a sophisticated mental model for how to respond quickly to a variety of inputs and requirements such as starting to tilt in one direction, going too slowly, needing to stop, and so on.
Operations management has a skill component as well. Even though a student may know how to calculate utilization, average queue length, or required safety stock in a written exam, the student may nevertheless find it difficult to apply that knowledge in a realistic scenario where decisions interact and the environment does not perfectly match the assumptions behind learned formulas.
Skills are often acquired through iterations of hypothesis, trial, and assessment. Continuing the bicycle example, the beginning bicycle rider may be wobbly or fall down as she develops a mental model of how to control the bicycle, including how the ride feels when all is going well, precisely how to respond if the bicycle starts to wobble, and how to execute actions like stopping or turning while managing interactions associated with tilt, momentum, inertia, and so on. The student acquires the skills by making mistakes (falling down on a turn), forming hypotheses (tilt slightly while making a turn), trial (attempting the same turn while tilting), and assessment (did it help?). In subsequent iterations skills become more sophisticated as more extensive hypotheses are formed (e.g., tilt at a greater angle at greater velocities), tried, and assessed.
The objective of Analysis games is the acquisition of skills. An Analysis game provides an environment in which students can form hypotheses, make decisions, and assess consequences over many iterations, building and refining a well-defined skill set. By definition, Analysis games require time to form hypotheses, trials, assessments, and applying conceptual tools from class. As a result, while Insight games typically lead the presentation of the material it covers, Analysis games typically follow the presentation of any conceptual material that would be used in the game. Offline versions of Analysis games can be implemented as stand-alone software distributed via CD-ROM or the web, or by an instructor collecting student decisions, processing those decisions using software, and distributing results back to students over several iterations. However, there are now online Analysis games as well to teach traditional production and control topics (MBE Simulations, 2007, Responsive Learning Technologies, 2007b) and supply chain management (Harvard Business School Publishing, 2004, Responsive Learning Technologies, 2007c).
Littlefield Technologies is an example of an online Analysis Game. The game was developed by Sam Wood and Sunil Kumar beginning in 1997 at Stanford University. (Miyaoka, 2005, Wood, 2004, Responsive Learning Technologies, 2007b). Students are divided into teams and each team manages an online simulated factory that runs continually over the span of the assignment. Versions of the game have been played over durations from 2 hours to about a month, but one week is the most common duration. The factory consists of a single continuous-review stock replenishment system that supplies a network of three stations performing a four-step process. Decisions on different aspects of the operation are enabled or disabled in different assignments depending on learning objectives. Decisions can include changing an inventory order quantity and order point, buying and selling machines at each station, changing a queue sequencing rule, and splitting production lots. Students compete to make the most money by the end of the game by meeting lead time contracts and managing capital and inventory expenditures during the game. Students base their decisions on a wide variety of metrics that evolve over time in response to their decisions. Metrics include inventory level, station utilizations and queue lengths, job arrivals and completions, and average lead time. The game seeks to use competition to engage students in traditional operations management topics and to develop student skills associated with monitoring a system, diagnosing problems, and improving performance.
To better understand how students play the game, software recorded every button clicked by students in a few games. Those logs provide anecdotal evidence of skill acquisition through successive iterations of more sophisticated analysis and decision making:
- Earlier login sessions tended to be much longer than later sessions, which suggests students became better at quickly assessing the status of the factory.
- In later login sessions where the factory was running well, teams were more likely to view only the plots that were indicators of global performance such as the overall lead time of the factory and ignore indicators of local problems such as average queue length in front of a particular station. In contrast, data monitoring was much less focussed in earlier sessions.
- It was common for teams to make early decisions that improved factory performance even though the decisions were suboptimal, such as increasing the reorder point but not enough to cover variation in demand or increasing capacity to cover demand but not enough to reduce queueing to required levels. Subsequent decisions appeared to reflect more sophisticated mental models as students made more refined decisions such as fine tuning capacity or order points.
While online Analysis games typically require more time and cost than offline games, there are benefits to having Analysis games online. To provide a level of complexity that makes analysis challenging and a duration that allows multiple iterations of sophisticated analysis, Analysis games typically require time outside the classroom. While effective, there are some specific benefits to putting Analysis Games online:
- Traditional motivations for the Applications Service Provider (ASP) model apply to game software. The problems of software installation and maintenance are removed to a remote service provider. Platform compatibility problems are reduced when the games run on standard browsers and plug-ins. Technical problems can be detected and fixed on the fly without new releases.
- Geographic dispersion is more easily accommodated. An instructor can monitor, assess, and assist teams as they play the game even though the game played is outside of class. Collaboration among geographically dispersed team members also becomes easier when the game can be accessed online.
- Scores can be compared fairly and continuously. Online games can be synchronized in a way that allows meaningful rank-ordered scores to be always available, providing a constant source of excitement and a means of self-assessment. For example, in the Littlefield Technologies game, students can view their team's cash position ranked against all the other teams at any time while the game is running or after it ends. Computer logs show that students will click on the button showing ranked scores almost as much as all the other buttons in the game combined.
Because Analysis games are often used to teach skills rather facts or concepts, it can be challenging to assess the pedagogical success of the games. However, self-reported assessments of learning can at least suggest success. When I taught the Operations Management core course at Stanford University, students rated Littlefield Technologies as the most educationally valuable assignment in the course. Based on similar reports from faculty from other universities, as well as other professors' reported beliefs that students were learning from the experience, Littlefield Technologies was awarded the POMS Wickham Skinnner Award for Teaching Innovation in 2004.
While both Analysis and Capstone games seek to foster skill development through iterations of hypothesis, trial, and assessment, Analysis games typically focus on specific disciplinary topics like "adjust capacity to meet growing demand with a lead time objective" while Capstone games provide a more comprehensive, less focused experience such as "set financial, marketing and operational policies to achieve long-run profitability." Capstone games typically involve more decisions, a wider scope of decisions, and a more complex set of instructions.
Analysis games are typically used to cover specific topics in a similar way as a case study or a reading might be used. In contrast, Capstone games are typically a pervasive element of a capstone course designed specifically to integrate a wide variety of disciplines from other prerequisite courses. It is not unusual for an entire course to be developed around a capstone game.
There is a wide variety of commercially available online Capstone games, including several with a strong operations management component. (BPG Simulation, 2007, Management Simulations Inc., 2007, Links Simulations, 2007, Intopia Inc, 2007, Smartsims International, 2007) A large MBA program at Pepperdine (Pepperdine, 2007) illustrates how Capstone games are typically used to provide integrative and capstone experiences. Once at the beginning of the second year and then again at the end of the year, MBA students play The Business Policy Game (BPG Simulation, 2007) over a weekend. The game is not a component of a larger course. Instead the simulation experience, including faculty guidance and debriefing, constitutes a standalone one-unit course. The first one-unit course, named "Integration in Business Operations: Core Operations," integrates the business disciplines from the first year with an emphasis on the operational level. The second one-unit course, named "Integration in Business Operations: Strategic Management", serves as a capstone experience for the entire MBA program, placing additional emphasis on strategic-level concerns.
The motivations for placing Capstone games online are similar to the motivations for placing Analysis games online: transferring many technical issues to a service provider; accommodating geographic distribution; and constant availability of meaningful scoring.
My own experience along with discussions with other users and developers of online games, has yielded some principles that at least for me were not intuitive.
When designing an online game, it is very tempting to specify the context (e.g., a factory or a distribution network with a realistic set of decision parameters) before specifying the learning objectives (e.g., experience how information availability can mitigate the bullwhip effect, or develop expertise at forecasting random demand that has growth and seaonality components), but this is a mistake. It is essential to rigorously define learning objectives before designing the game. The learning objectives in turn determine the game's decision parameters, mission, and context. If the game design precedes the definition of the learning objectives, it is much more challenging to "force fit" specific learning objectives to the design afterwards. As a result, learning objectives can be vague or ill-defined, reducing the pedagogical value of the game.
Another temptation when designing online games is to create a diverse multitude of decision parameters along with an extremely complex environment, all with the intent of providing an experience that is as authentic as possible. In my experience, when students cannot map their decisions to subsequent results then students cannot assess the success of their decisions and thus cannot refine their understanding or achieve much learning. The more decisions that must be made at once and the wider variety of results that are presented in response to decisions, the harder it is to understand the consequences of a given decision. Instead, the design of an online game should include the aggressive elimination of any decision parameters, data, or supporting materials that do not support the game's learning objectives.
Particularly for academics, I believe another difficult and overlooked challenge when designing online games is the design of the user interface. Based on experience with commercial software packages, many students now seem to expect that reading a manual is not required to use software. The longer the manual, the more challenging it is to get students to invest their time in studying the manual. As a result, student interfaces should be developed so that the game elements, including context, dynamics, data, and decision parameters are as intuitive as possible. Such an objective is more easily achieved using a well-designed graphical interface that reflects the game's elements, instead of using simple online forms.
As described above, keeping score is an essential part of an online game. Scores keep students engaged, enthusiastic, and allow an important means of self assessment. Another challenge in designing the game is to make the score both authentic (e.g., cash position or total cost) and reflect achievement of the pedagogical objectives of the game (e.g., better demand forecasts will drive better decisions that in turn lower costs, improving the score in a transparent way). In contrast, scores in a poorly designed game will be more determined by which random numbers happen to be drawn in the game or how well students can tease out settings and logic of the underlying simulation.
Performance benchmarks are helpful. For example, allowing the students to see not only a "top scores" list, but also the score that would result from doing nothing can both aid students' self assessment and make the game more fun.
Of course, there are a host of technical issues involved in designing online games such as compatibility with ever evolving web browsers and accommodation of ever changing firewall requirements. The challenges are a significant part of delivering online games to a diverse group of students.
Several best practices have emerged from my discussions with the hundreds of instructors that have used my company's online games to teach operations topics.
First, Analysis and Capstone games should be graded assignments. Reflection on the outcome of the game, including how decisions determined the outcome, is an important part of the learning process with online games. Graded debriefs in the form of short written assignments or class presentations require such reflection. In addition, a small part of the assignment grade should be based on the final score in the game so that students take the competition seriously. This also implies the game should be fair and consistent across teams. On the other hand, Insight games should probably not be graded if the games are used to motivate course material as opposed to demonstrating mastery of course material.
Second, students should play Analysis and Capstone games on teams instead of as individuals. An important part of the skill-building process is discussing or justifying decisions to peers in language those peers can understand. Furthermore, when a decision affects multiple students' grades, more care may be taken when making that decision.
Third, debriefs in class should be used with all three types of games. Once a game concludes, students will be very interested in an explanation of how the winning team won. That level of interest provides a terrific learning opportunity. The debrief typically begins by recognizing the winning team. Ideally the winning then explains how they made their decisions. If the game was well-designed, the winners' decisions will have applied course concepts. A winning group of students can thus increase the credibility of the course concepts as useful and valuable. The debrief should conclude by explicitly stating the learning objectives of the assignment and relating the game results to those objectives.
Fourth, encourage student teams to define team roles and protocols in advance of the game, especially for Analysis games with durations on the order of hours. For example, how will the team decide when some action needs to be taken? In addition to defining team roles, some faculty find it useful to show students the starting conditions of the game so that they can form initial tactics and strategies before the game begins. This "pre-game" preparation is especially important if there is unlikely to be time to perform sufficient analysis during the game, or if the game is played outside class and students are unaccustomed to time-critical projects.
Games are an ideal way to engage students in material and develop skills through practice. A variety of affordable online games to teach operations topics are already available. Successful use of an online game begins with identifying its general pedagogical objective: insight, analysis, or capstone. That in turn drives decisions of where to place the game in class, how to grade it, and how to debrief the results.
I am grateful to Prof. Armann Ingolfsson for valuable editorial guidance and Prof. Ken Ko for discussions concerning the capstone course at Pepperdine. Any errors or misrepresentations are only my own.
Aldrich, C. (2005), Learning by doing; A comprehensive guide to simulations, computer games, and pedagogy in e-learning and other educational experiences, Pfeiffer, San Francisco, CA.
BPG Simulation (2007), "The Business Policy Game: An International Strategy Simulation,"
, (last accessed October 29, 2007)
Darden Business Publishing (2007), "The Channel Coordination Simulation," , (eLearning tab) (last accessed October 29, 2007).
Faria, A.J. (1998), "Business Simulation Games: Current Usage Level — An Update," Simulation Gaming, September 1998, pp. 295-308.
Harvard Business School Publishing (2004), "Global Supply Chain Management," , (search for "Global Supply Chain Simulation") (last accessed October 29, 2007).
Intopia, Inc. (2007), "INTOPIA B2B," , (last accessed October 29, 2007).
Jacobs, F. R. (2007a) "The E-OPS game," , (last accessed October 29, 2007).
Jacobs, F. R. (2007b) "Beer Distribution Game," (last accessed October 29, 2007).
Links Simulations (2007), "Links Enterprise Mangement, Services Operations Management, and Supply Chain Management Simulations," , (last accessed October 29, 2007).
Management Simulations, Inc. (2007), "Foundations and Capstone Business Simulations," , (last accessed October 29, 2007).
MASystem (2007), "BeerGame," , (last accessed October 29, 2007).
MBE Simulations Ltd. (2007), "MERP," , (last accessed October 29, 2007).
MIT Forum for Supply Chain Innovation (2005), "The MIT Beer Game," , (last accessed October 29, 2007).
Miyaoka, J. (2005), "Making Operations Management Fun: Littlefield Technologies," INFORMS Transactions on Education, Vol. 5, No. 2,
Pepperdine University (2007), "Fully Employed MBA Curriculum" (last accessed October 29, 2007).
Responsive Learning Techologies (2007a), "eBeer," , (last accessed October 29, 2007).
Responsive Learning Technologies (2007b), "Littlefield Technologies" and "Littlefield Labs", , (last accessed October 29, 2007).
Responsive Learning Technologies (2007c), "The Supply Chain Game," , (last accessed October 29, 2007).
Smartsims International (2007), "MikesBikes," , (last accessed October 29, 2007).
Sterman, J. D. (1992) "Teaching takes off: Flight Simulators for Management Education," , (last accessed October 29, 2007).
Supply Chain Redesign (2007), LLC, SCLE: Supply Chain Learning Environment," , (last accessed October 29, 2007).
Wood, S. C. (2004) "2004 Skinner Teaching Award Winner: Littlefield Technologies," POMS Chronicle Vol. 11, 3-4. 2004, pp. 7-8.
||To download a printable version (pdf) of this paper, click here. To download the Adobe Acrobat reader for viewing and printing pdf files, click here.
reference this paper, please use:
Wood, S.C. (2007), "Online Games to Teach Operations," INFORMS Transactions on Education, Vol. 8, No 1,