How Managers’ Minds Work

A common topic in management literature over the past few years has been the difference between managers and management scientists, usually in relation to the argument that their association has not been a productive one. For example, a recent article by C. Jackson Grayson, Jr., compares the situation with C.P. Snow’s famous notion of the two cultures of science and humanities:

Perhaps this is an overpessimistic viewpoint, but it is one that is expressed often and by individuals who have substantial experience with the use of analytic methods in management.

Management science techniques have been very successful in such areas of business as logistics planning, resource allocation, financial forecasting, and so forth. It appears that, on the whole, these techniques have found the applications for which they are best suited, and managers make substantial and continued use of them.

However, in other areas of business they have been unable to gain any real foothold. Most obviously, they have had little impact on areas of decision making where the management problems do not lend themselves to explicit formulation, where there are ambiguous or overlapping criteria for action, and where the manager operates through intuition.

The major issue for management science as a discipline now seems to be to get managers in such situations to make use of the formal techniques that can clearly be so helpful to them but have not yet been so in practice. There seem to be two main factors affecting this problem.

One concerns the actual techniques available. Obviously, process chemists use linear programming because it suits the constraints and natures of the problems they deal with.

The primary factor, however, is the differences in approach and behavior between the two cultures. A feature under little control by either manager or scientist is that each has a distinctive style of thinking and problem solving. In its own context, each style is highly effective but not easily communicated to the other. The differences in thinking are neither “good” nor “bad”; they simply exist.

In a way, it is platitudinous to state that managers and scientists are different, but a reason for focusing explicitly on this factor is to examine the argument, maintained by management writers, that to bridge the gap between the two groups each should become a little more like the other. In this view, the differences themselves are the problem, and education is generally recommended as the solution: the manager should be trained in elementary quantitative techniques, and the scientist, in interpersonal and managerial skills.

Yet it is this very differentiation of thinking style that makes each of them successful in his chosen specialization. But the cost of differentiation is the increased difficulty it presents in integration. Therefore, the issue for both manager and scientist is complex: how to communicate with each other; how to complement each other’s strengths without sacrificing too much of one’s own.

In this article, we are explicitly concerned with these differences in thinking between the two cultures. We shall offer suggestions as to how the manager and the scientist can best work together in the development and use of analytic models and decision aids.

We suggest that such aids must be designed to amplify the user’s problem-solving strategies. Thus it seems that the central factor determining whether a manager will use a model to reach a decision is the extent to which it “fits” his style of thinking. The main body of this paper largely defines what we mean by “fit.”

Over the past four years, we have developed and tested a model of cognitive style, drawing on the developmental psychology that has in recent years reinvigorated the whole study of thinking and problem solving.2 Our main aim has been to better understand the cognitive aspects of the decision-making process.

In the first section of this article, we shall provide a statement of our model in terms applicable to problem solving and decision making in general, rather than just to analytic techniques. Next, we shall discuss the experimental data we have gathered in validating the model. Finally, we shall extend our findings to the implications of cognitive style for implementing formal analytic models.

We view problem solving and decision making in terms of the processes through which individuals organize the information they perceive in their environment, bringing to bear habits and strategies of thinking. Our model is based on the dual premise that consistent modes of thought develop through training and experience and that these modes can be classified along two dimensions, information gathering and information evaluation, as shown in Exhibit I.

Exhibit I Model of cognitive style

Information gathering relates to the essentially perceptual processes by which the mind organizes the diffuse verbal and visual stimuli it encounters. The resultant “information” is the outcome of a complex coding that is heavily dependent on mental set, memory capacity, and strategies—often unconscious ones—that serve to ease “cognitive strain.” Of necessity, information gathering involves rejecting some of the data encountered, and summarizing and categorizing the rest.

Preceptive individuals bring to bear concepts to filter data; they focus on relationships between items and look for deviations from or conformities with their expectations. Their precepts act as cues for both gathering and cataloging the data they find.

Receptive thinkers are more sensitive to the stimulus itself. They focus on detail rather than relationships and try to derive the attributes of the information from direct examination of it instead of from fitting it to their precepts.

Each mode of information gathering has its advantages in specific situations; equally, each includes risks of overlooking the potential meaning of data. The preceptive individual too easily ignores relevant detail, while the receptive thinker may fail to shape detail into a coherent whole. In management positions, the former will be most successful in many marketing or planning roles, and the latter in tasks such as auditing.

Information evaluation refers to processes commonly classified under problem solving. Individuals differ not only in their method of gathering data but also in their sequence of analysis of that data. These differences are most pronounced in relation to formal planning.

Systematic individuals tend to approach a problem by structuring it in terms of some method which, if followed through, leads to a likely solution.

Intuitive thinkers usually avoid committing themselves in this way. Their strategy is more one of solution testing and trial-and-error. They are much more willing to jump from one method to another, to discard information, and to be sensitive to cues that they may not be able to identify verbally.

Here again, each mode of information evaluation has advantages and risks. In tasks such as production management, the systematic thinker can develop a method of procedure that utilizes all his experience and economizes on effort. An intuitive thinker often reinvents the wheel each time he deals with a particular problem. However, the intuitive person is better able to approach ill-structured problems where the volume of data, the criteria for solution, or the nature of the problem itself do not allow the use of any predetermined method.

Most modern theories of the decision process stress “rationality.” Mathematical decision theory and game theory, for example, are both mainly concerned with defining the basics of rational behavior. Accounting for the discrepancies between it and observed behavior is only a secondary aim. Other theories, particularly those concerning organizational decision making, include factors of motivation, personality, and social forces but still treat decision making as essentially equivalent to problem solving.

In our model of cognitive style, we focus on problem solving, but our central argument is that decision making is above all situational and, therefore, includes problem finding. The manager scans his environment and organizes what he perceives. His efforts are as much geared to clarifying his values and intents as to dealing with predefined problems.

Obviously, some problems do force themselves on his awareness; this is particularly true in crisis situations. Nonetheless, he generally has some discretion in the selection of problems to deal with and in the level of aspiration he sets for himself. (His aspiration often determines the extent to which he involves himself in terms of effort and risk.)

The manager’s activities are bounded not only by the formal constraints of his job, but also by the more informal traditions and expectations implicit in his role. Because of this, the decision-making activity is strongly influenced by his perception of his position. A decision “situation” exists when he sees some event or cue in his environment that activates him into a search-analyze-evaluate sequence that results in a decision. This sequence is initiated by and depends on his environment assessment.

Our cognitive-style model provides some explanation of the processes affecting the manager’s assessment of his environment. It thus includes an important aspect of behavior omitted in most theories on decision making—namely, that of problem finding, problem recognition, and problem definition. Generally, other theories assume that the situation has already been defined; the manager is presented with a neatly packaged problem and instructions on what he should try to do.

Implicit in the focus on problem finding is the concept that particular modes of cognition are better suited to certain contexts than others. As we mentioned earlier, the central argument of our study is that there needs to be a fit between the decision maker’s cognitive style and the information-processing constraints of his task. Given this fit, the manager is more likely to gather environmental information that leads to successful (or at least comfortable) problem finding. He should also be able to evaluate that information in a way that facilitates successful problem solving. Perhaps the implications of a misfit are easier to indicate.

We mentioned earlier that a receptive thinker focuses on detail rather than pattern. But a receptive field sales manager who receives a wide range of information may well be flooded by it. He probably cannot examine all the sales reports, orders, phone calls, and so on. Instead, he should try to filter his information and be alert to trends and discrepancies. Thus a combination of the sales pattern in a particular region and a recent salesman’s report of several customers’ comments may lead him to recognize signs of change in consumer taste.

The preceptive individual is particularly suited to those tasks where he must have a concept of his environment. A preceptive manager would not be very successful in a task such as editing.

Similarly, it is easy to envisage tasks in which the intuitive thinker cannot come to terms with the data that are required in his decision making because he is unable to think in terms of a methodical sequence of analysis.

We have chosen the term “style” rather than the more common one of “structure” to stress the fact that modes of thinking relate more to propensity than to capacity. An individual’s style develops out of his experience. For example, there is a tendency, particularly in late high school and college, for a student to increasingly choose courses that build on his strengths. This reinforcing pattern further develops those strengths and perhaps atrophies the skills in which he is less confident.

This suggests not only that tasks exist that are suited to particular cognitive styles, but also that the capable individual will search out those tasks that are compatible with his cognitive propensities. In addition, he will generally approach tasks and problems using his most comfortable mode of thinking.

Our model indicates some important differences in the ways in which individuals of particular styles approach problems and data. The accompanying list summarizes the main characteristics of each style:

Systematic thinkers tend to—

Intuitive thinkers tend to—

Receptive thinkers tend to—

Preceptive thinkers tend to—

Our research supports the concept that particular tasks and roles are more suited to one cognitive style than to another. Exhibit II shows careers that seem to be especially compatible with the skills and predispositions implicit in each of the cognitive modes of style.

Exhibit II Tasks and roles compatible with each cognitive style

We have carried out a range of experiments over the past four years aimed at validating the assertions made in the preceding statements.3 The main effort in the experiments has been to identify and measure cognitive style. In the spring of 1972, a set of 12 standard reference tests for cognitive factors, developed by the Educational Testing Service, was administered to 107 MBA students. Each test was specifically chosen to fit one particular mode of style. The results confirmed most of the main characteristics of each style summarized earlier.

In our first set of experiments, 70% of the sample showed distinct differences in performance level between the systematic and the intuitive tests or between the receptive and the preceptive. This supports our basic contention that individuals tend to have a definite style.

We chose a conservative approach for our tests, classifying a subject as “intuitive,” “systematic,” and so on, only when the scores on tests requiring, say, an intuitive response were substantially different from those measuring capacity for the other mode of style along the same dimension. The comparisons focused on relative, not absolute, performance. The numeric scores were converted to a 1 to 7 scale, with a “1” indicating that the subject scored in the lowest seventh of the sample and a “7” corresponding to the top seventh.

From our main sample of 107 MBA students, we selected 20 whose test results indicated a distinct cognitive style for a follow-up experiment. This made use of a “cafeteria” set of 16 problems from which the subjects were asked to choose any 5 to answer. In individual sessions, which were tape recorded, the subjects were invited, though not required, to talk aloud as they dealt with each problem. The results pointed to distinct differences in the ways in which individuals of particular styles respond to problems.

As expected, the systematic subjects tended to be very concerned with getting into a problem by defining how to solve it. They were conscious of their planning and often commented on the fact that there were other specific ways of answering the problem.

In contrast, the intuitive subjects tended to jump in, try something, and see where it led them. They generally showed a pattern of rapid solution testing, abandoning lines of exploration that did not seem profitable.

More important, each mode of response was effective in solving different kinds of problems. In one instance, which required the decoding of a ciphered message, the intuitive subjects solved the problem—sometimes in a dazzling fashion—while none of the systematics were able to do so. In this particular case, there seemed to be a pattern among the intuitives: a random testing of ideas, followed by a necessary incubation period in which the implications of these tests were assimilated, and then a sudden jump to the answer.

There were often unexplained shifts in the reasoning of the intuitives, who were also much more likely to answer the problems orally. The latter tendency provided some confirmation for the idea that intuitive individuals use their own talking aloud to cue their activities and to alert themselves to possible lines of analysis.

There were distinct differences in the problems chosen by each of the groups, and their ratings of which problems they enjoyed most were remarkably consistent. The systematics preferred program-type problems, while the intuitives liked open-ended ones, especially those that required ingenuity or opinion.

The overall results of the initial experiments provided definite evidence to support both our model of cognitive style and the classification methods we developed through the main-sample test scores. The verbal answers in particular highlighted the degree to which these subjects consistently and distinctively respond to problems. There seems little doubt that, in these extreme cases at least, the individual maps himself onto the problem, rather than matching his behavior to the constraints and demands of the particular task.

In another set of tests, again using the main sample of 107 subjects, we examined the relationship between cognitive style and personality. We did this through comparisons of our test results with the Myers-Briggs scales used to classify individuals in relation to Jungian theories of psychological type.4

The most striking result of our experiment was that, while the scores on the Myers-Briggs scales showed virtually no correlation with absolute performance on our tests, there was a relationship between cognitive style and those scales. In particular, the systematic subjects were very likely to be of the “thinking” type and the intuitives much more likely to be at the other end of the scale, “feeling.” R.O. Mason and I.I. Mitroff provide a useful summary of the difference between the thinking-feeling types:

“A Thinking individual is the type who relies primarily on cognitive processes. His evaluations tend to run along the lines of abstract true/false judgments and are based on formal systems of reasoning. A preference for Feeling, on the other hand, implies the type of individual who relies primarily on affective processes. His evaluations tend to run along personalistic lines of good/bad, pleasant/unpleasant, and like/dislike. Thinking types systematize; feeling types take moral stands and are interested in and concerned with moral judgments.”5

We found a more modest relationship between systematic style and “introversion” and, similarly, between intuitive style and “extroversion.” Thus our findings mesh well with Mason and Mitroff’s predictions (they did not report any experimental data) about psychological type and information systems.

A year after the first two sets of experiments, we examined the relationship between style and career choice, using a sample of 82 MBA students. The results showed consistent differentiations between systematic and intuitive subjects. We compared the career preferences of the two groups and also looked at the test scores of those individuals who showed strong preference for particular careers.

In this experiment, the systematic students were attracted to administrative careers, to the military, and to occupations involving production, planning, control, and supervision. The intuitive group’s choices centered around the more open-ended business functions; they preferred careers in psychology, advertising, library science, teaching, and the arts.

The overall result of the three sets of student experiments supports the validity of our conceptual model as a useful and insightful framework for examining the role of cognitive processes in decision making. More important, now that we have established such proof, we plan to extend our research to the study of business managers and especially to model builders and model users.

One of our major conjectures, which partly underlay the whole development of our model, has been that computer systems in general are designed by systematic individuals for systematic users. Although management science has lost its early tones of missionary zeal, of bringing “right” thinking to the ignorant, the implementation of analytic techniques not unreasonably reflects the scientist’s own distinctive approach to problem solving.

Model building, from the viewpoint of the management scientist, involves making the causal relationships in a particular situation explicit and articulating the problem until he gets a reasonably predictive model; he will then generally refine that model. He has a faith in his own plan and process, and his specialized style of thinking enables him to literally build a model, shaping ideas and concepts into a methodological whole, and above all articulating relationships that the manager may understand but may not be able to make explicit.

The management scientist’s skill is indeed a specialized one; the powerful organizing and systematizing capacity he brings to model building is his special contribution. But, obviously, that can be a vice rather than a virtue in specific situations. What Donald F. Heany calls the “have technique, will travel”6 banner really amounts to the rigorously systematic individual’s preference for a methodical approach to all problems in all contexts.

Fortunately, there are many systematic managers. Our assumption is that most general managers who use management science techniques are likely to be systematic in style. The techniques match their own innate approach to problems, and they gravitate to occupations that are suited to their style.

For example, since inventory control is a task that can be systematized, it will attract systematic managers, and it will therefore be an area in which management science techniques will find fruitful ground.

However, there are just as many management positions not filled by systematic thinkers. For example, advertising, which is not so easily systematized, will attract intuitive people. If management scientists want their techniques used in these more loosely structured business areas, they must try both to make their models less awesome to the intuitive managers they will be working with and to support the managers in their decision-making processes.

This requires understanding the intuitive approach to problem solving in general and developing models which will amplify and complement that approach.

We have found it useful to categorize tasks—and problems in general—in terms of the problem solver’s assessment of his ability to first recognize and then act on relevant information.7 This process provides four basic classes of problems, as in Exhibit III.

Exhibit III Classification of tasks and problems

The classes are easily illustrated. If, for example, a manager encounters a problem of inventory control in which he feels that he knows both what data are relevant and what mental operations and analysis are required to deal with that data, the problem is one of planning (Type I in Exhibit III). His whole effort then involves merely arranging the data into a form which can be used as input to a defined sequence of evaluation.

Another class of problem (Type 2) exists when the required operations and methods are known, but the data involved are not. Price forecasting in complex markets is an example of this situation. Before a forecast can be made, a mass of data on economic, price, and market variables must be organized and sifted. Once this has been done, the forecasting procedure is simple.

A very different state of affairs exists when the individual understands the data but does not know how to manipulate them. Many production-scheduling problems fall into this class, invention (Type 3). The relevant data are known and the problem consists of finding a way to achieve the desired end.

The fourth class of problem exists when both information and operations are unknown. In this situation, there is a conscious search for cues and a generation of explanatory concepts, together with the development of a method for manipulating the data thus organized. The development of new products is a typical research problem.

Many management-science projects start as research. For example, modeling a complex environment such as the housing market in order to make industry or demand forecasts generally requires a complicated first step in which two areas of the problem are worked on in parallel: (1) the generation of concepts to “explain” reality and identify the most relevant variables, and (2) the definition of the outputs, aims, and implementation of the model.

In our cafeteria experiment, the one problem rated most enjoyable by well over half the systematic group was a basic planning task. The systematic management scientist can often take a research problem and shift it to one of planning. The methodological formalization he provides helps translate unknown states of perception and conception into known ones.

However, there is sometimes the danger that he will force the translation; he may insist on some objective function that does not really fit the situation, partly because his preference for planning leaves him unwilling to accept “unknown” states. He needs to make the implicit explicit.

Just as the systematic management scientist’s specialized style of thinking provides very definite strengths in specialized tasks, so too does the intuitive manager’s. It is important to again stress that the intuitive mode is not sloppy or loose; it seems to have an underlying discipline at least as coherent as the systematic mode, but is less apparent because it is largely unverbalized.

There are many situations where the volume of information, the lack of structure in the task, and the uncertainty of the environment defy planning and programming. In such situations the intuitive manager’s style can be highly effective.

For example, there is no way for any manager to systematically forecast consumer tastes for furniture styles. He can, however, build a set of cues and flexible premises that may alert him to shifts in taste. He may also use the rapid scanning and testing (the main characteristic of the intuitive) for a sense of fit among disparate items of information. More important, he need never make his concepts and methods explicit.

Unlike the model builder, the intuitive manager can act without making any conscious articulation of his premises. An amusing instance of this fact occurred in many of the early efforts to use process-control computers in paper making. The computer experts “knew” that paper makers knew how to make paper; the experts’ only problem was articulating the decision processes that the paper makers used, which turned out to depend mainly upon the operators’ “tasting the broth” and controlling the paper flow.

For a long time, this well-established and highly effective human decision process defied conversion into formal and explicit terms. The operators were not too helpful. They “knew” what worked; they had built up out of their experience a clear but not conscious sense of the process, but this sense often varied with the individual. Thus, when a shift changed, the new crew chief, for example, might reset the valves and modify the whole operation, asserting that the changes were needed because of the time of day. There was no articulated set of concepts or methods by which this assertion could even be tested.

The decision makers here—and they merit the term, since controlling the paper-making process is a constant series of evaluations, assessments, and actions—were able to act efficiently even though they could not articulate their own procedures. This lack of articulation became a problem only when it was necessary for the computer experts to build a model of that process.

Systematic and intuitive individuals often treat the same project as two entirely different problems. The systematic management scientist may try to structure the problem to reduce the unknowns and to define very explicitly all the constraints in the situation. He aims at a model that is complete and has predictive power, which he can then improve and refine. That, essentially, is how he regards problem solving.

However, consciously or not, the intuitive manager is most concerned with using the model to give him a better sense of the problem. He focuses on and enjoys playing with the unknowns until he gets a feeling for the necessary steps for completion. Then he is ready to delegate the process of dealing with the problem to some individual in his organization who can systematically handle it in a more routine fashion.

The intuitive manager may also approach a task for which a model is to be built not with a need to understand the analytic process, but with a desire to discover what he can trust in order to make useful predictions. This can be of value to the systematic scientist, in that, if he can build a model which “works,” the manager may well be ready to use it even though he does not understand it.

The central issue, however, is the validation of the model. The scientist validates his model formally and methodologically; he can test it in relation to known inputs and outputs. In general, he will have faith in his plan and in his own systematic process. The manager will validate the model experientially and test it against some of his own concepts and expectations. He places much less faith in external “authority.”

If our line of argument is valid, it is clear that the solution to the difficulties intuitive managers and systematic management scientists have in working together will not be obtained by trying to blur the differences. The intuitive manager may learn what network optimization is, but that is unlikely to make him think in the same systematic mode as the management scientist, who, in turn, is unlikely to develop intuitive responses through any form of education.

(This is not to assert that cognitive style is fixed, but to reinforce the point that individuals with very distinctive styles in specialized areas of activity have strengths that are directly related to their styles. It seems unlikely that the cognitive specialist will change easily—or that he should do so in any case.)

The real solution seems to lie in two areas: (1) in defining the model’s role within the larger decision-making process of the particular situation, and (2) in determining how to validate the model.

From this, the manager and scientist together can better control both the process of building the model structure and their mutual expectations and actions. At the root of both these areas of concern is the whole question of trust and communication, less in the interpersonal than in the cognitive sense.

The management scientist’s role can be one of either product or service. It is important that he decide which it is in a particular situation.

On the one hand, if his model will mainly help clarify a manager’s sense of the issues and options, then there is no point in the scientist’s trying to provide a meticulous and complex simulation. The manager does not intend to use the model as the basis for any decision. In fact, the model may simply help him decide what the problem is and can then be thrown away.

On the other hand, the manager may need a product rather than a service; for example, a financial forecasting model, once validated, may be used by a manager as the main basis for ongoing decisions.

The degree and direction of the scientist’s efforts will be very different, depending on how he perceives the manager’s needs in the situation. The scientist can only identify those needs by asking questions: How does this manager approach problems? How does he define his problem, given the four different classifications in Exhibit III? Does he want the model to further his own learning or to help him make a specific decision?

The answer to each question has distinct consequences. For example, if the manager’s response to problems is systematic, the model should explicitly reflect this fact. The scientist should explain to him the underlying assumptions as to method; the two can afford to invest substantial time and discussion on how to deal with the problem. Here, the manager is essentially looking for a technique and the scientist is the expert, with a catalog of methods.

However, if the manager is intuitive in style, the scientist should recognize that the model must allow the manager to range over alternatives and test solutions in the fashion that fits his natural mode of problem solving.

In this context, J.W. Botkin has used the paradigm of cognitive style in designing an interactive computer system for intuitive subjects.8 He has identified five necessary features for such a model:

1. The user should have the ability to create an arbitrary order of processing; the system should not impose a “logical” or step-by-step sequence on him. In Botkin’s words, “This lack of set sequence allows the intuitive user to follow his instinct for developing his ill-defined information plan directly from environmental cues.”

2. The user should be able to define, explore, and play out “scenarios” that may either generate cues or test solutions.

3. The user should be able to shift between levels of detail and generality.

4. The user should have some control over the forms of output and should be able to choose visual, verbal, and numeric displays at varying levels of detail.

5. The user should be able to extend his programming, providing input in an irregular and unspecific form (i.e., he should be able to provide commands such as, “Repeat the last step, increasing X by 10%”).

Botkin’s experiment showed fairly clearly that intuitive and systematic subjects used his model in greatly differing ways. The differences corresponded on the whole to those found in our cafeteria experiment. The intuitive group seemed to learn from the system and to enjoy using it as much as the systematic group.

Even though Botkin’s model was a special case, his results suggest that an effort on the part of the model builder to consider how the manager will use the model—in terms of process rather than output—will provide large dividends.

Here again, there is a distinction between service and product. Where the manager is most concerned with the recommendations he can derive from the model, the sort of cognitive amplifiers Botkin provides are unnecessary. However, where the manager wants the model to help him clarify his own understanding of the situation, it may well be essential to build them into the formal structure of the model.

Thus the management scientist needs to consider what a “good” model is. For himself, goodness is largely a quality of predictive power and technical elegance. For the manager, it is more a concern of compatibility and comfort—that is, the fit between how he approaches the problem and how the model allows him to do so.

Perhaps even more important than either recognizing the relevance of the user’s own problem-solving process or determining how that person will use the model is the whole question of trust. Often, the manager does not get involved in the model itself; he simply asks for the outputs. He may well wish to validate the model by testing out some scenarios for which he has some expectations of the outcome.

However, John S. Hammond suggests that the model builder should recognize that in a large and complex model the user will have neither the desire nor the ability to understand its mechanics. The designer must, therefore, provide the user with some other way of testing out—of building trust in—the model. Hammond recommends, therefore, that the management scientist should aim—

“…to get something simple and useful up and running as soon as possible. By skillfully manipulating the resultant model, the management scientist should be able to obtain results that will give great insights about the problem, its nature, and its alternatives to the manager. These insights should cue the mind of the manager and cause him to perceive the problems and alternatives differently, which will in turn affect the priorities and direction of the management science effort…

“Thus the management scientist, too, will learn about the nature of the problem and also about the nature of the manager’s perception of it.”9

This recommendation seems particularly relevant in cases where the manager’s cognitive style is highly intuitive. For relatively little effort and minimal commitment to a particular definition and design, the manager can obtain the initial exploration and trial testing that may enable him to articulate his assessments of the problem—or, better, that may enable the scientist to deduce them for him.

Our recommendations are fairly modest. Essentially, they argue that if both manager and scientist alike will look at the process instead of the output the techniques will look after themselves. It seems of central importance for the manager and scientist to recognize that each has a distinctive style of problem solving, and that each should accept the other’s difference.

If the management scientist can anticipate the fact that the manager may not use in his decision-making process the conscious planning that is so natural for the scientist himself, he will be less likely to assume that the manager’s reluctantly given statement of what the problem is has any permanent force. The intuitive manager can recognize a good plan, if he can validate it at some point on his own terms; the scientist’s responsibility is to provide the plan and also the validation.

The manager’s responsibility is to make very clear, first to himself and then to the scientist, what he wants the model to do and to be. If he asks for an optimization program for a facilities planning project, he should decide well in advance what he will do with the results. If he knows that he will not make his decision on the basis of the model’s output, he should make sure that the design process and the model structure allow him to use the model to amplify his own thinking.

The intuitive manager is very happy to relinquish the mechanics of formal analytic techniques to the expert, but only after he has developed confidence and trust in that expert. It is in this sense that the common recommendation of educating the manager in quantitative skills seems so inadequate. The intuitive manager will learn to make use of these skills supplied by others; but this learning is internal, experiential, and informal.

More than anything, the manager needs to learn how to tell a good model from a bad one. For him, a good model is one that he can, by testing his own scenarios, make sense of. However sloppy this may seem to the systematic scientist, his model will be used only if it allows the manager to make such tests or if the process of designing it has done so on a more ongoing basis.

People in general tend to assume that there is some “right” way of solving problems. Formal logic, for example, is regarded as a correct approach to thinking, but thinking is always a compromise between the demands of comprehensiveness, speed, and accuracy. There is no best way of thinking. If the manager and the management scientist can recognize first that each has a different cognitive style, and thus a different way of solving the same problem, then their dialogue seems more likely to bear fruit.

Our model of cognitive style is not necessarily either complete or precise. We suggest, however, that it does provide a useful way of focusing on the implementation of analytic models for decision making and of developing strategies of action that are much more likely to succeed than those based on concepts of technique, education, and salesmanship.

1. “Management Science and Business Practice,” HBR July–August 1973, p. 41.

2. See Jerome S. Bruner, Jacqueline J. Goodnow, and George A. Austin, A Study of Thinking (New York, John Wiley & Sons, 1956).

3. These experiments are described in detail in Peter G.W. Keen, “The Implications of Cognitive Style for Individual Decision Making,” unpublished doctoral dissertation, Harvard Business School, 1973.

4. See Isabel Briggs Myers and Katharine C. Briggs, “The Myers-Briggs Type Indicator,” Educational Testing Service, New Jersey, 1957.

5. “A Program for Research on Management Information Systems,” Management Science, January 1973, p. 475.

6. See “Is TIMS Talking to Itself?” Management Science, December 1965, p. B-156.

7. See James L. McKenney, “A Taxonomy of Problem Solving,” working paper, Harvard Business School, 1973.

8. “An Intuitive Computer System: A Cognitive Approach to the Management Learning Process,” unpublished doctoral dissertation, Harvard Business School, 1973.

9. “The Roles of the Manager and Analyst in Successful Implementation,” paper presented to the XX International Meeting of the Institute of Management Sciences, Tel Aviv, Israel, 1973.

How Managers’ Minds Work

Research & References of How Managers’ Minds Work|A&C Accounting And Tax Services
Source