van ontwerp via ontwikkeling naar competentie
privacy-beleid

GHvernuft

"The ability to learn faster than your competitors may be the only sustainable competitive advantage." (de Geus, 1988)

The QFD structure for knowledge mapping

Greald Henstra *, Jo M.L. van Engelen *^
Rijksuniversiteit Groningen © 26 May 2000

Re-edited. Previous version published in:
Transactions from the eleventh Symposium on Quality Function Deployment, 1999, Ann Arbor, MI : QFD Institute

This paper is focused on knowledge, as a constituent for core competence (Prahalad, 1990) and core capability (Leonard, 1995). But core capability often comes together with core rigidity, which in contrast affects striking power (Leonard, 1995). These opposing effects should be balanced.

The nature of knowledge and learning (how it is created) provides a basis for collective learning and learning organisations (Weggeman). Based on the anatomy of learning organisations (Senge) the learning process is analysed, resulting in a model for knowledge creation in the organisation (Nonaka).

In this model 'tacit' knowledge plays an essential role. 'Tacit' knowledge runs the risk of being wasted. This risk is expected to decrease as the knowledge is revealed. The first step in revelation may be monitoring the process in which it is created. In the proposed approach, here referred to as LPMQ (Learning Process Monitor applying QFD), a QFD like structure is proposed to serve as a tool for this monitoring.

For validation of LPMQ the steps in QFD (Akao, Hauser) are compared to Nonaka's.

1. Introduction and research objectives

The development of new products is a process of strategic importance to a firm. After all it contributes directly to the firm's market position **.

But parallel to developing a product the product development process results in knowledge. It's this kind of knowledge that Coca-Cola cherishes in its recipe for its beverage. Generally though, much more subtle forms of knowledge establish the backbone for businesses, like a plumber's knowledge where exactly to kick to repair a heating system (Glashow, 1988) *^^.

This knowledge, contributing to the commonly shared expertise in the organisation, though originally a side effect, might be of even more strategic importance to the firm than the product itself.

The core-competence and core-capability approaches can be found to confirm this importance. In line with these approaches this paper is intended to serve the ability to control the knowledge-evolution that comes together with product development processes. To this end the theoretical feasibility is studied for the ability to monitor this kind of learning processes by means of a Quality Function Deployment (QFD) structure.

In order to fulfil our expectation this monitoring ability is assumed to go necessarily hand in hand with the accessibility to and from a company's knowledge base. After all, if one knows where certain problems have occurred (i.e. to whom) and apparently have been solved, what lessons have been learned, one may expect to be fruitfully advised in a similar case.

In fact this is common praxis in doing business; there's very much reliance on informal contacts where this kind of information is transferred.

The reason to structure this field however also stems from praxis, because these structures consist of people linked together, and people happen to be virtually unable to reproduce information exactly.

Core competence and core capability.

"A company's competitiveness derives from its core competencies and core products. [] Core competence is the collective learning in the organisation, especially the capacity to co-ordinate diverse production skills and integrate streams of technologies." (Prahalad and Hamel, 1990)

In Prahalad's view a tree-like model is constructed around this concept, where core competencies, dispersed over the company, form the roots that nourish core products that in their turn "engender business units, whose fruits are end products".

Leonard (1995) makes a distinction between "competencies" and "capabilities". The "term technological capabilities is used to encompass the system of activities, physical systems, skills and knowledge bases, managerial systems of education and reward, and values that create a special advantage for a company [.]

Such systems may be supplemental, enabling, or core [all original emphasises, G.H.]." Core capabilities, here being restricted to, "[] provide a competitive advantage [and having] been built up over time [] cannot be easily imitated" (Leonard, 1995).

These core capabilities on their turn may create a starting point for further developments, and thus keep the wheel in motion.

Knowledge, connected to this notion, may help the firm to distinguish itself from competition, therefore it can be assimilated into core capability.

"At least three kinds of skills and knowledge [constitute] this dimension of core capability: (1) scientific (public), (2) industry-specific, and (3) firm-specific[;] moving from 1 to 3 [] increasingly less codified and transferable" (Leonard, 1995). It's knowledge, and especially the 'firm specific' type, that will be the central focal point for this study.

Evolution and core rigidity.

Enterprises generally do not establish themselves in their businesses in a fashionable way according to preconceived plans (Collins and Porras, 1998). In fact they struggle and stumble, missing chances, occasionally picking up opportunities, adapting to changes, and, almost suddenly find themselves in a certain market position *^*.

In this environment of semi-consciously directed business development, the basic product concept is continually being changed incrementally; it evolves.

One of the characteristics of an evolution is 'path dependency'.

"It could be argued that the major thrust of 'innovation studies' is predicated on the ideas enshrined in the concept of 'path dependency'- namely that innovation is [] not random; that it is [] dependent on a variety of factors [that] determine a 'trajectory'." (Coombs, 1998)

Path dependency "refers to the [] course of the evolution of any [concept]. The choice of the [alternative] in the beginning of the path, which may have satisfied the needs of its time, may lead to an irreversible course of development that locks out superior opportunities that might have been attained if the course had been different.

[A] famous example of path dependency is the organisation of the letters on the typewriter keyboard, QWERTY [(David, 1985)] which has become standardised and fixed even in the face of more efficient alternatives." (Henstra, 2000, p.25-26)

"Decisions and events from the past intrude on the present and shape the future. [] The disciplinary and functional specialties that grew up to create and control the knowledge critical to [create a] product [] inevitably experience difficulty integrating their different knowledge bases." Consequently core capabilities go together with "core rigidities" (Leonard, 1995), or "weaknesses in the strengths" (Trout, 1986) that offer opportunities to competitors for attack.

1.1 Research objectives

The reason for this research. "The rationale [for managing core capability activities] is that they provide the best hope for successfully address the central paradox [] that every core capability also inherently is a core rigidity." (Leonard, 1995)

A study Coombs & al. (1998) conducted was to "explore the [] potential for active 'knowledge management' in order to modify the limits on innovation imposed by path dependency".

These surmises are in line with the reason for this paper; it is intended to serve the need to control the knowledge-evolution, in establishing a balance between "core capability" and "core rigidity" or in dealing with "path-dependency".

Revelation of 'tacit' knowledge.

Usually knowledge is gained at an operational level, but generally it is not systematically recorded in the firm's corporate information systems. It may be disregarded, for instance because it's not directly oriented towards the enforcement of the market position **^. On the other hand, being newly acquired knowledge, it may not fit in the data base structures at hand. It remains tacit.

The being tacit of this knowledge eventually may turn out to be a source of waste. It's not unusual for solutions to be reinvented over and over again ***, not even within the same firm, in successive product generations. More sadly, solutions once found but not (yet) opportune, just may be forgotten.

Certainly, if an invention is reinvented in later product generations its objectives have been realised (provided that it is satisfactory and still actual).

Hence the process in itself is effective. (For after all, reinventing is common practice, and even so, technology in general is progressing).

It's the efficiency that is affected. 'Tacit' knowledge runs the risk to be wasted. On the other hand it forms a concealed source of competitive advantage. One thing to do to create a strategic head start, in this field, is to find this source and reveal it.

Here, in this study, for 'tacit' *^^^ knowledge not only the 'unspeakable', which would be in accordance with Polyani (1966) but also the (so far) 'unsaid' knowledge is considered.

It may be consciously known by one individual but it is just not communicated in the organisation. It might be perceived as too futile, or the organisation might not be able to detect it. Another cause for knowledge to remain 'tacit' may be isolation. "The fragmentation of core competencies becomes inevitable when a diversified company's information systems [] do not transcend SBU lines." (Prahalad, 1990)

Knowledge remains 'tacit' if it's not communicated: codified, transmitted *^^*, received and decoded (Shannon, 1959).

Our problem, to reveal 'tacit' knowledge, therefore boils down to two conditions.

One condition is assumed to be to monitor its creation process, because (as indicated before) traces of knowledge left in the monitoring structure are expected to be easily tracked down.

The other condition is considered to be the facilitation of the communication process, in providing a 'two way accessibility'; messages codified and transmitted into the structure should be easily retraced and decoded. Codification and decodification thus are processes that play a key role in communication. In this process the 'codes' of both actors involved need to coincide; a real problem especially in the hazy starting phases of product development.

Practices.

In an empirical research on prevailing "knowledge management practices" *^*^ within innovation processes Coombs & al. (1998) found that these practices could be clustered to five groups, "in terms of the relationships between particular practices and [] functional activities within innovation processes":

A. R&D management (writing technical reports, periodic reviews, physical clustering of projects in cognate technological areas, fixing project experience)

B. Mapping knowledge relationships (mapping knowledge and person-embodied skill both internally and externally, market and inter-company relationships)

C. Human resource management (secondments, encouraging consultancy activities, skills matrices, dual project-line matrix management, courses)

D. Managing intellectual property (more pro-active, early involvement)

E. R&D IT management (enabling new practices and supporting prevailing practices)

Purpose for the paper

This paper aims at an instrument in connection to Coombs' (1998) 'knowledge management practices'. Having been recognised these fields may at least be concluded to fulfil some need.

The paper furthermore is intended to make a proposal for an approach, with a 'two way accessibility', to monitor the knowledge creation process in product development, and to establish a theoretical foundation for its validity.

Solution proposal.

In developing products a general problem to overcome, perhaps the central one, is the communication between successive phases or consecutive disciplines. "Core competence is communication, involvement, and deep commitment to working across organisational boundaries." (Prahalad, 1990)

QFD is a method optimised for this purpose. Today in product development projects QFD in general plays the role as an instrument for transition of information.

Our problem, the revelation of tacit knowledge, can be recognised to be akin to this. One way or another the employees do not express their findings. Not very surprising when coming to think of it. Apart from incentives, they need a medium for communication to do so.

"[S]enior management should spend a significant amount of its time [to] a strategic architecture [:] a road map [] that identifies which core competencies to build and their constituent technologies." "The strategic architecture should make resources transparent to the entire organisation. [I]t helps lower level management to understand the logic of allocation priorities and disciplines senior management to maintain consistency" (Prahalad, 1990).

Whereas a step in this direction was assumed to be the monitoring of the knowledge creation process in the company, the question in this paper is whether QFD can play this role.

2 Conceptual framework

Some prevailing theories are examined in order to draw a line of thought from our basic starting point in the field of "core rigidity" or "path dependency", to the "knowledge creation process", and eventually to the proposed QFD-structured solution.

Knowledge. . .

Weggeman (1997) describes knowledge (which we found to be connected to core capacity and consequently to core rigidity) as the "metaphoric product" of information and tacit knowledge. "Tacit knowledge" is defined as: "personal, context-specific, and therefore hard to communicate" (Polanyi, 1966) and, in accordance with Galbraith (1973), information is data that "serve a meaningful purpose in a certain setting []. Information is a means of reducing uncertainty." "However, information can only reduce uncertainty if it adds something to the knowledge of the recipient of the information" (Boersma, 1996). Consequently it takes an individual to convert "data" into "information" and "information" into "knowledge".

Knowledge itself therefore cannot be recorded, consequently nor can competencies and capabilities; only data associated to the information-component can be.

Concerning the location of knowledge Boersma and Stegwee (1996) *^** "distinguish between four forms of knowledge (Van der Zwaan & Boersma, 1993): human knowledge [], mechanised knowledge [incorporated in the hardware of a machine], documented knowledge [] and automated knowledge []."

Boersma's "human knowledge" **^^, but even the recorded or stored forms of 'knowledge' may be forgotten if they are not maintained **^*. "Core competence does not diminish by use. Unlike physical assets, which deteriorate over time, competencies are enhanced as they are applied and shared. But competencies still need to be nurtured and protected; knowledge fades if it's not used." (Prahalad, 1990)

This may be illustrated by the presently (june 1999) actual worldwide 'Y2K' or 'millennium bug' consternation.

. . . and learning

Learning collectively, being a main component of (core-)competence, can be defined as "going through a process of knowledge creation" according to Weggeman. Collective learning may lead to "shared mental models" (Senge, 1990) that "can enhance synergy and speed of progress [] drastically."

Learning organisations.

"Organisational learning", reasoning this way, coming together with "collective learning in the organisation" is described (Senge 1990) to be developed in the five "disciplines": "personal mastery", "mental models", "building shared vision", "team learning" and "systems thinking[,] the fifth discipline [] fusing [the other four] to a coherent body of theory and practice."

It's "systems thinking", that integrates the others. Without the emphasis on The System the interaction between the other disciplines would not be clear. "The essence of [] systems thinking lies in a shift of mind: seeing interrelationships rather than linear cause-effect chains, and seeing processes of change rather than snapshots."

Senge's five-disciplines-model appears to indicate what forces to mobilise in organisational learning and how to arrange them. But for understanding their operation a closer look is needed. The focus should be replaced from 'why organisations learn' to 'what is created' in the process. In other words, the so far considered "collective" or "organisational learning" process will be described as the creating process of its results: knowledge.

Knowledge creation.

Nonaka and Takeuchi (1995) already recognised the relation of Senge's "five disciplines" to the constituents for their model: "Judging from the entire argument of [Senge's] book, more specifically from such terms as "mental models", "shared vision", "team learning" [] his practical model of "learning organisation" has some affinity with our theory of knowledge creating []".

Knowledge can be labelled to be "tacit" or "explicit"; with "tacit" knowledge also defined following Polyani (1966), and "explicit" (synonymous to "codified") knowledge as: "transmittable in formal, systematic language" ***^ ****. "Tacit" and "explicit" knowledge can be conversed into each other.

The conversion from "tacit knowledge" to "explicit knowledge" is referred to as "externalisation", a part of the "knowledge spiral" and resulting in "conceptual knowledge". It seems clear that here a strong connection can be made with Senge's "shared mental models". Both notions refer to a commonly felt understanding or view on a certain subject.

"Combination [i.e. conversing "explicit knowledge" into "explicit knowledge" in another place] is a process of systemising [explicit] concepts into a knowledge system", converting them to "systemic knowledge". Seen in this light the "co-ordinating-production-skill-and-integrating-streams-of-technology" component of Prahalad's "core competence", might be recognised as a "combination" mode of "collective learning in the organisation". Technologies and production skills from all company branches may be combined to new "core products".

In statements here about making 'tacit' knowledge "explicit" also the "combination mode" applies, following our description as mentioned before. In our terms thus 'revealing tacit knowledge' in terms of Nonaka would be referred to as "externalisation" as well as "combination".

Conversion from "tacit" to "tacit" is referred to as "socialisation"; the case that "explicit" knowledge is converted to "tacit" knowledge is indicated as "internalisation": learning by doing. Nonaka & al. incorporated these modes into the steps of their "five phase model of the organisational knowledge creating process". This model, here presented as the outcome of several theories on knowledge and learning, will serve as a touchstone for our proposal to monitor the learning process.

QFD

Akao (1990) defines QFD as "converting the customers' demands into "quality characteristics" and developing a design quality for the finished product by systematically deploying the relationships between the demands and the characteristics, starting with the quality of each functional component and extending the deployment to the quality of each part and process."

Akao bases his philosophy on Feigenbaum's notion of a quality system: "a system of administrative and technical procedures required to produce and deliver a product of specified quality standards" *^^^^. Combined with Juran's definition for 'quality function': "a function [e.g. planning, design, trial, manufacturing; i.e. a production-function, not a product-function] that forms quality" *^^^*, he argues that "a quality system [] really [is] a logical arrangement, or sequence of, of "quality functions". He concludes, using Shigeru Mizuno's (1978) definition of deployment of quality functions as a "step-by-step deployment in greater detail of the functions or operations that form quality systematically []", that a "quality system can be based on the deployment of quality functions".

The core of the QFD approach might be expressed by the statement that "product [-or more in general: system-] quality can be assured through the quality of the subsystems."

Akao operationalised this approach with a series of "quality charts", from one "quality function" to the next, also visualised in "the House of Quality" (Hauser, 1988):

"[] a chart that illustrates the relationship between [] quality (as demanded by the customer) systemised according to function, and the quality characteristics as counterpart characteristics."

"[It's] a graphic device that enables [among other things] to develop a design quality. Quality design is the entire process of converting the quality demanded by the customer into counterpart characteristics by means of reasoning, translating and transferring. The quality chart therefore is the basis for quality design."

For our purpose QFD is proposed to be applied as a framework for communication (as a "transmission" channel; not only in space but also in time) encasing the product development process (the scheme that rules individual projects) rather than finding a place in individual product development projects.

For the latter part this is much in line with Kyeong's (1998) proposal, who's "main idea [] is as follows:

1. Similar products have similar attributes of HOQ [= House of Quality] charts []. If similar HOQ charts are built into a same class, managing the HOQ charts is more efficient.

2. HOQ charts in the same class are classified into a rule tree according to their similarity degree. The main reason is to locate more similar charts nearby for the efficient selection.

3. IF-THEN typed knowledge retrieves the more similar HOQ chart from the selected class base for a new product. Based on the retrieved HOQ charts, human experts can modify the chart with ease. []

4. More QFD analysis is performed, the knowledge base case becomes more richer. []"

3. The 'LPMQ' model outline (Learning Process Monitor applying QFD)

Now, suppose a structure like QFD's would be used universally within the company as an administration file for development projects.

Like an umbrella covering all projects, rather than playing more humble roles within several dispersed projects, this structure might (in addition to Kyeong's argument) as well serve as a communication platform.

In case, in running a project, a new relationship is learned, it simply can be added to the system. In case that a new characteristic is discovered in an individual project a new 'row or column' can be added to the system. (Compare to Kyeong, 1998: "[] the base becomes [] richer.")

Of course the new addition will be available for the project to be run in the first place, but the structure also may give the entire organisation the opportunity to apply this knowledge, in real time if necessary. (Compare to Prahalad, 1990: "Strategic architecture")

Incidentally only this 'row or column' would have to be balanced and judged, requiring a fraction of the time needed to fill out complete matrices over and over again, and yet yielding a full characteristics-relations-overview. (Let it grow.)

The QFD structure can be concluded to have a fairly good 'two way accessibility': first in order to locate where to store certain data, second for retrieving them. Both ways are guided by the functional structure, since for every function standardised entrance routes can be made available. In its role as a source of knowledge the structure could give projects a quick start-up, in simply neglecting non-applicable items, prior to, or in stead of analysing the situation at hand. (Compare Kyeong, 1998: "IF THEN typed knowledge [&c.]")

Especially the "externalisation" and the "combination" modes in Nonaka's (1995) learning cycle, i.e. conversion of 'tacit' knowledge as we consider it, may be cut short in time (and in location if also the experts are mapped). As long as hard accessibility to available knowledge is a serious friction, the performance of the development process may be expected to improve significantly by the application of such an 'universal QFD-like structured system'.

4 Evaluation

For validation of LPMQ, whether a QFD like structure is appropriate to monitor the learning process, the steps in QFD, as described by Akao, are to be compared to Nonaka's "five phase model".

Nonaka's five-step-model starts with "sharing tacit knowledge". This "socialisation" of "tacit" knowledge, as well as collecting ‘tacit’ knowledge, provides raw material for the learning process. So does the first step in developing a "quality plan and quality design": "[to] survey the expressed and latent quality demands of consumers []" (Akao 1990).

This raw material is transformed in "creating concepts [where] the [] team articulates [] a shared mental model [] in the form of collective reflection. The shared tacit mental model is verbalised []. In this sense this phase corresponds to externalisation" (Nonaka, 1995).

This appears to be a stage where QFD usually finds its application.

It can be recognised in the setting up of a "House of Quality" as listing and organising the "voice of the Customer" (Hauser, 1988) or "sampling [] consumer comments [] or 'verbations' "(Akao 1990).

For instance the "Customer Attributes" list (consisting partly of latent, = "tacit", requirements) can be considered as 'tacit' knowledge made explicit. After all the attributes are expressed during listing and transformed into 'concepts' reflected by the "grouping in bundles of attributes that represent an overall customer concern" (Hauser, 1988).

Next the "new concepts [] need to be justified" *^^*^ [], in order to become "justified true belief" (Nonaka, 1995). "Justification involves the process of determining if the newly created concepts are truly worthwhile for the organisation and society. It is similar to a screening process. Individuals seem to be [] screening [] concepts [] continuously and unconsciously [] throughout the entire process. The organisation however [] must conduct this [] in a more explicit way to check if [] the [] intention is still in tact and [] if the concepts [] meet [their] needs []".

In QFD this screening process is conducted explicitly. The "Customer Attributes", now "grouped in bundles", are 'justified' in weighing their relative importance and their comparison with competitors' performances in 'perceptual maps' (Hauser, 1988) *^^**.

Subsequently in the Nonaka "five step model" an "archetype is built" in "combining several elements of explicit knowledge" -already existing as well as just created.

In building the "House of Quality" this step can be recognised in the implications drawn from the 'perceptual map' which provides "a natural link from the 'product concept' to the 'company's strategic vision'". (Hauser, 1988)

The "archetype" here can be identified as the strategic positioning and the innovation direction, resulting from the 'relative importance weights' and from the 'perceptual map'.

"Organisational knowledge creation is a never ending process that up-grades itself continuously. [] The new concept [] moves on to a new cycle of knowledge creation at a different ontological level [i.e. the spectrum: individual - group - organisation]" (Nonaka, 1995).

In their picture Nonaka associated this "cross levelling knowledge" phase with "internalisation" where (new) explicit knowledge triggers the emergence of new "tacit" knowledge (within the organisation as well as outside: users, suppliers, etc).

In QFD this phase appears in the centre of the "House of Quality": the "relationship matrix".

This transition gives rise to the emergence of the "Engineering Characteristics", which in the next cycle will be sampled ("sharing tacit knowledge"), listed and organised ("creating concepts"), "justified" in filling out "the Roof of the House", "archetyped" in "objection measures", and "cross-levelled" in the second relationship matrix.
 
 

5 Conclusions

In this paper a first step was taken in solving the quest "how to reveal 'tacit' knowledge". One condition assumed to be fulfilled is the congruency of QFD with Nonaka's postulate (with 'tacit' knowledge, also in our sense, as an essential element).

As demonstrated here, the QFD-method appears to be quite congruent with the five-step-model; therefore it might be expected to be able to serve to monitor the steps in knowledge creation as Nonaka described. This also applies for the LPMQ proposal since it constitutes merely a special case for QFD.

For further research a next step could be considered to support this suspicion with some empirical evidence or simulation. Then, if it still appears to be valid, a "(tacit) knowledge map" might be constructed as a QFD-like structure, that could be researched to serve as a format for the strategic architecture as Prahalad (1990) proposed: a map for core competence.

REFERENCES

Akao, Y., Mazur, G. (transl.),
Quality Function Deployment-integrating customer requirements into product design, Productivity Press, 1990, Cambridge, Mass

Boersma, Jacques S.K.Th., Stegwee, Robert A.,
Exploring the issues in Knowledge Management, In: Khosrowpour, M. (ed.), pp.217-222., 1996

Collins, J.C, Porras, J,I.
Built to last : successful habits of visionary companies, Century Business, 1998, London UK

Coombs, R., Hull, R.,
Knowledge management practices' and path dependency in innovation, Research Policy 27 (1998)

David, Paul A.,
Clio and the Economics of QWERTY, American Economic Review, 75/2, 1985

de Geus, Arie,
Planning as Learning, Harvard Business Review, 1988 mar/apr (in: Senge 1990)

Fahey, L.,
Competitors, outwitting, outmanoeuvring and outperforming, John Wiley and Sons, 1998, New York, NY

Feigenbaum, A.V.,
Total quality control, McGraw-Hill, 1991, New York, NY

Fauconnier, G.,
Algemene communicatietheorie: een overzicht van de wetenschappelijke theorieen over communicatie, Martinus Nijhoff Wetenschappelijke en Educatieve Uitgevers, 1986, Leiden

Galbraith, J.,
Designing complex organizations, Addison Wesley, 1973 (in: Boersma, 1996)

Glashow, Sheldon L.,
Interactions, Warner Books, 1988, New York, NY

Hauser, J.R., Clausing, D.,
The house of Quality, Harvard Business Review, May-June 1988

Henstra, D.J.,
The evolution of the money standard in medieval Frisia, PhD-thesis, University of Groningen, 2000, Groningen NL

Hollins, W., Pugh, S.,
Successful product design: what to do and when, Butterworth, 1990, London, UK

Juran, J.M., Gryna, F.M., Bingham R.S.,
Quality control handbook, McGraw-Hill, 3rd ed., New York, NY

Khosrowpour, M. (ed.),
Information Technology Management and Organizational Innovations, Proceedings of the 1996 IRMA International Conference, May 19-22, 1996, Washington D.C., U.S.A

Kyeong Jae Kim, Chang Hee Han, Sang Hyun Choi, Soung Hie Kim,
A knowledge-based approach to the quality function deployment, Computers & Industrial Engineering, Vol 35, No 1-2 pp.233-236, 1998

Leonard, D.,
Wellsprings of knowledge, Harvard Business School Press, 1995, Boston, Mass

Mizuno, Shigeru, Akao, Y.,
Quality Function Deployment: Approach for Total Quality Control, JUSE, 1978, (in: Akao, 1990)

Nooteboom, B.
Towards a cognitive theory of the firm. Issues and a logic of change. AFEE Conference Proceedings, 1996, San Francisco (in: Boersma 1996)

Nonaka, I., Takeuchi, H.,
The knowledge creating organisation, Oxford univ.press, 1995, New York NY

Omta, S.W.F., Fortuin, F.,
Benchmarking of world class R&D, University of Groningen, 1998, presented at IAMOT Conference, 15-17 March 1999, Cairo

Polyani, M.,
The tacit dimension, Doubleday, 1967, New York (in Nonaka, 1995)

Prahalad, C.K., Hamel, G.,
The core competence of the corporaration, Harvard Business Review, May-June 1990

Senge, P.,
The fifth discipline, Currency Doubleday, 1990, New York NY

Shannon, C.E., Weaver, W. ,
The mathematical theory of communication, University of Illinois Press, 1959, Urbana Ill., (in Fauconnier, 1986)

Trout, J., Ries, A.,
Marketing Warfare, McGraw Hill, 1986, New York NY

Van der Zwaan, A.H., Boersma, S.K.Th.,
Kennismanagement [Knowledge management] Bedrijfskunde, vol. 65, no. 4, pp. 401-411, 1993 (in: Boersma, 1996)

Weggeman, M.,
Organiseren met kennis, Scriptum Management, 1997, Schiedam NL


original document
over GHvernuft | privacy | algemene voorwaarden

ontwerp en innovatie