Issuu on Google+

JISTEM JOURNAL OF INFORMATION SYSTEMS AND TECHNOLOGY MANAGEMENT REVISTA DE GESTÃO DA TECNOLOGIA E SISTEMAS DE INFORMAÇÃO

www.jistem.fea.usp.br

ISSN: 1807-1775

Volume 8 : Number 3 : 2011

Available Online Disponível Online Apoio USP


JISTEM Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol.8, No.3, Sept/Dec, 2011, pp. 513-764 ISSN online: 1807-1775

Volume 8: Number 3 / Volume 8: Número 3

2011

Content / Índice

513-514

1

ICTS – new organizational form linkage in the Australian context: Theoretical model and research instrument Ahmad Abareshi, RMIT University, Melbourne, Australia Bill Martin, RMIT University, Melbourne, Australia Alemayehu Molla RMIT University, Melbourne, Australia

515-538

2

Semantic wikis and the collaborative construction of ontologies: case study Fernando Hadad Zaidan, Universidade Federal Minas Gerais – UFMG, Brazil Marcello Peixoto Bax, Universidade Federal Minas Gerais – UFMG, Brazil

539-554

3

Aligning information security with the image of the organization and prioritization based in fuzzy logic for the industrial automation sector André Marcelo Knorst, Coester Automação Ltda - São Leopoldo, RS, Brazil Adolfo Alberto Vanti, UNISINOS - São Leopoldo, RS, Brazil Rafael Alejandro Espín Andrade, Universidade Técnica de Havana, Cuba Silvio Luiz Johann, Fundação Getúlio Vargas Rio de Janeiro, Brazil

555-580

4

Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information Kathiane Benedetti Corso, Federal University of Rio Grande do Sul, RS, Brazil Mauri Leodir Löbler, Federal University of Rio Grande do Sul, RS, Brazil

581-604

5

Risk analysis in outsourcing of information technology and communication Edmir Parada Vasques Prado, EACH University of São Paulo– USP, Brazil

605-618

6

Metamodels of information technology best practices frameworks Arthur Nunes Ferreira Neto, Catholic University of Brasilía, Brazil João Souza Neto, Catholic University of Brasilía, Brazil

619-640

7

XLDM: an Xlink-based multidimensional metamodel Paulo Caetano da Silva, Federal University of Pernambuco, Brazil Mateus Silqueira Hickson Cruz, Ceu System - São Paulo, Brazil Valéria Cesário Times, Federal University of Pernambuco, Brazil

641-662

8

Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio - económico Lisandro Curia, Universidad Nacional del Comahue, Buenos Aires, Argentina Andrea Lavalle, Universidad Nacional del Comahue, Buenos Aires, Argentina

663-680

9

Recommender Systems In Social Networks Cleomar Valois B. Jr, Centro Universitário da Cidade do Rio de Janeiro, RJ Brazil Marcius Armada de Oliveira, Centro Universitário da Cidade do Rio de Janeiro, RJ - Brazil

681-716

R. Gest. Tecn. Sist. Inf. /JISTEM Journal of Information Systems and Technology Management, Brazil


514 Content / Indice

10 Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil

717-748

Sergio Alexandre Simões – UNINOVE, Brazil Leonel Cezar Rodrigues – UNINOVE, Brazil Emerson Antonio Maccari – UNINOVE, Brazil Nove de Julho University – UNINOVE, Brazil Mauricio Fernandes Pereira, Federal University of Santa Catarina – UFSC, Brazil Events / Eventos

749

Contributions / Submissão de Artigos

750-751 752-764

Editorial Information /Ad Hoc reviewers/ Content 2011 Informações Editoriais /Avaliadores Ad Hoc/ Indice 2011

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 513-514

www.jistem.fea.usp.br


JISTEM Journal of Information Systems and Technology Management Revista da Gestão da Tecnologia e Sistemas de Informação ISSN online: 1807–1775

Every four months/Quadrimestral

Universidade de São Paulo – FEA USP Prof. Dr. João Grandino Rodas – USP Reitor/Rector Prof. Dr. Hélio Nogueira da Cruz – USP Vice-Reitor/Vice-Rector Prof. Dr. Reinaldo Guerreiro - Diretor da FEA/Dean of FEA Editor Prof. Dr. Edson Luiz Riccio, University of São Paulo – FEA, Brazil Assistant Editor Marici Gramacho Sakata, TECSI University of São Paulo – FEA, Brazil Editorial Board – Comitê de Política Editorial Armando Malheiro da Silva, University of Porto, Porto, Portugal Christophe Benavent, Université Paris Ouest Nanterre La Defense, Paris, France Henrique Freitas, Federal University of Rio Grande do Sul, Rio Grande do Sul, Brazil JaeJon Kim, Chonnam National University, Gwangju, Korea Luc Marie Quoniam, University Paris 8, Paris, France Michael D. Myers, University of Auckland, Auckland, New Zealand Miklos Vasarhelyi, Rutgers Business School, New Jersey, USA Rejane Maria da Costa, University of Brasilia, DF, Brazil Robert D. Galliers, Bentley College, Massachusetts, USA Editorial Review Board – Comitê Científico Editorial Adam Mazurkiewicz, Instytut Technologii Eksploatacji, Poland Adalberto A. Fischmann, University of São Paulo, São Paulo, Brazil Antonio Carlos dos Santos, Federal University of Sao Carlos, São Carlos, Brazil Birger Hjorland, Royal School of Lis, Copenhagen, Denmark Burak Arzova, Marmara University, Istanbul, Turquia Dennis F. Galletta, University of Pittsburgh, Pittsburgh, USA Emerson Maccari, Uninove, Sao Paulo, Brazil Fabio Frezatti, University of São Paulo, São Paulo, Brazil Fernando Colmenero Ferreira, University of Madeira, Madeira, Portugal Geraldo Lino de Campos, University of Sao Paulo, Sao Paulo, Brazil Gilson Schwartz, University of Sao Paulo, Sao Paulo, Brazil Guilherme Ari Plonski, University of São Paulo, São Paulo, Brazil Jan Capek, Univerzita Pardubice, Pardubice, Czech Republic Jose Dutra de Oliveira Neto, University of Sao Paulo, Sao Paulo, Brazil José Rodrigues Filho, Universidade Federal da Paraíba, Paraíba, Brazil Miguel Juan Bacic, University of Campinas, Campinas, Brazil Napoleão Verardi Galegale, Centro Paula Souza and Galegale Associados, Sao Paulo, Brazil Rosana Grillo Gonçalves, University of Sao Paulo, Brazil Salvador Ruiz-de-Chavez, APCAM, Ciudad de Mexico, Mexico Published by TECSI - Laboratório de Tecnologia e Sistemas de Informação - Revista de Gestão da Tecnologia e Sistemas de Informação - EAC FEA USP Av. Prof. Luciano Gualberto, 908 FEA 3, Cidade Universitária - São Paulo/SP 05508-900 Brasil Fone: 55-11-3091 5820 r.190 Fax: 55-11-3091 5820 jistem@usp.br Indexation/Directories SciELO, Latindex, Proquest, Ulrich's Periodical Directory, DOAJ, The Index of Information Systems Journals, ACPHIS, Dialnet, Ebsco, Gale Infotrac, Portal de Periódicos USP, CAPES Webmaster jistem@usp.br Technical Support Equipe TECSI pesquisatecsi@usp.br Terms and Conditions The license lets others distribute, remix, tweak, and build upon your work, even commercially, as long as they credit you for the original creation. This is the most accommodating of licenses offered. Recommended for maximum dissemination and use of licensed materials. Direitos e Permissão Os artigos são de total responsabilidade dos autores e todos os direitos reservados ao TECSI.

Esta licença permite que outros distribuam, remixem, façam tweak e construam sobre a sua obra, mesmo comercialmente, desde que lhe deem crédito pela criação original. Support: USP


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, pp. 515-538 ISSN online: 1807-1775 DOI: 10.4301/S1807-17752011000300001

ICTS – NEW ORGANIZATIONAL FORM LINKAGE IN THE AUSTRALIAN CONTEXT: THEORETICAL MODEL AND RESEARCH INSTRUMENT Ahmad Abareshi Bill Martin Alemayehu Molla RMIT University, Melbourne, Australia _____________________________________________________________________________________

ABSTRACT Since the publication of the seminal article ‘Management in the 1980s’ (Leavitt and Whisler, 1958), the relationship between Information and Communications Technology (ICT) and organizations has been one of the most challenging issues for management scholars and researchers. Despite a long tradition of research that has been looking into the relationship between ICTs and organizations, the findings remain inconclusive. In particular, the specific mechanisms by which new information technologies affect and are affected by organizational forms have not been described in any systematic manner. This paper aims to make a contribution to address the above gap in research by developing an instrument and theoretical model that relates ICT and the attributes of new organizational forms (NOFs). Likert-type scale items were used for all scale items. Of the 3770 emails sent to top Australian managers, 312 were completed and returned. An Exploratory Factor Analysis was used to identify the underlying constructs in this research. The research findings provide an instrument whose properties were validated and ready to use in future research. Keywords: IT Strategic Alignment, new organizational forms, information and communication technologies, Australia

_____________________________________________________________________________________ Manuscript first received/Recebido em 14/05/2011 Manuscript accepted/Aprovado em: 22/09/2011 Address for correspondence / Endereço para correspondência Ahmad Abareshi, PhD. Lecturer in the School of Business IT and Logistics, RMIT University, Australia. He holds a Bachelor degree in management, MBA in Operations Research, and a PhD in Information Technology RMIT University School of Business IT and Logistics, RMIT University, Melbourne, .Bld 108 Level 16 Room 65 Bourke Street, Melbourne 3001, Australia. Phone: +61 3 99255918 Fax: +61 3 9925 5850 Email: ahmad.abareshi@rmit.edu.au Bill Martin, PhD. Research director in the School of Business IT and PhD Supervisor, Research and consultant in the area of business models for digital models for digital publishing. School of Business IT, RMIT University, Melbourne, Australia. Phone 613 99255843 Email: bill.martin@rmit.edu.au Alemayehu Molla, Associate Professor, School of Business IT & Logistics, RMIT University. He holds a Bachelor degree in management, a Master degree in information science, and a PhD in information systems. School of Business IT and Logistics, RMIT University, Melbourne, Bld 108 Level 17 Room 54 Bourke Street, Melbourne 3001, Australia. Phone: +61 3 99255803 Fax: +61 3 9925 5850 Email: alemayehu.molla@rmit.edu.au Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


516

Abareshi, A., Martin, B., Molla, A.

1. INTRODUCTION Organizational form refers to the combination of strategy, structure, internal control, and coordination systems that provides an organization with its operating logic, resource allocation rules, and corporate governance mechanism (Creed and Miles, 1996). Since the evolution of conventional organizational forms such as hierarchical and bureaucratic, they have been continuously transforming into newer forms. New organizational forms (NOFs) have acquired a variety of labels including fluid form (Schreyögg and Sydow, 2010) network organization (Ghoshal et al., 1999; Maguire and Hardy, 2006), virtual organization (Davidow and Malone, 1992). Previous research has therefore focused on identifying the contexts, processes and variables that are associated with the emergence of these organisational forms. In terms of context, Beugre et al.(2006: 52) believes that the volatility of the external environment influences how organizations restructure themselves to cope with changes or to anticipate them. More recently, contingency approach (soft determinism) proposes that a set of factors have determined new forms of organizations. Globalization, deregulation, convergence of industries and rapid technological advancements, particularly in Information Technology (IT) and telecommunication are the contexts through which NOFs are emerging (Fulk and Desanctis, 1999). In terms of process, the progression toward new organizational forms has been gradual in most firms, dramatic in some, and non existent in others. A number of variables have also been associated both with the shape of and the underlying process that have resulted in NOFs. For instance, Fulk and Desanctis(1999: 5) showed how advances in IT could influence the socioeconomic systems and facilitate the evolution of NOFs. Developing a process-oriented model to assess the impacts of IT on critical business activities, Tallon et al. (2000: 145) propose strategic intent for IT and management practices as two variables that influence the organization. An important new stream of thought stressing the importance of organizational fluidity has emerged in recent years. It represents a reaction to the increasing complexity and environmental turbulence that organizations have to master (Schreyögg and Sydow 2010). Organization Science This paper falls in the tradition of the research that has been looking into the variables that contribute to NOFs. Since the publication of the seminal article ‘Management in the 1980s’ (Leavitt and Whisler, 1958), the relationship between Information and Communications Technology (ICT) and organizations has been one of the most challenging issues for management scholars and researchers. What makes the ICT- organization relationship so thought-provoking is that not only does it touch the complex combinations of knowledge and the ICT synergies, but also its implications on a range of variables including cost, quality, accuracy, risk, efficiency and productivity (Sauer and Willcocks, 2004). Most, if not all (for example see: Winter and Taylor, 1999) of the previous research argued for a positive link between changes in ICTs and changes in some individual dimensions of the organization. This includes the effect of ICTs on firm’s strategy, organization structure, internal control and operating systems, jobs and skills and behaviour, values and norms (Garicano, 2000; Panayides, 2004; White et al., 2005; Rajan and Wulf, 2006). Generically, ICTs are also associated with

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


517 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

the emergence of new forms of organisations that operate a digital business, in the digital space and market with digital products and services (Apigian et al., 2006). Despite a long tradition of research that has been looking into the relationship between ICTs and organizations, the findings remain inconclusive. In particular, the specific mechanisms by which new electronic technologies affect and are affected by organizational form have not been described in any systematic manner (Henderson and Venkatraman, 1999). In addition, there is limited research that has looked into the shape and form of the organisation that has been formed as a result of many years of ICTs assimilation. Further, although the impact of ICT at a very generic level is known, there is much less research that relates the specific attributes of ICTs (upstream factors) to attributes of NOFs (downstream factors) that emerge as a result impacts of ICTs. This paper aims to make a contribution to address the above gap in research. Hence, the purposes of the study are (1) to identify the features of new organisational forms (2) to develop an underlying model that relates these features to attributes of ICT and (3) to develop and test an instrument that aids in operationalising the model and (4) to discuss the theoretical and practical utility of the model and instrument. 2. CONCEPTUAL FRAMEWORK Previous research on the effects of ICTs on the various dimensions of an organization has covered a number of variables. While some focus on the effect of ICTs on the organization size, scope and product, others specifically look at the effect of ICTs on vertical and horizontal control mechanisms. Following the advent of the Internet, the effect of ICTs on the quality of an organization’s connection has received some research attention. Table 1 summarizes a sample selection of this literature. Traditionally, organizational forms have been mainly designed for coordination and control purposes in the presence of time and distance barriers. According to Dutton (1999) technological innovations have led to changes in organizational forms offering new capabilities for overcoming such constraints. For example, telephone, telegraph, and mail systems have enabled organizations to have better organizational and interorganizational communication systems. Also, new ICTs have provided modern capabilities influencing organizational processes (Huber, 1993). The combination of hardware, applications, infrastructure forms the capabilities of ICTs. As each of these dimensions develops, the concept, design and capabilities of ICTs would dramatically change. ICT resources cover a wide range of services such as e-mail, voice mail, teleconferencing, videoconferences, desktop, video- conferencing, computer aided design (CAD), discussion lists, information databases, groupware, intranet, e-procurement, e-logistics, e-government. A number of researchers have focused on the impact of these capabilities on organizational dimensions in general and on dimensions of NOFs in particular. For instance Marschak (2004: 473) believes that these capabilities have overcome the traditional communication difficulties and affected the role of middle managers and organizational hierarchy. In another study Panayides (2004: 35) pointed out how advanced ICT capabilities could decrease the size of the organization. Therefore, what is clearly deducted from this perspective is the influencing power of ICT infrastructures and capabilities on organizational dimensions.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


518

Abareshi, A., Martin, B., Molla, A.

Table 1. ICT- Organization Literature Dimension Vertical Control

Author (Finnegan and Longaigh, 2002) (Holland and Geoffrey Lockett, 1997) (Argyres, 1999)

(Mukherji, 2002)

(Marschak, 2004) Horizontal Coordination

(Baker, 2002)

(Finnegan and Longaigh, 2002) (Symon, 2000)

Type of Connection

(Chae et al., 2005)

(White et al., 2005)

Focus Role of IT in organizational control and coordination Mixed mode network structures

Findings IT facilitates centralization of control

IT facilitates the development of mixed mode network structures. IT impact on IT can facilitate coordination coordination within and between organizations. IT impacts on structure A natural compatibility between IT and organization structure. IT and degree of Improved IT lowers decentralization. decentralization penalty. The effect of IT on the Significant improvement quality of decision in the quality of teams’ making. strategic decisions. Role of IT in IT facilitates organizational control depersonalization of and coordination coordination mechanisms. Assessment of new Ambiguity on the ability ways of organizing of IT to support the new and new technologies. ways of organizing. IT and supply chain The effect of IT on supply collaboration chain collaboration is determined by the interplay between IT and existing relationships between partners. IT impact on supply Increase in levels of chain management. integration between partners’ information systems, and agility in the supply chain.

As ICT needs a huge amount of organizational capital, there has always been a concern about the effectiveness of ICT investment. Successful ICT investment can not be achieved without commitment to change in general and IT project in particular. Since top management has a broader view over different internal and external organizational issues, their role in taking most advantages of ICT capabilities is irrefutable. In their process-oriented model, Tallon et al. (2000: 145) incorporated management practices as the key determinant of IT capabilities. They showed how top executives with more focused goals for IT could bring more IT capabilities into action. Top management make a variety of organization decision making, including planning, design, resource

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


519 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

allocation, and implementation activities (Thong et al., 1996). This in turn would somehow affect ICT structure and capabilities. In another study, Luftman and Kempaiah(2008: 99) identified senior executive support for IT as one of the key factors in a successful IT investment. Control The nature of control is one key attribute of NOF. It is related to the extent of centralization or decentralization of decision making. Centralization refers to the extent to which decision making authority is dispersed or centralized in an organisation (Richardson et al., 2002). Traditionally, external and internal information are the domain of top management; however, owing to increased global competition in recent decades, many organisations have moved decision making to lower levels of their organisation to take advantage of specialized staff (Fulk and Desanctis, 1999). There are no simple answers to the forms of control employed in NOFs. Kotorov(2001: 55) showed how spatial decentralization is prevalent in virtual organization. Using the organization of the research and development (R&D) function Hill, Martin et al. (2000: 563) supported the relevance of theories of increased decentralization in post bureaucratic forms. Kartseva, Gordijn et al. (2005: 57) argued that an important aspect of network organizations is that controls are typically not imposed on the network by a central authority but are negotiated by the network partners. Coordination Surviving in the current uncertain environment requires a high level of coordination among different parts of the organisation. In terms of concurrent engineering, it is worth noting that instead of employing sequential processing, NOFs have been employing parallel and concurrent processing. As zero inventory can reduce the total of cost of production, Piore (1994) argued that the elimination of inventory would lead to greater interdependency among organisational units and to greater lateral communication and less hierarchy. The last area of coordination focuses on the attributes of virtual organisations. Among the different dimensions of virtual organisations, those of the electronic rather than the material nature of data, or being structureless and having ambiguous external boundaries are of high importance (Nohria and Berkley, 1994). Therefore, “A coordination-intense structure� is another attribute of NOFs. Communication More importantly, surviving in new volatile environments requires organisations to continuously pursue organisational innovation. New organizational forms depend on communication for dealing with ever more complex interorganizational relationships. Communication plays an important role in facilitating the innovation processes and is

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


520

Abareshi, A., Martin, B., Molla, A.

considered as the core feature in NOFs. Virtual organizations, as a new form of organization, depend on the communication among organisational and geographically dispersed employees work in virtual team settings (Workman, 2007). Heckscher (1994: 14) argues that in NOFs, relationships depend on trust, a high degree of shared vision, and broad communication about the corporate strategy. Therefore, communication technologies can play an important role to facilitate the information flow among people, many of whom may never have met each other. Consequently ‘weak ties’ among organisational staff would be supported. In other words, in such forms coordination is accomplished by individuals and teams with cross-functional, computer –mediated jobs. Free communication flows and shared access to information and knowledge are regarded as essential in NOFs (Cairncross, 2001). Thus, contrary to classic theories of organization, information should be available to all members of the organization, irrespective of specialization and/or hierarchical position (Levine et al., 1999). In NOFs the demands for the rapid action have increased, so it is increasingly necessary to have people at all levels who are powered to act based on their first-hand knowledge of the issue and their capacities for exercising judgment and initiative. Therefore, the information should be accessible to all people. In NOFs there should be sufficient autonomy and clear organizational vision must be articulated and committed to (Chowdhury, 2003). The above review indicates that by looking at the nature of control, coordination and communication, it is possible to identify if an organisation demonstrates an NOF feature. Therefore, based on the above discussion on some dimensions of NOFs for the purpose of this study, we define NOFs as an organizational form which is decentralized form with horizontal coordination and high level of communication. Therefore, we posited that a multi-perspective of technological and managerial issues can provide meaningful indicators of the ICTs-NOFs linkage. We use the concept of ICT Assets to capture all the ICT capabilities and functionalities which affect the organizational dimensions and ICT Management to represent management of ICT in organizations. NOF represents some of the features and attributes of new organizations. We identified three constructs- ICT Assets, ICT Management, and NOFs as shown in Figure 1. ICTA

NOF ICTM

Figure 1. The Initial Research Model 3. RESEARCH METHODS This study is deductive in nature which is followed by gathering data and using some descriptive statistics to conceptualize the research theoretical model being proposed. The main research objective is to develop a theoretical model explaining how and to what extent ICT can contribute to the evolution of new organizational forms. The survey strategy was employed because it enables researchers to work with a large amount of data in a highly economical way. Prior to designing the research instrument,

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


521 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

the existing instruments were considered. STROBE (Venkatraman, 1989) and STROPIS (Chan et al., 1997) instruments were chosen and revised. In designing final instrument the instrument development procedure suggested by Churchill (1979: 64) was followed that involved specifying domain of constructs, generating sample of items, collecting data, and assessing the validity and reliability of the measure. 3.1 Domain of constructs Clearly research constructs are needed to investigate any linkage between ICTs and NOFs, but there is the immediate issue of which are likely to be the most relevant. It is difficult to conceptualize all dimensions of NOFs and to determine all variables that could affect the evolution of NOFs. Therefore, only those variables are considered, to some extent, related to information and communication technologies. Organizations attempt to use systems that support their strategic orientations. However, for different reasons, such as resource constrains or internal stability, some organizations would be more successful at developing appropriate systems than others. What is ignored in ICTs literature is the synergetic effects of ICTs where combined with other organizational potentials such as business strategy (Byrd et al., 2006). Hence, the alignment between IT and business strategy is of high importance. Alignment involves “applying information technology (IT) in an appropriate and timely way and in harmony with business strategies, goals, and needs” (Luftman and Brier, 1999). Although several factors are thought to contribute to the evolution of NOFs, a major force lies in the capabilities provided by ICTs. Over recent years, several different arguments have been offered to highlight the potential of ICTs to enable and shape an organizational form (Rajan and Wulf, 2006).On the basis of the literature along with conducting a focus group discussion that consists of three IT managers and two academics, we identified one variable related to the ICT assets construct, that is ICT Dynamics and two variables related to ICT management, that is, IT strategic alignment and management support. A number of the characteristics of NOFs were identified. In the following section, we discuss each variable briefly. •

IT strategic alignment (ITSA)

ICT capabilities and potentials (ICTC)

Management Support (MS)

Characteristics of NOFs (NOFC)

IT Strategic Alignment (ITSA) refers to the extent to which the IT mission, objectives, and plans support and are supported by the organization mission, objectives, and plans (Hirschheim and Sabherwal, 2001). While there is widespread consensus that organizational and IT strategies should be linked (Niederman and Brancheu, 1991) such an alignment has not been easily and clearly understood by practicing managers (Hirschheim and Sabherwal, 2001; Chan, 2002). Among different approaches to investigation of the mutual interaction between ICTs and organisation, that of the IT Strategic Alignment model, seems most likely to precisely describe this linkage. A successful strategic alignment is unlikely without advanced communication systems as such systems enable organisations to share the required real-time information between each other. The interlinking of organisational relationships across a wide range of industries, form banking to insurance, is resulting in complex alliance webs in which

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


522

Abareshi, A., Martin, B., Molla, A.

one organisation can serve simultaneously not only as a supplier, but also as a competitor, customer, and consultant. The result is a circular value chain and new forms of interdependence (Fulk and Desanctis 1995). Based on the need to achieve alignment across business and IT areas, Henderson and Venkatraman(1999: 472) proposed an IT strategic alignment model. In their model business strategy refers to realized business strategy and focuses on the ‘‘resource deployment patterns’’ that organizations employ to achieve their objectives. It is defined at the business unit level. It contains dimensions including: Aggressiveness, Analysis, Defensiveness, Innovativeness, and Proactiveness. There are five dimensions parallel to business strategy to measure IT strategy. In this regard, it is focused on the capabilities provided by IT to support different business strategies. Among different available models to investigate IT strategic alignment, the one proposed by Henderson and Venkatraman was employed. The Strategic Alignment Model provides a clear and concise basis to evaluate the strategic fit and functional integration of an organisation’s business and IT strategies on its structure. Thus, it is argued that there is an underlying relationship between the strategy (whether business or IT strategy) an organisation pursues and the resulting structure. Two previously validated scales were used to measure the alignment between business strategy and IT strategy. These scales are STROBE (Henderson and Venkatraman, 1999) and STROEPIS (Chan et al., 1997). Among two methods to calculate the alignment, the moderation model was employed as it consistently outperforms the matching models. While ICTs are different, and thus create different challenges, over time organisations have developed unique sets of IT capabilities. Owing to these differences, some organisations have been able to develop more successful IT infrastructures than their competitors (Wade and Hulland 2004). ICT capabilities that have received research attention include technical skills, IT management skills, and relationship assets (Piccoli and Ives 2005). ICT capabilities and potentials (ICTC) refers to ICT capabilities and functionalities which can influence the organizational dimensions including the amount of investment in ICT, the variety in ICT usage, and the sourcing structure for ICT. Schilling and Steensma (2001: 1149) provided a causal model indicating how technological changes could influence the evolution of modular organizational forms in the US manufacturing sector. Also Stiroh (2002: 1559) provided some evidence of how information technology would influence the firms´ boundaries and could enhance communication and coordination both between the firm and its partners, and within the firm itself. The technical IT infrastructure encompasses the physical IT and communications resources of an organization, along with the shared services and business applications. It encompasses an organization’s network, storage, data, and application assets as well as the network critical physical infrastructures (Byrd and Turner, 2000). ICT resources cover a wide range of services such as e-mail, voice mail, teleconferencing, videoconferences, desktop, video-conferencing, computer aided design (CAD), discussion lists, information databases, groupware, intranet, eprocurement, e-logistics, and e-government. A number of researchers have focused on the impact of these capabilities on organisational dimensions in general and on dimensions of NOFs in particular. For instance, Marschak (2004: 473) believes that these capabilities have overcome the traditional communication difficulties and affected the role of middle managers and organisational hierarchy. In another study, Panayides (2004: 35) pointed out how advanced ICT capabilities could decrease the size of an organisation. Put differently, several factors are thought to contribute to the evolution of

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


523 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

NOFs, but the capabilities provided by ICTs are a major influence. Over recent years, several different arguments have been offered to highlight the potential of ICTs to enable and shape organisational forms (Rajan and Wulf, 2006). The combination of hardware, applications, and infrastructure shapes the capabilities of ICT. As each of these dimensions develops, the concept, design and capabilities of ICT would dramatically change. ICT has decreased the size of middle management in organisations, especially with the advent of centralized decision making authority (McNulty and Ferlie, 2004). Decreasing the role middle management played in organisations would lead to a networked, flat organisational hierarchy. Miles and Snow (2005: 162) claim that the new ‘spherical’ organisational form is based on “leadership as a shared responsibility among colleagues, not as superior-subordinate relationship”. This outcome is supported by shifting the role of ICT from a back office function to a more influential one. IT is precisely due to this powerful capacity of ICTs, that many human contributions have been substituted by ICTs (Fulk and Desanctis, 1999). ICTs are seen to provide organisational employees with global data that will permit them to make local decisions consistent with overall organisational goals. As the flexibility of ICT enables it to handle a huge amount of processing jobs, these technologies can be configured to substitute for some traditional managerial roles. Management Support (MS) refers to the extent of management support for ICT promotion in organizations. As ICT requires commitment of a huge amount of organisational capital, there has always been a concern about the effectiveness of ICT investment. Successful ICT investment can not be achieved without commitment to changes in organisation. Since top management has a broader view over the different internal and external organisational issues, their role in making the most advantages of ICT capabilities is indisputable. In their process-oriented model, Tallon et al. (2000: 145) incorporated management practices as the key determinant of IT capabilities. They showed how top executives with more focused goals for IT could bring more IT capabilities into play. This in turn will affect ICT structure and capabilities. Luftman(1999: 109) identified senior executive support for IT as one of the key factors in a successful instance of IT investment. Management support was operationalized by assessing top management attitudes regarding ICTs capabilities and the extent of top management support for technological innovation (Luftman and Kempaiah, 2008). Brynjolfsson and Hitt(2002: 23) emphasized on the importance of management on the success of ICT investments. They believed that any relationship between ICTs and organizational change is due to the temperament of management rather than their economic capabilities. Characteristics of NOFs refer to the combination of strategy, structure, internal control, and coordination systems that provides an organization with its operating logic, resource allocation rules, and corporate governance mechanisms. Table 2 summarizes the definitions of variables for each construct used in the proposed model. 3.2 Instrument Development A pilot study was conducted with industry practitioners to examine the external validity of the constructs. The purpose of the pilot study was to ensure the vigorousness

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


524

Abareshi, A., Martin, B., Molla, A.

of the model and the initial instrument. The pilot study consisted of five people, including two academics, one from a government organization, and two persons from private sector. They were provided with the initial research model and the proposed variables to establish the basic unassailability of model. No modification was made at the end of this stage. For the research constructs, we then developed the questionnaire items based on the literature as well as on comments obtained from the pilot study. The survey questionnaire was revised by members of the member of pilot study. In formulating the initial instrument for determining the level of IT-Strategic Alignment, Venkatraman’s STROBE (Strategic Orientation of Business Enterprises) instrument (Venkatraman 1989) and Chan’s STROPIS (STRategic Orientation of the Existing Portfolio of Information Systems) instrument (Chan, Huff et al. 1997) were used and revised. Venkatraman’s STROBE instrument seeks to develop valid measurements of key dimensions of the business strategy construct. It focuses on the ‘‘resource deployment patterns’’ that organizations employ to achieve their objectives. Using eight quantified characteristics of business strategy Venkatraman proposed a STROBE scale. These characteristics are: aggressiveness, analysis, internal defensiveness, external defensiveness, futurity, proactiveness, riskiness, and innovativeness. Chan’s STROPIS instrument operates in parallel to STROBE, i.e., for each individual STROBE variable, there is a parallel variable in STROPIS. For instance, aggressiveness is one of the dimensions used to measure business strategy. The parallel STROPIS variable would be: IT supports for aggressiveness. Table 3 presents the reliability of six constructs in STROBE and STROPIS instruments. For parsimonious purposes, we combined the Riskiness and Futurity dimensions and created a revised dimension named Innovativeness. The items designed for the new dimension covered both futurity and riskiness. The items for the rest of the dimensions were employed for STROBE and STROPIS.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


525 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

Variables

Description

Number of Items

References

5

(Venkatraman, 1985; Chan et al., 1997; Hussin et al., 2002; Pierce, 2002; Chan et al., 2006)

Business Strategy Aggressiveness Analysis

Defensive

Innovative

The ways in which businesses implement resource allocation for pursuing aggressive strategies. Tendency to search and develop the best possible alternatives in organizational decision-making Maintaining the current position and defending the right to play in the market, say by employing cost reduction strategies. The development and early adoption of innovations. Active participation in emerging industries and a continuous search for market opportunities to anticipate and predict the future in both business and technology markets.

Proactiveness IT Strategy IT for The capabilities provided by IT to support Aggressiveness aggressiveness strategy. IT for analysis The capabilities provided by IT to support analysis strategy IT for The capabilities provided by IT to support defensiveness defensive strategy IT for The capabilities provided by IT to support innovative innovative strategy IT for The capabilities provided by IT to support proactiveness proactiveness strategy ICT Dynamics Refers to the varieties of IT usages, different course of action organization uses in meeting IT needs, and to the division of labour and responsibility for managing IT activities in organizations Refers to the extent of management support Management on ICT promotion in organization. Support

NOFs

It covers the areas of organization structure, internal control, resource allocation rules, corporate governance mechanism, division of labour, coordination systems, communication, and centralization of decision-making.

2

4

2 5

4 3 3 3 4 3

(Fulk and Desanctis, 1995; Tallon et al., 2000; (NOIE), 2005)

5

(Thong et al., 1996; Luftman, 1998; Tallon et al., 2000; (NOIE), 2005) (Heydebrand, 1989; Nohria and Berkley, 1994; Bowman et al., 1999; Fulk and Desanctis, 1999; Moller and Rajala, 1999; Dewett and Jones, 2001)

20

Table 2 Research Variables JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

(Chan et al., 1997; Pierce, 2002; Kearns and Lederer, 2003; Kearns and Sabherwal, 2006)

www.jistem.fea.usp.br


526

Abareshi, A., Martin, B., Molla, A.

3.2 Panel of Expert To pre-test the relevance and reliability of the instrument, we conducted a survey with a panel of experts. The initial questionnaire was sent to 307 internationally recognized academics in the field of ICT-organization and practitioners who were randomly selected from a rental database. Although some of the items we have used are pooled from previously validated instruments, in order to establish the current relevance of these items, we have surveyed a panel of experts. The participants were asked to judge the degree of relevance of individual items using a five point Likert scale ranging from not relevant (1) to extremely relevant (5). Of 307 questionnaires sent to the participants, we received 34 responses (11.1 % response rate). The response rate was reasonable for the purpose of this purpose (Wang and Tang, 2001). Intra-class (Interrater) reliability, which is usually reported as a correlation coefficient, provides a measure of how well two or more raters agree in their assessment of a variable (Litwin, 1995). At p = 0.01 all correlation coefficients between raters were significant indicating high reliability of the judgements. The Mean Relevance Score (MRS) was employed to determine those items that should be dropped from the initial instrument (Molla and Licker, 2005). MRS was more than average, 2.5, for all items so no items were removed from the final instrument. Then the internal consistency of the instrument was assessed using Cronbach’s alpha. Cronbach’s alpha values ranged from 0.71 for ICT Dynamics to 0.82 for IT Strategy indicating a high level of internal consistency. Table 4 represents the summary of Cronbach Alpha coefficients and inter-observer reliability. Table 4 Cronbachalpha’s and inter-rater reliability of initial instrument Construct

Cronbach Alpha

Business Strategy

.76

IT Strategy

.82

ICT Dynamics

.71

NOFs

.81

Interobserver reliability

.70 F=4.945 Sig: .000

3.3 Full Study The research is aimed at a broad sample of private and public sector organizations in Australia from seven selected industry groups of the Australian and New Zealand Standard Industrial Classification (ANZSIC). The sectors covered are Government Administration and Defence, Manufacturing, Electricity Gas and Water Supply, Construction, Communication Services, Finance and Insurance, Health and Community Services. These groups have obtained the highest mean overall business value from ICT among 17 groups in the ANZSIC classification ((NOIE), 2005). All Chief Executive Officers (CEOs) were contacted and asked to complete a web-based questionnaire on JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


527 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

the basis of a five point Likert scale. The use of email and websites allowed us to reach a broad audience (Apigian et al., 2006). Hence, an online questionnaire was located on a web server, and a web link to this server was provided in invitation emails. Stevens (2002) suggested a case-to-variable ratio of 5:1 to guarantee a reliable principal component analysis (PCA) procedure; however, some researchers have worked with ratios as low as 2:1. Therefore, in an effort to achieve an acceptable case-to-variable result, we utilized all 3770 email addresses rented from the Impact List Company. Of the 3770 emails sent to top Australian mangers, 312 were completed and returned, and 682 were returned as incorrect or otherwise invalid and undeliverable addresses. As the emails had an opt-out feature, 206 participants decided not to participate. Hence, the overall response rate for this study was 10.1 %. While a higher response rate is desirable in any research endeavour, this response rate is reasonable, given the comprehensiveness and length of the instrument. Moreover, for quantitative analysis, samples in excess of 30 are considered adequate for most exploratory research (Bergeron and Raymond, 1997). Table 5 contains a summary of research data. Table 5 Summary of data collected Frequency Job Title Chief Executive Officer Chief Information Officer Other Total

Percentage %

208

67.8 %

30

9.8 %

69

22.5 %

307

100 %

Type of Industry Communications Services Electricity, Gas, and Water Supply Construction Government Administration Finance and Insurance Health and Community Services Manufacturing None Total

29 7 37 39 52

Frequency

Annual revenue Less than 10 Million (Aus$) Between 10 M and 100 M Between 100 M and 500 M More than 500 M Not Known

9.39% Total 2.27% 11.97% 12.62%

Number of Employees Less than 999

16.50%

Percentage %

117

37.9 %

88

28.5 %

82

26.2 %

14

4.5 %

11

2.9 %

312

100.0 %

58.6 % 181

31.1 %

96

8.1 %

26

2.3 %

9 312

100.0 %

1000 to 9999 39 12.62% 83 26 312

26.54% 8.09% 100.00%

10000 to 99999 Not Known Total

3.4 Construct Validity To ensure the validity of the constructs used to test the research model, the relationships between the constructs and the items should be examined. Because multi-item variables measure each construct, Principal Component Analysis (PCA) with varimax rotation was employed to test the validity of the instrument.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


528

Abareshi, A., Martin, B., Molla, A.

Using the iterative sequence of factor analysis, B15, B16, I13, N12, N15 were eliminated from the instrument. The rest of the items are loaded with their hypothesized variables. Hence, the final instrument reduces to 44 items. Appendix B presents the final factor loadings. 4. RESULTS AND DISCUSSION First, the initial reliability of the instrument was tested in order to ensure the soundness of the instrument. The recommended measure of the internal consistency of a set of items is provided by Coefficient alpha which results directly from the assumptions of the domain sampling model (Churchill, 1979). Coefficient alpha and item-scale correlation coefficients were used to identify those items that did not have a common core. The threshold for the cut-off point was quite judgemental (Molla and Licker, 2005), however to ensure that the items were adequate measures of the constructs, a high cut-off value of 0.4 was used. Therefore, items with a correlation coefficient below 0.4 were dropped from the instrument. Therefore, B14, B17, B18 from Business Strategy, I4, I5, I10, I12 from IT strategy, and N11, N13, N16, N17, N18, N19 from NOF constructs were dropped. All remaining correlations with the corrected itemscale (r ≼ 0.4) were significant at Ď = 0.05. All Cronbach alphas exceeded 0.80, confirming that the measures were reliable (see Table 6). 4.1 Testing the Measurement Model Convergent validity refers to a situation where items that measure the same factor correlate highly with one another (Litwin, 1995). Two tests were employed to assess convergent validity. The first test is item reliability, which is measured by the factor loading of the item on the construct. The second test of convergent validity is the composite reliability of each construct. There is no generally accepted level of what constitutes an acceptable factor loading analysis (Thonget al., 1996). (Hair, 2010) recommended that a loading should be at least 0.55 which explains at least 30 percent of the variance in the construct. However, many IS researchers have used the 0.50 level (Rivard and Huff, 1988; Amoroso and Cheney, 1991; Thonget al., 1996). (Nunnally and Bernstein, 1994) guideline of 0.80 for assessing reliability coefficients was used to assess composite reliability. Table 6 represents the assessment of the measurement model. The results suggest that the convergent validity of the measurement is adequate. The item-total correlation coefficients of business strategy (0.41 to 0.74), IT strategy (0.40 to 0.76), ICT dynamics (0.65 to 0.85), Management Support (0.70 to 0.78), and NOFs (0.40 to 0.85) are also high. Figure 2 represents the final research model. Discriminant validity is the degree to which items differentiate between constructs, or measure different constructs (Thong et al., 1996). The test used to ensure discriminate validity was to examine whether each item loads more highly on its associated construct than on other constructs (Thompson, 2004). Appendix 2 presents the factor pattern matrix that shows the loadings of each item on all constructs. Except for 4 items (B15, B16, N12, N15 with 0.47, 0.47, 0.15, 0.44 loadings respectively), the rest of the items were greater than 0.50 and loaded more highly on their hypothesized constructs than on any other constructs.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


529 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

Regarding the fact that the item N12 is far from the threshold and the closeness of the rest three items, it seems that there is just one item that violates the discriminate validity. The relevant item loadings were statistically significant at 0.05. Hence, all items, except the above items, passed the test for discriminant validity. e1

STROBE

1 IT Alignment

e2

STROBIS

e3

MS1

e4

MS2

e5

MS3

e6

ICTC1

e7

ICTC2

e8

ICTC3

1 Management support

NOFs

NOF1

e9

NOF2

e10

NOF3

e11

1 ICT Dynamics

Figure 2 The Proposed Model with the Validated Instrument 5. CONCLUSION Various internal and external variables have contributed to the evolution of new organizational forms. As an influencing variable ICT has been one of the important variables in the transformation process toward new forms of organization. These approaches are discussed around three main areas of context, process and variables. Cumulative results from the previous studies that examined the relationship between ICT and NOFs were plagued with ambiguities and inconsistencies. At the same time, understanding the clear-cut impacts of ICT on organization requires a mechanism in which various domains of ICT, in terms of internal and external are considered. We tried to develop a model to address this gap. It is a working model of ICT- NOF and does not claim to be comprehensive. The business strategy, IT strategy, ICT Dynamics and management support provide meaningful indicators of ICT impacts in the evolution of NOFs. Therefore, both researchers and practitioners can take benefit of the research instrument and model. Researchers can use the model and instrument in future research endeavours. Managers can chart the transformation of their organizations using the common variables identified in this paper and benchmark their organizations against both historical data and industry best practices. Finally, an important issue that should be considered is that the effect of environmental conditions on the pervasiveness of ICTs impacts was not

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


530

Abareshi, A., Martin, B., Molla, A.

examined in this study as the scope of the study was limited to internal and organisational factors. Particularly the study, although constitutes a cross sectional survey, was limited to Australian organizations. Therefore, the effects of macroeconomic conditions were not investigated. Therefore we suggest to include some influencing external factors and some macroeconomic conditions in relation with ICTs in future research. Another important issue is the lack of appreciation from information intensity of the industry. Some specific industries such as insurance, banking, and finance are more information intensive than other section. Hence the speed and the effectiveness of ICT impact can vary depending on the type of the industry. As this study was conducted as a cross sectional survey of several industries, such effects were not monitored. Longitudinal investigation may further augment the empirical validity and generalizability of the proposed model and research instrument. In general, it is worth noting that including the environmental factors, the type of industry and conducting a longitudinal study can promote the comprehensiveness and generalizability of the proposed model. Latent Construct BUSS

Items

Mean

Standard Deviation

B1 B2 B3 B4 B5 B6 B7 B8 B9 B10 B11 B12 B13 B15 B16

72.68 72.39 72.37 72.30 72.47 72.49 72.39 72.66 72.29 72.27 72.30 72.63 72.29 72.29 72.31

40.06 40.31 40.88 40.34 40.31 42.02 41.96 40.31 41.46 41.55 40.57 41.74 42.41 42.75 43.02

I1 I2 I3 I6 I7 I8 I9 I11 I13 I14 I15 I16 I17

70.96 70.94 70.86 70.96 70.95 70.84 70.86 70.88 70.71 70.89 70.76 70.96 70.74

25.83 26.13 26.76 27.10 27.83 27.20 27.94 27.81 28.28 26.90 27.17 27.38 27.85

ITS

Cronbach’s alpha 0.89 0.72α 0.74 0.70 0.77 0.74 0.49 0.52 0.66 0.06 0.62 0.74 0.51 0.48 0.45 0.41 0.87 0.76α 0.75 0.63 0.51 0.46 0.54 0.44 0.45 0.40 0.49 0.57 0.55 0.46

Latent Construct NOF

Items

Mean

Standard Deviation

N1 N2 N3 N4 N5 N6 N7 N8 N9 N10 N14 N15

65.61 66.36 65.58 66.33 66.36 67.23 66.41 67.20 66.43 66.43 66.53 66.51

68.89 66.35 70.30 66.95 66.04 69.96 66.35 70.70 67.74 65.44 70.07 71.39

M1 M2 M3 M4 M5

17.40 17.11 17.09 17.01 17.19

3.94 4.05 4.20 4.17 3.92

8.63 8.21 7.88

1.56 1.60 1.81

MANS

ICTD ICT1 ICT2 ICT3

Item- total correlation Table 6. Summary of measurement model

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br

Cronbach’s alpha 0.84 0.77α 0.77 0.67 0.70 0.79 0.65 0.78 0.59 0.69 0.85 0.52 0.40 0.88 0.71α 0.72 0.70 0.70 0.78 0.80 0.58α 0.75 0.65


531 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

REFERENCES (NOIE), N. O. O. T. I. E. (2005), Achieving Value from ICT: Key Management Strategies on accessed on Amoroso, D., L. and Cheney, P. H., (1991) "Testing a Causal Model of End user Application Effectiveness," Management Information Systems 8(1), 63-89. Apigian, C. H., Ragu-Nathan, B. S. and Ragu-Nathan, T. S., (2006) "Strategic profiles and Internet Performance: An empirical investigation into the development of a strategic Internet system," Information & Management 43(4), 455-468. Argyres, N. S., (1999) "The Impact of Information Technology on Coordination: Evidence from the B-2 "Stealth" Bomber," Organization Science 10(2), 162-181. Baker, G., (2002) "The Effects of Synchronous Collaborative Technologies on Decision Making: A Study of Virtual Teams," Information Resources Management Journal 15(4), 78-94. Bergeron, F. and Raymond, L., (1997) "Managing EDI for corporate advantage: A longitudinal study," Information & Management 31(6), 319-333. Beugre, C. D., Acar, W. and Braun, W., (2006) "Transformational leadership in organizations: an environment-induced model.," International Journal of Manpower 27(1), 52-62. Bowman, E. H., Singh, H., Useem, M. and Bhadury, R., (1999) "When Does Restructuring Improve Economic Performance?," California Management Review 41(2), 33-54. Brynjolfsson, E. and Hitt, L. M., (2002) "Beyond Computation: Information Technology, Organizational Transformation and Business Performance.," Journal of Economic Perspectives 14(4), 23-48. Byrd, A., Lewis, B. R. and Bryan, R. W., (2006) "The Leveraging Influence of Strategic Alignment on IT Investment: An empirical examination," Information & Management 43(3), 308-321. Byrd, T. A. and Turner, D. E., (2000) "Measuring the Flexibility of Information Technology Infrastructure: Exploratory Analysis of a Construct," Journal of Management Information Systems 17(1), 167-208. Cairncross, F., (2001) The death of distance: How the communications revolution is changing our lives, Harvard Business School Press, Boston, MA Chae, B., Yen, H. R. and Sheu, C., (2005) "Information Technology and Supply Chain Collaboration: Moderating Effects of Existing Relationships Between Partners," IEEE Transactions on Engineering Management 52(4), 440-448. Chan, Y. E., (2002) "Why Haven’t we Mastered Alignment?: The Importance of the Informal Organization Structure," MIS Quarterly 1(2), 97-112. Chan, Y. E., Huff, S. L., Barclay, D. W. and Copeland, D. G., (1997) "Business Strategic Orientation, Information Systems Strategic Orientation, and Strategic Alignment," Information Systems Research 8(2), 125-151.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


532

Abareshi, A., Martin, B., Molla, A.

Chan, Y. E., Sabherwal, R. and Thatcher, J. B., (2006) "Antecedents and Outcomes of Strategic IS Alignment: An Empirical Investigation," IEEE Transactions On Engineering Management 53(1), 27-47. Chowdhury, S., (2003) Organization 21C: Someday all Organizations will Lead this Way, Pearson Education, London Churchill, G. A., (1979) "A Paradigm for Developing Better Measures of Marketing Constructs.," Journal of Marketing Research (JMR) 16(1), 64-73. Creed, W. E. D. and Miles, R. E., Eds. (1996). Trust in organizations: A conceptual framework linking organizational forms, managerial philosophies and the opportunity costs of control. London, UK, Sage. Davidow, W. H. and Malone, M. S., (1992) The Virtual Organization, Harper Business, New York Dewett, T. and Jones, G., (2001) "The role of information technology in the organization: a review, model, and assessment," Journal of Management Information Systems 27(3), 313-346. Dutton, W. H. (1999). The Virtual Organization: Tele-Access in Business and Industry. in G. Desanctis and J. Fulk (eds.). Shaping organization form, California, Sage Publications. Finnegan, P. and Longaigh, S. N., (2002) "Examining the effects of information technology on control and coordination relationships: an exploratory study in subsidiaries of pan-national corporations," Journal of Information Technology 17(3), 149–163. Fulk, J. and Desanctis, G., (1995) "Electronic Communication and Changing Organizational Forms," Organization Science 6(4), 337-349. Fulk, J. and Desanctis, G. (1999). Articulation of Communication Technology and Organization Form. in G. Desanctis and J. Fulk (eds.). Shaping organization form, California, Sage Publications: 5-31. Garicano, L., (2000) "Hierarchies and the Organization of Knowledge in Production," Journal of Political Economy 108(5), 874-904. Ghoshal, S., Bartlett, C. and Moran, P., (1999) "A New Manifesto for Management," Sloan Management Review 15(4), 9-20. Hair, J. F., (2010) Multivariate data analysis, Pearson Prentice Hall, Upper Saddle River, N.J. Heckscher, C. (1994). Defining The Post- Bureaucratic Type. in C. Heckscher and A. Donnellon (eds.). The Post- Bureaucratic Organization, Thousand Oaks, CA, Sage: 1462. Henderson, J. C. and Venkatraman, N., (1999) "Strategic alignment: Leveraging information technology for transforming organizations," IBM Systems Journal 38(2/3), 472-485. Heydebrand, W. V., (1989) "New Organizational Forms," Work & Occupations 16(3), 323-346.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


533 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

Hill, S., Martin, R. and Harris, M., (2000) "Decentralization, Integration and the PostBureaucratic Organization: the Case Of R&D," Journal of Maruigenunt Studies 37(4), 563-579. Hirschheim, R. and Sabherwal, R., (2001) "Detours in the Path toward Strategic Information Systems Alignment," California Management Review 44(1), 87-108. Holland, C. P. and Geoffrey Lockett, A., (1997) "Mixed Mode Network Structures: The Strategic Use of Electronic Communication by Organizations," Organization Science 8(5), 475-489. Huber, G., (1993) "How Continental bank Outsourced Its 'Crown Jewels'," Harvard Business Review January/February(121-129. Hussin, H., King, M. and P., C., (2002) "IT alignment in small firms," European Journal of Information Systems 11(2), 108–127. Kartseva, V., Gordijn, J. and Tan, Y.-H., (2005) "Toward a Modeling Tool for Designing Control Mechanisms for Network Organizations," International Journal of Electronic Commerce 10(2), 57-84. Kearns, G. S. and Lederer, A. L., (2003) "A Resource-Based View of Strategic IT Alignment: How Knowledge Sharing Create Competitive Advantage," Decision Sciences 34(1), 1-29. Kearns, G. S. and Sabherwal, R., (2006) "Strategic Alignment Between Business and Information Technology: A Knowledge-Based View of Behaviours, Outcome, and Consequences.," Journal of Management Information Systems 23(3), 129-162. Kotorov, R. P., (2001) "Virtual organization: conceptual analysis of the limits of its decentralization," Knowledge and Process Management 8(1), 55-63. Leavitt, H. H. and Whisler, T. I., (1958) "Management in the 1980's," Harvard Business Review 36(6), 41-48. Levine, R., Locke, C., Searls, D. and Weinberger, D., (1999) The cluetrain manifesto: The end of business as usual, Perseus Books, New York Litwin, M. S., (1995) How to Measure Survey Reliability and Validity, SAGE Publications, London Luftman, J., (1998) "Enablers & Inhibitors," InformationWeek -(700), 283-286. Luftman, J. and Kempaiah, R., (2008) "Key Issues for IT Executives 2007," MIS Quarterly Executive 7(2), 99-112. Luftman, J. N. and Brier, T., (1999) "Achieving and sustaining business-IT alignment," California Management Review 42(1), 109-122. Maguire, S. and Hardy, C., (2006) "The emergence of new global institutions: A discursive perspective," Organ. Stud. 27(1), 7-29. Marschak, T., (2004) "Information Technology and the Organization of Firms," Journal of Economics & Management Strategy 13(3), 473-515. McNulty, T. and Ferlie, E., (2004) "Process Transformation: Limitations to Radical Organizational Change within Public Service Organizations," Organization Studies 25(1389-1399.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


534

Abareshi, A., Martin, B., Molla, A.

Miles, R. and Snow, C. C., (2005) Collaborative Entrepreneurship: How Communities of Networked Firms Use Continuous Innovation to Create Economic Wealth, Stanford University Press, Stanford, CA Molla, A. and Licker, P. S., (2005) "eCommerce adoption in developing countries: a model and instrument," Information & Management 42(6), 877-899. Moller, K. and Rajala, A., (1999) "Organizing Marketing in Industrial High-Tech Firms: The Role of Internal Marketing Relationships," Industrial Marketing Management 28(5), 521-535. Mukherji, A., (2002) "The evolution of information systems: their impact on organizations and structures," Management Decision 40(5/6), 497-507. Niederman, F. and Brancheu, J. C., (1991) "Information systems management issues for the 1990s," MIS Quarterly Executive 15(4), 475-500. Nohria, N. and Berkley, J. D. (1994). The Virtual Organization: Bureaucracy, Technology, and the Implosion of Control. in C. Heckscher and A. Donnelon (eds.). The Post-Bureaucratic Organization: New Perspectives on Organizational Change, Thousand Oaks, CA, Sage: 108-128. Nunnally, J. C. and Bernstein, I. H., (1994) Psychometric Theory, McGraw Hill, New York Panayides, A., (2004) "Do the Advances in Communications Technology Affect City Size? A Theoretical Investigation.," Journal of Business & Economic Studies 10(1), 3542. Pierce, A. C. (2002). The effect of business and information technology strategic alignment on information technology investment returns and corporate performance. The Wayne Huizenga Schooll of Business and Entrepreneurship, Nova Southeastern University. Piore, M. (1994). Corporate Reform in American Manufacturing and the Challenge to Economic Theory. in T. J. Allen and M. S. Scott Morton (eds.). Information technology and the Corporation of the 1990s, New York, Oxford University: 43-60. Rajan, R. G. and Wulf, J., (2006) "The Flattening Firm: Evidence from Panel Data on The Changing Nature of Corporate Hierarchies," Review of Economics & Statistics 88(4), 759-773. Richardson, H. A., Vandenberg, R. J., Blum, T. C. and Roman, P. M., (2002) "Does Decentralization Make a Difference for the Organization? An Examination of the Boundary Conditions Circumbscribing Decentralized Decision-Making and Organizational Financial Performance," Journal of Management 28(2), 217-244. Rivard, S. and Huff, S. L., (1988) "Factors of Success for End-user Computing," Communications of the ACM 31(5), 552-561. Sauer, C. and Willcocks, L., "Strategic Alignment Revisited: Connecting Organizational Architecture and IT Infrastructure," 37th International Conference on System Sciences, Hawaii, Schilling, M. A. and Steensma, K. H., (2001) "The use of modular organizational forms: an industry-level analysis," Academy of Management Journal 44(1149-1168.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


535 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

SchreyĂśgg, G. and Sydow, J., (2010) "Organizing for Fluidity? Dilemmas of New Organizational Forms," OrganizationScience 21(6), 1251-1262. Stevens, J., (2002) Applied multivariate statistics for the social sciences, Lawrence Erlbaum Associates, Mahwah, N.J Stiroh, K. J., (2002) "Information technology and US productivity revival: What do the industry data say?," Amer. Econom. Rev. 92(5), 1559-1579. Symon, G., (2000) "Information and communication technologies and the network organization: A critical analysis," Journal of Occupational & Organizational Psychology 73(4), 389-414. Tallon, P., Kraemer, K., L. and Gurbaxani, V., (2000) "Executives' Perceptions of the Business Value of Information Technology: A Process-Oriented Approach," Journal of Management Information Systems 16(4), 145-173. Thompson, B., (2004) Exploratory and Confirmatory factor Analysis, underestanding Concepts and Applications, American psychological Association, Washangton D.C. Thong, J. Y. L., Yap, C., S. and Raman, K. S., (1996) "Top Management Support, External Expertise and Information Systems Implementation in Small Businesses," Information Systems Research 7(2), 248-267. Venkatraman, N. (1985). Strategic Orientation of Business Enterprises: The Construct and its Measurement. Pittsburgh, University of Pittsburgh. Venkatraman, N., (1989) "Strategic Orientation of business Enterprises: the construct, dimensionality, and measurement," Management Science 35(8), 942-962. Wang, Y. and Tang, T., (2001) "An instrument for measuring customer satisfaction toward web sites that market digital products and services," Journal of Electronic Commerce Research 2(3), 1-16. White, A., Daniel, E. M. and Mohdzain, M., (2005) "The role of emergent information technologies and systems in enabling supply chain agility," International Journal of Information Management 25(5), 396-410. Winter, S. and Taylor, S. L. (1999). The Role of Information Technology in the Transformation of Work. in J. FULK and G. DESANCTIS (eds.). Shaping the Organization Form, Thousand Oaks, California, Sage Publications: 101-128. Workman, M., (2007) "The effects from technology-mediated interaction and openness in virtual team performance measures," Behaviour & Information Technology 26(5), 355-365.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


536

Abareshi, A., Martin, B., Molla, A.

Appendix A - The Final Instrument Item ID

Description

B1

We prefer to cut prices to increase market share.

B2

Our investments are generally aimed at increasing our sale growth rate.

B3

We have attempted to be among the top five firms in our industry.

B4

Our operations can be generally characterized as high risk.

B5

We have a conservative view when making major decisions (rev. scored).

B6

We emphasize effective coordination between different functions (e.g. marketing, manufacturing)

B7

We emphasize the use of planning techniques in our decision-making processes.

B8

In developing business strategy the emphasis would be on cost reduction and using efficiency seeking methods.

B9

We have strong ties with our major customers.

B10

We have strong ties with our major suppliers.

B11

Our philosophy is to defend our present market position prior to expanding into new markets

B12

We seek to use the latest technological innovations.

B13

We pursue the generation of innovative solutions to organizational problems.

I1

Information Technology (IT) helps the organization to be the top five firms in the industry.

I2

Information Technology (IT) helps the organization to reach high level of growth rate.

I3

Information Technology provides the organization with the relevant information to take risk.

I6

During decision making processes, computer applications are available to managers.

I7

Information Technology enables the organization to develop detailed analysis of problems

I8

Information Technology improves overall efficiency of operations in the organization.

I9

Information Technology enables organization to have strong ties with major customers.

I 11

Information Technology provides innovative solutions in solving our organizational problems.

I 14

Information Technology enables organizations to find new business opportunities.

I 15

Information Technology helps the organization to determine the stages of life cycle for individual operations.

I 16

Information Technology helps us more with long-term rather short-term planning.

I 17

Information Technology provide the firm with the relevant information on different scenarios

M1

Organization management embraces technological initiatives in our organization.

M2

In our competitive environment, corporate success requires special attention to ICT capabilities.

M3

In formulating the organizational strategy we pay special attention to the capabilities provided by ICT.

M4

ICT competencies, such as system reliability, can contribute in achieving the competitive advantages.

M5

ICT capabilities help organizations to enter to new areas of products.

ICT1

ICT capabilities are a contributive factor in promoting inter-organizational cooperation.

ICT2

Our organization makes widespread use of computers.

ICT3

Electronic-based communication is a major form of communication in our organization.

N1

How frequently are the staff asked to participate in hiring or promotion of staff

N2

How frequently are the staff asked to participate in approval of the budget

N3

How frequently are the staff asked to participate in approval of new policies

N4

How frequently are the staff asked to participate in decisions on critical issues

N5

Management structure is functionally decentralized.

N6

Most decisions made by staff are reviewed by top management.

N7

Division of labour is flexible in our organization.

N8

There is a large number of written rules and policies in our organization

N9

Employees are encouraged to make minor decisions on their own.

N10

A manual containing rules and procedures is available in our organization.

N14

I acquire knowledge of how the business environment is changing by periods of formal education.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


537 ICTs –New Organizational Form linkage in the Australian Context: Theoretical Model and Research Instrument

Appendix B: The Final Factor loadings. Items

Factor 1

B1

0.78

B2

0.78

B3

0.76

B4

0.85

B5

0.80

B6

0.55

B7

0.58

B8

0.73

B9

0.69

B10

0.73

B11

0.83

B12

0.58

B13

0.56

Factor 2

I1

0.79

I2

0.84

I3

0.72

I6

0.58

I7

0.54

I8

0.61

I9

0.51

I11

0.51

I14

0.60

I15

0.67

I16

0.67

I17

0.57

Factor 3

N1

0.88

N2

0.86

N3

0.78

N4

0.82

N5

0.88

N6

0.77

N7

0.87

N8

0.68

N9

0.80

N10

0.94

N14

0.58

Factor 4

M1

0.80

M2

0.81

M3

0.80

M4

0.81

M5

0.86

Factor 5

ICT1

0.75

ICT2

0.86

ICT3

0.83

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


538

Abareshi, A., Martin, B., Molla, A.

Appendix C Factor pattern matrix for Discriminant validity Measure

Constructs BUSS

ITS

ICTD

Measure MANS

NOF

Constructs BUSS

ITS

ICTD

MANS

NOF

B1

0.78

0.02

0.05

-0.09

0.12

N1

-0.01

0.05

0.88

0.00

0.03

B2

0.78

0.06

-0.03

-0.01

0.13

N2

0.02

0.02

0.86

0.00

0.06

B3

0.76

0.03

0.06

0.00

0.01

N3

0.04

0.08

0.78

-0.07

0.02

B4

0.85

0.01

0.06

0.07

0.08

N4

0.02

0.04

0.82

-0.05

0.04

B5

0.80

0.02

0.01

0.01

0.04

N5

0.02

0.02

0.88

-0.01

0.03

B6

0.55

0.14

0.06

-0.04

0.15

N6

-0.03

0.01

0.77

0.00

0.03

B7

0.58

0.08

0.06

-0.03

0.09

N7

0.01

0.01

0.87

-0.01

0.05

B8

0.73

0.01

0.06

-0.14

0.16

N8

-0.03

0.04

0.68

0.01

0.02

B9

0.69

0.00

0.12

0.06

0.14

N9

-0.01

0.05

0.80

-0.02

0.00

B10

0.73

0.01

0.08

0.06

0.14

N10

0.02

0.02

0.94

-0.01

0.02

B11

0.83

0.02

0.06

0.06

0.06

N12

0.10

0.01

0.15

0.06

0.11

B12

0.58

0.05

0.00

-0.08

0.01

N14

-0.04

0.03

0.58

0.01

0.01

B13

0.56

0.04

-0.07

0.03

0.03

N15

0.03

0.01

0.44

0.15

0.03

B15

0.47

0.07

-0.05

0.00

0.05

B16

0.47

0.15

-0.02

-0.03

0.02

M1

0.04

0.10

-0.03

0.80

0.10

M2

0.02

0.04

-0.02

0.81

0.09

I1

-0.04

0.79

0.05

-0.14

0.05

M3

0.02

0.02

0.00

0.80

0.05

I2

-0.06

0.84

0.01

0.04

0.02

M4

-0.01

0.02

-0.01

0.81

0.03

I3

0.06

0.72

0.00

-0.07

0.01

M5

-0.01

0.00

-0.03

0.86

0.06

I6

0.04

0.58

0.14

-0.01

0.04

I7

0.01

0.54

0.08

0.06

0.13

ICT1

0.03

0.04

-0.01

0.17

0.75

I8

0.04

0.61

-0.04

-0.10

0.10

ICT2

-0.03

0.07

0.02

0.11

0.86

I9

-0.07

0.51

-0.08

-0.14

0.04

ICT3

-0.07

0.04

-0.02

0.02

0.83

I11

0.00

0.51

0.06

-0.06

0.06

I13

-0.04

0.48

-0.06

-0.05

0.06

I14

0.08

0.60

0.12

0.13

0.08

I15

-0.04

0.67

0.00

0.11

0.03

I16

-0.06

0.67

-0.03

0.08

0.01

I17

-0.10

0.57

0.06

0.07

0.05

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 515-538

www.jistem.fea.usp.br


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, pp. 539-554 ISSN online: 1807-1775 DOI: 10.4301/S1807-17752011000300002

SEMANTIC WIKIS AND THE COLLABORATIVE CONSTRUCTION OF ONTOLOGIES: A CASE STUDY Fernando Hadad Zaidan Marcello Peixoto Bax Universidade Federal Minas Gerais – UFMG, Brazil _____________________________________________________________________________________

ABSTRACT Ontologies are complex artifacts. They should seek consensus on the use of a set of modeled concepts. Some authors propose that these devices would be beneficial if they were built collaboratively. This article aims to address the use of a semantic wiki as an alternative to the collaborative construction of ontologies, and describes its ontological structure. Wikis are known as tools for collaborative construction of content. The semantic wiki is a research effort to integrate the concepts of wikis with the semantic web. The case study presented shows an implementation in Semantic MediaWiki: the best known and most used semantic wiki features by the academic community and the organizational environment. Keywords: Semantic Web, Semantic Wikis, Ontology, Collaboration.

1.

INTRODUCTION

Everyone agrees that user interaction building social networks is the cornerstone of "Web 2.0". Applications offer dynamic contents and rich interfaces, providing the means to add or edit contents. Blogs, wikis and photo / video / text sharing sites have increased user sharing and participation. Wikipedia, a successful example of web technology, helps the sharing of knowledge between people, leaving individuals free to create and modify its contents. But, Wikipedia is designed to be used by people - software does not understand or can not automatically handle the contents of Wikipedia. In parallel, the "semantic web", which is a set of technologies that help knowledge sharing across the web among different applications, is starting to gain weight. Recently, the concept of "semantic wiki" has emerged, integrating the advantages of the wiki with semantic web technologies. A semantic wiki is one that has an underlying knowledge model described on its pages. Classic or syntactic wikis are made up of text and untyped hyperlinks. Semantic Wikis, on the other hand, allow its users to identify information about the data described _____________________________________________________________________________________ Manuscript first received/Recebido em 15/01/2011 Manuscript accepted/Aprovado em: 03/06/2011 Address for correspondence / Endereço para correspondência Fernando Hadad Zaidan, Professor Doutor, Universidade Federal Minas Gerais - UFMG - Escola de Ciência da Informação ECI, MG, Brasil, E-mail: fhzaidan@ufmg.br Marcello Peixoto Bax, Professor Doutor, Universidade Federal Minas Gerais - UFMG - ECI, MG, Brasil, UFMG Escola de Ciência da Informação - Depto TGI Av. Antônio Carlos 6627 - Pampulha - CEP 31.270-901 E-mail: bax@eci.ufmg.br Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


540 Zaidan, F. H., Bax, M. P.

on the pages, and relations between pages, so that it may be inspected or exported as a database. In contrast, the contents of classical wikis are organized in a narrative and nonstructural form. It is known that the structured information assists the work of information processing software, for example, eliminating ambiguities. The semantic web reaches maturity using ontologies to guide the semantic markup and decreases the gap of understanding between the machine and humans. The semantic wikis were proposed in the early 2000´s, and began to be seriously implemented by 2005. In 2010, the best known wiki software seems to be the Semantic MediaWiki, whilst the best known semantic wiki is the Freebase1. There is a wide variety of application scenarios for semantic wikis. To name a few: the engineering of ontologies, knowledge management and educational environments. This paper presents a semantic wiki and illustrates how such a wiki can be used for the collaborative engineering of ontologies. Semantic Wikis allow the sharing of formal knowledge models and formally structured ones so that software can process them properly. Semantic Wikis are presented and analyzed in a case study.

2. METHODOLOGICAL PROCEDURES As stated earlier, the objective of this paper is to show how a semantic wiki can be used for collaboration in the engineering of ontologies. In order to meet that end, we did exploratory and qualitative research, where the improvement of ideas occurred from the selection and reading of seminal authored books, to support the main constructs elucidated below. With the aim of filling the gap between the content of the books used and the state of the art in semantic wikis, several major journals were accessed, resulting in an extensive reference list of technical and scientific papers, referenced and listed at the end of this study. However, for the appropriate depth of this paper, we present a practical case study of an implementation of a semantic wiki. In chapter 5, will be reporting the experience and observed situations and how they reflect on the semantic web. 3. EXPLANATION OF CONCEPTS For the theoretical foundation, an explanation of some concepts will be given, in light of the literature of the semantic web, collaboration and cooperation. Then, an analysis of semantic wiki will be presented as well as its ontological structure. 3.1 Collaboration and Cooperation on the Web 1

http://wiki.freebase.com/wiki/Main_Page

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


541 Semantic Wikis and the Collaborative Construction of Ontologies:case study

According to Choo (1997), the tacit and explicit knowledge of the members of organizations, collectively, is shared collaboratively. In order to encourage such organizations, through social and economic needs, knowledge is created, stored and disseminated. Piaget (1995) suggests that this group interaction stimulates human beings to share knowledge restricted to individuals, making it collective and expanding knowledge. In the processes of learning, collaboration and cooperation are crucial to a community. To cooperate is to work with someone, to operate together with them, and build something with others. According to Zaidan and Bax (2010), there is the possibility of creating a new relationship between information and its users, whose technological devices stimulate individual interaction with information. Vygotsky (1998), though, explains that collaboration between people (peers) is an essential action for the learning process, because it shows the heterogeneity of the groups, helping to preserve the virtue of the cognitive process implicit in interactions and communications. In each member of a community, sources that stimulate group life are found, regulating individual actions. The emphasis lies n the essential condition of stimulations of the group life which, however, permeate its controls (Piaget, 1995). We must introduce the most contemporary concept of mass collaboration, discussed in Tapscott and Williams (2006). According to the authors, due to fundamental changes in technology, demographics, business, economics, and in the world, we are in a new era when people participate in the economy like never before. In the past, collaboration existed on a small scale, and occurred only among relatives, friends, associates and homes, communities and workplaces. 3.2. Brief History of the Evolution of the Web The web is creating new spaces and contexts for the construction of knowledge. Its first generation was characterized by the huge amount of information made available. But, its regular users, mere readers, could not change or edit the content themselves (Recuero, 2009). Web 2.0 adds principles that characterize it as a platform not only for consumption, but also for production of information shared by its users. Content can be published by users in a simple and straightforward way, increasing collaboration. Also notable is the gratuity present in most systems available, usually supported by advertising. According to Zaidan and Bax (2010, p. 9), "technology is not neutral and with the advent of Web 2.0, society undergoes several changes, and among them, some may be considered fundamental." Perhaps the most significant is the possibility of computermediated socialization. Web 3.0 can be characterized as a semantic social web. Furthermore, Web 4.0 is in the future, where artificial intelligence will be put in place (SPIVACK, 2007).

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


542 Zaidan, F. H., Bax, M. P.

3.3. Wikis as collaborative tools With the use of IT, society suffers significant impacts. In this research the focus is placed on the possibility of socialization of knowledge, the use of communication tools. Indeed, collaborative tools are able to provide the web users, in addition to cooperation and collaboration, knowledge construction in an interactive manner (Recuero, 2009). The hypertext on the Internet emerged as a paradigm of social construction, as users negotiate and reconstruct and share knowledge (Majchrzak; Wagner; Yates, 2006). In the context of enterprise collaboration, according to Tapscott and Williams (2006), we see that the world is moving towards dispersed and decentralized knowledge. As it is known, the term wiki is originally from the Hawaiian language, coined by Ward Cunningham who originally created the WikiWikiWeb in 1995. Wikis are used for different purposes which are: ●

Maintenance of knowledge networks;

Construction of knowledge in communities;

Cooperation in the construction of knowledge;

Knowledge management systems.

Wikis as tools of knowledge construction allow collaboration in the knowledge society (Majchrzak; Wagner; Yates, 2006, Ramalho, Vidotti, Fujita, 2007; Tapscott, Willians, 2006). They are not only on the Internet, but also on the intranets of organizations. 3.4. Semantic Web It is widely known that the semantic web is the result of applying the technologies of knowledge representation to distributed systems in general, to fill the communication gap existing between humans and machines. Indeed, today the knowledge is implicit in web pages, and so it is difficult to be extracted and treated by the machine (Breitman, 2005; Mika, 2007). In a classic article, "The semantic web" (2001), the semantic web is described as an extension of the current web in order to develop means to ensure that machinery can serve humans more efficiently. However, it is necessary to build instruments in order to provide a logical sense (syntactic) and semantic to computers (Berners-lee, 2001). Different fields of computational linguistics, database, knowledge representation, knowledge-based systems and service-oriented computing collaborate in the building of the semantic web. 3.4.1 Metadata Allowing tagging of data and information is used for various purposes, such as content identification, description, location, etc. The realization of the semantic web is dependent on the joint construction of a worldwide network of metadata; in this context, they are also referred to as controlled vocabularies.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


543 Semantic Wikis and the Collaborative Construction of Ontologies:case study

3.4.2 Taxonomy Taxonomy is a scientific classification, proposed by a group of officials, experts in a particular field of knowledge. In IT, it is simply a hierarchical classification of entities as established relationships in the real world. 3.4.3 Ontologies and languages In information systems, an ontology is a formal conceptualization and consensus (for a group) of domain knowledge. They are conceptual models that capture in an explicit way the shared vocabulary used by applications. It is a guarantor of a communication with control of ambiguities. Ontologies make up a sort of lingua franca of semantic web. Building an ontology implies the making of a conceptual model of a domain in a formal language, enabling inferences by computers. This modeling activity is a formal description of some aspects of the physical world (real), aiming at representation (Breitman, 2005; Mika, 2007). 3.4.5 XML, RDF e OWL Created in 1998, XML (eXtensible Markup Language) has constituted the serialization syntax for various other formalities. RDF (Resource Description Framework) is a standard recommended by W3C in 2004, whose history began in 1995, proposed in 1999. RDF represents metadata in the form of statements about properties and relationships between resources on the web (Breitman, 2005). It provides simplified semantics with good representation for the treatment of metadata, but it does not provide necessary inputs required for the expression of an ontology (Mika, 2007). OWL (Web Ontology Language) is more comprehensive and expressive than RDF. A comparison is proposed for teaching purposes by Mika (2007) between the formalisms: entity-relationship diagram (E/R), UML (Unified Modeling Language), XML and RDF / OWL. Table 1: Comparison between concepts and semantic languages and non-semantics Origin

Primitive

Expressivity

Distributed representation

Formal semantics

E/R

1976

Relation

No

No

UML

1995

Objects

••

No

Yes2

XML

1998

Entity

••

Yes

No

Resource

••••

Yes

Yes

RDF 2004 / OWL

Source: Mika, 2007.

2

With UML 2.0.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


544 Zaidan, F. H., Bax, M. P.

3.5 Semantic Wikis Semantic wikis extend the traditional wikis allowing the annotation of content explicitly, making them more readable for the machines. Some limitations of wikis can be solved, namely (Krötzsch; Vrandečić; Volkel, 2005; Krötzsch, et al., 2007): ●

Content consistency: often in traditional wikis, the same information can appear on several pages. The semantic markup allows for greater information consistency, avoiding the ambiguity at the time that users are making the content insertion.

Content access: big wikis have many pages, making it a challenge to search and to compare information. Using search with syntax closest to SQL3, semantic wikis allow you to return to the desired content.

Content reuse: the motivation of wikis is to provide information. The content of unstructured traditional wikis allows only reading by browsers.

Semantic wikis enable semantic enrichment of the content that is still easily manipulated by the user. They allow: ● Classification and annotation for the links; ● Dynamic presentation of contents; ● Richer navigation; ● Metadata; ● Data in triples; ● Semantic search; ● Embedded queries; 3.5.1 Examples of semantic wikis: Nowadays, there are some semantic wikis available, which can also be used as free software. Among them: AceWiki4, Kiwi5, Knoodl6, OntoWiki7 and Semantic MediaWiki (SMW)8. The Semantic MediaWiki (SMW) is an extension, also MediaWiki9 is free. The research was begun in 2005 in Germany AIFB (Informatics Institute of the Faculty of Economics and Business Engineering10) in cooperation with KIT (Forschungszentrum

3

Language for the data manipulation in relational databases. http://attempto.ifi.uzh.ch/acewiki/ 5 http://www.kiwi-project.eu/ 6 http://knoodl.com 7 http://ontowiki.net 8 http://semantic-mediawiki.org 9 MediaWiki was created in 2002 by Magnus Manske. It is a free software and its installation requires a Web server (Apache) a database (MySQL or PostgreSQL) and PHP. In 2003 Jimmy Wales chose MediaWiki to be a tool intended to support Wikipedia. 10 http://www.aifb.kit.edu/web/Hauptseite/en 4

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


545 Semantic Wikis and the Collaborative Construction of Ontologies:case study

Karlsruhe GmbH and Universität Karlsruhe11). The responsible researchers are Krötzsch Markus, Denny Vrandečić and Max Volkel. The main features of SMW are: ● MediaWiki extension; ● Nothing is overwritten in MediaWiki; ● The functions are called when necessary; ● Simple semantic marks; ● You do not need a great knowledge of ontologies. Technologies for installation are the same for* the aforementioned MediaWiki (Krötzsch; Vrandečić; Volkel, 2005). (“*mesmos para = the same for/ mesmo que = the same as) 4. SEMANTIC WIKI ANALYSIS AS A COLLABORATIVE BASIS FOR ONTOLOGY CONSTRUCTION The collaboration aspects were embryonic and some explored the semantic web community in the first half of the 2000s. Tolksdorf and Simperl (2006) report that research efforts were focused on issues related to knowledge representation. Correndo and Alani (2007) explain that ontology collaborative constructions were little supported by ontology editors. In fact, some ontologies need to be agreed upon by user communities. To reach this agreement, it requires the support of tools and methodologies that will allow users to express and write down, collaboratively, their points of view. To address these requirements, many tools have evolved primarily from Web 2.0 and Web 3.0. In the state-of-the-art semantic approaches, ontologies have been sustained as the key to the most advanced technologies to support knowledge workers. Still, we see that is not given due importance in conscious elaboration to ontology projects, and they are made without the minimum requirements for construction. In full operation of the semantic web systems, the concept of collaboration seen in the previous section, is strongly committed to this deficiency. Braun, Schmidt and Zacharias (2007) show that the main challenge is to construct implicit and informal ontologies for the explicit formal models needed for semantic approaches, including semantic wikis. These authors said that to integrate the work of ontology constructions with the natural appearance (implied) of vocabularies, the followig is needed:

11

Ontology construction integrated with the usual tasks of users, for example, annotation or navigation;

It should not be assumed that modeling users are fully expert, and the complexity of ontology task editions should be reduced to a minimum. This also means balancing the expressiveness of ontologies with the usability of the editing program, for example, reducing buildings to taxonomic structures;

http://www.kit.edu/kit/english/index.php

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


546 Zaidan, F. H., Bax, M. P.

â—?

Finally, the strong collaboration, not only because ontologies share concepts, but also because it is necessary to disseminate them to various communities.

Kousetti, Millard and Howard (2008) corroborate and add that semantic wikis are the perfect combination of collaboration and semantic expressiveness. Attempting to analyze semantic wikis as a basis for collaborative ontology construction, these authors explain that the creation of ontologies has been in the hands of experts, as well as knowledge management with specialists in this area. In order to realize the semantic web, all participants in the process have the opportunity to contribute and, therefore, to collaborate. The ease of authorship in wiki pages is a great motivator for integration and interoperability of the semantic web. The knowledge users, not technical in ontologies, require a minimal understanding of how to operate in semantic wikis, without the need for deepening the ontological concepts. Concluding this analysis of semantic wikis for ontology collaborative construction, it is necessary to resort to Lim and Ko (2009), who show that the ontology construction is the main part of semantic web applications. These authors propose a semantic wiki that experts in a field, not technicians in ontology, can easily and collaboratively organize, evaluate and refine the content semantically marked. When a new wiki page is generated, the models with semantic marking on the pages are also generated, and then the collaboratively created ontology is automatically generated. In this same sense, Kasisopha and Wongthongtham (2009) demonstrate the ontologically-based evolution on semantic wikis. They start with the assumption that the maintenance of an ontology and the introduction of new versions to users is timeconsuming and costly; also, running the risk of new versions being built before the previous ontology is put into production. The semantic wikis provide the management of ontologies in that, they are characterized by the ease, creation and editing of the semantic search, the authentication and quality management. The semantic wiki chosen in this study comply with these aspects to the extent that users add semantic structures to the text in a collaborative environment on the Internet or Intranet. The same can not be confirmed if the user is building and editing their ontology in a stand alone12 editor to be imported later in the application of the semantic web. The next topic will confirm these concepts through a practical construction of a case study. 5. SUPPORT OF THE ONTOLOGY COLLABORATIVE CONSTRUCTION Let us illustrate, with a practical example, how a semantic wiki can serve to support the ontology engineering process. In the engineering of ontologies, domain experts and knowledge engineers work together to create a formal ontology. Semantic Wikis can support this process: â—?

Domain experts (non-technical) have an easy way to explain their knowledge;

12

A term that means that the application can be run and controlled by an operator independently.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


547 Semantic Wikis and the Collaborative Construction of Ontologies:case study

Domain experts and knowledge engineers can work together, each bringing their personal experience, and

Knowledge can be incrementally formalized in an evolutionary process, starting with informal texts.

At first, it is observed that some semantic wikis, such as a Knoodl and Ontowiki, which are not part of this case study, allow the import of ontology already built and in this sense are oriented not exactly towards the support of its engineering, but especially towards its use. In such cases, ontology provides the basis for content annotations included in the wiki. The SMW, the semantic wiki chosen in the study does not allow the natural importing of a previously built ontology without installing extensions (Vrandečić; Krötzsch, 2006). It can be considered this as a limitation of the semantic wiki (Auer, 2006; Dietzold, et al., 2010); however, it should be noted that the scenario of the SMW is one on which the semantic structure would be built collaboratively and not merely imported (Krötzsch; Vrandečić; Völkel, 2005). Regarding the SMW, its simplicity and collaborative ease of use, inherent in its nature as a wiki tool, allows to see the interest of its use in certain projects. Especially, we will see it in the support of collaborative engineering of ontologies. As an example, consider a researcher who wants to create a semantic portal in a given Congress (FIG. 1). It begins to fill the semantic of wiki with texts, images, and maybe even audio content, available through an easy-to-use editor. It can also describe simple relationships between content. At some point, his experience in technology is no longer enough. Then, a knowledge engineer and his experience in the Semantic Web come into play. The knowledge engineer, without understanding specifically the topic (Congress), shall include the simplest relationships to form an ontology. At the same time, the researcher can continue to fill the system with more content. The semantic structure of the SMW includes: ● Categories and subcategories (classes for meta-classification); ● Properties (annotated links); ● Data Types (to design values); ● Instances (their own wiki pages); ● URI (Uniform Resource Identifier - character string to identify a web resource). Figure 1 illustrates the class hierarchy that will be used in the case study. When inserted into SMW, such classes transform themselves into categories and subcategories, the basis for ontology development. In the class hierarchy of Figure 1, the root class is "Academic".

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


548 Zaidan, F. H., Bax, M. P.

Figure 1. Class hierarchy (Categories and subcategories) It is noteworthy that, as in any engineering project, the use of a support tool obviously does not exempt a previous study from the project requirements, as well as planning and preparation of its ontological structure, at least initially, before leaving to use the semantic wiki. In the figures that follow are the representations of the ontological structure of the SMW, ending with a comparison with a typical triple RDF (subject - predicate object), the basis for all ontologies. We will begin the presentation of case study showing, in Figure 2, a wiki page called "Contecsi 2010," developed in a traditional wiki. We can see the text-only form and the unstructured content form.

** “The 7th edition took place in the City of São Paulo in May 2010” Figure 2. Traditional wiki page. Source: Research data. Figure 3 reveals a piece of code on this page, the option of editing the wiki code. It is observed that there are two links: ● Internal link: syntax [[ City of São Paulo (city)]], which refers to an internal wiki page; ● External Link: syntax [http:// ... Contecsi 2010].

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


549 Semantic Wikis and the Collaborative Construction of Ontologies:case study

Figure 3. Piece of code on the traditional wiki page. Source: Research data. Using only links in the format of traditional wikis does not allow the formalization of content structure for programmatic use of the tool itself. So the software can do nothing to support users with regards to the ease of navigation, retrieval content form, collaboration, reuse, and other resources of manipulating content. These and other limitations are the ones that the semantic wikis could be reduced to (Schaffert, et al. 2009; BAO, et al., 2010). Table 2 presents the design of a semantic structure greatly simplified for instructional purposes. It uses only one instance (or wiki page), which presents the Contecsi/2010 event page. One category: Congress and its properties: city, rating, event date and publication type.

Instance

Contecsi 2010

Classes or Categories

Attributes

Data types

category

property

type

Congresses

City

String

Classification

String

Event Date

Date

Publication type

String

or Properties

Table 2: The planning for semantic structure of the example. To formalize the category, following the SMW syntax, the following line has to be added to the page: [[Category:Congresses]]. We have to create individual pages for the properties setting. In the figure below, the page illustrates the Classification property. Note that it is possible to make a list of accepted values. If they are not listed correctly ("Allows value" accepted only such values as Excellent, Good, Bad) at the moment of data entry, an error will appear.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


550 Zaidan, F. H., Bax, M. P.

Figure 4: SMW code for the "Classification" property. Source: Research data. In the code above, one has the data type for this property, described in SMW syntax as: [[has Type::Type::String]]. Finally, the whole semantic structure of the simplified study case is displayed on the Contecsi/2010page, shown in Figure 5.

Figure 5: Semantic page for “Contecsi 2010�. Source: Research data. One draws attention to a feature of SMW called "Facts about" (Figure 5) which explicitly shows the semantic relationships present in the content provided to SMW. This feature allows the navigation through the semantic content. For example, clicking on Excellent, it displays all the contents of the "Congresses" type (Figure 6).

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


551 Semantic Wikis and the Collaborative Construction of Ontologies:case study

Figure 6: Result of a click on the word "Excellent" in Figure 5. Source: Research data. This return from a click on the properties has advantages for the end user. However, it is noteworthy that for the semantic wiki administrator, all the navigation on pages and data can be built from complex sentences, also called inline queries, or embedded queries. As an example, from the code {{# ask: [[Category: Conference]]}} embedded in a wiki page, any maintenance that is done, on pages whose category is Congress, will be automatically updated, resulting in what is shown in Figure 7.

Figure 7: Result of an embedded query. Source: Research data. It is interesting to note that a typical triple RDF (subject-predicate-object), corresponds to the "classification property" of the "Contecsi 2010", as shown in Figure 5, and could be expressed as follows: <Contecsi2010

Classification Excellent>

6. FINAL CONSIDERATIONS It is known that the construction of an ontology is a complex task that should be, preferably, performed by multi-disciplinary teams; however, the most common tools to build ontologies (e.g. ProtĂŠgĂŠ13) do not yet allow for remote collaborative teamwork. After a brief presentation of teh Semantic Web key concepts, the article presented an instructionalcase study aimed at illustrating how a semantic wiki tool could be used as an interesting support option for the collaborative construction of ontologies. The ability for collaboration and cooperation becomes rich and effective with an easy and simple documentation process. 13

http://protege.stanford.edu/

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


552 Zaidan, F. H., Bax, M. P.

It can be said that the article illustrated how what we call "the wiki philosophy," enhanced by methods and techniques of the semantic web approach which gives rise to a simple, but effective tool which seemed useful in certain phases of the ontology engineering, or even as a support throughout the process. It is noteworthy the support for ontology documentation, which, once represented in a wiki, already constitutes its own documentation. The article makes explicit the semantic structure underlying the SMW tool and shows how it deals with some major problems of today's wikis, such as inconsistency of content, access and knowledge reuse. The usability of semantic wikis, Semantic MediaWiki in particular, was verified. Even people who are not specialists in logic or ontology could use it. However, it is important to point out the limitations of the textbook case study, with a very simplified knowledge domain, in representing all the SMW features. As noted above, the SMW tool is not the only one. Future work could make more comparisons with others like Knoodl and OntoWiki. It is believed that the use of a semantic wiki as a tool to support remote collaborative work has advantages. One can cite, for example, the flexibility to absorb, in a simple way, proposals from other team members while preserving the formal structure that organizes the content. That is, it is believed that the use of semantic wikis to support the remote collaborative construction of ontologies is a good commitment between flexibility (necessary for humans) and formal (necessary to machines). REFERENCES Auer S.; Dietzold, S.; Riechert, T. OntoWiki - A Tool for Social, Semantic Collaboration. In International Semantic Web Conference, 2006. Bao, J.; et al. Development of a controlled natural language interface for semantic mediawiki. In N. E. Fuchs, editor, Proceedings of the Workshop on Controlled Natural Language, Lecture Notes in Computer Science. Springer-Verlag, Heidelberg, Germany, 2010. Berners-Lee, T., Hendler, J., Lassila, O. The Semantic Web. Scientific American, (5), 2001. Boulos, M. N. K. Semantic Wikis: A Comprehensible Introduction with Examples from the Health Sciences. Journal of Emerging Technologies in Web Intelligence, Vol. 1, No. 1, Aug., 2009. Braun, S.; Schmidt, A.; Zacharias, V. Ontology maturing with lightweight collaborative ontology editing tools. In Gronau, N., 4th Conference Professional Knowledge Management - Experiences and Visions (WM â&#x20AC;&#x2122;07), Potsdam. Volume 2., Berlin, GITO-Verlag, 2007. Breitman, K. K. Web semântica: a internet do futuro. Rio de Janeiro: LTC, 2005. Choo, C.W. The knowing organization: how organizations use knowledge to construct meaning, create knowledge, and make decisions. Oxford: Oxford University Press, 1997.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


553 Semantic Wikis and the Collaborative Construction of Ontologies:case study

Correndo, G.; Alani, H. Survey of tools for collaborative knowledge construction and sharing. In: Workshop on Collective Intelligence on Semantic Web (CISW 2007), Fremont, CA, USA. 2-5 Nov., 2007. Cunningham, W. Wiki design principles. <http://c2.com/cgi/wiki?WikiDesignPrinciples>. Access date: Jan. 6th, 2011. DIETZOLD, S.; et al. Making the Semantic Data Web easily writable with DFauthor. Proceedings of the EKAW 2010 - Knowledge Engineering and Knowledge Management by the Masses. Lisbon, Portugal. 2010. Hoobie, D. Enterprise 2.0 at goodwin procter. KMPro Journal, v. 6, n. 1, 2009. Krötzsch, M., Vrandečić, D., Völkel, M. Wikipedia and the Semantic Web: the Missing Links. In: Proceedings of the 1st International WikipediaConference, Wikimania 2005. Kasisopha, N. WONGTHONGTHAM, P. Semantic Wiki-based Ontology Evolution. 3rd IEEE International Conference. Digital Ecosystems and Technologies. DEST '09. 2009. Kousetti, C.; Millard, D. E.; Howard, Y. A Study of Ontology Convergence in a Semantic Wiki. In Proceedings of the WikiSym 2008. ACM, New York, NY, 2008. Krötzsch, M., et. al. Semantic Wikipedia. In: Journal of Web Semantics.Elsevier, May, 2007. Lévy, P. Collective Intelligence. New York and London: Plenium Trade, 1997. Lim, S-K.; Ko, I-Y. Collaborative Ontology Construction Using Template-Based Wiki for Semantic Web Applications. Proceedings of the International Conference on Computer Engineering and Technology. ICCET '09. Volume 02, 2009. Majchrzak, A., Wagner, C., Yates, D. N. Corporate wiki users: results of a Survey. In: Proceedings of the 2006 International Symposium on Wikis. New York, NY: ACM Press, 2006. Mika, P. Social networks and the semantic web. New York: Springer, 2007. O´Reilly, T. What is Web 2.0? Design patterns and business models for the next generation of software. O’Reilly Media, Sebastopol, CA, 2005. <http://oreilly.com/web2/archive/what-is-web-20.html>. Access date: Jan. 6th, 2011. Piaget, J. Sociological Studies. London: Routledge, 1995. Ramalho, R. A. S., Vidotti, S. A. B. G., Fujita, M. S. L. Web semântica: uma investigação sob o olhar da Ciência da Informação. Datagramazero. Rio de Janeiro, 2007. Recuero, R. Redes sociais na internet. Porto Alegre: Sulina, 2009. Santos, M. L. B.; Franco, C. E.; Terra, J. C. C. Gestão de conteúdo 360°: integrando negócios e tecnologia. São Paulo: Editora Saraiva, 2009. Schaffert, S. IkeWiki: A Semantic Wiki for Collaborative Knowledge Management In: 1st International Workshop on Semantic Technologies in Collaborative Applications STICA 06, Manchester, UK, Jun. 2006.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


554 Zaidan, F. H., Bax, M. P.

Schaffert, S.; et al. KiWi – A Platform for Semantic Social Software. In: Proceedings of the Workshop on Semantic Wikis, in conjunction with The 6th European Semantic Web Conference (ESWC09), 2009 Spivack, N. How the WebOS Evolves? 2007. <http://www.novaspivack.com /technology/ how-the-webos-evolves>. Access date: Jan. 4th, 2011. Tapscott, D., Williams, A. D. Wikinomics: How mass collaboration changes everything. New York : Portfolio, 2006. Tolksdorf, R.; Simperl, E. P. B. Towards Wikis as Semantic Hypermedia. In: Proceedings of the 2006 international symposium on Wikis, WikiSym '06. Dinamarca, 2006. Vrandečić, D., Krötzsch, M.: Reusing ontological background knowledge in semantic wikis. In: Proc. of 1st Workshop From Wiki to Semantics (SemWiki’06), 2006. Vygotsky, L.S. Mind in Society: The development of higher psychological processes. Cambridge, MA: Harvard University Press, 1978.W3C. <www.w3c.org>. Access date: Jan. 6th, 2011. Wazlawick, R. S. Metodologia de Pesquisa para Ciência da Computação. São Paulo: Campus, 2009. Zaidan, F. H.; Bax, M. P. WIKI - enterprise collaboration tool of Web 2.0: case study. In: 7th CONTECSI International Conference on Information Systems and Technology Management. São Paulo: USP, 2010.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 539-554

www.jistem.fea.usp.br


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, pp. 555-580 ISSN online: 1807-1775 DOI: 10.4301/S1807-17752011000300003

ALIGNING INFORMATION SECURITY WITH THE IMAGE OF THE ORGANIZATION AND PRIORITIZATION BASED ON FUZZY LOGIC FOR THE INDUSTRIAL AUTOMATION SECTOR André Marcelo Knorst Coester Automação Ltda - São Leopoldo, RS, Brazil Adolfo Alberto Vanti UNISINOS - São Leopoldo, RS, Brazil Rafael Alejandro Espín Andrade Universidade Técnica de Havana, Cuba Silvio Luiz Johann Fundação Getúlio Vargas Rio de Janeiro, Brazil _____________________________________________________________________________________

ABSTRACT This paper develops the strategic alignment of organizational behavior through the organizations´ image, prioritization and information security practices. To this end, information security is studied based on the business requirements of confidentiality, integrity and availability by applying a tool which integrates the strategic, tactical and operational vision through the following framework: Balanced Scorecard - BSC (strategic) x Control Objectives for Information and Related Technology - COBIT (tactical) x International Organization for Standardization - ISO/International Electro Technical Commission - IEC27002 (operational). Another image instrument of the organization is applied in parallel with this analysis to identify and analyze performance involving profiles related to mechanistic, psychic prisons, political systems, instruments of domination, organisms, cybernetics, flux and transformation (MORGAN, 1996). Finally, a model of strategic prioritization, based on compensatory fuzzy logic (ESPIN and VANTI, 2005), is applied. The method was applied to an industrial company located in southern Brazil. The results with the application show two organizational images: "organism" and "flux and transformation ". The strategic priorities indicated a significant search for new business services and international markets. Regarding protection of information, security found the gap between "minimum" and "Reasonable" and in domain 8 (HR) of standard ISO/IEC27002, considered 71% protection as "inappropriate" and "minimal" in the IT Governance context. Keywords: security, information, organizational culture, images, compensatory fuzzy logic

_____________________________________________________________________________________ Manuscript first received/Recebido em 28/06/2010 Manuscript accepted/Aprovado em: 19/07/2011 Address for correspondence / Endereço para correspondência André Marcelo Knorst,Gerente de TI -CoesterAutomação Ltda. Mestre em Administração – UNISINOS, Bacharel em Sistemas de Informação – UNISINOS. Rua Jacy Porto, 1157, Bairro Vicentina, São Leopoldo, RS, Brasil. E-mail: amknorst@gmail.com Adolfo Alberto Vanti, Doutor em Direção de Empresas Universidade de DEUSTO – País Vasco – Espanha. Professor Titular do Programa de Doutorado e Mestrado em Administração e do Mestrado em Ciências Contábeis da Unisinos - Av. Unisinos, 950 – CEP 93022-000 – Cristo Rei – São Leopoldo-RS, Brasil – Ciências Econômicas. E-mail: avanti@unisinos.br Rafael Alejandro Espín Andrade, Professor Titular do Centro de Estudos de Tecnologia de Direção da Universidade Técnica de Havana. Matemático e Doutor em Engenharia - Tribunal Nacional Científico de Cuba. Rua 114 número 11900 entre 127 e 119, Marianao, Havana, Cuba. E-mail: espin@ind.cujae.edu.cu Silvio Luiz Johann, Fundação Getúlio Vargas - Rua Praia de Botafogo, 90 - Rio de Janeiro, RJ Brasil FGV Management. E-mail: silviojohann@terra.com.br Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


556

1.

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

INTRODUCTION

The culture of a company has a type of collective personality and can be noticed in the form of images that they express. According to Morgan (1996) the proper interpretation and analysis of what is conventionally called the representation of organizational images allows for a proper interpretation of the dominant culture in the company and this reveals some dysfunctional characteristics of each image. These organizational images have certain influence on information technology (IT), specifically on organizational information security. The relationship between organizational culture and information technology is presented as a complex relationship in which one influences the other in illogical ways. Examples of this illogical influence are the resistance to the implementation of a new technological project or when a modernization project imposes new behavior without respecting the identity of the organization or the way it operates internally. The consequences of behavioral and technological misalignment can generate significant losses for the company, from project delays and failures in implementation to leakage of strategic information. The examples described above exemplify the theory of complexity, which admits that a small number of simple rules can generate extremely complex results because they make a spontaneous order possible. This can be detrimental to the company and its main projects when it does not respect the factors of flexibility and controllability (ADIZES, 1989). When organizations are young they are very flexible and are not always controllable, but, as they age, this relationship changes. Normally, over time controllability increases and flexibility decreases, making an older company generally more controllable, but also more inflexible with little propensity for change. This type of situation can be evaluated and analyzed through images (MORGAN, 1996) and instrumented through Johann (2008) developing the images of mechanistic, psychic prisons, political systems, instruments of domination, organisms, cybernetic, and flux and transformation. Currently, for a company to survive and to be competitive, it is necessary to innovate, constantly changing its profile to a more flexible one. This consequently makes it less controllable and more vulnerable, which can mean problems in information security when behavior emerges and is counter to the organizational culture. So, the problem-question of this study is defined as follows: How to align information security with organizational image in the case of a company in the industrial automation sector studied? To this end an instrument was developed and utilized in the (1) evaluation of analysis of the organizational image in conjunction with an instrument of evaluation and (2) analysis of information security in the company that integrates such models as BSC, COBIT and ISO/IEC27002 (Knorst, 2010). The combination of the two instruments (technical and behavioral) occurs because the specific security instrument (BSC + COBIT + ISO/IEC27002) can only generate technical characteristics. This form of approach will not solve problems of character vulnerability which are located in the dimension related to personal or behavioral which is represented in organizational culture and counterculture, which is an everyday occurrence in organizations. Finally, a model of strategic prioritization based on compensatory fuzzy logic (LDC) is applied along with the organizational

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


557 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

images to identify in which strategic variables it is possible to encounter major vulnerabilities in information security. A company in the industrial automation sector was selected to apply this alignment in practice.

2.

ORGANIZATIONAL CULTURE AND COUNTERCULTURE THROUGH ORGANIZATIONAL IMAGE According to Schein (1990) managers often work with new patterns and practices to establish a competitive and stable pattern for the company, but fail to recognize the fundamental corporate culture or “foundation” of the corporate culture. This “foundation” is made up of the values shared by the employees, beliefs and behavior, which emerge from the success of the organization. When a culture becomes counterproductive, executives must work in a new cultural perspective, recovering residual elements of past successes, revitalizing productive habits and managing present conflicts and anomalies so that the organization can recover its essential purpose. In contrast, significant increases in a corporate counterculture or behavior contrary to the expected employee behavior can be compromising, mainly through the leaking of strategic information. For decades, entrepreneurs and top executives have repeated, ad nauseam, that the greatest assets of the organization are its people. This is a figurative way of enhancing the value of people, since they are only included on the balance sheet as a cost. Often, the reality of the company is that it does not adequately contemplate functional value and the employee can be treated as a simple resource, not a partner, and they could be replaced at any given moment. According to Minarelli (1995) it is a very present reality in organizations and companies are aware of this fact and that its direct consequences can result in the reduction in employee loyalty to the organization. This decrease in loyalty can result in different types of problems including those related to information security, which is the focus of this paper. When an organization has a high level of shared beliefs and values, it has a dense culture which, in the view of Freitas (1991), is based on the existence of few disagreements or ambiguities in the personal attitude and in decision making, as well as the management of information security. On the other hand, when the culture of an organization is not dense, a simultaneous existence of various different sub-cultures within the organization can occur. These sub-cultures can be specific to departments, units, etc., which can have very significant differences in relation to the core of the corporate culture. In some cases, sub-cultures that are too different in relation to the central core of the company can generate potential compromising corporate sustainability (Freitas, 1991). The counterculture binds groups or subgroups that reject that which the organization represents or it tries to become a covert opposition to the dominant values and/or power structure of the company. This counterculture often arises during times of stress or during major transformations within the company. For Freitas (1991, p. 77), “these forms of resistance and conflict express breaches in the system of formal power, […] that include: negation or hiding of information can arise during times of stress or during significant transformations in the company; the boycott of or resistance to innovations; and failing to cooperate amongst workgroups”.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


558

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

In the central nucleus of Organizational Culture is the “self”, which represents the integration of conscience and unconscious systems. These systems are a set of actions repeated over time, establishing a form, which are the guidelines adopted by the company. To identify the unconscious systems of an organization is, in truth, to recognize their grey areas. Their images can be identified as taboos or prejudices that somehow obscure the central idea of the organization and of information security. This “self” can be represented through organizational images (Morgan, 1996) such as mechanistic, psychic prisons, political system, instruments of domination, organisms, cybernetic and flux and transformation, and then it features a synthesis that develops the understanding of the application of the specific instrument of images (Johann, 2008): Mechanistic: Organizations that impose rigid routines and patterns, hierarchically distributed. Dealings are impersonal and control of the organization is bureaucratic. Because it is very predictable, it is no longer regarded as ideal, even in stable and authoritarian institutions. This style also presents difficulties for innovation. Psychic Prisons: Inflexibility is a characteristic of this image, becoming a prisoner of past events, allied to fundamental attitudes by their idealizers. Some of their traps are false assumptions, rules without questioning and fanaticism around the charisma of the leader. Political Systems: This view is not often in the interest of the group and often favors authoritarian executives. This includes companies with participatory management that is encompassed in political systems because although there is a certain distribution of power, the central objective will be executed by both subordinates and the owners of the capital. Instruments of Domination: In organizations viewed as instruments of domination, the employees and managers need to completely dedicate themselves to the company. They feel insecure about their employment and experience a lot stress on the job. Organisms: The fundamental principal of organisms is that it is based on the employees’ intellectual capital. Motivation is a substantial factor. Because of constant innovation and deadlines, the employees tend to obey a biological clock because there are targets to reach and constantly innovations to develop. Cybernetic: Intellectual capital is highly valued and is constantly being stimulated to improve. Decision-making needs to be done “through formal or temporary processes, producing policies and plans that offer a point of reference or a structure for information processing” (JOHANN, 2008, p.33). The definition of cybernetic is given due to the fact that information technology is permanently present, which ensures better conditions in the review of political norms and procedures, addition to learning how to absorb changes in the environment. Flux and Transformation: Organizations that best mirror flux and transformation are those that modify and evolve to conform to change and evolution in the environment. Their survival depends on their internal and external environments. The images described above led to the creation and implementation of an identification instrument for these types in different companies. This instrument is presented in the following pages and finalizes the behavioral analysis. It analyzes information security in the context of IT governance, integrating BSC x Cobit x ISO27002. This instrument that analyzed the information security in IT Governance context was very detailed in (Knorst, 2010).

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


559 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

3.

IT GOVERNANCE

The ITG can be considered the way in which decisions and responsibilities are directed towards a desirable behavior using IT (Weill and Ross, 2006). Another definition states that the ITG is an integral part of corporate governance established by management, organizational structures and processes to ensure the expansion of the strategies and objectives through the practices of IT (ITGI, 2009). ITG can also be considered the organizational capacity exercised by the board and executive management to control the formulation and implementation of IT strategies to ensure the integration between business and IT (Grenbergen and De Haes, 2005). The models of ITG like BSC (Kaplan and Norton, 1996), COBIT (ITGI, 2009) and ISO/IEC27002 (ISO/IEC 2005) assist with more effective management of IT resources. This work focuses on these models of IT governance, BSC, COBIT and ISO/IEC27002 as a continuation and expansion of the work of Knorst (2010). These were analyzed with a behavioural approach and strategic prioritization for assessing information security. 4.

INFORMATION SECURITY

Confidentiality, integrity and availability are basic requirements for business information security and provide the maintenance requirements of the business (ITGI, 2009), (Kwok and Longley, 1999), (Fitzgerald, 2007), (SĂŞmola, 2003), (Dias, 2000), (Moreira, 2001).An organizationâ&#x20AC;&#x2122;s dependency on their IT infrastructure combined with the neglecting of security requirements can put the entire information system at risk. A brief description of this being Confidentiality (C): All information must be protected according to the degree of secrecy of their content, aimed at limiting its access and used only by the people for whom they are intended; Integrity (I): All information must be kept in the same condition in which it was released by its owners, in order to protect it from tampering, whether intentional or accidental; Availability (D): All the information generated or acquired by an individual or institution should be available to their users at the time they need them for any purpose. The ISO/IEC27002 standard emphasizes the fact that information is an important asset of the organization. The three elements described above are essential to preserve the competitiveness of the company. Following is the summary of the goals of the areas of standard, starting at 5 in order to preserve the numbering of the chapters (chapters 1 through 5 are introductory chapters: 0= introduction; 1= scope; 2= terms and definitions; 3= structure of the standard; 4= risk assessment and treatment). Politics of Information Security (PI) is chapter 5 and compliance is chapter 15. Politics of Information Security (PI): Provide guidance and direction for information security in accordance with business requirements and the relevant laws and regulations. Organization of Information Security (OI): Managing information security within the organization as well as maintaining the security of information processing resources that are accessed, processed, communicated or managed by external parties Asset Management (GA): Achieve and maintain appropriate protection of organizational assets. Ensure that information receives an adequate level of protection.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


560

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

Human Resources Security (HR): To ensure that employees, suppliers and third parties understand their responsibilities and are in agreement with their roles, reducing the risk of theft, fraud and misuse of resources. They are aware of the threats and concerns related to information security, their responsibilities and obligations, and are prepared to support the information security policy of the organization during their work, to reduce the risk of human error and so they donâ&#x20AC;&#x2122;t leave the organization or change disorderly their work. Physical and Environmental Security (SA): Prevent unauthorized physical access, damage and interference with the organizations facilities and information. Prevent loss, damage, theft or compromise of assets and interruption of activities of the organization. Communications and Operations Management (GO): Ensure the correct and safe operation of the processing resources of information and minimize the risk of systems failures. Protect the integrity of software and information and maintain the integrity and availability of information and information processing resources, as well as ensuring the protection of information networks and the protection of infrastructure and support. Prevent unauthorized disclosure, modification, removal and destruction of assets and interruption of business activities. Guarantee the security of electronic commerce services and their safe use and detect unauthorized activities of information processing. Access Control (CA): Controlling access to information and ensuring access for authorized user and prevent unauthorized access to information systems. Prevent access by unauthorized users and prevent compromise or theft of information and resources for information processing, prevent unauthorized access to network services, and to preventing unauthorized access to operating systems. Prevent unauthorized access to information contained in application systems ensuring information security when using mobile computing resources and remote work. Information systems acquisition, development and maintenance (AQ): To prevent errors, loss, unauthorized modification or misuse of information in applications. Protect the confidentiality, authenticity or integrity of information by cryptographic means ensuring the security of system files. Maintain the security of application systems and information reducing risks of exploitation of vulnerabilities of known techniques. Information security incident management (GI): Ensure that the vulnerabilities and security event information associated with information systems are disclosed, allowing corrective actions to be taken in time. Ensure that a consistent and effective approach is applied to the management of information security incidents; Business Continuity Management (GC): Do not allow the interruption of business activities and protect critical processes from the effects of significant failures or disasters and ensure their timely resumption. Compliance (CF): Avoid violation of any criminal or civil statutes, regulations or contractual obligations and any requirements for information security. Guarantee that systems comply with organizational policies and standards of information security maximizing efficiencies and minimizing interferences in the process of auditing of information systems. With the objective of verifying compliance with the standard areas, Eloff and Eloff (2003) proposed a framework for the governance of information security´s four levels of protection against security practices. This allows organizations to take a

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


561 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

holistic view and verify the current stage of development of each business area. The stages are as follows: Inadequate protection: There is no organizational effort to implement any of the controls recommended for their specific needs. Certified products and equipment have no influence on the classification of the sections at this level. Minimal protection: The organization demonstrates minimal effort in adopting some of the recommended controls. Certified products and equipment have no influence on the classification of the sections at this level. Reasonable protection: Most controls are implemented and must meet the requirements based on written procedures and processes running on a reasonable level. Certified products and equipment are preferred for use. Adequate protection: Implement all controls recommended for the area. Wherever possible, it is obligatory to use certified products and equipment. These evaluation metrics, in conjunction with the recommendations of the standard, permit the formulation of an instrument to assess the security of information, identifying the stage where they are safe practices.

5.

MODEL INTEGRATION AND CREATION OF A RESEARCH INSTRUMENT TO EVALUATE INFORMATION SECURITY

Strategically, it is possible to establish a strategic relationship using the criteria of security and integrating them through the BSC model, COBIT and ISO/IEC27002, as depicted in Figure 1. These models facilitate the mapping of generic objectives for the IT business from the perspectives of the BSC with the overall goals of IT for business, those suggested by ITGI (2009). Following in (Knorst, 2010) it is possible to map the generic IT goals for business with COBIT processes involving the requirements of confidentiality, integrity and availability. Finally, to reach a technical and operational level, this alignment also included the mapping of COBIT processes with the practices of ISO/IEC27002. Here the mapping process results in a framework involving the following steps: 1) Identification of business objectives in the perspectives of the BSC; 2) Identification of IT objectives; 3) Mapping of IT goals with business goals; 4) Mapping of IT goals with COBIT processes with respect to safety requirements Confidentiality, Integrity and Availability; 5) Mapping of COBIT processes with safety practices proposed by ISO/IEC27002.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


562

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

Availability

COBIT

Integrity

BSC

Confidentiality

Safety Requirements ISO/IEC27002

Objectives Process Process/Practices

Figure 1: Integrating the BSC model, COBIT and ISO/IEC27002 Source: KNORST (2010, p.49)

In Knorst (2010) it is possible, in details, to identify the BSC x COBIT x ISO27002 mapping in relation to the generic IT requirements that impact security requirements, Confidentiality, Integrity and Availability. This confirms the first model and instrument in information security. 5.1.

Model Compensatory Fuzzy Logic (CFL) to Prioritization

To define strategic prioritization, the study elaborated the strategic map based on Compensatory Fuzzy Logic (CFL) and it is necessary to cross various sets of data in order to form the guiding actions of the organization. These sets of data include such items as strengths, weaknesses, opportunities, threats, objectives and actions. In the following, the principal points within the items mentioned above are presented. This is also a way to define which information is strategic for the control of internal security aspects. This is a logical model of the qualitative and quantitative type, based on CFL in which it is first defined as strategic variables related to SWOT analysis, with the addition of Strategic Objectives and Actions (SWOT-OA) (Vanti et al, 2006) . This model is defined with the structuring of matrices, and the relation between the variables is based on compensatory fuzzy logic validated by the manager of the company. CFL, according to Espin and Vanti (2005), aims to compensate Boolean logic, which uses only the extremes of decision, 0 or 1 {0,1} and works with the principle of gradualism within the interval [0,1] in order to measure the truthfulness of its predicates, considering 0 or 1 as the extremes of truthfulness (completely true) or falsity (completely false). This analysis (0.5) represents complete uncertainty or maximum vagueness. This representation is shown in the table of truthfulness found in Table 1.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


563 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

Truth Value

Category

0

False

0.1

Almost false

0.2

Slightly false

0.3

Somewhat false

0.4

Falser than true

0.5

As true as false

0.6

Truer than false

0.7

Somewhat true

0.8

Slightly true

0.9

Almost true

1

True

Table 1: Scale of Truth Source: The Authors

The values of truthfulness found in Table 1 are obtained to be included as data input into the matrices and also to calculate the results of these predicates that are sensitive to the changes of basic predicate truth values, or to the verbal meaning of the truth values, calculated as shown below. This proposal uses the geometric mean as a conjunction operator, negation as the classical function n(x)=1-x and the dual of geometric mean as a disjunction operator. Universal and existential quantifiers are introduced in the following way:

p(x) =

x∈U

p(x) =

n

x∈U

Π

p(x) =

x∈U

1  (1)  exp( n ∑ ln( p ( x ))) if x p ( x ) ≠ 0 x∈U =  0 other case  

x ∈U

p( x) =

x ∈U

p( x) = 1 −

n

Π (1 −

p ( x )) =

x ∈U

(2) 1  x p( x) ≠ 0 1 − exp( ∑ ln( 1 − p ( x ))) if =  n x ∈U  0 other case

v( p1 ∧ p2 ∧ ... ∧ pn ) = (v( p1 ).v( p2 )...v( pn ))1 / n v( p1 ∨ p2 ∨ ... ∨ pn ) = 1 − ((1 − v( p1 )).(1 − v( p2 ))...(1 − v( pn )))1/ n This formulation transfers the classical linguistic knowledge of SWOT to calculate strategic priorities, looking to compensate for the lack of associative properties from conjunction and disjunction operators. It also amplifies its framework to join

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


564

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

strategic objectives and actions in order to turn the SWOT analysis into an alignment, ranging from threats and opportunities to objectives and actions that the employees should perform. This way, the formula continuously tests the matrices, applying geometric means which operate with conjunctions to subsequently carry out disjunctions, until they reach the limit multiplication of the objective x objective matrix (minimal error in Matlab function) and final calculation of the sum of the variables. Such a model was programmed in the Delphi language and it is used in research and academic exercises in business administration and accounting courses for strategic prioritization definitions. For the data input, the manager of the company followed a sequence of questions to define the quantification of each crossing amongst the variables. These questions are: • • • • • • •

How certain is it that each characteristic of the company is recommendable to propose each objective? How certain is it that each characteristic of the environment is recommendable to propose each objective? How certain is it that each characteristic of the company together with each characteristic of the environment must be considered to choose the strategies that lead to the company’s vision? How certain is it that each characteristic is a characteristic of the company? It represents the evaluation of the presence of characteristic. How certain is it that each characteristic is a characteristic of the environment? It represents the evaluation of the presence of each characteristic of the environment. How certain is it that the fulfillment of each objective has influence on (or great importance to) the fulfillment of each of the other two objectives? How certain is it that the performance of each action has influence on (or great importance to) the fulfillment of each of the other two objectives?

After the generation of the matrices, it was possible to process them along with the equations structured through computer systems. Thus, the relative importance of each variable was generated between 0 and 1, the same Scale of Truth (table 1). To identify data, a questionnaire was used with the technical team to check the safety practices involving information systems. The practices identified were based on the ISO/IEC27002. Figure 1 is mapped through the integration of BSC x COBIT x ISO/IEC27002 models considering the security requirements Confidentiality, Integrity and Availability (Eloff and Eloff, 2003). They proposed a classification of protection practices with the following criteria: "Inadequate", "Minimal", "Reasonable" and "Adequate", but after the pre-test with the information technician, it was suggested to include a category called "Not applicable". This proved to be appropriate because the rule suggests that some practices in the context of the business do not apply. Due to space savings in the present study this instrument is shown already filled with the results of the case study, the Coester Automação Industrial Company. 5.2.

Model of Images of the Organization

The instruments of identification of images of the organization were developed based on Morgan (1996) and Johann (2008). This instrument performs linear summaries

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


565 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

to identify which image or images are the most predominantly in each organization to which it is applied. To this end, this is accomplished with a criterion of evaluation (1 through 4) which is presented in following: 1 = Practically nonexistent in my organization. 2 = Low incidence in my organization. 3 = Reasonable occurrence in my organization. 4 = Strong presence in my organization. The questions developed to identify the images are the 35 that are presented next and, at the end, the same is presented in a summary table with an evaluation section for each question. The 35 questions are related to the different images of the organization, which are Mechanistic (M), Psychic Prisons (PP), Political Systems (PS), Instruments of Domination (ID), Organisms (O), Cybernetic (C), and Flux and Transformation (FT). Questions: 1) Procedures, operations and processes are standardized. 2) Changes in the organization are normally a reaction to changes that already occurred in the macro business environment. 3) Administrators frequently talk about authority, power and superior-subordinate relationships. 4) Flexible and creative action. 5) Working in inadequate circumstances and conditions is considered a proof of loyalty to the organization. 6) The organization sees itself as a part of a larger system where there is an interdependence that involves the community, suppliers and the competition. 7) People and groups tend to display infantile behavior. 8) Past achievements are constantly cited as references and as examples on how to deal with present situations and how to face future adversities. 9) The organization evolves in harmony and balance with its macro environment. 10) People act under constant stress and pressure. 11) There is constant questioning and redirection of actions. 12) Power serves to provide discipline and achieve order in conflicts of interest. 13) The organization considers the motivations and needs of people. 14) There are rigid patterns and uniformity in peopleâ&#x20AC;&#x2122;s behavior. 15) The company has and utilizes a great number of rules, norms and regulations about operational aspects of the business. 17) The delegation of power to operational levels tends to be very restricted. 18) Negative feedback is encouraged to correct the organizational direction. 19) The organization expects complete devotion and dedication from its employees.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


566

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

20) The company benefits more from external events (environmental, etc.) than from strict planning. 21) There are many taboos and prejudices in the organization. 22) The relationships between superiors and subordinates tend to contain elements of love and hate. 23) Long term achievements will be achieved in partnership with the forces acting with the macro-environment and not against it. 24) To dismiss people and streamline activities are part of the game. 25) Most people think about and influence on the destiny of the company. 26) Interpersonal gossip consumes energy and diverts attention from productivity. 27) Organizational objectives and peopleâ&#x20AC;&#x2122;s needs can be met simultaneously. 28) The organization is a realm of bureaucracy. 29) The organization is expected to operate in a routine, efficient, reliable and predictable manner. 30) Employees are seen as valuable resources who can offer rich and varied contributions to the organizations activities, provided that the organization attends to their needs and motivations. 31) Rumors and gossip are frequent. 32) The organization tends to offer quick answers to changes in their macroenvironment. 33) The organization values executives who appear framed and faithful to the mode of being of the company 34) In strategic decision making the company normally abandons the simple view and prefers to take into account the complexity of the situation. 35) People are dedicated to the organization because they feel they belong to something greater, which transcends their existence and individual limitations. The questionnaire used for data collection was developed, taking as its basic premise the concept that organizations send some images which can, in principle, be perceived by the people who work in them, especially their employees. In constructing the questionnaire, the theoretical approach of Morgan (1996) was used as a reference by eight types of metaphors - or images - that characterize the culture of each organization. Thus, this theoretical approach was applied to the field of organizational practice considering how people, who work in the same organization, perceive the characteristics of their businessâ&#x20AC;&#x2122;s organizational metaphors - or images - described by the author. In constructing the questionnaire it was also found that the organizational images are not uniformly present in the company and that they vary in intensity or presence according to the company under review. That is, some images are more present in a particular organization, while others are more intense. MORGAN (1996) outlined eight possible types of images (metaphors), and in the search - and consequently in the construction of the questionnaire - "organizations as cultureâ&#x20AC;? was

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


567 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

not included in the metaphor, because in reality all of the other seven images can comprise features of the culture of a company. Johann (2008) notes that a company's culture is a kind of collective personality and the images of Morgan may serve to characterize the self organization of a collective personality. The self organization, even within the definition of Johann, is the central complex and the nucleus of culture and is repeated in the interaction between people and the consolidation of a set of attitudes that act within the consciousness and unconsciousness (zone shadows), reflect the organizational values and the rules of the game. In Morgan's approach, it is emphasized that the organization sends multifaceted images – they can, therefore, be regarded as images of self organization - which are perceived by employees. These images take on different facets according to the nature of the culture of their company. So, we worked with the ability to check people's perceptions about the following images of Morgan - or organizational “self” - in the construction of the questionnaire: M - organizations are perceived as machines; O - organizations are perceived as Organisms; PS – organizations are perceived as Political Systems; C - organizations are perceived as Brains (cybernetic); ID – organizations are perceived as Instruments of Domination; FT - organizations are perceived as Flux and Transformation; PP - organizations are perceived as Psychic Prisons. The questionnaire was developed comprising a set of 35 questions in total. Each of the seven images above was addressed in the questionnaire through five issues for the consideration of respondents. The issues contained in the questionnaire were distributed throughout this instrument to collect data, looking for its dispersal is aiming to become less obvious the logic of the questionnaire from the perspective of the respondents. 5.3.

Application of Instrument to identify Images of the Organization

The automation of industrial processes involves an extensive chain of activities that start in scientific research and end when put into operation in a productive unit. Technological evolution in the industry is constant, the plants are not alike and thus its automation is unlikely to be normalized. Legacy systems (pre-existing) are different and always require adjustments to make new equipment, infrastructure and communications applications compatible with existing ones. This causes the activities, related to the automation of industrial processes, to demand highly qualified manpower and investments in research and development. According to the company's quality manual, Coester was founded in 1963 and was originally dedicated to communication equipment for businesses and offices. In the mid-60s, beginning with the 1st Brazilian Naval Construction the company directed its activities to design and manufacture of control systems for ships. The company’s current focus has become industrial automation. From an interview with the directors, it was possible to identify the images of the organization as Table 2. The instrument finds totals for each one of the images, which are composed in 35 questions and are presented in Table 2

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


568

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

. Test Results M

O

SP

C

ID

FT

PP

01:

3

02:

4

03:

1

04:

3

05:

2

06:

3

07:

1

14:

2

13:

3

12:

3

11:

2

10:

2

09:

3

08:

4

15:

3

16:

3

17:

2

18:

2

19:

3

20:

4

21:

2

28:

2

27:

3

26:

1

25:

2

24:

2

23:

4

22:

1

29:

2

30:

3

31:

2

32:

2

33:

3

34:

3

35:

2

12

16

9

11

12

17

10

Table 2: Results from the Application of the Organizational Image Instrument Source: The Authors

5.4.

Analysis of the Models Used (Images and Information Security)

The analyses resulting from the implementation of the organizational image instrument generated a distribution image of "flux and transformation" (17) and the image of "organisms" (16). Flux and transformation are the organizations that best change and evolve according to a changing and evolving environment, their survival depends on its internal and external environment. Organisms, on the other hand, are based principally on the basis of its employees who form the intellectual capital; motivation being a substantial indicator, since it tends to follow a biological clock because there are deadlines to be met and the constant innovations. So we can say that the company is â&#x20AC;&#x153;dynamicâ&#x20AC;?. It follows the innovations of the external environment and as a part of these same changes allies itself to a significant investment in the functional area. This latter aspect allows us to understand that the workforce can be easily aligned to a dense organizational culture or principal with little existence of a counterculture. 5.5.

Application of Compensatory Fuzzy Logic (CFL) for Prioritization

First, to identify the strategic priorities the variables of the SWOT analysis (WRIGHT et al, 1998) were defined with the board of directors. Later these variables were expanded to include the Objectives and Actions and with the SWOT a more realistic sense of the business process. Thus, it will generate the mathematical model and computational SWOT-OA sustained by Compensatory fuzzy logic (CFL) which gives the intersection of all the variables together and the processes to generate the final prioritization or differentiated level of importance. In continuation, all the strategic variables, the filling of the matrixes and finally the result of the processing to direct the funds to be invested in information security are described. Primary Strengths within the company: a) Proprietary technology; b) Local engineering;

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


569 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

c) New product line; d) Lean structure; e) Flexibility; f) Structure of local service; g) Employee qualifications; h) Financial balance. Primary weaknesses within the company: a) Structure of deficient services; b) Very diversified produce; c) Small scale production; d) Lack of incentives or appreciation for employees; e) Communication failures between sectors. Opportunities for the company: a) Currency appreciation; b) Economic growth; c) Increased competition amongst competitors; d) Predatory competition; e) Specialists; f) Turn Key Solutions. Threats to the company: a) Economic Expansion; b) Service Market; c) Prospective international vendors; Objectives listed: a) Develop skills for services management; b) Rearrange structure and commercial operations; c) Enhance the development of overseas business; d) Intensify the dissemination of new product line; e) Make communication effective; f) Develop internal marketing actions; g) Alignment and cooperation amongst areas; h) Migrate management controls to SAP. Actions: a) Establish a methodology for project management;

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


570

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

b) Hire an HR consulting company; c) Design a program for jobs and wages; d) Hire a commercial manager; e) Consolidate the network of sales representatives in the country; f) Develop commercial representatives for Mexico and the USA; g) Design advertising material; h) Restructure the company's website; i) Communication training for leaders; j) Create a process map; k) Hire a certified SAP consultant. The matrix shows the continuation of the SWOT (Strengths and Weaknesses Opportunities and Threats x) in which the Column and Row "Presence" is equal to how true it is that each of the variables actually exist in the organization or the environment. The geometric mean of each row is multiplied by the Presence. Subsequently, all united matrices are processed by the Matlab function in SWOT-OA system. The total equations were presented in Espin and Vanti (2005). See Table 3: SWOT Matrix CurrencyApreciation

EconomicBubble

IncreasedCompetitionAm ongsCompetitors

PreditroyCompetition

Specialists

Turn Key Solutions

Economic Expansion

Service Market

Prospective International Vendors

Presence Question 4

Proprietary Technology

0.7

0.6

0.7

0.9

0.8

0.8

0.9

0.9

0.3

0.8

Local Engineering

0.6

0.5

0.6

0.8

0.7

0.7

0.8

0.8

0.7

1

New Product Lines

0.7

0.6

0.7

0.9

0.8

0.8

0.9

0.9

0.6

1

Lean Structure

0.6

0.5

0.6

0.8

0.7

0.7

0.8

0.8

0.7

0.7

Flexibility

0.7

0.6

0.7

0.9

0.8

0.8

0.9

0.9

0.6

0.7

Local Service Structure

0.7

0.6

0.7

0.9

0.8

0.8

0.9

0.9

0.7

0.7

Employee Qualifications

0.8

0.7

0.8

1

0.9

0.9

1

1

0.7

0.8

Financial Equilibrium

0.6

0.5

0.6

0.8

0.7

0.7

0.8

0.8

0.8

0.8

Structure of Deficient Services

0.7

0.6

0.7

0.9

0.8

0.8

0.9

0.9

0.6

0.9

Very Diversified Product

0.5

0.4

0.5

0.7

0.6

0.6

0.7

0.7

0.7

1

Small Scale Production

0.5

0.4

0.5

0.7

0.6

0.6

0.7

0.7

0.5

1

Lack of Incentives or Appreciation for Employees

0.8

0.7

0.8

1

0.9

0.9

1

1

0.5

0.7

Communication Failures Between Sectors

0.6

0.5

0.6

0.8

0.7

0.7

0.8

0.8

0.8

0.7

Presence Question 5

1

0.7

0.8

0.7

1

1

1

0.8

0.8

XXX

STRENGHT/WEAKNESS

OPPORTUNITY/THREAT

Table 3: SWOT Matrix Source: The Authors

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


571 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

The subsequent matrix compares Strengths and Weaknesses x Strategic Objectives. See Table 4: Quantification: Strategic Objectives x Weaknesses and Strengths The subsequent Matrix compares Opportunities and Threats x Strategic Objectives. MATRIX 2

Strategic Objectives

Question 2

Make Communication Effective

Develop Internal Marketing Actions

Alignment and Cooperation Amongst the Areas

Migrate Management Controls to SAP

xxxx

xxxx

xxxx

xxxx

xxxx

xxxx

xxxx

Proprietary technology

0

0.6

0

0.5

0.2

0.7

0.2

0

Local Engineering

0.7

0.7

0.8

0.8

0.7

0

0.2

0

New Product Line

0.7

0.9

0.9

0.9

0.9

0.8

0.9

0.9

Lean Structure

0

0.9

0.8

0.3

0.9

0

0.8

0.8

Flexibility

0.8

0.8

0.8

0

0.8

0

0.8

0.3

Local Service Structure

1

0.9

0.9

0.5

0.3

0

0.7

0.3

Employee Qualifications

1

0.8

1

1

0.8

0.8

0.8

0.9

Financial Balance

0.8

0.5

0.8

0.8

0

0

0.7

0.6

Structure of Deficient Services

1

0.9

0.9

0.5

0.3

0.7

0.8

0.7

Very Diversified Product

0.2

0.2

0

0.5

0

0

0.8

0.8

Small Scale Production

0.2

0.2

0

0

0

0

0.9

0.8

Lack of Incentives or Appreciation for Employees

0.8

0.8

0.8

0.7

0.3

0.8

0.7

0.8

Communication Falure Between Sectors

0.2

0.2

0

0.8

1

1

1

1

Intensify the Dissemination of the New Product Line Intensify the Development of Overseas Business Rearrange Structure and Commercial Operations

xxxx

Develop Skills for Services Management

STRENGHT/WEAKNESSES

Table 4: Quantification: Strategic Objectives x Strengths and Weaknesses Source: The Authors

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


572

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

See Table 5: Quantification: Strategic Objectives x Opportunities and Threats The subsequent Matrix compares Strategic Objectives x Strategic Objectives.

MATRIX 3

Strategic Objectives

Question 3

Make Communication Effective

Develop Internal Marketing Actions

Alignment and Cooperation Amongst the Areas

Migrate Management Controls to SAP

xxxx

xxxx

xxxx

xxxxx

xxxx

xxxx

Currency Valuation

0

0.6

1

0.8

0.1

0

0

0

Economic Bubble

0

0.4

0.8

0.2

0

0

0

0

Increased Competition Amongst Competitors

0.9

0.8

0.8

0.8

0.2

0

0.1

0.6

Predatory Competition

0.8

0.8

0.8

0.1

0

0

0.1

0

Specialists

1

0.8

0.9

0.1

0

0

0.8

0

Turn Key Solutions

1

0.8

0.9

0.2

0

0

0.8

0

Economic Expansion

1

0.9

0.9

0.9

0.6

0

0.3

0

Services Market

1

1

0.8

0.2

0.8

0

0.9

0.8

Prospective International Vendors

0.2

0

1

0.2

0.2

0.3

0.8

0.8

Intensify the Dissemination of the New Product Line Intensify the Development of Overseas Business Rearrange Structure and Commercial Operations xxxx

Develop Skills for Services Management xxxx

THREATS/ OPPORTUNITIES

Table 5: Quantification: Strategic Objectives x Opportunities and Threatens Source: The Authors

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


573 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

See Table 6: Quantification: Strategic Objectives x Strategic Objectives The subsequent Matrix compares Actions x Strategic Objectives.

Strategic Objectives

Develop Skills for Services Management

Rearrange Structure and Commercial Operations

Intensify the Development of Overseas Business

Intensify the Dissemination of the New Product Line

Make Communication Effective

Develop Internal Marketing Actions

Alignment and Cooperation Amongst the Areas

Migrate Management Controls to SAP

Develop Skills for Services Management

1

0.8

0.9

0.6

0.8

0.5

0.8

0.9

Rearrange Structure and Commercial Operations

0.8

1

0.8

0.3

0.3

0.7

0.8

0.8

Intensify the Development of Overseas Business

0.9

0.8

1

0.8

0.8

0.8

0.9

0.6

Intensify the Dissemination of the New Product Line

0.7

0.8

0.8

1

0.8

0.9

0.9

0.2

Make Communication Effective

0.8

0.8

0.7

0.7

1

0.9

0.8

0.9

Develop Internal Marketing Actions

0.9

0.7

0.7

0.8

0.9

1

0.9

0.8

Alignment and Cooperation Amongst the Areas

0.9

0.9

0.9

0.5

1

0.8

1

0.9

Migrate Management Controls to SAO

0.8

0.6

0.7

0.2

0.8

0.8

0.8

1

Table 6: Quantification: Strategic Objectives x Strategic Objectives Source: The Authors

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


574

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

See Table 7: Quantification: Strategic Objectives x Actions Each of the matrices was filled by the IT Manager for each one to be subsequently inserted into the computer system for calculating importance, which defines ranking or strategic priorities. The result of this processing is represented in table 8.

Develop Skills for Services Management

Rearrange Structure and Commercial Operations

Make Communication Effective

Develop Internal Marketing Actions

Alignment and Cooperation Amongst the Areas

Migrate Management Controls to SAP

xxxx

xxxx

xxxx

xxxx

xxxx

xxxx

xxxx

xxxx

Establish a methodology for project management

0.9

0.7

0.6

0.6

0.6

0.8

0.7

0.8

Hire an HR consulting company

0.9

0.9

0.8

0.7

0.8

0.8

0.8

0.7

Design a program for jobs and wages

0.7

0.8

0.6

0

0.6

0.7

0.3

0

Hire a commercial manager

0.7

1

0.9

0.7

0.5

0.7

0.3

0.7

Consolidate the network of sales representatives in the country

0.5

1

0.5

0.8

0

0.6

0.3

0

Developed commercial representatives for Mexico and the USA

0.5

0.9

1

0.8

0

0.8

0

0

Design advertising material

0

0.4

1

1

0.5

0.5

0

0

Restructure the company's website

0.5

0.7

0.8

0.8

0.8

0.7

0.6

0.8

Communication training for leaders

0.3

0.8

0.8

0.5

1

0.9

0.9

0.8

Create a process map

0.5

0.8

0.3

0

0.8

0.5

0.9

0.9

Hire a certified SAP consultant

0.3

0.8

0.3

0

0.6

0

0.8

1

Intensify the Dissemination of the New Product Line Intensify the Development of Overseas Business

Strategic Objectives Actions

Table 7: Quantification: Strategic Objectives x Actions Source: The Authors

See Table 8: Prioritization based in Compensatory Fuzzy Logic (CFL) The results highlighted in red are those that should receive more attention from the security management in a more strategic way, that is, involving not only internal variables of the organization, but also external variables such as Opportunities and Threats. The strategic priorities indicate a significant search for new business in services and international markets and that means the hiring of manpower to meet this demand.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


575 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

Organizational Characteristics

st7

Employee Qualifications

1

st6

Local Service Structure

0.84096875

st3

New Line of Products

0.83740745

st5

Flexibility

0.83740745

st1

Proprietary Technology

0.82798177

st8

Financial Balance

0.79322776

st2

Local Engineering

0.78434395

st4

Lean Structure

0.78434395

we4

Lack of Incentives or Appreciation for Employees

we1

Structure of Deficient Services

0.83206217

we5

Communication Failures Between Sectors

0.78978055

we3

Small Scale Production

0.73074916

we2

Very Diversified Product

0.72858576

op2

Local Engineering

0.84948461

op3

Services Market

0.84948461

op1

Economic Expansion

0.73077657

th3

Increased Competition Amongst Competitors

0.84948461

th1

Currency Valuation

0.79417651

th2

Economic Bubble

0.79417651

th4

Predatory Competition

0.73738553

th6

Turn Key Solutions

0.73738553

th5

Specialists

0.68366691

1

Environmental Characteristics

Strategic Objectives ob1

Develop Skills for Services Management

1

ob3

Intensify the Development of Overseas Business

1

ob4

Intensify the Dissemination of New Product Line

1

ob7

Alignment and Cooperation Between Areas

0.77453586

ob6

Develop Internal Marketing Actions

0.75263601

ob5

Make Communication Effective

0.74630662

ob2

Rearrange Structure and Commercial Operations

0.73503024

ob8

Migrate Management Controls to SAP

0.72180293

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


576

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L. Actions ac6

Develop a commercial representative for the U.S. and Mexico

1

ac7

Design advertising material

1

ac2

Hire an HR consultancy

0.83621855

ac9

Communication training for leaders

0.79504613

ac4

Hire a commercial manager

0.79498275

ac1

Implant a project management methodology

0.79233469

ac8

Restructure the company website

0.78601955

ac10

Conduct process mapping

0.68926213

ac5

Consolidate the network of sales representatives in the country

0.65111586

ac3

Design a program of jobs and wages

ac11

Hire a SAP certified consultant

0.623552 0.61280733

Table8: Prioritization based in fuzzy logic

5.6 Application of Information Security Model The instrument that generated the integration of BSC (strategic) x COBIT (tactical) x ISO/IEC27002 (operational) model, in the case studied, allowed us to observe that in the "Minimal to Reasonable" technical level information security it is treated through isolated actions based on technical knowledge. It is confirm the theoretical reference points as a constraint on model for IT management in organizations, there is no executive view on aspects surrounding the security of information. Posthumus and Solms (2004) confirm that information security should be incorporated into corporate governance and cared for at the highest management levels. For information security to be effective, it has to be incorporated into the culture of the organization through techniques such as the establishing of policies, training, awareness and application of disciplinary practices (Solms and Solms, 2004). The implementation of the protection of the norm instrument, domain 8 (HR) concentrated 71% of “inadequate” protection to “minimum” protection and this demonstrated a potential risk. This is possible because a knowledgeable, but unsatisfied employee can harm the company through theft, fraud and improper use of resources related to information. The same argument can be extended to domain 9 (physical and environmental security), 77% of inadequate protection, with the possibility of damage and interference with the facilities and organization information causing loss, damage, theft or compromise of assets and disruption of activities of the organization.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


577 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

See Table 9: Results from the protection with qualitative of information security

1 - Inadequate

2 - Minimal

3 - Reactive

4 - Adequate

Protection

0.00%

50.00%

50.00%

0.00%

5 - (PL) – Information Security Policy

15.79% 47.37%

36.84%

0.00%

6 - (OI) – Organizing Information Security

0.00%

62.50%

25.00%

12.50% 7 - (GA) – Asset Management

14.29% 57.14%

14.29%

14.29% 8 - (RH) – Human Resources Security

4.17%

70.83%

16.67%

8.33%

9 - (SA) – Physical and Environmental Security

16.07% 32.14%

42.86%

8.93%

10 - (GO) – Operations and Communications Management

18.60% 27.91%

39.53%

13.95% 11 - (CA) – Access Control

15.63% 56.25%

28.13%

0.00%

12 - (AQ) – Acquisition, Development and Maintenance of Information Systems

63.64% 18.18%

18.18%

0.00%

13 - (GI) – Information Security Incident Management

0.00%

50.00%

50.00%

0.00%

14 - (GC) – Business Continuity Management

31.58% 26.32%

42.11%

0.00%

15 - (CF) – Compliance

Areas of standard

Table 9: Results from the protection with qualitative Fuzzy Results (norm control) Source: The Authors

6.

CONCLUSION

IT Governance enables the expansion of compliance and Corporate Governance through the use of different models such as the audit of IT processes and information security. This provides more robust systems and internal control; however, limitations have been identified for achieving strategic alignment between IT and Business. The limitations of strategic alignment between IT and Business may be related to aspects of organizational behaviour and strategic prioritization, covered in this work. These limitations create vulnerabilities in the information that is, reducing the security of information, especially for business requirements. This paper develops the strategic alignment of organizational behavior through the organizations image, prioritization and information security practices. To this end, information security is studied based on the requirements of confidentiality, integrity and availability by applying a tool which integrates the strategic, tactical and operational vision through the following framework; Balanced Scorecard - BSC (strategic) x Control Objectives for Information and Related Technology - COBIT (tactical) x International Organization for Standardization - ISO/International Electro

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


578

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

Technical Commission - IEC27002 (operational).In Knorst (2010) the mapping of this alignment is presented. Another image instrument of the organization is applied in parallel with this analysis to identify and analyze performance involving profiles related to mechanistic, psychic prisons, political systems, instruments of domination, organisms, cybernetics, flux and transformation (MORGAN, 1996) and (JOHANN, 2008). Finally, a model of strategic prioritization CFL based on fuzzy logic is applied (ESPIN and VANTI, 2005). The method was applied to an industrial company located in southern Brazil. Many of the problems related to information security are related to behavior that is not suitable to the culture of the organization. This behavior can be characterized as a counterculture that brings together small groups in companies that do not accept the dominant culture. The consequence of maintaining these small groups may be directly related to the denial of information boycotts of information relating to business innovation and the delivery of information to third parties through the release of passwords or even by sending it out of the organization via scanned files. A common business response to this type of problem is to implement a greater number of IT solutions. This work, however, presented a solution that aligns the technical answers (IT models/IT governance) aligned with models of behavior and business culture through the study of their images. This indicates that depending on the results evaluated by the applied instruments, you can find good solutions and invest resources correctly for information security with a major investment in staff training, their awareness, and adherence to the dominant culture in the company. In the case studied, security requirements were aligned with the images of “organism” and “flux and transformation”. Thus, by definition these two images have a significant functional recovery and dynamism with the movement of the market, meaning that it can significantly facilitate the consistency of the work between the technical area and the behavioral area. For information security, it was concluded that the technical perspective is from “Minimal” to “Reasonable”; this dimension based significantly on isolated actions based on technical knowledge. Thus, it was possible to analyze the Confidentiality, Integrity and Availability applying an instrument that integrates the strategic, tactical and operational vision through the BSC (strategic) x COBIT (tactical) x ISO/IEC27002 (operational)Models. The organizational profile revealed that images of “organisms” and “flux and transformation” and the area of Human Resources Security (RH) are suggested as minimal and actions are suggested with a base in security policies and training so that employees, suppliers and third parties understand their responsibilities and are aware of the threats and concerns about information security. The controllability of the company studied diagnosed by the technology instrument also allows certain flexibility for the image of “flux and transformation” aligned to “organisms”. Thus, the company adapts to the external environment and has a high level of control of strategic information, while innovating with adaptations to environmental changes, which are typical of the industry it serves. Finally, it has presented the application in a real case and, in order to preserve the integrity of the company’s private information, some results were analyzed in a more synthetic way.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


579 Aligning Information Security with the Image of the Organization and Prioritization Based in Fuzzy Logic for the Industrial Automation Sector

With the LDC model, it was also possible to find the information security variables that were related to the SWOT and consequently the external environment. This means that the company has the strategic positioning on information security that considers the company’s internal environment (strengths and weaknesses), but also considers the external environment (opportunities and threats), which contain the variables of “competitors,” “government regulations” and “technological trends” which possess relations of great vulnerabilities.

REFERENCES Adizes, I. Corporate Lifecycles: How and Why Corporations Grow and Die and What to Do About It.NJ: Paramus, 1989, ISSBN 0-13-174400-3. Dias, C. Segurança e Auditoria da Tecnologia da Informação. Rio de Janeiro: Axel Books, 2000. Eloff, J.; Eloff, M. Information Security Management: A New Paradigm. Proceedings of the 2003 annual research conference of the South African institute of computer scientists and information technologists on enablement through technology. Republic of South Africa p.130-136, 2003. Espín, R.; Vanti, A.A. Administración Lógica: Un estudio de caso en empresa de comercio exterior. Revista BASE. São Leopoldo, RS- Brasil, ago, 1(3), 2005, pp.4-22. Fitzgerald, T, Clarifying the Roles of Information Security: 13 Questions the CEO, CIO, and CISO Must Ask Each Other. Information Systems Security, Volume 16, Issue 5, Pages 257-263, 2007. Freitas, M. E de. Cultura organizacional: formação, tipologias e impactos. São Paulo: Makron, McGraw-Hill, 1991. Grenberger, W.; De Haes, S.; Measuring and improving information technology governance through the Balanced Scorecard. Information Systems Control Journal.Vol. 2, 2005, ITGI. ITGI: Information Technology Governance Institute. CobiT 4.1: Framework, Control Objectives, Management Guidelines, Maturity Models. Available in: http:// www.isaca.org/, accessed in Feb, 24, 2009. ISO/IEC 27002:2005. Information technology e code of practice for information security management. Switzerland: International Organization for Standardization (ISO), 2005. Johann, S. L. Gestão da cultura corporativa: como as organizações de alto desempenho gerenciam sua cultura organizacional. Porto Alegre: Saraiva, 2008, 2a.edition. ______. Gestão da cultura organizacional. Working in paper - SIGA - Sistema de Informação e Gestão), Rio de Janeiro: Acadêmica, da Fundação Getúlio Vargas (FGV), 2008. Kaplan, R. S.; Norton, D. P.The Balanced Scorecard: Translating Strategy into Action. Mass: Harvard Business School Press, 1996.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


580

Knorst, A. M., Vanti, A. A., Andrade, R. A. E., Johann, S. L.

Knorst, A. M. Strategic alignment between business goals and information security in the IT governance context: a study in the automation industry. Thesis Master's degree. Unisinos. São Leopoldo, 2010. Kwok, l.; Longley, D. Information security management and modeling, Information Management & Computer Security, Volume 7, Issue 1, Pág.30 – 40, 1999. Minarelli, J.A. Empregabilidade – o caminho das pedras. São Paulo: Ed. Gente, 1995. Moreira, N. S. Segurança Mínima – Uma Visão Corporativa da Segurança de Informações. Rio de Janeiro: Axcel Books, 2001. Morgan, G. Images of organization. London: SAGE Publications, 1996. Posthumus, S.; Solms, R.. A framework for the governance of information security.Computers & Security, Amsterdam, no. 23, págs. 638-646, 2004. Schein, E. H. Organizational Culture. American Psychologist. San Francisco, v.45, n.2, fev. 1990, 109-119. Sêmola, M.. Gestão da Segurança da informação: Visão executiva da Segurança da Informação. Rio de Janeiro: Campus, 2003. Solms, R.; Solms, B.. From policies to culture, Computers & Security, Volume 23, Issue 4, 2004, Pages 275-279. Vanti, A.A.; Espin, R.; Goyer, D., Schripsema, A. The importance of objectives and strategic lagging and leading indicators definition in the chain import and export process in the light of strategic planning through the use of Fuzzy LogicSystem.ACM SIGMIS Conference Personnel Resource (CPR), April, Claremont, 2006. Vanti, A.A.; Espin, R. Metodologia Multivalente para Priorização Estratégica em Construção de Balanced Scorecard (BSC). Revista CCEI, vol. 11, no. 20. Ago. 2007, pages 54-67. Weill, P.; Ross, J. Governança de Tecnologia da Informação. São Paulo, M. Books do Brasil Editora Ltda. Wright, P.; Kroll, M.; Parnell, J. Strategic Management: concepts. New Jersey: Prentice Hall, 1998.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 555-580

www.jistem.fea.usp.br


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, pp. 581-604 ISSN online: 1807-1775 DOI: 10.4301/S1807-17752011000300004

UNDERSTANDING THE SUBJECT´S BEHAVIOR IN THE INTERACTION WITH A DECISION SUPPORT SYSTEM UNDER TIME PRESSURE AND MISSING INFORMATION Kathiane Benedetti Corso Mauri Leodir Löbler Federal University of Rio Grande do Sul, Brazil

__________________________________________________________________________________

ABSTRACT The article seeks to determine how time pressure and missing information in decision-making affect the behavior of decision makers. Data was collected through an experimental task of simulating the purchase of a car, which was structured with the AHP (Analytic Hierarchy Process) multi-criteria method in a Decision Support System. When pressured by time, the experimental subjects focused on the car of their choice; whereas with no time pressure, some of them rationalized more, used the information, and did not agree with the chosen car. Assumptions of the Theory of Image justified some findings, indicating that previously structured images in the mind of the decision maker are a way to cope with time pressure. Given the missing information, the use of background knowledge and individual experience were the most prominent coping strategy. Keywords: Behavior of decision makers; Time Pressure, Missing Information, Decision Support Systems, AHP multi-criteria method.

_____________________________________________________________________________________ Manuscript first received/Recebido em 09/02/2010

Manuscript accepted/Aprovado em: 22/07/2010

Address for correspondence / Endereço para correspondência Kathiane Benedetti Corso, Assistant Professor at UNIPAMPA – Campus Santana do Livramento/RS, Doctorate student at PPGA/EA/UFRGS - Post graduation Program in Administration in The School of Administration at the University of Rio Grande do Sul, RS, Brazil – E-mail: kathianecorso@unipampa.edu.br Rua Barão do Triunfo, 1048 - Santana do Livramento/RS - CEP: 97573590. Universidade Federal do Pampa - Telephone: (55) 3243-4540 Mauri Leodir Löbler, Assistant Professor at the Department of Administrative Sciences - Post Graduation Program in Administration – PPGA/UFSM, PhD in Administration from the Administration Post Graduation Program at the School of Administration at the University of Rio Grande do Sul - E-mail: lobler@ccsh.ufsm.br Rua Marechal Floriano Peixoto, 1184 –2. andar - Santa Maria/RS - CEP 97015-372. Universidade Federal de Santa Maria – Telefone: (55) 3220-9296 Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


582 Corso, K. B. Lรถbler,M.L.

1. INTRODUCTION Whether operational or strategic, decisions are part of any human task. When specifically examining the process of decision making in organizations, it is possible to note that it has been changing quickly in recent years, particularly regarding the speed of advancement of information technology and communications. The changeability and dynamics of the environment in which companies act result in a new business environment: from the manager and the chief executive greater preparation for the establishment of strategies and decision making is required (Berto, 2004). Therefore, when studying the decision-making process one should not fail to analyze the influences experienced by the decision maker during this process, since there are several behavioral factors that influence those expected to make the decision. The study on decision making in companies has been the subject of theoretical and management research, and much has been discovered and analyzed on the subject. Simon (1965) reminds us that in order to understand management, it is important to perceive how people actually solve problems and make decisions. Accordingly, Lรถbler & Hoppen (2005) add that research on the decision maker's behavior, human judgment, and choices currently aim at understanding how the human mind works in different situations and with different information, in addition to observing the results. According to Fisher, Chengalur-Smith & Ballou (2003) it has been acknowledged that the effectiveness of decision making is influenced by several factors, including the time available before a decision is made. Many decisions in business, especially in economics and finance have to be made under severe time pressure (Kocher & Sutter, 2006). According to Ahituv, Igbaria & Sella (1998), several studies have reported the negative effect of time pressure on the effectiveness of decision making, and the pattern of results is fairly consistent. Studies suggest that time pressure results in reduced choice and information processing, as well as in the decrease of the number of alternatives considered. Availability of information is also influential in the decision process. Many of the day-to-day decisions can involve situations where there is missing information. According to Kรถrner et al. (2007), the missing information topic has received great attention in the research literature on decision. Concerning the missing information in decision making, Jagacinski (1991) ensures that a common problem that people face daily to make decisions is when important pieces of information are missing. Given the scenario presented up to now, the research question of this present study is: "How do decision-making time pressure and missing information affect the behavior of the decision maker?" Thus, this paper intends to compare two influential factors in decision making - time pressure and missing information - to examine how these can affect the behavior of the decision maker. The development of the research, based on a theoretical framework, is followed by data collection from individuals characterized as decision makers for the task that guided the study: the simulation of a car purchase, with subjects submitted or not to time pressure and missing information, covered in the method section. After that, the results obtained with the experiment are presented, and these results are further JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

583

discussed and confronted with the literature that supports this theme. It is noteworthy that, to date, no studies have addressed these concepts applied to decision-making in the Brazilian reality, so this study depicts a challenging idea that is relevant to be studied, and potentially significant for future theoretical and practical contributions. 2. THE DECISION-MAKING PROCESS AND ITS INFLUENCES Organizational tasks are essentially decision-making and problem solving activities (Simon, 1979); they occur all the time, at all levels, and they directly influence organizational performance. Decision making involves a process: a sequence of steps or stages that succeed, called decision-making process. Making a decision is an answer to problems where problems include a set of choices among alternatives (Kingma, 1996 cited in Fisher, Chengalur-Smith & Ballou, 2003). Simon (1978) stated that decision makers were limited because they were not aware of the total number of consequences of their decisions, and were also limited by social and personal pressures. Decision making is not always optimal, but it is satisfactory in a certain situation or at a certain moment. Simon's research aimed at simplifying and understanding complex decision making situations in order to meet the needs of a large number of researchers who had been struggling with the same problem, both in the economic field and in other areas. Simon's great contribution was to bring about the decision-making process in organizations, since the rudimentary theory of the firm, found in traditional economy, was only good to serve as a basis for market behavior studies as a whole, and not for the behavior of the firm individually. Thus, the study of the decision-making process in the firm, conducted by Simon, brought a major breakthrough for Economic Science. Tversky & Kahneman (1974), with a similar thought, showed that people are unable to analyze complex situations correctly, when future consequences are uncertain. In these circumstances, according to both authors, individuals seek shortcuts called heuristics. Thus, Tversky & Kahneman (1974) demonstrated that in situations of uncertainty, human judgment is based on subjective rules, which systematically contradict fundamental propositions of probability. Continuing in the same line of research, Kahneman and Tversky (1979) showed that individuals reason differently depending on how the problem is presented. The Prospect Theory then emerged, where risk aversion only occurs in the field of gains, but when the possibility of loss is presented to individuals, they tend to be prone to risk. This finding was contrary to the Expected Utility Theory. According to Markman and Medin (2001), there is now a large number of demonstrations of the limitations in the human decision-making process, and the publication of the Prospect Theory by Kahneman and Tversky (1979) marks a turning point in increasing the interest and influence of psychological models on the decisionmaking process. Bazerman (2004) also shows that heuristics and biases influence the decision JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


584 Corso, K. B. Lรถbler,M.L.

making process, he worked on a reinterpretation of the findings of Tversky & Kahneman and he uses cases applied to illustrate the influence of the above variables. According to the author, the rational model cannot explain how the decision-making process actually happens, since it is based on a set of assumptions that determine how a decision should be made, rather than how a decision is actually made. Although the decision making process has problem solving as its scope, individual subjectivity in decisions is considerable. Accordingly, several other authors (Simon, 1965; Payne, Bettman & Johnson, 1993; Svenson, 1996, Pereira & Fonseca, 1997) address aspects taken by some as factors, and by others as variables or constraints, which influence the decision maker at the moment of decision making, or even throughout the decision making process. Such constrains may be individual factors that limit the quality and quantity of the process, such as limits on reflexes and habits, values, and level of knowledge (Simon, 1965), stress, time pressure, involvement with the task, and mood (Svenson, 1996). The present study addresses time pressure and missing information as factors that influence the decision making process. 2.1 Influence of Time Pressure in Decision Making Time factor is a variable that changes in the new dynamics of companies; thus, it is increasingly necessary to make decisions in a short time; i.e., decision makers should consider this complex picture and its onset quickly, assessing the results of their decisions to the most extent. Pereira & Fonseca (1997) ensure that when people are rushed, they have a tendency to act impulsively, and the idea that there is not enough time to reach their goals or needs gives rise to general chaos: Hurrying and speeding produce confusion, and even longer time becomes necessary to solve the problem. The self-imposed illusion that there is no time produces an enormous pressure that leads to panic and its consequences. (Pereira & Fonseca, 1997, p. 204).

Weber, Smith & Ram (1987 cited in Smith & Hayne, 1997) ensure that time pressure is commonly implemented through the use of time limits imposed on completing a task. Thus, time pressure is experienced whenever feasible time for completion of the task is perceived to be shorter than normally required by the activity (Svenson & Edland, 1987 cited in Fisher, Chengalur-Smith & Ballou, 2003). This effect of time pressure may result in the "closing of the mind" (Kruglanski & Freund, 1983), meaning that people seek cognitive closure and stop considering important aspects of multiple alternatives, engaging superficially rather than processing information in a complete and systematic manner. Table 1 summarizes the studies on time pressure in a decision-making process, showing the main research contributions on this topic, as well as the theme and context in which they were conducted:

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

585

AUTHORS

THEME

MAIN CONTRIBUTIONS

Payne, Bettman & Luce (1996)

Decision making in opportunity-cost (risk) environments

Individuals speed up their processing of information, being more selective, and change strategies from a deeper pattern of processing (based on alternatives) to a wider one (based on attributes).

Smith & Hayne (1997)

Business decision in a Decision Support System Group

Time pressure changes the difficulty of the task, influences the decision process, and reduces the time available to make a decision.

Ordóñez & Benson (1997)

Risk Decision Making

The subjects change their strategies of decision in response to time pressure.

Ahituv, Igbaria & Sella (1998)

Decision Making in Air Force

Time pressure generally impairs performance of the decision maker.

De Dreu (2003)

Negotiation Tasks

Kocher & Sutter (2006)

Decision Making in Economy

Rieskamp & Hoffrage (2007)

Business probabilistic inference task with a higher profit

the

The pressure of time produces the "closing of the mind", resulting in unfounded perceptions and poor motivation to encode new and relevant information about the preferences and priorities of the opponent. Time pressure leads to the worst decisions.

Under high time pressure, compared to low time pressure, individuals accelerate the search for information, using less information, and staying focused on the most important features.

Table 1 - Summary of studies on Time Pressure in decision making in different contexts Source: created by the authors

Table 1 shows that the majority of studies on this subject includes contributions regarding negative effects of time pressure on the decision-making process. Under this variable individuals tend to change their decision strategies (Payne, Bettman & Luce, 1996; Ordóñez & Benson, 1997). Still, time pressure impairs the performance of the decision maker (Ahituv, Igbaria & Sella, 1998), closes the mind resulting in unfounded perceptions (De Dreu, 2003) and leads to worse decisions (Kocher & Sutter, 2006). 2.2 Influence of Missing Information in Decision Making Information constitutes a basic and indispensable resource for decision making. However, in most cases, information that is available for such judgments is limited or incomplete (Sanbonmatsu et al., 1997). This happens, for example, when a subject is asked to evaluate alternatives of a set of dimensions, but they are not given complete JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


586 Corso, K. B. Löbler,M.L.

information about the values of each alternative across multiple dimensions (Payne, Bettman & Johnson, 1993). Birgelen, Ruyter & Wetzels (2000) state that missing information is unlikely to be ignored or denied by decision makers; and several studies have shown that decision makers tend to reasoning based on the assumption and extrapolation or will predict the missing information of other available information. Table 2 shows the summary of studies on Missing Information in decision making, with the main research contributions on this topic, as well as the theme and context in which they were conducted: AUTHORS

THEME

MAIN CONTRIBUTIONS

Ahituv, Igbaria & Sella (1998)

Decision Making in Air Force

Ebenbach & Moore (2000)

Judgments of environmental projects

When there is missing information, the subjects infer missing information from available information.

Kivetz & Simonson (2000)

Decision to purchase laptop, yogurt, and choice of leisure club

The missing information affects the preference for the options being considered, and also tastes and preferences in subsequent choices.

Körner et al. (2007)

Choice of qualified student to receive scholarship

When in missing information, decisionmakers give more weight to common dimensions (alternatives available in all dimensions), and examine them before unique dimensions (with missing alternative).

Full information improves performance.

Table 2 - Summary of studies on Missing Information in decision making in different contexts Source: created by the authors

In Table 2 it is clear that Ebenbach & Moore (2000) and Körner et al. (2007) present their contributions with respect to how the subject reacts when faced with a task of missing information, i.e., that individuals infer this information from information already available, giving more weight and common dimensions (with information available) than the unique dimensions (with missing information). Kivetz & Simonson (2000) find that contributing to missing information affects the preference of choice, and tastes in later choices of individuals. Ahituv, Igbaria & Sella (1998) observed that full information improves performance, implying that missing information compromises the individual´s performance of task.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

587

3. METHOD OF RESEARCH This study was supported by experimental research, which according to Fachin (2002), is the research method in which variables are manipulated in a pre-determined manner and their effects are sufficiently controlled and known by the researcher to observe the study, in order to establish causal relationships. Experimental control is achieved by treating people of all the groups from the experiment identically, and the only difference among the groups is the manipulated variable (Cozby, 2006). For purposes of this study, the manipulated variables will be two - time pressure and missing information - the remainder being homogenized in the experimental groups. The experiment was structured based on a 2x2 matrix (with missing information versus no missing information X with time pressure versus no time pressure) to obtain different experimental groups. Missing formation and time pressure variables, given to the decision maker, when combined are structured as follows: (1) task with missing information and with time pressure, (2) task with missing information and with no time pressure, (3) task with no missing information and with time pressure, and (4) task with no missing information and with no time pressure. The experimental task, which consisted of the choosing of the cars, was a decision task that has as alternatives popular cars, and as attributes or criteria, their characteristics. By comparison, the user was asked to choose and rank the alternatives and attributes according to their preference, and also to state the relevance of their choice. The methodology used for the decision-making task was the AHP-Analytic Hierarchy Process - multi-criteria method, which allows to structure decision in hierarchical levels, determining by the synthesis of values of decision makers a global measure for each of the alternatives, sorting and prioritizing them to the end of the method (Saaty, 1991). For the task, we created a Decision Support System (DSS) in order to determine how decision makers react to time pressure and missing information. DSS's systems are currently being utilized as a tool for academic research seeking to study the behavior of decision makers. Payne, Bettman & Johnson (1993) state that one of the key points of the systems that support decisions is the identification of the strategy being employed by each individual, that is, what information the individual seeks, the sequence of information acquisition, the amount of information acquired, and how long the information was examined. At the end of the decision making task the mapping of the choices made during the task was analyzed, and a questionnaire was made after the experiment with the four experimental groups to check the behavior of decision makers. For the sake of a better understanding of the research method steps, Figure 1 shows a schematic of the route taken, which starts with the construction of DSS AHP MAKH-ER, goes through the validation (math, by user and of mapping) of the Decision Support System AHP MAKH-ER, then the experiment itself, ending with the mapping and post-experiment questionnaire, for the subsequent analysis of the behavior of the decision maker.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


588 Corso, K. B. Lรถbler,M.L.

Figure 1 - Schematic of the research method steps Source: created by the authors Afterwards, it shows the AHP MAKH-ER system operation and interface. The application of the independent variables - Time Pressure and Missing information - is presented soon after, as well as the research subjects and the experimental design. 3.1 Operation of the AHP MAKH-ER Decision Support System The software AHP MAKH-ER built to research and validated by Corso and Lรถbler (2010), allows you to create any kind of decision-making task. The underlying mathematical model was based on the AHP - Analytic Hierarchy Process multi-criteria method, known as AHP Method, which according to the judgments and choices of decision makers, ponders the relevance of each alternative and criterion, by using a JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

589

matrix. The system supports individuals in the decision task and helps the researcher in mapping the decision-making process, through computer accesses (logs), i.e., the recording of the movements and accesses made by decision makers. In the layout of the experimental task (Figure 2), the lines have the three alternatives of choice, that is, the three models of cars available for the choice of decision-makers, and the columns have the attributes of each car, i.e., relevant criteria when buying cars, from LĂśbler's research (2005). Depending on the clicks, the information remains available. As the decision maker examines all the values of the alternatives of a given criterion, an automatic pop up screen opens (Figure 2), with a question asking the decision maker to make a pair comparison of each element, in accordance with the AHP methodology. Hence, the decision maker will represent, from a pre-defined scale, their preference among the elements compared. Saaty (1991) recommends that the process of comparing pairs must be implemented using verbal questions.

Figure 2- Task screen of AHP MAKH-ER â&#x20AC;&#x201C; Task with opening of the judgment screen on the first level Source: System developed for the research

In the example of Figure 2, the decision maker will have to compare between: Maintenance/Gol City x Maintenance/Palio Fire, Maintenance/Palio Fire x Maintenance/Celta Life, and Maintenance/Gol City x Maintenance/Celta Life, by JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


590 Corso, K. B. Lรถbler,M.L.

answering the question: "Regarding the Maintenance Criterion, which is the most relevant?", in other words, they will judge which one contributes more to the maximization of the criterion, according to their opinion. Then it is asked "What is the relevance of this matter?" That is, how many times the Maintenance of Celta Life is more relevant than the Maintenance of Gol City, as in the example. A pairwise comparison of the alternatives is used with its own linear scale, which ranges from 1 to 9, and it is called Fundamental Scale of Saaty, shown in Table 3. The numbers of the scale are not available for the decision maker, because the study of Saaty (1991) has previously determined that the semantic intensity has a corresponding representation in values, and that the individual needs only to manifest semantically. Scale

Intensity of Relevance

Description

1

Same relevance

3

A little more relevant

Experience and judging favor one task in relation to another.

5

Considerably more relevant

Experience and judging strongly favor one task in relation to another.

7

A lot more relevant

A task is a lot more strongly favored in relation to another.

9

Extremely relevant

An evidence favors a task in relation to another with the highest level of reliability.

Both tasks contribute equally to the purpose.

Table 3- Fundamental Scale of Saaty Source: Saaty (1991), adapted by the author

Thus, according to the AHP Method, the answers are converted to numbers within a matrix, given the degree of relevance and priority that they have , ranging from 1 to 9 (Table 3), and which expresses the number of times an alternative dominates or is dominated by the other (Saaty, 1991). The same process, of the pop up automatic screen, will pertain to the other criteria, whenever the decision maker opens all the cells of the alternatives of a criterion. So, after going through the procedure shown in Figure 3, for all the criteria of a Set of Criteria, a new comparison pop up screen opens to a higher hierarchical level, i.e., to compare each Set of Criteria (criteria of the Economic Group and criteria of the Quality Group), to finally compare the last hierarchical level: the comparison between Economic Criteria and Quality Criteria Groups. Finally, the weights are acquired and the consistency of the matrix is checked, a process called matrix normalization, (Saaty, 1991). The choice of the final decision should be based on the alternative with the highest score (percentage weight). The final screen of the decision of each user is presented as a vertical bar Table of the choice of the car, based on the alternative with the highest score, with the percentages of each attribute that they judged to be relevant during the task.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

591

3.2 Application of Independent Variables: Time Pressure and Missing information To characterize time pressure, a few concepts already addressed in the literature were used. Among them, and most importantly, that in which time pressure is experienced through the use of time limits imposed by the completion of a task (Smith & Hayne, 1997); that in which time pressure is experienced whenever feasible time for task completion is perceived to be shorter than normally required by the task (Fisher, Chengalur-Smith & Ballou, 2003), and that such time restriction creates a feeling of stress and a need to deal with the restriction of time (Ordóñez & Benson, 1997). For the task under study, no literature with reference to the specific time that causes the feeling of stress in individuals has been found. Maule & Hockey (1993) state that it is difficult for researchers to determine how to vary the time given to provide a comprehensive estimation of the effects of time pressure in decision making. Thus, they show that most studies have operationalized time pressure by adopting a given time, which is any fraction of the usual time to complete the task, without clearly justifying this particular fraction. Based on such considerations and on the measured time that individuals took to complete the task in the validation of AHP MAKH-ER with users, we chose to set pressure time, by using separatrices, i.e., the quartile list of times checked. To determine, therefore, the time that would put pressure on the individual, the first quartile list was determined, i.e., the periods of time corresponding to the 25% lowest periods of time found, and the mean of this quartile was later used. Then, by definition, an average of 4 m and 45 s. was determined. Still, to create the specific scenario and induce a feeling of stress on the individual, it was decided that, in addition to the reduced time to perform the task when submitted to time pressure, three warnings would be given. For the task performed with no time pressure, no time limit was set. To set a task with missing information, the one considered was that in which some values of certain cells of alternative X criterion were removed, since the literature states that missing information occurs when not all information is made available to the individual (Körner et al., 2007). Thus, for the version of AHP MAKH-ER with missing information, some values of certain criteria of cars were randomly (by a drawing) removed. In the decision-making task interface that represents the version with missing information, the decision maker finds "No Value" in that cell. Even so, they must proceed with their judgment, having to choose between two alternatives offered. 3.3 Subjects and Experimental Design As the decision-making task is the choosing of a car, the individuals selected as experimental subjects should be knowledgeable of the subject of decision, that is, individuals who often buy cars, subscribe to car magazines, are car enthusiasts, and work in the field, such as mechanics, drivers and car salesmen, people that have one or more of these characteristics. The criterion of background knowledge was therefore used for the selection of the subjects, while Löbler (2005) found this to be an influential variable. After 20 participants were selected for the experiment, they were randomly divided for the application of the decision-making task into four groups of five individuals, according to the inclusion or not of the research independent variables applied in the task (Figure 3). JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


592 Corso, K. B. Lรถbler,M.L.

TASK

With missing information

With no missing information

With time pressure

GROUP 1

GROUP 2

With no time pressure

GROUP 3

GROUP 4 (Control)

Figure 3 - Standardization of experimental group denomination

Thus, the research design was divided into four groups of experimental subjects, and each group performed the same task of choosing a car and were differentiated by experiencing missing information or not, and time pressure or not in their task. 4. RESULTS: THE BEHAVIOR OF THE DECISION MAKER This section shows the results of the decision-making process mapping, by using computer aid (logs), as for the search for information, and the analysis of questionnaires made to decision makers at the end of the experiment. Such results, when contrasted with the literature, are described separately by an experimental group, in order to examine the differences among the groups submitted to different variables. 4.1 Process Mapping By using logs for the answers of decision makers, it was possible to find aspects of their behavior during their judging of which car to choose, as well as the manner which they seek for information, i.e., the sequence that the decision maker followed during the process: checking if at first they gave priority to opening criterion or alternative cells. The mapping shows the differences in the manner of seeking for information among the groups. In Group 1, which performed the task with missing information and with time pressure, all experimental subjects opened the information cells of the AHP MAKH-ER by criteria (Figure 4), and in the order in which they were presented, i.e., starting in the first column and so on. It is believed that because of time pressure, individuals led the opening of cells per criterion, and in the order that they were available, so when clicking on the three alternatives of each criterion, the pop up screen was already open, optimizing the time of completing the task.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

593

Figure 4 - Search for Information by Criterion Source: System developed for the research

In Group 2, which also dealt with time pressure, but with no missing information, two (2) of five (5) individuals did not follow this idea of optimization, analyzing the information through the alternatives, according to the sequence shown in Figure 5. However, one of them, after a certain time, perhaps feeling that the time was short, chose to open the cells by criteria, and in order.

Figure 5 - Search for Information by Alternative Source: System developed for the research

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


594 Corso, K. B. Lรถbler,M.L.

In Groups 3 and 4, which did not perform the task under time pressure, it appears that some decision makers opted to seek information by alternative (Figure 5), some in the order presented, others randomly, or even looking for criterion (Figure 4) but not in order. It is inferred that by not having time pressure as a factor, these individuals sought information as their preferences to choose, i.e., whether by criterion or alternative. 4.2 Analysis of Post-Experiment Questionnaires The questionnaire administered to the experimental subjects at the end of the experiment sought to examine the agreement or disagreement with the final car choice , and especially with issues related to missing information and time pressure. These openended questions sought to determine whether in the face of such influences, decisionmakers used any technique or strategy to make things easier, and if they had not had such experience, their decision would have been better. Thus, the questionnaires consisted of the same two (2) questions for all groups, and the others were different for the four groups (GROUP 1, GROUP 2, GROUP 3 and GROUP 4), totaling three (3) questions for G4, which was the control group, five (5) for G2 and G3, and seven (7) for G1, which was influenced by the two variables of the study. The answers were tabulated and grouped according to similarities. Then, some categories were created to represent a certain group of answers. Soon after, the questions are presented with the answers, in accordance with the experimental group, to highlight the differences found. When asked if they agreed with the car chosen, shown in the graph of AHP MAKH-ER and if they believed to have made a good choice, the five categories of answers are shown in Table 1. Overall, of the 20 respondents, the vast majority (17) agreed with the final choice shown in the AHP MAKH-ER graph. It is noteworthy that of these, eleven (11) agreed to be the car of their preference, and six (6) for believing that based on the available information it was the best choice. Only two (2) individuals did not agree, because one did not agree with the weights of criteria (1) and another because that model of car was not a priority in their judgments.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

595

GROUP 1

GROUP 2

GROUP 3

GROUP 4

Total

Categories

With MI and With TP

With no MI and With TP

With MI and With no TP

With no MI and With no TP

Yes, because the car chosen represents my preferences

3

4

2

2

11

Yes, because I believe to have chosen the best vehicle, in accordance to the information given

1

1

2

2

6

Yes, because with information available and my background knowledge, it is the best car

1

0

0

0

1

No, because I do not agree with the weight of criteria

0

0

1

0

1

No, because in my answers I tried not to give priority to the chosen car.

0

0

0

1

1

Table 1 - Answers related to compliance with the vehicle chosen by experimental group Making an analysis by experimental group, it appears that Groups 1 and 2, submitted to time pressure, agreed on the final choice, in great majority, because it was their preference of car; while Groups 3 and 4, with no time pressure on the task, also agreed with the choice, but had different opinions regarding the preference of car and the information presented. Those that have not had time pressure on the task were able to better analyze the information available if compared to Groups 1 and 2. All groups were asked whether they had any difficulty using the system for the process of choosing the car. The three answer categories were: no difficulty (13), time was short (5), and some information was missing (2). It is noteworthy that the majority (13) of 20 experimental subjects said they did not to have difficulty using the system during the task, and five (5) of these were respondents in the control group (Group 4). It should be noted that in Group 1, where time pressure and missing information were dealt with, decision makers reported that only time pressure had been the difficulty factor. This may be due to the fact that, between the two variables that were applied to the task of this group, individuals were more significantly affected by time pressure than by missing information. Questions 4 to 7 were in the questionnaire according to the Group's exposure to a particular variable of study. Therefore, only the questions of those respondent groups are presented here. For Groups 1 and 2, submitted to the time pressure variable during the experimental task, the following question was asked: Knowing that you had to

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


596 Corso, K. B. Lรถbler,M.L.

choose under time pressure, did you use any specific technique / strategy to facilitate the task? We obtained four categories, as described in Table 2: GROUP 1

GROUP 2

With MI and With TP

With no MI and With TP

No technique / strategy

1

1

2

Yes, I conducted the task in the quickest manner

2

0

2

Yes, my knowledge about cars

2

0

2

Yes, I directed my choice to criteria / car of preference

0

4

4

Categories

Total

Table 2 - Answers on the techniques and strategies used to deal with time pressure by experimental group It is observed that the largest number of answers (4) regarding techniques and strategies to deal with time pressure is directed to the opening of cells of criteria and / or favorite cars. That is, to deal with the short period of time, the subjects make use of their preferences, which have been previously established. It is noteworthy that this strategy was only used by Group 2, which did not have, in addition to time pressure, missing information. As for those who took part in the group with the incidence of both (Group 1), the techniques to deal with time were to perform the task more quickly (2), and to make use of knowledge about cars (2). It was also asked to Groups 1 and 2 if they had had more time to choose, would they have chosen better, and why? And there were four categories of answers among the 10 subjects. The results show that most of them (6) believe that with more time to perform the task, their choices would have been better, because they would have been able to analyze more closely the criteria and / or have different answers. For the remaining respondents (4), more time would not have helped in choosing better, because some were inclined to a particular model of car (2), or worked with cars, making it easy to choose (1), or even, because they had given priority to the analysis of criteria that pleased them (1). For Groups 1 and 3, with missing information, it was asked: Knowing you had to choose with missing information, did you use any specific technique / strategy to facilitate the task? Table 4 shows that most (6) of the experimental subjects in the groups that performed the task with missing information (10), when facing the task, made use of background knowledge about the object (cars), or made their choices based on their experience in the automobile industry.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

597

GROUP 1

GROUP 3

With MI and With TP

With MI and With no TP

Yes, between choosing a vehicle about which I have information and another that I don't, I have always chosen the first

1

1

2

Yes, what I know about cars and / or the experience I have about the subject

4

2

6

Yes, the knowledge I have about cars, because I did not use data offered at all

0

1

1

Yes, I chose the car that seemed good to me considering other characteristics

0

1

1

Categories

Total

Table 4 - Answers on the techniques and strategies used to deal with missing information by experimental group

Another strategy used by two (2) respondents was to choose the vehicle that had information available on that criterion, always discarding the one that did not contain information. As for the difference between the two groups that dealt with missing information, it appears that in Group 1, also submitted to time pressure, the strategy of four (4) out of five (5) decision makers was to use their knowledge or experience on cars . As for Group 3, which did not suffer time pressure when performing the task, the techniques used to deal with missing information included using previous knowledge / experience (2) choosing the vehicle that had information available (1), choosing by what they knew, not using any of the data available (1), and choosing the car that appealed to their preferences on other criteria (1). It was noted in Groups 1 and 3, however, that when the subjects also suffered time pressure (Group 1) to deal with missing information, they focused on their knowledge and experience to complete the task. As for the group that did not have a preset time (Group 3), they were able to better select their strategies to deal with the issue. To Groups 1 and 3, with missing information, it was also asked if the missing information could help improve their choice and why? It appears that the majority of respondents (7) believe that missing information could have helped improve their choice if available. However, two (2) of them said they could have improved the process of choice, but would not have changed their chosen model. Three (3) decision makers responded that missing information would not have made their decision any better, because of the knowledge they have on the automobile industry. They were those that dealt with missing information, but with no time pressure (Group 3), which may

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


598 Corso, K. B. Löbler,M.L.

explain that only with missing information this is supplied by the individual's background knowledge. Now that results obtained by the experimental group have been presented, in order to better understand the behavior of the decision maker, we may proceed to the next section, where we will discuss the main findings and reference them with the literature on this matter. 5. DISCUSSION: CONCLUSIONS REGARDING THE FINDINGS OF THE EXPERIMENT 5.1 Time Pressure Behavior It was found that experimental subjects who experienced time pressure, in most cases, sought the information by criteria and not by alternative, as observed in most of the group with no time pressure. Those who started accessing the information by alternative, after a certain time, changed to opening the cells using criteria, in order to prolong the restricted period of time. Edland & Svenson (1993) state that when time becomes exclusively short, the third step of dealing with the situation (after acceleration and selection) is to change strategies, as Ordóñez & Benson (1997) have observed. Accordingly, Payne, Bettman & Luce (1996) report that, when under severe time pressure, people accelerate the processing of information, become more selective, and change strategies to a deeper pattern of processing, based on alternatives, to a broader pattern, based on attributes (criteria). It is possible to note that when under time pressure, individuals make their choices based on their previous preferences, i.e., on those previous choices that they already have in mind. There are several moments an which this occurs. Initially, it may be observed that the correlation of these subjects with the chosen car is given because it represents their priorities. Subsequently, most decision makers say they use the strategy of directing their choices to criteria / car of preference, when experiencing time pressure. These findings can be justified by the Theory of Image, which asserts that decision makers already have in mind representations and schemes, called images (Seidl & Traub, 1998), and from there, they lead their decision making process to attain a certain goal that has been pre set. When Löbler (2006b) verified through an experiment how individuals deal with information in a decision process, he proved that transgressions committed by the experimental subjects are driven by prior expectations and goals. His findings show that decision makers use the strategic image that comes from the Theory of Images to achieve a certain goal. The strategic image, according to Seidl & Traub (1998), refers to the tactics and plans of the decision maker, that is, the means they use to try to achieve the goals of the trajectory image. For this same reason, the Theory of Images can also be seen, and perhaps quite clearly, when some decision makers, if not most of them, say that if they had more time to accomplish the task, this would not help them make a better choice, because they were already inclined to a certain model of car. Thus, the Theory of Images can explain the findings of this study, regarding the use of previous preferences to face time JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

599

pressure. Other techniques for dealing with time pressure were also observed. Some individuals stated that, when faced with such situation, they used their knowledge about the object. Such respondents were those belonging to Group 1, which, in addition to time pressure, also suffered missing information; specifically, those in Groups 1 and 3 revealed that the use of knowledge was the main strategy, as discussed later. By performing an experiment to evaluate the information processing of individuals with different levels of knowledge, Löbler (2005) found that those with more knowledge, since they have more knowledge about the object in question, tend to disregard the objective information provided by the system and use their prior knowledge about the subject. The findings of the author may be well justified in the use of prior knowledge as a strategy for dealing with time pressure, since the respondents were individuals who had good understanding of the subject matter. The explanation can come from the findings of Tversky & Kahneman (1974), according to both authors, individuals seek shortcuts called heuristics. The rational model cannot explain how the decision-making process actually happens, since it is based on a set of assumptions that determine how a decision should be made, rather than how decision is actually made. Another strategy of the decision makers was to accomplish the task more quickly, which speeds up the processing of information (Payne, Bettman & Luce, 1996). Because time is perceived as being shorter (Fisher, Chengalur-Smith & Ballou, 2003), one must limit the time for the task (Smith & Hayne, 1997). In this regard, De Dreu (2003) ensures that under time pressure, individuals are less motivated to process information systematically, being more influenced by cognitive heuristics, and thus spending less time making decisions. This is corroborated by Rieskamp & Hoffrage (2007) when they state that under high time pressure individuals accelerate the search for information, using less information, and staying focused on the most important features. It was observed that respondents who suffered time pressure, as well as missing information, were more affected by time pressure than missing information, when asked about the difficulties encountered when using the AHP MAKH-ER in the selection process. This finding may be due to the negative behavior that time pressure causes on individuals, like the closing of the mind (Kruglanski & Freund, 1983), the feeling of stress (Benson & Ordóñez, 1997), which consequently leads the individual to fault and bad judgments (Ahituv, Igbaria & Sella, 1998), preventing the complete and profound processing of information (Kocher & Sutter, 2006). Thus, such feelings can be more noticeable by the individuals than missing information, when in a decision making situation. 5.2 Missing information Behavior The answers of subjects, who experienced missing information to perform the task, show that their background knowledge, as well as the experience that the individual has regarding the object of choice, is the most frequently used coping technique. The strategy of using prior knowledge and experience is briefly discussed by Hsu & Mo (2009). The authors sought to examine how consumers perceived missing JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


600 Corso, K. B. Lรถbler,M.L.

information in print ads and in fashion magazines, trying to identify how missing information affected their decisions. In general, consumers who had high levels of involvement with the advertising tended to give more attention to missing information and were more likely to seek information. On the other hand, consumers who had lower levels of involvement tended to ignore the missing information. Regarding this, Hsu & Mo (2009) argue that missing information could be inferred through previous experiences with decision making. Such strategy was demonstrated in this study to address missing information. Another coping strategy verified twice in the post-experiment questionnaire was that, given the missing information, decision makers prefer to choose the alternative in which information is available for that criterion, totally discarding the one in which the information is missing. This finding concurs with Kรถrner et al. (2007) who confirmed that the individuals give more weight to those aspects in common (with available data) than the unique dimensions (with missing information). Even though verified by only one experimental subject, it is important to underline that, to deal with the missing information, the subject used knowledge of the object, totally rejecting the data available in the AHP MAKH-ER during the task, showing extreme self confidence. The same individual showed similar behavior reporting difficulties with the system, which was considered confusing and not designed by a person knowledgeable about cars. This self confidence is a concern highlighted by Lรถbler (2006b) that decision makers have values so heavily rooted in their knowledge structure, that they guide all subsequent processes, causing them to remove information that does not suit their beliefs or to overvalue information that does. In this study, when dealing with missing information, the subject was confident to disregard information, even those that were available in the system, believing in their prior information. The explanation for the behavior described above can be found in Tversky and Kahneman (1974) and Bazerman (2004) when they explain that some individuals' decisions are based on the available heuristic, which leads to ease of recall bias, i.e., individuals base their decisions on those most vivid memories, while disregarding those that are less available, but that may be important for the decision. Self-confidence was also observed in the question that asked whether the missing information would help improve the choice. Some decision makers said that it could even improve the choice, but it would not change the chosen car. Note here that if the information was not missing, such information could qualify the decision-making process, but would not reach the final choice of the individual, who is confident in their judgments. The same applies to those who responded that the missing information would not improve their choice because they are experts in the field of the object in question, which shows overconfidence in their own knowledge. Again, in Lรถbler (2005) may lie the justification for the decision maker's self confidence, in which individuals with more knowledge, grounded in their knowledge about the object of the decision, use more of their own knowledge on the subject, causing them to disregard even more the information provided by the DSS.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

Independent Variable

DECISION MAKER BEHAVIOR Searches information by criterion Changes strategy (alternative criterion)

to

Chooses based on previous preferences

TIME PRESSURE

Uses background knowledge about the object Accelerates information

the

processing

of

Feels Time Pressure more than Missing information

MISSING INFORMATION

Uses background knowledge and experience Chooses the alternative in which information is available Shows self-confidence

601

THEORETICAL BASIS Edland & Svenson (1993); Ordóñez & Benson (1997); Payne, Bettman & Luce (1996) Seidl & Traub (1998) and Löbler (2006b) - Theory of Image Löbler (2005) Payne, Bettman & Luce (1996); De Dreu (2003); Rieskamp & Hoffrage (2007) Kruglanski & Freund (1983) Closing of the Mind; Ordóñez & Benson (1997) - Feeling of stress Hsu & Mo (2009) Körner et al. (2007) Löbler (2005); Löbler (2006b)

Table 4 - Summary of results of the experiment with theoretical underpinning and references Table 4 is presented in order to demonstrate a short view of this study, with two independent variables of the experiment and the main findings about the behavior of the decision maker, verified in the mapping process and in the questionnaire with open questions, which was applied after the experiment; as well as the theoretical foundations that underpin these findings. 6. FINAL REMARKS Although it may be considered an exploratory work in its essence, since it is the first time these variables have been tested in Brazil, it is important to note here some conclusions that could guide research following the same line shortly. The results of this study provide evidence to state that decision makers have specific strategies for dealing with time pressure and missing information. Under time pressure the individuals sought information on criteria, or they changed their strategy: starting by alternative and then by criterion. When under time pressure, the experimental subjects focused on the car of their choice; whereas with no time pressure, some of them rationalized more, used the information, and did not agree with the chosen car. Assumptions of the Theory of Images also justified some findings, indicating that previously structured images in the mind of the decision maker, which represent JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


602 Corso, K. B. Löbler,M.L.

their preferences, are a way of coping with time pressure. Still regarding this variable, it was found that decision makers speed up the processing of information, performing the task faster, and thus feeling the time pressure more intensely, in the form of blocking the mind and having a feeling of stress, than the missing information when subjected to both variables. Given the missing information, the use of background knowledge and individual experience was the most prominent coping strategy. Less frequently, but it deserves to be emphasized, self-confidence was also noted in some subjects when experiencing missing information during the task and the choice of the alternative in which information was available (when compared to other alternative with no information). In this regard, given the time pressure and missing information that decision makers face daily, the strategies that emerged in this study may come perhaps to facilitate the understanding of some kind of future action of the decision maker. Among all the limitations related to a scientific work, the main limitation of this study is the fact that experimental research in the field of social sciences was conducted, where total control of the dependent and independent variables is something virtually impossible to happen (Triviños, 1995). However, we sought to address this issue properly, within the limits of social science research. Another limitation is the low number of experimental subjects, which entails possibilities of replication of the study with a larger number of subjects. Another point to note as a suggestion for future research is the possibility of evaluating more carefully some issues that emerged in this study and seem to be related to influences of heuristics and biases adressed by Tversky and Kahnmann (1974) and Bazerman (2004), for example, disregard for the information system and use of prior knowledge on the subject. Another approach may be to include the theory of prospects in this type of experiment, because it may bring some explanations related to the behavior of decision makers from the presentation of information. Also, it could include feelings of loss and gain in the decision under time pressure and missing information to see how decision makers would react as of the change of these two variables based on theory proposed by Kahnemann and Tversky (1979). Regarding collection techniques to be used to study the time pressure and missing information in decision making, it is suggested a further study with qualitative methods such as interviews and observation, which allow individuals to evoke revealing features. It also proposes the study through the use of a protocol analysis, a technique known as “the think-aloud method”, which suggests collecting information about people's thoughts during a variety of cognitive tasks. The verbalization process reveals the assumptions, conclusions, misconceptions and problems that users face when solving problems or performing tasks (Ericsson & Simon, 1993), which in the context of this study would provide a better understanding of the behavior of the decisionmaker, allowing, perhaps, to find the best strategies to cope with time pressure and missing information.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


Understanding the subject´s behavior in the interaction with a decision support system under time pressure and missing information

603

REFERENCES Ahituv, N., Igbaria, M., & Sella, A. (1998). The Effects of Time Pressure and Completeness of Information on Decision Making. Journal of Management Information Systems, 15(2), 153-172. Berto, A. R. Jogos de Empresas: Avaliação da cognição em relação ao processo de tomada de decisão e formação de estratégia. Congresso Virtual de Brasileiro de Administração. Recuperado em 29 jul, 2007, de <http://www.convibra.com.br/pdf/66.pdf>. Birgelen, M., Ruyter, K., & Wetzels, M. (2000). The Impact of Incomplete Information on the Use of Marketing Research Intelligence in International Service Settings. Journal of Service Research, 2(4), 372-387. Corso, K. B.; Löbler, M. L. (2010). AHP MAKH-ER: Validação de um sistema de apoio à decisão para estudar a influência da pressão do tempo e da falta de informação no processo decisório. Produto e Produção, v. 11, n.2, p.45-58. Cozby, P. C. (2006). Métodos de pesquisa em ciências do comportamento. São Paulo: Atlas. De Dreu, C. K. W. (2003). Time pressure and closing of the mind in negotiation. Organizational Behavior and Human Decision Processes. v. 91, 280-295. Ebenbach, D. H., & Moore, C. F. (2000). Incomplete Information, Inferences, and Individual Differences: The Case of Environmental Judgments. Organizational Behavior and Human Decision Processes, 81(1), 1-27. Edland, A., & Svenson, O. (1993). Judgment and Decision Making under Time Pressure: Studies and Findings. In: Svenson, O.; Maule, J. (Eds.). Time pressure and stress in human judgment and decision making. New York: Plenum, Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: verbal reports as data. MIT Press. Fachin, O. (2002). Fundamentos de Metodologia. 3. ed. São Paulo: Saraiva. Fisher, C. W., Chengalur-Smith, I. S., & Ballou, D. P. (2003). The impact of experience and time on the use of data quality information in decision making. Information Systems Research, 14(2). Hsu, J. L., & Mo, R. H. C. (2009). Consumer responses to incomplete information in print apparel advertising. Journal of Fashion Marketing and Management, 13(1), 66-78. Jagacinski, C. M. (1991). Personnel Decision Making: The Impact of Missing Information. Journal of Applied Psychology, 76(1), 19-30. Kivetz, R., & Simonson, I. (2000). The Effects of Incomplete Information on Consumer Choice. Journal of Marketing Research. v. 37, p. 427-448. Kocher, M. G., & Sutter, M. (2006). Time is money: time pressure, incentives, and the JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


604 Corso, K. B. Löbler,M.L.

quality of decision-making. Journal of Economic Behavior and Organization, v. 61, 375-392. Körner, C., Gertzen, H., Bettinger, C., & Albert, D. (2007). Comparative judgments with missing information: A regression and process tracing analysis. Acta Psychologica. v. 125, 66-84. Kruglanski, A.W., & Freund, T. (1983). The freezing and unfreezing of lay-inferences: effects of impressional primacy, ethnic stereotyping, and numerical anchoring. Journal of Experimental Social Psychology, v. 19, 448–468. Löbler, M. L. (2005). Processamento da Informação: Uma avaliação dos diferentes níveis de conhecimento no processo de decisão. 2005. 214f. Tese (Doutorado em Administração) – Escola de Administração, Programa de Pós-Graduação em Administração, Universidade Federal do Rio Grande do Sul, Porto Alegre. Löbler, M. L. (2006b). A Teoria da Imagem como Explicação para Violação do Método Multicritério de Decisão. Salvador/BA. Anais do 30º ENANPAD, ANPAD. Löbler, M. L., & Hoppen, N. (2005). Uso da Informação e Estratégias de Decisão na Interação com um SAD. Brasília/DF. Anais do 29º ENANPAD, ANPAD. Maule, A. J., & Hockey, G. R. (1993). State, Stress, and Time Pressure. In: SVENSON, O.; MAULE, J. (Eds.). Time pressure and stress in human judgment and decision making. New York: Plenum. Maule, A. J., & Svenson, O. (1993). Theoretical and empirical approaches to behavioral decision Making and their relation to time constraints. In: SVENSON, O.; MAULE, J. (Eds.). Time pressure and stress in human judgment and decision making. New York: Plenum. Ordóñez, L., & Benson III, L. (1997). Decisions under Time Pressure: How Time Constraint Affects Risky Decison Making. Organizational Behavior and Human Decision Processes, 71(2), 121-140. Payne, J. W., Bettman, J. R., & Johnson, E. J. (1993). The adaptive decision maker. Cambridge University Press. Payne, J. W., Bettman, J. R., & Luce, M. F. (1996). When Time Is Money: Decision Behavior under Opportunity-Cost Time Pressure. Organizational Behavior and Human Decision Processes, 66(2), 131-152. Pereira, M. J. L., & Fonseca, J. G. M. (1997). Faces da decisão: as mudanças de paradigmas e o poder da decisão. São Paulo: Makron Books.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 581-604

www.jistem.fea.usp.br


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, pp. 605-618 ISSN online: 1807-1775 DOI: 10.4301/S1807-17752011000300005

RISK ANALYSIS IN INFORMATION TECHNOLOGY AND COMMUNICATION OUTSOURCING ANÁLISE DE RISCO NA TERCEIRIZAÇÃO DA TECNOLOGIA DE INFORMAÇÃO E COMUNICAÇÃO Edmir Parada Vasques Prado Escola de Artes, Ciências e Humanidades – University of São Paulo, Brazil _____________________________________________________________________________________

ABSTRACT This research aims at evaluating the risk analysis process in Information Technology and Communication (ICT) outsourcing conducted by organizations of the private sector. The research is characterized by being a descriptive, quantitative and transversal type study, which was used the survey method. Data were collected through questionnaire, the sample is not random and we used a convenience sampling process. The research made contributions to understanding the risk analysis process in ICT services outsourcing, and identified statistically significant relationships between risk analysis, organization's size and its industry, and between risk analysis and diversity of outsourced services. Keywords: Information Technology; Outsourcing; Risk Analysis, Private Sector; Survey.

1

INTRODUCTION

In today's globalized world, organizations make use of ICT to optimize processes, save costs and gain competitive advantage. Within this context, outsourcing is particularly important because it allows for global sourcing of ICT resources as a way to extend the possibilities of sourcing (Goodman & Ramer, 2007). The importance of ICT outsourcing is also a reality in Brazil. Moreover, Brazil has a special position, because its ICT market is the largest among Latin American countries with an annual income of about $ 21 billion in products and services (King, 2008). Its domestic market is dynamic and, despite being a developing country, the income generated by services reaches 38% of gross domestic product (GDP), comparable to developed country markets. Other publications, like Computerworld (2008), have highlighted the growth of the outsourcing market and the emergence of new services with higher added value. Constant change in organizations’ environment, innovation and shortening of technological cycles has become a risk factor of great influence in the business _____________________________________________________________________________________ Manuscript first received/Recebido em 21/12/2009 Manuscript accepted/Aprovado em: 16/05/2011 Address for correspondence / Endereço para correspondência Edmir Parada Vasques Prado, Mestre e Doutor em Administração de Empresas com ênfase em Métodos Quantitativos e Informática (FEA/USP, 2000 e 2005). Professor da Universidade de São Paulo (USP), na Escola de Artes, Ciências e Humanidades (EACH), do curso de Sistemas de Informação. Rua Arlindo Béttio, 1000 - Ermelino Matarazzo - CEP: 03828-000 Tel. (11) – 3091.8893 Email: eprado@usp.br Link para currículo Lattes: http://lattes.cnpq.br/2091731281771940 Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


606

Prado, E. P. V.

environment (Sauso, 2003). The risks have also been studied in relation to outsourcing. According to Willcocks and Lacity (1999), the growth in importance and size of outsourcing deals has resulted in an increased concern with the management of ICT service providers, and most notably the issue of risk mitigation. Therefore, the study of outsourcing risk becomes important because it reduces material and monetary loss in organizations. For Cohen and Young (2006), organizations need to understand the strategies of ICT outsourcing and its importance in this overall outsourcing process, which can provide a competitive advantage. Risk analysis process is important for organizations which purchase ICT services, as well as for those which provide such services. The analysis of this process helps to identify specific organizational characteristics that are associated with an inadequate risk assessment, which increases the probability of organizations not obtain the desired return. This study aims at evaluating the risk analysis process in ICT services outsourcing, conducted by organizations that purchase outsourcing services. To achieve this overall objective two specific goals were defined. These specific goals aim to identify relationships between risk analysis and specific organizational characteristics, and between risk analysis and heterogeneity, represented by the diversity of services, components and activities outsourced. 2

LITERATURE REVIEW

Organizations are motivated to outsource ICT services by factors such as cost saving, focus on organization's core business, improvement of technology and service quality, and access to knowledge and technology that the organization does not have, among others (Prado & Takaoka, 2002). As a result, outsourcing projects have become more common, and currently a project may involve several organizations globally distributed. This increased the risk involved in outsourcing processes, since outsourcing has increased the number of people and computerized networks that store and manipulate the organization's information. Given this scenario, the risks involved in managing outsourcing projects have grown in importance in recent years (Goodman & Ramer, 2007; Taylor, 2007). This literature review addressed three topics commonly presented in outsourcing literature: outsourcing services, types of outsourcing models, and risks related to ICT outsourcing. 2.1

Outsourcing Services

There are several ways of classifying ICT resources and activities. This is because ICT is present in most organizational activities and is an integral part of their processes. Additionally, the increasing technological developments have made available new technology to organizations and, as a consequence, suppliers have diversified the services offered to organizations. This diversity allows us to classify the services from several viewpoints. Many authors have classifies ICT services (Kliem&Ludin, 2000; Leite, 1995). The outsourcing annual edition (2009) classifies ICT services based on ICT supply market. Smith, Mitra and Narasimhan (1996) classify services in terms of resources used and project features, and Looff (1997) classifies services as a function of information systems, their components and activities. Table 1 shows, based on these studies, the classification of ICT services adopted in this research.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


607 Risk analysis in information technology and communication outsourcing

Table 1. Scope of outsourcing based on services offered by ICT market Category

Services

Infra-structure

• Data center / ASP (1)

• Facilities maintenance

• Helpdesk

• Networks and servers

• Storage

• Network Security

• Print

• Technical Support

• Applications / SaaS(2)

• Legacy systems

• System development and maintenance

• Business process outsourcing

Planning

• Methodology implementation

• Contingency plans

and organization

• Consulting and troubleshooting

Miscellaneous

• Typing Services

• Employee Training

• Desktop publishing

• Others

Systems

services

and

Legend : (1) ASP – application solution provides. (2)

SaaS – software as a service.

Source: Adapted from Kliem and Ludin (2000), Leite (1995), Outsourcing (2009), Smith et al. (1996) and Looff (1997). 2.2

Types of Outsourcing Models

The outsourcing models can be classified in several ways. Leite (1994) classified the models of outsourcing based on the number of suppliers involved. For this author, depending on the outsourcing strategy, the organization may choose to outsource ICT to a single vendor (homogeneous model), or multiple vendors (heterogeneous model). In the first model the organization is very dependent on the supplier, thereby increasing their vulnerability. Moreover, it will be easier to integrate the various outsourced services, in addition to the reduced cost of coordination, since it will manage only one vendor. The second model, called heterogeneous, consists of contracting multiple vendors. In this model the organization seeks to gain access to better skills and abilities. For this reason, it delegates the management of ICT services to many suppliers by selecting those that offer better conditions for each activity. Although this choice may seem beneficial, it can reach a level of great diversity, making it difficult to manage technical and administrative activities. Cohen and Young (2006) identified eight different models of outsourcing, which are shown in Figure 1: a) Internal delivery. The ICT service is provided by organization internal staff, and can also be considered as homogeneous; b) Shared Services. It creates, in essence, an internal department to provide services to the organization as a whole; c) Independent Company. This model represents a step forward compared to the Shared Services model, because it created a new company that will offer ICT services not only for the corporation but also for the ICT market;

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


Prado, E. P. V.

Acess to best in-class capablities

Higher

Lower

Type of management

Number of suppliers

Outsourcing model

External

Heterogeneous

Selective outsourcing

External

Heterogeneous

Best-of-breed consortium

External

Heterogeneous

Prime contrator

External

Homogeneous

Total outsourcing

Internal

Homogeneous

Independent company

Internal

Homogeneous

Shared services

Internal

Homogeneous

Internal delivery

Lower

Joint Venture

Control of service delivery

608

Higher

Figure 1. Outsourcing models Source: Adapted from Cohen and Young (2006, p. 91) d) Total Outsourcing. In this model the organization outsources, through a single contract, with a single outside vendor, most of ICT activities; e) Prime Contractor. In this model the organization hires one vendor to provide a range of services, but allows this vendor to subcontract other providers that have better skills to delivery specific services; f) Best-of-breed consortium. In this model, different from the Prime Contractor Model, the client chooses the best providers for each ICT service, and after that, choose a vendor to manage all the suppliers; g) Selective outsourcing. In this case, the organization selects and manages all suppliers. The organization chooses the most appropriate suppliers to perform each service that is necessary; h) Joint Venture. It is the creation of a new business organization by two or more partners. 2.3

Risks Related to Information and Communication Technology (ICT)

One of the most important factors to consider in the outsourcing process is risk. There are many definitions of risk in the literature, but in this work will be adopted the definition of The Institute of Risk Management (IRM, 2002), which defines risk as the combination of the probability of an uncertain event with its consequences. Every risk has a cause, or a risk factor. Similar definitions can be found in other studies (Aron, Clemons &Reddi, 2005; PMI, 2004, Westerman& Hunter, 2008). There are different types of risks, which may originate from internal organization activities or from organization environment. Environment risk factors such as the changing business environment and the shortening technological cycles have become a risk factor of great influence to organizations (Rovai, 2005). The understanding that man has about the universe that surrounds him and that it is limited and imperfect. His perception depends on cultural factors, knowledge and

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


609 Risk analysis in information technology and communication outsourcing

accumulated life experience, which changes over time, among other factors. When an appraiser has no previous experience with a certain risk factor, it tends to overestimate the probability of occurrence, as well as the impact to the organization. This shows the importance of risk analysis within the context of organizations. This view is also shared by Willcocks, Lacity and Kern (1999) when it comes to the outsourcing of ICT. According to these authors, the growth in importance and size of outsourcing deals has resulted in an increase in concerns with outsourcing and, especially, the issue of reducing the risks associated with outsourcing. This reality makes risk management an important element in the organizational context. For IRM (2002) risk management should be a continuous process, because it is necessary to analyze all the risks inherent to past and present activities, and especially the future activities of an organization. Figure 2 illustrates the process of risk management applied to business risks analysis. Formal audit

The organizationâ&#x20AC;&#x2122;s strategic objectives

Risk assessment: a) Risk analysis b) Risk evaluation

Risk reporting: Threats and opportunities

Decision

Risk treatment

Residual risk reporting

Monitoring

Modification

Figure 2. Risk Management Process Source: Adapted from IRM (2002, p. 5) Risk management process comprises several sub processes which are executed in a sequential manner. One of these sub processes is the Enterprise Risk Analysis, which consists of three steps: a) Risk identification. Identifies and classifies an organization's exposure to a risk factor. To identify the risk factor it is necessary a deep knowledge of the organization and the market in which it operates. After identification, the risk should be classified as: (1) strategic, that deals with objectives related to the long term; (2) operational, related to everyday life with which the organization is facing as it strives to achieve its strategic objectives; (3) financial, related to the effective management and control of the organization's finances; (4) knowledge management, that regards the management and control of resources related to organizational knowledge, such as intellectual property, competitive technologies and loss of key personnel; and (5) compliance, which is related to security, environment, data protection, regulatory issues, among others; b) Description of Risk. Presents identified risks in a structured format. Facilitates risk assessment, by description of possible improvement actions, nature of risk, identification of protocols and monitoring of risk; c) Risk estimation. In this step a qualitative and quantitative risk analysis is performed. After the risk estimation is complete, you can prioritize risks, and emphasize those that have the greatest amount of exposure. JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


610

Prado, E. P. V.

The lack of an adequate analysis of the risks involved in outsourcing can have serious consequences for organizations. Leite (1994) published one of the first works that addressed risk of outsourcing in Brazilian organizations. His study showed some risk involved in the process and its consequences: a) Loss of autonomy and control. The outsourcing process can lead the organization to a lean structure. However, the organization has the risk of losing part of its autonomy, since it may not have internal team to discuss technical issues related to technology strategies. In extreme cases, there may be a total loss of control, with no way to validate suppliers’ bids. Moreover, it is difficult, time consuming and expensive to regain control of unsuccessful ICT outsourcing; b) Rising costs. In some cases, the costs of outsourcing of ICT services can move to very high levels and, consequently, transforming the organization into a supplier’s hostage; c) Confidentiality. This is one of the most serious risks facing organizations, because there is confidential information that can compromise the organization success if they become public or if the competition has access to it. 3

THEORETICAL RESEARCH MODEL

The research model is shown in Figure 3 and is designed to meet the objectives of the research. The constructs in the research model were defined from the literature review mentioned in section two of this work and aim to provide a conceptual and operational definition that allows the measurement of variables. The model consists of three constructs: a) Organizational characteristics. This construct aims to describe the organizational general characteristics that are relevant to the analysis of outsourcing (Prado & Takaoka, 2006); b) Heterogeneity. Describes the degree of diversity and complexity of outsourced services (Kliem&Ludin, 2000; Leite, 1995; Looff, 1997);

V4 – Outsourced services

V1 - Industry

Organizational characteristics

V2 - Size V3 – Degree of investment

Heterogeneity

H1

H2

V5 – Components

V6 – Activities

Risk analysis

V7 – Degree of outsourcing

V8 – Contribution to operation

V9 – Contribution to strategy

V10 – Contractual formality

V11 – Trust

Legend: H1 and H2 are research hypotheses

Figure 3. Research Model c) Risk analysis. This construct represents the process of risk analysis considered in

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


611 Risk analysis in information technology and communication outsourcing

outsourcing, according to the work of Schmidt and Prado (2008). The constructs presented in the research model consist of 11 variables, which are described in Table 2. Two hypotheses were developed based on the research model. The first hypothesis (H1) establishes a relationship between organization characteristics and outsourcing risk analysis, and can be described by the following statement: "organizational characteristics, such as size, industry and level of ICT investment, are associated with appropriate risk analysis of ICT outsourcing." And the second hypothesis (H2), establishes a relationship between heterogeneity and risk analysis in ICT outsourcing, and presents the following statement: "high degree of heterogeneity is associated with more formal analysis of risks in outsourcing”. Table 2. Summary of research variables Constructs Name • Organizational • Industry Echaracteristics • Size • Degree of investment • Heterogeneity • Outsourced services • Components • Activities • Risk analysis

• Degree of outsourcing • Contribution to Eoperation • Contribution to Estrategy • Contractual formality • Trust

4

Type Nominal Ordinal Ordinal Ratio

Variables Categories Manufacturing industry; and service industry Medium; large; and very large Very little; little; medium; and high Amount of outsourced services

Ratio Number of components used in outsourcing: hardware, software, data, processes and people Ratio Amount of outsourcing activities: planning, development, deployment, maintenance and operation Ordinal Up to 20%; up to 40%; up to 60%; up to 80%; and more than 80% Ordinal Likert Scale (1 to 5) Ordinal Likert Scale (1 to 5) Ordinal Likert Scale (1 to 5) Nominal Was not considered; level of control and penalties; prior knowledge between people; common goals; and past succes

RESEARCH METHOD

This section describes procedures and methods used in the research. The first item classifies the type of research. The following items describe aspects related to population and sampling, data collection and procedures for data analysis. 4.1

Type of Research

The research proposed in this paper is characterized as a descriptive study, according to Wrightman, Cook and Selltiz (1976). This type of study aims to determine the frequency in which some phenomenon occurs and discover or verify the link between variables. This is a quantitative and transverse type research, because the information was collected once (Malhotra, 2009).

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


612

4.2

Prado, E. P. V.

Population and Sampling

The definition of the target population should contain information about the elements of the sample, and its scope (Aaker, Kumar & Day, 2004). In this research the unit of analysis is the risk assessment made by the organizations, and the unit of observation is the organizations' ICT department. The research scope covers large and medium-sized private organizations that have at least one ICT service outsourced. We adopted the Fischer's Exact Test statistical technique to analyze the data. This technique applies to the contingency tables that have sparse or unbalanced cells and thus applies to small samples. We opted for a non-random sample, with a convenience sampling procedure. According to Aaker et al. (2004) these characteristics are suitable for obtaining information with less cost. We obtained a sample of 54 organizations. 4.3

Data Collection

The data collected were classified as primary data, i.e., are those that were not collected before. The people interviewed are those responsible for the ICT department in organizations. A structured questionnaire as an instrument of data collection was adopted in this study. The advantage of this instrument is the cost reduction of the research and the uniformity of measurement. Malhotra (2009) also points out the questionnaire as the best way of gathering information from a large number of respondents. Data were collected in the second half of 2008. 4.4

Data analysis procedures

Data analysis was performed in three stages. In the first stage we used descriptive statistics, which include frequency and contingency tables. The goal of this stage is to describe the sample and learn about the features presented in the research model. In the second stage we made an analysis of the reliability of measurement scales using factor analysis technique. In the last stage we used Fischer's Exact Test in order to verify the research hypotheses. 5

DATA ANALYSIS AND RESULTS

Data analysis and results are presented on two topics: (1) Reliability of Scales, and (2) Test of Research Hypotheses. The sample consisted of 54 organizations and their characteristics are presented in Table 3. The size of the organization was measured by the number of employees of the organization. The sample comprised 77.8% (46.3% + 31.5%) of large organizations with more than 500 employees, and 96.3% (74.2% + 22.1%) of them invest more than 1.0% of gross annual revenues in ICT. Therefore, the sample is composed mostly of large companies with good investments in ICT, which is appropriate for the purposes of this research.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


613 Risk analysis in information technology and communication outsourcing

Table 3. Sample characteristics Industry

Number of employees Level %

Investments (% of annual revenue) Level %

Manufacturing

100-499

30.0 • de 0 a 1%

5.0

40

500-999

50.0 • de 1 a 2%

82.5

(74.0%)

> 999

20.0 • mais de 2%

12.5

Service

100-499

0.0 • de 0 a 1%

0.0

14

500-999

35.7 • de 1 a 2%

50.0

(26.0%)

> 999

64.3 • mais de 2%

50.0

Total

100-499

22.2 • de 0 a 1%

3.7

54

500-999

46.3 • de 1 a 2%

74.1

(100.0%)

> 999

31.5 • mais de 2%

22.2

5.1

Reliability of Scales

We conducted an analysis to improve existing scales, by evaluating the reliability of scales defined in the research model. The heterogeneity is represented by three variables (V1, V2 and V3). The application of factor analysis reduced these three variables to only two: diversity of services (V12) and number of suppliers (V13). These two new variables account for 90.47% of the variance of the three previous variables. With this change the Cronbach's Alpha increased from 0.539 to 0.642.

V12 – Diversity of services

V1 - Industry

V2 - Size V3 – Degree of investment

Organizational characteristics

Heterogeneity

H1

H2

V13 – Number of suppliers

Risk analysis

V7 – Degree of outsourcing

V14 – Selection process

V15 – Hiring process

Legend: H1 and H2 are research hypotheses

Figure 4. Result of Factor Analysis Likewise, risk analysis is represented by five variables (V7, V8, V9, V10 and V11). The application of factor analysis reduced these five variables to only three: the degree of outsourcing (V7), the selection process (V14) and the hiring process (V15). These two new variables account for 82.14% of the variance of the four previous variables. The research model, simplified by the application of factor analysis, is shown in Figure 4. With this change the Cronbach's Alpha increased from 0.510 to 0.640.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


614

5.2

Prado, E. P. V.

Test of Research Hypotheses

We analyzed the relationships between variables of the research model according to research hypotheses. Table 4 presents the results of applying Fischer’s Exact Test and highlights the relationships considered statistically significant, i.e., with statistical significance level less than or equal to 5%. Table 4. Statistically significant relationships Variables

Hypotheses H1

Organizational characteristics V1 - Industry

V2 - Size

H2

Heterogeneity V12 - Diversity of services

Statistical significance

Risk analysis V7 - Degree of outsourcing

0,567

V14 - Selection process V15 - Hiring process V7 - Degree of outsourcing V14 - Selection process

0,030

V15 - Hiring process Risk analysis V7 - Degree of outsourcing

0,039

V14 - Selection process

(*)

0,360 0,180 0,221

0,021 0,000 0,004

V15 - Hiring process V7 - Degree of outsourcing 0,144 0,165 V14 - Selection process V15 - Hiring process 0,148 Legend : (*) - Fisher's Exact Test using Monte Carlo algorithm. Ho = independence between variables V13 - Number of suppliers

A total of 12 relationships ware analyzed, each research hypothesis included analysis of six relations. The results showed that the hypothesis H1, which suggests a relationship between organizational characteristics and risk analysis process, had two statistically significant relationships: (1) the relationship between industry and the risk analysis in the suppliers’ selection process; and (2) the relationship between organizational size and risk analysis in the suppliers’ hiring process. Analyzing hypothesis H2, which suggests a relationship between heterogeneity and risk analysis process, we found three statistically significant relationships between the diversity of services and: (1) risk analysis associated with the size of outsourcing, (2) risk analysis in the process of vendor selection, and (3) risk analysis in the hiring process. We made a detailed analysis of the statistically significant relationships and the results are presented in Table 5: a) Selection process with industry. Manufacturing companies make an inappropriate risk analysis in the suppliers’ selection process, while services companies make a good analysis of this type of risk. b) Hiring process with size of organizations. Medium-sized companies make almost no risk analysis in the suppliers’ hiring process, while companies with more than 1,000 employees and with annual revenues exceeding $ 1 billion, ranked as Corporation, make a good analysis of this type of risk. c) Degree of outsourcing with risk analysis. The greater the degree of diversity of

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


615 Risk analysis in information technology and communication outsourcing

outsourced services, the more elaborate the risk analyses are in the suppliers' selection and hiring processes. d) The Suppliers' selecting process and service diversity. Just as in the previous relationship, the greater the diversity of outsourced services, the more elaborate the risk assessments associated with suppliers' selection process are. That was the relationship that had the highest level of statistical significance, as shown in Table 4. e) The Suppliers' hiring process and service diversity. This was the only relationship in which the association was negative, i.e., services with lower diversity were associated with more elaborate risk analysis in the suppliers' hiring process. Table 5. Analysis of statistically significant relationships Risk Analysis Degree of outsourcing

Organizational characteristics Heterogeneity Size Diversity of services Medium Large Very large None Little Medium Large Very large

None Little Medium Large Very large Hiring process

2.3 3.7

Medium None Little Medium Large Very large Selection process

Size Large

2.1 2.9 4.4 Diversity of services Very large None Little Medium Large Very large

*

2.1

2.9

2.9 2.2

2.4

Industry Manufacturing Services

Diversity of services None Little Medium Large Very large 2.5 3.5 2.6

None Little 2.9 Medium 2.2 Large Very large 2.0 Legend: (*) values greater than 1.96 are statistically significant at 5% level

6

3.3

2.5

3.7

CONCLUSIONS AND RECOMMENDATIONS

The aim of this paper was to evaluate the process of risk analysis in ICT service outsourcing. This goal was achieved through a survey that analyzed 54 private sector organizations. We tested three hypotheses regarding the risk analysis in ICT outsourcing. Two hypotheses were confirmed and are the main contributions of this research. a) Association between risk analysis in ICT outsourcing and organizational characteristics. The organizational characteristics considered in this study were the

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


616

Prado, E. P. V.

organization's size and its industry. The analysis showed that large organizations make better risk analysis of ICT service outsourcing, and also that organizations that belong to the service industry make better analyses than those in the manufacturing industry. This result is consistent with the case studies conducted by Schmidt and Prado (2008) who found that large corporations, working in the service industry, perform better outsourcing risk analyses. b) Association between outsourcing risk analysis and heterogeneity. This association was the strongest evidence found in this study. The diversity of outsourced services, one characteristic of heterogeneity, had significant association with all variables that represent risk analysis: degree of outsourcing, selection process and hiring process. The association between degree of outsourcing and the selection process was positive, i.e., the greater the diversity of outsourced services, the greater the risk analyses conducted by organizations in the suppliers´ selection process and in highly outsourced environments. This result also concurs with other studies (Leite, 1994; Schmidt; Prado, 2008). However, the diversity of outsourced services had a negative association with risk analysis in the hiring process, that is, the greater the diversity of outsourced services, the lesser the risk analyses in the hiring process. This result may be associated with the degree of formality, which is a component of risk analysis in the hiring process. Kishore Rao, Nam, Rajagopalan and Chaudhury (2003) found that organizations with higher outsourcing culture establish relationships with suppliers based more on aspects of trust than on formal contracts. The concept of heterogeneity is similar to the concept of idiosyncrasy used by the Transaction Cost Theory (TCT) described by Williansom (1975). For the TCT, organizations should outsource assets frequently used and with high specificity (idiosyncratic). This research supports the recommendations of TCT in the specific case of ICT outsourcing, showing that ICT services with a high degree of heterogeneity (idiosyncratic) should be outsourced, but with an appropriate risk assessment. The results of this study should be considered according to their limitations. Among them, there is that associated with the research method used. A non-random sample obtained through a convenience sampling process does not allow generalization of results. Therefore, we recommended the development of new researches, based on the propositions of this study. Possible alternatives include the following: replicate this research using a random sample and extend the study to the commercial industry, to allow a better comparison between the various industries. REFERENCES Aaker, David. A., Kumar, V., & Day, George.S. (2004). Marketing research, 7th edition. New York: John Wileys & Sons. Alencar, J. A., & Schmitz, E. A. (2006). Anålise de risco em gerência de projetos. Rio de Janeiro: Brasport. Aron, R, Clemons, E. K.,&Reddi, S. (2005). Just right outsourcing: understanding and managing risk. In: Proceedings of the 38th Hawaii International Conference on System Sciences, Hawaii, 2005.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


617 Risk analysis in information technology and communication outsourcing

Cohen, L., & Young, A. (2006). Multisourcing: moving beyond outsourcing to achieve growth and agility. Boston: Harvard Business School Press. Goodman, S. E., & Ramer, R. (2007). Identify and mitigate the risks of global IT outsourcing, Editorial Preface. Journalof Global Information Technology Management, 10(4), 1-6. IDG Now (2008). Terceirização de infra-estrutura de TIC no Brasil triplicará até 2012. Recuperado em 15 de junho, 2009, de http://idgnow.uol.com.br/computacao_corporativa /2007/04/18/idgnoticia.2007-0418.7746081676/ IRM (2002). The risk management standard. Recuperado em 18 de abril, 2008, de http://www.theirm.org/publications/PUstandard.html. King, M. (2008). Brazil information technology report Q3. Recuperado em 22 de abril, 2008, de http://www.companiesandmarkets.com/Summary-Market-Report/BrazilInformation-Technology-Report-Q3-2008-49414.asp. Kishore, R., Rao, H. R., Nam, K., Rajagopalan, S., &Chaudhury, A. (2003). A relationship perspective on IT outsourcing.Communications of the ACM, 46(12), 87-92. Kliem, R. L., &Ludin, I. S. (2000). The essentials for successful IT outsourcing.In J. Butler (Org).Winning the outsourcing game. (57-65). New York: Auerbach Publications. Lacity, M. C., Willcocks, L. P., &Feeny, D. (1995). IT Outsourcing: maximize flexibility and Control. Harvard Business Review, May/June, 84-93. Leite, J. C. (1994). Terceirização em informática. São Paulo: Makron Books. Leite, J. C. (1995). Terceirização em tecnologia no Brasil: investigação sobre a situação da terceirização em Informática no contexto brasileiro. Núcleo de Pesquisa e Publicações e Relatórios de Pesquisa, relatório nº 13. São Paulo: FundaçãoGetúlio Vargas. Looff, L. (1997). Information systems outsourcing decision making: a managerial approach. Hershey: Idea Group Publishing. Malhotra, Naresh K., & Birks, David F. (2009).Marketing research: an applied orientation, 6th edition. England: Pearson Education. Outsourcing (2009). Série estudos, edição anual, 6(6), Recuperado em 15 de setembro de 2009, de http://www.serieestudos.com.br/EstudosMercado/PublicacaoDetalhe. aspx?id=2 Prado, E. P. V., &Takaoka, H. (2002). Os fatores que motivam a adoção da terceirização da tecnologia de informação: uma análise do setor industrial de São Paulo. Revista de Administração Contemporânea, 6(3), 129-147. Prado, E. P. V., &Takaoka, H. (2006). Arranjos contratuais na terceirização de serviços de TI em organizações do setor privado. Anais do 30º Encontro Anual da Associação Nacional dos Programas de Pós-graduação em Administração. Salvador. PMI (2004).A Guide to the Project Management Body of Knowledge – PMBOK Guide.3º ed. Newtown Square: PMI.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


618

Prado, E. P. V.

Rovai, R. L. (2005). Modelo estruturado para gestão de risco em projetos: estudo de múltiplos casos. São Paulo. Tese de Doutorado – Escola Politécnica. Universidade de São Paulo, São Paulo. Sauso, R. (2003). Business and information technology alignment: research propositions related to enterprise architecture frameworks. Helsinki University of Technology. Schmidt, S. O., & Prado, E. P. V. (2008).Modelos organizacionais da terceirização da tecnologia de informação: um estudo de múltiplos casos. Monografia, Escola de Artes, Ciências e Humanidades, USP, São Paulo. Smith, A. M., Mitra, S., &Narasimhan, S. (1996). Offshore outsourcing of software development and maintenance: a framework for issues. Information &Managment, 31, 165-175. Taylor, H. (2007). Outsourced IT project from the vendor perspective: different goals, different risks. Journal of Global Information Management, 15(2), 1-27. Westerman, G., & Hunter, R. (2008). O Risco de TI: convertendo ameaças aos negócios em vantagem competitiva. São Paulo: M. Books. Willcocks, L. P., &Lacity, M. C. (1999). IT Outsourcing in insurance services: risk, creative contracting and business advantage. Information Systems Journal, 9, 163-180. Willcocks, Leslie P., Lacity, Mary C., & Kern, Thomas (1999). Risk mitigation in IT outsourcing strategy revisited: longitudinal case research at LISA. Journal of Strategic Information Systems, Sept., 8(3), 285-314. Williamson, Oliver E. (1975). Markets and hierarchies. New York: The Free Press. Wrightman, Lawrence S., Cook, Stuart W., &Selltiz, Claire.(1976). Research Methods in Social Relations. Publisher: Holt, Rinehart & Winston.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 605-618

www.jistem.fea.usp.br


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, pp. 619-640 ISSN online: 1807-1775 DOI: 10.4301/S1807-17752011000300006

METAMODELS OF INFORMATION PRACTICES FRAMEWORKS

TECHNOLOGY

BEST

Arthur Nunes Ferreira Neto João Souza Neto Catholic University of Brasilia, DF, Brazil _____________________________________________________________________________________

ABSTRACT This article deals with the generation and application of ontological metamodels of frameworks of best practices in IT. The ontological metamodels represent the logical structures and fundamental semantics of framework models and constitute adequate tools for the analysis, adaptation, comparison and integration of the frameworks of best practices in IT. The MetaFrame methodology for the construction of the metamodels, founded on the discipline of the conceptual metamodelling and on the extended Entity/Relationship methodology is described herein, as well as the metamodels of the best practices for the outsourcing of IT, the eSCM-SP v2.01 (eSourcing Capability Model for Service Providers) and the eSCM-CL v1.1 (eSourcing Capability Model for Client Organizations), constructed according to the MetaFrame methodology. Keywords: Framework; Best Practices in IT; Metamodels; eSCM.

1.

INTRODUCTION

According to the IT Governance Institute (2005), “the survival and success of an organization in light of the new globalized market, where time and distance are suppressed, are under the effective management of information and related technology”. In light of this context, in which IT (Information Technology) takes a decisive role within organizations, the models or frameworks of best practices in IT have emerged in the last two decades. These frameworks are the enterprise’s and academia’s response to the challenges of management and governance of IT functioning as instruments for the promotion and alignment between the processes of IT and the strategic objectives of the organization. _____________________________________________________________________________________ Manuscript first received/Recebido em: 01/10/2010 Manuscript accepted/Aprovado em: 02/02/2011 Address for correspondence / Endereço para correspondência Arthur Nunes Ferreira Neto, Master in Information Technology and Knowledge Management – MGCGI/UCB, Catholic University of Brasilia, Campus Avançado, SGAN 916 Asa Norte - Modulo B Sala A111 - CEP: 70.790-160 Brasília – DF, Brasil - Telefone: (61) 3338-6534 - E-mail: arthurnetobsb@gmail.com. João Souza Neto, Doctor of Science in Electrical Engineering, University of Brasilía – UNB, Professor at Catholic University of Brasilía, on the Master’s degree Program in Information Technology and Knowledge Management, Campus Avançado, SGAN 916 Asa Norte - Modulo B - Sala A121 - CEP: 70.790-160 Brasília – DF, Brasil - Telefone: (61) 3448-6534 - E-mail: joaon@ucb.br. Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


620

Ferreira Neto, A. N., Souza Neto, J.

According to Johannsen and Goeken (2007), the frameworks for best practices in IT “describe organizational objectives, processes and aspects of the management and control of IT”. Among the main frameworks of IT best practices currently used in organizations, we have the eSCM (eSourcing Capability Model), the CobiT (Control Objectives for Information and Related Technology), CMMI (Capability Maturity Model Integration), PMBoK (Project Management Body of Knowledge) and ITIL (Information Technology Infrastructure Library). The effective implementation of an IT best practices framework is a complex activity that demands planning and that normally brings significant changes in the organization and in its processes. The challenge then arises to understand, in depth, the structure of the framework so that a preliminary study of its adoption on the processes of the organization can be done. Besides this, it is noted that the adoption of just one of these IT best practices frameworks may not be sufficient for a particular organization. Despite the different focuses and the conceptual and structural differences, the IT best practices frameworks are not, in principle, incompatible, and can be used concomitantly to promote the improvement of the organization’s information technology management. However, one of the challenges currently faced in IT management is how to analyze, adapt, compare and integrate the different frameworks of IT best practices. It is understood, consequently, that the first step towards solving these problems is by understanding the logical structures and generating semantics of the IT best practices frameworks. This can be achieved by generating ontological metamodels of these frameworks. The metamodels of the ontological type were identified and defined in the studies by Atkinson and Kühne (2003a and 2003b) and will be considered in the theoretical reference of this article. We highlight that the creation of a domain ontology for the IT best practices frameworks is not part of the present work, but it will be the object of further studies. Among the main approaches used, up to now, to make the analysis and the comparison of the IT best practices frameworks, we have high level classifications, based on diverse criteria of comparison and the high level, detailed mapping, of the functions and processes between frameworks (ITGI, 2006, 2008). Meanwhile, just the application of these two approaches does not contribute significantly to the solution of the problem of comparison and integration of the frameworks. The high level classifications, based on comparison criteria, are not detailed enough to detect the correspondences or incoherencies between different areas of the IT best practices frameworks. At the other extreme, the detailed mapping of the functions and processes of the frameworks possess a high level of detail, but little information to understand the conceptual and logical structures are available, which are important for planning and effective integration. In an effort to fill this gap, the MetaFrame methodology, which joins procedures, strategies and instructions for the creation of ontological-type metamodels for IT best practices frameworks is presented in this article. An example of the application of the MetaFrame methodology is also presented: the generating of the metamodelof the eSCM-SP v2.01 01 (Hyderet all, 2006) and the eSCM-CL v1.1 (Hefley e Loesche, 2006).

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


621 Metamodels of information technology best practices frameworks

1. THEORETICAL REFERENCE 1.1.

Ontological Metamodels The managing of elements of an organization increasingly makes use of models, tools, and environments of modeling even more complex. For Karagiannis (2002), the state of the art in the area of organizational modeling is based on metamodels.

One can begin a literal analysis of what metamodel means by the prefix “meta”. In Greek, “meta” means “that which is beyond”, “that which encompasses”, “that which supersedes”, “that which transcends”. We use the prefix “meta” when a certain operation is completed twice. For example, a dialogue about how to conduct a dialogue is a metadialogue. Summarizing, “meta” is put before some f operations, so that it is characterized as being completed twice. Instead of writing ff, as in modelmodel, metaf is used, that is, metamodel. For another application subsequent to the operation, another prefix, “meta”, is added, for example, metametamodel. For Kühne (2005) author of a semantic formalization for the metamodels, a model can be thought of as a projection, which implies that something (the original) is projected and that part of the information is lost during the projection, in the activity called abstraction. The part that is retained depends on the proposition to which the model will be used. The author uses the definition of a model as an abstraction of a system (real or based on a language), permitting predictions or inferences to be made. According to the open consortium of the OMG (Object Management Group), responsible for the MDA (Model-Driven Architecture, 2003) and UML (Unified Modeling Language, 2004) specifications, a model is an instance of a metamodel, which implies that a metamodel is a model of another model. An important contribution to the studies on the subject of this article was provided by Atkinson and Kühne (2003a and 2003b), which identifies two dimensions of metamodelling, giving rise to two distinct forms of instancing of objects of the metamodel (linguistic and ontological). One dimension is related to the definition of the language and makes use of linguistic instantiation, used, for example, in MDA architecture, the basis of UML language. Another dimension is about the definition of the domain or type of object and uses the ontological instancing used in the creation of the metamodels of frameworks of best practices in IT in this study. Both forms occur simultaneously and serve to precisely locate an element of the model in the linguisticontological space. In figure 1, the OMG_MDA architecture with four layers of abstraction (M0 toM3) is used, also followed by UML2.0 and MOF 2.0 linguistic modeling standards. We have the visualization of a linguistic metamodel with four horizontal layers that begin by M0, denoting the lowest level, and M3, the highest level of abstraction. At the same time, we have the visualization of the ontological metamodel, represented by different areas separated by a dashed line in the vertical division at the M1 level. By explaining the two metadimensions, Figure 1 also illustrates the relationship between the elements of the model and the real world. The dog and the lamp (mental concept) of the M0level are the elements of the real world to be modeled. The real Lassie is “represented” by the object Lassie and not by an ‘instance of’ Collie. The abstraction level M1 contains the first level of abstraction of an object in the real world, together with the type of which the object is an ontological instantiation. The Lassie object (O0)

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


622

Ferreira Neto, A. N., Souza Neto, J.

is an ontological instantiation of the type Collie (O1). From M1 each level is a model expressed in the language defined at the higher level. In M2, the Lassie object is a linguistic instantiation of the Object type, which, in M3. is a linguistic instance of the Class type.

Figure 1: The Linguistic Metamodel (Adapted from Atkinson and Kühne, 2003b) The ontological metamodels use the ‘instance of’ relationship to relate the concepts with their types or metatypes. In Figure 2, we extend the ontological levels rotating Figure 1 to the right, and adding level O2. In this way, the ontological metalevels are arranged horizontally. For Atkinson and Kühne (2003b), the two points of view are equally valid and useful.

Figure 2: The Ontological Metamodel (Adapted from Atkinson and 2003b).

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


623 Metamodels of information technology best practices frameworks

The utility of using metaconcepts has been recognized for a long time. For example, using meta concepts with races and species brings enormous advantages. Figure 3 shows one of the most mature ontological metamodels, the biological taxonomy for human beings. One notes that the Lassie object is an ontological instance of the Collie type or class, which is at the O1level or level of the model. The Collie type, on the other hand, is an ontological instance of the Race type which is in O2, or at the metamodel level. One can see, from the UML notation, that the Collie type is also a specialization of the CanisFamiliares type (domestic dog) at the same ontological level O1. The presence of the other metaconcepts like Species, Genus etc. is perceived in O2.

Figure 3: Biological Classification (Adapted from Atkinson and Kühne, 2003b). From a linguistic or grammatical perspective, Lassie is a noun or an object, or rather noun and object are linguistic classifiers of Lassie. From a semantic or ontological point of view, the word Lassie can be understood as a type of dog or animal film character. We understand, however, that ‘type of dog’ and ‘animal film character’ are ontological classifiers for Lassie. The first type of classification refers to the form and the second to the content of the element. These two dimensions of classification can be expressed graphically. In the visual models, the linguistic metamodels refer to the classification of the elements of the model with relation to its form (Object, Class, Association, Attribute). The ontological metamodels, on the other hand, refer to the classification of the content of the elements of the model (Collie, Race etc.) According to Atkinson and Kuhne (2003b), despite the validity and utility of the ontological metamodels of types, for the tool builders and members of the standardizing consortia, such as the OMG, the metamodel term refers typically just to the metamodel of the linguistic type. Meanwhile, from the perspective of the user of the language, the hierarchy of types formed by ontological levels is much more relevant. In other words, the ontological metamodels are metamodels for the users focused on the content and the linguistic metamodels are a standard of metamodels focused on forms.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


624

Ferreira Neto, A. N., Souza Neto, J.

Researcher Strahringer (1996) studied how the level hierarchies of the models are built and coined the term ‘metaization principle’, to designate an operation that is repeatedly applied from a level to another, or rather, the primary mechanism of abstraction to structure the objects in levels of hierarchy. Kühne’s analysis (2006) is similar to Strahringer’s (1996), however making use of a different distribution of the elements for the levels and a different terminology. The MetaFrame methodology, presented in this article, utilized the metaization principle to verify and inform users how the metamodel components of IT best practices frameworks were built. The metaization principle most used in information systems is the linguistic metamodeling. For example, the syntax of languages of modeling is at the M2 level, as the well known methodology E/R (Entity/Relationship) by Chen (1976), used to represent part of the objects in the real world (M0) at the level of an E/R (M1) model, where only the components of the language (types, entity, relationship types, attributes etc.) can be used. Starting with this principle, an M2 level structures the representation of the objects at the M0 level in the M1 level. In the ontological metamodeling, metatypes at the Mx level are defined, which describe the existing concepts at the Mx-1 level. 2.2 Metamodel Principles and Instructions The traditional focus of quality evaluation is on the final product; however, the defects of the final product often have roots in the initial planning and conception phases. This suggests that greater efficiency and effectiveness would be reached if efforts were made to evaluate the quality of the conceptual models. For Moody (2005), the current state of the evaluation practice of the conceptual models quality possesses more art characteristics =than engineering characteristics. For the conceptual modeling to progress from art to engineering, quality standards need to be defined, agreed on and used in practice. Schütte (1998) is one of the authors who contribute to the work of this research through the modeling instructions contained in GoM (Guidelines of Modelling). The GoM is a framework for the development and evaluation of conceptual models composed of six general principals, described as follows: 1. Construction adequacy Principle: a consensus must exist among specialists and users on what type of construction of a model is adequate for the problem and its proposal. 2. Language adequacy Principle: the language used to create the metamodel fulfills its proposal. This principle refers to the completeness of and the consistency between the model and the metamodel. This means that the model should not possess any symbol or item that has not been specified in the metamodel. 3. Economic efficiency Principle: this principle formulates economic restrictions on the task of modeling. The costs of development of a model should not surpass the gains of its use. 4. Clarity Principle: this principle deals with the comprehensibility and expressivity of the model. Within the objectives of clarity, are the hierarchical decompositions, the formatting (arrangement of the elements) of the model and the filtering of information. Criteria and objectives of the quality of the graphic formatting of a model were defined by Tamassia (1988).

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


625 Metamodels of information technology best practices frameworks

5. Systematic conception Principle: this principle deals with the consistency of the construction between the models and it is also important for the integration of the models. 6. Comparability Principle: this principle deals with the semantic comparison between two models according to their correspondence or similarity. This is one of the most important principles in a metamodelling environment. Metamodels are frequently used to compare and integrate models. Goeken (2009) proposes the use of the principles defined by SchĂźtte (1998) to also evaluate the metamodels. The author adds three new specific instructions to evaluate the quality of the metamodels: Instruction 1: a metamodel reveals its metaization principle. It is important for the user of the metamodel to know which rules were used to construct the metamodel levels. Instruction 2: a metamodel should posses a clear mapping between the universe of discourse and the words and symbols that name and describe them. Questions should not exist among users about the meaning of concepts in the metamodel. Instruction 3: a metamodel must have rich semantic connections. The relationships between the metamodel components must be relevant and described in an expressive way. The metamodels created from the MetaFramemethodology are verified as to the principles and instructions described. 2.3 Aplications of the Metamodels The ontological metamodels can be applied to complete the analysis, adaptation, comparison and integration of the IT best practices frameworks. Once the components of the metamodels are extracted, the frameworks can be examined and analyzed to know the characteristics of their structure. This analysis contributes to the evaluation of the framework and in helping the implantation and adaptation within the organization. Other possibilities of the application of the metamodels of the IT best practices frameworks are the comparison and integration with different frameworks. Using the same methodology for construction or, according to Strahinger (1996), the same metaization principle, the representation of the metamodels allows the comparison between the frameworks at a high or abstract level. This process of comparison is an important step in the integration of the frameworks. The integration of the metamodels can guide the integration of the frameworks at a concrete or low level. However, despite the advantages of the comparison and integration with the use of the metamodels, when models with different concepts are compared, difficulties caused by the differences in language, such as synonyms and homonyms, arise. To resolve these problems, the solution can come from the research of data bases about the comparison and integration of schemes. For Zaniolo (1982), â&#x20AC;&#x153;the mapping between different models is a formidable problem to be resolvedâ&#x20AC;?. The author points out that the more promising approach consists of the use of a metamodel, at the conceptual scheme level.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


626

Ferreira Neto, A. N., Souza Neto, J.

According to Heuser (1998), “the description of a model is called, in the data base terminology, the scheme of the data base”. One can consider the scheme as a textual representation of the model or metamodel and that the diagram or model of the E/R methodology is a graphic representation. The schemes, as textual representation of the model, have their own language, where the syntax is given for a particular grammar. A formal description of a scheme for the extensive E/R methodology, which is used in this research, is given by Engels et all (1992). Various authors research the comparison and integration of models/metamodels; among these, the work of Batini et all (1986), Spaccapietra (1992), Teorey (1999), Conrad (2002), Rizopoulos (2005), Magnani et all (2005), Kurpjuweit (2007) e Karagiannis (2008) is notable. According to Karagiannis (2008), the integration of models that were created from different metamodels can be approached through the mapping on the metamodel layer or meta2 layer. The metamodel will act as a translator between the models that were instantiated by their metamodels. For this author, the use of metamodels is the most adequate way to integrate models; however the approaches, up to now, have not been capable of completing semantic integration and interoperability, which deal with the use of explicit semantic descriptions, very frequently provided in the ontological form. Karagiannis (2008) cites the Gartner group, which informs that more than 40% of the expenses of the IT company are spent on problems of integration (be they syntactic, structural or semantic). Of these expenses, around 60 to 80% of the work force dedicated to resolving integration problems is spent on reconciling the semantic heterogeneity (be it in data bases, information systems etc.) The author reminds that, whatever the integration problem is, it has to be represented adequately. Diagramatic languages like UML and the E/R methodology, aligned with the metamodel concepts, are capable of expressing the syntactic, structural and semantic aspects in question. Karagiannis (2008) describes the process of lifting or ontology anchoring as the essence of the semantic integration of models. For the author, lifting is what is called the ontological metamodeling, which is not limited by the meta1 layer, but can be applied to the meta2 layer and beyond. The author observes that when two models— whose forms were created by different metamodels—are integrated, their semantically related components will have to be found. According to the author, the combination of metamodels and ontologies bring an excellent way to resolve the task of integration and interoperability, achieving all the syntactic, structural and semantic heterogeneity. 2.4 Extensive E/R Methodology The Entity Relationship E/R methodology, proposed by Chen (1976), was developed for the creation of conceptual and semantic models. The metamodels constructed with the MetaFrame methodology, presented in this study, follow the concepts and the notation of an extension of the E/R methodology, formalized by Engel et all (1992), with the objective of improving metamodel expressiveness. Figure 4 presents the main components and their notation, according to the authors cited above.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


627 Metamodels of information technology best practices frameworks

Figure 4: Components and notation of the extended E/R methodology. Engels et all (1992). Depending on the quantity and complexity of the objects (entity types, relationship types, attributes, constructor types), the use of a modeling strategy is important to help in the organization and development of the work of finding and defining the metamodel components. One modeling strategy for the extended E/R methodology is a sequence of steps that repeat themselves, producing small transformations of the initial model in the final model. The choice of the strategy for the construction of the model is influenced by the main source of information of the modeling process. In the literature, there are four types of basic modeling strategies (Top-Down, Bottom-Up, Inside-Out or Middle-Out and Mixed) however, there is no consensus among the authors on which of these is the best technique. They use the work of Heuser (1998) and Atzeniet (1999) to describe these strategies. In the Top-Down strategy, an initial model is created in which the most abstract concepts (â&#x20AC;&#x2DC;from aboveâ&#x20AC;&#x2122;) are represented first. Afterwards, intermediary models are created gradually through the refinement of the concepts into more specific concepts. The Bottom-Up strategy (from below to above) is the inverse of the Top-Down (from above to below), consisting in starting from the most elementary and detailed concepts to construct more abstract and complex concepts. The Inside-Out (from inside

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


628

Ferreira Neto, A. N., Souza Neto, J.

to out) or Middle-Out (from the middle out) strategy consists in considering the most important, or central concepts (inside), and gradually adding peripheral concepts related to them (out). The Mixed strategy is a combination of the other strategies. None of the modeling strategies presented is universally accepted. The authors prescribe the use of a certain strategy or a combination of them starting with the specific information. Figure 5 shows some sources of information and recommendations on strategies use.

Figura 5: Modelling strategies by source of information. Source: the authors. The complexity of the model depends on the types of sources of information and quantity of types of entities to be represented. However, in more complex models, with more than 20 types of entities, various concomitant strategies are normally used. In these cases, a high level model is divided so that each partition can be modeled separately. 2.5 The best practices frameworks for IT eSCMP and eSCM-CL The modelseSCM-SP 2.01 (ITSqc/CMU, 2006) andeSCM-CL 1.1 (ITSqc/CMU, 2006) were created by ITqsc (Information Technology Services Qualification Center) from the Carnegie Mellon University. The two models address 26 questions critical to the success of IT outsourcing, from the point of view of the service provider and the client organization. These critical questions are the result of literature reviews and interviews with IT service providers and clients. The eSCM-SP and eSCM-C models are broad ranging sets of the best practices, developed, exclusively, for the management of operations of outsourcing of IT services. Their objectives are to offer guidance to IT clients and service providers to evaluate and improve the capacity of the outsourcing cycle organization. The eSCM-SP also offers a standard so that service providers differentiate themselves from their competitors. The structure of the models is composed of three dimensions: Outsourcing life cycle, Areas of Capacity and Levels of Capacity, while the life cycle of sourcing is divided into four phases in the eSCM-SP model (Continuum, Initiation, Delivery and Closure) and five in the eSCM-CL model, with the addition of the Analysis phase before hiring services. Phases group the practices that occur over a certain outsourcing life cycle.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


629 Metamodels of information technology best practices frameworks

The eSCM-SP 2.01 possesses 84 best practices of outsourcing, grouped in ten areas of capacity (personnel management, knowledge, performance, relationships, threats, technology, contracting, transference of service, planning and service implementation and service delivery) and the eSCM-CL model has 95 best practices grouped in 17 capacity areas (outsourcing strategy management, governance, relationship, value, organizational changes, people, knowledge, technology, threats, outsourcing opportunities, outsourcing approach, planning, evaluation of service providers, outsourcing agreements, transference of service, source of service and conclusion of outsourcing). The levels of ability show the path of evolution of the service provider and client capacity. They are divided into five levels in the two models: 1-Providing services/completing the outsourcing, 2-consistently attending the requirements/consistently managing the outsourcing, 3-managing organizational performance of outsourcing, 4-proactively setting values and 5-maintaining excellence. The eSCM-SP and eSCM-CL are third generation best practice models, or rather, were projected to be articulated with other models. In this way, they complete models of best practices with COBIT. ITIL, and CMMI, among others, that don nor address in a comprehensive way all of the critical questions regarding IT outsourcing. For this reason, the eSCM-SP and the SCM-CL were selected as the research theme of this article, which intends to create a metamodel of these frameworks with the use of the MetaFrame methodology. 2. METHODOLOGY For the development of this research the hypothesis that ontological metamodels, presented in 2.1 of the theoretical reference, facilitate the analysis of IT best practices frameworks was considered, based on the restructuring of a higher level abstraction, its components and a rich logical structure and semantics of its relationships. In order to prove the hypothesis of this study, a collection of data, depuration, organization, analysis and presentation of data for the creation of the metamodels was made. As sources of information, official guides of IT best practices frameworks were used as a source, shown in 2.5 of the theoretical reference. The process of data collection of the official documents is similar to the technique of gathering data for systems analysis for the modeling of information systems. The Extended Entity/Relationship methodology, by Engels et all (1992), was used and the conceptual modeling strategies, presented in 2.4 of the theoretical reference, for the organization, analysis and representation of the data in the following types: entity, relationship, attribute and constructor. The final purpose of this data gathering was to elaborate the conceptual metamodeling framework. All of the procedures described above are included in the methodology created in this research study, called MetaFrame, which describes a detailed process of creation and of verification of the quality of the metamodels of IT best practices. The objective of the MetaFrame methodology is to guarantee the quality of the metamodel and create useful products, such as dictionaries of the metamodel data, to be used in the

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


630

Ferreira Neto, A. N., Souza Neto, J.

applications of metamodels, for example, in the comparison and integration of frameworks. 2.1 The MetaFrame Methodology The aim of the methodology, called MetaFrame, presented in this article, is to create a metamodel framework of IT best practices based on the collecting and analyzing of data contained in the official guides of the IT best practices framework. The methodology has an iterative process of construction of the components of the metamodel, using modeling techniques and documentation of information systems, determining the verification of the results based on quality criteria. The metamodel documentation, generated by the MetaFrame methodology, is important to analyze, adapt, compare and integrate IT frameworks, as they contain a dictionary of data with the definitions of the components represented. Phase 1 of the methodology Metaframe encompasses the preparation of the study. In this phase, objectives are defined, professionals selected and their roles assigned, training and the distribution of support materials for the participants are performed. Phase 2 is the execution phase, where the collection of data and the iterative processing of the construction and documentation of the metamodel using modeling techniques are performed. Phase 3 verifies the quality of the metamodel according to the principles and instructions presented in 2.2 and also the correction and updating of the documentation generated by the methodology. A summary of the methodology is presented in Figure 6.

Figure 6: Metaframe Methodology for the creation of IT metamodel frameworks. As an example, some steps and procedures of Phase 1 of the MetaFrame methodology are described in the following illustration:

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


631 Metamodels of information technology best practices frameworks

Table 1. Some steps and procedures of Phase 1 of the Meta Frame methodology . Phase 1—Preparation: it defines the objectives of the study; preparation and allocation of the resources; task planning; material distribution. Step 1- Definitions: it defines the framework and the objectives of the study; selection of the professionals and definition of their roles; selection of the official guides. Procedure 1- Framework: it defines the framework of IT best practices that will be used in the study and their characteristics (name, work area, version, version date, etc.). Procedure 2—Objectives: it defines the objectives of the study to create a metamodel (aide the installation, analysis, customization, comparison, integration or fusion of the model, creation of application system, etc.). Procedure 3—Participants: it selects the participants (systems analysts, business analysts, framework specialists, etc.) . Procedure 4—Roles: it defines the participants’ roles or functions. The suggested roles are: systems and business analysts, documenter, framework specialist, data administrator, etc. Procedure 5—Official Guidelines: it select the official framework guidelines and register the bibliographic information (title, authors, year of publication, number of chapters, number of pages, etc.) on the bibliographic data form of the official guidelines. Step 2—Training: Training of participating professionals in the necessary tasks of constructing a metamodel framework. Procedure 1—E/RMethodology: Training of participating professionals in the E/R methodology extended from Engels et al (1992). Procedure 2—Metamodeling: Training participating professionals in the modeling strategies, concepts of ontological metamodeling and in the metamodel quality criteria presented in sections 2.1 and 2.2. Procedure 3—MetaFrame: Training participating professional in the MetaFrame methodology. Procedure 4- IT framework: Training participating professionals in the IT framework, which will be the ‘metamodel’. This training may be a course, lecture, or a written text, etc.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


632

Ferreira Neto, A. N., Souza Neto, J.

The next table displays some of the forms and models of the MetaFrame methodoloy, such as the formula for the collection of component candidates, in Figure 7, the forms of the official guides and metamodel components, in Figure 8, the model of verification of the metalevel of the components, in Figure 9, and the model of verification of the quality of the metamodel, in Figure 10.

Figure 7: Form for collecting the component candidates.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


633 Metamodels of information technology best practices frameworks

Figure 8: Form for the official guides and components of the metamodel.

Figure 9: Model of verification of the metalevel components.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


634

Ferreira Neto, A. N., Souza Neto, J.

Figure 10: Model of the verification of quality of the metamodel. With the finalization of the verification phase of the MetaFrame methodology, the results or products will be ready to be published within the organization or externally. The metamodel and the explanatory summary should be released together so that the users will have no questions regarding the components represented. Once the products of the methodology are ready, these can be used in the applications defined in the objectives of Phase 1, Step 1, Procedure 2. 3. RESULTS AND DISCUSSIONS 3.1 The Metamodel of the eSCM-SP v2.01 and eSCM-CL v1.1 Figure 11 represents the metamodel created, applying the MetaFrame methodology, of the eSCM-SP v2.01 and eSCM-CL v1.1. frameworks. Despite both frameworks having practically the same ontological metamodel, they were built separately. After the building process, and with the finding of the almost identical metamodels, they were integrated in one single ontological metamodel, with the representation of the maximum cardinality through two numbers. For example 84 and 95 represent the number of practices in the eSCM-SP and SCM-CL models respectively.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


635 Metamodels of information technology best practices frameworks

Figure 11: Metamodel of the eSCM-SP 2.01 and eSCM-CL v1.1, MetaFrame methodology. Source: the authors. The metalevel of each one of the entity type of the metamodel was verified through the form of verification of the level of the metamodel. For example, the Practice entity type, which is in O2,classifes the “pp101” concept at the O1 level of the model. In the real world, in O0, “pp101” represents the document of the policy of the organization for the incentive to innovate, required by the practice of the model. It was verified that the metamodel serves all of the principles and quality instructions through the verification form of metamodel quality. For example, the Comparability Principle, related to the semantic comparison between models according to their correspondences or similarities, is addressed, as the metamodels created through the MetaFrame methodology follow the same construction mechanism and are comparable due to the fact that they present in their documentation a dictionary of data of the metamodel, which allows an effective comparison of the exposed concepts. The explanatory summary, previewed by the MetaFrame methodology in procedure 2 of step 4 of the second phase, intends to interpret the metamodel in a clear way for the user. The definitions here presented were selected from the official guides of the frameworks. In the eSCM-SP and eSCM-CL models, the central entity is a Practice. A Practice for these models corresponds to a set of actions that must be completed, by the provider/client of the IT services, so that the outsourcing relationship is successful. The eSCM-SP model has 84 practices and the eSCM-CL has 95 practices. A Practice can depend on or be the progression of another Practice. In the dependence relationship, a Practice depends on the realization of another to be initiated. In the relationship of progression, a Practice is an advanced or deeper development of another Practice, on an inferior capacity level. Each Practice contains, exactly, 3 main activities. The Activity entity represents the main activities of the models, which support, document and

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


636

Ferreira Neto, A. N., Souza Neto, J.

implement each Practice. An Activity has one or more required activities. The Required Activity entity represents the necessary steps for the constitution of the Activity. Some practices (support practices) support the institutionalization of the activities required by other Practices. A Required Activity can suggest recommended activities. The Recommended Activity represents the procedures suggested for the organization to perform, though they are not mandatory for the certification in the models. A Practice develops one or more work products. The Work Product entity represents whatever type of documentation, tools or software created by Practice, that is, it is a result of the Practice. A Practice employs one or more resources. The Resource entity represents all of the people, financial resources, implicit and explicit knowledge, infrastructure, systems, networks etc. A Practice involves the stakeholders that participate in the realization of the practice. The entity Stakeholder represents the staff, clients, final users, partners, suppliers, merchants and all of the people affected by the practices. A Practice designs one or more roles. The entity Paper represents the accountabilities, authorities or responsibilities attributed to a certain Practice. The organizations are certified with just one level of capacity that can be from 1 to 5. The Organizaton entity represents the supplying organizations or IT service clients. A level of capacity is given to these organizations after the Complete Evaluation for Certification, which is the only method of determination of capacity for granting certification. The entity, Capacity Determination Method then determines the Level of Capacity of the organization. Each Practice implements just one of the capacity levels. The Capacity Level entity groups none or at most, 84 and 95 practices of the eSCM-SP and eSCM-CL models, respectively. At capacity level 1, the organization cannot have any practices of the model implemented. However, to receive a capacity level of 2 in the eSCM-SP model, for example, the organization must have, at least, 48 practices implemented. The practices of the models are completed in just one IT outsourcing life cycle. In the eSCM-SP model, this life cycle is composed of 4 phases and in the eSCM-CL model, which includes an analysis phase, there are 5 phases. The Life Cycle entity executes, at minimum, one, and at maximum, 84 and 95 practices of the eSCM-SP and eSCM-CL models, respectively. These values are theoretical, since, in practice, the life cycle executes more than one and less than the total practices of the models. In both models, the practices compose groupings called capacity areas. There are ten capacity areas in the eSCM-SP model and 17 capacity areas in the eSCM-CL model. The Capacity Area entity joins theoretical values from, at minimum, one and, at maximum, 84 and 95 practices of the models eSCM-SP and eSCM-CL respectively. Each Capacity Area deals with, at least, one critical question of IT outsourcing. The Critical Question entity represents the critical questions of IT outsourcing that are served by one or more capacity areas of the models. There are 23 questions in the eSCM-SP model and three additional questions in the eSCM-CL.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


637 Metamodels of information technology best practices frameworks

4. CONCLUSION AND FUTURE RESEARCH The objective of this article was to present the MetaFrame methodology for creating metamodels of IT best practices frameworks and to exemplify the application of this methodology with the metamodels of the s eSCM-SP and eSCM-CL framework. The approach of creating ontological metamodels can contribute significantly to the analysis of the frameworks through the entities and their relationships. Other possibilities of applications that emerge from the analysis of the metamodels and merely cited in this article, deal with the adaptation, comparison and integration of the frameworks. The present research had certain limitations, such as the definition of a scope circumscribed by the identification and definition of ontological metamodels, without dealing with the creation of a domain ontology for the IT best practices frameworks. Another relevant limitation is the metamodeling of two similar frameworks, eSCM-SP and eSCM-CL. In the future, the application of the Metaframe methodology to other IT best practices frameworks is aimed at . In future studies, the metamodels developed by the MetaFrame methodology will be used in diverse applications. The metamodels can be used to analyze the general structure of the framework as well as its scope, completeness and coherence in relation to its binding objectives. The metamodels can, also, offer methodological support for the adaptation or customization of the framework on the processes and structures of an organization. For example, the metamodel can suggest the adaptation or implementation of a new process or practice within the framework by exhibiting the entity types and relationship types associated. Future research intends to study the comparison of frameworks of IT best practices through the metamodels, which can be very useful to analyze eventual complementary functionality. For example, one can observe, through the metamodel and the documentation generated by the Metaframe methodology that the ITIL does not offer metrics or other components of control in the same extension as COBIT. In this case, the dictionary of the metamodel generated by the MetaFrame methodology would be a prerequisite to the comparison of the structures of two or more frameworks and for dealing with the question of the synonymous and homonymous concepts. It is intended to research the possibility of applying the IT best practices framework metamodels to the solution of the integration problem of frameworks. The term integration is here used when one wants to maintain the characteristics of each framework, but at the same time, wishes to create a common area among them. After the processes of analysis and comparisons of the metamodels are performed, connections between the components of the frameworks can be found through an integration metamodel. Entity types, such as Processes, Activities, Resources and Products are present in many of the frameworks of IT best practices, with similar meanings and attributes. Other components, despite having different names, have the same meaning and can also be integrated.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


638

Ferreira Neto, A. N., Souza Neto, J.

REFERENCES

Atkinson, C., and Kühne, T. (2003a), Model-Driven Development: A Metamodeling Foundation, IEEE Software, vol. 20,no. 5, pp. 36-41. Atkinson, C., and Kühne, T. (2003b), Calling a Spade a Spadein the MDA Infrastructure, International Workshop Metamodeling for MDA, York. Atzeni, P., Ceri, S., Paraboschi, S., TORLONE,R. (1999), Database Systems Concepts, Languages and Architectures. McGraw-Hill. Batini,C., Lenzerini, M., Navathe, S.B. (1986), A Comparative analysis of methodologies for database schema integration. In ACM Computing Surveys 18, 4, pp. 323-364. Chen, Peter, P.S. (1976), The Entity-Relationship Model: Towards a Unified View of Data, ACM Transaction on Database Systems, vol. 1, nº1, pp. 9-36. Conrad S. (2002), Schemaintegration - Integrationskonflikte, Lösungsansätze, aktuelle Herausforderungen. Informatik - Forschung und Entwicklung, SpringerVerlag, Vol. 17, No. 3, pp. 101-111. Engels, G., Gogolla, M.,Hohenstein, U., Hulsmann, K. (1992), Conceptual Modelling of Database Applications Using an Extended ER Model. North Holland, Amsterdam, pp. 157-204. GoekenM., Alter S. (2009), Towards Conceptual Metamodeling of IT Governance Frameworks Approach - Use - Benefits, hicss, 42nd Hawaii International Conference on System Sciences, pp.1-10. Hyder,E. B., Heston, K.M., Paulk,M. C. (2006). The eSourcing Capability Model for Service Providers (eSCM-SP).V 2.01.Pittsburgh: ITSqc. Disponívelem<htttp://itsqc.cmu.edu/downloads>.Acesso em 10 de Jun de 2009. Hefley, W. E., Loesche, E.A. (2006). The eSourcing Capability Model for Client Organizations (eSCM-CL).V 1.1.Pittsburgh: ITSqc. Disponívelem<htttp://itsqc.cmu.edu/downloads>.Acesso em 10 de Jun de 2009. Heuser, C. A. (1998), Projeto de Banco de Dados, 6ª edição. ISBN: 979-85-7780-3828. Editora Bookman. ITGovernance Institute. COBIT 4.1 (2005).Disponível em: http://www.isaca.org/Template.cfm?Section=COBIT6&Template=/TaggedPage/Tagged PageDisplay.cfm&TPLID=55&ContentID=7981. Acessoem: 14 Jun 2009. ITGovernance Institute: COBIT Mapping: Overview of International IT Guidance (2006), 2nd Edition, ISBN 1-933284-31-5. IT Governance Institute COBIT® MAPPING: MAPPING OF ITIL V3 WITH COBIT® 4.1 (2008), ISBN 1-933284-31-5.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


639 Metamodels of information technology best practices frameworks

Karagiannis, D. and Höfferer, P. (2008), Metamodeling as an integration concept.Software and Data Technologies.Publisher Springer Berlin Heidelberg (2008), pp.37-50. Karagiannis, D., and Kühn, H. (2002), Metamodeling Platforms. In A. Min Tjoa, & G. Quirchmayer (Eds.), Lecture Notes in Computer Science: Vol. 2455. Proceedings of the Third International Conference EC-Web, Springer, pp. 451-464. Kühne, T. (2006), Matters of (Meta-) Modelling, In Journal on Software and Systems Modeling, Volume 5, Number 4, pp. 369-385. Kühne, T. (2005), What is a model? In: Seminar 04101 Language Engineering for Model-Driven Software Development, Dagstuhl Seminar Proceedings InternationalesBegegnungs. Kurpjuweit, S., and Winter, R. (2007). Viewpoint based Meta Model Engineering. EMISA, pp 143-161, In Manfred Reichert, Stefan Strecker, Klaus Turowski (Eds.): EnterpriseModelling and Information Systems Architectures – Concepts and Applications, Proceedings of the 2nd International Workshop on EnterpriseModelling and Information Systems Architectures). Johannsen, W., Goeken, M. (2007). Referenzmodelle für ITGovernance. dpunkt. verlag GmbH, Heidelberg. Magnani, M., RizopoulosN., Mcbrien, P., Montesi, D. (2005). Schema Integration based on Uncertain Semantic Mappings, Conference or Workshop Paper, 24th International Conference on Conceptual Modeling (ER05), Klagenfurt, Austria Lecture Notes in Computer Science, Volume 3716, pp.31-46. Moody, D.L. (2005). Theoretical and practical issues in evaluating the quality of conceptual models: current state and future directions, In Data & Knowledge Engineering 55, pp. 243-276. OMG (2003).MDA Guide Version 1.0.1 Version 1.0.1, OMG document omg/03-06-01. OMG (2004).UML-Unified Modeling Language Infrastructure Specification, Version 2.0, Version 2.0, OMG document ptc/03-09-15. Rizopoulos, N., McBrien, P. (2005). A general approach to the generation of conceptual model transformations.In: Proc. CAiSE. LNCS, Springer-Verlag. Schütte, R., Rotthowe, T. (1998). The Guidelines of Modeling- an approach to enhance the quality in information models. In Ling, Ram, Lee (Eds.) Conceptual Modeling – ER 98.Singapore, 16.-19.11.98, pp. 240-254. Spaccapietra, S., Parent, C., Dupont, Y. (1992). Model Independent Assertions for Integration of Heterogeneous Schemas.VLDB Journal 1, 1, pp. 81.126. Strahringer, S. (1996). Metamodellierung als Instrument des Methodenvergleichs, Shaker Verlag, Aachen. Teorey, T.J. (1999). Database Modeling and Design, 3rd Edition, University of Michigan, Lecture Notes.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


640

Ferreira Neto, A. N., Souza Neto, J.

Zaniolo, C. (1982). A Formal Approach to the Definition and the Design of Conceptual Schemata for Database Systems. ACM Transactions on Database Systems, Vol. 7, No. 1, Pages 024-059.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 619-640

www.jistem.fea.usp.br


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, p. 641-662 ISSN online: 1807-1775 DOI: 10.4301/S1807-17752011000300007

XLDM: AN XLINK-BASED MULTIDIMENSIONAL METAMODEL Paulo Caetano da Silva Federal University of Pernambuco, Brazil Mateus Silqueira Hickson Cruz Ceu System - São Paulo, Brazil Valéria Cesário Times Federal University of Pernambuco, Brazil _____________________________________________________________________________________

ABSTRACT The growth of data available on the Internet and the improvement of ways to handle them consist of an important issue while designing a data model. In this context, XML provides the necessary formalism to establish a standard to represent and exchange data. Since the technologies of data warehouse are often used for data analysis, it is necessary to define a cube model data to XML. However, data representation in XML may generate syntactic, semantic and structural heterogeneity problems on XML documents, which are not considered by related approaches. To solve these problems, it is required the definition of a data schema. This paper proposes a metamodel to specify XML document cubes, based on relationships between elements and XML documents. This approach solves the XML data heterogeneity problems by taking advantages of data schema definition and relationships defined by XLink. The methodology used provides formal rules to define the concepts proposed. Following this formalism is then instantiated using XML Schema and XLink. It also presents a case study in the medical field and a comparison with XBRL Dimensions and a financial and multidimensional data model which uses XLink.

Keywords: XLDM, XML, XLink, XBRL, Multidimensional Data Metamodel

_____________________________________________________________________________________ Manuscript first received/Recebido em 28/09/2010 Manuscript accepted/Aprovado em: 16/11/2011 Address for correspondence / Endereço para correspondência Paulo Caetano da Silva, Graduated in Chemical Engineering at Bahia Federal University (1985). Master’s degree in computer networks at Salvador University – UNIFACS (2003). PhD at Pernambuco Federal University, in Computer Sciences, in database – XML area (2010). Currently, he is a professor at Salvador University - UNIFACS, in the master course of Systems and Computation, he is also an analyst at Brazil Central Bank. He has experience in the area of computer sciences, focused on Software Engineering, Database and XML, actuating, mainly, in the following themes: XBRL, OLAP for XML, information systems, web and financial information.E-mail: paulo.caetano@bcb.gov.br Mateus Silqueira Hickson Cruz, Graduated in Computer Science at Bahia Federal University (2009). Has interest in the following themes: databases, mobile applications development, XML and XBRL.E-mail: mateus@ceusystem.com.br Valéria Cesário Times,Graduated at Pernambuco Catholic University (1991), master’s degree in Computer Sciences at Pernambuco Federal University (1994) and PhD on Computer Science - Leeds Metropolitan University (1999). Currently is an adjunct professor I at Pernambuco Federal University. She has experience in the area of Computer Sciences, focused on Information Systems, actuating mainly in the following themes: data warehouse, geographic data base, geographic information services, geographic information systems and OLAP tools.E-mail: vct@cin.ufpe

Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


642

1.

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

INTRODUCTION

Data are usually available in several formats. XML (eXtensible Markup Language) is used to integrate them in order to achieve efficient data interchange and handling. XML, being an extensible meta-language, allows new markup languages to be defined for specific domains. Due to its extensibility, XML is used for heterogeneous data sources integration. This makes XML documents a rich source of information for the organizational decision maker. Similarly, the use of Data Warehouses systems (Kimball & Ross, 2002) allows the identification of tendencies and standards in order to better conduct the companies’ businesses. However, the integrated use of these technologies is still under development. In order to transform XML into a technology that helps the decision making process, it is integrated its use with Data Warehouses systems to unify them. Applications and technologies derived from XML use XLink (XML Linking Language) (XML Linking, 2001) as an alternative for representing the semantics and the structure of information, expressing relations between concepts which are usually defined in a schema based on XML Schema (XML Schema, 2004). XBRL (eXtensible Business Reporting Language) (XBRL International, 2008) is, among the works that represent data semantics usage of XLink, an international standard to represent and publish financial reports that use extended links for modeling financial concepts (e.g. arithmetic operations between accounting facts). On the other hand, XML presents some problems due to its flexibility in data representation, known as heterogeneities: (i) semantics, in which similar information is represented by different names (e.g. enterprise and company) or dissimilar information is represented by the same name (e.g. virus in the informatics area and in the medical area); (ii) syntax, where the semantically equal content is represented in several ways, for example, in different languages or in several measurement units (e.g. meters and feet); and (iii) structure, where the data are organized in several structures (e.g. in different kinds of hierarchies, attributes or elements) (Näppilä, Järvelin & Niemi, 2008). This flexibility in representation is important, though it makes the usage of XML data a complex task. XLink has been used to represent semantic and structural information, expressing relations among concepts that are normally defined through XML Schema. From the combined use of XML Schema and XLink, this paper represents a multidimensional metamodel, entitled XLDM (XLink Multidimensional Data Metamodel), which solves the heterogeneity questions in XML and specifies data cube models for several semi-structured data applications. Among the related papers´ analysis, the existence of a model that solves heterogeneity problems in XML and makes its application possible in multiple human knowledge domains has not been verified. Thus, the development of a metamodel based on XML documents, and relationships to solve these questions, consists of the motivation for this paper. Elements, attributes and relationships were defined for the XLDM specification, allowing a greater expressivity and enabling its applicability in different domains. The formalization of XLDM was created to allow the multidimensional data schema to be based on XML Schema’s and XLink’s definitions. This paper is organized as follows: Section 2 discusses the main proposals for defining XML data models, including a description of the most important dimensional

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


643 XLDM: Na Xlink-Based Multidimensional Metamodel

models and other approaches of which definitions have affected the development of our work. Contributions are presented in Section 3, which includes the XLDM specification and its formalization. In Section 4 a case study for the medical area is to be found. A comparison against a multidimensional metamodel that uses XLink for the financial area is given in Section 5, showing the proposed solution wideness, and finally, conclusions are presented in Section 6. 2.

DATA WAREHOUSE FOR XML DATA

A Data Warehouse system architecture for complex data is proposed by Boussaid, Messaoud, Choquet and Anthoard (2006) using XML documents. Based on the database view concept, Baril and Bellahséne (2003) present architecture for XML data integration and a formalism for DW specification. A data model is defined, using DTD (Document Type Definitions), to represent each view, aiming at semi-structured organized data. Then, it is proposed a Data Warehouse based on the views. Trujillo, Luján-Mora and Song (2004) use UML class diagrams to represent Data Warehouse systems in a conceptual level. From the definition of a DTD, which represents the same multidimensional model specified by the class diagram, XML documents are generated for data exchange. Pokorny (2001) describes how to represent a star model (Kimball & Ross, 2002) in XML, proposing the XML-Star schema and using DTD to explicit dimension hierarchies. One dimension was modeled as a DTD sequence logically associated, resembling the referential integrity in a relational database. The dimensional structures were not defined in the XML schema, leaving to the software applications the data multidimensionality understanding. Golfarelli, Rizzi and Vrdoljak (2001) discuss a multidimensional model represented in attribute trees. They use XML Schema to express the multidimensional model through the relation between sub-elements. Nassis, Rajugan, Dillon and Rahayu (2004) propose an object-oriented approach to develop a conceptual model for DW, named XML Document Warehouses (XDW). They also define dimensions by using XML and UML packages diagrams, in order to contribute to the hierarchic conceptual views construction. An XML repository, named xFACT, was built from the integration of object oriented concepts with XML Schema. Jensen, Moller and Pedersen (2001) present architecture in which the data in XML and relational formats are the information sources. The data schema is represented in UML diagram classes, and, subsequently, mapped into a relational structure. This process is defined by the authors as a logical data integration. It is, then, possible to use OLAP query tools, assuring time gain, since there is no physical data integration. Pedersen, Riis and Pedersen (2001) describe the benefits of combining data handled from OLAP and XML tools. Examples of queries to OLAP cubes, that may have had their dimensions complemented with new information acquired in XML files, are also presented. This addition is made using links from the OLAP cube to XML files. The liking process between OLAP and additional data in XML builds a logical integration, avoiding the reprocessing effort by the inclusion of external XML data. Hümmer, Bauer and Harde (2003) define XCube, a Data Warehouse metamodel for XML documents, in which data schemas, based on XML Schema, were used to dimension, fact and cube representations. This approach has the advantages of a standardized environment, making the document to be reused more easily (e.g. dimension documents) for different domains. It also allows the integration with Web Services, as well as the insertion of comments in almost all elements of the documents, in different languages and with terms of specific areas of human knowledge. XCube

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


644

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

solves part of the heterogeneity problems in XML: (i) document structure, which is defined in schemes based on XML Schema; (ii) syntactic difference of content, which is partially solved through the units attribute to represent the same information in different units. Nevertheless, the treatment of elements with similar contents, written in different languages, is not discussed; and (iii) the semantic heterogeneity is not approached in XCube. HernĂĄndez-Roz and Wallis (2006) specified XBRL Dimensions to model hypercubes of financial data. This model is based on a vocabulary definition, specified through XML Schema, and relationships based on XLink. These relationships express XML elementsâ&#x20AC;&#x2122; hierarchical structure, dimensions and its members, thus it deals with the structural heterogeneity issue. Another type of relationship found in XBRL allows the creation of labels in several languages for each vocabulary element, solving the semantic heterogeneity problem. Syntactic heterogeneity is solved through the use of a unit attribute to define data measurement unit and through the definition of identical labels for elements that represent the same information written in different languages. However, this solution is limited to financial data representation. Even though these works discuss Data Warehouse models for XML, they differ, for several reasons, from the data metamodel definition specified in this paper. In Boussaid et al. (2006), a Data Warehouse system based on XML architecture is presented but it lacks a detailed multidimensional data model. Pokorny (2001) and Golfarelli et al. (2001) specify the multidimensional model through software applications. Papers from Baril et al. (2003), Gottlob, Koch and Pichler (2003), Pokorny (2001) and Trujillo et al. (2004) are based on DTD or on the object-oriented paradigm. Finally, there are proposals of logical integration with the relational model: Baril et al. (2003), Jensen et al. (2001), and Pedersen et al. (2001). Besides, none of these papers have considered the use of XLink to define dimensional structures for XML. The works that can be compared to the data metamodel presented here are XCube and XBRL Dimensions. Although the first one has not considered the use of XLink, it represented the cube, the dimensions and the facts, using distinct schemas. The second one, which is the only paper from the evaluated ones that uses XLink for the definition of the multidimensional model, is a restricted solution to a specific domain. Gotlob et al. (2001) define a data model based on XML documents and a set of binary relations to propose an algorithm that evaluates XPath (XPath, 2007) expressions and optimizes the queries on these documents regarding time and storage space needed to perform them. XML document is described as a non-classified tree, i.e., a tree with an arbitrary number of children, ordered and labeled, in which each child node is ordered, and each node has a label. The document tree is represented by a set of binary relations, of which axes are the ones from XPath language (e.g. self, child, parent, descendent). As a result of that, the defineddata model allows the navigation in XML documents, performed by XPath language, which is the core mechanism for XML nodes addressing of other technologies, such as XQuery (XQuery, 2007) and XPointer (Grosso, 2003). Motivated by the use of XML applications, BarcelĂł & Libkin (2005), Libkin & Neven (2003) and Libkin (2006) analyze query languages for XML trees, based on the same document definition given before, and present a group of definitions to handle XML data. These authors refer to several other proposals that consider naturally modeled XML data as non-classified trees, and also conceptualize an XML document. Boussaid et al. (2006) propose a technology that specifies data warehouses for star and snowflake

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


645 XLDM: Na Xlink-Based Multidimensional Metamodel

models logical definition which the data model is based on a mathematic formalization performed through XML Schema. The concepts presented by Barceló et al. (2005), Boussaid et al. (2006), Gottlob et al. (2003), Libkin et al. (2003) and Libkin et al. (2006) define the XML document, navigation functions and a formalization for data warehouse typical models, i.e., star schema and snowflake schema. These definitions are taken into account in this paper for the multidimensional metamodel based on XML Schema and XLink. They were extended to include the relations among two or more documents through links. New definitions are given to represent the existence of relationships between XML documents. In this paper, the proposal: (i) has a metamodel that solves the heterogeneity problems in XML data; (ii) uses linkbases, sets of links, for the definition of relationships among XML document elements, in order to specify the possible data cubes that can be used; (iii) is a metamodel that can be used in a variety of domains. In the next section, our data cube metamodel, which considers the use of XML, XML Schema and XLink technologies, is presented. 3.

A MULTIDIMENSIONAL METAMODEL BASED ON LINKS

This section shows a multidimensional metamodel for applications that uses XML as source of information. Initially, mathematical definitions are given, which allow a non-ambiguous metamodel specification (see Section 3.1). Then the set of XMLdocuments, based on XML Schema and XLink, that compose the XLDM specifications proposed here are discussed in Section 3.2. 3.1

Formal Definitions for XLDM

Definition 1: A rooted tree, ordered and non-classified, is a tree with unlimited child quantity, in which each node is given a unique label, and that is an N* element (i.e. a finite string of natural numbers). Then, a rooted, ordered, labelled and non-classified tree T is defined as (D, <pre ) in which: 1. The element ε∈ D (an empty string) is the root; 2. D is a set of nodes, named tree domain, which is a subset of N*,such g∈ D, implies b ∈ D, if, and only if,b <pre g. The relation <pre defines the order of the document, i.e., it is the pre-fixed relation with the elements of D, being b<pre gif, and only if, the only way from the root to g goes through b. Besides the relationship between nodes, in a given XML document, the relationships between nodes of two different documents are also defined. That is why the inclusion of Rσ is necessary in the XML document definition, which is a data structure that can be defined as follows: Definition 2: An XML document d is represented as a 5-tuple (T, β, λ, Rχ, Rσ), in which: 1. T = (D, <pre ) is the rooted, ordered, labeled and non-classified tree; 2. βis a set of tags (XML elements); 3.λ: D → βis a function that assigns a node in T on each XML tag;

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


646

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

4. Rχis a set of binary relations on β, e.g. parent, child and sibling; 5. Rσ is a set of binary relations on β´x β´´, where β´ and β´´ are groups of tags from distinct XML documents, d´ and d´´, respectively, with d´ ≠ d´´, d´⊂ D andd´´⊂ D. Definitions 1 and 2 are exemplified in Figure 1, assuming that the trees shown in this figure represent the documents d´ and d´´.The nodes ε and b are elements which have a binary relationship χε-b∈ Rχ, such as parent-child relationship, and among the documents there is a relationship σb-a’∈ Rσ, established by the elements b and a’. Definition 3:Taking into account that dis an XML document (T,β,λ, Rχ, Rσ),in a relation toχ∈Rχ , and ρ(β) as a subset of β,the function fχ: ρ(β)→ P(β) is defined, with fχ(X) = {y ∈ β | ∃x ∈X, so that (x,y) ∈χ}.Then the relation name could be overridden, such as fchild.

Figure 1.Relationships between the documents d´ and d´´ and among their elements. Definition 4: Being d´ and d´´ two XML documents (T´, β´, λ´, Rχ´, Rσ´) and (T´´, β´´, λ´´, Rχ´´, Rσ´´), respectively, for a relation σ∈Rσ´ in d´, the function gσ:P(β´)→ P(β´´) is defined by gσ(X) = {y ∈ β´´ | ∃x ∈X,so that (x,y) ∈σ}. An XML document consists of element structures, which contain sub-elements and attributes. The attributes are added to the elements in opening declarations (tag). Between an opening and a closing tag, there may be any number of sub-elements. The attributes can be used to make reference among elements or between elements and other XML documents. According to these properties, the following definitions are used to represent the data cube metamodel proposed in this paper. Definition 5: Being (F,S) a data warehouse (DW) schema, where F is a set of facts having m measures, {F.Mq, 1 ≤ q ≤ m), and a set of independent dimensions r, S = {Ss, 1 ≤ s ≤ r), where each Ss contains a group of Ij domains, {Ss.Ij, 1 ≤ j ≤ i} and each Ijcontains a group of n members, {Ij.Np, 1 ≤ p ≤ n}. The (F,S) schema is composed of schemas and link bases documents: 1. 2. 3. 4. 5. 6. 7. 8.

F defines a set of fact tags; S defines a set of dimension tags; I defines a set of domain tags of a dimension; N defines a set of member tags of a domain; H defines a set of hypercube tags; M defines a set of measures for a fact; L is a set of link bases documents, represented as a 5-tuple (T, β, λ, Rχ, Rσ); ∀s ∈ {1,...,r}, Ss defines elements associated to facts f ∈F;

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


647 XLDM: Na Xlink-Based Multidimensional Metamodel

9. ∀s ∈ {1,...,r} and ∀i∈ {1,..., ij}, Ss.Ij defines relationships between dimensions and domains. 10. ∀i∈ {1,...,ij} and ∀n ∈ {1,..., np}, Ij.Np defines relationships between domains and members. As the definition of XLink allows types of relationships between elements, by the use of the attribute xlink:arcrole, this is used for the definition, considered below, of relationships between elements representing members, domains, dimensions, facts and cubes. Definition 6: Considering l∈L, a linkbase(T,β,λ, Rχ, Rσ), the following relations are defined in l: 1. domain-member: ∀n ∈N, fχ(n) = {i∈ I | ∃n∈N, so that (n,i) ∈χ}; 2. dimension-domain:∀i∈ I, fχ(i) = {s ∈ S | ∃i∈I, so that (i,s) ∈χ}; 3. hypercube-dimension:∀s ∈S, fχ(s) = {h ∈ H | ∃s∈S, so that (s,h) ∈χ}; 4. all and not-all: ∀n ∈N, fχ(n) = {f ∈ F | ∃n∈N, so that(n,f) ∈χ}; Figure 2 shows how the relationships defined by the Definition 6 can be established. Each circle represents a node, which corresponds to an XML element. The lines that connect the nodes represent the possible relationship types that can exist between the elements. The domain-member relationship connects elements that are part of a domain. The dimension-domain relationship links the domain to the dimension. The hypercube-dimension relationship connects the fact to the dimension. Lastly, the relationships all and not-all state, for a given fact, whether all domain members are part of the hypercube or not. These relationships are established in the XLDM metamodel through link bases.

Figure 2.Relationships between the trees that represent the documents d´ and d´´. Definition 7: Allow d to be an XML document (T,β,λ, Rχ, Rσ). A hypercube is defined as {∀h ∈ H | H ⊆ d} and {∀f ∈F | f ⊆ d}, and there is a function fχ(f) = {h ∈ H | ∃f∈F, such that (f,h) ∈χ}; The XML formalism allows the insertion of sub-elements multi-levels in an XML element and the establishment of relationships, through XLink, which define hierarchies among elements. Thus, the definition of the domain-member relationship allows the building of hierarchies in a dimension. For example, one dimension country can have a domain-member relationship with the element brazil, then, this one can

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


648

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

have the same kind of relationship with other elements, e.g. brazil and south, brazil and northeast. Based on the definitions discussed in this section, documents that compose the XLDM metamodel specification were created. Next, the XLDM document specifications are discussed. 3.2

XLDM Documents

It is noticed that, due to the inherent XML technology flexibility, different data cube metamodels for data warehouses based on XML can be specified. For this reason, XML data heterogeneity problems are made evident. However, the use of XLink and XML Schema can solve such problems through the specification of dimensions, facts and cubes. The proposed multidimensional metamodel is based on the definitions presented in section 3.1. To do so, instance-schema.xsd and linkbaseschema.xsd documents have been specified based on these definitions and are available at http://www.cin.ufpe.br/~pcs3/XLDM/Spec. An XML database that uses XLink is made of schemas, linkbases and XML instances, i.e.; XML documents with the data. The schemas specify the elements that represent the facts, the dimensions, the dimension members and the cubes. The linkbases define the relationships between members, dimensions and facts, establishing combinations of cubes that can exist. In the instance, the facts occur and, combined with dimension members, determine a data cube. The XML instance document, which may contain one or more cubes, has a structural dimensionin which the contexts are presented with the dimension members. There is also a non-dimensional structure, with the measures of the facts.Figure 3 shows the UML components diagram (Unified Modeling Language, 2005) for the XLDM metamodel proposed in this section. This figure illustrates the data organization according to this model. Based on XML Schema, the vocabulary, i.e. a set of elements to be used in the XML instance, is specified. The relationships among the instance elements and between them and other resources are expressed in linkbases. The specified data types are common to a variety of domains. This was done in order to broaden the model applicability. However, it is possible to create types for a specific domain. The attributes and elements declarations, which can be found in XLDM instances, are made in instance-schema.xsd. An element declaration of particular importance, the instance root element, is shown in Listing 1. The presence of this element and its children in the instance is based on Definition 1, 2 and 3. Initially, the element naming is performed. In XLDM, two alternatives are given to this element identification, in order to name it according to the domain to which it is being applied: (1) changing its declaration in the instance-schema.xsd document; (2) without changing its declaration in the instance-schema.xsd document, a label can be created for this element, specifying it in the link base Label and creating a relationship between two documents (RĎ&#x192;, Definitions 2 and 4). Thus, the instance xldm element has a label, for domain application, specified on the link base label. Next, there is the element description. After this, the declarations of the references to schemas, link bases, roles and arcroles are performed. To do so, schemaRef, linkbaseRef, roleRef and arcrole Re felements were specified to allow the establishment of relationships among documents (RĎ&#x192;) as shown in Definition 2 and 4. A significant characteristic is the obligatoriness of two link bases (entitled Definition and Label, which will be discussed later). Thus, the

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


649 XLDM: Na Xlink-Based Multidimensional Metamodel

number of minimum occurrences in the linkbaseRef element is defined as two. Finally, the elements that can occur in the instance are declared, such as item and tuple. Fact Schema and Hypercube Schema are created from the instance-schema.xsd definitions, which define, respectively, the elements that represent the facts and the dimensions members. Regarding organization purposes, these schemas can be specified in the same document or in distinct documents. In the instance documents, it is mandatory the presence of contextRef and unitRefattributes in the elements that represent the facts.They make reference to the dimensional context and the fact unit, represented by the elements context and unit, found in the multidimensional structure. The declaration of these two attributes and elements establishes the binary relations (RĎ&#x2021;) described in Definitions 2 and 3 and its use is illustrated at the extract of the XLDM instance document show in Listing 11.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


650

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

Figure 3.The XLDM Multidimensional Metamodel. Listing 1. Definition of an XLDM Root Element <element name="xldm"> . . . <complexType> <sequence> <element ref="link:schemaRef" minOccurs="1" maxOccurs="unbounded"/> <element ref="link:linkbaseRef" minOccurs="2" maxOccurs="unbounded"/> <element ref="link:roleRef" minOccurs="0" maxOccurs="unbounded"/> <element ref="link:arcroleRef" minOccurs="0" maxOccurs="unbounded"/> <choice minOccurs="0" maxOccurs="unbounded"> <element ref="xdmi:item"/> <element ref="xdmi:tuple"/> <element ref="xdmi:context"/> <element ref="xdmi:unit"/> <element ref="link:footnoteLink"/> </choice></sequence> </complexType></element>

Linkbases are defined to conform to the relationships that can be present in a great variety of domains. The definitions occur by adding roles specification for the arcrole attribute, besides the inclusion of elements and attributes. Listing 2 shows the arcrole multiplication-item specification and, in Listing 3, the Description linkbase

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


651 XLDM: Na Xlink-Based Multidimensional Metamodel

definitionis illustrated. XLDM proposed linkbases willbe discussed next and their uses are illustrated in Section 4. Listing 2. The multiplication-item Arcrole definition <arcroleType id="multiplication-item" cyclesAllowed="undirected" arcroleURI="http://www.example.br/arcrole/multiplication-item"> <definition>Target (a primary item declaration) is a multiplication factor which composes the value of the source (a primary item declaration)</definition> <usedOn>calculationArc</usedOn> </arcroleType>

Listing 3 The Description Linkbase definition. <element name="descriptionlLink" substitutionGroup="xl:extended"> <annotation> <documentation> descriptionLinkelementdefinition. </documentation> </annotation> <complexType> <complexContent> <restriction base="xl:extendedType"> <choiceminOccurs="0" maxOccurs="unbounded"> <elementref="xl:title"/> <elementref="link:documentation"/> <elementref="link:loc"/> <element ref="link:descriptionArc"/> </choice> . . . </complexContent> </complexType> </element>

1. The Definition linkbase, considered mandatory, allows the creation of the hierarchical element structure. This solves the problem of structural heterogeneity. For the structural aspect, the relationship definition among dimensions, members and cubes is performed by the Definitionlinkbase. The relationship is expressed by the arcrole domain-member lists and the possible members of a domain, which is associated to the dimension through the arcrole dimension-domain. The cross product between the dimensions and the facts, in order to establish the possible cubes to be used in the instance, is defined by the arcrole hypercube-dimension. To include the measures in the cube, the arcrole all is to be used and to exclude the member of a domain in a cube specification, the arcrolenotAll is also used. These relations, based on the Definitions 5, 6 e 7, determine the relationship among members, dimensions, facts, cubes and possible values for the attribute arcrole, so that a data multidimensional model may be created. These relationships are shown in Figure 2. Besides these relationships, illustrated in Listing 6, other relationships are also specified. Their definition occurs with the following arcroles: (a) main, which defines a relationship between a concept and another as main, e.g. in a model for a disease treatment, a medical procedure is defined as the main one for a disease treatment; (b) secondary, which defines a relationship as secondary. In the disease treatment example, there may be the main procedure and the secondary ones. Then, the attribute order indicates the order in which the secondary procedures should be performed; (c) substitution, which determines the possibility of substitution of a concept for another one, e.g., a medical procedure can be replaced by another one. The attribute order indicates the order in which the concepts can be replaced, e.g., it defines the order in which the procedures replace the main one;

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


652

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

2. The Label linkbase allows the use of different labels for the same element, which can be specified in different languages through the attribute xml:lang. This is mandatory, so that the semantic and syntactic heterogeneities can be avoided; 3. The Ordering linkbase is an optional linkbase defined to determine not only the elementsâ&#x20AC;&#x2122; presentation order in the instance, but also their processing order, which can differ from the presentation. For the definition of links aiming at specifying the presentation orders, an extended link element presentationLink is used. For processing purposes, the element processingLink is used; 4. The Description linkbase is another optional linkbase introduced in this metamodel in order to supply textual description to a relationship. For example, the relationship between a disease and its description is represented by these linkbase arcs. Placing the descriptions in a different linkbase contributes to the model modularity; 5. The Reference linkbase is also optional and it is used to define elements that represent references; 6. The Calculation linkbase expresses arithmetic relations. It was defined so that, besides the sum operation, the arithmetic operations of multiplication, division, exponentiation and n-th root are specified. To do so, values are defined for the arcrole attribute. Listing 4 shows the possible arcroles for this linkbase, which depends on the arithmetic operation. Attributes are also defined, with proper domains, for each kind of operation. For example, the attribute weight changes its domain based on the operation. For sums, the domain varies from -1 to 1, what means that its value is completely or partially used in the addition, which results in the parent element value. This attribute domain for multiplication is the set of real numbers. Regarding the exponentiation and n-th root operations, there are the attributes exponent and index, of which domain is the set of natural numbers. For division, only the arcrole values are used in the numerator and denominator specification. By using XLDM, the heterogeneity questions of the XML data, mentioned in section 1, are solved as follows: (i) in semantics, the Label linkbase establishes one or more names for an element defined in the schema. Therefore, a unique element can have several names, and distinct elements in different domains can have the same name. This allows an application to perform the processing through the element itself or through its label; (ii) in syntactic, the Label linkbaseallows the definition of names in different languages for the same element and the attribute unit allows to inform the unit referring to the measure; and (iii) in the structural, the schema defines the elements, their attributes, and child elements. The Definition linkbase specifies the hierarchy among the elements, determining the XML document structure. Listing 4. Attribute arcrole in Calculation Linkbase. Attribute arcrole Name <link:calculationArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/summation-item" xlink:from="A"xlink:to="B" weight="1.0"/>

<link:calculationArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/multiplicationitem" xlink:from="A"xlink:to="B" weight="3.0"/> <link:calculationArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/numerator-item"

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

Meaning The total value of concept B contributes for the formation of concept A. Three times the value of B is a multiplication factor to compose the value of A. The value of B is the numerator of the

www.jistem.fea.usp.br


653 XLDM: Na Xlink-Based Multidimensional Metamodel xlink:from="A"xlink:to="B"/>

<link:calculationArcxlink:type="arc" xlink:arcrole="http://www.examplo.br/arcrole/denominatoritem" xlink:from="A"xlink:to="C"/> <link:calculationArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/exponentiationitem" xlink:from="A"xlink:to="B" expoent="3.0"/> <link:calculationArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/nthroot-item" xlink:from="A"xlink:to="B" index="2.0"/>

4.

division that compose the value of A. The value of C is the denominator of the division that composes the value of A. The value of A corresponds to the value of the third exponent of B. The value of A corresponds to the n-th root of B.

CASE STUDY

In order to demonstrate the applicability of the proposed metamodel, a case study is presented to apply XLDM to medical data represented in XML documents. At www.cin.ufpe.br/~pcs3/XLDM, a different example, dealing with sales, is also available. 4.1

XLDM Application

Figure 4 shows UML components diagram for the data cube model used in this case study. The TreatmentCube hypercube has four dimensions: Patient, Procedure, Medication and Disease. They relate to the hypercube through the arcroles hypercubedimension. There is also the relationship of the cube with the Dosage measure, made with the use of the arcrole all. The schema created is shown in Listing 5. It contains the definition of the data cube TreatmentCube, the dimension PatientDimension, the domain PatientDomain, and a member of this domain (Patient1). Finally, the specification of the Dosage measure is made. The use of the attribute abstract in some elements indicates that these are used only for structural organization purposes and it is not possible to be used in the instance. Listing 5. Schema Document <element id=”TreatmentCube” name=”TreatmentCube”type=”xldmi:stringItemType”abstract=”true” substitutionGroup=”xldmdt:hypercubeItem”/> <element id=”PatientDimension” abstract="true"name="PatientDimension"type="xldmi:stringItemType" substitutionGroup="xldmdt:dimensionItem"/> <element id="PatientDomain" name="PatientDomain" type="xldmi:stringItemType"abstract="true" substitutionGroup="xldmi:item"/> <element id="Patient1" name="Patient1" type="xldmi:stringItemType" substitutionGroup="xldmi:item"/> <element id="Dosage" name="Dosage" type="xldmi:stringItemType"substitutionGroup="xldmi:item"/>

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


654

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

Figure 4.The Data Model for the Treatment Cube. Listing 6 illustrates some elements of the Definition linkbase, in which the definitions of the hierarchic relations occur. The arcrole all is used for the relationship between the TreatmentCube cube and the Dosage measure. The dimensions are linked to the cube through the arcrole hypercube-dimension. The dimensions also relate to its domains through the arcrole dimension-domain. Finally, the arcrole domain-member provides the representation of the hierarchies in the dimensions. Listing 6. Definition Linkbase <definitionLinkxlink:type="extended" xlink:role="http://www.example.br/xldm/treatment"> <locxlink:type="locator" xlink:label="lbl_TreatmentCube" xlink:href="treatment.xsd#TreatmentCube"/> <locxlink:type="locator" xlink:label="lbl_Dosage" xlink:href="treatment.xsd#lbl_Dosage" /> <definitionArcxlink:type="arc" xlink:arcrole=http://www.example.br/arcrole/all xlink:from="lbl_Dosage" ink:to="lbl_TreatmentCube"/> <locxlink:type="locator" xlink:label="lbl_PatientDimension" xlink:href="treatment.xsd#lbl_PatientDimension" /> <definitionArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/hypercubedimension" xlink:from="lbl_TreatmentCube" xlink:to="lbl_PatientDimension"/> <locxlink:type="locator" xlink:label="lbl_PatientDomain" xlink:href="treatment.xsd#lbl_PatientDomain"/> <definitionArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/dimensiondomain" xlink:from="lbl_PatientDimension" xlink:to="lbl_PatientDomain"/> <locxlink:type="locator" xlink:label="lbl_Patient1" xlink:href="treatment.xsd#lbl_Patient1" /> <definitionArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/domainmember" xlink:from="lbl_PatientDomain" xlink:to="lbl_Patient1"/> </definitionLink>

The Label linkbase, used to create labels for the members, can be seen in Listing 7. In this case, the label specifications are used to represent the medicine

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


655 XLDM: Na Xlink-Based Multidimensional Metamodel

commercial names and supply the ICD-10 code, an international standard for diseases. In this example illustrated in Listing 7, the attribute lang indicates that the drug name is specified in the English language. These relations use the arcrole concept-label. Listing 7. Label Linkbase <labelLinkxlink:type="extended" xlink:role="http://www.example.br/xldm/treatment"> <locxlink:type="locator" link:label="lbl_Captopril" xlink:href="treatment.xsd#Captopril" /> <label xlink:type="resource" xml:lang="en" xlink:label="lbl_Capitopril_Comercial" xlink:role="http://www.example.br/role/label"> Capoten </label> <labelArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/concept-label" xlink:from="lbl_Captopril" xlink:to="lbl_Capitropil_Comercial"/> </labelLink>

For the medical area, it is necessary to specify an order for performing certain procedures during the disease treatment. In the proposed data model, it is possible to express this by ordering representation using the Orderinglinkbase. This linkbase can be seen in Listing 8, which uses a processingLink element to define that the first element that should be processed is the cube, followed by the dimension, the domain and the member. This is done by the numerical value of the attribute order. Listing 8. Ordering Linkbase <processingLinkxlink:type="extended" xlink:role="http://www.examplE.br/xldm/treatment"> <locxlink:type="locator" xlink:label="lbl_TreatmentCube" xlink:href="treatment.xsd#lbl_TreatmentCube" /> <locxlink:type="locator" xlink:label="lbl_PatientDimension" xlink:href="treatment.xsd#lbl_PatientDimension" /> <processingArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/parentchild" xlink:from="lbl_TreatmentCube" xlink:to="lbl_PatientDimension" order="1" /> <locxlink:type="locator" xlink:label="lbl_PatientDomain" xlink:href="treatment.xsd#lbl_PatientDomain" /> <processingArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/parentchild" xlink:from="lbl_PatientDimension" xlink:to="lbl_PatientDomain" order="2"/> <locxlink:type="locator" xlink:label="lbl_Patient1" xlink:href="treatment.xsd#lbl_Patient1" /> <processingArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/parentchild" xlink:from="lbl_PatientDomain" xlink:to="lbl_Patient1" order="3"/> </processingLink>

The Description linkbase is used to provide textual descriptions related to a certain concept. Listing 9 shows the description made for an element that represents the disease Rheumatic Fever.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


656

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

Listing 9. Description Linkbase <descriptionLinkxlink:type="extended" xlink:role="http://www.example.br/xldm/treatment"> <locxlink:type="locator" xlink:label="lbl_Rheumaticfever" xlink:href="treatment.xsd#RheumaticFever"/> <description xlink:type="resource" xlink:role="http://www.example.br/role/description" xlink:label="desc_RheumaticFever"> Rheumatic fever is an inflammatorydisease that may develop two to three weeks after a Group A streptococcal infection such as strep throat or scarlet fever). It is believed to be caused by antibodycross-reactivity and can involve the heart, joints, skin, and brain. Acute rheumatic fever commonly appears in children ages 5 through 15, with only 20% of first time attacks occurring in adults </description> </descriptionLink>

Listing 10 shows the composition of a medicine, making evident the relationship between the formula components and the medicine by using the Calculation linkbase. This linkbase explains that the medicine Hydralazine is composed by 5% of Sodium Nitroprussiate and 25% of Isotonic Glucose Solution. Listing 10. Calculation Linkbase <calculationLinkxlink:type="extended" xlink:role="http://www.example.br/treatment"> <locxlink:type="locator" xlink:label="lbl_Hydralazine" xlink:href="treatment.xsd#Hydralazine" /> <locxlink:type="locator" xlink:label="lbl_SodiumNitroprussiate" xlink:href="treatment.xsd#SodiumNitroprussiate" /> <calculationArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/summation-item" xlink:from="lbl_Hydralazine" xlink:to="lbl_SodiumNitroprussiate" weight="0.05"/> <locxlink:type="locator" xlink:label="lbl_IsotonicGlucoseSolution" xlink:href="treatment.xsd#IsotonicGlucoseSolution"/> <calculationArcxlink:type="arc" xlink:arcrole="http://www.example.br/arcrole/summation-item" xlink:from="lbl_Hydralazine" xlink:to="lbl_IsotonicGlucoseSolution" weight="0.25"/> </calculationLink>

An extract of the instance document is illustrated in Listing 11, showing a data cube, where the context Patient1_Hypertension is defined. In this context, the dimension members PatientDimension, ProcedureDimension, MedicationDimension and DiseaseDimension are given. The temporal view of the fact is established by the element period, with sub-elements for the starting and ending date for which the fact is valid. A temporal hierarchy can be established by other sub-elements, e.g. elements for year, semester. The element unit defines the measure unit. In the example, the unit referring to the fact is milligrams per day, i.e. for Patient1, which is undertaking the fundoscopy treatment, during the period of January 1st, 2007 to December 31st, 2007, for the disease hypertension, the daily dose of the medicine Captopril is 50mg. Listing 11. XLDM Instance <context id="Patient1_Hypertension"> <entity> <segment> <xldmdi:explicitMember dimension="treat:PatientDimension">treat:Patient1</xldmdi:explicitMember> <xldmdi:explicitMember dimension="treat:ProcedureDimension">treat:Fundoscopy</xldmdi:explicitMember> <xldmdi:explicitMember dimension="treat:MedicationDimension">treat:Captopril</xldmdi:explicitMember> <xldmdi:explicitMember dimension="treat:DiseaseDimension">treat:Hypertension</xldmdi:explicitMember> </segment>

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


657 XLDM: Na Xlink-Based Multidimensional Metamodel </entity> <period><startDate>2007-01-01</startDate> <endDate>2007-12-31</endDate></period> </context> <unit id="mg_day"><divide> <unitNumerator>mg</unitNumerator> <unitDenominator>day</unitDenominator></divide> </unit> <treat:DosagecontextRef="Patient1_Hypertension" unitRef="mg_day">50</treat:Dosage>

5. A COMPARATIVE ANALYSIS BETWEEN XLDM ANDXBRL DIMENSIONS Since XLDM and XBRL Dimensions use the same technologies, XML Schema and XLink, it is important to highlight the differences between them. This section provides a comparison between an application of XMDL to financial indexes and XBRL Dimensions. Besides having a broader applicability than XBRL Dimensions, XLDM has also agreater expressivity and its use is easier. An example to show that the use of XLDM, in the financial field, can have more expressivity than XBRL is given by the use of the proposed arcroles for the Calculation linkbase. Listing 12 shows the specification of a financial index named ExposureRiskIndex. It is formed by the relationship between ExposureValue and CreditRiskCapitalRequirements. This element was extracted from the XBRL taxonomy of project COREP (Boixo& Flores, 2005), an initiative of CEBS (Committee of European Banking Supervisors) to provide a framework of financial reports for some institutions from the European Union. It is not possible to express this kind of relationship in XBRL Dimensions. It happens because there is no relationship for division operation on XBRL Dimensions. In addition to that, byusing the Description linkbase, it is possible to give a description to a concept. In Listing 13, there is a part of the Description linkbase, describing the Exposure Risk Index concept. As a result, the XLDM contribution, in the financial field, extends the possibilities offered by XBRL Dimensions. Consequently, besides the proposed generalization, a wider expressivity is assured. A taxonomy for some financial indexes are available at www.cin.ufpe.br/~pcs3/XLDM/FinancialDataModel. Listing 12.Financial Index Specification <calculationLinkxlink:type=”extended” xlink:role=”http://www.example.br/financial”> <locxlink:type=”locator” xlink:label=”lbl_ExposureValue” xlink:href=”financial.xsd#ExposureValue”/> <locxlink:type=”locator” xlink:label=”lbl_ExposureRiskIndex” xlink:href=”financial.xsd# ExposureRiskIndex”/> <calculationArcxlink:type=”arc” xlink:arcrole=“http://www.example.br/arcrole/numerator-item” xlink:from=”lbl_ ExposureRiskIndex” xlink:to=”lbl_ExposureValue”/> <locxlink:type=”locator” xlink:label=”lbl_CreditRiskCapitalRequirements” xlink:href=”financial.xsd#CreditRiskCapitalRequirements”/> <calculationArcxlink:type=”arc” xlink:arcrole=“http://www.example.br/arcrole/denominator-item” xlink:from=”lbl_ExposureRiskIndex” xlink:to=”lbl_CreditRiskCapitalRequirements”/> </calculationLink>

Listing 13. Debt Composition Index Description <descriptionLinkxlink:type=”extended” xlink:role=”http://www.example.br/financial”> <locxlink:type=”locator”xlink:label=”lbl_ExposureRiskIndex” xlink:href=“financial.xsd#ExposureRiskIndex”/> <description xlink:type=”resource” xlink:role=”http://www.example.br/role/description” xlink:label=”desc_ExposureRiskIndex”> <desc:Description>Ratio between Exposure Value andCredit Risk Capital Requirements.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


658

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

</desc:Description> </description> <descriptionArcxlink:type=”arc” xlink:arcrole=http://www.example.br/arcrole/conceptdescription” xlink:from=”lbl_ExposureRiskIndex” xlink:to=”desc_ExposureRiskIndex”/> </descriptionLink>

A XLDM application can be found in the LMDQL (Link-based and Multidimensional Query Language) (Silva & Times, 2009). LMDQL is a language that has operators for multidimensional analysis of data represented in XML documents interconnected through XLink. LMDQL has an operator, OperatorDefinition, which allows the users to create new operators, through mathematical relation sexpressed in the linkbase Calculation. In operator’s creation through OperatorDefinition, are generatedthelinkbase Calculation, Definitionand Labeland the operators scheme (XML Schema). In such cases, the solution for the problems of data heterogeneity in XML is also reached. The operator specification is represented on linkbase Calculation to define the arithmetic relation that constitutes the new operator. LMDQL language processor was incorporated into the OLAP server Mondrian (Mondrian, 2008) so that the analytical queries could be performed based on this languagein XML documents. After this implementation, the creation and use of the operator, that represents the index, ExposureRiskIndex were possible. Figure 5 illustrates this operator creation by LMDQL operatorOperatorDefinitionand Figure 6 demonstrates its use. Figure 7 shows linkbase Calculation, stored on SGBD DB2 Express C (IBM – DB2 Express C, 2006), generated by the operator OperatorDefinition, which represents the index created. 6.

CONCLUSION

The metamodel proposed in this article solves the heterogeneity problems of XML data through the specification of data schemas, by usingXML Schema, and the relationships between them, through linkbases. The specification of this metamodel allows its use in different domains, because they have linkbases that determine common relationships with several knowledge areas, e.g. ordering relation, hierarchy, element naming, description and reference. For arithmetic relationships, the Calculation linkbase comprises all kinds of basic arithmetic operations, making various mathematic expressions possible. An important characteristic is that the metamodel, being based on XLink, can be extended for representing relationships that are not predicted. For this reason, XLDM makes the development of processing XML data tools, that use XLink (Silva & Times, 2009), (Silva, Santos and Times, 2010), easier. This paper presents the formalism for XML cubes based on XLink, thus, allowing the specification of the proposed metamodel based on a set of XLDM documents definitions. The case study shows its applicability in the medical and financial area. Another application of the metamodel XLDM was made in the field of sale organization, which can be seen in http://www.cin.ufpe.br/~pcs3/XLDM/foodMartXML. For further proposals, it is intended to use this metamodel in other domains, and to validate it in other contexts with the LMDQL language, developed for the analytical processing of XML data that uses XLink. A CASE tool that uses XLDM to define data models for specific domains can also be seen as another indication of future proposals of work.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


659 XLDM: Na Xlink-Based Multidimensional Metamodel

Figure 5.Index creation from the operator OperatorDefinition.

Figure 6.Use of the operator ExposureRiskIndex.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


660

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

Figure 7.Linkbase Calculation, created by the LMDQL operator OperatorDefinition.

REFERENCES

Barceló, P. and Libkin, L. (2005).Temporal Logics over Unranked Trees.Proceedings of the 20th Annual Symposium on Logic in Computer Science. Baril, X., Bellahséne (2003).Z.: Designing and Managing an XML Warehouse. XML Data Management: Native XML and XML-Enabled Database Systems. Addison Wesley Professional 455–474. Boussaid, O., Messaoud, R. B., Choquet, R. and Anthoard, S. (2006). X-Warehousing: An XML-Based Approach for Warehousing Complex Data. East-European Conference on Advances in Databases and Information Systems (ADBIS 06). Boixo, I.; Flores, F. (2005). New Technical and Normative Challenges for XBRL: Multidimensionality in the COREP Taxonomy. The International Journal of Digital Accounting Research. v. 5, n. 9, p. 79-104. ISSN: 1577-8517. Golfarelli, M., Rizzi, S., Vrdoljak, B. (2001). Data Warehouse Design from XML Sources.Proceedings of the 4th ACM International Workshop on Data Warehousing and OLAP (DOLAP 2001), Atlanta, Georgia, USA, ACM Press 40–47. Gottlob, G., Koch, C. and Pichler, R. (2003).XPath query evaluation: improving time and space efficiency.19th International Conference on Data Engineering. Grosso, P. (2003).XPointer Framework W3C Recommendation. Retrieved from http://www.w3.org/TR/2003/REC-xptr-framework-20030325/

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


661 XLDM: Na Xlink-Based Multidimensional Metamodel

Hernández-Ros, I. and Wallis, H. (2006).XBRL Dimensions 1.0. Retrieved from www.xbrl.org/Specification/XDT-REC-2006-09-18.htm Hümmer, W., Bauer, A., Harde, G. (2003). XCube – XML for Data Warehouses.Proc.The 6th ACM Intl Workshop on Data Warehousing and OLAP, p. 33– 40. IBM – DB2 Express C 01.ibm.com/software/data/db2/express

(2006).

Retrieved

from

http://www-

Jensen, M. R., Moller, T. H. and Pedersen, T. B. (2001). Specifying OLAP Cubes On XML Data.Technical Report 01-5003. Department of Computer Science, Alborg University. Kimball, R., Ross, M.(2002).The DataWarehouse Toolkit.John Wiley and Sons. Libkin, L., Neven, F. (2003).Logical Definability and Query Languages over Unranked Trees.LICS 2003. Canada, Ottawa. IEEE Computer Society, p. 178-187. Libkin, L. (2006). Logics for Unranked Trees: An Overview.Logical Methods in Computer Science, Vol. 2 (3:2) 2006, p. 1–31. Mondrian (2008). Retrieved from http://mondrian.pentaho.org Näppilä, T., Järvelin, K., Niemi, T. (2008). A tool for data cube construction from structurally heterogeneous XML documents.Journal of the American Society for Information Science and Technology (JASIST), Vol. 59, Issue 3, p. 435-449. Nassis, V., Rajugan, R., Dillon, T. S. and Rahayu, W. (2004).Conceptual Design of XML Document Warehouses.Data Warehousing and Knowledge Discovery, 6th International Conference, DaWaK 2004, p. 1–14. Pedersen, D., Riis, K. and Pedersen, T. B. (2001). XML – Extended OLAP Querying.Technical Report 02-5001.Department of Computer Science, Alborg University. Pokorny, J. (2001). Modeling Stars Using XML.The 4th ACM Workshop on Data Warehousing and OLAP (DOLAP01), p. 24–31. USA, Atlanta. Silva, P. C., Times, V. C. (2009). XPath+: A Tool for Linked XML Documents Navigation.XSym 2009 - Sixth International XML Database Symposium at VLDB'09.France, Lyon. Silva, P.C., Times, V. C. (2009), LMDQL: Link-based and Multidimensional Query Language.DOLAP 2009 - ACM Twelfth International Workshop on Data Warehousing and OLAP.China, Hong Kong. Silva, P.C., Santos, M. M., Times, V. C. (2010). XLPATH: XML Linking Path Language.IADIS WWW/Internet 2010 (ICWI 2010) Conference. Trujillo, J., Luján-Mora, S., Song, I. (2004).Applying UML and XML for Designing and Interchanging Information for Data Warehouses and OLAP Applications.Journal of Database Management 15(1) 41–72. Unified Modeling Language (2005). Retrieved from http://www.uml.org XBRL International (2008). Retrieved from http://www.xbrl.org XQuery 1.0: (2007). Retrieved fromhttp://www.w3.org/TR/xquery

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


662

Silva, P. C. da, Cruz, M. S. H., Times, V. C.

XML Linking (2001). Retrieved fromhttp://www.w3.org/TR/xlink XPath language (2007). Retrieved fromhttp://www.w3c.org/tr/xpath20 XML Schema (2004). Retrieved fromhttp://www.w3.org/TR/xmlschema-1

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 641-662

www.jistem.fea.usp.br


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, pp. 663-680 ISSN online: 1807-1775 DOI: 10.4301/S1807-1775201100030008

ESTRATEGIAS DE DECISIÓN EN SISTEMAS DINÁMICOS: APLICANDO MAPAS COGNITIVOS DIFUSOS APLICACIÓN A UN EJEMPLO SOCIO – ECONÓMICO DECISION STRATEGIES IN DYNAMIC SYSTEMS USING FUZZY COGNITIVE MAPS. APPLICATION TO A SOCIO - ECONOMIC EXAMPLE Lisandro Curia Andrea Lavalle Universidad Nacional del Comahue, Buenos Aires, Argentina _____________________________________________________________________________________

ABSTRACT Situations in which human beings develop their daily tasks are extremely complex and dynamic. In each field, the analysis of the system`s variables under consideration can be simplified if the system is conceived as a set of concepts, where a change in each of them will cause changes in the remainder. In the analysis of a particular problem, the representation of concepts in the form of a map helps to synthesize information detecting the main concepts that are linked to the problem. In these cases, Fuzzy Cognitive Maps (FCM) are able to synthesize much of this information. Also with this technique is possible to follow the evolution of the concepts to a state of equilibrium and therefore allow us to study the dynamics that takes the passage from one state to another given the situation under review. This paper describes the construction and analysis of the Fuzzy Cognitive Maps technique through an economy application example. The aim is to present a methodology supported by the FCM to analyze the evolution and impact on the system causing by the change in the value of one or more concepts involved. Keywords: Fuzzy logic, dynamic cualitative systems, cognitive maps.

_____________________________________________________________________________________ Manuscript first received/Recebido em 05/03/2011 Manuscript accepted/Aprovado em: 17/11/2011 Address for correspondence / Endereço para correspondência Leopoldo Lisandro Horacio Curia, Ingeniero Mecánico por la Universidad Tecnológica Nacional Regional Santa Fe. Magíster en Enseñanza de las Ciencias Exactas y Naturales, orientación Matemática por la Universidad Nacional del Comahue. Profesor Adjunto Regular del Departamento de Matemática de la Universidad Nacional del Comahue. Dpto. de Matemática Facultad de Economía y Administración. Universidad Nacional del Comahue. Buenos Aires 1400, (8300) Neuquén. Argentina. +54-299-4490300. lcuria@uncoma.edu.ar Andrea Lina Lavalle, Profesora en Matemática por la Universidad Nacional del Comahue. Magíster en Enseñanza de las Ciencias Exactas y Naturales, orientación Matemática por la Universidad Nacional del Comahue. Profesora Adjunta Interina del Departamento de Estadística de la Universidad Nacional del Comahue. Dpto. de Estadística. Facultad de Economía y Administración. Universidad Nacional del Comahue. Buenos Aires 1400, (8300) Neuquén. Argentina. +54-299-4490300. lavalleandrea@yahoo.com.ar Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


664 Curia, L., Lavalle, A.

RESUMEN Las situaciones que abarcan el contexto en que el ser humano desarrolla sus tareas cotidianas son extremadamente complejas y dinámicas. En cada campo, el análisis de las variables que componen el sistema objeto de estudio puede simplificarse si se lo concibe compuesto como un conjunto de conceptos donde un cambio en cada uno de ellos provocará cambios en los restantes. Cuando se trata de analizar la problemática de un área particular, la representación de los conceptos en forma de un mapa permite sintetizar la información aislando los principales conceptos que están vinculados en el problema. En estos casos los Mapas Cognitivos Difusos (MCD) logran sintetizar gran parte de la información presente. Además con esta técnica es posible seguir la evolución de los conceptos hasta un estado de equilibrio y por lo tanto permiten estudiar la dinámica que lleva el pasaje de un estado a otro determinado en la situación que se analiza. En este trabajo se expone la técnica de construcción y análisis de Mapas Cognitivos Difusos a través de un ejemplo de aplicación a la economía. El objetivo consiste en presentar una metodología de trabajo apoyada en los MCD que permite analizar la evolución y el impacto que provoca en el sistema el cambio en el valor de uno o varios conceptos intervinientes. Palabras clave: Lógica difusa, sistemas cualitativos dinámicos, mapas cognitivos

1. INTRODUCCIÓN La toma de decisiones en ámbitos como la política y la educación pasando por la economía y la salud, constituye un serio desafío. Normalmente, establecer prioridades y pronosticar determinadas situaciones puede no ser una tarea sencilla aún disponiendo de personal idóneo y con conocimiento y experiencia suficiente como para llevar adelante esa tarea (Carlsson, 1996; Wheeldon & Faubert, 2009). La propia dinámica de los acontecimientos y el número de variables que componen el sistema objeto de estudio añaden una complejidad adicional a la situación que se quiere analizar. Desde el punto de vista estadístico y computacional, modelar un sistema que evoluciona en el tiempo (sistema dinámico) es una tarea compleja; por ejemplo, los modelos matemáticos para analizar el comportamiento de este tipo de sistemas se desarrollan en base a ecuaciones diferenciales o mediante sistemas de ecuaciones diferenciales (Jang & Mitsutani, 1997). Aún así, en estos modelos determinísticos debe conocerse completamente la expresión matemática que liga las variables que componen el sistema. Este es un hecho poco común en muchos sistemas o situaciones en las que el ser humano desarrolla sus actividades. Si bien existen modelos matemáticos que describen fenómenos físicos, sistemas mecánicos, velocidades de reacción, transmisión de calor, etc., existen otras situaciones, principalmente aquellas en las que interviene el factor humano, donde es imposible establecer un modelo matemático. En estos ambientes donde el ser humano interviene es frecuente calificar situaciones y reacciones con expresiones lingüísticas y no numéricas. Surgen así sistemas donde se describe la realidad en términos cualitativos y por ello se denominan sistemas cualitativos (Carvalho & Tomé, 1999). De este modo, se entiende que los sistemas cualitativos son difíciles o imposibles de ser descriptos con leyes matemáticas y están rodeados de factores como incertidumbre e imprecisión. Estos sistemas abarcan una amplia gama de situaciones que van desde el comportamiento y actitud individual que puede exhibir, por ejemplo, el presidente de una compañía que elabora un producto, hasta conductas más generalizadas presentes en agrupamientos sectoriales de una sociedad que engloba un determinado número de personas, y que puede llegar a generar conflictividad social, procesos inflacionarios, crecimiento económico, inestabilidad en el gobierno, etc. (Lin et al., 2010; Stylios et al., 2005). Otros conceptos sustentados por la sociedad como poder adquisitivo, riqueza, pobreza, libertad, bienestar, etc., son también ejemplos de conceptos que componen sistemas cualitativos. Como puede verse,

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680 .

www.jistem.fea.usp.br


Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio – económico

665

los sistemas socio-económicos son sistemas cualitativos por excelencia y en realidad, debido a los continuos cambios que se suceden en estos ámbitos, corresponde hablar de sistemas cualitativos dinámicos. Planteo del problema El problema de investigación que se aborda en el presente trabajo se relaciona con la necesidad de contar con herramientas que, mediante un tratamiento cualitativo y cuantitativo, permitan predecir el comportamiento que tendrán las componentes de un sistema cualitativo dinámico. Como puede observarse, el problema es de tipo metodológico y para dar respuesta al mismo se propone la utilización combinada de un mapa cognitivo con técnicas de la lógica difusa ya que aportan elementos que permiten traducir expresiones verbales y dichos ambiguos en valores numéricos concretos. Se analiza un caso concreto de aplicación a la economía aunque la metodología propuesta puede extenderse con facilidad a otras áreas relacionadas con la toma de decisiones y estrategias políticas y de producción. Objetivos •

Presentar una metodología de trabajo apoyada en los Mapas Cognitivos Difusos que permite analizar la evolución de un sistema cualitativo dinámico

Ilustrar la técnica de construcción y análisis de Mapas Cognitivos Difusos a través de un ejemplo de aplicación a una problemática socio-económica.

Este trabajo está compuesto de cuatro secciones. En la primera se establecen los lineamientos teóricos que dan sustento a la investigación, se ejemplifican los sistemas cualitativos y se definen diferentes tipos de mapas: conceptuales, cognitivos y cognitivos difusos, como formas de representación del conocimiento. En la segunda sección se describe la propuesta metodológica donde se recurre a un ejemplo concreto para mostrar cómo se construye y se procesa un mapa cognitivo difuso. En la tercera sección se muestran los resultados de la implementación de un mapa cognitivo difuso aplicado a un problema de toma de decisiones. En la última sección se detallan las conclusiones del trabajo y las líneas futuras de investigación. 2. MARCO TEÓRICO Antes de pasar a describir la técnica que involucra la implementación y el análisis de un Mapa Cognitivo Difuso se presentará brevemente una situación que pone de manifiesto cómo los conceptos que intervienen en un sistema cualitativo se ven afectados por los cambios que puedan presentar uno o varios de los conceptos restantes. Considérese por ejemplo, un pasajero que debe viajar por una autopista. Los conceptos a incluir para conformar el sistema pueden ser: mal tiempo, aversión a conducir, congestión de tránsito, estado de la autopista, estado del automóvil, velocidad a la que se conduce, entre otras variables que puedan tener relevancia en el sistema. Las variables del sistema generalmente son seleccionadas por uno o varios especialistas que van a analizar la problemática determinada. Estas variables pasarán a ser luego los conceptos del mapa cognitivo. Por ejemplo, otro especialista perteneciente al área en el que se está analizando el problema del tránsito podría incluir como variables la frecuencia de control de la policía, día de la semana, etc. Los conceptos se influyen mutuamente (Papageorgiou et al., 2006; Miao et al., 2006) y esta influencia puede ser positiva o negativa y su intensidad puede ser alta o baja. Por ejemplo, el mal tiempo

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680

www.jistem.fea.usp.br


666 Curia, L., Lavalle, A.

conduce a un aumento del congestionamiento de tránsito y determina una disminución de la velocidad. Si la frecuencia del control policial aumenta, el número de accidentes baja. El número de accidentes no afecta directamente la propia velocidad del que conduce, pero si este número aumenta, el propio sentido de riesgo del conductor condiciona su aversión a conducir y como la aversión aumenta, el conductor tiende a retrasarse. Estos escenarios y otros similares donde se describen situaciones económicas, políticas y sociales pueden representarse gráficamente por medio de conceptos en un papel y luego implementarse en una computadora a través de los denominados Mapas Cognitivos donde los conceptos reciben el nombre de nodos y las aristas que conectan los conceptos se denominan eventos causales (Calais, 2008; Kosko, 1986). Existen diversos tipos de mapas que pueden utilizarse para representar información. Los mapas conceptuales permiten la organización jerárquica del conocimiento de un tema y pueden utilizarse para analizar y sintetizar información (Novak & Gowin, 1988). Un mapa conceptual representa una jerarquía de diferentes niveles de generalidad e inclusividad conceptual y se compone de conceptos, proposiciones y palabras enlace. Los conceptos se refieren a objetos, eventos o situaciones y se representan en círculos. Existen tres tipos de conceptos: supraordinados (mayor nivel de inclusividad), coordinados (igual nivel de inclusividad) y subordinados (menor nivel de inclusividad). Las proposiciones representan la unión de dos o más conceptos relacionados entre sí mediante una palabra enlace. Las palabras enlace, expresan el tipo de relación existente entre dos o más conceptos y se representan a través de líneas rotuladas (Carvalho & Tomé, 1999; Mellado et al., 2002). Las técnicas de construcción de mapas conceptuales son muy utilizadas para la enseñanza en los diferentes niveles educativos. Los mapas cognitivos permiten analizar contenidos temáticos de una disciplina, situaciones personales o grupales y también los procesos que intervienen en la formación de estrategias operativas. Asimismo, proporcionan una imagen mental personal tomada del ambiente y a la que puede recurrirse para analizar informalmente los cambios que se producen en el medio que se está analizando cuando se modifican los estímulos ambientales originales (Carlsson, 1996; Axelrod, 1976). Los mapas cognitivos relacionan de una forma parcialmente jerarquizada, unidades de información con un sentido más amplio que los mapas conceptuales tradicionales de Novak. Aunque tanto la representación de conceptos por medio de mapas cognitivos o a través de un mapa conceptual, permiten una visión global y no fragmentada de las concepciones que cada experto tiene del área de estudio, los primeros han sido utilizados mayormente en estrategias de planeamiento y los mapas conceptuales se utilizan principalmente en educación y en el estudio del comportamiento humano, pero ambos atienden a situaciones estáticas, dejando de lado las componentes dinámicas que implican la evolución y posterior desenvolvimiento de una situación (Kardaras & Karakostas, 1999). Es por ello que los mapas cognitivos se utilizan en un marco donde la dinámica del sistema y la jerarquización de los conceptos se dejan de lado. Los mapas cognitivos admiten diferentes formas de representación: tipo sol, de nubes, de ciclos y de secuencias, entre otras. Por ejemplo, en la representación tipo sol, el concepto central se representa con un círculo en el medio del esquema y el resto de los conceptos se muestran con varios rayos dirigidos hacia este centro que dan a entender cómo influyen las ideas representadas en los rayos sobre el concepto central.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680 .

www.jistem.fea.usp.br


Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio – económico

667

En el mapa cognitivo en forma de nubes, se coloca en la nube central el tema principal y alrededor de ella se colocan otras nubes que contienen subtemas. Estas llevan información adicional que se desea aportar. Un mapa cognitivo muy utilizado es el denominado de satélites. En la parte central se coloca el tema de interés o el concepto principal enmarcado en un círculo y en rectángulos periféricos se distribuyen los subtemas simulando satélites que giran en torno al tema central. De los rectángulos salen flechas que indican la incidencia de los subtemas en el tema central (Pimienta, 2005). Los mapas cognitivos difusos (MCD) constituyen una herramienta desarrollada por Kosko (1986) que tiene por finalidad expandir el horizonte de trabajo que supone la implementación de los mapas cognitivos tradicionales y los mapas conceptuales de Novak (Carlsson et al., 2006; Kosko, 1997). Estos mapas, en versiones más avanzadas, pueden incluir la combinación de técnicas pertenecientes a la Soft Computing, tales como las Redes Neuronales Artificiales y la Lógica Difusa (Jang et al., 1997; Kosko, 1997). Si bien el método de construcción de un MCD no puede considerarse una técnica infalible, puesto que algunos conceptos pueden dejarse de lado y finalmente su ausencia podría ser un factor crítico en el análisis final de la estrategia a seguir, constituyen una herramienta de gestión muy útil y de gran impacto visual a la hora de representar y resumir la información (Cai et al., 2006; Chandana et al., 2007). Aun así, los MCD pertenecen a un área en constante expansión y con un campo de aplicación cada vez mayor, donde actualmente se investigan alternativas de mejora, tal es el caso de los MCD con base de reglas borrosas (BR-MCD) (Carvalho & Tomé, 1999) cuya construcción se basa en la aplicación de reglas difusas un tanto más complejas que las detalladas en este trabajo. Estos sistemas muestran ser un excelente complemento a los MCD tradicionales para estudiar la evolución de los conceptos en aquellos casos en los casos en que los MCD tradicionales no presenten buenas características de convergencia o cuando se quiera representar el conocimiento impreciso o el lenguaje ambiguo de los especialistas con una base de reglas operativas llamadas reglas borrosas (Carvalho & Tomé, 1999). Este aspecto es un elemento a tener en cuenta a la hora de modelar sistemas en los que intervienen los seres humanos ya que el modo de comunicar sus ideas sobre hechos, situaciones o acontecimientos determinados siempre contiene información expresada con ambigüedad e imprecisión lo que dificulta o imposibilita disponer de modelos que trabajen con valores numéricos concretos. Formalmente, un mapa cognitivo difuso consiste de un grafo dirigido con varios nodos que representan los conceptos causales que surgen del tema a tratar y de arcos dirigidos conectados a los nodos que representan las relaciones causales entre los mismos (Stach et al., 2010). Inicialmente cada concepto toma valores 0 ó 1, el valor 0 indica la ausencia del concepto en el instante en que se analiza el problema mientras que valor 1 indica la presencia del concepto. Existen otros MCD donde los nodos, llamados nodos difusos, toman valores pertenecientes al intervalo [0, 1]. Las aristas que conectan los nodos tienen asociado un peso cuyo valor puede pertenecer al conjunto {-1, 0, 1} o al intervalo [-1,1]. Estos valores indican el peso o la intensidad con que un concepto influye en el otro. El peso será positivo si representa una relación causal incremental y negativo si la conexión implica una relación causal decreciente (Carvalho & Tomé, 1999). El término difuso proviene del hecho con que estos valores son asignados por uno o varios expertos y por tanto conllevan algún grado de información cualitativa propia de la interpretación del experto y que es introducida al mapa como un valor concreto.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680

www.jistem.fea.usp.br


668 Curia, L., Lavalle, A.

Con la finalidad de ilustrar los tres tipos de mapas enunciados, se presenta un ejemplo tomado del texto de Kandasamy y Smarandache (2003) en el que, en un escenario de una dinámica compleja de mercado, se establecen los factores que influyen en el mismo clasificados como factores incontrolables, controlables y semicontrolables. Sobre los mismos conceptos se realizan tres representaciones: un mapa conceptual, un mapa cognitivo y un mapa cognitivo difuso. En el caso del mapa conceptual representado en la Figura 1, se parte del concepto más inclusivo, que es el mercado, y se han puesto en un mismo nivel jerárquico los tres tipos de factores, sin embargo puede resultar de interés distinguir los órdenes jerárquicos de los ejemplos de cada uno de los tres casos. Así en una situación especial de mercado, para el que diseña el mapa conceptual puede ser de mayor importancia incluir el concepto “demanda” en un nivel jerárquico más alto porque por ejemplo podría hacer referencia a una situación especial de alta demanda del producto por condiciones específicas del mercado.

Figura 1: Representación de conceptos en un Mapa Conceptual En la Figura 2 se representa la misma problemática a través de un mapa cognitivo. En este caso los conceptos no están ordenados jerárquicamente y hay libertad para disponerlos en distintos sectores del mapa. El mapa cognitivo es del tipo satélite y contiene una idea central que es el mercado y los temas secundarios están distribuidos en rectángulos con flechas orientadas hacia la idea central.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680 .

www.jistem.fea.usp.br


Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio – económico

Factores controlables: Control de calidad Precio de venta Productividad

Mercado

669

Factores Incontrolables: Condiciones económicas Actividad de la competencia

Factores semi controlables: Demanda Competitividad

Figura 2: Conceptos en un Mapa Cognitivo En la comparación entre un mapa conceptual y un mapa cognitivo debe mencionarse un aspecto de fundamental importancia que los distingue. El mapa conceptual presenta una estructura lógica aceptada socialmente por los expertos del tema. En cambio el mapa cognitivo tiene una estructura con un sesgo hacia el campo de la psicología y pretende formar una representación idiosincrásica personal (Axelrod, 1976). Finalmente en la Figura 3 se utiliza un Mapa Cognitivo Difuso. Aquí los conceptos están relacionados con flechas que tienen adicionados un signo más o un signo menos de acuerdo al caso.

Figura 3: Conceptos en un Mapa Cognitivo Difuso Para este ejemplo particular, el experto que evalúa la incidencia que tienen entre sí los conceptos considera, por ejemplo, que el control de calidad tiene una incidencia positiva en la demanda de mercado y el precio de venta una incidencia negativa sobre el mismo. Lo que se pretende lograr con el estudio de un MCD es básicamente obtener la respuesta a un cuestionamiento de la forma “que ocurre si”. Por ello dentro de un

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680

www.jistem.fea.usp.br


670 Curia, L., Lavalle, A.

conjunto de conceptos reunidos se pretende analizar qué le sucede al sistema si algún estímulo cambia el valor de los conceptos, o bien si se introducen nuevos conceptos o se quitan algunos existentes. Este análisis en la dinámica del sistema, puede lograrse mediante un MCD que operando de manera iterativa, de modo de ir actualizando los valores de los conceptos secuencialmente y luego de cierto número de iteraciones, llegue a un nuevo estado de equilibrio que permita concluir qué consecuencias trajeron aparejadas los cambios realizados en los valores de los conceptos (Hilera & Martínez, 2000). La estrategia a seguir consiste en comenzar con un cierto valor inicial de los conceptos y manteniendo fijos los valores de las conexiones entre ellos, analizar el cambio que opera en los valores de los mismos de acuerdo con la dinámica propia del sistema. Las posibilidades de aplicación de los mapas cognitivos difusos se están extendiendo en la actualidad a diversos campos disciplinares y en diferentes ámbitos de la toma de decisiones. En el trabajo de Stach et al. (2010) se resalta el amplio uso de los mapas cognitivos difusos debido en gran medida a su sencillez y facilidad de uso. Estos autores resaltan los avances que ha habido en los últimos años en la construcción prácticamente automática de estos mapas. Sin embargo, cuando el número de conceptos que contiene el mapa es elevado su construcción se torna compleja, inclusive si el desarrollo del mapa recae sobre un experto. Por tal motivo, los autores proponen recurrir a otra rama de la inteligencia artificial como son los algoritmos genéticos con objeto de depurar las variables que contiene el problema. Los algoritmos genéticos fueron ampliamente utilizados en problemas de optimización y reducción de variables. En este caso los autores presentan el método de construcción de un mapa cognitivo difuso partiendo de un gran número de mapas y optimizando las variables que intervienen en cada subconjunto hasta quedarse con un modelo definitivo. Estos trabajos, basados en rutinas de optimización con base en los algoritmos genéticos, están ocupando un sector importante en la divulgación científica de la actualidad. Los algoritmos genéticos junto con las redes neuronales artificiales y la lógica difusa constituyen las bases de la denominada soft computing y a su vez son una rama importante de la inteligencia artificial la que cuenta con un enorme desarrollo tanto teórico como en la práctica. El empleo de estas rutinas de optimización en la construcción de un mapa cognitivo difuso sólo se justifican en estos casos, donde hay un gran número de conceptos. Si bien la inclusión de algoritmos genéticos en su forma netamente práctica no exige grandes complicaciones y existe software específico para su implementación, cuando el número de variables no es elevado no se justifica su empleo. Con un número reducido de conceptos los mapas cognitivos difusos ya han mostrado dar buenos resultados y su novedad consiste no tanto en optimizar su funcionamiento sino en dar una respuesta concreta, ágil y flexible a un problema de aplicación específico. En medicina, los mapas cognitivos difusos también lograron encontrar su lugar ya que pueden colaborar en la toma de decisiones en diagnósticos médicos. Kannappan et al. (2011) proponen el empleo de mapas cognitivos difusos para modelar y predecir conductas en niños autistas. Su trabajo descansa en la combinación de modelos de redes neuronales con las técnicas clásicas de construcción de mapas cognitivos difusos. La importancia de su trabajo radica más en la utilidad que prestan los mapas cognitivos difusos en la predicción de trastornos en niños con capacidades especiales que en la efectividad de los mapas en sí.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680 .

www.jistem.fea.usp.br


Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio – económico

671

Por otro lado, en el trabajo de Papageorgiou et al. (2011) se realiza un estudio de la predicción del rendimiento de cultivos de algodón. Su estrategia vuelve a ser el empleo de la lógica difusa para modelar mediante una base de reglas difusas el conocimiento del experto en la temática. Ellos presentan un enfoque clásico del uso de los mapas cognitivos difusos también similar al que se emplea en este trabajo. Utilizan como nodos la textura del algodón, su pH, materia orgánica, la producción de algodón y otros elementos químicos que forman parte del algodón. Su objetivo es presentar una herramienta que sea útil para predecir el rendimiento del algodón. Este trabajo pertenece a un área específica de la producción y está destinado concretamente a la mejora en la gestión de un cultivo. Si bien en sus comienzos los mapas cognitivos difusos surgieron con el objetivo de ayudar a resolver problemas de estrategias políticas y de mercado, posteriormente, tal como muestran los trabajos citados, su empleo fue ramificándose hasta cubrir un gran número de aplicaciones aunque siempre relacionadas con estrategias de planeamiento y ayuda en la toma de dediciones. 3. MATERIALES Y MÉTODOS Técnicas para la construcción de Mapas Cognitivos Difusos Muchos fenómenos del mundo real pueden ser complicados si se los quiere representar por medio de un modelo matemático. En ocasiones para poder comprender la distribución de los principales conceptos y las relaciones que los vinculan, resulta de utilidad incorporar mapas cognitivos (Carlsson, 1996; Kosko, 1986; Kosko, 1997; Peláez & Bowles, 1995). Cuando se requiera analizar la evolución de uno o varios conceptos intervinientes en el mapa al sufrir éste variaciones o cambios, resulta de interés estudiar el proceso dinámico que permita interpretar el estado al que evolucionó el sistema y las consecuencias que el estado inicial provocó. Pero no sólo captar la dinámica del sistema puede resultar complicado, un problema central que se plantea en la construcción de un sistema que permita representar conocimientos es el de atribuirle algún valor a la intensidad o al grado de influencia con la que un concepto provoca efectos o causalidades en otros. En general, esta relación viene expresada de manera ambigua pudiendo interpretarse de distintas formas como resultado natural de expresiones lingüísticas. Es completamente natural que sobre un mismo fenómeno varios especialistas tengan conocimientos y fuentes de valoraciones distintas. Con el fin de obtener una única valoración para la relación causal entre los conceptos se utilizan reglas de la lógica difusa. Más adelante se analizará esta situación con un caso concreto. Si se está interesado en modelar un fenómeno o evento determinado que puede provenir del campo de la economía, relaciones sociales, salud, educación, etc., en primer lugar deben elegirse los conceptos que intervienen en la descripción del modelo y establecer las relaciones entre ellas. De acuerdo a Peláez (1995) el proceso de construcción del mapa cognitivo difuso de un sistema puede ser dividido en 3 etapas: 1) escoger los conceptos variables, 2) determinar los arcos que muestran las relaciones entre los conceptos y 3) asignar apropiadamente signos y la intensidad lingüística para describir las relaciones. Luego se debe elegir un método que permita simular el fenómeno, es decir, adoptar las herramientas adecuadas que permitan analizarlo y en lo posible, predecir el comportamiento ante nuevas situaciones.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680

www.jistem.fea.usp.br


672 Curia, L., Lavalle, A.

La Figura 4 relaciona dos conceptos A y B con un tercero C. Este es el modo de construir una simple relación causal entre dos conceptos A y B que influyen positiva o negativamente en otro concepto C.

Σ Figura 4: Red neuronal para inferir el valor de C En este caso, puede considerarse que existe una relación causal entre los nodos a A-C y B-C, de manera que un incremento en el nodo A causa un incremento en el nodo C y a su vez, un incremento en el nodo B incrementa el valor del concepto C (si ambos pesos son positivos). Si se designan los pesos (intensidad) de estas conexiones con wAC y wBC respectivamente, el objeto del estudio consiste en implementar un procedimiento que permita establecer un nuevo valor de C y encontrar el estado de equilibrio mediante un proceso iterativo que garantice que al continuar el procedimiento los valores de todos los conceptos A, B y C, ya no sufrirán cambios. Es decir, se espera que después de un cierto número de iteraciones, el sistema se estabilice y converja a un estado fijo o bien a un ciclo de estados. Como se mencionó, la técnica de los MCD no es infalible y puede ocurrir que no se llegue a un estado de equilibrio. Sin embargo, los MCD constituyen una herramienta muy flexible que permite incorporar o quitar nodos y volver a analizar la presencia de estabilidad en el proceso iterativo. La metodología propia de la construcción y el procesamiento de un MCD se describe e ilustra a continuación mediante un problema concreto de formación estratégica presentado por Hilera y Martínez (2000) que puede ser tratado con un MCD. En el mismo se considerarán los siguientes conceptos que componen el MCD: Inversión Extranjera (IE), Leyes Regulatorias (LR), Empleo (E), Conflictividad Social (CS) y Estabilidad de Gobierno (EG). Las relaciones causales entre estos conceptos se ilustran en la Figura 5.

Figura 5: MCD para relacionar cinco conceptos

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680 .

www.jistem.fea.usp.br


Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio – económico

673

Los MCD conllevan una técnica iterativa según la cual cada concepto va modificando su valor. En cada iteración se simula el paso correspondiente a un intervalo de tiempo determinado y el valor de cada concepto en la iteración actual es calculado a partir de los valores de sus conceptos antecedentes en la iteración anterior. Debido al carácter iterativo del procedimiento, el sistema representado por un MCD evoluciona a lo largo del tiempo hacia nuevos estados, es decir que, a partir de un estado inicial dado por un vector C cuyas componentes son los valores de los conceptos C = [IE, E, L, CS, EG], se llega a un nuevo estado y a semejanza del sistema real podrá o no converger a un estado o ciclo de estados de equilibrio. De esta forma, en un MCD el valor de cada concepto representado en el mapa depende del nivel de sus antecedentes en la iteración anterior, y es calculado por medio de una suma de productos a la que se le aplica una función de salida. La relación entre un concepto y sus antecedentes es modelada por un arco que los vincula y que tiene asignado un peso que establece la intensidad de la influencia de un concepto sobre el otro. En este trabajo los nodos adoptarán sólo dos valores posibles, 0 ó 1 reservándose el término difuso para los valores de las aristas donde cada arco tiene asociado un peso cuyos valores se distribuyen en el intervalo [-1, 1]. Procesamiento de un mapa cognitivo difuso La representación del nivel de causalidad entre conceptos queda mejor organizada si previamente se dispone de una tabla denominada tabla de causalidad o conectividad. La tabla 1 relaciona los conceptos de la Figura 4 mediante los pesos wij, cada valor representa la intensidad con la que un concepto i influye sobre otro concepto j. IE

E

L

CS

EG

IE

w11

w12

w13

w14

w15

E

w21

w22

w23

w24

w25

L

w31

w32

w33

w34

w35

CS

w41

w42

w43

w44

w45

EG

w51

w52

w53

w54

w55

Tabla 1: Nivel de causalidad en los conceptos de la Figura 4 A su vez, la representación tabular es llevada a una matriz para su procesamiento. Si por ejemplo, el concepto C1 se relaciona con el concepto C2 mediante el peso w12 de valor 0.6, significa que la inversión Extranjera (IE) favorece el Empleo (E) con un factor de 0.6 en el sentido que un incremento en la inversión provoca un aumento en el empleo. La matriz completa se representa como sigue:

 w11   w21 w =  w31   w41 w  51

w12 w22 w32

w13 w23 w33

w14 w24 w34

w42 w52

w43 w53

w44 w54

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680

w15   w25  w35   w45  w55 

www.jistem.fea.usp.br


674 Curia, L., Lavalle, A.

El procedimiento iterativo permite seguir la evolución del modelo en pasos que sucesivamente van entregando secuencias de estados con valores actualizados del vector de conceptos C, este vector contiene todos los valores de los conceptos: C = [IE, E, L, CS, EG]. Cada nuevo estado del vector C se obtiene aplicando la función f al producto entre el vector C y la matriz de conectividad w. La obtención de los nuevos valores de C en el paso t+1 en función de los valores de C en el paso t está dada por la ecuación (2)

Ct +1 = f (Ct w)

(2)

En esta expresión:

Ct Es el vector con los valores de los conceptos en paso t. Ct +1 Representa el vector que contiene todos los valores de los conceptos en la etapa t+1 de la iteración. w

Designa la matriz de conectividad

f Es la función de transferencia que se aplica al producto Ct w En el caso más simple f es la función salto unitario que toma el valor cero si el argumento es menor que cero y valor uno si el argumento es mayor o igual a cero. La elección de esta función está ligada tanto a la simplicidad que se quiera tener durante el cálculo como a las características de convergencia que exhiba el método. En muchas ocasiones basta considerar la función salto unitario para el cálculo, mientras que para analizar la evolución de los conceptos a lo largo de las sucesivas iteraciones se recurre a la función identidad. Este último aspecto posibilita una característica de análisis adicional que tienen los MCD, ya que ir representando gráficamente los valores que van tomando los conceptos en el cálculo da una idea concreta de las oscilaciones y las situaciones por las que deberá pasar el sistema hasta llegar a un estado de equilibrio. La matriz w tiene una importancia central en el procedimiento de construcción de un MCD. Dado que cada elemento de w contiene un peso que da la intensidad con que un concepto incide en otro (con tendencia a aumentarlo o disminuirlo), esta matriz con la conexión entre conceptos frecuentemente está construida con la ayuda de varios expertos en el área de interés y comúnmente se encuentran resultados muy disímiles entre especialistas al valorar un mismo hecho o situación, tales situaciones surgen cuando, por ejemplo, un médico debe establecer un diagnóstico basándose en los síntomas del paciente o cuando un juez debe aplicar una pena en base a los delitos cometidos y a la interpretación de las legislaciones vigentes. En caso que se quiera considerar la opinión de varios expertos para construir la matriz de conectividad w, habrá que incluir tantas matrices como expertos intervengan en el análisis y a partir de ahí obtener una nueva matriz de consenso resultante de operar con las matrices individuales basadas en la opinión de cada especialista. En este caso para determinar la matriz de pesos w, que considere la opinión de todos los especialistas consultados sobre el tema, se recurre a la suma lógica difusa. El valor final del peso wij que intervendrá en el estudio de la evolución de los conceptos surge de la ecuación (3) wij =

k =M

Uw

k ij

= max( wij1 , wij2 ,..., wijM ) (3)

k =1

Donde se consideró la opinión de M expertos y por esa razón el supra índice de los pesos wij recorre los valores de 1 a M. Durante el cálculo de la suma lógica borrosa

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680 .

www.jistem.fea.usp.br


Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio – económico

675

para determinar la matriz de consenso w, se considera que cada peso ij se obtendrá como el de mayor valor absoluto para esos subíndices. Esta operación se realiza con los elementos correspondientes de las M matrices sugeridas por los expertos. La lógica difusa proporciona muchas herramientas para tratar este tipo de problemas donde distintas personas tienen un grado de valoración particular sobre cada aspecto o situación determinada. Supongamos que se quiere determinar el valor correspondiente a w12 (nivel de causalidad de C1 sobre C2). En este caso se trata de establecer cómo influye la inversión extranjera en el empleo. Un especialista puede establecer que tal influencia se da con un valor cercano a 1, por ejemplo 0.9, mientras que otro puede atribuirle un grado de influencia de 0.7. Procediendo de acuerdo con la ecuación (3), para determinar el valor w12 se debe adoptar el máximo entre los dos valores que proponen los especialistas. La regla del máximo es una de las operaciones más simples de la lógica difusa pero proporciona una solución al problema de hallar la matriz de conectividad w. En la bibliografía se citan trabajos donde aparecen formulaciones con mayor énfasis en la aplicación de reglas difusas para construir la matriz de adyacencia de un MCD (Hilera & Martínez, 2000; Carvalho & Tomé, 1999) 4. RESULTADOS La estrategia seguida para la construcción de un MCD proporciona un camino para obtener la evolución que siguen los conceptos a partir de un estado inicial determinado. A continuación se mostrarán los resultados del análisis del problema que relaciona los cinco conceptos seleccionados para establecer relaciones causales entre ellos a través de un MCD, representado gráficamente en la Figura 4. Estos conceptos están dados por el vector C = [IE, E, L, CS, EG]. Si se considera que la opinión de los dos expertos ha sido reunida en las matrices w1 y w2, la matriz de adyacencia resultante al aplicar el operador U de unión que, como ya se mencionó, consiste en tomar el mayor valor de los dos que corresponden a una misma posición de la matriz de cada especialista, es la siguiente:    w1 =     

0

0.6

0

0 0.2

0 0.4

0 0

0 0.7

0 0

0 0

0   −1 0.2  0.6 0   0 − 0.8  0 0  0 .3 0 0

 0   0 w 2 =  0 .8   0  1 

w = w1U

   w2 =     

0

0 0 .1

0 0

− 0 .2 0 .3

0

− 0 .2

0

0

0

0

0

0.6

0

0

0 0.8

0 0.4

0 0

−1 0.6

0 1

0 0

− 0.2 0

0 0

  0 .2  0   − 0 .6  0  0

0   0.2  0   − 0.8  0 

Esta matriz w se mantiene constante a lo largo de todo el proceso iterativo que conlleva la evolución que van sufriendo los conceptos en cada una de las iteraciones.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680

www.jistem.fea.usp.br


676 Curia, L., Lavalle, A.

Inicialmente se parte de un estado inicial con el que se fijan los valores de los cinco conceptos según el vector C (t=0)= [0, 0, 0, 1, 0]. El primer paso consiste en efectuar el producto entre el vector fila C y la matriz w, obteniéndose como resultado el vector R= [0 0.6 -0.2 0 -0.8]. De acuerdo con la ecuación 2, el siguiente paso consiste en aplicarle la función f a este valor obtenido. En este trabajo se adoptará una función salto unitario para el cálculo iterativo y la función identidad para seguir la evolución de los conceptos (Carlsson et al., 2006; Peláez et al., 1995). La Tabla 2 contiene los valores que se van obteniendo para los sucesivos pasos de t, desde 0 a 3. En la parte inferior derecha puede observarse cómo las dos últimas líneas han permanecido iguales. Esto significa que los valores de C para t=2 son iguales a los correspondientes a t=3, es decir, el sistema ha llegado a un estado de equilibrio a partir del cual no sufrirá variaciones. El resto de los cálculos parciales aparecen detallados en la tabla. Esta situación puede interpretarse de la siguiente manera: si se parte del estado inicial donde C(t=0)=[0, 0, 0, 1, 0] luego de tres iteraciones, se llega a una estabilidad en los valores de los conceptos que componen el mapa, es decir, C(t=3) = C(t=4) = [1, 1, 1, 0, 1] . Por lo tanto, partiendo de una situación inicial C(t=0) de plena conflictividad social se llega a un estado C(t=5) donde ésta se ha podido eliminar con Inversión Extranjera (IE), aumentando el Empleo (E) y en un marco de Leyes Regulatorias (L), logrando además la Estabilidad en el Gobierno (EG).

t

C(t)

R = C*w

C(t+1) = f(R)

0

[0, 0, 0, 1, 0]

[0, 0.6, -0.2, 0, -0.8]

[1, 1, 0, 1, 0]

1

[1, 1, 0, 1, 0]

[0, 0.6, -0.2, -1, -0.6]

[1, 1, 0, 0, 0]

2

[1, 1, 0, 0, 0]

[0, 0.6, 0, -1, 0.2]

[1, 1, 1, 0, 1]

3

[1, 1, 1, 0, 1]

[1.8, 1, 0, -0.4, 0.2]

[1, 1, 1, 0, 1]

Tabla 2: Valores de los conceptos durante la evolución del MCD La Figura 6 representa la evolución de los conceptos durante el proceso iterativo.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680 .

www.jistem.fea.usp.br


Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio – económico

677

2

1.5

1

0.5

0

-0.5

-1

0

0.5

1

1.5

2

2.5

3

3.5

4

Figura 6: Oscilaciones de los conceptos durante la evolución del MCD Por otro lado, si se parte del estado inicial donde C(t=0)=[0, 0, 1, 1, 0] luego de tres iteraciones, nuevamente se llega a una estabilidad en los valores de los conceptos. t

C(t)

R = C*w

C(t+1) = f(R)

0

[0, 0, 1, 1, 0]

[0.8, 0.4, -0.2, 0.6, -0.8]

[1, 1, 0, 1, 0]

1

[1, 1, 0, 1, 0]

[0, 0.6, -0.2, -1, -0.6]

[1, 1, 0, 0, 0]

2

[1, 1, 0, 0, 0]

[0, 0.6, 0, -1, 0.2]

[1, 1, 1, 0, 1]

3

[1, 1, 1, 0, 1]

[1.8, 1, 0, -0.4, 0.2]

[1, 1, 1, 0, 1]

Tabla 3: Valores de los conceptos durante la evolución del MCD En la Tabla 3 se observa el proceso iterativo y en la Figura 7 la evolución de los valores de los mismos en dicho proceso. 2

1.5

1

0.5

0

-0.5

-1

0

0.5

1

1.5

2

2.5

3

3.5

4

Figura 7: Oscilaciones de los conceptos correspondientes a la tabla 2

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680

www.jistem.fea.usp.br


678 Curia, L., Lavalle, A.

En este caso, se parte de una situación inicial de plena conflictividad social pero con un marco de leyes laborales. Luego de las iteraciones se llega a un estado C(t=5) donde se observa que para eliminar la Conflictividad Social y garantizar la Estabilidad en el Gobierno (EG) es necesario contar con Inversión Extranjera (IE) que aumente el Empleo (E). A partir del análisis de los resultados obtenidos en los dos casos concretos, se observa que el MCD construido proporciona resultados claros y fácilmente interpretables. Dado que el número de conceptos es bajo, no se hace necesario recurrir a otras técnicas complementarias como se propone en los trabajos de Stach et al. (2010) y Kannappan et al. (2011). En consecuencia es importante observar que estos resultados fueron obtenidos únicamente a partir de la construcción de la matriz de adyacencia mediante la opinión de expertos altamente calificados en la problemática concreta que se propuso analizar y que en ningún momento fue necesario recurrir a otras técnicas adicionales para optimizar el procedimiento de construcción de un MCD. 5. CONCLUSIONES Los MCD, al igual que otros mapas como los conceptuales y los mapas cognitivos tradicionales, sirven para representar el conocimiento mediante conceptos unidos por nexos. Sin embargo, los MCD tienen características adicionales que permiten detectar el conocimiento implícito sobre situaciones causales y la evolución de las mismas cuando se parte de determinadas situaciones externas que quedan reflejadas en los valores de los conceptos y en los valores con que las relaciones causales influyen sobre ellos. En este trabajo se muestra el procedimiento iterativo que constituye el núcleo operativo de un MCD y la forma en que esta técnica permite seguir la dinámica de los conceptos reunidos en el mapa. Mediante un ejemplo concreto del campo de la economía se pone de manifiesto la sensibilidad que presenta la evolución del mapa, cuando partiendo de una situación inicial determinada es posible analizar los sucesivos estados por los que pasan los conceptos, brindando además la posibilidad de interpretar estos paulatinos cambios como estados que experimentará el sistema en el contexto real en el que se está estudiando el problema. Los MCD muestran ser una herramienta de aplicación directa y eficiente ya que ofrece una forma rápida y de fácil empleo como para constituir un soporte ideal destinado al especialista de un área temática que quiere investigar un problema concreto y disponer de un curso de acción a seguir. La técnica de los MCD permite seguir con comodidad la evolución que presentan los conceptos reunidos en el mapa preservando un gran impacto visual y uniendo técnicas de análisis cuantitativo a fenómenos que muchas veces son sólo descriptos en términos cualitativos. Del análisis realizado se desprende que es recomendable mantener un diseño sencillo del MCD en los casos en que el número de conceptos sea bajo y anexarle a la construcción de un MCD clásico algoritmos de depuración más específicos sólo en los casos en que haya gran cantidad de conceptos involucrados en la problemática que se analiza. Las líneas de investigación en lo que se refiere a los mapas cognitivos difusos están dadas en una dirección tendiente a integrar las técnicas de lógica difusa conjuntamente con las otras dos ramas de la soft computing: las redes neuronales y los algoritmos genéticos. De particular interés resulta el caso en el que debido a la opinión de especialistas la información es integrada a través de un gran número de conceptos

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680 .

www.jistem.fea.usp.br


Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio – económico

679

que muchas veces conforman pensamientos o puntos de vista que resultan contradictorios o entran en conflicto entre sí. En estos casos es útil recurrir a métodos de decisión multicriterio en los que básicamente el planteo consiste en utilizar mapas cognitivos y técnicas de lógica difusa para ayudar a grupos de personas a tomar una decisión cuando existen varias aristas desde donde el problema puede ser analizado (Rodriguez Bello, 2009). Si bien en este trabajo no se emplean algoritmos genéticos ni redes neuronales puede resultar de interés, como futura línea de investigación, buscar soluciones a estos problemas y analizar los resultados que se obtienen con la incorporación de dichas técnicas de la inteligencia artificial, comparándolos con rutinas de solución más básicas. REFERENCIAS Axlerod, R. (1976). Structure of Decision: the Cognitive Maps of Political Elites. Princeton University Press, Princeton, New Jersey. Cai, Y., Miao, Ch., Tan, A. & Shen Z. (2006). Context Modeling with Evolutionary Fuzzy Cognitive Map in Interactive Storytelling. School of Computer Engineering, Nanyang Technological University. Calais, G. (2008). Fuzzy Cognitive Maps Theory: Implications for Interdisciplinary Reading: National Implications. FOCUS On Colleges, Universities, and Schools, 2(1), 1-16. Carlsson, C. (1996). Knowledge formation in strategic management, HICSS-27 Proceedings, IEEE Computer Society Press, Los Alamitos. Carlsson, C., Ramírez, L., Mora, M. & Terán, A. (2006). Adaptive Fuzzy Cognitive Maps for Hyperknowledge Representation in Strategy Formation Process. Proceedings of the International Panel Conference on Soft and Intelligent Computing, Technical Univ. of Budapes. http://citeseer.ist.psu.edu. Carvalho,J.P. & Tomé, J.A. (1999). Rule Based Fuzzy Cognitive Maps and Fuzzy Cognitive Maps - A Comparative Study , Proceedings of the 18th International Conference of the North American Fuzzy Information Processing Society, NAFIPS99, New York. Chandana, S., Leung, H. & Levy, J. (2007). Disaster management model based on Modified Fuzzy Cognitive Maps. Systems, Man and Cybernetics. ISIC. IEEE International Conference. Hilera, J. R. & Martínez V. (2000). Redes Neuronales Artificiales: Fundamentos, modelos y aplicaciones. RA-MA Editorial, Madrid. Jang, R. Sun, C. & Mitsutani, E. (1997). Neuro-Fuzzy and Soft Computing, A Computational Approach to Learning and Machine Intelligence. Nueva Jersey, Prentice Hall. Kannappan, A., Tamilarasi, A. & Papageorgiou, E. (2011). Analyzing the performance of fuzzy cognitive maps with non-linear hebbian learning algorithm in predicting autistic disorder. Expert Systems with Applications: An International Journal archive. Volume 38 Issue 3.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680

www.jistem.fea.usp.br


680 Curia, L., Lavalle, A.

Kardaras, B. & Karakostas B. (1999). The use of fuzzy cognitive maps to simulate the information systems strategic planning process. Information and Software Technology, 41, 197-210. Kosko, B. (1986) Fuzzy Cognitive Maps. International Journal on Man Machine Studies. 24. 65-75. Kosko, B. (1997). Fuzzy Engineering. Prentice-Hall, Upper Saddle River, New Jersey. Lin, Ch., Lu P., Wu W., Chen Ch., Chiang H. & Huang, Y. (2010). Utilizing a concept map as the teaching strategy based on conceptual change theory for the course information technology and society. Joint International IGIP-SEFI Annual Conference 2010, 19th - 22nd September 2010, Trnava, Slovakia. Mellado, V., Peme-Aranega, C., Redondo, C. & Bermejo, M. L. (2002). Los mapas cognitivos en el análisis gráfico de las concepciones del profesorado. Campo Abierto, 22, 37-58. Miao, Ch., Yang, Q., Fang, H. & Goh, A. (2006). A cognitive approach for agent-based personalized recommendation. Knowledge-Based Systems, 20, 397–405. Novak, J.D. & Gowin, D.B. (1988). Aprendiendo a aprender. Barcelona: Martínez Roca. Papageorgiou, E., Stylios, Ch. & Groumpos P. (2006). Unsupervised learning techniques for fine- tuning fuzzy cognitive map causal links, Intl. Journal of HumanComputer Studies. 64, 727-743. Papageorgiou, E. Markinos, A. & Gemtos, T. (2011). Fuzzy cognitive map based approach for predicting yield in cotton crop production as a basis for decision support system in precision agriculture application. Applied Soft Computing, Volume 11 Issue 4 Peláez, C. E. & Bowles, J. B. (1995). Applying Fuzzy Cognitive Maps KnowledgeRepresentation to Failure Modes Effects Analysis. IEEE Proceedings Annual Reliability and Maintainability Symposium. 0149-144X/95. Pimienta, J. (2005). Metodología constructivista: Guía para la planeación docente. Pearson/Prentice-Hall. http://slideshare.net/eus/mapas-cognitivos-y-construccin-delconocimiento Rodriguez Bello, S. (2008). Toma de decisión multi-criteria con AHP, ANP y Lógica Difusa. Maestría en Ingeniería de Sistemas y Computación. Universidad Nacional de Colombia. Stach, W., Kurgan, L. Pedrycs, W. & Reformat, M. (2005). Genetic learning of fuzzy cognitive maps. Fuzzy Sets and Systems. 153(3), 371-401. Stach, W., Kurgan, L. & Pedrycz, W. (2010). A divide and conquer method for learning large Fuzzy Cognitive Maps. Journal Fuzzy Sets and Systems archive. Vol. 161 Issue 19 Stylios, Ch., Georgopoulos, V. & Groumpos, P. (2005). The use of fuzzy cognitive maps in modeling systems. School of Electrical Engineering and Computer Science, Ohio University. Wheeldon, J. & Faubert J. (2009). Framing Experience: Concept Maps, Mind Maps, and Data Collection in Qualitative Research. Int10.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 663-680 .

www.jistem.fea.usp.br


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, pp. 681-716 ISSN online: 1807-1775 DOI: 10.4301/S1807-1775201100030009

RECOMMENDER SYSTEMS IN SOCIAL NETWORKS SISTEMAS DE RECOMENDAÇÃO EM REDES SOCIAIS Cleomar Valois B. Jr Marcius Armada de Oliveira Centro Universitário da Cidade do Rio de Janeiro, RJ - Brazil _____________________________________________________________________________________

ABSTRACT The continued and diversified growth of social networks has changed the way in which users interact with them. With these changes, what once was limited to social contact is now used for exchanging ideas and opinions, creating the need for new features. Users have so much information at their fingertips that they are unable to process it by themselves; hence, the need to develop new tools. Recommender systems were developed to address this need and many techniques were used for different approaches to the problem. To make relevant recommendations, these systems use large sets of data, not taking the social network of the user into consideration. Developing a recommender system that takes into account the social network of the user is another way of tackling the problem. The purpose of this project is to use the theory of six degrees of separation (Watts 2003) amongst users of a social network to enhance existing recommender systems. Keywords: Recommender Systems, Social Networks.

1.

INTRODUCTION

1.1. Position and Explanation In the past, the study of knowledge discovery in databases, and more specifically of recommender systems (systems that filter relevant information to a specific user according to his/her profile), was limited to the data available to researchers. With the Internet explosion, new opportunities and challenges have arisen in this research field. More recently, the online social network phenomenon has enabled access to the user’s profile and preferences. This has allowed several databases available on the Internet to be collected and used in experiments by research (GroupLens Research 2010). The importance of research on recommender systems arises from the wide scope of the available information, making it difficult for the user to access relevant items and raising the need for tools that help perform this task. _____________________________________________________________________________________ Manuscript first received/Recebido em 05/03/2011 Manuscript accepted/Aprovado em: 17/11/2011 Address for correspondence / Endereço para correspondência Cleomar Valois Batista Jr., Bachelor of Science in Computer Science (BSc CS), Centro Universitário da Cidade do Rio de Janeiro, NUPAC - Núcleo de Projetos e Pesquisa em Aplicações Computacionais, Rio de Janeiro, RJ - Brazil, +55(21)3593-6186, E-mail:jr@muitobom.com.br Marcius Armada de Oliveira, Bachelor of Science in Computer Science (BSc CS), Centro Universitário da Cidade do Rio de Janeiro, NUPAC - Núcleo de Projetos e Pesquisa em Aplicações Computacionais, Rio de Janeiro, RJ - Brazil, +55(21)3495-3917, E-mail:marcius@marcius.com.br Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


682 Valois B Jr., C. Armada, M.

For this same reason, the generation of good recommendations has been commercially exploited by large companies. Although these recommendations are acceptable, they still present several problems. The data used often does not correctly reflect the client’s preferences. The items purchased are not always intended for the buyers themselves. They may be gifts, and this fact makes the data unreliable (Gupta, Jain e Song 2008). Most recommender systems researched and in operation make recommendations without taking into consideration any knowledge about the recommended items. For that reason, they must take into account all data space known by the system. However, because of the easy access to large quantities of information, usually beyond processing capacity in a reasonable response time, recommender systems not only need to deal with the quality of the recommendations, but they also must filter the available base with which they are going to work. In order to verify a possible solution to this scalability problem (section 2.4.4.7) and reduce the volume of information, this article intends to use the concept that people closer to a person in a social network have more influence on his/her opinions (Mendes 2008). This way, the existing data space may be divided based on the degree of separation among the individuals within the social network and still generate relevant recommendations in a database that would otherwise not be processed due to its extent. 1.2. Objective Recommender systems are widely used in several different domains for the recommendation of articles, music, movies, and even people. Portals such as Amazon and Submarino use recommender systems to suggest products to their customers. Meanwhile, social networks such as LinkedIn and Facebook use them to suggest new contacts. To accomplish that, the most used techniques employed in recommender systems are (section 2.4.3): The collaborative filtering and content-based systems. The collaborative filtering does not take into account the type of items, nor their attributes. It takes exclusively into account the expressed opinion about the other items in order to make recommendations. Meanwhile, content-based filtering uses the knowledge it has of the items and their attributes to make recommendations. These techniques perform well, but they employ cluster1 solutions to solve the scalability problem (section 2.4.4.7) and be able to process high chunks of data. This article looks for a new solution, different from the one normally employed. The objectives of this article are:

• From an assessment database, similar to Grouplens’ (GroupLens Research 2010) initiative, complemented with the relations among the participants, in such a way that is possible to draw the graph of the social network; • Evaluate the benefits to recommender systems originated from the knowledge provided by the social network. This database has missing 1

A cluster is composed by a group of linked computers that uses a special type of operational system, called distributed system. It is often built from traditional computers (PCs) connected to a network. They communicate with each other through the system, operating as a single big machine.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


683 Recommender Systems In Social Networks

values that are treated as non-evaluated items. Recommender systems try to predict the user’s evaluation of an item that has not yet been evaluated. Based on the concept that people who are more closely connected have more influence on each other’s opinions (Mendes 2008), this article will try to predict these missing values based on relations, so as to generate more relevant results. • Evaluate the applicability and imputation efficiency of the missing evaluations with refeeds (section 2.3), using the identified degrees of separation among participants. Verify, in contrast to the traditional approach, if there are scenarios in which this application is more useful. At the end of this article, results will be presented which indicate whether dividing the data base by degrees of separation (section 2.2) will have a positive effect on the results. 2.

FOUNDATIONS AND TECHNOLOGIES

2.1. Social Networks Social networks are node structures (individuals or organizations) connected by social ties. The organization of each network depends on the surroundings in which it was generated and operates. Each network has a particular organization of its members and, especially, of its facilitators’ political culture and shared objectives (Amaral 2004). The social network represents a set of independent members joining ideas and resources based on shared values and interests (Marteleto 2001). It is important to understand the classification of networks, as well as how they are formed. In “The Small World” model, ties are established randomly and some people are able to transform the network into a small world. In the book “Six Degrees of Separation – Small World”, Watts (Watts 2003) discusses the idea that the average distance between two people on the planet does not exceed approximately 6 people, considering there are some random ties among groups. This topic will be discussed in more detail in section 2.2.

2.2. Six degrees of separation theory Six degrees of separation is related to the idea that one person is only six “steps” away from any other person on Earth. So, in a social network, in order to connect two people, six or fewer intermediaries would be needed, that is, it is believed that two individuals can be connected through a maximum of 5 acquaintances (Figure 2.1 Six Degrees of Separation

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


684 Valois B Jr., C. Armada, M.

Figure 2.1 Six Degrees of Separation

Source: (Watts 2003) In his doctorate thesis, Michael Gurevich (Gurevich 1961) produced a seminal work in his empirical studies of social networks structure. The Austrian mathematician Kochen Manfred, involved in an urban statistics project, extended these empirical results to a mathematic manuscript (Sola Pool and Kochen 1978) and concluded that in a population as big as the USA, without social structure, “it’s virtually correct to say that two individuals can get in touch with each other through at least two intermediaries. In a socially structured population this is less probable, but it still can happen, and maybe if we consider the whole world population, probably only one more transition individual is needed”. Simulations performed in 1973, using relatively limited computers, produced more realistic predictions of 3 degrees of separation among the US population, anticipating the results of the American psychologist Stanley Milgram, whose study (Milgram 1967) showed that “people in the US seem to be connected on average by a chain of three acquaintances. The simulations did not, however, speculate on the planet interactions". In 2001, Duncan Watts (Watts 2003), a Columbia University professor tried to recreate Milgram’s experiment on the internet, using an email message as a "package" to be delivered. The experiment involved 48,000 senders and 19 intended recipients (in 157 countries). Watts found the average number of intermediaries remained around 6. Note that this was not the highest number of intermediaries. 2.3. Sequential Imputation Imputation is any automatic or semi-automatic procedure capable of filling in missing values in data bases (Goldschmidt and Passos 2005). Imputation methods, once restricted to the statistics domain, are now more evolved and present implementations based on Artificial Intelligence (IA) or even in hybrid constructions (Farhangfar, Kurgan and Pedrycz 2007) (Lakshminarayan, Harp, and Samad 1999). Imputation processes can be distinguished by the capacity of imputing missing values that occurs in a univariate or multivariate way. The imputation is univariate when the missing values are present in only one attribute. In this scenario, several techniques are commonly applied, such as:

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


685 Recommender Systems In Social Networks

• Replacement with a central trend value: The missing data of quantitative variables are replaced with the variable’s mean. This mean may be the average of the observed data or the average of a group with more similar characteristics to the missing data, identified by one or more categorical variables found in the database. • “Hot deck”: The individual whose observed data most closely resembles that of the individual with the missing data in relation to the auxiliary variables is located. The missing data is then replaced with the corresponding matched data. • Regression: The imputed values are predicted through regression by simply using one or more existing variables to predict the missing value of another variable similar to the previous one. Two types of regression may be used for the imputation: regression and regression with an additive component of the error variance. However, due to the increased complexity and database sizes in past years, there has been an intensification of research and development related to the mechanisms for multivariate imputation, when missing values are arranged in two or more attributes (Schafer 1997). There are two ways of approaching the solution for multivariate problems (Vanbuuren, et al. 2006): joint modeling or fully conditional specification. Jointmodeling consists of using statistical models (like Bayesian network) to stimulate missing values in all attributes at once. Conditional specification techniques are generalized in literature as sequential imputation (Commission and Europe 2000), in which the missing values to be imputed are processed in a sequential manner based on attributes (Lepkowski, et al. 2001) (Oudshoorn, Buuren, and Rijckevorsel 1999) or records (Kim, Kim e Yi 2004) (Verboven, Branden e Goos 2007). The main characteristic of imputing the missing values sequentially is the breakdown of a multivariate problem into several univariate problems, which can be solved by the well-known traditional technique of univariate imputation. (Oudshoorn, Buuren, and Rijckevorsel 1999) (Gelman and Hill 2006). 2.4. Recommender Systems 2.4.1. Introduction The explosive growth of the World Wide Web (www), the emerging popularity of e-commerce and social networks have provided access to a large quantity of information, which was previously inaccessible. Gathering data is not a problem anymore, but the extraction of useful information and its presentation to the user in a relevant way is. Recommender systems have been developed to help fill the gap between information collection and analysis, by filtering all available information and presenting the most relevant items to the user (Resnick e Varian 1997). The recommender system helps enhance the capacity and efficiency of this process. The biggest challenge of this type of system is finding the perfect match between those

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


686 Valois B Jr., C. Armada, M.

recommending and those receiving the recommendation; that is, defining and discovering the relation between their interests. Information systems that filter relevant information for a specific user based on his/her profile are known as Recommender Systems. A recommender system usually compares the user profile with some reference characteristic and attempts to predict the evaluation a user would provide of a particular item that has not yet been considered. E-commerce websites are currently the main interest group of recommender system usage, employing different techniques to find more appropriate products for their clients and to raise sales volume. There are two approaches to recommender systems: Collaborative filtering and content-based systems, which will be explained in details in section 2.4.3. Some authors, like Montaner (Montaner, Lopez, and de La Rosa 2003), highlight the existence of a third type of information filtering, called demographic filtering. This filtering method is not part of the scope of this article. However, this method uses a person’s description to determine the relation between a specific item and the type of individual who may be interested in the particular item. This kind of approach uses peoples’ descriptions to determine the relations between one item and the type of person who may be interested in that particular item. The user profile is created by grouping users in stereotype classifications that represent the characteristics of a class of users. Personal data are usually requested from the user through registration forms used to create a characterization of the user and his/her interests. One of the algorithms commonly used in recommender systems is K-NN (section 2.4.5.1). In a social network we may find neighbors of a specific user with the same tastes or interests. In order to do this, the Pearson Correlation coefficient must be calculated through the selection of top-N neighbors preferred data of a specific user (weighted similarity) and use specific techniques to calculate whether the user’s preference can be predicted. 2.4.2. Evaluations The users may make explicit or implicit evaluations. Explicit evaluations are usually a discrete value that belongs to a limited set of possible numerical values for each item (Resnick e Varian 1997), like a Likert type scale. Implicit evaluations offer the advantage of reducing the user workload and are usually extracted from the buying history or the behavior while browsing through sites that require some type of evaluation. 2.4.2.1. Likert SCALE On a Likert-type scale (Table 2.1 Likert scales), the answers to each item vary according to an intensity degree. This scale with sorted, equally spaced categories and the same number of categories in all items is widely used. It is formed by a set of phrases or sentences with positive or negative opinions. The evaluator (user) has to rate his/her degree of agreement, from “I strongly disagree” (level 1) to “I totally agree”

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


687 Recommender Systems In Social Networks

(level 5, 7 or 11), depending on the number of levels on the scale2 (Chisnall 1973) (Likert 1932). Table 2.1 Likert scales

Value Graphic representation Textual representation 5

Excellent

4

Very good

3

Good

2

Fair

1

Poor

2.4.3. Collaborative filtering and content-based filtering (Balabanovic and Shoham 1997) defined two main approaches to recommender systems: collaborative filtering and content-based filtering. For the collaborative filtering, the recommendation is based on the analysis of similar users to indicate items of preference. For content-based filtering, the recommendation is made through the analysis of items which are similar to the ones the user has already seen and evaluated. 2.4.3.1.Collaborative Filtering In this type of filtering, recommendations are made based on predictions of user preferences resulting in interactions between other users. This type of filtering usually offers a higher degree of surprise to the user with good recommendations and, in some cases, may offer totally irrelevant contents. Collaborative filtering systems are trying to include people in the filtering systems, since they can better assess documents than any computer task (Resnick e Varian 1997).

2

Likert previously recommended a scale of 5 points, but is currently recommending the use of scales of 5, 7 or 11 points based on individualâ&#x20AC;&#x2122;s lack of discriminatory capabilities when the scale has many answer possibilities, or based on the fact that only when the scale has many possibilities is it similar to the continuum of our opinion.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


688 Valois B Jr., C. Armada, M.

Figure 2.2 Collaborative Filtering

Source: (Sarwar, et al. 2001) A first approach to this type of filtering (Resnick e Varian 1997) establishes recommendations based on items consumed by users with the same consumption pattern as the current user (Figure 2.2 Collaborative Filtering It is used mainly in e-commerce systems, such as Amazon and Submarino. To make this type of approach easier to understand, letâ&#x20AC;&#x2122;s consider a user and a set consisting of the top users whose buying pattern is more similar to â&#x20AC;&#x2122;s (since they bought some of the same products x has brought). Now consider vectors ( , ), where is a product and is the number of times this product was purchased by a . If the vector set ( , ) is sorted in s descending order by , the result will be an order for product recommendation for . A variation may apply different weights to users , based on their relation to users. In the approach related to collaborative filtering, it is possible to solve the problem found in the recommendation approach using contents (section 2.4.3.2), where the user only receives items with similar contents. However, this approach does not solve other problems, such as the insertion of new items into the base that will only be recommended after a certain number of users have read and assessed them. Another issue is handling users who do not have similar interests in other members of the population. Thus, this unique user will not have peers in which the collaborative recommender system can base itself on. This and other issues are described in section 2.4.4. 2.4.3.2.Content-Based Filtering The content-based filtering approach (Figure 2.3 Content-based Filtering ) is based on the premise that the user would like to see similar items as to those previously seen by him (Balabanovic e Shoham 1997). With information on a specific content and data about a specific user that can be related to this information, it is possible to define the relation between user and content. This approach employs content-based filtering techniques, for example, filtering by keyword and latent semantic analysis.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


689 Recommender Systems In Social Networks

Figure 2.3 Content-based Filtering

Source: (Lorenzi 2006) For example, a book about programming could be recommended to students of computer related courses. This is the basis of content-based recommendations which, in contrast to collaborative filtering, does not use the relations among users to define the content. For this reason, the recommendation based on contents usually does not surprise the user, since the relation and the means used by the system to recommend can be inferred directly by the user, even if unconsciously. 2.4.4. RECOMMENDER SYSTEMS ISSUES 2.4.4.1.Cold Start The issue with Cold Start is more prevalent in recommender systems. The problem occurs at the start of the system (no assessments from other users). A recommender system usually compares the user profile with some reference characteristics. These characteristics can be based on information (content-based approach) or on the userâ&#x20AC;&#x2122;s social environment (collaborative filtering approach). In content-based approach, the system should be able to match the characteristics of an item to relevant characteristics in the userâ&#x20AC;&#x2122;s profile. In order to do that, a model with sufficient detailed information on the user, including his/her tastes and preferences must first be built. This can be done in explicitly (by consulting the user) or implicitly (observing the userâ&#x20AC;&#x2122;s behavior). In both cases, the Cold Start issue requires the user to dedicate to creating his/her profile before the system can begin any relevant recommendation. Because of the Cold Start issue, items not previously assessed would be ignored in the collaborative filtering approach.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


690 Valois B Jr., C. Armada, M.

2.4.4.2. Gray Sheep If a user has rare tastes, the recommendation may not be accurate, as there are no “close neighbors”. This problem is called gray sheep (Resnick and Varian 1997) (Lorenzi 2006). In the collaborative filtering system, a user with this profile is not easily related to other users in the system, making it difficult to recommend items. In the content-based filtering system, even if the user has a rare profile, the recommendation of items related to this profile is not an issue, since recommendations are more generic. For example, if the system identifies that a user is interested in technology and oceanography, it will easily recommend these items to the user, even if only unpopular items have been evaluated. 2.4.4.3.Early-Rater When a new item emerges, it cannot be recommended to a user before a person assesses it (Resnick and Varian 1997) (Lorenzi 2006). This issue is clearly identified in collaborative filtering. When a new item with no user assessment or recommendations is inserted, it cannot be recommended. In content-based filtering, knowing the contents of an item is enough to enable a recommendation to a user. 2.4.4.4.SPARSE EVALUATIONS When there are few users and many items, the evaluations may become sparse and it becomes difficult to find similar users (Resnick e Varian 1997) (Lorenzi 2006). In collaborative filtering, this issue is easily identified because the filtering is completely based on the user’s assessment of the item. In content-based filtering, the recommendation does not depend on the number of users and items, but rather on their profiles and contents. 2.4.4.5.Super-SPECIALIZATION Only items that are similar to those previously evaluated by the user will be recommended. Exploring new item categories is not possible (Resnick and Varian 1997) (Lorenzi 2006). In content-based filtering, this issue is clearly identified. A user whose profile has been defined will always receive items related to this profile, and any personal profile modification (outside the system) will not be reflected on the system. In collaborative filtering, item recommendation is not based on the user’s initial profile, but rather on his/her actions and relation to other users. 2.4.4.6.Serendipity This is related to the lack of surprise in the recommendation. Products that are not related to the user’s profile may never be recommended (Resnick and Varian 1997) (Lorenzi 2006). This problem occurs in content-based filtering, since the recommended contents will always belong to the same group relating back to the user profile. Meanwhile, in collaborative filtering, the surprise occurs more frequently, since similar users may have evaluated completely different items from those seen by the original user.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


691 Recommender Systems In Social Networks

2.4.4.7.Scalability When the quantity of users, items and evaluations is too large, the system that executes real-time calculations of the relations among users may provide a very long response time and may need computer resources that are not available. This is a common problem in both approaches. However, in collaborative filtering, this issue is more evident as the calculations are done using all users and all items. In content-based filtering, calculations are done using only one user and all related items, considering all attributes (Resnick and Varian 1997) (Kajimoto, et al. 2007). 2.4.5. RECOMMENDER SYSTEM TECHNIQUES The objective of recommender systems is to provide recommendations based on recorded information on the usersâ&#x20AC;&#x2122; preferences. These systems use information filtering techniques to process information and provide the user with potentially more relevant items. This section presents collaborative filtering techniques based on users and items (J., et al. 1999) (Sarwar, et al. 2001). For K users and M items, usersâ&#x20AC;&#x2122; evaluations are represented in the K x M matrix, user-item (Figure 2.4 User x Item Matrix indicates that user k evaluated item m with r, where r ). Each element , that is, this item has been evaluated. And indicates that the evaluation is unknown.

Figure 2.4 User x Item Matrix

Source (Wang, P. and J.T. 2005) User-item matrix (Figure 2.4 User x Item Matrix ) may be decomposed into row vectors: , Where stands for transposition. Each row vector corresponds to a user profile and represents a particular item evaluation. This decomposition leads us to collaborative filtering based on users. Alternatively, the matrix may also be represented by column vectors:

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


692 Valois B Jr., C. Armada, M.

Where each column vector corresponds to a specific item, which has been evaluated by all K users. This representation results in the item-based recommendation algorithm. 2.4.5.1.Based On users (k-NN) The closest k-neighbor algorithm (k-nearest neighbors), originally proposed in (Cover and P. E. 1974), is based on the similarity concept to build a group of objects (the closest users) from where candidates are extracted in order to impute the evaluation of an item. The similarity concept is based on the idea of distance. Perhaps the Euclidean distance is the most preferred in literature because it was the first to be proposed or perhaps because of its calculation simplicity (Ferlin 2008). Collaborative filtering based on users predicts user interest in an item based on evaluations of similar users (J., D. and C. 1998) (J., et al. 1999). As shown in Figure 2.5 Evaluation prediction based on user similarities

, each user profile is classified based on its dissimilarity to the user profile, for which the prediction is being made. The evaluations performed by the most similar users have more influence on the prediction of the item evaluation for the relevant user. The list of the most similar users may be identified by using a cut-factor or by selecting the most similar top-N users.

Figure 2.5 Evaluation prediction based on user similarities

Source (Wang, P. and J.T. 2005) The Euclidean distance and the Pearson correlation are common similarity measures in collaborative filtering (P., et al. 1994) and will be discussed in sections 2.4.6.1 and 2.4.6.2, respectively. The existing methods differ in the way they handle unknown evaluations. Unknown evaluations may be interpreted as a zero value evaluation (P., et al. 1994), or by interpolation3 of the mean of the user’s evaluations and the mean of the evaluations of similar users’ evaluations (Xue, et al. 2005). After defining the most similar users, using only known evaluations of these users, the evaluation of the specific user is estimated (P., et al. 1994).

3

Through interpolation, we can create a function that somewhat “fits” this specific data, thus giving them the desired continuity.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


693 Recommender Systems In Social Networks

The user based recommendation algorithm (Return top-N items from list A, sorted by the average of the evaluations in descending order

) can be described as the process of recommending items for a user this way:

for each user v (that is not u) compute the similarities between u and v add the most similar users to the list L of u “neighbors” for each item i evaluated by a user in L but not evaluated by u for each user v in L that has evaluated i compute the similarity s between u and v Multiply the evaluation of v for item i by weight s (av,i = av,i * s ) incorporate the weighted evaluation into the mean of the evaluations for item i add the mean of the evaluations of item i to list A Return top-N items from list A, sorted by the average of the evaluations in descending order

Algorithm 1 Recommendation algorithm based on user

Source: (Owen, et al. 2010) First of all, the most similar users are identified in order to know which items they find interesting. These items are considered as candidates for recommendation to intended users. 2.4.5.2. Slope-One This approach pre-computes the average difference between the evaluations of each pair of items previously evaluated by the user. The Slope-One system is being recommended as the new reference system for recommender systems in production by (Lemire e Maclachlan 2005) for the following reasons.

• Supports dynamic updates: The addition of new evaluations to the system changes all predictions instantaneously; • Efficient when consulted: The searches are fast, even though the system requires larger storage capacity than other approaches; • A user with few evaluations should receive relevant recommendations; • Reasonably precise: Several approaches “compete” for the most accurate prediction. However, even the smallest gain in precision is not always worth the sacrifice in simplicity or scalability. As an example, let’s suppose that people who enjoyed the movie “Carlito’s Way” apparently also liked another movie starred by Al Pacino, “Scarface”. However they seem to like “Scarface” more. Let’s suppose that on a five star scale (section 2.4.2.1), most people who watched “Carlito’s way” gave the movie 4 starts and “Scarface” 5 stars. According to this reasoning, if another person gave “Carlito’s Way” 3 stars, it would be possible to assume that this same person would give “Scarface” 4 starts, one more star. Furthermore, people who evaluated “Scarface” gave “The

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


694 Valois B Jr., C. Armada, M.

Godfather” the same rating. Subsequently, a user who evaluated “Carlito’s Way” with 2 stars and “The Godfather” with 4 stars would have his/her evaluation estimation for “Scarface” calculated the following way: Based on the evaluation of “Carlito’s Way”, 2.0 + 1.0 = 3.0. Based on the evaluation of “The Godfather”, 4,0 + 0,0 = 4,0. By calculating a simple average of these two evaluations, the estimated evaluation equals (3,0+4,0)/2=3,5. This is the essence of the Slope-One approach. The name Slope-One originates from the fact that the recommendation algorithm is based on the assumption that there is a linear relation between the evaluation values of two items, and that it is usually possible to estimate the evaluations of an item Y based on the evaluations of an item X, using a linear function similar to Y = mX + b. Slope-One further simplifies this assumption by applying value 1 to m. Therefore, simply find value b = Y – X, and calculate the average difference of the evaluation value for each pair of items. This represents a significant preprocessing phase (

add the average difference

di,j, to list D

), in which all differences are computed: for each item i for each item j (not i) for each user u that evaluated i and j adds the difference (b = au,i – au,j) to an average add the average difference di,j, to list D

Algorithm 2 Slope-One pre-processing

Source: (Owen, et al. 2010) After the preprocessing phase, the recommender system that uses Slope-One is able to make recommendations. for each item i not evaluated by user u for each item j not evaluated by user u find the average difference between i and j in list D adds this difference to u’s evaluation of j (au,j + di,j) adds this value to an average returns top-N items sorted by these averages

Algorithm 3 Slope-One processing

Source: (Owen, et al. 2010) Slope-One performance does not depend on the number of users in the matrix K x M (Figure 2.4 User x Item Matrix ). It depends exclusively on the average difference between every pair of items, which can be pre-computed. Further, this structure can be efficiently updated. Simply update the average difference whenever there is a new evaluation or a change in an existing preference. (Owen, et al. 2010).

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


695 Recommender Systems In Social Networks

However, please note that the memory requirements necessary to store all these differences grow exponentially according to the number of items (items2) (Owen, et al. 2010). 2.4.6. Similarity metrics Similarity metrics are functions that numerically measure the similarity degree between two entities. As far as recommender systems are concerned: items or users. Metrics are usually needed in recommender systems in which entities are compared one by one, and the similarity measure is the instrument used to distinguish, among entities, the similar and non-similar candidates (Webber 1998). The similarity metrics summarize, by a measure of importance, the similarity of each attribute level and are used to model the relevance among them. Consequently, any math-oriented calculation is valid, such as the weighted average. However, it is still possible to perceive the similarity evaluation as a problem of recognizing standards within the entity space (Cover and P. E. 1974). 2.4.6.1.EUCLIDEAN DISTANCE In mathematics, Euclidean distance (or metric distance) is the distance between two points, which can be proved by the repeated application of the Pythagorean Theorem. If this formula is applied as distance, the Euclidean space becomes a metric space. The idea makes sense for recommender systems, if the users were represented as points in a space with several dimensions, one for each item, where the coordinates are the evaluation value. This metric computes the Euclidean distance between these two points (users). This value by itself is not a valid similarity metrics because higher values would mean bigger distance and lower level of similarities. The values have to be lower when the users are more similar. For that, in a system implementation, the similarity measure is calculated this way (Table 2.2 Euclidean distance and similarity measure calculated in relation to user 1): 1 / (1 + d), that is, when the distance equals to zero (meaning identical evaluations by both users), the calculated similarity measure is 1, lowering up to zero, according to distance increase (Owen, et al. 2010). Table 2.2 Euclidean distance and similarity measure calculated in relation to user 1 Item 101

Item 102

Item 103

Distance

Similarity related to user 1

User 1

5.0

3.0

2.5

0.000

1.000

User 2

2.0

2.5

5.0

3.937

0.203

User 3

2.5

-

-

2.500

0.286

User 4

5.0

-

3.0

0.500

0.667

User 5

4.0

3.0

2.0

1.118

0.472

source: (Owen, et al. 2010)

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


696 Valois B Jr., C. Armada, M.

The formula for calculating the Euclidean distance is presented below: , considering

and

as data vectors.

2.4.6.2.The Pearson CORRELATION COEFFICIENT The Pearson correlation coefficient is a measure of the degree of linear relation between two variables. This coefficient varies between -1 and 1. Zero (0) value means that there is no linear relation; value 1 indicates a perfect linear relation; and value -1 also indicates another perfect linear relation, but reverse, that is, when one of the variables increases the other decreases. The closest to 1 or -1, the strongest is the linear association between the two variables. The calculation formula of Pearson correlation coefficient

is:

It means a perfect positive correlation between the two variables. It means a perfect negative correlation between the two variables â&#x20AC;&#x201C; that is, if one increases, the other always decreases. In terms of linearity, it means both variables do not depend on each other. However, another kind of dependency (non-linear) may exist. Thus, the result Ď = 0 must be investigated by other means. The Pearson correlation coefficient measures the trend of two series of numbers, paired one by one, in moving together (Owen, et al. 2010). Table 2.3 The Pearson correlation related to user1 Item 101

Item 102

Item 103

Correlation with user 1

User 1

5.0

3.0

2.5

1.000

User 2

2.0

2.5

5.0

-0.764

User 3

2.5

-

-

-

User 4

5.0

-

3.0

1.000

User 5

4.0

3.0

2.0

0.945

source: (Owen, et al. 2010) 2.4.6.3.Log-Likelihood Log-Likelihood similarity metrics is similar to the Tanimoto Coefficient, since evaluations done by the users are not taken into consideration, but it is more difficult to understand it intuitively. Mathematics involved in this metric processing is beyond this

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


697 Recommender Systems In Social Networks

paper´s scope. Although it is also based on the number of common items between two users, this measure takes into consideration how rare the intersection of the evaluated items is (Owen, et al. 2010). To illustrate this, let’s consider two movie fans that evaluated the movies “Star Wars” and “Casablanca”. If they evaluate hundreds of movies, the fact that they evaluated these two is not relevant, because many people have watched them. If both have evaluated a few movies, the fact that they have watched these two movies is considered relevant, because it is not usual that a fan of “Star Wars” is also fan of “Casablanca”. Table 2.4 Similarities using Log-Likelihood related to user 1 Item

Item

Item

Item

Item

Item

Item

Similarities

101

102

103

104

105

106

107

with user 1

User 1

X

X

X

User 2

X

X

X

User 3

X

User 4

X

User 5

X

0.90

X

X

0.84

X

X

X

X

X

X

X

X

0.55

X

0.16

X

0.55

source: (Owen, et al. 2010) For the purpose of experiments, the Log-Likelihood will be used as the metric that does not take into consideration the evaluations done by the users. 2.4.7. Evaluation of recommender systems In characterizing recommender systems as a scientific research, it is vital to understand the methodologies for system evaluation. An efficient way to evaluate recommender systems is through the comparison of the generated predictions and the real evaluations made by the user. This is achieved by suppressing a certain evaluation, after this, the recommender system is used to predict this suppressed evaluation, and finally both values are compared. The attainment of metrics to evaluate the performance of a recommender system before a wide commercial usage is vital to check if the predictions made will be appropriate for the specific purpose. The most used metrics used in the literature for the evaluation of recommender systems will be presented below. It is important to highlight that, for each dataset or business domain, a specific recommender system may be more appropriate than others. It is only possible to define which recommender system is best applied to a domain through experimentations and the analysis of the results.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


698 Valois B Jr., C. Armada, M.

2.4.7.1.Root Mean Square (RMS) and Mean Absolute Error (MAE) In statistics, the Root Mean Square (RMS) is one of the several ways to measure the difference between one estimated value and its actual value. RMS measures the mean square to quantify the difference of estimated values. The RMS value calculated for a set of values the arithmetic mean of the squares of the values of the set added.

is the square root of

In statistics, Mean Absolute Error (MAE) is another way to quantify the difference between an estimated value and the actual value. As the name suggests, MAE is the Mean Absolute Error.

Table 2.5 Difference between MAE x RMS exemplify the differences between RMS and MAE. Table 2.5 Difference between MAE x RMS Item 1

Item 2

Item 3

Current

3.0

5.0

4.0

Estimated

3.5

2.0

5.0

Difference

0.5

3.0

1.0

MAE

= (0.5 + 3.0 + 1.0) / 3 = 1.5

RMS

=â&#x2C6;&#x161;((0.52 + 3.02 + 1.02) / 3) = 1.8484

2.4.7.2.Precision, Recall, and Fall-Out Precision measures the precision of the recommendations made by recommender systems and it is measured by the quantity of recommended items that are actually interesting to the user in comparison with the set of all recommended items (Figure 2.6 Precision ). The precision of a system shows how close the prediction is to the actual evaluation done by the user.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


699 Recommender Systems In Social Networks

Figure 2.6 Precision

source: (Owen, et al. 2010)

Recall indicates the quantity of interesting items to the user that appear in the recommendation list in comparison with all relevant items.

Figure 2.7 Recall

Fall-Out is the proportion of non-relevant items that are recommended in comparison to all non-relevant items.

Figure 2.8 Fall-Out

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


700 Valois B Jr., C. Armada, M.

3.

SUGGESTED SOLUTION

3.1. Introduction One of the aims of a recommender system is to generate relevant recommendations. In order to accomplish that, collaborative filtering uses massive datasets, no matter the item types. The recommendations are calculated based on this data (Figure 3.1 Top-Down view of traditional recommender systems ). But working with massive data volume represents a scalability problem that should not be underestimated. This project intends to test a way to decrease user x item matrix space and check if, this way, the recommender system is able to improve the relevance of the recommended items and minimize sparsity issues, super-specialization and lack of surprise (section 2.6.4).

Figure 3.1 Top-Down view of traditional recommender systems

3.2. Separation Degree As mentioned in the social networks basics, social network members and the relationship established among them form a graph where it is possible to extract their separation degree or, according to graph theory, their distance. In this paper, the separation degree is taken into account as a natural grouping factor, defined by the members themselves, as people tend to get closer to others who have interests in common (Mendes 2008). Following this reasoning, the smallest is the distance among the network members, and the biggest is the similarity of their interests. This way, we could represent this closeness among the members in a distance matrix (Figure 3.2 Distance matrix ) and group people who are more similar to each user.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


701 Recommender Systems In Social Networks

Distance matrix and heat map u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u1

0

1

6

2

1

3

1

2

5

3

u2

1

0

3

1

4

5

2

6

3

4

u3

6

3

0

1

4

2

5

3

2

4

u4

2

1

1

0

3

2

5

3

5

3

u5

1

4

4

3

0

5

4

1

2

4

u6

3

5

2

2

5

0

2

6

3

1

u7

1

2

5

5

4

2

0

2

3

6

u8

2

6

3

3

1

6

2

0

2

1

u9

5

3

2

5

2

3

3

2

0

5

u10 3

4

4

3

4

1

6

1

5

0

Figure 3.2 Distance matrix

These groups are naturally formed by the separation degree and are used for massive data partitioning, thus, decreasing its space. With the combination of these two ideas, we have a matrix representation of item x user (U x I), for the u1, partitioned this way (Figure 3.3 Top-Down matrix U x I, with evaluations, partitioned taking into account u1 separation degre Partitioned UxI matrix l1 l2

l3

l4

l5

l6

l7

l8

l9

l10

u1

3

5

2

5

4

2

5

4

5

3

u2

4

5

5

2

4

2

2

3

3

1

1st separation degree

u5

1

3

4

2

4

4

5

2

3

5

2nd separation degree

u7

5

4

1

4

5

4

3

1

2

4

3rd separation degree

u4

2

2

1

3

2

4

4

2

5

4

4th separation degree

u8

2

1

4

1

2

3

2

4

2

1

5th separation degree

u6

4

5

2

3

1

1

1

2

2

1

6th separation degree

u10

5

1

1

2

4

3

4

2

3

5

u9

2

4

2

4

3

3

1

4

5

2

u3

4

5

1

4

4

4

3

1

5

3

Figure 3.3 Top-Down matrix U x I, with evaluations, partitioned taking into account u1 separation degre

But the exclusion of items that have been evaluated by more distant users could lead to super-specialization issues or lack of surprise related to the generated

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


702 Valois B Jr., C. Armada, M.

recommendations (section 2.4.4). In an attempt to solve these problems, we have to add another concept used in this solution, that will be explained in the next section. 3.3. Sequential Imputation Another problem of recommender systems (section 2.4.4) derives from the sparsity of the evaluation in matrix U x I. Among possible solutions (Soares 2007), the use of missing values imputation may create noise and ends up compromising the recommendations instead of improving them. But if the imputation is done â&#x20AC;&#x201D; in a reduced space â&#x20AC;&#x201D; among grouped users that allegedly have high similarity, it may be possible to input values with a reduced noise level. In order to ease sparsity problems, super-specialization and lack of surprise (section 2.4.4), this solution suggests the use of the underlying idea in partitioning through the separation degree of the social network members, plus the underlying idea in sequential imputation as follows (Figure 3.4 Top-Down view of the recommender system based on separation degree to do sequential imputation )(Figure 3.5 Process of SocialBased Recommendation ):

1. One of the studied collaborative filtering is chosen to be used for the estimate for user evaluation values of an item. From now on, it will be called the auxiliary collaborative filtering of this system. 2. Every time the system finds a separation degree, it tries to input the user missing evaluations at the same distance to the user to whom the system is recommending. For the imputation, only the evaluations in this space and the auxiliary collaborative filtering are used. 3. The system is thus reefed with new users of the following degree, their evaluations and previous degree imputed evaluations. 4. Steps 2 and 3 are repeated until the desired separation degree. 5. With the resulting matrix, the actual evaluations and the ones imputed by the system, the auxiliary collaborative filtering is used to create the recommendations for the user.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


703 Recommender Systems In Social Networks

Figure 3.4 Top-Down view of the recommender system based on separation degree to do sequential imputation

With the suggested solution, we move forward to the implementation phase, tests and result analysis.

Figure 3.5 Process of SocialBased Recommendation

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


704 Valois B Jr., C. Armada, M.

4.

TESTS AND RESULTS

4.1. DATA SUMMARY From the massive volume of data, 100 users have been randomly selected to form the group of root users of the social network. Table 4.1 User set Total of Users Groups

edges

nodes

Root

-

85

1st degree

3,156

3,140

2nd degree

9,427

6,557

Total

12,583

9,782

Total of books:

441,311

Total of evaluations:

2,128,782

Average quantity of books/evaluations per user:

218

To enable experiment processing, the data space had to be reduced. Only the 500 most popular books have been selected. This has defined the data space that will be used for tests as follows: Table 4.2 Reduced user set Total of Filtered Users Groups

edges

nodes

Country

-

85

1st degree

2,927

2,911

2nd degree

8,587

5,914

Total

11,514

8,910

Total of books:

500

Total of evaluations:

338,690

Average quantity books/evaluations per user:

of 38

4.2. Summary of experiments 4.2.1. Tests Using RMS and MAE For each recommender system to be tested, the following parameters are alternated as follows:

â&#x20AC;˘ The recommender system itself: GenericItemBased, GenericUserBased, KnnItemBased, SlopeOne, SVD, SocialBased;

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


705 Recommender Systems In Social Networks

Database percentages that will be used for training and test: 60%-40% and 70%-30%; • The quantity of root users, using the following values: 10, 25, 40, 55, 70, 85 users; • The quantity of books, using the following values: 50, 100, 150 e 200 books. The used books were the most evaluated ones by root users and their descendents, therefore, the most popular books; • Three rounds of tests have been made for each combination.

The used values have been chosen because they were the most common ones found in literature, following our advisor suggestions. 4.2.2.

Tests USING Precision, Recall AND Fall-Out

These tests try to find a relation between relevant books for the user and the ones recommended by the system. The calculated measures are precision, recall and fall-out, which have been explained in this paper basics (section 0). For these tests, each recommender system has been used to recommend 10 books to each of the root users. The quantity of books and users were varied. The quantity of books and users variation were the same as previous tests. The list of items, which is considered as relevant to each tested user, is formed by books that have an evaluation value greater than the defined threshold, calculated by the average plus a standard deviation. The chosen quantity, to be recommended to each tested user, was 10 books. This quantity has been chosen because it is the most common one found in several different sources (newspaper, radio, television, internet etc.). Therefore, Top-10 is the easily understandable measure for the tests that have been run. It is worth clarifying that the training percentages are not taken into account in this type of test since the evaluation is done as if the system were fully operational. Therefore, all other user evaluations and all tested user evaluations are used, except for the evaluations of books considered by the system as relevant to the user and books that have been separated for system evaluation. 4.2.3.

Summary of test result analysis

In addition to analyzing all general results of all tested recommender systems, we compared separately UserBased (k-NN) against SocialBased (using UsedBased itself (k-NN) as an auxiliary recommender system), since this is the most frequently used implementation (Ferlin 2008), to check the benefits brought by the suggested solution to traditional implementations. 4.3. Generic User Based (K-NN) x Social Based With just a few users, the SocialBased recommender system, implemented in this project, suffers a drawback if compared against the basic implementation of

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


706 Valois B Jr., C. Armada, M.

USerBased (K-NN). In all other experiments, SocialBased has been able to enhance the precision of recommendations. The result of the 85x200 combination is contradictory to other results and needs to be checked with a higher number of rounds. 4.3.1. Root Mean Square

Figure 4.1 RMS measure with Euclidean distance (a)

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


707 Recommender Systems In Social Networks

Figure 4.2 RMS measure with Euclidean distance (b)

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


708 Valois B Jr., C. Armada, M.

4.4. Performance evaluation As we can check in Figure 4.3 Response time for each Recommender system , the implementations done with Slope-One and UserBased (k-NN) have shown to be more efficient than other Recommender systems. However, the response time of other Recommender systems would not preclude its usage in a production environment.

Figure 4.3 Response time for each Recommender system

In Figure 4.4 Average time for the recommendation of 10 books (a) and Figure 4.5 Average time for the recommendation of 10 books (b) are presented the measured average times for each tested user-book combination.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


709 Recommender Systems In Social Networks

Figure 4.4 Average time for the recommendation of 10 books (a)

Figure 4.5 Average time for the recommendation of 10 books (b)

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


710 Valois B Jr., C. Armada, M.

Slope-One and UserBased (k-NN) performance checked in Figure 4.6 Used memory by each Recommender system has been undermined. Recommender systems with the best performances are the ones which consume more memory. In this item, the SocialBased memory savings were greater than the traditional implementation.

Figure 4.6 Used memory by each Recommender system

In Figure 4.7 Memory consumption (a) and Figure 4.8 Memory consumption (b) The average memory consumption can be seen with each tested user-book combination. The average memory consumption of recommender systems was around 185 Mbytes. The values varied between 100 Mbytes and 225 Mbytes.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


711 Recommender Systems In Social Networks

Figure 4.7 Memory consumption (a)

Figure 4.8 Memory consumption (b)

4.5. Discussions about the results After analyzing the data, we concluded that the bigger the number of users and books is, the bigger the drop in the quality of the recommendations is; this result does

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


712 Valois B Jr., C. Armada, M.

not depend neither on the recommender system nor on the type of evaluation done. This behavior is caused by the increase in data sparsity. While matrix U x L (user x book) increases, the results related to Precision and Recall drop. However, the recommended items are not necessarily irrelevant to the user. The evaluation algorithm, used by Precision and Recall, needs to select the relevant items already evaluated by the user (section 4.2.2). This reduces the quantity of relevant items to a small number. If the recommender system suggests items in this list, it means it is working perfectly. However, if it does not recommend items in this list, it does not mean that they are not interesting to the user, but only that they were not among the selected test data used to evaluate the system. In several combinations of users and books, SocialBased managed to improve the results of traditional implementation, using the sequential imputation and the partitioning of the social network through the separation degree up to the second degree. Although SocialBased response time has been higher than the majority of the tested system, it does not disable its use. On the other hand, the gains derived from low memory consumption are encouraging, something around 17.5% in comparison with the traditional implementation (k-NN). It also has been possible to conclude that the variation on the user quantity does not influence the quality of the results as significantly as the increase in the quantity of item does. The results achieved by RMS and MAE measure the checked mistake between the actual evaluations and the evaluations imputed by recommender systems. However, the goal of a recommender system is not to impute closer values to the actual value, but it is to recommend the more relevant items to a specific user (section 2.4). What matters for the final quality of a recommender system is that the results of Precision, Recall and Fall-out are always the best possible ones. In run tests, it was found that SocialBased improved UserBased (k-NN) results using the Log-Likelihood similarity metrics. For k-NN-based recommender systems that use this metrics, the use of SocialBased could be beneficial. 5.

FINAL CONSIDERATIONS

The focus of this paper was the study of Recommender Systems in social networks. There is a need for tools to help the users handle the great deal of information they receive. Based on this context, Recommender Systems technologies have been presented and discussed, in addition to the way they are implemented and evaluated and other needed knowledge for its use, such as several similarities metrics and evaluation metrics. Some studies about the way social networks are formed and their organization also have been discussed. We also presented the small world theory and the idea that people who are closer to an individual have more influence over this individualâ&#x20AC;&#x2122;s opinions. At last, we presented the idea of sequential imputation, the reuse of previously imputed value in a new iteration. Based on this researched knowledge, we built a â&#x20AC;&#x153;Hot-Deckâ&#x20AC;? solution combining these isolated ideas, in an attempt to add value to current Recommender systems.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


713 Recommender Systems In Social Networks

Through the implementation of a testing environment and the analysis of the obtained results, we could conclude their advantages and disadvantages. These conclusions are found in section 4.5. With this paper, we noticed how data related to members of social networks influence Recommender Systems. It is a new different way of partitioning users and defining who the closest people to an individual are in Recommender Systems context. One of the contributions of this paper is a new dataset, which can be used for the continuation of this line of research. Differently from other available datasets, this one has information on the relationships among users that can be used to create a graph of a social network. The natural evolution of this paper would be the expansion of the database so that we could reach a third degree of separation between the users. This way, we could check the behavior of the suggested solution with a higher number of iterations. There is also the possibility of changing the suggested proposition so that the distribution of processing in several computers can be done, even if they are in a cloud computing environment. This is possible because the used Mahout framework is ready for its future interconnection with Apache Hadoop framework (Apache Software Foundation 2008). This paper has focused on the use of collaborative filtering. For future papers, we recommend implementations of Recommender Systems based on contents and that can take into account other data, such as age, sex and location of the participants.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


714 Valois B Jr., C. Armada, M.

REFERENCES Amaral, Viviane. (2004) Redes sociais e redes naturais: a dinâmica da vida. http://www.rits.org.br/redes_reste/rd_tmes_fev2004.cfm (visited in October 17th, 2009). Apache Software Foundation (2008). Apache Hadoop! http://hadoop.apache.org/ (visited in December 12th, 2010).

July,

2008.

Balabanovic, M., and Y. Shoham. (2007) “Fab: Content-based, Collaborative.” Communications of the ACM. Chisnall, P. (1973) “Marketing Research: Analysis and Measurement.” McGraw-Hill. Comission, U. N. S., and E. C. (2000) For Europe. Glossary of Terms on Statistical Data Editing. http://stats.oecd.org/glossary/download.asp (visited in June 17th, 2010). Cover, T. M., and Hart P. E. (1974) “Nearest Neighbor Classifiers.” IEEE Transactions on Computers. by Sola Pool, Ithiel, and Manfred Kochen.(1978) “"Contacts and influence." Social Networks.” Farhangfar, A., L. Kurgan, and W. Pedrycz. (2007) “A Novel Framework for Imputation of Missing Values in Databases.” IEEE Transactions on Systems, Man, and Cybernetics. 692-709. Ferlin, Claudia. (2008) “Imputação Multivariada: Uma Abordagem em Cascata.” Rio de Janeiro, RJ. Gelman, A., and J. Hill. (2006) “Data Analysis Using Regression and Multi-level / Hierarchical Models.” Cambridge University Press, 2006. GroupLens Research. (2010). http://www.grouplens.org/ (visited in June 17th, 2010). Gurevich, Michael. (1961) “The Social Structure of Acquaintanceship Networks.” Cambridge, 1961. J. L. Herlocker, J. A. Konstan, A. Borchers, and J. Riedl. (1999) “An algorithmic framework for performing collaborative filtering.” In Proc. of SIGIR. J. S. Breese, D. Heckerman, and C. Kadie. (1998) “Empirical analysis of predictive algorithms for collaborative filtering.” In Proc. of UAI. Kajimoto, Allan Panossian, Renato Shirakashi de Sousa, Sidney Eduardo Serra Zanetti, and Victor Miranda Cirone (2007). “Sistemas de recomendação de notícias na Internet baseados em filtragem colaborativa.” São Paulo: IME Kim, K. Y., B. J. Kim, and G. S. Yi. (2004). Reuse of imputed data in microarray analysis increases imputation efficiency BMC Bioinformatics 2004, 5:160 BMC Bioinformatics Lakshminarayan, K., S. Harp, and T. Samad. (1999) “Imputation of Missing Data in Industrial Databases.” Applied Intelligence. 259-275. Lemire, Daniel, and Anna Maclachlan.(2005) “Slope One Predictors for Online RatingBased Collaborative Filtering.” February 7th

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


715 Recommender Systems In Social Networks

Lepkowski, J., T. Raghunathan, P. Solenberger, and J. Van Hoewyk. (2001) “A Multivariate Technique for Multiply Imputing Missing Values Using a Sequence of Regression Models.” Canada, 85-95. Likert, Rensis. (1932) “A Technique for the Measurement of Attitudes.” Archives of Psychology. Lorenzi, Fabiana. (2006) “Sistemas de Recomendação Filtragem Colaborativa e Baseada em Conteúdo.” Rio Grande do Sul: UNIVERSIDADE FEDERAL DO RIO GRANDE DO SUL - INSTITUTO DE INFORMÁTICA, Marteleto, M. R. (2001) Análise de Redes Sociais. Brasília Mendes, Regina. (2008) “Ação de Professores em Contexto de Globalização: um estudo a partir do grupo de educação Sócio-ambiental da Pampulha.” Doctorate Thesis in Education - UFMG. Belo Horizonte, MG, 2008. 67-74. Milgram, Stanley. (1967) “The Small World Problem.” Psychology Today. Montaner, M., B. Lopez, andJ. L. de La Rosa. (2003) “A Taxonomy of Recommender Agents on the Internet.” Artificial Intelligence Review. June. Oudshoorn, C., S. Van Buuren, and J. Van Rijckevorsel. (1999) “Flexible multiple imputation by chained equations.” Netherlands Organization for Applied Scienti. Owen, Sean, Robin Anil, Ted Dunning, and Ellen Friedman. (2010) Mahout in action. Greenwich, CT: Manning. P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl. (1994) “Grouplens: an open architecture for collaborative filtering of netnews.” In Proc. of ACM CSCW. Resnick, Paul, and Hal R. Varian. (1997) “Recommender systems.” ACM Communications, March v40 n3 p56(3) Sarwar, Badrul, George Karypis, Joseph Konstan, and John Riedl. (2001) “ItemBased Collaborative Filtering Recommendation Algorithms.” GroupLens Research Group/Army HPC Research Center. Minneapolis: Department of Computer Science and Engineering University of Minnesota. Schafer, J. L. (1997) “Analisys of Incomplete Multivariate Data.” Vol. 1. Rio de Janeiro, RJ: Chapman and Hall-CRC. Soares, Jorge Abreu (2007) “Pré-Processamento em Mineração de Dados:Um Estudo Comparativo em Complementação.” Rio de Janeiro, RJ, May/2007. Vanbuuren, S., J. Brand, C. Groothuis-Oudshoorn, and D Rubin. (2006) “Fully conditional specification in multivariate imputation.” Statistical Computation and Simulation. 1049-1064. Verboven, S., K. V. Branden, and P. Goos. (2007) “Sequential imputation for missing values.” Comput. Biol. Chem. 320-327. Wang, Jun, Arjen de Vries P., and Marcel Reinders J.T. (2005) “Information and Communication Theory Group.” Amsterdam, The Netherlands. Watts, D. J. (2003) Six Degrees: The Science of a Connected Age. 1st. New York: W.W. Norton & Company.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp.681-716

www.jistem.fea.usp.br


716 Valois B Jr., C. Armada, M.

Webber, Rosina Lee. (1998) “Pesquisa Jurisprudencial Inteligente.” Engineering Doctorate Thesis - Universidade Federal de Santa Catarina. Florianópolis: UNIVERSIDADE FEDERAL DE SANTA CATARINA, May 14th. Xue, G. R., C. Lin, Q. Yang, W. Xi, J. Zeng, and Z. Chen. (2005) “Scalable collaborative filtering using cluster-based smoothing.” In Proceedings of the 2005 ACM SIGIR Conference, Salvador, Brazil.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 681-716

www.jistem.fea.usp.br


JISTEM - Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação Vol. 8, No. 3, Sept/Dec. 2011, pp. 717-748 ISSN online: 1807-1775 DOI: 10.4301/S1807-17752011000300010

MANAGING IT AS A BUSINESS: THE LUTCHEN’S GAP IN THE 100 TOP ORGANIZATIONS BASED IN BRAZIL Sergio Alexandre Simões Leonel Cezar Rodrigues Emerson Antonio Maccari Nove de Julho University – UNINOVE, Brazil Mauricio Fernandes Pereira Federal University of Santa Catarina – UFSC, Brazil ______________________________________________________________________ ABSTRACT A common problem in IT management involves the lack of business vision on the part of IT executives who align IT to the strategic assumptions of the company, but forget the tactical functions of the former, managed as a business. This is known as the Lutchen’s gap. Because of its importance in the context of IT management, this paper is a complementary approach to the preceding one from Rodrigues et al. (2009) and aims at identifying the Lutchen´s gap presence in the management of IT among the 100 top companies located in Brazil. To do this, we proceeded to a description of the profile of IT management in these companies, controlled by the executive board. We used a questionnaire filled out by IT executives, containing 77 questions covering the four functions of IT in a company. The main results indicate that: (a) In the IT’s Alignment function, the design of IT reveals dichotomies between business and IT objectives (b) In the IT’s Management function, the IT’s budget is oriented toward ensuring "delivering" abilities to IT services and differing in the methods of cost control, ranging from apportion by volume, overheads and by ABC cost, (c) in the IT’s Deliver function, evidence suggests that IT is managed as a business enabler; and (d) in the IT’s Quality and Safety Assurance, IT executives monitor quality and safety events, but are limited to IT’s basic operations. In conclusion, in the companies surveyed, IT is seen much more as an" on-demand solution provider", rather than as an instrument of innovation and competitiveness enabler for organizations. Keywords: IT Strategy; Innovation; IT as business; Lutchen Gap. _____________________________________________________________________________________ Manuscript first received/Recebido em: 14/05/2011 Manuscript accepted/Aprovado em: 22/07/2011 Address for correspondence / Endereço para correspondência Sergio Alexandre Simões, Mestre pela Universidade Nove de Julho – UNINOVE, sergioalexandre.simoes@gmail.com Leonel Cezar Rodrigues, PhD Doutor pela Vanderbilt Unviersity Diretor do Programa Interinstitucional de PG em Administração da UNINOVE Professor do Programa de Mestrado e Doutorado em Administração da UNINOVE, Av. Dr. Adolpho Pinto, 109 Barra Funda - São Paulo-SP, Brasil, Email:leonel@uninove.br Emerson Antonio Maccari, Doutor pela Universidade de São Paulo - FEA/USP Diretor do Programa de Mestrado Profissional em Gestão de Projetos – UNINOVE - Universidade Nove de Julho, E-mail: maccari@uninove.br Mauricio Fernandes Pereira, Pós-Doutorado em Sociologia Economica e das Organizações pela Universidade Técnica de Lisboa, Professor do Departamento de Ciências da Administração da Universidade Federal de Santa Catarina– UFSC, E-mail: mfpcris@gmail.com Published by/ Publicado por: TECSI FEA USP – 2011 All rights reserved.


718 Simões, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

RESUMO Um problema comum em gestão de TI, apontados pelos especialistas, envolve a falta de visão de negócio dos executivos de TI ao alinharem seus departamentos com as premissas estratégicas da empresa e esquecerem-se das funções táticas da TI, onde esta poderia ser gerida como um negócio. Esse é conhecido como o gap de Lutchen e por sua importância no contexto da gestão da TI, este artigo é complementar ao artigo de Rodrigues et al. (2009) e visa identificar a presença do gap de Lutchen, na gestão da TI entre as 100 maiores empresas sediadas no Brasil. Para tanto, procedeu-se a um levantamento do perfil da gestão de TI nessas empresas, na visão de seus executivos. Utilizou-se um questionário, respondido pelos executivos de TI, contendo 77 perguntas que varrem os quatro fundamentos da função da TI na empresa. Os principais resultados indicam que: (a) na função Alinhar, o desenho da TI apresenta dicotomias entre os objetivos dos negócios e os da TI; (b) na função Gerir, observa-se que os orçamentos de TI orientam-se para garantir a capacidade de “entrega” dos serviços de TI e diferem nos métodos de controle de custo, variando entre rateios por volume, overhead e custo ABC; (c) na função Entregar, as evidências indicam que a TI é gerida como uma capacitadora de negócios; e (d) na função Garantir a Qualidade e Segurança, muitos executivos de TI monitoram eventos de qualidade e segurança, mas limitam-se às operações básicas da TI. Em conclusão, o que se observa nas empresas pesquisadas é que a TI é vista como um sistema “fornecedor de soluções,” sob demanda, dos negócios, mais do que como um instrumento capaz de garantir inovação e capacidade competitiva para as organizações. Palavras-chaves: Estratégia de TI; Inovação; TI como Negócio; Gap de Lutchen.

1. INTRODUCTION Currently, Information Technology plays a key role to generate improvements in productivity, competitiveness and in the profitability of organizations. In this context, IT management design can facilitate or could limit the achievement of organizational goals and objectives. Lutchen (2003) states that, to be useful to corporate objectives, IT must be managed as a business. The problem, according to the author, is that in most organizations, IT management seems to pay attention to other aspects of organizational dynamics. According Lutchen, Chief Information Officers (CIOs) are directing their actions towards two opposite ends. To one end, they target at IT operating functions of the company; that is, planning and implementing of IT infrastructure. To the other end, they focus the strategic interests of the company, looking at the alignment of IT with business strategy. However, IT should be in the middle of these two extremes as a mechanism to improve the enterprise´s business performance. According to Lutchen (2003), to exercise the dichotomous roles in the business structure, IT leaves a large functional gap: to ensure the integration and differentiation processes of the organization's business. Lutchen calls the failure of this function as the IT delivery gap. The author argues that as an IT solution and business process flexibility inductor mechanism, IT itself should be managed as a business. To achieve this, IT should be, at the same time, determining the profile of the business and be fully integrated into the business transactional processes. Because it is a function that adjusts designs and ways to optimize resources and capabilities at the service of business processes, it is in fact a corporate function, but with a fundamental role for efficiency and effectiveness of business processes.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


719 Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil

Recently, Craig and Tinaikar (2006), based on the results of their research in the world's largest organizations show that IT is simply viewed, by most IT and Technology executives, as a solution provider demanded by business people. Companies where IT is used as a tool for business innovation and as a means of re-designing business are rare and almost nonexistent. Craig and Tinaikar (2006) research results also indicate that the majority of organizations still does not view IT as a mere instrument of delivering solutions. To manage IT as a tool for innovation, managers need to address or support a particular innovation strategy that is central to the dynamics of the enterprise. Having a clear innovation strategy, that is, setting technology as a basis for the business strategy is a quest that many organizations find it difficult to design and implement. Either way, Craig and Tinaikar’s (2006) research results show that the functional performance of IT in the organization confirms Lutchen´s (2003) thesis about the gap of IT as a tool for business differentiation. On the one hand, business executives want and need IT to help them achieve business goals. On the other hand, IT executives, more subservient to the structures and procedures of the organization, understand business goals as the only thing to be done. This ultimately determines the format of the IT’s management that focuses on operational functions and define this as the main focus of IT executives. Additionally, this focus limits the role of IT regarding business fundamentals. In a broader context, yet to a lesser degree, IT is also used by business executives to support the formulation of their own competitive strategies. This is noticed in several works of renowned researchers, such as Abreu and Fernandes (2008), Rao (2002), and Prahalad and Krishnan (2002) that show the potential of IT to play a more critical role in the design of business that could be turned into a better macro performance for the organization. The findings of these researchers indicate that there seems to be a clear bias regarding the priorities of IT management and the implicit priorities in business management. Major national organizations, because of their good performance, seem to have adjusted their business models using IT in efficient ways and giving the impression of absence or at least, the minimization of the functional dichotomy of IT in the organization. This problem, present in Lutchen (2003) "IT Delivery Spectrum" also shows to be a fact in these organizations, especially when one notes a great dispersion of IT’s management and control methodologies (COBIT, ITIL, ISO etc.) in use. Using a variety of management methods does not imply that such methods are harmful to the IT’s performance, but each method has a bias of specific administrative interest and does not necessarily fill the managerial gap above mentioned. 1.1 Problem and Reseach Objectives The efficiency of IT management cannot be measured only by control parameters, applied to or used by IT executives. Instead, the performance of IT is also a function of perception of how useful and how compatible IT services are to their users and to the business of the organization as a whole. The vision of the role of IT in the organization, however, is not only a responsibility of business executives; it is mainly an IT executives´ responsibility. The importance of IT regarding business efficiency has grown a lot in recent decades, and no leading organization can afford to disregard the role of IT to business performance. This issue shows how critical the relationship between business needs and the functions of IT in organizations is. Thus, would it be the case of IT management among the 100 top organizations based in Brazil? In what

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


720 Simões, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

ways are the IT executives of these organizations working on the management of their areas in order to align, manage, deliver and assure the quality of IT services to their organization business? In this context, the objective of this article, which is complementary to the article by Rodrigues et al. (2009), is to identify the Lutchen’s (2003) IT delivery gap in the IT management of the 100 top companies headquartered in Brazil, distributed among the ten most representative economic sectors in the country. The Lutchen´s (2003) IT delivery gap will be characterized through the IT management maturity criteria set out in COBIT (2010), involving the managerial design and the role of IT related to the needs of business organizations. 2. LITERATURE REVIEW 2.1 IT Functions and the Lutchen’s IT Delivery Gap aqui, Services Gap ou Delivery Gap? The literature review in this article involves the interpretation of Miles and Huberman (1994) on the Lutchen´s (2003) IT delivery gap associated with the institutional functions of IT. To discuss more clearly, we propose a conceptual model, represented in Figure 1, which positions IT in the organization's administrative structure based on the functions that integrate the IT Delivery Spectrum (IT Enabling, M&A Dues, and Leadership), proposed by Lutchen (2003) as the practical execution of IT functions. Here, we adapted Lutchen’s concept of practical execution using surrogate activities in the spectrum – Alignment, Management, Services and Quality and Safety – as key functions of IT. The way IT delivers in these functions tells us the presence or even the origins of IT services or delivery gap.

Figure 1: Structural Functions of IT Source: Adapted from the IT Delivery Spectrum (LUTCHEN, 2003). To understand the concepts in Figure 1, one must consider, by definition, the involvement of three distinct contextual event areas – strategy; technology; and IT gap.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


721 Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil

These areas determine the profile of the IT management in the majority of organizations: (1) Strategies Area - Hierarchically above the "spectrum of IT Delivery", the formulation of IT strategies occurs, which should be aligned with major corporate strategies, but mainly with business strategies. By aligning with the corporate strategy and the company's business, IT can play a key role delivering applications to the internal demands and to the flexibility of business processes. IT strategy as a business, however, is motivated primarily by an internal view of IT management in order to reduce costs, increase process efficiency and service levels. (2) Technology Base - the efficient delivery of IT services is conditioned by the technological composition of the organization, that is, by the technology available to IT in the company. To perform its functions, IT needs to manage technological resources and computer technology (hardware and related devices, and peripherals, software, telecommunications, systems, and data management and information) in an efficient and focused manner. Using the base of technology allows IT to perform its functions and effectively responds to domestic demands. (3) IT Gap - The execution of IT functions (IT Delivery Spectrum) is where the translation and connection of IT solutions to business needs happen. Given its criticality to the business of the company, this is a strategic area in the context of IT management. It is in this area that the integration of systems, optimization of their applications, processes and controls development, management, performance indicators, and etc., should happen, ensuring that the alignment of IT with business strategies is not casual, but causal. Most often, however, this integration - between solution and the ability to meet business needs – does not happen in the best format and the IT loses its role as integrator and strategic to the business, leaving an open gap function, the Lutchen´s gap. The IT functions pointed by Lutchen (2003) are where one can observe that IT plays its role in the direction of changes and in the performance of organizational dynamics. Therefore, it is important to review each of the functions, here called Alignment, Management, Services and Quality & Assurance, discussing concept, accumulated knowledge and understanding their implications for the business. 2.1.1 Alignment Although executives of IT in general ensure the IT alignment with business strategies, the results provided by Prahalad, Krishnan´s (2002) research, among 500 business executives from organizations around the globe, are against the existence of alignment between IT and the strategic interests of the business. This research evaluated the alignment of IT under five parameters, intrinsic to the business nature (Degree of changes in the Industry, Strategic Management, Ability to change the organization, Quality of infrastructure and People’s Collaborative capacity within the Organization)

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


722 SimĂľes, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

on a scale ranging from 0 to 5, being 5 fully aligned. The responses of business executives demonstrated a low degree of alignment (a 2.5 mean), indicating that alignment of IT with business strategies, in the companies studied, is far from desirable. The intense competition among companies requires from IT special attention to align goals and activities with strategic business objectives. This alignment is an important driver of organizational success. It is only through this alignment that the mission, the goals and the business strategic plan (BSP) can be effectively shared and supported by the IT Strategic Plan (ITSP). The SBP is a set of objectives and goals that help organizations use and optimize allocation of resources to develop special skills and competencies that make them particularly competitive (LOBLER et al. 2008). In addition, Rezende (2002) observes that the SBP is a dynamic and iterative process to induce the setting of objectives, policies and strategies of the organization. The ITSP, according to Weill; Ross (2004) provides an overview of concepts, methods and processes, technologies and tools necessary to facilitate the implementation of the business strategy. The ITSP also supports the decisions, the business actions and respective processes, generating direct benefits to the business itself. The alignment between the ITSP and the BSP can be decisive for the competitiveness of the company. Thus, according to Luftman (2003), Zorello (2005) and Abreu and Fernandes (2008), a good level of alignment means that the organization applies IT resources in a timely and adequate manner, making it consistent with the goals, the needs and the business strategy of the company. Craig; Tinaikarâ&#x20AC;&#x2122;s (2006) research shows that IT plays different roles in organizational processes, especially in cases where the ITSP is aligned to BSP, to achieve specific objectives. The authors grouped the influence of IT on business into three roles, based on the relationship and support it provides to products and processes: (a) service provider, (b) tool for innovation, (c) rule-breaker business. (a) IT as a service provider. This is the most common role that IT plays within the organization. As a service provider, IT primarily focuses on meeting the needs of administrative tasks and of information for the organization. IT keeps itself apart or away from business without a clear and direct involvement in the processes and in their final performance. (b) IT as a tool for innovation. This is a less common role played by IT, however, a more desirable one. It is a very role difficult to play because it requires a combination of capabilities and IT resources in order to give the organization greater ability to innovate. In this case, IT aligns well with other internal functions such as knowledge management, organizational learning and corporate entrepreneurship and has become a tool to support process and products innovation, and to support flexibility in transactional business processes. (c) IT as a competitive differentiation. This is the most rare and unusual role of IT. It happens when IT, through new solutions, forces the redesign of a business base (technological and transactional), distinctively from the usual format a business is built in a sector. In this case, IT is what determines the size, shape, products and business processes. IT also becomes the base for

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


723 Managing IT as a business: The Lutchenâ&#x20AC;&#x2122;s Gap in the 100 Top Organizations based in Brazil

the business model based on its own technological capabilities and taking a proactive role in determining a new business format. It is clear that the alignment of IT with business strategy gives to IT a more proactive and innovative role in business. It is clear from the arguments of Lutchen that IT must be managed as a business, requiring that CIOs facilitate business strategies, as suggested by Prahalad (2006), and from the observations of Craig and Tinaikar (2006), that IT has distinct roles in the model and competitiveness of a business. However, it remains unresolved the difficulty of transforming CIOs, usually with expert advice on business executives, as suggested by Lutchen (2003). More recently, Takanen (2008) also points out as an additional difficulty the fact that CIOs do not participate in the planning and in the decision making process of strategies formulation in the organization, because of the underprivileged hierarchical position of IT in organizations. A position close to the top management would actually help in more efficient alignment actions to achieve the company's strategic goals, as well as in a more efficient execution of its functions in the organization. 2.1.2 IT Management Good IT management is associated with the dynamic synchronism between IT and business strategies. As pointed out by Rodrigues et al. (2009), the dynamic synchronism of strategies, however, is not enough to ensure the effectiveness of IT functions regarding business interests, according to Shpilberg et al. (2007). It is necessary beware of the alignment traps. By alignment one means the degree of commitment of the IT group to business priorities, resource allocation and execution of projects, as well as the delivery of solutions consistent with business objectives. The traps of alignment appointed by Shpilberg et al. (2007), however, are associated more with problems of inefficiency due to incompetence in the IT group than with problems of alignment itself. For example, the most frequent cases of delay in IT projects usually refer to problems associated with the incapacity of the staff than necessarily with the problems of IT solutions design. In these cases, effectively executed IT management, noting the proper complementarity of administrative functions, should ensure the effectiveness of IT institutional functions. In addition to the normal aspects of managing IT, management processes must consider investments in IT as a management tool to drive and to support the alignment of IT with business strategies. According to Craig and Tinaikar (2006), as pointed out by Rodrigues et al. (2009), one of the most efficient ways to identify this alignment is to consider investments to "stay in the race," to "win the race" or to "radically differentiate" in the business. To stay in the race, investments simply focus on the capacity of IT to respond to local demands. To win the race, investments consider transforming IT into an innovation tool. And to radically differentiate themselves, investments attempt to make IT a base to radical or even disruptive innovation in business. Even from distinct approaches, the understanding of the strategic management of IT, analyzed by Craig and Tinaikar (2006), keeps similarities to the understanding of IT management by Lutchen (2003), in the sense that, for the authors, IT must be managed efficiently regarding its functions in business, independently of the role

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


724 Sim천es, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

intended to it in the context of the organization. The fact is that, as shown by Craig and Tinaikar (2006), IT should be playing a much more prominent role for business innovation than it actually does. Research results show that in the majority of the firms, IT plays a secondary role, being a mere solution supplier. And so, IT works as a common department of internal routine, much more concentrated on avoiding and explaining its operating costs than on distributing them over the demanding business departments (LUTCHEN, 2003). IT seems to be less concerned to demonstrate its value, based on the investment return to the business coming from its innovative role than to justify its costs (CRAIG; TINAIKAR, 2006). On the other hand, Marwaha et al. (2005) research results, from responses of 9.345 IT executives worldwide, indicate that 53% of these executives recognize innovation as the most important ability to grow their business. There seems to be a dichotomy between actions and the managerial thinking of IT executives. Despite claiming that IT constitutes a powerful tool for business innovation, IT executives act, routinely, as if the only IT function were to provide automation and productivity solutions to the enterprise. Although these actions may have explanations in their respective contexts, the dynamics of the business concept, which requires constant innovation, seem not yet touched IT executives in their day to day managerial practice. 2.3 Services Speed, productivity and innovation in delivering are hard to be built characteristics because of their conflicting natures. Speed and productivity are congruent, but adding ability to innovation, simultaneously, seems to be dichotomous. Marwaha, Willmott (2005) suggest a steady migration of levels. Once identified, innovation needs to migrate quickly to business standards. Thus, innovation as a solution to a process or a new product must keep a strategy aligned with business strategy to ensure achievement of goals. The adoption of innovative processes to help business performance by giving higher efficiency to IT services, however, must be orchestrated in a consistent manner across the organization. According to Correa (2006), IT Governance is the mechanism that can provide a framework linking IT processes, resources and strategies to the strategic objectives of the organization. IT governance is therefore essential to ensure efficiency improvements in the processes of the organization aiming at speed, productivity and innovation. Once the cycle of governance and best practices are provided by the management of IT processes, it is necessary, according to Weill and Ross (2004), to explore the potential of macro processes of Management of Demands and Portfolio, which is essential to establish the cycle of Governance and evaluate the results of IT service delivery. The IDC study (2006) concluded that organizations that are part of the Global 2000 Group are using management and prioritization of projects and programs of IT to gain a competitive edge and increase efficiency. According to the IDC (2006), the IT portfolio management is used to define, assess, control, monitor and optimize tasks and resources needed to plan and complete a project, also in addition to managing the portfolio of all projects of the organization, including the analysis of "what if" in projects proposed or approved, but not yet running. This is the process of collectively analyzing costs, risks and benefits of new projects in the context of ongoing investments

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


725 Managing IT as a business: The Lutchenâ&#x20AC;&#x2122;s Gap in the 100 Top Organizations based in Brazil

in order to make better decisions and investments. In this phase, it is important the definition of roles and responsibilities of each player (Borland, 2006). In short, what can be inferred from the observations and findings in the authors cited, is that in terms of Services, the focus is on how to accelerate and increase innovative capabilities of organizations through consistent delivery of IT services. This really generates the context for the argument of this article. That is, speed and innovative capacity of business are reflections of how IT should be managed as a business, as argued by Lutchen (2003), involving documented processes and rules, automated and disseminated through the company, in accordance with the best practices, bolstered by parameters, indicators and measurable metrics. 2.4 Quality & Security Assurance Quality and Security in the processes of IT management are associated with the efficiency and effectiveness in Service Level Agreements, known as SLA. For a better understanding, it is important to clarify the difference between the concepts of efficiency and effectiveness in the context of SLA, according to Albertin and Sanches (2008): â&#x20AC;˘

Efficiency: It is related to the provision of IT services and costs associated with planned deadlines. It is associated with the way IT services are delivered to users, according to agreed quality standards. It involves the search for alternative solutions to meet the demands within the level of pre-planned resources. Efficiency is usually measured using unitary costs and using the service levels offered by IT. It involves obtaining economies of scale and use of standardized solutions versus specific solutions.

â&#x20AC;˘

Efficacy: It is about the ability to provide effective solutions to the demands. It means being able to meet the needs of users, regardless of the level of difficulty of the demand. Efficacy is linked to the priorities of IT investments and relates to the evolution of IT solutions. Efficacy also relates to the benefits after the implementation of the solutions that meet or exceed original expectations. The model of IT services is tailored to the needs and goals of users.

It is also important to define here the understanding of Service Level Agreements that seek to parameterize the quality of products and services of IT. Albertin and Sanches (2008) define the Service Level Agreement as a contract that defines parameters that a service provider makes available to clients, specifying the performance measures for hired services. SLA also defines commitments, responsibilities and limits applicable to both parties. According to Abreu and Fernandes (2008), it concerns to the management of IT performance, the setting of performance objectives, the creation and implementation of indicators, the monitoring, the decisions considering results measured / obtained and the continuous improvement actions arising from the process of monitoring and control. In practical terms, Weill and Ross (2004) suggest that the management of IT performance should done based on four main factors and their relative importance for each

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


726 SimĂľes, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

organization: the use of IT-oriented efficiency and cost control; the effective use of IT to use its assets, the effective use of IT for business growth; and the effective use of IT for business flexibility. Standardizing the measurement of IT performance, however, typically follows the version 4 COBIT, performance framework (ITGI, 2010). This is a structure recognized and used by organizations as a model for the management of IT performance. It contains processes and controls objectives and for measuring and monitoring IT in organizations that eventually can be measured (or calculated) through the balanced scorecard (BSC) and / or specific audits, as the Statement on Audit Standard 70 (SAS70). According to Brege, Brehmer and Rehme (2008), the use of BSC as a tool for performance measurement and control is highly useful in IT management, especially in cases of outsourcing encompassing more strategic and long range functions. The main reason for using the BSC in these cases is that long-term contracts can have a direct relation with and affect activities related to value creation, innovation and product development. Therefore, it is essential for executives to have a more corporate and strategic performance vision of these contracts. The application of the BSC can help obtain and evaluate the usefulness of these indicators. Finally, to ensure the quality of IT services, it should be noted that the performance of IT in terms of quality and safety of services can be measured by the combination of the ability of IT governance mechanisms, which enable the desirable behavior of IT in the organization, and the level or degree to which these performance objectives for quality and safety requirements are met in the business arena. 3. METHODOLOGY The research framework that guided the questionnaire followed the premises of Lutchenâ&#x20AC;&#x2122;s (2003) spectrum of IT delivery, as we expressed by: alignment, management, service and quality & security. The design has followed the assumption that a strategic group of leading organizations is responsible for the performance profile and trends in its economic sector. Porter (1986) defines strategic group as a set of similar organizations within a sector, and different from others outside this sector, in one or more key dimensions of strategy. Thus, the sample was intentionally chosen to form a strategic group representing ten different economic sectors in the Brazilian economy. To select the research sample, we used the biggest and best companies located in Brazil, listed in the Guia Exame das Maiores e Melhores, published in July 2008. The answers to the 77 questions constituent of a disguised and semi-structured questionnaire was sent, via Internet, to the CIOs of the 100 selected companies. The objective was to characterize the management of IT in terms of its alignment with business strategy, in terms of profile management, in terms of ways to deliver IT services, and in terms of quality and safety assurance of IT services. The characterization of these four functions was studied for this article, in terms of the maturity of IT management, according to the parameters and indicators listed in the version 4 of COBIT (ITGI, 2010). In COBIT, the resultant of the four functions of IT represents the strategic alignment of IT with the business, expressed as levels of maturity in IT (Lobler et al., 2008). Thus, by knowing the functions and expressing

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


727 Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil

them as the level of alignment of IT with the business objectives, we are expressing the COBIT maturity level and, at the same time, the profile (or presence) of the Lutchen´s gap. Thus, we utilized the parameters of maturity in IT management, as defined in COBIT, version 4 (ITGI, 2010), since COBIT has been adopted as the reference model in business worldwide, because it is recognized internationally and help understand and manage IT risks. The levels of maturity in IT management listed in COBIT are classified by Lobler et al. (2008) as the levels of alignment of IT to business, into five levels. As stated above, the levels of alignment are actually a result of the combined performance of the four functions cited by Lutchen (2003). Thus, the criteria indicate that the organizations that are at level 1 of maturity are those with a zero level of alignment, or virtually nonexistent alignment with respect to the business strategies. At level 2 organizations are classified in a state of incipient maturity in terms of alignment with business strategies. Level 3 represents organizations with a maturity of IT alignment, focused and established. At level 4, organizations are characterized with a maturity managed at the strategic level. At Level, 5 organizations are classified with a level of IT maturity excellence, that is, at this level those organizations whose IT strategies are fully aligned with business strategies are classified. IT keeps all the processes of services fully integrated and documented in accordance with the market´s best practices. For a better understanding of the maturity levels resulting in the search, we simplified the levels of maturity into three distinct groups: (a) incipient, (b) intermediate and (c) mature. Thus, organizations classified with maturity levels in IT management between 1 and 2, in Lobler et al. (2008) classification, are here considered incipient in IT management. That is, in these companies, IT would not be managed as a business. The ones classified with level 3 of maturity are considered intermediate and have been shown to have elements of IT management as a business. Organizations that show maturity level between 4 and 5 are hereby classified as mature in IT use, because they are managing IT like a business. Thus, the methodological design allows obtaining information about the results of the profile of IT management in the 100 biggest and best organizations located in Brazil. It is possible, therefore, to preview the profile of IT management regarding the presence of the Lutchen’s gap, through the analysis and characterization of managerial practices used by IT to deliver services. 4. RESULTS As indicated in the methodology, the organizations surveyed belong to the Brazil's 500 top companies, classified in the Guia Exame – Maiores e Melhores/2008. From this list, we selected the 10 largest in the 10 most economically representative industrial sectors of the country, that is, the 10 sectors with the highest contribution for gross domestic product of Brazil. In the characterization of the respondents, the degree of economic importance of the surveyed organizations is represented by the fact that 60% of them declare have revenues over US $ 1 billion and only 3%, under US $ 200 million, annually.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


728 Simões, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

The survey data also indicate that, in terms of "momentum" of the business, 74% of respondents note to be in process of expanding their markets, that is, directly increasing its relative share in the market´s segment, or increasing the product portfolio to increase its presence in specific segments. To provide focus and clarity to the great amount of the research findings and in keeping the purposes of this article, we decided to organize the results under the four functions of IT (Alignment, Management, Services and Quality & Security). The data analysis considers only the main evidence of the functions of IT, to classify the management of IT in the organizations surveyed, according to the degree of maturity, using the parameters of the COBIT (IGI, 2010). Since the characteristics of maturity in IT management are based on the same elements of managing IT as a business, then we can establish a direct relationship between the degree of maturity in management of IT and the Lutchen’s IT delivery gap. 4.1 Function Alignment When questioned if the IT plan is formal, current, disseminated and aligned to the business, the responses of IT executives indicated that 63% have formal plans and IT is aligned with the strategic business plan (BSP). It should be noticed, however, that 37% of IT executives did not execute a formal alignment or did not recognize the alignment as relevant, thus, generating a premise of significant misalignment of IT management with the mission and business objectives. It is presumed, for these cases, that IT does not support the business strategic plan (BSP). The evidence is the lack of use of management tools (Balanced Scorecard, for instance) determinants of the adherence to goals and actions of IT to business goals and actions. However, despite evidence of misalignment, IT can be exercising very well its unique role as a solution supplier through the implementation of operational activities and routine tactics. In this case, the focus may be in the management processes which are appropriate to technological and computational resources. Graphic 1 shows that just under 40% of IT executives understand the importance of indicators and make use of them. For most IT executives (61%) the Balanced Scorecard is not yet part of their management or even constitutes an instrument for aligning IT with the business. It should be noted, additionally, that about one fifth of the executives surveyed (18%) did not even see this as a relevant tool for aligning IT with business strategy.

Chart 1 – IT alignment based on Balanced Scorecard

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


729 Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil

We also investigated the hierarchical location of IT, in the organizational structure as an important factor to characterize the strategic role of IT. Takanen (2008) has pointed the hierarchical position of IT as a barrier for IT to take an innovative role in the organization. The results indicate (Graphic 2) that the majority of IT executives is linked and reports directly to the CEO (Chief Executive Officer) in their organizations. This situation gives to IT a strategic participation in the decision making process of the organization, as IT leaders are at the same hierarchical level as business leaders are. This positioning also helps IT to exercise more effectively an innovative role or even being the cause of business-breaking rules, making more relevant the value that IT can add to products and services of the organization (CRAIG; TINAIKAR, 2006).

Report to CEO or Board

25 42

Report to CTO or CFO 33

Report to CIO Global or Regional

Chart 2 – Localization of IT in the Organizational Structure With regards the leadership provided by IT, the survey shows that, normally, IT does not assume the leadership for internal changes, but shares the leadership for changing initiatives (65%) with other areas of the organization. In other words, only in about one third of organizations surveyed, IT leads innovation initiatives. On the other hand, if added, the two percentages (28% and 65%) indicate that IT executives actively participate in over 90% of business change initiatives. This, in theory, allows them to practice the alignment of IT with the business, offsetting the possible problems arising from a less privileged hierarchical location of IT (Graphic 3).

Chart 3 – Leadership in Innovation by IT

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


730 SimĂľes, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

Despite of IT executives share the leadership in internal changes and despite they indicate that IT plans are aligned with the mission and strategic business objectives, the survey results indicate that 43% of respondents do not use a performance measuring instrument (e.g. BSC) for IT. Considering that 61% (Chart 1) of IT executives do not use a performance measuring instrument, one can imagine that IT does not have performance standardized indicators and metrics common to the business area. On the other hand, 57% of IT executives say they have operational indicators in use (indicators which are documented, formalized, and disclosed as a routine), may not be feeding the performance indicators of the business properly. This might have strong implications for not alignment of IT with business objectives. To characterize the alignment of IT management in organizations surveyed in relation to the degree of maturity according to COBIT (IGI, 2010), we considered relevant the positive responses to the existence of the following four essential elements: (1) IT Planning is aligned to the Business, (2) performance of IT measured by a standard instrument, such as the BSC, (3) subordination of IT to the CEO or Board, and (4) indicators and metrics to evaluate operating performance in use. For each element, we identified the percentage of positive responses in the survey. For conversion to the scale of COBIT maturity level, which ranges from 1 to 5, we used a normalization factor 20 for convenience. Thus, if all the survey responses were positive, we would have a level 5 maturity (100/20 = 5), indicating that in the whole survey, organizations demonstrate to have IT strategies fully aligned with business strategies. For this research, the results of the Alignment function are shown in Table 1. Elements of Alignmet

Results

Normalization

Maturity

Vector 1 - IT Alignment Plan

63%

3.2

3

Vector 2 - IT Efficiency Measured by BSC

39%

2.0

2

Vector 3 - IT Reporting - CEO or Board

42%

2.1

2

Vector 4 Indicator/Metrics

57%

2.9

3

-

Performance

Table 1 â&#x20AC;&#x201C; Level de Maturity Related to Alignment Function Just a reminder, COBIT (ITGI, 2010) maturity scale, was associated by Lobler et al. (2008) to the functions of IT proposed by Lutchen (2003). In this study the scale was simplified to three levels: incipient, intermediate, mature. To fill the Lutchenâ&#x20AC;&#x2122;s gap, the classification of the COBIT maturity level in IT management follows the criteria: Incipient: those with maturity level between 1 and 2. In this case, organizations do not to run IT as a business. Intermediate: those with maturity level 3. In this case, organizations have clear indications that are managing IT like a business. Mature: those organizations with maturity level between 4 and 5. In this case, IT is managed like a business.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


731 Managing IT as a business: The Lutchenâ&#x20AC;&#x2122;s Gap in the 100 Top Organizations based in Brazil

For the results of this research, the function Alignment has no clear indication that lies between the maturity level 2 and 3. That is, the organizations surveyed are partially incipient (and subordinate performance measures) and partly as intermediate (plane alignment of IT and operational performance indicators). 4.2 Function IT Management Within the Lutchenâ&#x20AC;&#x2122;s (2003) concept, the management functional gap appears essentially in the managerial practices of IT management. Managerial practices related to the logic of budgeting are the most critical. They involve the division of costs between units users so that IT is not seen as a mechanismtaken for granted, but as an essential management tool to increase the performance of the organization. As a mechanism that performs tasks for the distinct sectors - decreases operating costs increasing efficiency - IT costs must be paid by unit users. The results of research on the practices of budgeting IT show a certain managerial pattern, focusing on three evidences. First, there seems to be a preference for allocation of resources (up to 10%), among the majority of IT executives, over technologies in progress, updating the infrastructure, training and technological innovation (Chart 4).

Chart 4 â&#x20AC;&#x201C; IT Budgeting Practices Second, a smaller set of executives spend on average 20% of their budget on systems and technologies in progress, on updating the IT infrastructure, and on personnel communication. Third, a small group (between 11% and 25%) does not want to invest, or has zero budget for training / qualification in IT. Given the fact, however, that the numbers refer to the 100 top organizations based in the country, this percentage becomes significant. A New project is what gets the highest level of investment (over 25%) in the budget of almost 50% of the IT executives surveyed. One can notice that IT budgets are very focused on managing the IT day-to-day routines (maintenance of the park, Telecom, personnel costs, and the like). This form of resource allocation, as indicated by Craig; Tinaikar (2006),that resources are only to

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


732 SimĂľes, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

maintain the basic services of IT. In other words, IT management and the organization see IT as a solution supplier to the computational demands of their work routine. In our view, this format indicates that IT is being managed for implementing a business strategy based on the simple idea that the organization must just stay in the race (survive or maintain the position of the business). It is relevant to the context of IT managing function, the way outsourcing is handled by IT executives (Chart 5). Being an important supporting role, both in readyto-use solutions, as well as at the level of costs for IT management, outsourcing can not be ignored or seen as an underlying matter in the administrative process. Despite the pronounced tendency of major international organizations in adopting outsourcing as a strategic alternative to the effective management of IT (Kaplan et al., 2006), we can notice, in this research, a certain conservatism in the allocation of resources for outsourcing (45% allocate up 10% of budget on outsourcing).

Chart 5 â&#x20AC;&#x201C; Outsourcing Budget It is also noticeable, in addition, that a significant share (13% of IT executives) of the organizations surveyed allocates zero budget for outsourcing. The vast majority, however, allocates up to 10%, however, 42% allocates over 10% of the budget for this activity. The analysis of the activities contemplated by outsourcing shows that the biggest chunk of the budget aims at systems. The apparent reason for this fact may lie in the fact that IT, in search of greater responsiveness to the business demands, considers coding more application systems a priority in order to meet business needs. On an opposite end, there may be also greater "commoditization" of some technologies for system development which leads organizations to reduce their personnel, thus, reducing costs, but forcing them to look after knowledge cooperation for specific software that, scaling up, seems to be less expensive.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


733 Managing IT as a business: The Lutchenâ&#x20AC;&#x2122;s Gap in the 100 Top Organizations based in Brazil

Chart 6 â&#x20AC;&#x201C; Budget Distribution of Outsourcing It is also interesting to observe the behavior of IT investments shown in the search results by Weill and Aral (2006) among the 140 largest organizations in the world versus the results of this search. For Weill and Aral (2006), areas of increased investment in IT are in the following order: infrastructure, transactional, informational and strategic. In this research, although not all percentages match the percentage of Weill and Aral (2006), the results confirm the same order of priorities of investments (Chart 7). Also, according to the Weill and Aral research, the second area of investment (26%), the informational area, coincides with the percentage of investments in Brazil. This area serves the tactical systems operations and became a major focus of investment, behind only the investments in infrastructure, possibly because of the concern of organizations with information systems. More recently, such systems have been refined to better serve the information needs on the competitive environment to support decision making in organizations. Finally, it is also important to note that the strategic area of IT, the one that should worry about the value generation and incorporation of technological innovations, has received the least amount of investments in recent years, in both surveys.

Chart 7 â&#x20AC;&#x201C; IT Investments: World - Brazil In assessing IT costs, one can notice that 31% of respondents indicate that IT costs, required to respond to the demands of a business area, are attributed directly to

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


734 Simões, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

that area and other costs are prorated based on the outcome criteria or by users volume or by price list. Another 30% of executives say that IT costs are recorded as a cost center and distributed as overhead costs. Another 27% indicates that their IT costs are prorated interchangeably by year or monthly, according to the consumption of IT resources and business areas. These costs are measured by cost drivers. Chart 8 shows the different format of allocating IT costs. 8% 4% 31%

CHARGED DIRECTLY OVERHEAD PRORATED

27%

PRICE LIST MEASURED 30%

Chart 8 – IT Costs Allocation To summarize the Managing function, we considered the same criterion of the previous function, and among the various findings of the research, some aspects may be relevant to demonstrate the performance of IT regarding its role for the business as follows: (1) the practices of project management; (2) Business Process Management (BPM); (3) the adoption of Capability Maturity Model (CMM); and (4) the adoption of Service Oriented Architecture (SOA) (Table 2).

Elements of IT Management Function

Results

Normalization

Maturity

64%

3.2

3

Vector 2 – Adoption of BPM

53%

2.7

3

Vector 3 - Adoption of SOA

37%

1.9

2

Vector 4 - Adoption of CMM

53%

2.7

3

Vector 1 Management

Practices

of

Project

Table 2 – IT Managing Function: Elements of Managing it as a Business The analysis of Table 2, containing the elements of the COBIT maturity (ITGI, 2010) for the IT Managing Function indicates that ¾ of the elements of this function tend to be performed in a more formalized way, which comply with the parameters of the level 3 maturity, intermediate.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


735 Managing IT as a business: The Lutchenâ&#x20AC;&#x2122;s Gap in the 100 Top Organizations based in Brazil

4.3 IT Services Function IT services, also known simply as delivery, are analyzed based on 5 vectors: (1) Architecture and Systems, (2) Infrastructure Services, (3) Innovation, (4) ITIL Management, and (5) Portfolio Management. The first three vectors relate to the delivery processes of systems and applications, that is, meeting the needs of the business. The last two vectors relate to the standards and practices of IT governance. One can check in these two vectors if deliveries are structured based on accepted model of IT management, such as ITIL, COBIT, and PMI, among others. For the first vector, called Architecture and Systems it becomes evident that the majority of IT executives have documented, monitored and measured the tasks inherent to this vector (architecture, operations analysis and systems development) - Chart 9. Intuitively, documentation, monitoring and evaluation functions of the tasks are minimal to ensure that IT, as a whole, executes the expected role it plays for the business. The survey results indicate that regardless of the economic segment of the organization, size or timing of the business, more than 70% of organizations surveyed demonstrated levels of maturity greater than 3, on our measuring scale. Therefore, there is evidence that the vector architectures and systems in the organizations surveyed are in the mature stage.

Chart 9 - Vector 1 - Architecture and Systems For the second vector, called Infrastructure Services, the data are shown in Chart10. The results indicate that a small portion (11 to 13% only) of IT executives use the best market practices in an automated way. Few (2-4%) are still unstructured for efficient management of the Infrastructure Services. In relation to the general IT services, the pattern is basically the same, that is, IT documents, monitors and measures, but it seems to have some problems to adopt the best practices. Chart 10 illustrates the information.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


736 SimĂľes, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

Chart 10 â&#x20AC;&#x201C; IT Infra-structure Services What we could note similarly to the first vector is the delivery of these services has, in more than 70% of organizations, level of maturity greater than 3. Therefore, there is evidence (Chart10), that management process practices of Infrastructure Services are also mature in the surveyed organizations. The third vector of IT Services researched was the innovation. Here ,we sought to know the level of concern of IT executives regarding innovation, that is, how IT plays the role of innovation and delivering innovation for business. The survey showed that about 20% dedicate not more than 5% of their time to innovation projects. Just over one third (35%) of IT executives spend between 5% and 10% of their time on innovation and the other third devote between 10% and 20% of their time to innovation. When looking at the role of IT in the innovation process - Chart 3 â&#x20AC;&#x201C; one can see that for almost two thirds of IT executives, the role of innovation in business processes, products or services falls more as a partnership with business executives that appears as leadership actions in innovation, as already mentioned in Chart 3. Still, in relation to the projects and technologies that IT is delivering to the organization, research shows that most IT executives (51%) have a plan or simply do not plan to deliver innovation projects. Lutchen (2003) already alerted to the fact that organizations need to invest not only to maintain good service and IT projects, but also need to follow the best practices, in any case, to ensure innovation in products or processes that help to open new markets. The fourth vector of IT Services refers to models of best practices and methodologies used by IT to deliver services and to ensure quality. This vector was verified by the IT governance in the organizations surveyed. According to Correa (2006), IT Governance is essential to ensure efficiency improvements in the processes of the organization. IT Governance practices seek to ensure that the organization's expectations regarding IT are met, performance is measured, resources are managed within accepted standards and risks are mitigated. Hence, the use of standards and practices for IT

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


737 Managing IT as a business: The Lutchenâ&#x20AC;&#x2122;s Gap in the 100 Top Organizations based in Brazil

Governance, structured around models of planning, organizing and managing IT, such as ITIL, COBIT, PMI, and others. Overall, it looks like ITIL, PMI, COBIT and ISO27001 enjoy the preference of the majority of IT executives. One of the latest methodologies for the management of IT, The Open Group Architecture Framework - TOGAF (2009) - is virtually ignored by IT executives (3% of respondents use the TOGAF). The low use of TOGAF may be due to the fact that it is still a recent practice of architectures and process management, though already in its ninth edition. The main methods for the management of IT used by respondents are shown in Graphic 11.

Chart 11 â&#x20AC;&#x201C; Most Common Methodologies for IT Management It is noticeable that there is a concern of IT executives to ensure that the adoption of standards of IT delivery is made through the tactical and operational discipline of tasks related to IT. Hence, perhaps, the fact that ITIL and PMI are the most widely used methods of governance in the organizations surveyed, in order to ensure speed and performance. Also there is a clear concern of the executives regarding the adoption of controls in line with corporate objectives in order to add value and keep the risk controlled through the use of management structures such as COBIT, BS and ISO. As the fifth vector of analysis in the Services function, we evaluate the processes of demand and portfolio management, essential to establish the cycle of governance. The responses showed that the majority of IT executives (80%) are capturing the demand for IT projects and services in a structured and documented way (Chart 12). In the context, projects involving present portfolio systems, in most organizations surveyed (63%), level of maturity above 3 (three), i. e, are documented and disseminated, and performance is monitored and measured continuously. Additionally, 16% of the respondents admit that the monitoring of these processes is made possible through the adoption of best market practices and automated tools.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


738 Simões, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

Chart 12 – Demands and Portfolio It is important a reflection on these numbers, comparing them to the survey conducted by Jeffery and Leliveld (2004) with the 130 organizations of Fortune 1000. Leliveld and Jeffery (2004) identified four stages of maturity of IT portfolio management. Compared to the COBIT maturity levels (IGI, 2010), the vectors of IT Services function and considering the same criteria, the identification of the level of maturity for this function corresponds to level 4. The maturity levels obtained for this function within the parameters of COBIT are shown in Table 3:

Elements of Services

Results

Normalization

Maturity

Vector 1 – Architecture e Systems

80%

4.0

4

Vector 2 – Infrastructure Services

76%

3.8

4

Vector 3 – Innovation

%

Vector 4 – Managerial Method (ITIL)

81%

4.0

4

Vector 5 – Portfolio of Services

80%

4.0

4

Table 3 –Services Function

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


739 Managing IT as a business: The Lutchenâ&#x20AC;&#x2122;s Gap in the 100 Top Organizations based in Brazil

The interpolation of the results shown in Table 3 shows that the level of maturity of the elements of the function Services are at level 4, mature, according to the convention adopted here. This indicates that the organizations surveyed are massively using the best market practices to manage IT in an appropriate manner and with higher standards of control. 4.4 Quality and Security Function Quality and Security function is here analyzed in terms of the main results obtained by considering four basic vectors: (1) communication in IT, their metrics and indicators related to the business, (2) risk management framework; (3) structure and use of internal controls, and (4) the quality management of third parties and respective contracts. We examine the suitability of these vectors for the specific regulatory environment in which each organization operates. The goal is to ensure that the services keep the quality with the necessary adherence of IT to business objectives, considering costs and deadlines for the tasks and project demands. Also, if IT products, whether produced internally or purchased in the market, are delivered to users and evaluated in terms of services and its expected quality. It is assumed that the accuracy in monitoring the quality should be the same for internal or external service providers and that user is always seen as a customer. In terms of vector 1, communication in the IT business areas - Chart 13 â&#x20AC;&#x201C; seems to be more introjective and with little socializing. The evidence appears in the results showing that just over 40% of CIOs responded to have quality indicators that support business processes. This figure rises slightly (44%) for the surveillance function of aggressors and quality communication to users, but it remains below 40% when looking at performance monitoring of IT services in order to generate indicators for the daily business areas (39%).

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


740 SimĂľes, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

Chart 13 â&#x20AC;&#x201C; Communication of IT / Business The introjections of IT management in the organizations surveyed are most apparent when comparing these data with the results of knowledge about the indicators of IT business goals. In around 76% of organizations surveyed, CIOs say IT knows the indicators. However, such knowledge is not accompanied by the IT quality criteria that match the business. About 60% of organizations do not have such indicators. This fact has a strong implication on the issue of aligning IT with the business. In fact, if there were the sharing of results generated by the business indicators with indicators of IT quality, a better strategic alignment between both areas could happen, since the socialization of the results establishes a strong commitment and inspire greater cooperation between areas. Regarding the second vector analysis, the risk management involving the maintenance of proper internal controls in IT, we can observe that there are internal controls of IT, formalized in 76% of organizations surveyed (Chart 15). This probably happens because of the adoption and use of management methodologies COBIT, ISO and BS (34, 32, 30%, respectively, seen in Chart11) by IT executives aimed at improving the governance environment, and the fact that more than 50% of organizations surveyed are subject to market rules for publicly traded companies. It is remarkable that more attention is being given to internal controls in order to reach maturity in process control. Research shows that 64% of IT executives - Chart 14 - seek to understand the maturity level of its internal control processes through internal and external benchmarking (other organizations or organizations in the same conglomerate) and independent audits (52%). The evaluation of internal controls is to identify potential risks of utility and effectiveness of controls in order to redirect the IT plan and improve the quality of the services provided.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


741 Managing IT as a business: The Lutchenâ&#x20AC;&#x2122;s Gap in the 100 Top Organizations based in Brazil

Although significant rate (64%) of those who deem important to reach maturity in internal controls, this rate still appears to be relatively low given its importance as a factor in ensuring the quality of IT services.

Chart 14 â&#x20AC;&#x201C; Internal Control Management The third vector of the Quality and Security assurance function refers to the quality management of third parties and their respective contracts. The verification of the profile of this vector is done by analyzing the management of outsourcing. The results indicate that just over 30% of respondents do not execute the management of such contracts - Chart 15 - or even consider them important. One of the reasons of the failure of management outsourcing perhaps lies in the fact that management is not from IT, but from another area, such as the Legal Department of the organization. Or, what would be less likely, IT executives assume self-sufficiency of their systems in use and new applications are developed internally, on demand, without external help. We also inquired IT executives about their concerns with the contract management of outsourced projects and activities, as the fourth vector of this function. We can find that about 50% of them, in fact, does the management of these contracts, but only 20% give proper attention, making and updating their outsourcing processes

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


742 Simões, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

Chart 15 – IT Management of Outsourced Contracts and Projects In relation to Service Level Agreements with providers, the survey results have shown that to ensure the quality of services, 54% of IT organizations establish Service Level Agreements with third-party providers. These are formal agreements and they reflect the negotiation between the requirements of the clients (users) and IT operational capability for a particular service. Although a trend in the use by the IT area, 36% of respondents declared that is an important type of agreement, however, not having yet implemented it. The identification of the level of maturity according to COBIT (IGI, 2010) for the vectors of the Quality and Security Assurance function was made considering the same criteria used to identify the level of maturity in previous functions studied, taking into account the vectors: (1) Communication IT with the business, (2) risk management structure, (3) Structure of internal controls and IT (4) contract management of third parties. The maturity levels obtained for this function are shown in Table 4: Elements of Quality and Security Function

Results

Normalization

Maturity

Vetor 1 – IT Communication

76%

3.8

4

Vetor 2 – Risk Management

54%

2.7

3

Vetor 3 – Internal Control Management

76%

3.8

4

Vetor 4 – Third Parties Agreement Manag.

69%

3.5

3

Table 4 – Quality and Security Assurance Function The analysis of the data in Table 4 indicates that there is a fairly balanced dispersion of maturity between intermediate and mature levels over the vectors of this function. While communication of IT with the business areas of the organization and

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


743 Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil

management of internal control are already reaching maturity, risk management and third party contracts remain at intermediate levels of maturity. 5. FINAL REMARKS AND CONCLUSIONS The main objective of this research is to determine whether IT in the 100 top organizations based in Brazil, classified in the Guia Exame – Maiores e Melhores/2008, is managed as a business, under the perspective of the IT executives. To this end, the research aimed to study how IT executives are managing the IT functions, where the Lutchen’s (2003) gap can be detected. Thus, we characterized a conceptual scheme placing the key functions of this research - Alignment, Management, Services and Quality and Security Assurance – as the elements of the IT Delivery Spectrum. We then identified the Lutchen’s gap in those functions by determining the level of IT maturity for each function, using the version 4 of COBIT (IGI, 2010) adapted criteria as described in the methodology. Data collection was performed in a quantitative manner, with the Top 100 organizations based in Brazil, distributed over the 10 more representative economic sectors. Analysis and interpretation of survey data enabled us to reach some basic conclusions that show a context determined by the answers of IT executives of large global organizations, very close to the ones observed in the literature. An accurate analysis of the results shows that IT executives, in this study, seem willing to be aligned to the business their IT areas support. But there is a significant difference between the reality and their wishes. While the vast majority (63%) claims to have a plan or comply with alignment between IT (ITSP) and Business (BSP) plans, a significant portion of them (37%) does not recognize the alignment as an important factor in the management of IT activities. This dichotomy between the expected and observed alignment is also noted in the research by Luftman (2003), Zorello (2005) and Abreu and Fernandes (2008), whose findings indicate alignment problems, also point that organizations as a result, may be allocating IT resources inappropriately. Given the less representative positioning of IT in the organizational hierarchy, IT strategies do not have the expected reach and effectiveness, and often are misaligned to business strategies. The subordination of IT to more tactical than strategic levels did not contribute to the adequate alignment of IT and respective control through commonly shared business indicators, such as the Balanced Scorecard as noted by Fernandes; Abreu (2008) . In terms of maturity in the use of processes that ensure alignment of IT with business strategies, the results of this research also show consistency with the evidence described above, as these results indicate that the level of IT maturity lies in between maturity levels (2) and (3), under COBIT (ITGI, 2010) criteria. This means that the function Alignment (with business strategies) shows evidences that it is still incipient for some vectors, and in some others, is intermediately mature. With respect to the Management function, one can notice that in terms of IT budget investments, results of this survey are consistent with other research, such as the Center for IT Studies at MIT (WEILL et al. 2002; Weill and Aral, 2006). Results of the

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


744 Sim천es, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

research are fairly consistent regarding financial allocation of technologies and systems, upgrade of infrastructure, communication, training and technology innovation. Still, from the point of view of the management function, this survey results show that the elements of the governance cycle are documented, automated and disseminated at an intermediate level, considering the best practices. Thus, we can assume that the maturity of IT lies, among organizations researched, at the intermediate level (3), under COBIT (ITGI, 2010) maturity scale. This means that the management function shows evidence that its processes are more mature than the function alignment, but still at an intermediate level. Concerning the function Services, despite the evidence of some discrepancies with the results of other global surveys, already mentioned, there are clear indications that the organizations surveyed seem to be running IT like a business. A major evidence is the administrative function of planning and control IT activities. It may be noted that the organizations surveyed, in general, apply processes, rules management, budgeting and costing and use parameters, measurable indicators and metrics control, proper of administrative procedures for any business. Lutchen (2003) believes that the main contribution of services function is associated with how to accelerate and innovative organizations through consistent delivery of IT. Data from this study show that all the IT executives have adopted a model of management practices (whether ITIL, COBIT, PMI, ISO27001 or similar) to manage their services, demands, operations and assets. Among the listed practices, ITIL is the most widely adopted (almost 50%) by IT executives. This shows that IT documents internal processes as standard models, but obliterates the creation of automated structures in IT based on the best market practices. It is also worth noting that data of services function, interpolated on the COBIT (ITGI, 2010) management maturity scale, indicates that the organizations surveyed are mature, at level maturity (4). Finally, the function Quality and Security Assurance shows that a significant portion of IT executives are monitoring events, but it is not measuring the performance of IT as a whole. There are clear signs of individual control, such as monitoring and reporting of disrupting events (such as an incident and problems of management) to business managers (41%), validation and periodic auditing of internal controls (50%), requirement of SLAs (49 %) and SOW (27%) of suppliers. However, such control matters are still "compartmentalized" among the vectors that compound the function. They are not integrated, not coexisting in a coherent way allowing for indicating the overall performance of IT. Assessing the maturity of existing quality processes in the organizations surveyed, according to COBIT (ITGI, 2010) maturity scale, there is evidence that the quality and security function is located between levels of maturity (3) and (4). In other words, managing the quality function of IT can be ranked between the intermediate and mature levels. In general terms, as it is designed, IT management tends to consume more time of IT executives dealing with routine tasks than with innovation, which seems to be no priority among them. The results indicate that most executives plan innovation projects for the future more than the present. IT executives prefer partnering leadership in innovation than taking a proactive leadership role.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


745 Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil

Considering that three of the four functions evaluated in this study show rates of maturity between incipient and intermediate and only one function shows a higher level of maturity, we can say that IT is managed, by the majority of IT executives in organizations located in Brazil, much more as a "solution provider" than an innovation tool or less yet as a competitive differentiator. IT is, thus, dedicated only to fulfilling a functional task in the structure of organizations. For better clarity, we design a chart showing the level of maturity of the functions analyzed in this investigation. Figure 2 shows iconographically the specific levels of maturity for the IT functions Alignment, Management, Services and Quality & Security, summarizing the Lutchen’s gap.

Macro Factor

Maturity 1 e 2

Maturity 3

Maturity 4 e 5

Align Management Services Quality&Secur Figure 1 – Summarizing maturity in IT functions A summary of the results shown in Figure 2 demonstrates that the profile of IT management in the top 100 companies based in Brazil does not have the maturity level needed to systemically integrate the objectives, goals and business processes. It seems that the "operational view of IT" advocated by Lutchen (2003) is detected as the emphasis of the IT management model among the researched organizations. Thus, similarly to the requirements for effective management of IT mentioned and discussed by Shpilberg et al. (2007) and Lutchen (2003), in the daily life of organizations, there is evidence that IT plays a purely operational role, that is, a service supplier. In the context of the results discussed so far, we cannot forget to notice that, although there are management tools and they are being used, the role of IT management, a necessary link between the functions Alignment and Services, IT management does not show evidence of the expected maturity to raise the level of maturity of the Alignment function, the most incipient of the functions evaluated. Thus, it is evident that the alignment remains as the main bottleneck of the IT functions, characterizing the Lutchen’s (2003) gap. There seems to be the IT functional gap, which obliterates efforts to manage IT as a business. Here, according to Lutchen, IT should have greater influence on business processes and the overall performance of the organization. Looking at the results of this research, we noted that there are several inefficiencies in management, governance and control and use of quality indicators creating barriers for IT to play its influencing role in the overall business performance of the organization. Despite of the higher level of maturity in other functions, results for the Services function, if compared with the Alignment function, again confirms the Lutchen’s (2003) gap, considering the dichotomy between what happens at the intermediate level of the relationship of IT with business managers. It seems clear, therefore, that aligning IT

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


746 Simões, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

processes with business needs, delivering quality services under the best practices and distributing costs over users´ departments, is a less common managerial practice in larger organizations based in the country. Finally, it is conclusive, by the results of the survey, that the IT managerial gap, appointed by Lutchen, is detected in the organizations surveyed. The limitations encountered in this research are related to the fact that the survey had been answered only by IT executives and not additionally by business executives. This format would allow a more specific understanding than if problems associated with both sides of the IT management (providers and users) had been considered. For future work, therefore, it is also suggested to apply the research among business executives, to confirm if the weaknesses appointed by IT people are similar to the ones eventually appointed by business people.

REFERENCES

Albertin, A. L.; Sanchez, Otávio Próspero (Org.).(2008) Outsourcing de TI: impactos, dilemas, discussões e casos reais. Rio de Janeiro: FGV. Borland. A (2006) Comprehensive technology portfolio management process. Borland IT Management & Governance Solution, p. 3-21. Disponível em: http://www.borland.com/resources/en/pdf/white_papers/technology_portfolio_managem ent.pdf Acesso em 10 Mar. 2010. Brege, Staffan; Brehmer, Per-olof; Rehme, Jakob. (2008) Managing supplier relations with balanced scorecard. Int. J. Knowledge Management Studies, v. 2, n. 1, p.147161. Corrêa, P. M. (2006) Um estudo sobre a implantação da governança de TI com base em modelos de maturidade. Centro Estadual de Educação Tecnológica Paula Souza. Disponível em: http://www.centropaulasouza.sp.gov.br/ Posgraduacao/Trabalhos/ Dissertacoes/DM_Tecn_Paulo_Correa.pdf. Acesso em 10 Mar. 2010. Craig, David; Tinaikar, Ranjit. (2006) Divide and Conquer: Rethinking IT strategy. McKinsey Quarterly. Disponível em: http://www.mckinseyquarterly.com/ Divide and_conquer_ Rethinking_IT_strategy. Acesso em: 26 Fev. 2010. Fernandes, Aguinaldo Aragon; ABREU, Vladimir Ferraz de. (2008) Implantando a governança de TI: da estratégia à gestão de processos e serviços. 2. ed. Rio de Janeiro: Brasport. IDC. (2006) IT project and portfolio management and the application life cycle: understanding the market and enabling IT / business coordination. Framingham, 2006. Disponível em: http://www.cio.co.uk/whitepapers/3684/it-project-and-portfoliomanagement-and-the-application-life-cycle/. Acesso em 08 Set. 2010. COBIT Framework for IT Governance and Control – version 4. IT Governance Institute – ITGI: Rolling Meadows, (2010) Disponível emhttp://www. http://www.isaca.org/Knowledge-Center/cobit/Pages/ Overview.aspx.. Acesso em: 12 Mar. 2010. Lobler, Mauri Leodir; Bobsin, Débora; Visentini, Monize Sâmara. (2008) Alignment between the strategic business plan and the plan of information technology at

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


747 Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil

companies: the comparative analysis through the maturity level and critical sucess factors. Journal of Information Systems and Technology Management, v. 5, n. 1, 2008, p. 37-60. Luftman, J.N. (2003) Managing the information technology resource: leadership in the information age. Rio de Janeiro: Prentice Hall. Lutchen, Mark. (2003) Managing IT as a business: a survival guide for CEOs. New York: John Wiley and Sons. Marwaha, Sam; Seth, Parul, Tanner, David W. (2005) What global executives think about technology and innovation. McKinsey Quarterly. Disponível em: http://www.mckinseyquarterly.com/What_global_executives_think_about_technology_ and_innovation_1653. Acesso em: 25 Fev. 2010. Marwaha, Sam; Willmott, Paul. (2005) Managing IT for scale, speed and innovation. McKinsey Quarterly. September, p. 15-21. Disponível em: http://www.mckinsey quarterly.com/Managing_IT_for_scale_speed_and_innovation _1848. Acesso em: 25 Fev. 2010. Miles, M. B., Huberman, A. M. (1994) Quantitative data analysis: an expanded sourcebook. 2 ed. Thousand Oaks: Sage. Prahalad, C.K. Krishnan, M.S. (2002) The dynamic synchronization of strategy and information technology. MIT Sloan Management Review, v. 43, n. 4, p. 24-33. Prahalad C.K. (2006) CIOs Hold Key to Operational Excellence. Optimize. v.5, n.5, p.66. Porter, M. E. (1986) Estratégia competitiva: técnicas para análise da indústria e da concorrência. Rio de Janeiro: Campus. Rezende, D.A. (2002) Tecnologia da informação integrada à inteligência empresarial: alinhamento estratégico e análise da prática nas organizações. São Paulo: Atlas. Rodrigues, L.C.; Maccari, E.; Simões, S.A. (2009) O Desenho da Gestão da Tecnologia da Informação nas 100 Maiores Empresas na Visão dos Executivos de TI. Revista de Gestão da Tecnologia e Sistemas de Informação, v.6, n.3, p. 483-506. Shpilberg, David.; Berez, Steve, Puryear, Rudy; Shah, Sachin. (2007) Avoiding the alignment trap in information technology. MIT Sloan Management Review, v. 49, n. 1, p. 51-58. Takanen, Tiina. (2008) The changing role of the CIO – Is CIO and IT expert or a business executive? Disponível em: http://hsepubl.lib.hse.fi/FI/ethesis/pdf/12001/hse_ ethesis_12001.pdf. Acesso em: 06 Nov. 2010. Weill, Peter; Aral, Sinan. (2006) Genereting Premium Returns on your IT Investments. MIT Sloan Management Review, v. 47, n. 2, p. 39-48 Weill, Peter; Ross, Jeanne (2004) IT Governance. Boston: Harvard Business School Press. Weill, Peter; Subramani, Mani; Broadbent, Marianne (2005). Building IT infra-structure for strategic agility. MIT Sloan Management Review, v. 44, n. 1, p. 57-65, 2002. ZORELLO, G. Metodologias COBIT e ITIL e as perspectivas do Modelo de

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


748 Simões, S. A., Rodrigues, L. C., Maccari, E. A., Pereira, M. F.

Alinhamento Estratégico de TI. XII SIMPEP, p.2-4 Disponível em: <http://www. consulting.com.br/edsonalmeidajunior/admin/downloads/cobit.pdf>. Acesso em 20 dez. 2010.

JISTEM, Brazil Vol.8, No. 3, Sept/Dec. 2011, pp. 717-748

www.jistem.fea.usp.br


Revista de Gestão da Tecnologia e Sistemas de Informação Journal of Information Systems and Technology Management Vol. 8, No. 3, 2011, p. 749 ISSN online: 1807-1775

Conference / Congresso

9th CONTECSI International Conference on Information Systems and Technology Management May 30 to June 1st, 2012 USP/São Paulo/SP FEA USP São Paulo, Brazil

The 9th International Conference on Technology and Information Systems Management CONTECSI is an event focusing Technology and Information Systems Management under a multidisciplinary view. CONTECSI aims at putting together academics and professionals involved in IT and Systems management for a state-of-the-art discussion. International researchers are expected to contribute for the integration between the academic and the professional communities. The Conference welcomes papers submission for presentation and panel discussions. Major topics on interest include, but are not limited to: Information Society, Open Systems, Systems Interfacing and Integration, Wireless Computing, Entrepreneurship in IT and IS, Accounting Information Systems, E-Commerce / E-Business, Software Engineering, ERP Systems, Financial Management in Information Systems, IT Strategic Management, etc. All papers will be subject to a blind review process and full papers will be published (CD) in the Conference Proceedings. Deadline: January 31st 2012

More information: http://www.tecsi.fea.usp.br/eventos/contecsi Chair: Prof. Edson Luiz Riccio. PhD – FEA USP and TECSI Contact: contecsi@usp.br

9º CONTECSI Congresso Internacional de Gestão da Tecnologia e Sistemas de Informação 30 de Maio a 1 de Junho de 2012 USP/São Paulo/SP FEA USP São Paulo, Brasil O 9º Congresso Internacional de Gestão da Tecnologia e Sistemas de Informação CONTECSI visa reunir acadêmicos e profissionais envolvidos com a temática de gestão para discussão do Estado-da-arte deste campo. Atualmente este campo encontrase disperso em áreas específicas, carecendo de uma visão holística e integrada do assunto. O CONTECSI contará com a presença de palestrantes de renome, bem como estará aberto para a recepção de trabalhos para serem apresentados em sessões paralelas e painéis. Assim como compareceram nos anos anteriores, são esperados personalidades, professores e pesquisadores do Brasil e do exterior, principalmente de Universidades da França, Inglaterra, Espanha, México, Portugal, Chile, Argentina, Colômbia, Uruguai, Venezuela entre outras. Os focos de interesse deste congresso inclui todas as abordagens referentes à gestão de Tecnologia e dos Sistemas de Informação nas instituições públicas e privadas e na sociedade em geral. Data final para submissão de artigos: 31 de Janeiro de 2012

Mais informações no site: http://www.tecsi.fea.usp.br/eventos/contecsi Coordenação: Prof. Dr. Edson Luiz Riccio – FEA USP e TECSI Contato: contecsi@usp.br


JISTEM Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação ISSN online: 1807–1775

Every four months/Quadrimestral

1) Paper Submission Guidelines Register at "Online Submissions" and submit your paper accordingly to JISTEM guidelines. a) Manuscript style Articles must be submitted in English, Spanish, Portuguese or French in MS-Word format. Authors must translate the final version of the article to English. First page must present: title of the article, author's full name, affiliation, full address, telephone, email, fax and a brief curriculum vitae. Limit of 3 co-authors per article. Second page must present: title of the article, abstract in the original language of the article of about 100 words, title, area and 5 key words (if accepted an abstract in English and key-words will be required), Articles must be limited to 30 pages in double-space, Arial or times new roman, 12 points; Authors must include figures and graphics in high-resolution 300 dpi (jpg or gif). They must be numbered (Arabic) and with the complete title. References to each table or figure have to be made in the text. Authors must submit the questionnaires and research results to the editor and review purposes. Acknowledgments to institutions regarding financial support can be included only in the final accepted version. b) Structure Style Articles should clearly present the Abstract, Introduction, Objectives, Justification, Question, literature review, research method, results, conclusion, recommendation and limitation, plus references; References are to follow the American Psychological Association (APA) guidelines. More detailed explanations and examples of these guidelines can be found at the following locations: http://www.apastyle.org/faqs.html or Publication Manual of the American Psychological Association (6th ed., 2010) American Psychological Association (APA).A list of reference must be presented in alphabetical order. A glossary can be included in the end of the article if needed. 2) Book Review Book review should be sent by Prof. Edson Luiz Riccio at jistem@usp.br

R. Gest. Tecn. Sist. Inf. /JISTEM Journal of Information Systems and Technology Management, Brazil


Contributions / Submissão de Artigos

751

1) Instruções para submissão de artigo a) Quanto à Formatação Os artigos submetidos para publicação, em inglês, espanhol, português ou francês, devem ser enviados em formato MS-Word. Após aceito, os autores devem traduzir o artigo para o idioma inglês. Na primeira página do artigo deve constar: título, subtítulo (se houver), tema, nome, instituição, departamento, endereço, telefone, fax e e-mail do autor e co-autores (máximo de 3 co-autores) e breve curriculum que indique sua formação, instituição/empresa a que pertence e sua área atual de trabalho.; Na segunda página do artigo deve constar: título, subtítulo (se houver), tema e resumo na língua original do artigo, com 100 palavras aproximadamente e 5 (cinco) palavras-chaves. Se o artigo for aceito para publicação será solicitado o envio do título, abstract e palavras-chave em inglês; Os artigos deverão ter no máximo 30 páginas em espaço duplo, fonte arial ou times new roman, tamanho 12; As figuras e gráficos devem estar em alta qualidade com resolução de 300 dpi (figuras) e extensão jpg e/ou gif no artigo. Cada ilustração deve conter numeração e legenda. Deve ser feita referência à figura ou tabela no corpo do texto. Questionários e resultados da pesquisa devem ser enviados para a avaliação do Editor e pareceristas. Agradecimentos a órgãos de financiamento da pesquisa devem ser incluídos apenas na versão final do artigo, após o aceite. b) Quanto à Estrutura Os artigos enviados devem conter em seus tópicos os seguintes itens: Resumo, Introdução, Objetivos, Justificativa, Problema/Questão, Revisão da Literatura, Metodologia, Resultados, Conclusão, Recomendações, Limitações e Referência Bibliográfica; As citações e referências devem seguir o estilo da APA (http://www.apastyle.org/l) As referências deverão ser apresentadas no corpo do texto, incluindo o sobrenome do autor, a data de publicação e o número de página (se for o caso), conforme normas da APA. Referências bibliográficas completas do(s) autor (es) citados deverão ser apresentadas em ordem alfabética, no final do texto, de acordo com as normas da APA. Para maiores informações: American Psychological Association (APA). (2001). Publication Manual of the American Psychological Association (5th ed.). Washington, DC Poderá ser incluído um glossário ao final do artigo, caso o autor julgue necessário; 2) Sugestões de livros para Resenha Resenhas devem ser enviadas para o Prof. Edson Luiz Riccio pelo e-mail: jistem@usp.br

Vol.8, No. 3, 2011, p.750-751


752

JISTEM Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação

Editor E.L.Riccio (2004-presente/present) Avaliadores Ad Hoc – 2011 Ad Hoc Reviewers – 2011 Adilson Luiz Pinto Adonai Teles Alberto Medeiros Alexandre Graeml Alexander Brem Andrea Padovan Jubileu Andressa Pacheco António Espírito Santo Adolfo Alberto Vanti Carlos Fernando Jung Carlos Henrique Medeiros de Souza Cristiano Das Neves Almeida Cristina Dai Prá Martens Daniel Coronel Daniel Estima Carvalho Elaine Tavares Emerson Antonio Maccari Emilio Jose Montero Arruda Filho Evandro Marcos Saidel Ribeiro Fernando Hadad Zaidan Fernando Santos Portugal Feruccio Bilich George Leal Jamil Gilberto Perez Gregório Varvakis Humberto Rito Ribeiro Ildeberto Aparecido Rodello Ivam Ricardo Peleias Jaime Crozatti Joana Coeli Ribeiro Garcia Joanna Kamiska João Leomar Todesco José Amaro dos Santos José Carlos Cavalcanti Jose Rodrigues Filho Julio Araujo Carneiro da Cunha Leandro Faria Lopes Leandro Liberio da Silva Leomar dos Santos Lillian Maria Araújo de Rezende Alvares Luciano Augusto Toledo

R. Gest. Tecn. Sist. Inf. /JISTEM Journal of Information Systems and Technology Management, Brazil


Editorial Information 2011

753

Luiz Antonio Pereira Neves Malik Faisal Azeem Marcos de Moraes Sousa Maria Alexandra Viegas Cortez da Cunha Marici C. Gramacho Sakata Maurício Rissi Miguel Juan Bacic Octavio Ribeiro de Mendonca Neto Paulo Caetano da Silva Patricia de Sá Freire Pedro Luiz Côrtes Raquel Janissek Muniz Renato de Campos Ricardo Matheus Rogerio Lacerda Rosangela Mesquita Ayres Saulo Barbará de Oliveira Serje Schmidt Susana Azevedo Teodoro Guimaraes Theodoro Peters Turkka Näppilä Valdemar Setzer Valter de Assis Moreno Jr. Yóris Linhares Souza

Editorial Information - Informações Editoriais Total of Published Texts / Total de Textos Publicados Articles / Artigos Inéditos Research Communication / Comunicação Conference Reports / Relatórios de Conferências Authors from Brazil / Autores do Brasil Authors Outside Brazil / Autores do Exterior Articles in English / Artigos em Inglês Articles in Spanish / Artigos em Espanhol Articles in Portuguese / Artigos em Português Articles in French / Artigos em Francês Time from submission to feedback including review Tempo de submissão à resposta aos autores com revisão 1-16 semanas/weeks 10% 17-40 semanas/weeks 75% + 41 semanas/weeks 15% Tempo/Range 1-42 Média/Average 29 semanas/weeks Indice de rejeição - Rejection rate 60%

JISTEM, Brazil Vol. 8, No. 3, Sept/Dec. 2011, pp. 752-764

www.jistem.fea.usp.br

2011 33 32 0 1 63 19 26 3 2 1


754

JISTEM Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação

Proofreading Services / Revisão Contact Language Index per author 2011 / Indice por autor 2011 ABARESHI, Ahmad, ICTS – new organizational form linkage in the Australian context: Theoretical model and research instrument Volume 8, Número 3 – Set – Dez, 2011 AGUILAR, Gustavo Cossío, Universidad Autónoma de Puebla, Mexico Aplicación de las tecnologías de la información para la creación de arte de nuevos medios: bio-lencia (in guns we trust) como caso de estudio Volume 8, Número 3 – Set – Dez, 2011 ALONSO, Luiza Beth Nunes, Catholic University of Brasília, Brazil Certificação Digital no Governo Eletrônico Brasileiro Digital Certification in the Brazilian E-government Volume 8, Número 2 – Set – Dez, 2011 ANDRADE, Helga Patricia Bermeo, Universidad de Ibagué, Colombia Exploring Business Competitiveness in High technology Sectors An Empirical Analysis of the Mexican Software Industry Volume 8, Número 2 – Set – Dez, 2011 ANDRADE, Rafael Alejandro Espín, Universidade Técnica de Havana, Cuba Aligning information security with the image of the organization and prioritization based in fuzzy logic for the industrial automation sector Volume 8, Número 3 – Set – Dez, 2011 ANGELONI, Maria Terezinha, Pierre Mendès France University, France CRM as a Support for Knowledge Management and Customer Relationship Volume 8, Número 1 – Set – Dez, 2011 ARMADA de Oliveira, Marcius, Centro Universitário da Cidade do Rio de Janeiro, Brazil Recommender Systems In Social Networks Volume 8, Número 3 – Set – Dez, 2011 ARROYO, Cristiane Sonia, University of São Paulo, Brazil Use and Development of Health Information Systems: The Experience of an Organizational Unit Responsible for the Technological Services at Public Hospitals Volume 8, Número 1 – Set – Dez, 2011 BAÑALES, Dora Luz González, Instituto Tecnológico de Durango, México Exploring Business Competitiveness in High technology Sectors An Empirical Analysis of the Mexican Software Industry Volume 8, Número 2 – Set – Dez, 2011

R. Gest. Tecn. Sist. Inf. /JISTEM Journal of Information Systems and Technology Management, Brazil


Editorial Information 2011

755

BARBOSA, Eduardo Batista de Moraes, National Institute for Space Research (INPE), Brazil Data Information System to Promote the Organization Data Collections – Modeling Considerations by the Unified Modeling Language (UML) Volume 8, Número 1 – Set – Dez, 2011 BAX, Marcello Peixoto, Universidade Federal Minas Gerais – UFMG, Brazil Semantic wikis and the collaborative construction of ontologies: case study Volume 8, Número 3 – Set – Dez, 2011 BECKER, Fábio Dídimo, Universidade de Caxias do Sul, Brazil Application de la Veille Anticipative Strategique pour le suivi de L’environment et la Production de Connaissances Actionables /The Use Of Anticipative And Strategic Intelligence For Scanning The Environment And To Produce Actionable Knowledge Volume 8, Número 2 – Set – Dez, 2011 BRAGA, Lamartine Vieira, University of Brasilia, Brazil Certificação Digital no Governo Eletrônico Brasileiro Digital Certification in the Brazilian E-government Volume 8, Número 2 – Set – Dez, 2011 CAPELLINI, Gustavo de Almeida, University of São Paulo, Brazil Gestão do conhecimento em transnacionais: o ambiente organizacional como instrumento disseminador Knowledge Management in Transnational Organizations: The Organizational Environment as a Scatter Instrument Volume 8, Número 1 – Set – Dez, 2011 CARVALHO, Frederico A. de, Federal University of Rio de Janeiro, Brazil Evaluation of the perceived Quality of the Website of an Online Bookstore: an Empirical Application of the Carnes and Vidgen Model Volume 8, Número 1 – Set – Dez, 2011 CONICET, Maria Verónica Alderete, Instituto de Investigaciones Económicas y Sociales del Sur (IIESS), Argentina Networks Versus ICT Use: The Case of SME from Bahía Blanca, Buenos Aires (Argentina) Volume 8, Número 2 – Set – Dez, 2011 CORSO, Kathiane Benedetti, Universidade Federal do Rio Grande do Sul, RS, Brasil Understanding the behavior of the subject in interaction with a decision support system under time pressure and missing information Volume 8, Número 3 – Set – Dez, 2011 CÔRTES, Eliana Golfette de Paula, University Gama Filho, Brazil Hospital Information Systems: a Study of Electronic Patient Records Volume 8, Número 1 – Set – Dez, 2011 CÔRTES, Pedro Luiz, University Nove de Julho, Brazil

JISTEM, Brazil Vol. 8, No. 3, Sept/Dec. 2011, pp. 752-764

www.jistem.fea.usp.br


756

JISTEM Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação

Hospital Information Systems: a Study of Electronic Patient Records Volume 8, Número 1 – Set – Dez, 2011 CRUZ, Mateus Silqueira Hickson, Ceu System - São Paulo, Brazil XLDM: AN XLINK-BASED MULTIDIMENSIONAL METAMODEL Volume 8, Número 3 – Set – Dez, 2011 CUNHA, Julio Araújo Carneiro da, University of São Paulo, Brazil Gestão do conhecimento em transnacionais: o ambiente organizacional como instrumento disseminador / Knowledge Management in Transnational Organizations: The Organizational Environment as a Scatter Instrument Volume 8, Número 1 – Set – Dez, 2011 CURIA, Lisandro,Universidad Nacional del Comahue, Buenos Aires, Argentina Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio - económico Volume 8, Número 3 – Set – Dez, 2011 DOLCI, Pietro Cunha, Federal University of Rio Grande do Sul, Brazil The Dimensions of IT Portfolio Management (ITPM): An Analysis Involving IT Managers in Brazilian Companies Volume 8, Número 2 – Set – Dez, 2011 ESTIVALETE, Vania de Fátima Barros, Federal University of Santa Maria, Brazil A Vision of Orkut’s Users: Studying this Phenomenon Through Cognitive Absorption Volume 8, Número 1 – Set – Dez, 2011 FERNEDA, Edilson, Catholic University of Brasília, Brazil Certificação Digital no Governo Eletrônico Brasileiro Digital Certification in the Brazilian E-government Volume 8, Número 2 – Set – Dez, 2011 FERREIRA, Alessandra Henriques, University of São Paulo, Brazil Use and Development of Health Information Systems: The Experience of an Organizational Unit Responsible for the Technological Services at Public Hospitals Volume 8, Número 1 – Set – Dez, 2011 FERREIRA Neto, Arthur Nunes, Catholic University of Brasilía, Brazil Metamodels of information technology best practices frameworks Volume 8, Número 3 – Set – Dez, 2011 FETZNER, Maria Amélia de Mesquita, Federal University of Rio Grande do Sul, Brazil Business Intelligence (BI) Implementation from the Perspective of Individual Change Volume 8, Número 1 – Set – Dez, 2011 FREITAS, Henrique, Federal University of Rio Grande do Sul, UFRGS, Brazil Business Intelligence (BI) Implementation from the Perspective of Individual Change

R. Gest. Tecn. Sist. Inf. /JISTEM Journal of Information Systems and Technology Management, Brazil


Editorial Information 2011

757

Volume 8, Número 1 – Set – Dez, 2011 FREITAS, Henrique Federal University of Rio Grande do Sul, UFRGS, Brazil Application de la Veille Anticipative Strategique pour le suivi de L’environment et la Production de Connaissances Actionables The Use Of Anticipative And Strategic Intelligence For Scanning The Environment And To Produce Actionable Knowledge Volume 8, Número 2 – Set – Dez, 2011 GIARDINO, Adriano, Institute of Nuclear and Energy Research (IPEN), São Paulo, Brazil. The Development of an Enterprise Resource Planning System (ERP) for a Research and Technology Institute: the case of the IPEN Volume 8, Número 1 – Set – Dez, 2011 GRIPE, Fernando Gustavo Dos Santos, University of São Paulo, Ribeirão Preto, USP, Brazil A theoretical analysis of key points when choosing open source ERP systems Volume 8, Número 2 – Set – Dez, 2011 HERRERO, Concepción Pérez de Celis, Universidad Autónoma de Puebla, Mexico Aplicación de las Metodologías Ágiles en el Proceso de Producción de Piezas de Arte de Nuevos Medios: Bio-Lencia como Caso de Estudio/ Application of agile software methodologies in new media art: bio-lencia as a study case Volume 8, Número 2 – Set – Dez, 2011 JANISSEK-MUNIZ, Raquel, L´Ecole de Administration, UFRGS, Brazil Application de la Veille Anticipative Strategique pour le suivi de L’environment et la Production de Connaissances Actionables /The Use Of Anticipative And Strategic Intelligence For Scanning The Environment And To Produce Actionable Knowledge Volume 8, Número 2 – Set – Dez, 2011 JOHANN, Silvio Luiz, Fundação Getúlio Vargas Rio de Janeiro, Brazil Aligning information security with the image of the organization and prioritization based in fuzzy logic for the industrial automation sector Volume 8, Número 3 – Set – Dez, 2011 KARIM, Akram Jalal, Manama, Kingdom of Bahrain The Significance of Management Information Systems for Enhancing Strategic and Tactical Planning Volume 8, Número 2 – Set – Dez, 2011 KNORST, André Marcelo, Coester Automação Ltda - São Leopoldo, RS, Brazil Aligning information security with the image of the organization and prioritization based in fuzzy logic for the industrial automation sector Volume 8, Número 3 – Set – Dez, 2011 LAVALLE, Andrea, Universidad Nacional del Comahue, Buenos Aires, Argentina Estrategias de decisión en sistemas dinámicos: aplicando mapas cognitivos difusos aplicación a un ejemplo socio - económico Volume 8, Número 3 – Set – Dez, 2011

JISTEM, Brazil Vol. 8, No. 3, Sept/Dec. 2011, pp. 752-764

www.jistem.fea.usp.br


758

JISTEM Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação

LEONIDIO, Ueliton da Costa, Catholic University of Petrópolis, Brazil Evaluation of the perceived Quality of the Website of an Online Bookstore: an Empirical Application of the Carnes and Vidgen Model Volume 8, Número 1 – Set – Dez, 2011 LESCA, Humbert, Université Pierre Mendès France (UPMF) Grenoble, France Application de la Veille Anticipative Strategique pour le suivi de L’environment et la Production de Connaissances Actionables / The Use Of Anticipative And Strategic Intelligence For Scanning The Environment And To Produce Actionable Knowledge Volume 8, Número 2 – Set – Dez, 2011 LÖBLER, Mauri Leodir, Federal University of Rio Grande do Sul, Brazil A Vision of Orkut’s Users: Studying this Phenomenon Through Cognitive Absorption Volume 8, Número 1 – Set – Dez, 2011 LÖBLER, Mauri Leodir, Federal University of Rio Grande do Sul, Brazil Understanding the behavior of the subject in interaction with a decision support system under time pressure and missing information Volume 8, Número 3 – Set – Dez, 2011 LUCIANO, Edmara Mezzomo, Pontific Catholic University of Rio Grande do Sul, Brazil Controles de Governança de Tecnologia da Informação para a terceirização de processos de negócio: Uma proposta a partir do COBIT Controls of Information Technology management for business processes outsourcing based on COBIT Volume 8, Número 1 – Set – Dez, 2011 MAÇADA, Antônio Carlos Gastaud, Brazil The Dimensions of IT Portfolio Management (ITPM): An Analysis Involving IT Managers in Brazilian Companies Volume 8, Número 2 – Set – Dez, 2011 MACCARI Emerson Antonio, UNINOVE, Sao Paulo, SP, Brazil Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil Volume 8, Número 3 – Set – Dez, 2011 MARTIN, Bill, RMIT University, Australia ICTS – new organizational form linkage in the Australian context: Theoretical model and research instrument Volume 8, Número 3 – Set – Dez, 2011 MARTÍNEZ, María Teresa Gutiérrez, Universidad Autónoma de Puebla, Mexico Aplicación de las Metodologías Ágiles en el Proceso de Producción de Piezas de Arte de Nuevos Medios: Bio-Lencia como Caso de Estudio Application of agile software methodologies in new media art: bio-lencia as a study case R. Gest. Tecn. Sist. Inf. /JISTEM Journal of Information Systems and Technology Management, Brazil


Editorial Information 2011

759

Volume 8, Número 2 – Set – Dez, 2011 MATALLANA, Adriana Maritza, Corporación Universitaria Minuto de Dios, Colombia Teach-me: implementation of mobile environments to the teach – learning Volume 8, Número 1 – Set – Dez, 2011 MOLLA, Alemayehu, RMIT University, Australia ICTS – new organizational form linkage in the Australian context: Theoretical model and research instrument Volume 8, Número 3 – Set – Dez, 2011 MONTEZANO, Roberto Marcos da Silva, Faculty IBMEC, Brazil Evaluation of the perceived Quality of the Website of an Online Bookstore: an Empirical Application of the Carnes and Vidgen Model Volume 8, Número 1 – Set – Dez, 2011 MOTA, Flávio Perazzo Barbosa, Universidade Federal da Paraíba, Brazil Public E-Procurement and the Duality of Technology: A Comparative Study in the Context of Brazil and of the State of Paraíba Volume 8, Número 2 – Set – Dez, 2011 NOVI, Juliana Chiaretti, Universidade Federal de São Carlos – UFSCar, Brazil E-SCM And Inventory Management: A Study of Multiple Cases in a Segment of the Department Store Chain Volume 8, Número 2 – Set – Dez, 2011 OLIVEIRA, Marcio Mattos Borges de, University of São Paulo, Ribeirão Preto, SP, Brazil Use and Development of Health Information Systems: The Experience of an Organizational Unit Responsible for the Technological Services at Public Hospitals Volume 8, Número 1 – Set – Dez, 2011 OLIVEIRA, Marcio Mattos Borges de, University of São Paulo, Ribeirão Preto, SP, Brazil E-SCM And Inventory Management: A Study of Multiple Cases in a Segment of the Department Store Chain Volume 8, Número 2 – Set – Dez, 2011 OLIVEIRA, Sonia Valle W. Borges de, University of São Paulo, Brazil Use and Development of Health Information Systems: The Experience of an Organizational Unit Responsible for the Technological Services at Public Hospitals Volume 8, Número 1 – Set – Dez, 2011 PACAGNELLA Junior, Antonio Carlos, Universidade Federal de São Carlos – UFSCar, Brazil E-SCM And Inventory Management: A Study of Multiple Cases in a Segment of the Department Store Chain Volume 8, Número 2 – Set – Dez, 2011

JISTEM, Brazil Vol. 8, No. 3, Sept/Dec. 2011, pp. 752-764

www.jistem.fea.usp.br


760

JISTEM Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação

PEREGRINO, Luis Eduardo Pérez, Corporación Universitaria Minuto de Dios, Colombia Teach-me: implementation of mobile environments to the teach – learning Volume 8, Número 1 – Set – Dez, 2011 PEREIRA Mauricio Fernandes, Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil Volume 8, Número 3 – Set – Dez, 2011 PÉREZ, Frey Rodríguez, Corporación Universitaria Minuto de Dios, Colombia Teach-me: implementation of mobile environments to the teach – learning Volume 8, Número 1 – Set – Dez, 2011 PINILLA, Alejandro Moreno, Corporación Universitaria Minuto de Dios, Colombia Teach-me: implementation of mobile environments to the teach – learning Volume 8, Número 1 – Set – Dez, 2011 PRADO, Edmir Parada Vasques, Escola de Artes, Ciências e Humanidades – USP, Brazil Risk analysis in outsourcing of information technology and communication Volume 8, Número 3 – Set – Dez, 2011 RICCIO, Edson Luiz, Universidade de São Paulo, Brazil Resultados do 8º. CONTECSI / Outcomes of the 8th CONTECSI – International Conference on Information Systems and Technology Management Volume 8, Número 2 – Set – Dez, 2011 RODELLO, Ildeberto Aparecido, Faculdade De Economia, Administração e Contabilidade de Ribeirão Preto, USP, Brazil A theoretical analysis of key points when choosing open source ERP systems Volume 8, Número 2 – Set – Dez, 2011 RODRIGUES, Leonel Cezar, UNINOVE, São Paulo, SP Brazil Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil Volume 8, Número 3 – Set – Dez, 2011 RODRIGUES Filho, José, Universidade Federal da Paraíba, Brazil Public E-Procurement and the Duality of Technology: A Comparative Study in the Context of Brazil and of the State of Paraíba Volume 8, Número 2 – Set – Dez, 2011 ROSES, Luís Kalb, Catholic University of Brasília, Brasília, Brazil Antecedents of End-User Satisfaction with an ERP System in a Transnational Bank: Evaluation of User Satisfaction with Information Systems Volume 8, Número 2 – Set – Dez, 2011 SAKATA, Marici Gramacho, TECSI/University of Sao Paulo, Brazil

R. Gest. Tecn. Sist. Inf. /JISTEM Journal of Information Systems and Technology Management, Brazil


Editorial Information 2011

761

Resultados do 8º. CONTECSI / Outcomes of the 8th CONTECSI – International Conference on Information Systems and Technology Management Volume 8, Número 2 – Set – Dez, 2011 SALGADO Junior, Alexandre Pereira, Universidade Federal de São Carlos – UFSCar, Brazil E-SCM And Inventory Management: A Study of Multiple Cases in a Segment of the Department Store Chain Volume 8, Número 2 – Set – Dez, 2011 SENA, Galeno José de, Paulista State University, Brazil Data Information System to Promote the Organization Data Collections – Modeling Considerations by the Unified Modeling Language (UML) Volume 8, Número 1 – Set – Dez, 2011 SILVA, Paulo Caetano da, Federal University of Pernambuco, Brazil XLDM: AN XLINK-BASED MULTIDIMENSIONAL METAMODEL Volume 8, Número 3 – Set – Dez, 2011 SIMÕES, Sergio Alexandre, Managing IT as a business: The Lutchen’s Gap in the 100 Top Organizations based in Brazil Volume 8, Número 3 – Set – Dez, 2011 SOUSA, Willy Hoppe de, Institute of Nuclear and Energy Research (IPEN), São Paulo, Brazil. The Development of an Enterprise Resource Planning System (ERP) for a Research and Technology Institute: the case of the IPEN Volume 8, Número 1 – Set – Dez, 2011 SOUZA Neto, João, Catholic University of Brasilía, Brazil Metamodels of information technology best practices frameworks Volume 8, Número 3 – Set – Dez, 2011 TESTA, Mauricio Gregianin, Pontifical Catholic University of Rio Grande do Sul, Brazil Controles de Governança de Tecnologia da Informação para a terceirização de processos de negócio: Uma proposta a partir do COBIT Controls of Information Technology management for business processes outsourcing based on COBIT Volume 8, Número 1 – Set – Dez, 2011 TIMES, Valéria Cesário, Federal University of Pernambuco, Brazil XLDM: AN XLINK-BASED MULTIDIMENSIONAL METAMODEL Volume 8, Número 3 – Set – Dez, 2011 TORIANI, Silvana, University of Southern Santa Catarina, Brazil CRM as a Support for Knowledge Management and Customer Relationship Volume 8, Número 1 – Set – Dez, 2011 TREZZA, Maria Aparecida H., Institute of Nuclear and Energy Research (IPEN), São Paulo, Brazil

JISTEM, Brazil Vol. 8, No. 3, Sept/Dec. 2011, pp. 752-764

www.jistem.fea.usp.br


762

JISTEM Journal of Information Systems and Technology Management Revista de Gestão da Tecnologia e Sistemas de Informação

The Development of an Enterprise Resource Planning System (ERP) for a Research and Technology Institute: the case of the IPEN Volume 8, Número 1 – Set – Dez, 2011 VALENTE, Nelma Terezinha Zubek, Universidade Estadual de Ponta Grossa, Brazil Resultados do 8º. CONTECSI / Outcomes of the 8th CONTECSI – International Conference on Information Systems and Technology Management Volume 8, Número 2 – Set – Dez, 2011 VALOIS B. Jr, Cleomar, Centro Universitário da Cidade do Rio de Janeiro, RJ, Brazil Recommender Systems In Social Networks Volume 8,