taylor and bobe - rmit universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 abstract invoking a...

32
Uses of management control and performance measurement systems by Deans and Heads in Australian universities: their effects on research, teaching and networking capabilities Dennis Taylor School of Accounting, RMIT University and Belete J. Bobe *School of Accounting, Economics, and Finance, Deakin University Contact details: Professor Dennis Taylor School of Accounting, RMIT University Building 108, 239 Bourke St Melbourne 3000 Email: [email protected] Submitted to First RMIT Accounting Educators’ Conference, Melbourne, November, 2010

Upload: others

Post on 25-Sep-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

Uses of management control and performance measurement systems by

Deans and Heads in Australian universities: their effects on research,

teaching and networking capabilities

Dennis Taylor

School of Accounting, RMIT University

and

Belete J. Bobe

*School of Accounting, Economics, and Finance, Deakin University

Contact details:

Professor Dennis Taylor

School of Accounting, RMIT University

Building 108, 239 Bourke St

Melbourne 3000

Email: [email protected]

Submitted to First RMIT Accounting Educators’ Conference, Melbourne, November,

2010

Page 2: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

2

Abstract

Invoking a resource-based view (RBV), this study investigates relationships between

management control systems (MCSs) use, including information use from performance

measurement systems (PMSs), and organisational capabilities in the context of academic

units of Australian universities. Increased competition and attention to distinctive

capabilities amongst universities, particularly at their strategic operating unit level of a

Faculty1 or School2, provides the setting for application of this theoretic perspective.

Based on a questionnaire survey of all Faculty Deans and Heads of Schools in all 39

universities in Australia, evidence is provided on relationships between diagnostic and

interactive use of MCSs, attention given to imposed and discretionary types of PMS

information, the strength of capabilities of the academic unit and, in turn, overall

performance of the academic unit. Highlights of findings are that Heads/Deans conceived

capabilities of their unit in functional dimensions, not in generic dimensions as found in

prior literature; interactive MCS use and imposed performance measures, respectively,

direct relate to several types of capabilities and indirectly to performance of the academic

unit, but diagnostic MCS use does not. The findings have practical implications for styles

of control systems use and performance information use by management in universities.

Keywords: management control systems use, diagnostic, interactive, performance

measurement systems use, resource-based view, organisational capabilities,

performance, universities.

1 Faculty may also be called College or Division or Portfolio. It is headed by an Executive Dean or Pro-

Vice-Chancellor. It is comprised of a set of academic Schools, Departments and/or Centres from cognate fields of study. 2 School may also be called Department. It is headed by an Head of School or Head of Department. It

comprises one or more academic disciplines. It will normally be part of a Faculty.

Page 3: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

3

1. Introduction

Over the past 15 years, the resource-based view (RBV) of the firm has become an influential

framework in the strategic management literature (2001, p. 537; Hoopes, Madsen, and Gordon

2003). The RBV conceptualises organisations as bundles of resources that can be used to

implement value-creating strategies (Barney 1991; Eisenhardt and Martin 2000) together with

capabilities that forge a link between resources and permit their strategic deployment (Day

1994). In the management accounting research literature, Henri (2006) has been the only study

to pursue research at the interface between MCS and strategy with the application of an RBV

framework. A large body of management accounting research has explored the relationship

between strategy and MCS in the absence of the RBV. These studies take strategy as a given

MCS in a structural form whereby the perspective is static and the focus is placed on such

issues as the presence or absence of specific MCS systems, their technical properties and their

design (Chapman 1997, 1998; Dent 1987). Other studies, however, have considered the role of

MCS in the process of formulation of strategy. The focus is on such issues as the dialogue and

interaction surrounding the use of MCS (e.g., Bisbe and Otley 2004; Davila 2000; Kloot 1997).

The findings provided by the MCS-strategy stream of research remain ambiguous. These

ambiguous results, according to Henri (2006) can be attributed in part to (i) the absence of a

theoretical framework founded on the resource-based view, and (ii) the limited attention devoted

to the different ways management makes use of MCSs, including their uses of performance

measurement systems.

It is postulated in this study that the use of MCSs needs to be studied in terms of its effects

on the process of developing and leveraging the firm’s specific capabilities that, under the RBV,

is a fundamental factor in achieving strategic success. That is, the link between MCS and

strategy-management is likely to occur at the capabilities level rather than the strategic-choice

level. Capabilities are deemed to support strategic choices “by providing the competitive

advantage necessary to materialize these choices” (Henri 2006, p. 530). Hence, “the notion of

strategic choice itself may not be directly traceable to MCS. Instead, the relationship should be

examined between capabilities and MCS, rather than between strategic choice and MCS” (Henri

2006, p. 531).

How should the role of MCS be conceptualized in developing capabilities and enabling

capabilities to leverage the firm’s resources towards implementation of strategies that produce

superior organizational performance? Following the work of Simons (1990; 1991; 1994; 1995),

several studies have examined a more active role of MCS in the formulation of strategy and the

implementation of strategic change (e.g., Abernethy and Brownell 1999; Bisbe and Otley 2004;

Page 4: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

4

Chenhall and Langfield-Smith 2003). Another line of research describes how organizations

balance the traditional and more active roles of MCS (e.g., Ahrens and Chapman 2004;

Chapman 1998; Dent 1987; Haas and Kleingeld 1999) or the tension resulting from the use of

MCS in various ways (e.g., Chenhall and Morris 1995; Henri 2006; Marginson 2002).

Building on the MCS-use framework of Simons, this study seeks to examine, from an RBV,

how the style of use of MCSs by heads of university strategic sub-units (i.e., academic faculties

and schools) can affect the strength of capabilities of their organizational sub-unit and, in turn,

the performance outcomes resulting from the successful implementation of strategies in a

competitive environment. Specifically, this study investigates the traditional feedback style of

use of MCS (Simons’ ‘diagnostic use’) and the more proactive discussant style of use of MCS

(Simons’ ‘interactive use’) in developing and shaping capabilities and consequently in affecting

performance. It also investigates the attention given by Heads/Deans to types of information

from the performance measurement system and how this relates to developing different

capabilities.

Hence, the primary research questions addressed in this study are: (1) to what extent do the

diagnostic and interactive uses of either broader MCSs or narrower PMSs by management of a

strategic sub-unit of an organization, contribute to the strength of development of various

capabilities of that organizational sub-unit? and (2) to what extent do the diagnostic or

interactive style of use of the broader MCSs or narrower PMSs have a significant indirect effect

on the performance of the organisational sub-unit through their impact on capabilities?

These research questions have not been addressed outside the work of Henri (2006) in the

context of private sector manufacturing companies. This study makes a contribution to the

advancement of literature on MCS use from a resource-based view by choosing the setting of

public universities. The nature of public universities, with their tension between the ethos of new

public management’s (NPM) managerialism and traditional collegiality amongst scholars,

coupled with tension within the management structure between professionals and

administrators, provides a complex, but relevant, setting for the application these research

questions.

2. The Setting for this Study

The empirical research for this study is set in the not-for-profit sector, specifically public

universities in Australia. The current study extends Henri’s (2006) private sector model and

findings by modifying the categories and scales used in the definition and measurement of MCS

Page 5: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

5

use, PMS use, organisational capabilities, and organisational performance in order to take

account of unique practices, policies and environments applying in public university.

The case for the application of the RBV theory to higher-education institutions in the UK has

been made by Lynch & Baines (2004) based on increasing competition and attention to

distinctive capabilities amongst universities. Strong competition for resources and the relevance

of capabilities for the strategic deployment of resources is also present amongst public

universities in Australian due to “deregulation and globalization having resulted in greater

competition for domestic and international students (as well as) research talent and funding…

(and increased) pressure for commercialisation and innovation” (Chung, Harrison, and Reeve

2009, p. 56-57).

In Australia, universities have experienced on-going government-imposed reforms under the

banner of NPM since the initial Dawkin’s (1987; 1988) restructuring of the higher/further

education sector. Due to the application of NPM doctrines, such as private sector management

practices and organisational forms, value-for-money ethos, and efficiency targets, universities in

Australia have operated under several constraints and challenges. These relate to reduced

government funding, increased dependence on fee-paying students, intensified competition for

international students, pressure through government quality audits and funding incentives to

continuously improve the quality of teaching and research outputs, increased demands both on-

shore and off-shore for flexible and multiple delivery platforms, and heightened compliance and

legitimacy demands to provide public accountability for their plans, activities, achievements and

financial performance to government agencies, students, alumni, staff, industry alliances,

international educational agencies and partners, local communities and the media (Biggs, 2003;

Davies and Thomas 2002; Deem 1998, 2004; McLaughlin, Osborne, and Ferlie 2002).

There has been heated debate about whether corporate-style management is appropriate for

the management of public university (Davies and Thomas 2002; Deem 1998, 2004; Lafferty and

Fleming 2000; Parker 2002; Pick 2006; Roberts 2004). Some argued that under new

managerialism, universities are forced to become what they are not (Jones 1986). Management

systems, processes and expectations that are claimed to now pervading universities due to NPM

include a shift from input-focus to output-focus, target-setting, benchmarking (Parker 2002);

devolved budgeting, performance budgeting, cost centres, responsibility accounting,

quantification (Lapsley 1999; Parker 2002); reduced public funding (Deem 2004; Parker 2002);

and competition, incentivization, managerial enterprise, economic rationality, marketisation,

modernization (Broadbent and Guthrie 2008). These features make universities a particularly

relevant organisational setting for the current study on MCS/PMS-capabilities.

Page 6: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

6

Another noteworthy feature of the setting for this study is the incongruence between

scholarly work with its inherent role of contributing to society’s intellectual, economic, cultural

and social developments (DEEWR 2007), on the one hand, and academic management with its

push into commoditization and commercialization of programs and research outputs on the

other hand. This incongruence is seen in the fact that academic managers (i.e., Dean of Faculty

and Head of School) will usually maintain their academic career in their discipline. Often they

will continue their research career, and possibly their teaching career, when holding a

managerial position. This can result in a Head of School and a professor in the school, working

together on a research project. The research partner might be academically senior to the Head

of School. Their relationship cannot be described as a normal supervisor-subordinate

relationship. It can best be described as more of a collegial than a managerial relationship,

notwithstanding the managerialism demanded in the NPM era. Hence the style of use of MCSs

by Deans and Heads is likely to be different from other commercial organisational settings

(Pettersen and Solstad 2007).

3. Literature Review and Hypotheses Development

3.1. Resource-based View

In the RBV literature, various terms have been used for the notion of ‘resources’: firm

resources, capabilities, internal capabilities, dynamic capabilities, distinctive internal capabilities,

competencies, and distinctive competencies (e.g., Barney 1991, 2001; Clardy 2007; Conner

1991; Day 1994; Eisenhardt and Martin 2000; Ethiraj, Kale, Krishnan, and Singh 2005; Henri

2006; Simons 2000; Tripsas and Gavetti 2000; Wernerfelt 1984). Authors have not necessarily

used these terms interchangeably. For example, Day (1994), uses the term competencies to

refer to well-defined routines that are combined with firm-specific assets to enable distinctive

functions to be carried out whereas capabilities are the mechanisms and processes by which

new competencies are developed. On the other hand, Ethiraj et al. (2005) describe resources

as factor inputs which are available to be acquired by any organisation unlike capabilities that

are developed within an organisation over long period of time.

Following Daft (1983), Barney (1991, p. 101) uses the term ‘firm resources’ to include “…all

assets, capabilities, organisational processes, firm attributes, information, knowledge, etc

controlled by a firm that enable the firm to conceive of and implement strategies that improve its

efficiency and effectiveness’. Barney further explains that in traditional strategic analysis,

resources will include internal strengths which firms develop to gain competitive advantage.

Page 7: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

7

Similarly, Simons (2000, p. 23) uses ‘distinctive internal capabilities’ to refer to internal strengths

in SWOT analysis and defines it as “…the special resources and know-how possessed by a firm

that give it competitive advantage in the marketplace”. Simons provides examples of distinctive

capabilities as being the ability to perform world-class research, excellence in product design,

superior marketing skills, the ability to manage costs, proprietary information technology, and

proprietary manufacturing skills. He classifies these capabilities into three types – functional

skills, market skills, and embedded resources.

Prior to Lynch and Baines (2004), the application of RBV had not been explored in the

higher education sector. Lynch and Baines (2004) employed RBV to explore if British

universities possess capabilities that give them sustainable competitive advantages. By

applying RBV application-based concepts to evidence from Quality Assurance Agency’s (QAA)

reports (1996 – 2002) and the Research Assessment Exercise’s (RAE) (1996 and 2001)

reports, Lynch and Baines (2004) proposed five dimensions of competitive resources (or

capabilities) of universities. These are (1) reputation (i.e., capabilities that enable an

organisation to communicate favourable information about itself to its stakeholders), (2)

architecture (i.e., the network of relationships, contracts, and alliances), (3) innovative (i.e., the

ability to undertake totally new initiatives that go beyond the current strategy), (4) core

competencies (i.e., the group of production skills and technologies that enable an organisation

to provide a particular benefit to customers), and (5) knowledge-based advantages (i.e., tacit

and explicit proprietary knowledge possessed by an organization).

This study adopts the dimensions proposed by Lynch and Baines (2004) in defining

capabilities. The higher education sectors in the UK and Australia are significantly similar in

structure and operations; and Lynch and Baines’ (2004) study is so far the only empirical work

on the application of RBV in universities. However, they use secondary data to provide evidence

in support of their five generic capabilities appropriate to universities. This study will obtain

primary data from a survey of Deans and Heads to test whether these five dimensions are

perceived as the ones that resonate or whether they have an alternative perspective on

capabilities. Hence, the following hypothesis will be tested:

Hypothesis 1: Heads and Deans will perceive capabilities in the same dimensions as

identified from secondary data by Lynch and Baines (2004).

3.2. Management Control Systems

Page 8: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

8

MCS have been investigated from several perspectives and under different contexts. For

example, there is a substantial body of literature on the relationships between MCSs and

strategy, which was outlined in the introduction. Moreover, there are alternative frameworks to

understanding MCSs. Collier (2005)) distinguishes formal, systems-based approaches including

those by Otley & Berry (1980), Daft and Macintosh (1984), Simons (1995), Merchant (1998),

Otley (1999), and Ferreira and Otley (2005), from informal, social or cultural forms of control

including those of Trist and Bamforth (1951), Ansari (1977), Ouchi (1979), Hofstede (1981),

Euske, Lebas, and McNair (1993), and Diitllo (2004).

Of interest to the current study is a broader-based formal systems-rooted framework. Such a

framework has been developed by Simons (1995) and Ferreira and Otley (2005), respectively.

Simons (1995) defined MCSs as comprising of four types of systems which he called ‘levers of

control’: belief systems, boundary systems, diagnostic control systems and interactive control

systems. Ferreira and Otley (2005) developed a ‘performance management and control’

framework comprising of twelve key planning, control and performance evaluation elements

which arose from integrating five issues proposed by Otley (1999) with the Simons (1995)

framework.

This study will partially adopt Simons (1995) framework. Simons (1995) has provided the

framework for many MCS empirical studies (e.g., Abernethy and Brownell 1997, 1999; Bisbe

and Otley 2004; Henri 2006; Kober, Ng, and Paul 2003, 2007; Widener 2007). He argues that

his ‘levers of control’ (LOC) provide a framework for controlling business strategy. The

framework’s key focus is strategy implementation rather than strategy formulation. Two LOCs,

diagnostic control systems and integrative control systems, are deemed to be especially

relevant to the context of this study. Prior studies have chosen to include the diagnostic and

interactive LOCs alone in their empirical model, and exclude the belief and boundary LOCs.

Abernethy & Brownell (1997) and Naranjo-Gil & Hartman (2006) have both justified this choice

in the context of public hospitals; Kober et al. (2003; 2007) have justified it in the context of

research and development organisations.

Simons (1995) defines diagnostic control systems as the use of formal feedback systems by

management to monitor outcomes and correct deviations from preset standards of performance.

Interactive control systems are defined as use of diagnostic control systems by management in

a way that regularly and personally involve them in the decision activities of subordinates.

Simons (1995) explains that a diagnostic control system can be made interactive by continued

and frequent management attention and interest that facilitates organisational learning and

deals with strategic uncertainties.

Page 9: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

9

In terms of their effect on the development of organizational capabilities, Henri (2006) has

found that diagnostic use of PMS has a negative effect whereas interactive use of PMS has a

positive effect. According to Simons (1995), diagnostic and interactive uses of MCS represent

countervailing forces used to balance the inherent organizational tension. Use of MCS in a

diagnostic manner tends to reflect single-loop negative feedback and focuses on intended

strategies. The use of MCS in an interactive way tends to reflect double-loop positive feedback

and focuses on emergent strategies (English 2001).

Diagnostic use of MCS, as described by Simons (1995), is a negative force that creates

constraints and ensures compliance with orders: “[Diagnostic systems] constrain innovation and

opportunity-seeking to ensure predictable goal achievement needed for intended strategies”

(Simons 1995, p. 91). Diagnostic use of MCS is used to signal when productivity and efficiency

have fallen, and when innovation needs to be curbed (Miller and Friesen 1982). Henri (2006, p.

537) conjectures that PMS tends to be used diagnostically by management “to limit the

deployment of capabilities by providing boundaries and restricting risk-taking.” Alternatively,

interactive use of MCS supports the development of ideas and creativity. According to Simons

(1995, p. 93) “senior managers use interactive control systems to build internal pressure to

break out of narrow search routines, stimulate opportunity-seeking, and encourage the

emergence of new strategic initiatives.” By fostering organizational dialogue and debate, an

interactive use of MCS contributes to expanding the organization’s information processing

capacity and consequently the deployment of organizational capabilities (Haas and Kleingeld

1999).

These arguments lead to the following hypotheses:

Hypothesis 2a: Interactive use of MCSs positively influences the extent of the

development of capabilities in an academic unit.

Hypothesis 2b: Diagnostic use of MCSs negatively influences the extent of the

development of capabilities in an academic unit.

3.3. Performance Measurement Systems

PMSs have traditionally been used as a means of maintaining organizational control and

financial goals that hierarchically condense to measures like EPS and ROI (Hussain 2005).

Subsequent studies have concluded that traditional accounting-based PMSs are inadequate in

a business environment of economic uncertainty and complex competition (Ballantine and

Bringall 1995; Bromwich and Bhimani 1989; Govindarajan and Shank 1992).The need of a

broader range of performance measures, including non-financial and financial, internal and

Page 10: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

10

external, qualitative and quantitative and comparative and absolute, measures has been

demonstrated by researchers like Johnson (1990) and Kaplan (1991).

Subsequent research on diversity in the design of PMSs has used a contingency approach.

This has involved analysis of the effects of various organizational and external factors, including

strategy, structure, size, technology, and environmental uncertainty, on specific mixes of

financial and non-financial measures (e.g. Baines and Langfield-Smith; Hall; Hoque 2004;

Hoque and James 2000; Maurice 2005; Said, Elnaby, and Wier 2003; Stede, Chow, and Lin

2006; Widener 2006).

Another line of study has been the use of diversely designed PMS (e.g. Bisbe and Otley

2004; Henri 2006; Simons 2000). According to Henri (2008, p. 252), the use of PMS refers to

“the nature and purpose of the use of performance indicators by managers.” The nature and

purpose of use of diverse PMSs in an organizational setting have been classified in several

ways. For example, Simon, Guetzkow, Kozmetsky, and Tyndall (1954) classify accounting

information use as scorecard, problem-solving, and attention-directing. Atkinson, Waterhouse,

and Wells (1997) present three uses of PMS, namely, monitoring, diagnostic, and coordination.

Vandenbosch (1999) groups PMSs into four categories, namely, score-keeping, problem-

solving, attention-focusing, and legitimizing. Ittner, Larcker, and Randall (2003) present four

uses of PMSs, namely, problem identification, capital investment, performance evaluation, and

external disclosure. Finally, Henri (2008) classifies PMSs into four main uses – monitoring,

strategic decision-making, attention-focusing, and legitimization.

In the current study, the classification of nature and purpose of use of PMS departs from the

above suggestions. The public-private sector nature of universities gives rise to unique PMSs.

On the one hand, universities in Australia have a suite of standardized key performance

measures imposed from the federal government department and its agencies, many of which

are tied to funding formulas. On the other hand, universities have considerable operating

autonomy in the way they manage the efficiency, quality and strategic differentiation of their

Faculties and Schools. At the level of the academic unit, this suggests the need to give attention

to both centrally-imposed key performance indicators (KPIs) and locally-determined

performance measures that relate to the academic unit’s flexibility at the margin for developing

its distinctive capabilities and achieving its competitive advantage. Therefore, in the specific

context of use of PMSs by Deans and Heads of academic units, this study seeks to empirically

validate a two-fold classification scheme. This two-fold classification distinguishes between the

use of PMS for the purpose of meeting centrally-set standardized measures of performance and

for the purpose of setting and meeting locally-determined discretionary measures of

Page 11: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

11

performance. In brief, the distinction is between central standard performance and discretionary

performance.

Whether Deans and Heads make greater use of indicators of core or discretionary

performance or a balance of both is expected to have an influence on the way their academic

unit’s capabilities are nurtured and enhanced.

First, consider a high focus on centrally-imposed performance indicators. This would imply

that the Dean or Head is strongly concerned about the phenomenon of relative performance

evaluation. As explained by Johansson and Siverbo (2009) relative performance evaluation

arose in local government organizations after the arrival of new public management (NPM).

With NPM-inspired accounting innovations and new performance measurement tools, the

observation by Johansson and Siverbo (2009) is that many performance evaluations in Swedish

municipalities involve comparison with other municipalities. For universities, if core performance

(in any one of the core fields of teaching, research or networks) is relatively superior to that of

competitor Faculties or Schools, then it would be expected that the competitive strength of

capabilities associated with that core field of performance would be superior.

Second, consider a high attention on locally-developed measures of discretionary

performance. This would imply that the Dean or Head is focused on aspects of efficiency or

capabilities development that are of specific importance to that academic unit. Such attention

would expect to have the effect of strengthening the Faculty’s or School’s capabilities.

Based on these arguments, the following hypothesis is put forward:

Hypothesis 3a: Attention to use of centrally-imposed measures in the PMS by Heads

and Deans positively influences the extent of development of capabilities in an

academic unit.

Hypothesis 3b: Attention to use of self-determined discretionary measures in the PMS

by Heads and Deans positively influences the extent of development of capabilities in

an academic unit.

3.4. Relationships between MCS/PMS use, Capabilities and Organizational Performance

The RBV of the firm suggests that distinctive capabilities lead to a sustained competitive

advantage, which in turn contributes to organizational performance. The generic capabilities of

universities which can give competitive advantage are identified by Lynch and Baines (2004) as

reputation, architecture (external relationships), innovativeness, core competencies (in

production technology and skills) and knowledge-based (expertise). They are considered to be

vital for leveraging resources in a way that deploys them for new value-creating strategies.

Page 12: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

12

Empirically, previous studies on the private sector show that different sets of capabilities

contribute positively to performance (e.g. Henri 2006; Hult and Jr 2001; Lee, Lee, and Pennings

2001; Spanos and Lioukas 2001).

In section 3.2 diagnostic use of MCS and interactive use of MCS are linked to capabilities

leading to H2. In section 3.3 it is argued that the attention given to both centrally-imposed core

metrics and locally-developed discretionary indicators in the PMS are linked to capabilities,

leading to H3. Since, these capabilities are expected to lead to organizational performance, the

use of MCS and PMS, respectively, can be expected to have an indirect effect on performance

through their influence on capabilities and resources deployment.

Therefore, the following hypotheses are proposed:

Hypothesis 4a: The interactive use of MCSs by Heads and Deans has a positive and

significant indirect effect on performance of an academic unit through its impact on

capabilities.

Hypothesis 4b: The diagnostic use of MCSs by Heads and Deans has a negative and

significant indirect effect on performance of an academic unit through its impact on

capabilities.

Hypothesis 4c: The use of key performance indicators by Heads and Deans has a

positive and significant indirect effect on performance of an academic unit through its

impact on capabilities.

Hypothesis 4d: The use of non-key performance indicators by Heads and Deans has a

negative and significant indirect effect on performance of an academic unit through its

impact on capabilities.

4. Method

4.1. Sample selection and instrument preparation

This study collects primary data through a questionnaire survey of Deans of Faculties and

Heads of Schools in Australian universities. The unit of analysis is an academic unit (Faculty

and School). These academic units are deemed equivalent to strategic business units since

they have sufficient responsibility in generating and using operating budgets, recruiting staff and

initiating their teaching and learning programs and research activities in their fields of scholarly

discipline.

The survey instrument was developed in three steps, before being finally administered. The

first step involved the preparation of a draft questionnaire in reference to previously validated

Page 13: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

13

scales in the literature and website content of five universities for contextualization purposes.

The second step involved pre-testing in which the questionnaire was reviewed by three

accounting professors, one senior lecturer of accounting, one associate professor of information

systems, and three accounting/finance Heads of Schools for wording, clarity, ambiguity, and

relevancy of the individual items. Several non-fundamental revisions to the instrument resulted

from these experts’ comments. The third step involved piloting the instrument with three deans

of science and business faculties in two Australian universities. These deans completed the

survey in their own time followed by semi-structured interviews. The interviews aimed at

ensuring that definitions of MCS, capabilities and organisational performance would resonate

with respondents. The final questionnaire consisted of questions on the demographics of

respondents and their academic unit, together with multi-items measures of dimensions of use

of MCS, attention given to types of performance measures, extent of development of different

capabilities, and self-rating of the unit’s performance. Details from the questionnaire are

provided in Appendix B.

After ethics approval, the final questionnaire was mailed to 200 Deans and 679 Heads. This

sample was selected as a census of all Faculties and Schools in all 39 Australian universities.

The mailing lists of 200 Deans and 679 Heads were prepared from the websites of each

university’s Faculties and Schools. The survey package consisted of a covering letter addressed

to each Dean/Head by name and title, the questionnaire (four pages), and a reply paid

envelope. Two weeks after mailing the questionnaire, reminder phone calls were made to the

Deans who had not responded or responded but not identified themselves3 through their

personal/executive assistants. The Heads were contacted via direct emails encouraging them

to complete the questionnaire and thanking those who had responded. After a further two

weeks, the Heads were sent second reminder emails.

After these follow up phone calls and emails, 58 (29%) questionnaires from Deans and 168

(24.74%) from Heads were received. Two questionnaires from Deans and three questionnaires

from Heads were rejected due to missing data. Hence, the final usable responses were 56 from

Deans, a response rate of 28%, and 165 from Heads, a response rate of 24.3%, making a

combined response rate of 25.14%. These response rates are in the higher regions of the 15-

25% reported in similar studies (Henri 2006; Mahama 2006). To test for non-response bias, the

means the MCS use and performance variables were compared between the first 40 responses

and the last 40 responses received (20 from Deans and 20 from Heads). No significant

3 At the bottom of the first page of the questionnaire, respondents were given options of providing their

names and email addresses if they wished to receive summary of the findings of the study.

Page 14: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

14

differences were found. A comparison of means of responses between the Deans and the

Heads was also undertaken, and again no significant differences were found.

4.2. Variable measurement

Eleven constructs (latent variables) are used in the empirical model. Discussion of the

measures of each construct is presented below.

Management control systems use: The scales for interactive/diagnostic uses of MCSs

(MCSINTER and MCSDIAG) were generated in reference to instruments developed by

Abernethy and Brownell (1999); Henrie (2006); Kober et al. (2003; 2007); Naranjo-Gil and

Hartmann (2006) and Simons (1987; 1990). Simons (1995) was used as the main reference.

They were measured using seven-point Likert-scales anchored from 1 strongly disagree to 7

strongly agree. The preamble to this part included a definition of MCS as covering

planning/budgeting, monitoring, and performance reporting and review systems. Both the

diagnostic use and interactive use constructs had six-item scales. The MCS diagnostic use

scales were reverse coded.

Performance indicators: The scales for performance indicators were developed in reference

to Naranjo-Gil and Hartmann’s (2006) distinction between financial and non-financial indicators.

The instrument had 6 items of financial and 6 items of non-financial performance data.

Respondents were asked to indicate the extent to which they keep a watch on, discuss and, if

necessary, take action on the progressive reporting of each of these 12 performance measures.

They were seven-point scales anchored from 1 rarely or never to 7 very often.

Organisational (academic unit) capabilities: While Lynch and Baines (2004) sought to apply

the RBV to the higher education sector, their proposed set of capabilities was not validated but

based loosely on university data collected by government. The scales for organisational

capabilities for the current study were devised to capture the five “competitive resources of

university’ identified by Lynch and Baines (2004). Twenty-three items grouped into these five

dimensions were drafted and re-drafted following the pre-test and pilot test stages explained in

section 4.1. The final instrument provided a definition of academic unit capabilities as those

distinctive resources, expertise, networks or reputation that the unit had developed to make it

more competitive. It then presented Lynch and Baines’ (2004) five dimensions of relationship

with outside institutions4 (4 items), innovative capabilities (4 items), expertise5 (4 items),

4 Lynch and Baines (2004) use the term ‘architecture’.

5 Lynch and Baines (2004) use the term ‘knowledge-based advantages’.

Page 15: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

15

reputation (7 items), and core competencies (4 items). Seven-point scales were anchored from

1 not at all developed to 7 fully developed.

Organisational (academic unit) performance: Respondents were asked to give a subjective self-

rating of the overall performance of their Faculty or School out of 10. To guide respondents in

deciding on this score, they were asked to weigh up the following aspects of their Faculty’s or

School’s recent performance: teaching and learning output-based successes, research output-

based successes and reputation for attracting high quality students and staff.

5. Results

5.1 Descriptive Statistics

The respondents’ profile and their academic unit is characterized as follows:

� the majority of Deans (57%) and Heads (64%) have no academic qualification in

administration or management.

� approximately half of the Dean and half of the Heads have held their current position for

less than 3 years.

� prior senior academic management positions of 4 years or more have been held by 70%

of Deans and 49% of Heads.

� the majority of the Deans (59%) and substantial proportion of Heads (45%) have been in

the higher education sector as academics and/or academic managers for greater than

20 years.

� 48% of Deans and 50% of Heads had never held a management/leadership position

(even short-term) in an organization other than universities.

� no Dean or Head is below the age of 35 and about one third were female.

� Schools ranged in diversity from one discipline to nine disciplines, and in size from less

than 1,000 to greater than 5,000 equivalent full-time undergraduate student load.

5.2 Confirmatory Factor Analysis to Test H1 on Capabilities

As mentioned, the instrument contained 23 scales for capabilities of the academic unit

devised to capture Lynch and Baines’ (2004) areas of capabilities in developing and nurturing

external relationships, innovation, expertise, reputation and core skills/technologies. These 23

items for capabilities were subjected to principal components factor analysis using SPSS.

Twenty of the 23 items loaded into three factors. The remaining three items had very small

factor loadings and were removed from further analysis. From a perusal of the nature of the

items in the three factors, three functional capabilities were labelled as follows: networking

Page 16: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

16

capabilities (6 items); teaching capabilities (6 items); and research capabilities (8 items). Table 1

presents the results of this factor analysis.

It is concluded from the results in Table 1 that Hypothesis 1 (i.e., that Deans/Heads perceive

capabilities in the same dimensions as identified from secondary data by Lynch and Baines,

2004) is not supported. This study finds that capabilities of academic units are perceived by

Deans/Heads in their functional dimensions. That is, capabilities are perceived as bundles of

relationships, innovations, skills, technologies and reputations that are differentially developed

into specific sets of teaching capabilities, research capabilities and network capabilities.

5.3 Partial least-squares analysis

The analysis proceeds by employing partial least square (PLS) path modelling (using

SmartPLS 2.0 created by Ringle, Wende, and Will (2005)). PLS provides two levels of analysis

– its is used first as a ‘measurement model’ that relates observed variables (measures) to their

respective unobserved variables (constructs or latent variables); and second as a ‘structural

model’ that estimates relationships between constructs (latent variables) as hypothesised in the

researcher’s conceptual model (shown in Figure 1).

PLS is said to be a useful technique for causal-predictive analysis in situations of high

complexity but low theoretical support (Joreskog and Wold 1982). The advantages of PLS have

been reported in recent management accounting research (e.g. Abernethy et al. 2010; Mahama

2006). The particular features that have led to the choice of this methodology for the current

study are its handling of complex structural models, its acceptance of data that does not meet

parametric data distributional assumptions, its allowance for multicollinearity among endogenous

variables, and its creation of latent variable scores directly using cross products involving multi-

item measures (Chin and Newsted 1999).

5.3.1 The PLS measurement model for reliability and validity testing

The measurement model relates the measures (indicators) to their respective constructs.

As such, it evaluates the reliability and validity of the scale measures. Table 2 presents the

results for each of the multi-scale variables (constructs) in this study. The reliability of individual

items is assessed from the loading to their respective construct. The loading scores are the

correlations between the observable indicators and their associated constructs. Estimate of 0.70

or more is desirable (Nunnally 1978) although in causal modeling lower values are acceptable

(Barclay, Higgins, and Thompson 1995).

Page 17: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

17

Table 1: Factor analysis of capabilities for regrouping of items from five dimensions suggested by Lynch and Baines (2004) to three functional capabilities.

Rotated Component Matrixa

Component

Items in the survey questionnaire. 1 2 3 4 5

Description of items in the questionnaire Coded for PLS

Reputation 3 .900 -.040 -.081 -.003 .144 Reputation in research. CAPRES1

Expertise 2 .894 .013 -.069 .033 .102 A critical mass of internationally renowned researchers in focused research areas in the Faculty/School.

CAPRES2

Reputation 5 .861 .042 -.081 -.033 .123 Reputation for eminent professors. CAPRES3

Innovative capabilities 2

.849 .046 .003 .010 .026 Ability and experience in pursuing original research projects and generating publications.

CAPRES4

Reputation 6 .794 .080 -.016 -.058 .185 Reputation for renowned authors. CAPRES5

Expertise 3 .773 .120 .062 .145 .024 Expertise and support structures for the Faculty/School to seek linkage research grants.

CAPRES6

Innovative capabilities 3

.635 .171 .254 .188 -.198 Ability and experience in commercializing research through patents or consulting/training services.

CAPRES8

Reputation 1 .622 .024 .196 .238 .475 Recognised brand name in the higher education sector.

CAPRES7

Relationships with outside institutions 3

.586 .164 .340 .178 -.383 Professional connections/involvement of Faculty/School members with private and public sector organisations and government funding councils for research funding.

CAPNW5

Expertise 4 .554 .068 .227 .074 .070 Direct access to, and experience with, high quality database to use in empirical research.

Note 1.

Innovative capabilities 4

.481 .399 .191 .241 -.030 Overall talent and flexibility to pursue new initiatives in the Faculty/School that goes beyond the current strategies.

Note 2.

Innovative capabilities 1

.043 .845 .095 .038 -.084 Ability and experience in pursuing teaching and learning innovations (e.g., experiential learning, e-learning).

CAPTEA-CH3

Core competency 1 .060 .773 .192 .041 .181 Core competencies in the processes underpinning teaching, learning and assessment strategies (e.g. technology based teaching delivery; flexible delivery modes; diverse assessment structures).

CAPTEA-CH1

Expertise 1 -.052 .760 .041 .274 .172 Technology, processes, copyrighted materials and expertise to strongly underpin flexible teaching delivery, multimedia learning modes and diverse assessment structures.

CAPTEA-CH2

Core competency 2 .240 .514 .451 .009 .012 Core competencies in the application of theory to practical problems (vocation) for the development of teaching or consultancy products or research.

CAPTEA-CH6

Core competency 4 -.017 .124 .852 .135 .104 Expertise and necessary resources to enable students find jobs after graduation.

Note 3.

Core competency 3 -.099 .162 .835 .111 .043 Expertise and resources to place students for work experience while studying.

CAPNW4

Reputation 4 .390 .282 .434 .262 .247 Reputation with external communities/stakeholders/industry engagement.

CAPNW6

Relationships with outside institutions 4

.381 .232 .417 .276 -.386 Professional connections/involvement of Faculty/School members with government regulators, professional bodies, and media in general.

CAPNW2

Relationships with outside institutions 1

.072 .113 .043 .888 -.004 Collaboration with other local and international educational institutions and partners in delivery and articulation of academic programs.

CAPNW3

Relationships with outside institutions 2

.082 .129 .292 .806 .067 Relationships with local and international agents, partners, secondary schools, TAFE colleges, and alumni for attracting and recruiting students.

CAPNW1

Reputation 2 .293 .281 .149 .156 .702 Reputation in teaching and learning. CAPTEACH5

Reputation 7 .371 .427 .184 -.153 .492 Reputation for distinguished teachers. CAPTEACH4

Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization.

a. Rotation converged in 10 iterations.

Note 1: Removed due to low factor loading and lack of face validity with the other research capability items.

Note 2: Removed due to low factor loading and lack of face validity with the other research capability items.

Note 3: Removed due to lack of face validity with the other network capability items.

Page 18: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

18

Table 2 first shows that in respect of the 20 scales that were validated in Table 1 as the

items in the measurement of capabilities, there are 3 items from network capability and one item

from teaching capability with low loadings6. Consequently, these 3 items are removed from

further PLS analysis, leaving the final measurement model with 3 scales for network capability;

5 for teaching capability; and 8 for research capability.

Second, Table 2 shows an unanticipated result for the use of PMS information. As

previously mentioned, the survey instrument contained scales for performance indicators based

on Naranjo-Gil and Hartmann’s (2006) distinction between financial and non-financial indicators.

However, in the two ‘performance indicators’ factors in Table 1, the items consist of a mixture of

financial and non-financial loadings. (See Appendix B for a description of the items). Three

items were dropped due to low loading values and the final measurement model consisted of

four items in factor 1 and five items in factor 2. These sets of items were reviewed to establish if

they could be interpreted in any conceptually meaningful way. The answer appeared to be in the

source of determination of these performance measures. All four performance indicators in

factor 1 were found to be of a type that is widely imposed on Faculties/Schools by their

university chancellory because they are sought by federal government departments or agencies

and are tied to either funding or benchmarking of universities by government. In contrast, all five

items in factor 2 are performance measures that are typically associated with discretionary

spending or fund raising activities at the Faculty/School level. Hence, the concept of PMS

information use is found to fall into two dimensions of attention given to centrally-imposed

performance indicators (PIIMPOSED) and attention given to locally-activated discretionary

performance indicators (PIDISCRETION).

Third, the measures for academic unit performance had two scales dropped because of low

loadings – one scale from teaching performance and one from reputation performance. The

inclusion of the overall performance measure, where respondents are asked to rate the overall

performance of their Faculty or School, provides another way of validating the other

performance indicators (e.g. Abernethy and Brownell 1997).

Turning to the overall results in Table 2, it is found that most measures (39 out of 43) are

above or close to 0.70. All loadings are statistically significant at the 0.001 level (one tailed).

This suggests high reliability across the measures. Further, Table 2 gives the composite

reliability and Cronbach’s alpha as measures of the internal consistency (reliability) of the set of

6 Loading value of 0.70 and greater are desirable although lower values are acceptable in early stages of

research.

Page 19: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

19

scales associated with each construct. The composite reliability scores are all greater than the

recommended 0.70 (Fornell and Larcker 1981), with all constructs ranging from 0.82 to 0.95.

Similarly, the Cronbach’s alpha scores are all greater than the recommended 0.70 except for

the teaching performance construct. Given that this construct has a 0.84 for composite

Table 2: Reliability, convergent validity, and descriptive statistics.

Variables Composite

reliability AVE

Cronba-

ch alpha R2

Min Max Mean S.D.

Load-

ing*

Network capability 0.85 0.65 0.74 0.12

CAPNW2Q4 2 7 4.88 1.28 0.82

CAPNW5Q3 1 7 4.83 1.42 0.83

CAPNW6Q16 2 7 5.15 1.18 0.77

Research capability 0.94 0.67 0.93 0.15

CAPRES1Q15 1 7 4.89 1.61 0.91

CAPRES2Q10 1 7 4.45 1.74 0.90

CAPRES3Q17 1 7 4.62 1.71 0.88

CAPRES4Q6 1 7 5.43 1.42 0.85

CAPRES5Q18 1 7 4.40 1.61 0.82

CAPRES6Q11 1 7 4.37 1.60 0.79

CAPRES7Q13 2 7 5.44 1.29 0.71

CAPRES8Q7 1 7 3.22 1.62 0.64

Teaching capability 0.83 0.50 0.76 0.14

CAPTEACH1Q20 1 7 5.20 1.08 0.72

CAPTEACH3Q5 1 7 5.35 1.01 0.60

CAPTEACH4Q19 2 7 5.04 1.27 0.77

CAPTEACH5Q14 2 7 5.41 1.19 0.72

CAPTEACH6Q21 1 7 4.79 1.27 0.71

Centrally-imposed

performance indicators

0.82 0.54 0.72 -

PIIMPOSED4 2 7 5.87 1.02 0.78

PIIMPOSED5 1 7 5.14 1.39 0.67

PIIMPOSED6 2 7 5.78 1.03 0.69

PIIMPOSED7 1 7 5.54 1.33 0.78

Discretionary

performance indicators

0.84 0.52 0.77 -

PIDISCRETION1 1 7 4.35 1.62 0.62

PIDISCRETION2 1 7 5.25 1.51 0.70

PIDISCRETION3 1 7 5.25 1.45 0.85

PIDISCRETION4 1 7 4.61 1.43 0.63

PIDISCRETION5 1 7 5.01 1.44 0.76

MCS diagnostic use 0.85 0.54 0.79 -

MCSDIAG1 1 7 3.65 1.65 0.69

MCSDIAG2 1 7 3.17 1.53 0.72

MCSDIAG3 1 7 3.40 1.54 0.74

MCSDIAG5 1 7 3.63 1.65 0.84

MCSDIAG6 1 7 3.69 1.58 0.67

MCS interactive use 0.95 0.76 0.94 -

MCSINTER1 1 7 4.25 1.74 0.86

MCSINTER2 1 7 4.62 1.60 0.91

MCSINTER3 1 7 4.60 1.70 0.91

MCSINTER4 1 7 4.32 1.65 0.88

MCSINTER5 1 7 3.62 1.60 0.80

MCSINTER6 1 7 4.48 1.62 0.88

PERFOVERALL - - - 0.50 2 10 7.20 1.12 -

*All item loadings are statistically significant (p < 0.001, one-tailed)

Page 20: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

20

reliability, it is concluded that its scales have sufficient internally consistency. Table 2 also

presents the average variance extracted (AVE) for each construct. AVE is used to measure the

convergent validity of the scales to their associated constructs. It measures the amount of

variance extracted by an underlying factor in relation to the amount of variance due to

measurement error. The AVE for all constructs is greater than 0.50 (except for one indicator

which scored 0.50 exactly) and satisfies the suggested criterion in the literature (Chin 1998).

Lower values are acceptable in early stages of studies that are not yet fully established

(Abernethy et al. 2010).

Turning to Table 3, this provides an assessment of the discriminant validity of the scales

using the square roots of the AVE scores of each construct compared to the correlations among

the constructs. Table 3 shows that all of the square roots of the AVEs are greater than their

corresponding correlation values confirming that adequate discriminant validity exists for all

latent variables (Fornell and Larcker 1981). An alternative measure of discriminant validity

based on the matrix of the cross-loadings of the observable indicators to their latent variables

was run. Results are not presented in this paper, but they also confirm that there is adequate

discriminant validity in the scales.

Table 3: Discriminant Validity Test

1 2 3 4 5 6 7 8

1. CAPNW 0.81

2. CAPRES 0.53 0.82

3. CAPTEACH 0.47 0.42 0.71

4. PIIMPOSED 0.32 0.36 0.30 0.73

5. PIDISCRETION 0.21 0.07 0.19 0.45 0.72

6. MCSDIAG 0.19 -0.17 -0.27 -0.28 -0.23 0.73

7. MCSINTER 0.21 0.13 0.27 0.25 0.35 -0.66 0.87

8. PERFOVERALL 0.42 0.51 0.44 0.30 0.16 -0.13 0.15 1.00

5.3.2 The PLS structural model for hypotheses testing

The PLS structural model provides the analysis for the hypothesized relationships

between the latent variables as presented in Table 4. The results for direct effects are depicted

in Figure 1. Table 4 has two panels. Panel A presents the results of the direct PLS structural

model relationships between the latent variables while Panel B provides the total effects (direct

and indirect) of the structural relationships between the latent variables. In all cases, the

statistical significances of the path coefficients have been tested using a bootstrapping

approach with 1,000 re-samplings.

Page 21: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

21

The results given in Panel A for direct relationships reveal some clear patterns about the effects

of MCS and PMS use on capabilities. First, in respect of H2, interactive MCS use is expected to

positively relate to the strength of capabilities (H2a), whereas diagnostic MCS use is expected

to negatively relate to capabilities in the academic unit (H2b). The results in panel A show only

one relationship between MCS use and capabilities is significant – from interactive MCS use to

teaching capability. Therefore, there is a partial support to hypothesis 2a. Interestingly,

diagnostic MCS use is not significantly related to any capabilities, so H2b is fully rejected. This

study is not consistent with Henri (2006) who found significant positive effects of interactive

MCS use and significant negative effects of diagnostic MCS use on capabilities in

manufacturing companies. However, Henri’s (2006) study may not be comparable to this study

because capabilities are conceived in different dimensions.

Table 4: PLS Structural Model Results: Regression Coefficients (and t-values) for relationships between MCS use, Capabilities and Performance Panel A: Direct Effects

Path from: CAP NW

CAP RES

CAP TEACH

PERF OVERALL

R2

MCSINTER 0.10 (1.19) 0.02 (0.22) 0.13 (1.45)*

MCSDIAG -0.04 (0.49) -0.08 (0.96) -0.12 (1.24)

PIIMPOSED 0.26 (3.68)*** 0.39 (5.47)*** 0.22 (2.77)***

PIDISCRETION 0.05 (0.61) -0.13 (1.44)* 0.01 (0.18)

CAPNW 0.35 0.12

CAPRES 0.24 0.15

CAPTEACH 0.07 0.14

PERFOVERALL 0.50

Panel B: Total Effects

Path from: Path through:

CAPNW, CAPRES & CAPTEACH PERF

OVERALL

MCSINTER 0.03 (0.59) H4a

MCSDIAG -0.05 (1.17) H4b

PIIMPOSED 0.21 (5.04)*** H4c

PIDISCRETION -0.05 (1.25) H4d

Second, in respect of H3, attention to use of centrally-imposed measures in the PMS by

Deans/Heads is expected to positively related to capabilities in an academic unit (H3a), as is

attention to discretionary performance measures (H3b). The results in panel A give fully support

for H3a and no support for H3b.

The main findings in Panel A of Table 4 need to be interpreted in the specific context of

universities. First, the result that interactive MCS use significantly positively affects teaching

capabilities could be understood by the fact that teaching capabilities are developed and

Page 22: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

22

nurtured through collegial needs. Hence, capabilities in teaching and learning innovations,

flexible delivery technologies and reputations of teaching teams would be more strongly

supported when a Dean/Head uses MCSs interactively by stimulating dialogue about teaching

and learning quality and promoting adoption of exemplary teaching practices.

Figure 1: Direct Effects of MCS Use and PMS use on Capabilities, Direct Effects of

Capabilities on Performance, and Indirect Effects of MCS use and PMS use on Performance

___________________________________________________________________________________ .02 -.13* -.04

.27*** -.08 -.02 -.04 ..39*** .39*** .22*** .26*** ..26*** -.13* .03 .05 Total Effects (= Direct +Indirect Effects) Interactive MCS use through Capabilities on Performance = 0.03 n.s. Diagnostic MCS use through Capabilities on Performance = -0.05 n.s. Imposed PMS use through Capabilities on Performance = 0.21*** Discretionary PMS use through Capabilities on Performance = -0.05 n.s. _____________________________________________________________________________________ * = significant at .05, ** = significant at .01

Interactive

MCS use

Diagnostic

MCS use

Imposed

performance

measures use

Research

Capability

(R2=0.15)

Teaching

Capability

(R2=0.14)

Networking

Capability

(R2=0.12)

Overall

Performance

(R2=0.25)

Discretionary

performance

measures use

Page 23: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

23

Second, the result that use of centrally-imposed performance indicators is significantly

positively related to all capabilities (teaching, research and networking capabilities) could be

evidence that these imposed performance indicators are treated as imperatives by those

Deans/Heads who use them the most, so these indicators will strongly guide their decisions

relating to capabilities of the academic unit. The centrally-imposed performance indicators

contain well-known metrics on research publications, external research income, course/program

surveys by students and external bodies. Hence, the strength of teaching capabilities, research

capabilities and networking capabilities will all be more fully pushed by a Dean/Head who gives

greater attention to these centrally-imposed performance metrics.

Third, Panel A shows that greater attention to discretionary performance indicators has a

negative relationship to the extent of development of research capabilities. This result could be

understood by the sensitivity of decisions made about allocations of discretionary resources like

travel funds and support staff by those Deans/Heads who give greater attention to discretionary

performance indicators. Such resource allocation decisions concerning the use of internal

discretionary funds of the academic unit may be perceived as unsupportive of the development

of research capabilities amongst research-active staff because the metrics in the centrally-

imposed performance indicators expect researchers to generate research funds and seek talent

external to the academic unit.

Finally, Panel A reveals significant positive relationships between the different

dimensions of capabilities (teaching, research and networking) and the different dimensions of

performance (teaching, research and reputation) of academic units. This finding supports the

RBV which conceptualises organisations as having distinctive capabilities that can enable

resources to be used in implementing value-creating strategies (Barney 1991; Day, 1994;

Eisenhardt and Martin 2000), resulting in sustained performance.

Panel B in Table 3 provides results for the tests of H4 concerning the indirect effects of

the uses of MCS and PMS on organisational performance through the intervening effects of

organisational capabilities. The PLS structural path model shows the result that the indirect

effect of interactive MCS use on performance (through organisational capabilities) is only

significant for the path to teaching performance. H4a is partially supported. For diagnostic MCS

use, Panel B shows that none of the path coefficients to organizational performance are

significant. Hence, H4b is rejected. These indirect effects of MCS use on performance are

entirely consistent with the direct effects of MCS use on capabilities. Turning to the PMS results,

all the indirect path coefficients from use of centrally-imposed PMS measures to organisational

performance are significantly positive, giving fully support for H4c. Further, the path results for

Page 24: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

24

discretionary PMS measures reveal a significant negative relationship to both research

performance and reputational performance. This result is opposite to the direction hypothesized.

Hence H4d is rejected. In summary, the path results in Panel B reinforce the relevance of RBV

as a theoretical perspective, and the inclusion of organizational capabilities as a factor, in

contingency-based functionalist research studies on MCS/PMS use.

6. Conclusions

This study has extended Henri (2006) – the only prior survey-based MCS/PMS study that

has invoked the resource-based view – to the unique organisational setting of academic units of

universities. It is a setting which is characterized by tensions between managerial and collegial

approaches to management, performance criteria that is centrally directed but has room for

some discretion at the academic unit level, and academic work that can give senior researchers

higher status than their administrative Dean or Head. In this setting, findings on the uses of

MCS and PMS and their impact on organisational capabilities and performance are expected to

be context specific. Hence, if they are found to be consistent with the findings of Henri (2006),

this would provide evidence of the power of RBV as a theoretical perspective in MCS/PMS

research.

Several results were produced that could only be explained by the specific context of

university academic units. In particular, Deans/Heads conceived capabilities of their unit in

functional dimensions, not in generic dimensions as found in prior literature. They thought in

terms of bundles of generic capabilities (e.g., expertise, innovation, reputation) that were

differentiated as teaching, research and networking capabilities. They also thought in terms of

PMS information that was either imposed or discretionary performance measures, not in terms

of financial/non-financial information. In relation to path modelling results, highlights are that

MCS use has little relationship with the extent of development of capabilities. First diagnostic

use of MCS is unrelated to the development of capabilities in any of the fields of teaching,

research or networking, and in turn does not indirectly affect overall performance. Second,

interactive MCS use is only related to teaching capabilities, probably because of the inherent

need to manage quality issues in teaching at the ground level of the academic unit. A further

highlight is that use of centrally-imposed measures from the PMS is positively related to

capabilities in all these fields and, in turn, indirectly positively affects overall performance. The

strong influence of these common centrally-imposed performance across faculties and schools

of universities tends to infer that the development of capabilities and, in turn, areas of attained

Page 25: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

25

performance is converging rather than differentiating for the purpose of sustaining

competitiveness.

The findings are subject to the limitations arising from the field survey method. In particular,

data would be expected to contain respondent biases from the ‘halo effect’ or acquiescence.

Since, some of the scales in the instrument were heavily adopted from other instruments to be

applied for the first time to the higher education sector, and re-conceptualised after the data was

factor analysed, replication studies are needed to fully establish the validity and reliability of

constructs used in this study. Further research could provide greater depth of understanding

about the use of MCS/PMS by Deans/Heads and the way capabilities are prioritised and

strengthened by undertaking qualitative case study research within selected university Faculties

and Schools.

Page 26: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

26

Appendix A : Profile of respondents and their academic units

Profile of respondents: Deans (N=56) Heads (N=165) Valid

count

Percentage Mean Valid count

Percentage Mean

Educational qualification in administration and related fields of education:

None 1 to 2 years 3 years 4 to 5 years More than 5 years Missing

32 3 2 4

12 3

57 5 4 7

21 5

106 13 6 8

24 8

64 8 4 5

15 5

Experience in current academic management position:

Less than 3 yrs 3 – 5 yrs

Greater than 5 yrs Missing

30 16 10 -

54 29 18 -

3.12

83 59 22 1

50 36 13 1

3.29

Experience in prior academic management positions:

Less than 4 yrs 4 – 10 yrs

11 – 20 yrs Missing

17 27 12 -

30 48 21 -

6.54

84 65 13 3

51 39 8 2

4.60

Total years of experience in higher education (teaching, research, management, etc): 10 yrs or less

11 – 20 yrs 21 – 30 yrs

31 – 40 yrs 40 yrs or more

Missing

5 15 22 11 2 1

9 27 39 20 4 2

24.14

14 74 61 13 1 2

8 45 37 8 1 1

21.00

Management/leadership positions in organisations other than universities:

None 1-5 yrs

6 – 10 yrs Greater than 10 yrs

Missing

27 14 7 6 2

48 25 13 11 4

3.59

82 36 22 15 10

50 22 13 6 6

3.38

Gender: Male Female Missing

37 19 -

66 34 -

114 51 -

69 31 -

Age: Below 35 yrs 35-44 yrs

45 – 54 yrs 55 – 64 yrs

Older than 65 yrs Missing

- 3

21 30 2 -

- 5

38 54 4 -

- 19 76 67 2 1

- 12 46 41 1 1

Profile of the academic units Academic schools/departments or

disciplines in the Faculty/School (number): 3 and less

4 – 5 6 – 7

8 and more Missing

15 13 14 14 -

27 23 25 25 -

5.66

83 44 19 17 2

50 27 12 10 1

4.12

Undergraduate students in the Faculty/College/School/Department (EFTSL*): 1,000 and less

1,001 – 2,000 2,001 – 3,000 3,001 – 5,000

Greater than 5,000 Missing

7 12 11 11 11 4

13 21 20 20 20 7

3,505

110 35 8 2 1 9

67 21 5 1 1 5

911

*EFTSL stands for equivalent full-time taught student load which is 96 credit hours or 8 subjects of 12 credit hours.

Page 27: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

27

Appendix B: Survey instruments

Organisational distinctive capabilities Respondents were asked to indicate the extent that their academic unit has developed the following capabilities ranging with 1 (not at all developed) to 7 (fully developed).

Network capabilities 2 Professional connections/involvement of staff members with government regulators, professional bodies, and

media in general. 5 Professional connections/involvement of staff members with private and public sector organizations and government funding

councils for research funding. 6 Reputation with external communities/stakeholders/industry engagement. Teaching capabilities

1 Core competencies in the processes underpinning teaching, learning and assessment strategies (e.g. technology based teaching delivery; flexible delivery modes; diverse assessment structures)

3 Ability and experience in teaching and learning innovations (e.g., experiential learning, e-learning).

4 Reputation for distinguished teachers.

5 Reputation in teaching and learning.

6 Core competencies in the application of theory to practical problems (vocation) for the development of teaching or consultancy products or research.

Research capabilities

1 Reputation in research.

2 A critical mass of internationally renowned researchers in focused research areas. 3 Reputation for eminent professors. 4 Ability and experience in pursuing original research projects and generating publications. 5 Reputation for renowned authors. 6 Expertise and support structures for the Faculty to seek linkage research grants. 7 Recognised brand name in the higher education sector. 8 Ability and experience in commercializing research through patents or consulting/training services.

Performance Respondents were asked to give a subjective rating of the performance of their academic units on the following criteria out of 10 where 10 is the highest.

Teaching and Learning

2 Graduates employment success rates 3 Student retention rates

Research 1 Research publications 2 Research income 3 Higher degree by research

Reputation 1 Ability to attract quality students 2 Ability to attract high-profile academic staff

Overall Overall performance of the Faculty/School

Use of Management Control Systems Respondents were asked to indicate the extent to which they agree or disagree with the following statements anchored

with 1 (strongly disagree) and 7 (strongly agree).

Page 28: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

28

Performance measures Respondent were asked to indicate the extent to which they keep a watch on, discuss and, if necessary, take action on the progressive reporting of the following performance measures anchored with 1 (rarely or never) and 7 (very often):

Centrally-directed performance indicators 4 Research publications. 5 External course/ program surveys carried out by government and other institutions such as CEQ and GDS. 6 Internal course/ program surveys carried out by the University/Faculty. 7 Research income. Discretionary performance indicators 1 Cost per equivalent full-time student. 2 Tuition income. 3 Staff salary cost by categories (e.g. full-time, part-time). 4 Travel costs. 5 Administrative expenditures other than salary cost.

References

Abernethy, M. A., Bouwens, J., and van Lent, L. 2010. Leadership and control system design. Management Accounting Research 21 (1):2-16.

Abernethy, M. A., and Brownell, P. 1997. Management control systems in research and development organizations: The role of accounting, behavior and personnel controls. Accounting, Organizations and Society 22 (3-4):233-248.

———. 1999. The role of budgets in organizations facing strategic change: an exploratory study. Accounting, Organizations and Society 24 (3):189-204.

Ahrens, T., and Chapman, C. S. 2004. Accounting for Flexibility and Efficiency: A Field Study of Management Control Systems in a Restaurant Chain. Contemporary Accounting Research 21 (2):271-301.

Ansari, S. L. 1977. An integrated approach to control system design. Accounting, Organizations and Society 2 (2):101-112.

Atkinson, A. A., Waterhouse, J. H., and Wells, R. B. 1997. A Stakeholder Approach to Strategic Performance Measurement. Sloan Management Review 38 (3):25-37.

Baines, A., and Langfield-Smith, K. Antecedents to management accounting change: a structural equation approach. Accounting, Organizations and Society 28 (7-8):675-698.

MCS interactive use

I often use management control systems (MCS) information as a means of questioning and debating the decisions and actions of heads of schools/departments and other managers in the Faculty.

I use the MCS to stimulate dialogue with heads of schools/departments and other managers in the Faculty.

The information generated by the MCS becomes an important and recurring agenda item addressed by the highest level of management of the Faculty.

The MCS involves a lot of interactions with all levels of managers.

The MCS is designed to respond to new and unplanned circumstances (e.g., new opportunities) in a flexible way.

MCS information generated by the system is often interpreted and discussed in face-to-face meetings.

MCS diagnostic use (reverse coded) 1 The monitoring and accomplishment of predetermined critical performance goals is central to the use of the

MCS. 2 I heavily rely on staff specialists (i.e., finance and other administrative staff) to monitor the achievement of goals

and strategies. 3 I mainly use MCS information reported through formal channels. 5 The MCS is primarily used to regularly track progress towards goals. 6 I give higher priority to accuracy and completeness of MCS information than its timeliness.

Page 29: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

29

Ballantine, J., and Bringall, S. 1995. A taxonomy of performance measurement framework. Paper presented at 18th annual congress of European Accounting Association in Birmingham.

Barclay, D., Higgins, C., and Thompson, R. 1995. The partial least squares (PLS) approach to causal modeling: Personal computer adoption and use as an illustration. Technology Studies 2 (2):285-309.

Barney, J., Wright, M., and Ketchen Jr, D. J. 2001. The resource-based view of the firm: Ten years after 1991. Journal of Management 27 (6):625.

Barney, J. B. 1991. Firm Resources and Sustained Competitive Advantage. Journal of Management 17 (1):99.

———. 2001. Resource-based theories of competitive advantage: A ten-year retrospective on the resource-based view. Journal of Management 27 (6):643.

Biggs, J. 2003. Teaching for Quality Learning at University: What the Student Does. 2nd edn ed. Berkshire: Open University Press.

Bisbe, J., and Otley, D. 2004. The effects of the interactive use of management control systems on product innovation. Accounting, Organizations & Society 29:709-737.

Broadbent, J., and Guthrie, J. 2008. Public sector to public services: 20 years of "contextual" accounting research. Accounting, Auditing & Accountability Journal 21 (2):129-169.

Bromwich, M., and Bhimani, A. 1989. Management Accounting: Evolution not Revolution. London: CIMA Publications.

Chapman, C. S. 1997. Reflections on a contingent view of accounting. Accounting, Organizations & Society 22 (2):189-205.

———. 1998. Accountants in organisational networks. Accounting, Organizations and Society 23 (8):737-766.

Chenhall, R. H., and Langfield-Smith, K. 2003. Performance Measurement and Reward Systems, Trust, and Strategic Change. Journal of Management Accounting Research 15:117.

Chenhall, R. H., and Morris, D. 1995. Organic decision and communication processes and management accounting systems in entrepreneurial and conservative business organizations. Omega 23 (5):485-497.

Chin, W. W. 1998. The partial least squares approach structual equation modeling. In Modern Methods for Business Research, edited by G. Marcon. NJ: Lawrence Erlbaum Associates.

Chin, W. W., and Newsted, P. R. 1999. Structual Equation Modeling: Analysis with Small Samples Using Partial Least Squares. In Statistical Strategies for Small Sample Research, edited by R. H. Hoyle. Thousand Oaks, CA: Sage, 307-341.

Chung, T., Harrison, G., and Reeve, R. 2009. Interdependencies in Organization Design: A Test in Universities. Journal of Management Accounting Research 21:55.

Clardy, A. 2007. Strategy, core competencies and human resource development. Human Resource Development International 10 (3):339-349.

Collier, P. M. 2005. Enterpreneurial control and construction of a relevant accounting. Management Accounting Research 16:321-339.

Committee of Vice-Chancellors and Principals. 1985. Report of the Steering Committee for Efficiency Studies in Universities (Chairman A. Jarratt).

Conner, K. R. 1991. A Historical Comparison of Resource-Based Theory and Five Schools of Thought Within Industrial Organization Economics: Do We Have a New Theory of the Firm? Journal of Management 17 (1):121.

Daft, R. 1983. Organisation theory and design. New York: West. Daft, R. L., and Macintosh, N. B. 1984. The Nature and Use of Formal Control Systems for Management

Control and Strategy Implementation. Journal of Management 10 (1):43-66. Davies, A., and Thomas, R. 2002. Managerialism and Accountability in Higher Education: the Gendered

Nature of Restructuring and the Costs to Academic Service. Critical Perspectives on Accounting 13 (2):179-193.

Davila, T. 2000. An empirical study on the drivers of management control systems' design in new product development. Accounting, Organizations and Society 25 (4-5):383-409.

Dawkins, J. S. 1987. Higher Education - A Policy Discussion Paper: Australian Government Publishing Service, Canberra.

———. 1988. Higher Education - A Policy Statement: Canderra: Australian Government Publishing Service.

Page 30: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

30

Day, G. S. 1994. The capabilities of market-driven organizations. Journal of Marketing 58 (4):37. Deem, R. 1998. New managerialism and higher education: the management of performances and

cultures in universities in the United Kingdom. International Studies in Sociology of Education 8 (1):47 - 70.

———. 2004. The Knowledge Worker, the Manager-academic and the Contemporary UK University: New and Old Forms of Public Management? Financial Accountability & Management 20 (2):107-128.

DEEWR. 2007. Higher Education Summary: Department of Education, Employment and Workplace Relations, Canberra, accessed on 25 November 2008, http://www.dest.gov.au/sectors/higher_education/default.htm

Dent, J. F. 1987. Tensions in the Design of Formal Control Systems: A Field Study in a Computer

Company. In Accounting and Management: Field Study Perspectives, edited by J. William J. Bruns and R. S. Kaplan. Cambridge, MA: Harvard Business School Press.

Ditillo, A. 2004. Dealing with uncertainty in knowledge-intensive firms: the role of management control systems as knowledge integration mechanisms. Accounting, Organizations and Society 29 (3-4):401-421.

Eisenhardt, K. M., and Martin, J. A. 2000. Dynamic Capabilities: What Are They? Strategic Management Journal 21 (10/11):1105-1121.

English, T. 2001. TENSION ANALYSIS IN INTERNATIONAL ORGANIZATIONS: A TOOL FOR BREAKING DOWN COMMUNICATION BARRIERS. International Journal of Organizational Analysis (1993 - 2002) 9 (1):58.

Ethiraj, S. K., Kale, P., Krishnan, M. S., and Singh, J. V. 2005. Where do capabilities come from and how do they matter? A study in the software services industry. Strategic Management Journal 26 (1):25.

Euske, K. J., Lebas, M. J., and McNair, C. J. 1993. Performance management in an international setting. Management Accounting Research 4 (4):275-299.

Ferreira, A., and Otley, D. 2005. The Design and Use of Performance Management Systems: An Extended Framework for Analysis, Social Science Research Network. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1439503.

Fornell, C., and Larcker, D. F. 1981. Evaluating Structural Equation Models with Unobservable Variables and Measurement Error. Journal of Marketing Research (JMR) 18 (1):39-50.

Govindarajan, V., and Shank, J. K. 1992. Strategic Cost Management: Tailoring Controls to Strategies. Cost Management 6 (3):14.

Haas, M. d., and Kleingeld, A. 1999. Multilevel design of performance measurement systems: enhancing strategic dialogue throughout the organization. Management Accounting Research 10 (3):233-261.

Hall, M. The effect of comprehensive performance measurement systems on role clarity, psychological empowerment and managerial performance. Accounting, Organizations and Society 33 (2-3):141-163.

Henri, J.-F. 2006. Management control systems and strategy: A resource-based perspective. Accounting, Organizations and Society 31 (6):529-558.

———. 2008. Taxonomy of performance measurement systems. Advances in Management Accounting 17:247-288.

Henseler, J., Ringle, C. M., and Sinkovics, R. R. 2009. The use of partial least squares path modeling in international marketing. Advances in International Marketing 20:277-319.

Higgins, J. C. 1989. Performance measurement in universities. European Journal of Operational Research 38 (3):358-368.

Hofstede, G. 1981. Management control of public and not-for-profit activities. Accounting, Organizations and Society 6 (3):193-211.

Hoopes, D. G., Madsen, T. L., and Gordon, W. 2003. Guest Editors' Introduction to the Special Issue: Why Is There a Resource-Based View? Toward a Theory of Competitive Heterogeneity. Strategic Management Journal 24 (10):889-902.

Hoque, Z. 2004. A contingency model of the association between strategy, environmental uncertainty and performance measurement: impact on organizational performance. International Business Review 13 (4):485-502.

Page 31: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

31

Hoque, Z., and James, W. 2000. Linking Balanced Scorecard Measures to Size and Market Factors: Impact on Organizational Performance. (cover story). Journal of Management Accounting Research 12:1-17.

Hult, G. T. M., and Jr, D. J. K. 2001. Does Market Orientation Matter?: A Test of the Relationship between Positional Advantage and Performance. Strategic Management Journal 22 (9):899-906.

Hussain, M. M. 2005. Management accounting performance measurement systems in Swedish banks. European Business Review 17 (6):566.

Ittner, C. D., and Larcker, D. F. 2001. Assessing empirical research in managerial accounting: a value-based management perspective. Journal of Accounting and Economics 32 (1-3):349-410.

Ittner, C. D., Larcker, D. F., and Randall, T. 2003. Performance implications of strategic performance measurement in financial services firms. Accounting, Organizations and Society 28 (7-8):715-741.

Johansson, T., and Siverbo, S. 2009. EXPLAINING THE UTILIZATION OF RELATIVE PERFORMANCE EVALUATION IN LOCAL GOVERNMENT: A MULTI-THEORETICAL STUDY USING DATA FROM SWEDEN. Financial Accountability & Management 25 (2):197-224.

Johnson, H. T. 1990. Performance measures for competitive excellence. In Measures for Manufacturing Excellence, edited by R. S. Kaplan. Boston, MA: Harvared Business School Press.

Jones, C. S. 1986. UNIVERSITIES, ON BECOMING WHAT THEY ARE NOT. Financial Accountability & Management 2 (2):107.

Joreskog, K. G., and Wold, H. O. 1982. The ML and PLS technique for modeling with latent variables: Historical and comparative aspects. In Systems under indirect observation, Part I, edited by J. K. G. and W. H. O. Amesterdam: North-Holland.

Kaplan, R. S. 1991. New Systems for Measurement and Control. The Engineering Economist: A Journal Devoted to the Problems of Capital Investment 36 (3):201 - 218.

Kloot, L. 1997. Organizational learning and management control systems: responding to environmental change. Management Accounting Research 8 (1):47-73.

Kober, R., Ng, J., and Paul, B. J. 2003. CHANGE IN STRATEGY AND MCS: A MATCH OVER TIME. Advances in Accounting 20:199-232.

———. 2007. The interrelationship between management control mechanisms and strategy. Management Accounting Research 18 (4):425-452.

Lafferty, G., and Fleming, J. 2000. The Restructuring of Academic Work in Australia: Power, Management and Gender. British Journal of Sociology of Education 21 (2):257-267.

Lapsley, I. 1999. Accounting and the New Public Management: Instruments of substantive efficiency or a rationalising modernity? Financial Accountability & Management 15 (3/4):201.

Lee, C., Lee, K., and Pennings, J. M. 2001. Internal Capabilities, External Networks, and Performance: A Study on Technology-Based Ventures. Strategic Management Journal 22 (6/7):615-640.

Lynch, R., and Baines, P. 2004. Strategy development in UK higher education: towards resource-based competitive advantages. Journal of Higher Education Policy & Management 26 (2):171-187.

Mahama, H. 2006. Management control systems, cooperation and performance in strategic supply relationships: A survey in the mines. Management Accounting Research 17 (3):315-339.

Marginson, D. E. W. 2002. Management Control Systems and Their Effects on Strategy Formation at Middle-Management Levels: Evidence from a U.K. Organization. Strategic Management Journal 23 (11):1019-1031.

Maurice, G. 2005. An empirical study of performance measurement in manufacturing firms. International Journal of Productivity and Performance Management 54 (5/6):419.

McLaughlin, K., Osborne, S. P., and Ferlie, E., eds. 2002. New Public Management: Current trends and future prospects. 1st ed. London: Routledge.

Merchant, K. A. 1998. Modern Management Control Systems: Text and Cases. Upper Saddle River, NJ: Prentice-Hall.

Miller, D., and Friesen, P. H. 1982. Innovation in conservative and entrepreneurial firms: Two models of strategic momentum. Strategic Management Journal 3 (1):1-25.

Naranjo-Gil, D., and Hartmann, F. 2006. How Top Management Teams Use Management Accounting Systems to Implement Strategy. Journal of Management Accounting Research 18:21-53.

Nunnally, J. C. 1978. Psychometric Theory. New York: McGraw-Hill. Otley, D. 1999. Performance management: a framework for management control systems research.

Management Accounting Research 10 (4):363-382.

Page 32: Taylor and Bobe - RMIT Universitymams.rmit.edu.au/43q0dq3bvrw2.pdf · 2010 . 2 Abstract Invoking a resource-based view (RBV), ... 1 Faculty may also be called College or Division

32

Otley, D. T., and Berry, A. J. 1980. Control, organisation and accounting. Accounting, Organizations and Society 5 (2):231-244.

Ouchi, W. G. 1979. A Conceptual Framework for the Design of Organizational Control Mechanisms. Management Science 25 (9):833-848.

Parker, L. D. 2002. It's been a pleasure doing business with you: a strategic analysis and critique of university change management. Critical Perspectives on Accounting 13 (5-6):603-619.

Pettersen, I.-J., and Solstad, E. 2007. THE ROLE OF ACCOUNTING INFORMATION IN A REFORMING AREA: A STUDY OF HIGHER EDUCATION INSTITUTIONS. Financial Accountability & Management 23 (2):133-154.

Pick, D. 2006. The Re-Framing of Australian Higher Education. Higher Education Quarterly 60 (3):229-241.

Ringle, C. M., Wende, S., and Will, S. 2005. SmartPLS 2.0 (M3) Beta, available from http://www.smartpls.de.

Roberts, R. W. 2004. Managerialism in US universities: implications for the academic accounting profession. Critical Perspectives on Accounting 15 (4-5):461-467.

Said, A. A., Elnaby, H. R. H., and Wier, B. 2003. An Empirical Investigation of the Performance Consequences of Nonfinancial Measures. Journal of Management Accounting Research 15:193.

Simon, H. A., Guetzkow, H., Kozmetsky, G., and Tyndall, G. 1954. Centralization vs decentralization in organizing the controller's department. New York: Controllorship Foundation Inc.

Simons, R. 1987. Accounting control systems and business strategy: An empirical analysis. Accounting, Organizations and Society 12 (4):357-374.

———. 1990. The role of management control systems in creating competitive advantage: New perspectives. Accounting, Organizations and Society 15 (1-2):127-143.

———. 1991. Strategic Orientation and Top Management Attention to Control Systems. Strategic Management Journal 12 (1):49-62.

———. 1994. How New Top Managers Use Control Systems as Levers of Strategic Renewal. Strategic Management Journal 15 (3):169-189.

———. 1995. Levers of Control: How Managers Use Innovative Control Systems to Drive Strategic Renewal. Boston: Harvard Business School Press.

———. 2000. Performance Measurement & Control Systems for Implementing Strategy: Text & Cases. New Jersey: Prentice Hall.

Spanos, Y. E., and Lioukas, S. 2001. An Examination into the Causal Logic of Rent Generation: Contrasting Porter's Competitive Strategy Framework and the Resource-Based Perspective. Strategic Management Journal 22 (10):907-934.

Stede, W. A. V. d., Chow, C. W., and Lin, T. W. 2006. Strategy, Choice of Performance Measures, and Performance. Behavioral Research in Accounting 18:185.

Tripsas, M., and Gavetti, G. 2000. Capabilities, Cognition, and Inertia: Evidence from Digital Imaging. Strategic Management Journal 21 (10/11):1147-1161.

Trist, E. L., and Bamforth, K. W. 1951. Some Social and Psychological Consequences of the Longwall Method of Coal-Getting: An Examination of the Psychological Situation and Defences of a Work Group in Relation to the Social Structure and Technological Content of the Work System Human Relations 4 (February):3-38.

Vandenbosch, B. 1999. An empirical analysis of the association between the use of executive support systems and perceived organizational competitiveness. Accounting, Organizations and Society 24 (1):77-92.

Wernerfelt, B. 1984. A Resource-based View of the Firm. Strategic Management Journal 5 (2):171-180. Widener, S. K. 2006. Associations between strategic resource importance and performance measure

use: The impact on firm performance. Management Accounting Research 17 (4):433-457. ———. 2007. An empirical analysis of the levers of control framework. Accounting, Organizations and

Society 32 (7-8):757-788.