MIS40640 – Managing Technology & Change: Cultural & Political Perspectives – The IS design & development process

Readings of the Week:

Curtis, Knasner and Iscoe – Field Study Software Development Process

Lyytinen & Robey – Learning failure in ISD

Chaos report summary 2009

Questions & Comments:

Lyytinen & Robey conclude that “learning from experience can be a long and tortuous adventure” but it is entirely necessary after providing four key reasons why organisations fail to learn. In their offering of methods to overcome those barriers to learning, they speak about knowledge management but give fairly generic advice on how to help on the issue of organisational intelligence such as ‘performance metrics’. Is this really going to overcome the learning problems they write about in their paper? Would a more concrete knowledge management methodology (for example, a transactive memory system) be more useful to help them address those issues?

Both the readings from Curtis et al as well as Lyytinen & Robey tries to understand the problems faced during IS implementations and suggest frameworks to avoid the obvious pitfalls. While these frameworks promise no magic cure or silver bullet but do provide a means to understand the complex landscape of IS implementations. But is it enough? The IS implementations by their very nature remain complex proposition, difficult to fathom, almost impossible to predict and would invariably go off track. That is a given, academic research should focus more on the ways to bring these off-track projects back on track instead of creating frameworks on the implementation landscape as such. Perhaps the wisdom is that by understanding the IS implementation failures somehow managers can learn to avoid the highlighted shortcomings. But it would help much more if research can be done around projects which went off track but then by either sheer devotion and risk taking by an individual or a steering group were brought back on schedule & under budget. Such examples would provide more useful insights to the project management field and help come out with action plans or frameworks to resolve the mystery of large IS implementations. The examples of MCC, Taurus or CompuSys just highlight that fact the in spite of talented managers around & available resources things will go wrong. What does it take to bring things back in order, control the slippages and reign in the projects back on track? I think that is the question academicians and practitioners need to answer.

The three most salient problems, in terms of the additional effort or mistakes attributed were the thin spread of application domain knowledge, fluctuating and conflicting requirements and communication and coordination breakdowns. They explore the causes and consequences of these problems at five levels—individual, team, project, company, and milieu—and observe that problems caused at one level may have repercussions on another. Do you think these are applicable in case of the agile projects? These should vary for the project development teams and product development teams?

Minutes of the Class:

On the topic of Evaluation, Seamas continued from the previous week’s class discussion

The key notion is that Evaluation is a “valuation” exercise

  • It is based on what you “value”. Even if this is a legitimate value, the evaluation is still a value judgement
  • Foucault said that we can never step out of a discourse. How we evaluate things will always be affected by the way we value things
  • Therefore evaluations can never be truly unbiased and we must always be sceptical when reading evaluations (critical consumers)

Evaluation Process as a political tool

  • We need to know and understand the politics behind the veil of rationale
  • Sometimes an evaluation is used to justify something. In last week’s papers, an evaluation was used to justify something, rather than actually evaluating something
  • But sometimes it is used as a calming tool when meeting uncertain outcomes or risks
  • Human behaviour has a tendency to lean towards league tables and rankings. But this only focuses on certain features to highlight
  • Evaluations is a ritual that one follows – to perform due diligence

The language used in evaluations has the power of control

  • The Metrics, KPIs, Graphs all combine to convince readers. Although on the face of it, it appears emotionally detached, however we know it is based on value judgements

The idea of the Overt and Covert perspective

  • We should be cautious about what evaluations highlight but also what was hidden
  • Metaphors are always coming from perspective. Drawing attention to one thing but always draws attention away from something else. Therefore, we should always be very critical consumers of evaluation, especially when reviewing things that are highlighted – to look for what is covered or what we have been drawn away from

When is the best time to conduct an Evaluation?

  • Key notion here is that with the passing of time, things can change, and an evaluation done in the past might not hold up in the future – “Something that looked really good at one point can look really bad in another point”
  • “It’s too early to say…” if something can be deemed great
  • Seamas suggested to read Joseph Stiglitz’s essay on the US Administration – describing the short-termism of the administration’s view, freely admitting to delivering policies that are impactful only in the short-term, without knowing or understanding the impact of the medium to long term. The estimated time-lag of implementing something and seeing a fruitful outcome – approx. 17 years, so it didn’t make sense to do something long term because it will be long forgotten
  • However, note the example of Bill Clinton and his awareness of a long-term legacy
  • There will always be pressure to make a (short-term) marker in an organisation. But something that may look successful now may not do so in the long term (and vice versa, in that something that looks bad now can be of huge significance in the long term). Therefore managers need to be aware and think in the long-term

Interpretative approach to evaluation

  • Recast evaluation as an on-going dialogue, rather than reduced to a nice dashboard
  • Admit to the politicality of the process
  • Need to actively create the reality, to appreciate the value that is set. To then generate commitment and actions from an evaluation
  • Need to be sensitive of what’s going on, to understand the idiocy of what the current evaluation process is. To find spots of resistance and work at ways to get buy-in, through promoting and shaping the interpretation, to motivate into a collective action
  • Aware of multiple views and multiple parties involved – Need to understand how different groups can interpret an evaluation differently with different assumptions and different meanings. To seek these groups out (RSGs) and listen to their views
  • Can’t always have consensus but it doesn’t mean it is not valuable to engage in a dialogue process. Can find that doing this can make the eventual executive decision more palatable to everyone (the Harmony perspective)

A Learning process

  • Use evaluations to find ways of understanding things. To see the evaluator as a collaborator in a teacher/learner relationship
  • Work on shaping this into reality and changing the current process / view of the process. The outcomes of which are drivers of actions
  • Evaluation as an ongoing sense-making process
  • In Finding ways of mobilizing; Finding ways of understanding things; Finding ways to get consensus; Finding ways to get collective action
  • Genuinely discuss what the best course of action is and create an environment to let people speak candidly
  • Involves huge amount of skill to do this. It is a Social Shaping skill

The discussion evolved into a discussion on Communication Skills e.g. Alex Ferguson managed Manchester United through a million conversations

  • Committing to dialogue as a way of management
  • Developing a sensitivity and shape how people make sense of things
  • Framing language to get buy-in and mobilize collective actions

Need to develop the relationships to allow people to be candid

  • Otherwise very toxic if everyone are false and say things they don’t mean
  • Need to promote honest dialogue, that produces actual meaningful action
  • Listening / Giving feedback is a very powerful skill

Week 7 Readings and Key Learning’s from the Group Presentation

Problems of IT Projects

  • Complex and time consuming, requiring good Stakeholder Management skill
  • Communications is key and need to manage up/down/across the organisation
  • Politics / Culture / Language issues encountered
  • Scope management/scope creep


  • No silver bullet – use different development methodologies where it makes sense
  • Building relationships is key, and requires a good balance of hard and soft skills
  • To learn at all levels – creating an learning culture / a collaborative environment, with candid dialogue and a supportive management
  • Confidence to know what is going on, to step back and say something different

Seamas’ view on problems of IT Projects and issues with Software Development

Notion of “software crisis”

  • Very similar issues are enduring down the years. One explanation is that software development is different than other projects (such as building a bridge). There is a lot of flexibility in s/w development and easily influenced by various stakeholders / senior management. The success / failure rate as shown in the Chaos Report is therefore expected and normal
  • However the other side of the argument says that it shouldn’t be much different, in that there is specialist knowledge required, and much of the issues mentioned matches that in other discipline

Key factor of success

  • The problem is managing the size of the project. It is said that s/w projects don’t scale well. If we want to minimise risks in s/w projects, the key lesson is to set a limit of 6 people per project part. Above this scale will be problematic
  • The Chaos Report said there are more successes in recent years due to smaller agile projects, a decrease in waterfall projects and a decrease in project size. This reflects the above point of the size of projects and how managing each smaller part is crucial

Domain knowledge

  • There is a thin spread of application domain knowledge. For s/w projects to succeed there is a need to know a lot about the domain in which the s/w will eventually operate. This knowledge is therefore critical
  • If the knowledge is scarce in your project, be very careful as this is a very risky project – “Writing code isn’t the problem, understanding the problem is the problem”

“Project Gurus” (Domain knowledge experts)

This existence of these experts are very real in practice, They are extremely familiar with the application domain and often integrate several knowledge domains. They are the focal point in a project and often adopt an unofficial management position, with most things needing to go through them


  • They can be helpful to identify something that is not written, to see problems not noted or are about to happen. To turn things around from going down a wrong path and know what would not work right away
  • Usually a very technical person (even if not a coder) and therefore can communicate well with software development teams
  • Large amounts of time is saved from their learnt lessons that are gathered with time and experience. They are also very giving about their knowledge


  • They tend to internalise problems of the project, taking issues personally
  • Major risk factor for a project – if leaves the project, communications and the project itself can completely breakdown
  • They may have their own power/political agendas that can hold project progress, or prevent things being done
  • They may dominate the design discussions – which can be good or bad
  • Application domain experts are very rare and very specific to their domain

What can be done?

  • Educating new people – through shadowing / listening, to get the knowledge spread around. How to develop the current learning process to bringing people up to speed
  • To have these lessons taught to a core group – make changes to facilitate this, to mitigate risks tied to certain individuals
  • Appreciation for the practices that exists, regardless of the s/w is elegant or not
  • The solution is not very efficient and clear – but there is a need to learn from the expert, likely over a prolonged period, with trials and error
This entry was posted in iBusiness and tagged , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.