MIS40910 – Skills for Business Enquiry – Research Paradigms

1.Levy, Yair and Ellis, Timothy J (2006) ‘A Systems Approach to Conduct an Effective Literature Review in Support of Information Systems Research’, Informing Science: International Journal of an Emerging Transdiscipline, 9: 181-212.

1.Levy2006-System Approach to Literature Review

2.Rosenzweig, P. (2007). Misunderstanding the nature of company performance: The halo effect and other business delusions. California Management Review, 49(4), 6-20.

2.Rosenzweig2007-Misunderstanding the Nature of Company Performance

It was our turn for presentation and the link to it is below

Skills for Business Enquiry – Week5 Presentation

Posted in iBusiness | Tagged , , | Leave a comment

MIS40910 – Critique – Systems Approach to Literary Review & Misunderstanding Performance due to Halo Effect

A Systems Approach to Conduct an Effective Literature Review in Support of Information Systems Research – Yair Levy and Timothy J. Ellis

1.Levy2006-System Approach to Literature Review

Misunderstanding the Nature of Company Performance: THE HALO EFFECT AND OTHER BUSINESS DELUSIONS – Phil Rosenzweig

2.Rosenzweig2007-Misunderstanding the Nature of Company Performance

Though the two articles have little in common but still there is a common thread that they both deal with human enquiry process. Levy & Ellis have proposed a systematic framework to enquire into the realm of writing literary reviews whereas Rosenzweig’s article also deals with enquiry but is more a critique on the research methodologies used by different authors who enquired into the performance measures among business organisations.

They are in agreement in that both give the due importance to the quality of data to be used as input for either writing literary reviews or researching company performances. Levy’s article emphasises the importance of avoiding the pitfall of garbage-in/garbage-out while producing an effective literary review. In a same breath Rosenzweig stresses how dubious data and flawed findings have led to questionable findings and to a general misunderstanding of company performance.

Where they differ is in that Levy & Ellis have totally ignored any discussion around biases that might lead to the skewed data processing. On the other hand Rosenzweig has over emphasised the impact of biases and suggested that almost every magazine or business journal out there has been lost to halo effect resulting in derivation of specific inferences on the basis of a general impression. In our view the solution might be somewhere in the middle, though biases should not be discounted totally but accepting that biases for & against will always be there in any human enquiry process and by taking steps the dilute their impact instead of rejecting outright any previous research might benefit researchers by allowing them to use the past conclusions & judgements of their compatriots.

Levy & Ellis have over simplified the knowledge creation process by using the analogy of Input, Processing & Output stages as if human brain can process robotically tome of data collected during Input stage. Human brain is not a CPU in real sense and the thought processes get impacted by emotions, likings, past experiences, intelligence, education and innumerable other variables. Human brain can make unexpected connections using the data at hand or can even totally miss the most important relevant fact. That’s the reason multiple revisions, peer reviews, editorial reviews are so common in literary domain. But there is no mention of the same in Levy & Ellis’s article.

In contrast Rosenzweig article takes a much more rational approach and critiques the research methodology adopted by three well reputed set of authors. The arguments made by him around those methodologies broadly make sense. Any discussion around the performance measures of individuals, organisations or for that matter even of nations & societies can at the most be a debatable topic. To make a sound argument on the performance measures one has to formulate a process of quantifying the efficiency and effectiveness of past actions which is very subjective. To overcome this constraint Rosenzweig has recommended evidence based management, advanced by Pfeffer & Sutton. But then relying on just hard facts can make managers short sighted, there is enough evidence to suggest that great managers have all been instinctive and insightful. If we take an example of one of the greats, Steve Jobs once famously said “Customers don’t know what they want”. In this context searching for evidence in business environment can be as worse as searching for a mirage in a desert.

imageimage

imageimage

imageimage

image

Posted in iBusiness | Tagged , , | Leave a comment

MIS40670 – Buxton’s Avalanche Case

From Bill Buxton’s Sketching User Experiences (Buxton, 2007).

This case sets  the issues for high-tech design. Design that works “in the wild”,
that works for real people in real situations and facilitates achieving their human goals.

Bill Buxton sets the scene for us with his avalanche responder case. It is an incident
experienced by Bill’s good friend Saul Greenberg, Saul Greenberg, and three friends when
skiing in high mountainous terrain in Canmore and Kananaskis Country, Alberta, Canada.

The group was traversing a valley slopeimage when a lethal avalanche fell across their path.
The three skiers in the middle of the group were caught in the slide. The lead (Saul’s wife)
and the last skier could only watch the disaster
unfold as the three others were engulfed
by the avalanche.

One was simply knocked down, one was buried up to her shoulders and the last  Saul, was missing.

The group was well prepared in that the equipment they carried consisted of transceivers, probe sticks & shovels.

imageimageimage

But more than just the technology; it also requires knowledge, shared practices, skills, and analysis of a concrete situation (among others). When skiing in avalanche prone conditions, you work one of a number of simple systems depending on the severity of the risk. One was simply knocked down, one was buried up to her shoulders, and the last, Saul, her husband, was missing. The normal procedure when traversing is to spread out. Lookouts at either and and traverse one-by-one. Retain one lookout.Triage; rescue the most able first (and they may be able to assist later). Go to the approximate location, judge if carried, then guide using transceiver. Use avalanche probe. When the victim is felt you start to dig. And dig. Judy started digging. Steve arrived and asked if she had verified the spot with her probe, she hadn’t. Judy was confident that she had the right spot, but by this time she had had to dig so deep that her confidence was wavering…

What did Saul do? He tried to ski his way out but got caught in the hollow (avalanches can travel at up to 200km/hour whereas 40km/hour is really fast for a skier). He got caught in the trough, a ‘feature trap’, that also meant he was buried deep! But he had cupped his hand over his mouth and nose, preserving a small air space so he could breathe. He waited, imageburied under the weight of the snow, and tried to relax. He had to trust in his partners, their training, his and their gear.

None of the participants had ever been in this situation before. The total time from slide to rescue was about 10 minutes. Under the conditions, after 20 minutes he would have been dead.

Ask yourselves; After the ‘who’, what saved Saul? If avalanches are so lethal how or why  did he survive? If there is a system here? There is isn’t there? If so what is it?

Well I think the system is there and it consists of a combination of human & technology factors. I this case human factors consisted of

1) Ad-hoc Problem Solving: None in the group had ever been in this situation before. But they devised methods ad-hoc to rescue Saul

    • Context
    • Ad-hoc
    • Adapting

2) Training: Saul used his training and put his hand over his mouth & nose to preserve a small air space, this proved to be crucial as it was taking time to dig him out.

    • Rescue Training
    • Survival Training

And technological factors involved were

1) Procedures: The group spread out so that there are always lookouts in such emergencies

    • Lookouts at either
    • Traverse one-by-one.
    • Retain one lookout.
    • Triage; Rescue the most able first (and they may be able to assist later)

2) Equipment: The group had transceiver, probe and shovel at their disposable for such situations

    • Transceiver
    • Shovel
    • Probe Stick

But still it was the combination of the two that saved Saul. Though technology to locate accurately was available Judy relied on her eye sight & instincts and was confident that she was digging the right spot. But when she could not locate Saul her confidence wavered, so if she would have relied on evidence, she would have been sure about the burial location. And if transceiver would not have been there and as Saul was buried too deep, it would have been extremely difficult for the group to locate him.

We have our intentions for technology, it should just work. The system just works, in the wild. But what do we need to do to get it to work?

First have trust in the technology and then rely on it. And ensure that the system does take care of the context and environment landscape where it is supposed to function.

Posted in iBusiness | Tagged , , | 1 Comment

MIS40670 – Project View & Planning Poker

The class had an interesting discussion around Project view in which high tech development projects characterizes the work in terms of four key variables:

Quality, Cost,image Time and Scope.

Allen initiated the discussion around how the project success is dependent on these 4 variables.

Allen projected the slides which had his own handmade graphs showing the impact for these variables on the project success and then whole class has discourse around the graphs on each dimension

The first variable picked up was Time. Time is a crucial dimension of production activity. It turns out that an appropriate time line is a huge enabler for a project. However too aggressive a time target dooms a project to undue hast, unrealistic delivery times and, potentially, failure. Similarly, an excessively long time frame can defocus a team’s attention and starveimage the project of valuable feedback and checkpoints.

How does a project get to be a year late?… One day at a time.” (Brooks Jr., 1995)

Second variable on which project success is dependent is Scope. It was deduced during discussions that scope management is very crucial for project success. Typically project scope will expand over time to include more features in greater detail as you learn what the customer wants/needs/really needs. The feature list of a project should always be clear and concise. Too large a list of features or feature creep generates problems of priority and coherence. A smaller imageset of the most crucial features probably has a stronger (positive) influence on the underlying architecture of the product. And “less scope makes it possible to delivery better quality”

If you actively manage scope, you can provide managers and customers with control of cost, quality, and time.” (Beck, 2000)

Third variable is Quality which was a bit contentious and generated some debate. However quality might be defined we should keep in mind that a definition of quality is a non-trivial exercise. Quality is usually highly contextual, situated in a prevailing culture of what constitutes good or bad quality. In the case of software the product (or service) is not a physical good and so does not wear out in the way that hardware does. Hardware degrades over time due to physical wear and tear, breaking down and mechanical or physical failure. Software still fails and so it undergoes maintenance work to fix or enhance it over its economic life. For the purpose of a particular project the product’s quality is imagegenerally a negotiated concept. Don’t deliver something you know hasn’t been tested, or fails the tests; quality should be used to set thresholds and targets, using it as a control variable undermines and destroys the values we all aspire to.

Quality is a terrible control variable” (Beck, 2000)

The final and perhaps the most important variable discussed was Cost. The project go, no-go decisions are generally taken on the cost basis only. The “Estimation as practice” relies on the skill, knowledge, resources and contexts of those involved in a situation. Estimations are ‘situated’ in the same way that other kinds of knowledge work are situated amongst context, history, place and moments in time. Accepted wisdom suggests that you make a guess, double it and hope for the best. Why? Novel valuable high tech requirements are by definition unknowns. And estimating how to produce an as yet unknown, unfinished thing is necessarily an art, not a science. But we don’t need to leave it there; there are some approaches and practices, that combined, enable us to make informed ‘guesstimates’ orimage scientific wild-ass guesses (SWAG) that help us overcome this bind.

more software projects have gone awry for lace of calendar time than for all other causes combined… but adding manpower to a late software project makes it later.” (Brooks Jr., 1995)

For estimation, Allen introduced the class to an interesting notion of Planning Poker. Planning Poker is “played” by the team as a part of the Sprint Planning meeting. A Planning Poker session begins by the customer or marketing representative explaining each requirement to the extended development team. We use the term extended development team (often called the “whole team” by agile software developers) to refer to all those involved in the development of a product, including product managers, project managers, software developers, testers, usability engineers, security engineers and others. In turn, the team discusses the work involved in fully implementing and testing a requirement until they believe that they have enough information to estimate the effort. Each team member then privately and independently estimates the effort. The team members reveal their estimates simultaneously. Next, the team members with the lowest and highest estimate explain their estimates to the group. Discussion ensues until the group is ready to re-vote on their estimates. More estimation rounds take place until the team can come to a consensus on an effort estimate for the requirement. Most often, only one or two Planning Poker rounds are necessary on a particular requirement before consensus is reached.

Each group was instructed to use Planning Poker to estimate the cost in number of hours to build the Lego robot the class had built in the earlier session. Our group has 4 people and we used the provided poker cards to deduce the estimates. In most cases our estimates were quite close but still we need a few iteration of card playing before building the consensus around estimates.

Posted in iBusiness | Tagged , , | Leave a comment

MIS40910 – Skills for Business Enquiry – Preliminary Research Question

Title:

Which contextual, behavioural & social factors are responsible for making an innovation technically efficient or inefficient?

Overview:

While reading the Abrahamson’s article on ‘Managerial fads and fashions: The diffusion and rejection of innovations‘, a doubt came to our mind in that while he uses “Outside Influences” & “Imitation-Focus” dimensions to explain various perspectives on diffusion & rejection of technically efficient innovations, the underlying assumption that an innovation is indeed “technically efficient” puts the whole research on shaky grounds.

We would like to explore this conundrum of what makes an innovation technically efficient and beneficial for an organisation or a group of organisations. To answer this question we would like to dig deeper into various contextual, behavioural & social factors that might determine the technical efficiency of an innovation.

We would also like to explore if Pfeffer & Sutton’s ‘Evidence-based management’ can be used to formulate a generic framework to measure the technical efficiency of an innovation. We believe that such a measure can help an organisation make an informed decision in adopting technically efficient innovations or rejecting technically inefficient innovations.

Methodology:

We would work on a premise that an innovations are always generic in nature and share similar characteristics across any industry, geographical or social context. The methodology would be to explore which contextual, behavioural & social factors are responsible for making an innovation technically efficient. We propose to explore journal articles, news articles as well as books on innovation & evidence management to get better insight into contemporary research on innovation and evidence management. Any academic journal exploring a measure of technical efficiency of an innovation will be of particular interest to us.

Research Design:

The research design into this topic would be broadly based on the following points

1) To list the different contextual, behavioural & social factors impacting the technical efficiency of an innovation within organisations or group of organisations

2) To explore degree of impact of each of these different contextual, behavioural & social factors on the technical efficiency of an innovation within organisations or group of organisations

3) To devise a model based on evidence management to determine “technical efficiency quotient” for any innovation which can then help an organisation make an informed decision on either adopting technically efficient innovations or rejecting technically inefficient innovations.

References:

1. Abrahamson, E. (1991) ‘Managerial fads and fashions: The diffusion and rejection of innovations’, Academy of Management Review, 16(3): 586-612.

2. Pfeffer, J. and Sutton, R. I. (2006) ‘Evidence-based management’, Harvard Business Review, 84(1): 62–74.

3. Peter Drucker, Innovation and Entrepreneurship

4. Robert B. Tucker, Driving Growth through Innovation

5. Tom Kelley, The Art of Innovation: Lessons in Creativity from IDEO, America’s Leading Design Firm

6. Clayton M. Christensen, The Innovator’s Dilemma: The Revolutionary Book that Will Change the Way You Do Business (Collins Business Essentials)

7. Gary Hamel, Leading the Revolution

Posted in iBusiness | Tagged , , | Leave a comment

MIS40910 – Skills for Business Enquiry – Research Questions

1.  Abrahamson, E. (1991) ‘Managerial fads and fashions: The diffusion and rejection of innovations’, Academy of Management Review, 16(3): 586-612.

1.Abrahamson1991-Managerial Fads And Fashion

2(a).  Pfeffer, J. and Sutton, R. I. (2006) ‘Evidence-based management’, Harvard Business Review, 84(1): 62–74.

2a.Pfeffer2006-Evidence Based Management

Or

2(b).  Pfeffer, J., & Sutton, R. I. (1999). Knowing “what” to do is not enough: Turning knowledge into action. California Management Review, 42(1), 83-108.

2b.Pfeffer1999-Knowing What To Do

MIS40910 – Group ‘D’ questions/comments

Philip Burtenshaw, Conor Gleeson, Tarun Rattan, Thomas Joseph, Fiona Walsh

Q:1 – Managerial Fads and Fashions: The Diffusion and Rejection of Innovations:

Abrahamson argues in his article that, even though there is no quick answer to diffusion of inefficient innovations and rejection of efficient innovation, such process will help organisation in finding new efficient innovation and adopting it. Is this tendency good for all type of industries?

The differences as outlined in the No Silver Bullet paper in Managing System Development from this week are stark in comparison. Can we say that while the scope of this paper is broad it should not be applied to IT innovations? Companies who have shown themselves to have a pro-innovation culture have very much benefited from this approach in the IT sector.

As Abrahamson himself concedes “The cost of adopting and rejecting multiple fads or fashions in order to find a technically efficient innovation may be much lower than the returns from using this innovation.” This would seem to be especially applicable to the IT sector.

Q:2 – Knowing “What” to Do is Not Enough:  Turning Knowledge into Action

clip_image001 How can we have so many books, articles, consulting, research, and training yet not make dent in actual management practice?  When one considers working knowledge often falls outside of knowledge management frameworks and yet 70% of workplace knowledge is informal there is clear and logical reason why so much theory gets left as just that theory.

The article makes a valid observation in that firms have not done a good job at building knowledge into products and services or to develop new products and services out of this knowledge. This after all the age of the knowledge economy or was that just a catch phrase amongst CEO who seem to have trouble implementing what they say.

Companies over estimate the importance of the tangible, specific, programmatic aspects of what competitors, for instance do, and underestimate the important of the underlying philosophy that guides what they do and why they do it. Honda being a high profile proponent of this such that it permeates everything it does including partner selection. The value of philosophy – Intangible glue in the management and utilisation of knowledge.

Posted in iBusiness | Tagged , , | Leave a comment

MIS40910 – Critique – Managerial fads & fashions, Evidence based management

This is a critique on the following articles on management strategies by Abrahamson and Pfeffer & Sutton, it was written as part of week 4 readings in Skills for Business Enquiry course

1.  Abrahamson, E. (1991) ‘Managerial fads and fashions: The diffusion and rejection of innovations’, Academy of Management Review, 16(3): 586-612.

1.Abrahamson1991-Managerial Fads And Fashion

2(a).  Pfeffer, J. and Sutton, R. I. (2006) ‘Evidence-based management’, Harvard Business Review, 84(1): 62–74.

2a.Pfeffer2006-Evidence Based Management

Or

2(b).  Pfeffer, J., & Sutton, R. I. (1999). Knowing “what” to do is not enough: Turning knowledge into action. California Management Review, 42(1), 83-108.

2b.Pfeffer1999-Knowing What To Do

Group D – Tarun Rattan, Philip Burtenshaw, Conor Gleeson, Thomas Joseph, Fiona Walsh

Abrahamson as well as Pfeffer & Sutton are trying to unravel decision making processes within individual organisation and also within a group of organisations. They are questioning

1) When & by what processes & contextual factors are technically inefficient innovations diffused or efficient innovations rejected within organisations or a group of organisations?

2) Is evidence based decision making better than experience based models while determining technical efficiency of an innovation or context?

Abrahamson approaches these questions from the academic & researcher standpoint whereas Pfeffer & Sutton represent the industry practitioner viewpoint. Both of them are aware of pro-innovation or pro-evidence biases and strive to remain unbiased by suggesting perspectives countering these pro biases. While both agree that efficient choice is the dominant perspective in business decision making, they concur in that it alone is not sufficient to explain either Innovation diffusion processes or Evidence acceptance decisions. To counter pro biases Abrahamson uses “Outside Influences” & “Imitation-Focus” dimensions to explain various perspectives on diffusion & rejection of choices. Though Pfeffer & Sutton are not so methodical they also use similar arguments while promoting evidence based decision making. Both of them agree that research into decision making process should explore other perspectives which nullify the pro biases.

Abrahamson argues that the assumption that “rational adopters make independent & technically efficient choices” in fact perpetuates pro-innovation biases and is also incorrect in most cases. Similarly Pfeffer & Sutton contends that assumption that medical practitioners use evidence as efficient choice method to “guide medical decisions” is not accurate. In-fact the facts are more damning, “the recent studies show that only about 15% of their decisions are evidence based”.

Abrahamson’s research is at a more macro level in that his focus is on the organisation or group of organisations. Pfeffer & Sutton on the other hand are approaching these questions at a more micro level and are focussed on individuals within these organisations. If we try to superimpose Pfeffer & Sutton evidence based management methods on the “Outside Influences” & “Imitation-Focus” dimensions used by Abrahamson, the matrix looks like below

 

Imitation Focus

Dimension

Imitation Processes

Do Not Impel the Diffusion or Rejection

Imitation Processes

Impel the Diffusion or Rejection

 

 

 

 

 

 

Outside Influence Dimension

Organizations Within a Group Determine the Diffusion and Rejection Within This Group

Abrahamson: Efficient Choice Perspective

Abrahamson: Fad Perspective

Pfeffer & Sutton:  Evidence Based Decision Making

Pfeffer & Sutton: Hype & Marketing / Ideology

Organizations Outside a Group Determine the Diffusion and Rejection Within This Group

Abrahamson: Forced Selection Perspective

Abrahamson: Fashion Perspective

Pfeffer & Sutton: Forced Ranking

Pfeffer & Sutton: Uncritical Emulation / Casual Benchmarking / Dogma & Belief

The various propositions raised by Abrahamson around these perspectives can again be countered due to the numerous contextual factors affecting organisations & incipient individuals within these organisations. For example the proposition around efficient choice perspective that “performance gaps will prompt the diffusion of innovations in efficient organisations” is true only in certain cases. In most cases the organisational culture & personality traits of the leader override the acceptance of such innovations. Similarly another proposition around forced selection where “political pressures are deemed to trigger diffusion or rejection of innovations” does not take into account that political class is generally business & technology illiterate. The forced selection is nothing but the result of “lobbying” by influential organisations within an industry segment. The same argument is very much true for the propositions around the fashion perspective. Similarly evidence based methods as proposed by Pfeffer & Sutton discount human instincts which have been a key for many a great things humans have achieved. Emerson’s quote “Trust your instinct to the end, though you can render no reason.” still resonates for most of us.

The two approaches put forward by Abrahamson and Pfeffer & Sutton might look similar but there is an underlying fallacy if one digs deeper. If we look at employee stock options references in the two articles then Abrahamson on the one hand uses stock options to explain how the diffusion of technically efficient innovation does takes place among a group of organisations. But on the contrary Pfeffer & Sutton use it to highlight that evidence suggests that employee stock options “had no consistent effects on financial performance”. So while Abrahamson thinks employee stock option is a technically efficient innovation, it does not pass the test of evidence based management. This dichotomy leads to a bigger question, bigger than at least what Abrahamson is trying to research.

Perhaps more important is not “how” the technically efficient innovations are diffused or inefficient ones rejected? But rather “what” makes an innovation technically efficient or inefficient? Which contextual, behavioural & social factors are responsible? And if it can be proven that evidence based method is a useful measure for determining the technical efficiency for such innovations, then it would have much more utility.

Posted in iBusiness | Tagged , , | Leave a comment

MIS40670 – Silver Bullet, Chaos Model, Scrum Methodology

The week three readings in Managing System Development course were articles on different software engineering methodologies.

Brooks Jr. F. P. (1987) No Silver Bullet Essence and Accidents of Software Engineering, Computer, 20, 10-19

Brooks_NoSilverBullet_1987

An interesting analogy between Software Projects and Werewolves and based on personal experience I can vouch that projects can turn overnight into monsters difficult to control and manage. A software project like Werewolves can destroy individuals, team’s even organisations. Once the things start going bad then heads start rolling starting from Project Managers, Configuration Managers & development leads in that order.

But to expect a Silver Bullet to kill the uncertainty & complexity is expecting moon. As Brooks rightly mentions this ambiguity & density is inherent in software engineering realm. “No other technology since civilization began has seen six orders of magnitude in performance-price gain in 30 years

Brooks went philosophical when he invoked Aristotle to divide difficulties inherent into software realm into essences and production difficulties into accidents. But I don’t believe philosophy is going to help us here. The difficulties inherent in the nature of software as well as during production of software are varied in nature and collating these into any single philosophical term is to over simplify the problem. In software development we are trying to describe a bit of our own intelligence, personality traits & inter human dynamics which is not easy.

I liked Brook’s division of essential difficulties in nature of software into

1) Complexity

2) Conformity

3) Changeability

4) Invisibility

And agree with him that the past breakthroughs have partially solved some glaring issues. For example High-Level Languages, Time Sharing & Unified Programming Environments have to an extent given us perhaps a Silver Pellet if not a Bullet per se.

As the article is quite old so the future hopes on the horizon at that time have since come and gone.

1) Ada

2) Object-Oriented Languages

3) Artificial Intelligence

4) Expert Systems

5) Automatic Programming

6) Graphical programming

Most of these programming constructs have since been used extensively and retrospectively most of these constructs have helped in the advancement of Software Engineering.

Most effective statement in the article comes in the Incremental Development section which is grow, don’t build software. I agree the industry has evolved from writing software to building software and the next step in evolution should be to grow software.


Racoon, L. B. A. (1995) The chaos model and the chaos cycle. ACM Sigsoft Software Engineering Notes, 20, 12

Racoon_TheChaosModelAndChaosLifecycle_1995

This is the most insightful of the articles I’ve read on software engineering and I totally concur that the developer’s viewpoint has been overlooked in traditional models.

The software engineering is complex in nature and Chaos model “which combines a simple, people oriented, problem-solving loop with fractals” is a good fit in that. Racoon’s analysis on how Chaos model “improves or understanding of the contribution and limitations of users, developers, and technologies” is enlightening. And it does makes sense when Chaos Model is used to “define the phases of the life cycle in terms of fractals and show that all phases occur throughout the life cycle”.

Now when I look back and analyse poorly executed projects using linear Problem-Solving loop, the failings can be derived from the Problem Definition phase. Most of the time all resources & time is spent in solving wrong problems. The acceptance of the fact that project life cycles are not linear but fractal in nature can definitely allow us to approach the problems from different viewpoint. The traditional software techniques do not agree on the fact that all levels of software development have the same value to the project as a whole. And it is part of the problem a uniform focus instead of focus on few large issues by top management and running down from that order results in skewed projects.

I totally agree that software development is a Human activity and enough checkpoints should be put in place to ensure

1) Developers don’t mistakenly identify the problem or ignore the problem definition

2) Wrong technology is not used or right technology is not misused

3) Miscommunication is avoided in the project team

Fractal phase definitions can be used as an alternative method during project management to get a different viewpoint.

1) Requirement Analysis

2) Design

3) Implementation

4) Maintenance

5) Prototyping

But it is to be used with a caveat that it is long winding method and needs strong PMO to ensure that the project does not get trapped in a fractal mess.


Laurie Williams, Gabe Brown, Adam Meltzer, Nachiappan Nagappan, (2010) Scrum + Engineering Practices: Experiences of Three Microsoft Teams

WilliamsEtAl_SCRUMExperience_2010

It is a good read on Scrum Methodology which is an agile software development process that works as a project management wrapper around existing engineering practices to iteratively and incrementally develop software.

The authors have tried to summarised their findings with three different Microsoft teams which implemented Scrum Methodology to execute projects. The teams are varied in size and nature and the projects also have a a number of different variables. To the uninitiated Scrum is composed of the following project management practices:

• The Product Owner creates the requirements, prioritizes them, and documents them in the Product Backlog during Release Planning. In Scrum, requirements are called features.

• Scrum teams work in short iterations. When Scrum was first defined [16, 29], iterations were 30-days long. More recently Scrum teams often use even shorter iterations, such as two-week iterations. In Scrum, the current iteration is called the Sprint.

• A Sprint Planning Meeting is held with the development team, testers, management, the project manager, and the Product Owner. In the Sprint Planning Meeting, this group chooses which features (which are most often user-visible, user valued, and able to be implemented within one iteration) from the product backlog are to be included in the next iteration, driven by highest business value and risk and the capacity of the team.

• Once the Sprint begins, features cannot be added to the Sprint.

• Short, 10-15 minute Daily Scrum meetings are held. While others (such as managers) may attend these meetings, only the developers and testers and the Scrum Master (the name given to the project manager in Scrum) can speak. Each team member answers the
following questions:
o What have you done since the last Daily Scrum?
o What will you do between now and the next Daily Scrum?
o What is getting in your way of doing work?

• At the end of a Sprint, a Sprint Review takes place to review progress and to demonstrate completed features to the Product Owner, management, users, and the team
members.

• After the Sprint Review, the team conducts a Retrospective Meeting. In the retrospective Meeting, the team discusses what went well in the last Sprint and how they might improve their processes for the next Sprint.

The Scrum based process used by teams was as following

1) Basic Scrum: All teams began with four week iteration of the Scrum Process as described above. The teams performed “just-in-time” design of features before or during the iteration in which feature was to be developed.

2) Planning Poker: The teams used Planning Poker to estimate the person hours required to complete functionality within an iteration.

3) Continuous Integration: The teams utilised the continuous integration practice where members of a team  integrate their work into the main build frequently.

4) Unit Test-Driven Development: With this practice, a software engineer cycles on a minute-by-minute basis between writing failing automated unit tests and writing implementation code to pass those tests.

5) Quality Gates: The Microsoft teams call their done criteria “quality gates”. The quality gates established for these teams included the following:

• All unit tests must pass
• Unit test code coverage must be at least 80% (for
all teams except Team B)
• All public methods must have documentation
• All non-unit test code must not have any static
analysis errors or warnings (see Sub-Section 9 of
this section)
• Build must compile with no errors or warnings on
the highest level

6) Source Control: Source control is management of changes to documents, programs, and other information stored as computer files through a source control system. The Microsoft teams used the Visual Studio Team Foundation Server Version Control tool.

7) Code Coverage: Engineers were required to manage their automated unit test coverage and monitored this coverage with each build. Two of the teams (A and C) followed the Microsoft Engineering Excellence recommendation of having 80% unit test coverage.

8) Peer Review: In each iteration, the teams conducted design reviews of architecture diagrams and of code when adding new features.

9) Static Analysis Tool:  The use of static analysis tools can identify common
coding problems [17] or unusual code [1] early in the development process [9]. The teams utilized the FxCop static analysis tool built into Visual Studio.

10) XML Documentation: The team used .NET-style inline XML generated documentation on all public classes, properties, and methods. As a result the code was self-documenting.

The findings of the study was that the productivity of the teams as they transitioned to
agile temporarily dropped for three iterations. The team attributed this drop to their unfamiliarity of Scrum and required a “gelling” period to start delivering value based on
the development process. From their fourth sprint on they experienced a significant improvement in productivity without an increase in defects. Teams transitioning to the
use of an agile software development should plan for a similar temporary productivity decrease.

Teams that used Scrum and sound engineering practices showed better quality in terms of defect density compared with similar non-Scrum teams including data benchmarked across 40 projects from nine companies. These results indicates that Scrum combined with sound engineering practices have the potential to yield a higher quality product. Team B that followed Scrum but the engineering practices to a lesser degree than Teams A and C had the highest defect density.

Posted in iBusiness | Tagged , , | Leave a comment

MIS40910 – Skills for Business Enquiry – Knowing, Believing and Acting

The following readings were discussed

1.  Brannick, T. & Coghlan, D. (2006). ‘To Know and to Do: Academics’ and Practitioners’ Approaches to Management Research’.Irish Journal of Management, 26(2), 1–22

1.Brannick2006-To Know and To Do

2.  Huff, A.S. (2000). ‘Citigroup’s John Reed and Stanford’s James March on management research and practice’. Academy of Management Executive, 14(1), 52-64.

2.Huff2000-Citigroup March on Management

MIS40910 – Group ‘D’ questions/comments

Philip Burtenshaw, Conor Gleeson, Tarun Rattan, Thomas Joseph, Fiona Walsh

Question 1: Don’t you think that the relationship between research schools & business organisations is mutually conducive to each other? If the purpose of life as reiterated in both readings is primarily a “pursuit of knowledge” then isn’t it imperative to have research schools to collate that management knowledge. And taking the argument to the other side don’t you agree that knowledge in itself is not sufficient and that business organisations are required for the application of that knowledge so as to generate some material gains for the greater good of human kind?

In the article John Reed references a quote by James Atlas which forms the bases of this mutually beneficial aspect of a relevant business school in the eyes of management practitioner. Atlas “portrays a movement toward viewing business schools as credentialing and contact-forming institutions” which as Reed puts it “is not an unworthy vision it is a vision with consequences for the development of ideas about management.

Philip Burtenshaw, Conor Gleeson, Tarun Rattan, Thomas Joseph, Fiona Walsh

Question 2: Do you think that the issue at hand is that the dominant academic vocabulary is out of sync with that used in the real world thus even if the research on a topic is good it is often seen as “not relevant, readable or reachable.”? The content problem is better explained by the quote in the article which professes that “researchers should ensure that knowledge produced is relevant and transferred to practitioners in ways that enhance their capacity to us it”. But on the contrary care needs to be taken that any research which relies heavily on practitioner input does not get corrupted by the “contemporary enthusiasm for immediate relevance”.

So the right tone & balance in the researcher & practitioner inputs should be a prime focus in a collaborative research. The great debates are not always won by the side with the strongest arguments but by one who transmits the message loud & clear.

Posted in iBusiness | Tagged , , | Leave a comment

MIS40670 – Enquiry methods

Allen gave the class choice of different enquiry methods for a small research exercise. Elena & myself picked up Extreme User Interviews as our enquiry method for this exercise.

Extreme User Interview is a very useful technique to highlight key issues of design problem. In this individuals are identified who are either extremely familiar or unfamiliar with the product and then asked to evaluate their experience using it.

Elena & myself decided to pick Smart Phone as a the product to research on using this technique. We decided to interview two people each who either are using smartphones as of now or who have never used it before. We decided to conduct two interviews with each person, one preliminary and other one where we dig deeper into the product features.

The results are summarized below, we found the technique very useful for firming up the product design issues.

image

Posted in iBusiness | Tagged , , | Leave a comment