Agile Business Analysis in Flow --The Work of the Agile Analyst (Part 2)

Featured
43195 Views
8 Comments
20 Likes

In Part 1 of "Business Analysis in Flow - The Work of the Agile Analyst," I talked about the new skills and attitudes business analysts need to bring to agile development. When your organization adopts this value-centered approach, you need to have, as I wrote, "a tolerance for ambiguity along with a concurrent drive for specificity and closure."

Now it's time to talk specifics. What exactly do BAs do in agile development? How will your activities differ from those of traditional development? Let's take a look at agile business analysis from the perspective of the activities that make up requirements development and management, comparing traditional with agile analysis.

Setting the stage: Requirements planning activities
To set the stage for requirements, the team strives to create a shared understanding of the product by all the stakeholders.

Traditional Analysis

Agile Analysis Adaptation

Attend project chartering sessions to define a vision, glossary, requirements risks, and product stakeholders.

  • Design, facilitate, or participate in product vision and roadmapping workshops.

  • Help your customer understand which roles and themes to best deliver in each product release.

  • Help your customer and team identify logical groupings of value-based requirements, and use these groupings to create a product roadmap showing incrementally delivered requirements over time. These requirements often take the form of minimally marketable features, stories, or epics (i.e., large stories that cross releases), use cases (high level only), events, or a combination.

Review and modify a list of tasks, time, and delivery dates in a work breakdown structure plan developed by the project manager.

  • Design and facilitate (or participate in) release and iteration planning workshops.

  • Regularly prune the product backlog by collaborating with team members to generate a relative size estimate for backlog items.

  • Conduct analysis "spikes" (short, timeboxed research stories) to elaborate on backlog items that need more analysis, researching requirements and their priorities.

Generate a SWAG ("S#*&-Wild-Ass-Guess") estimate of time, effort, or cost for each requirement in the specification or user requirements document.

  • During iteration planning, together with the rest of the team, write down the needed tasks to deliver each user story, and estimate how many hours they will take.

  • Share actual time usage information with your team so that the team can track progress via visual graphs ("information radars") such as burndown, burn up, or cumulative flow diagrams.

Requirements elicitation activities
During requirements elicitation, the team identifies the sources of requirements and then discovers, derives, evokes, and elicits requirements from those sources.

Traditional Analysis

Agile Analysis Adaptation

Plan how to elicit requirements using a variety of techniques.

  • Use face-to-face, collaborative elicitation techniques (workshops, prototypes) as much as possible while avoiding techniques (interviews, surveys, documentation study) that require longer lapse times or interpretation.

Plan, design, and facilitate requirements workshops over weeks (or months).

  • Plan and facilitate short, informal requirements modeling sessions throughout each iteration.

  • Plan and facilitate product vision and roadmapping workshops and release planning workshops.

  • Teach your customer about supplemental analysis models so that they can question, participate, critique, review, and approve them (this should be done in traditional projects as well).

  • Sketch out prototypes and identify user acceptance test data in real time, while a story is being designed, coded, and prepared for testing.

Requirements analysis activities
During analysis, the team seeks to understand and define requirements so that stakeholders can prioritize their needs and decide which requirements to build.

Traditional Analysis

Agile Analysis Adaptation

Define the scope up front by using a set of requirements models as the basis for detailed modeling.

  • Help your customer define the vision and the scope up front-at a high level only.

  • Help your customer and team create lightweight models during product roadmapping and release planning. These models help customers carve out a value-based release schedule that balances business priority with architectural dependencies.

  • Collaborate with architects and developers on design to ensure that requirements include the technical aspects of the product.

Develop analysis models for the entire set of requirements that are in scope.

  • Help your customer and team develop stories (user stories as well as stories that incorporate or separately define quality attributes).

  • Help your customer and team develop and extend analysis models that support understanding backlog items selected for delivery in an iteration-if and when needed.

Ask the customer to prioritize requirements using a ranking scheme. If the customer is not available, do the ranking yourself.

  • Help your customer assign a business value and a ranking to each backlog item.

  • Help your customer understand requirements dependencies that might warrant adjustments to backlog rankings.

  • Question rankings based on goals or themes for upcoming release or iterations.

  • Assist your customer and team to right-size high-priority backlog items that are too big to deliver in combination with other high-priority backlog items in the next iteration.

Requirements specification activities
Specification involves refining and organizing requirements into documentation (typically a software requirements specification). This includes the entire set of functional and nonfunctional requirements to be transformed into design, code, and tests.

Traditional Analysis

Agile Analysis Adaptation

Write a requirements specification.

  • Help your customer and team write stories (or if you're acting as proxy customer, you write them).

  • Create doneness criteria for stories so that each becomes a well-defined, small piece of valuable software for delivery in the next (or current) iteration.

  • Create user acceptance tests or sample input and output data for each story.

  • Determine the form and format of documentation that is necessary and sufficient for requirements-related work-in-progress, handover, or product documentation.

Requirements validation activities
During validation, the team assesses whether the product satisfies user needs and conforms to the requirements.

Traditional Analysis

Agile Analysis Adaptation

Set up and run meetings to review and sign off on requirements documents, and help customers run acceptance tests after the entire product's code has been created.

  • Meet with the customer and some team members to prune the backlog (once or twice each week).

  • Participate in iteration demonstrations and listen to stakeholder feedback on the delivered requirements to learn the customer's real needs and determine how to adapt the evolving product.

  • Plan and facilitate, or participate in, iteration retrospectives, and learn from the customer how you can help deliver value faster.

Communicate with developers or testers (or respond to their e-mails and calls) to explain information in the requirements document; attend or run formal requirements review meetings.

  • Conduct just-in-time analysis modeling with customers and your team to validate the business value of each story and to ensure it will be delivered to the customer's satisfaction.

  • Participate in daily stand-ups.

  • Sit with developers and testers as they are building code and tests to explain the story and its doneness criteria.

Help testers create user acceptance tests, or run those tests, after the entire product has been designed, coded, and unit/system/integration tested.

  • Define input data and expected results or specific user acceptance tests as part of defining doneness for each user story, iteration by iteration.

Requirements management activities
Requirements management involves monitoring the status of requirements and controlling changes to the requirements baseline ("a point-in-time view of the requirements that have been reviewed and agreed upon to serve as the basis for further development," Gottesdiener 2005).

Traditional Analysis

Agile Analysis Adaptation

Establish the requirements baseline, document change control processes, and generate requirements trace matrices.

  • Help the customer and team establish a product backlog and define the smallest necessary requirements attributes for each backlog item.

  • Help the customer and team define "just enough" requirements tracing needed to satisfy external regulatory body expectations.

  • Help the team determine simple, meaningful requirements mapping and organizing (features to stories, events to stories, etc.).

  • Define simple, unobtrusive ways to trace stories, with the aim of capturing metrics that will be useful for reuse and promoting development efficiencies.

Attend or schedule change control meetings.

  • Help the customer and team prune the product backlog continually (reprioritize items, break down stories, assign rankings, estimate size, and explore requirements dependencies that will impact architecture and therefore release planning).

  • Help the customer maintain the product backlog items (on story cards on a wall, in a spreadsheet, or using an industrial strength agile requirements management tool) - or do this on behalf of the customer.

Learning: The heart of agile success
A mantra for agile teams is "inspect and adapt." This means regularly checking on the delivered product and the processes used. Continuous improvement (called "kaizen" in lean approaches) is essential to agile success. How do you inspect and adapt your business analysis work to learn and develop?

Traditional Analysis

Agile Analysis Adaptation

  • Participate in milestone or project "lessons learned" sessions to find out what went wrong, what went right, and who is responsible for the problems. The project manager fills out the lessons learned template and writes the closeout document.

  • Sit with your manager once or twice a year for a performance review, and get feedback on your performance, months or weeks later. Sometimes that feedback includes second-hand comments from your customer and team.

  • Use acceptance tests, examples, sketches, simple drawings, and face-to-face communication to get feedback on your understanding of requirements.

  • Participate in daily stand-up status meetings to hear the impact you are having on other people's ability to deliver.

  • On any given day, as an item you committed to deliver is deemed done, show it to the customer to get feedback on it and confirm that the conditions of satisfaction have been met.

  • Design and facilitate, or participate in, iteration and release retrospectives (every two or three weeks, depending on your iteration timebox) to learn what works, learn what to adapt, and collaboratively agree on one or two things to do differently in the next iteration or release. The goal is to learn, adapt, get better, and experience joy in your work.

The new world of agile analysis
So there you have it - a bird's-eye view of how business analysts operate and add value in agile projects. As you can see, this approach calls on you to stretch your analysis muscles.

As an agile analyst, you are deeply committed to delivering business value and building the right product as soon as possible. As a member of an agile team, you are less concerned with roles and job boundaries, and more concerned with delivering as a team.

You experience the rhythm of successive elaboration and product delivery. You thrive on feedback and small, continual improvements. What's more, you have an intense need to self-reflect, communicate transparently, improve your skills and abilities, and serve your team and customer. You thrive on the energy and joy of being in rhythm with an agile team.

References
Gottesdiener, Ellen. The Software Requirements Memory Jogger: A Pocket Guide to Help Software and Business Teams Develop and Manage Requirements. GOAL/QPC, 2005.

Resources and Readings

Thanks!
The author would like to thank Phil Abernathy, Susan Block, Mary Gorman, Kamal Singh, Norman Stang, and Stephanie Weiss for their helpful review and feedback on a draft of this article.

Author: Ellen Gottesdiener, Principal Consultant, EBG Consulting, helps you get the right requirements so your projects start smart and deliver the right product at the right time. Ellen's company provides high-value training, facilitation, and consulting services to agile and traditional teams. An agile coach and trainer with a passion about agile requirements, she works with large, complex products and helps teams elicit just enough requirements to achieve iteration and product goals.

Ellen's book Requirements by Collaboration: Workshops for Defining Needs describes how to use multiple models to elicit requirements in collaborative workshops. Her most recent book, The Software Requirements Memory Jogger is the "go-to" industry guide for requirements good practices. In addition to providing training and consulting services and coaching agile teams, Ellen speaks at and advises for industry conferences, writes articles, and serves on the Expert Review Board of the International Institute of Business Analysis (IIBA) Business Analysis Body of Knowledge™ (BABOK™).

You can subscribe to EBG Consulting's offers a free monthly eNewsletter "Success with Requirements" offering practical guidance and requirements-related news. When you sign up, you'll receive a free article on essentials for scoping your requirements. You can follow Ellen on Twitter or contact her via email ([email protected]).

Copyright © 2009 EBG Consulting, Inc.


Like this article:
  20 members liked this article
Featured
43195 Views
8 Comments
20 Likes

COMMENTS

Cara NZ posted on Sunday, August 9, 2009 8:51 PM
Thought this article was great... at work we've gone through the theory of Agile, however it's what it means from a practical day to day, on the floor approach that a lot of our crew are trying to get their heads around. Have forwarded with thanks!
caratalbot
Tony Markos posted on Monday, August 10, 2009 10:52 AM
Ellen:

The whole BA community really needs to get back to basics. While what you mention needs to be done, little of it is analysis. Go to any dictionary and look up the definition of analysis. Chances are that the first listed defintion will state that analysis is about properly partitioning (i.e., breaking up) the whole into parts. Just as important, analysis is about determining how the parts interrelate.

Neither use cases nor user stories can be used to properly partition the business domain under question. And neither are any good at determining the essential interfaces between the parts. (I won't even talk about as-is vs to-be here.)

There is no devine intiuition; somebody has to do the work that I mention. It can be done formally (i.e., the fast way), or informally (i.e., the slow way). From what I see, Agile is based on the assumption that someone, prior to project kick-off, has done this "real" analysis informally - probably, for larger scale efforts, over many years of work experience. (I won't talk about the fallacy of this assumption here.)

What I see Business Analysts doing in the Agile world that you describe is rough-cut design or design and testing coordination activites. Again, I am not disputing that what you mention Ellen needs to be done - just that it is not real business analysis.

Tony
ajmarkos
Leslie posted on Tuesday, August 11, 2009 7:16 PM
I like this article and I do agree with Tony's sentiments about 'what is analysis', but Ellen could argue that it is captured by this statement:

- Develop analysis models for the entire set of requirements that are in scope.
----------------------------------------------------------------------------------------------
* Help your customer and team develop stories (user stories as well as stories that incorporate or separately define quality attributes).
* Help your customer and team develop and extend analysis models that support understanding backlog items selected for delivery in an iteration-if and when needed.

- Is there any reason why user stories cannot be captured with use cases?
- By 'extend analysis models', I can interpret this to mean that there are class, sequence, state diagrams that support the user stories and that each time a user story is added to the product backlog, that these are updated.

Les.
baldrick
zarfman posted on Wednesday, August 12, 2009 8:18 PM
My work back ground is in the areas of finance and accounting.

Speaking as an accountant I don’t understand why the iterative process is helpful. If I want an accounts payable system. I don’t want part of it, I want the complete system. Anything else is useless. If I already have an accounts payable system I’ll tell you what I want to stay the same and what I want added.

Assign a business value, if I’m the chief financial officer and I want you to do something that should be enough. If I tell you I want a customer number validated in three seconds and you tell me it can’t be done, I’ll try and find a consultant to help you meet that requirement. It can even come out of my budget.

Assist your customer and team to right-size. Two buzz words I hate are right-size and optimal. How does one actually prove something is right-sized or optimal. Easy to say very hard to do.

It seems to me that agile analysis is just another attempt to solve the age old problem of accountants and computer people not having a clue as to what the other is trying to say or what the accountant wants the computer to do, or the poor computer person try to figure out what a debit or a credit is.

From my prospective it would be to my benefit if some of my accountants understood business rules, interface design, middle ware and data base design. Or, if the BA had a minimum the 12 hours of accounting (Not bookkeeping) at the university level.

Thoughts of an irritable old CFO
zarfman
Ellen Gottesdiener posted on Saturday, August 22, 2009 10:22 PM
I agree that the BA community needs to get back to basics. And those basics doing what’s needed to deliver value. However it is accomplished, analysis serves the business or organization for that one primary goal.

Tony, I agree it’s about the work - the necessary analysis to deliver value as opportunistically as possible. And no, agile has no assumptions about analysis having been done. That is why engaging with product management or business management to define requirements-driven planning – product, release and iteration - is essential to success on large agile projects.

When you experience working on an agile project, and I hope you do, you’ll see that indeed analysis is necessary. It won’t be the whole-hog get it all down upfront analysis though, as explained (I hope) in this two part article.

Les asked about use cases. On an agile project, in my experience, use cases specification are excessive; you can use events or just use cases names (and possible brief descriptions) to arrive at user stories. You can also start with features and define use cases, or define events and features then move into stories.

Writing a full spec? Nah! I like use cases – well-named one, and very brief ones, though. If your culture is familiar with use cases, list those well-named use cases and then write a brief description. From there, list variations in short hand, or list them as stories or user acceptance tests. Your product owner then needs to decide which are most valuable to build. However, she needs to get good info from the team about relative size, dependencies and risks; that’s doing good analysis to help her make smart choices.

Extended analysis models might be whatever works for the team: maybe class diagram or data model, maybe a sequence diagram, and I really like lightweight state diagrams – sure! But only if necessary to increase understanding on the part of the delivery team. The purpose of modeling is to communicate. In practice, most agile teams do model, but don’t capture it in formal ways.

Do you save the models? Well, only if somebody needs it in some other form and if your customer/product owner is willing to pay for that work. Here’s an example: with one agile team I coached, we kept what we called an ‘organic’ data model, that grew as we built out stories from iteration to iteration. The product owner loved it: we began to discuss the stories all together by pointing to and adding attributes and relationships on the data model. Eventually, it was cleaned up a bit so other teams could ‘grok’ it (it got kind of messy), and it was used to help build the user manual, too.

Dear Irritable: I’m sorry for your soreness with the ideas of agile, iterative and the terms I used that made you unhappy. Agile is not new, so it’s not another attempt to solve that age old problem you speak about. It’s just making a comeback with a big more hype surrounding it.

It sounds like communication between “computer people” and “accountants” may have been awful for you. It’s really a shame. Working on a collaborating team is wonderful; I regret you’ve not had the experience in your career, thus far.

The idea with iterative development is to take the communication about your needs in bite size pieces and not try to define those needs for months via long documents before building you something.

Rather, define a few pieces of what you need, and get something real (working software) in a few weeks. From there, you can better figure out what you really need. you get to test drive your needs and continually improve your communication capabilities. In the end, you get the complete system sooner and with better quality. That’s the idea.

Rarely is a full defined specification ever delivered, let alone are all those requirements used.

Sorry Irritable, it sounds like the business analysts you worked with did not have sufficient domain knowledge. I wish you better experiences working with ‘computer people’ in the future.

Stay tuned for another article on agile analysis in November issue of Modern Analyst, and thanks for reading this!

Kind regards,
~ ellen

PS: Dear readers, please also read the comments I wrote in response to reader comments to Part I, at the bottom of this page: http://www.modernanalyst.com/Resources/Articles/tabid/115/ArticleType/ArticleView/ArticleID/913/Default.aspx
ellengott
zarfman posted on Monday, August 24, 2009 10:05 PM
Ellen:

What we seem to have is a failure to communicate. I don’t want a FEW working pieces of what I need. Nor do I need these FEW pieces to figure out what I REALLY need. I all ready know what I need. Accounting has been around for a very long time. We are bound by numerous ever evolving rules. We have generally accepted accounting principles, i.e. GAAP. We have the financial accounting standards board FASB. If we wish to provide our services to the general public we must be licensed, e.g. CPA. Not only that, we have other CPA’s called auditors looking over our shoulders to see if we are telling the truth and following the rules.

Given the foregoing I would not expect business analysts to have sufficient accounting domain knowledge.

I wonder if some form of cross training would be helpful? Perhaps we need modern business systems analysis for accountants.

Further thoughts of an irritable old CFO.
zarfman
Scott Sehlhorst posted on Saturday, August 29, 2009 3:43 PM
@ellengot - good article, thanks.

@everyone, this has drifted into an interesting discussion, almost a sidebar, that is one of the tougher problems I've seen in helping companies discover ways to leverage an agile team's approach to developing software. Finding not just a way to say "I can work with that", but finding a way to say "our business can take advantage of this."

@zarfman - I think you are raising some fantastic points, and I've been on the receiving end of some version of each of them more than once. I want to spend the most time on "I don't want part of what I need, I want _all_ of what I need." But let me address another of the other issues first.

You started with (paraphrasing) "I'm the CFO - if I want it, that should be enough."

Most of the time, a software dev team isn't working only for the CFO. For Ellen's audience here, most people will be supporting multiple departments. Even if originally allocated to one of your projects, they are absolutely at risk of getting an urgent request to build something that drives top-line growth, that the CMO cares about. So most IT people will have to deal with high-level prioritization trade-offs. I have worked with teams (usually called rogue IT departments) that have been hired by marketing VPs or directors, outside of IT, protected from these trade-offs. But that introduces another set of problems, as it is not globally optimal for the business to do that, however beneficial it may be to that group.

As a CFO, I would expect you to care about _having_ an accounts payable system. You probably should even care about that system being reliable, secure, and low-cost to use and maintain. If you're really hands-on, or don't have strong direct reports, you might also care that the system can do some interesting things, like issue payments on the last allowable day (to maximize float), automatically calculate the 'optimal' day to pay, by comparing contractual terms (like early-payment discounts and late-payment fees).

As a CFO, I would be disappointed if you were personally spending time on things like "my accountants need to be able to visualize the chart of accounts in a tree view representing the hierarchy of accounts and sub-accounts" or "the system needs to be able to infer contract terms from electronic copies of the contracts, versus a clerk entering terms into the system manually."

Most of the "when or if" decisions revolve around these types of considerations. And that leads to the main issue - "all or nothing."

There is absolutely a minimum amount of functionality that is required to justify any investment. An accounts payable system has to protect information, has to allow you to schedule and process payments to creditors, and has to update the appropriate accounts for every transaction. No accounts payable system should ever be deployed that can't do those things. Depending on your disaster recovery policies, an AP system may also be required to have offsite redundancy and fail-over capabilities from the instant it is turned on.

Every accounts payable system that has ever been deployed as software has been replacing an existing accounts payable system. The existing system might have been computerized, or it might have been manual. It doesn't really matter - the point is that the business has an ongoing "process" for paying creditors. Maybe that process is inconsistent, inefficient, or inaccurate, or too expensive. Some combination of those things has driven the desire for the "new" system to be implemented.

Your IT and accounting teams will introduce 'practical constraints' on how and when the business can move from the old system to the new system (including things like training employees, avoiding quarter or year-end peak effort seasons, data-center and hardware purchasing constraints, etc).

All of these considerations lead to a pretty no-nonsense definition of "minimal acceptable solution." And a CFO or VP shouldn't care about details beyond that.

Everything else is just ROI - "do a little bit more, if the payback meets our internal hurdle rates, and if nothing 'more valuable' comes up between now and then". As long as the team can add capabilities to the AP system that add more value than they cost, the team should keep working, right? Better equals better. The only reason to stop is if something even better than that comes along, or if you can't afford to invest more.

Ellen's advice and approach presented here applies _absolutely and effectively_ in this situation.

Here's an example - you mentioned business rules. Well, you're certainly going to have policies about making payments. And your accountants will enforce those policies when authorizing payments. Say you have a policy that says "when cash-flow goes negative for a quarter, make all payments 60 days late, and pay the fees." You may refine that policy to say "unless creditor relies on us for more than 50% of their receivables - then pay them on time." Then, you have to refine the policy for dealing with international payees - some you choose to delay longer, some not at all. Perhaps you also want to add controls based on the size of the invoice, and on the total amount due per payee.

A perfectly reasonable software AP system would allow your accountants to enforce these policies. Another option is to have these policies be automated - the AP system monitors cash accounts, determines when the cash-flow trigger should initiate the policy changes, and then enforces them.

Automating this stuff will have some value - in labor savings, reduced cost of audit, and reduced risk of mis-applied policy. But would you refuses to deploy a system if it didn't automate this policy (yet)? That would be a shame.

Perhaps your business is rapidly expanding globally, and as you enter new markets, you discover new regulations, requiring you to regularly update the policies your AP system has to enforce. Most software would require you to either change the software each time the policies change, or allow your team to manually enforce the policies - or both. You could build an AP system that allows your team to enter the policies as a set of rules, changing the rules whenever the policy changes, and having the system automatically enforce them.

Same question - would you refuse to deploy the system before the rules-enforcement and maintenance-automation is implemented? That would be a bummer.

Using your 'customer number validated in three seconds' example - ok, that's pretty easy (I used to write software, so I know). Perfectly reasonable to ask for it. Consider this though - is there any value in having the system be able to discover any time the customer number was "fat fingered" and two digits were transposed, or one was wrong - and still identify the right customer? That can be done too - but it isn't as easy. If you've got 1,000 people working in call centers on the phones, and this happens a lot, it might be worth doing. But you should probably do it "later." Where later is defined as "as soon as there isn't anything better to do."

So, certainly, there is a set of stuff that "must be" done, and there are things that you really should think about as "more is better" or "customer delight" features - and those are the things that you jockey around by priority, adding them as you go.

I know this is really really long - sorry about that. Normally, I have this conversation in person, not as part of a discussion thread.

Scott
@sehlhorst (on twitter)

sehlhorst
zarfman posted on Wednesday, September 2, 2009 5:02 PM
More thoughts from an irritable old CFO aka zarfman.

I continue to have a problem with the idea that someone not skilled in the art or science can analyze same. I wonder if accounting should say here is a black box, this goes in this comes out. This is the interface I want, these are the checks and balances I require and this is what I want the output to look like. Some of the better universities require a course in accounting information systems as part of there under graduate degree in accounting. Over the years I’ve noticed programming languages come F# and go ALGOL or LISP. Data base technology changes, etc. With each successive iteration the languages and data bases get more complex and more powerful. The number of practitioners that master this new complexity and power seem be fewer and fewer. This concerns me.

Case in point, three or four months ago I was talking to the CFO of a credit card processing company. One of the credit card companies was complaining about not getting their money fast enough. After a few phone calls I put his IT guy in touch with a DBA I knew to be competent. In three days the DBA rewrote one of the processes. The processing time dropped from three and one half hours to just over four seconds.

Moving on;

Scott wrote: all of these considerations lead to a pretty no-nonsense definition of "minimal acceptable solution." And a CFO or VP shouldn't care about details beyond that. As long as the accountant responsible for A/P and the CFO or VP get to define minimal acceptable solution, WE ARE IN AGREEMENT. Or, borrowing a concept from mathematics, control systems theory and statistics. Necessary and sufficient conditions. As long as the minimal acceptable solution meets accounting and auditing necessary and sufficient requirements’ we have a deal, otherwise no deal. If CFO gets over ruled CFO vows to get revenge. Next audit report poor internal controls results in qualified or disclaimer opinion (not good). Office politics.

Scott wrote: most of the time, a software dev team isn't working only for the CFO. For Ellen's audience here, most people will be supporting multiple departments. Even if originally allocated to one of your projects, they are absolutely at risk of getting an urgent request to build something that drives top-line growth, that the CMO cares about.

The risk of being interrupted: To me this is the Achilles heel of most organizational units including IT and an Agile development team. If you’re Agiling (sic) my A/P system and we suddenly discover that our manufacturing return authorization system has a major bug. If the powers that be take away two of the A/P programmers and the DBA. Or, at any rate severely deplete the Agile team, if no replacements are available it seems to me the A/P team is minimally functional. Time goes by, CFO not happy outsources project or buys off the shelf IT, manager not happy. Moreover, recent studies suggest that multitasking isn’t the panacea it was once thought to be.

Automation of rule based processes, it depends on the individual rules complexity and the capability of the accounting staff. Those with accounting degree get more latitude than an accounting clerk. Some I rules I may say automate, others automate but let me know if the rule has been triggered and do not process the payable. Also, give me some form of (yet to be defined) manual override. Other rules I may say to many judgment calls are needed to fix the problem, don’t bother automating it. Moreover, I am reasonably sure that at some point in the past I’ve triggered some ones bummer or that’s a shame response and most likely I’m unrepentant.

One of the major problems in communicating with different disciplines is terminology or jargon. A simple example is learning curve. As I recall manufacturing was using that term about 80+ years ago. More recent usage has totally different meaning. The same goes for impedance mismatch. Scientist and Engineers (maybe BA’s) think it weird when an MBA uses that term in a management setting.

Scott wrote: everything else is just ROI - "do a little bit more, if the payback meets our internal hurdle rates, and if nothing 'more valuable' comes up between now and then". As long as the team can add capabilities to the AP system that add more value than they cost, the team should keep working, right? Better equals better. The only reason to stop is if something even better than that comes along, or if you can't afford to invest more.

What Does Return On Investment - ROI Mean? Some say;

ROI = (Gain from investment – Cost of investment)/ Cost of investment

Seems simple enough, it depends on what is included in returns/gains and costs. Some gains/returns are difficult to quantify in dollars. Moreover, ROI calculations are known to have been manipulated to suit management’s purposes. If one is forced to consider the time value of money e.g. NPV and uncertainty then the analysis becomes significantly more complex. If IRR is used, under certain conditions multiple roots can occur (not good). The project can flip from go to no go or vise versa depending upon the technique and the assumptions used.

Many of my colleagues and I have noticed that unless the project is a financial or technical disaster rarely are the ROI estimates revisited. More likely a committee is formed to assign blame.

Reasons to stop work on a system: Might I suggest the law of diminishing marginal utility, perhaps rational or irrational management depending upon the actual situation. Also, check the financial or accounting definitions of payback versus hurdle rates (accounting jargon). We may have a units mismatch.

Ellen wrote: about agile being a value-centered approach. The term value probably probable has 15 or 20 different meaning depending up the context. Being the money guy I tend to define value in dollars. If it will make the company money after EBITDA (accounting jargon) and keep the shareholders happy then I tend to be happy.

The term optimal or optimization, in my view is subjected to great abuse and miss-use. It is a mathematical technique for selecting an alternative from a set of available alternatives given certain constraints. This selection process involves working in n-dimensional space. Fortunately, there is software available that handles some of the complexity. Accordingly, how do Agile teams or organizations prove optimality or global optimality?

I used this technique when working for a bank. It was used to allocate scare resources.
The same technique is used in minimizing stock feed costs while maintaining the proper nutritional requirements. The refining industry also uses this technique.

Keep publishing the modern analyst, it’s a good forum.

Regards,

Irritable
zarfman
Only registered users may post comments.

 



Upcoming Live Webinars

 




Copyright 2006-2024 by Modern Analyst Media LLC