Lastly, projects involving multiple stakeholders (which occurs in many government projects) with disparite interests almost demands a more traditional approach.
Larimar
I guess there are everak quite substantial and complex projects that have been successfully deliverred via agile methods now, so it's not true to say it can't be done.
The question is what are the right circumstances for picking structured analysis and design up front (eg waterfall) versus requirements discovery as you go (eg agile.) I think the answer is based in organisational culture and the market you are operating in, but I am open to other opinions.
Craig
And if BA's are charged with doing such reengineering, shouldn't they be most focused on as-is modeling, instead of jumping right into to-be modeling, as is typically the case with Use Cases?
Tony
I agree that an understanding of the current state is useful. Surely you could use use cases to model the current state if you wanted to? (Anyone else got an opinion on this?)
Unless a very detailed "AS-IS" is needed, I would use something like:
- Adrian
Adrian
For larger scale efforts - and maybe even not so large scale efforts - use cases are a poor as-is modeling technique. Can you imagine trying to do something really as-is-intensive like business process reengineering using use cases? We are talking largely about process discovery, and, as a process is defined by its data inputs and data outputs, unless we use a technque which formally identifies these, the result is going to be unsatiafactory. And only data flow diagrams rigorously capture these inputs and outputs (see my comments to Craig about Method H).
Also, imagine trying to scope a large scale system using a data model (ERD or class model). As a data model is, compared to strong functional modeling technique, inert (i.e., it is not going to prod the analyst through discovery), functional analysis should lead entity modeling.
All:
This is an addendum to my above post. On another forum, Craig Brown asked about using Method H for modeling processes with respect to inputs and outputs. I can not locate that posting, so here is my response.
In Method H, the inputs often come into the process from outerspace and go from the process back out to outerspace. Where is the lithmus test of accuracy? How, for example, do I konw that I have ID'd all inputs - not just some of a processes inputs. Answer: I don't. It is only by hooking the processes together via inputs and outputs that we can flow thing through systematically and thereby insure completedness. Only by flowing the flow of a data package through its string of processes can we ensure that we have discovered all essential processing on that data.
The project I am currently on tried something almost exactly like Method H. I have taken things over. When looking at a given process in isolation things kind of look OK. However, in trying to integrate the processes together, I found that the models where so disjointed that I basically have been starting from scratch with data flow diagrams. A false allusion of precision can be dangerous.
brought to you by enabling practitioners & organizations to achieve their goals using: