1 INTRODUCTION
Project failures is not something new. Looking back some twenty years shows that failures have been occurring and cost overheads have been reported. Project failures still prevail, and this seems to be ongoing. A good level of attention has been given to failed IT projects by the press and the boardroom in recent years. Organizations are conducting showcases having project post-implementation reviews so that they can analyze what went wrong and what they should consider in order to avoid such failure again.
Reports keep depicting that completion of IT projects has always been difficult either due to time, scope, or budget. Some are left incomplete in order to avoid further overheads. Not only a single factor but multiple factors can be a reason behind projects’ failure.
Having a common definition of “success” actually gives more consideration to differentiation of failure from success. According to Kimmons and Loweree (1989), a project can be considered successful if it fulfills these big fours:
- Completing it before or in time
- Completing it within the budget specified
- Satisfied client
- Technical conformance
According to an often-quoted study from the Gartner Group, 75% of IT projects fail. The Standish Group conducts an annual survey of IT projects. Their latest report shows a decrease in project success rates.
- 32% of all projects succeeded: delivered on time, on budget, with required features.
- 44% were challenged: late, over budget, and/or with less than the required features.
- 24% failed: cancelled prior to completion or delivered and never used.
When we look back, we not only see failures but can clearly see the boom the software industry has been given with its success. But the worrying aspect is that there are failures that are recurring every year, maybe in a different organization but mostly with common causes.
2 THE UK E-BORDERS PROGRAM
Border control in the UK relied on procedures and a system that operated on the border itself in 2003. At this particular time, there was the realization that more checks had to be undertaken in advance before passengers arrive in the UK. Also, the same was attributed before persons could leave their point of origin. The e-Borders project was initiated based on these projections (Vaz, 2009). The 2003 vision is quite similar to the current concept of the program that seeks to makeb use of traveler information through data collected from passengers that are about to enter or leave the UK (Rees, Lomax and Boden, 2015). Undertaking data analysis in advance before arrival or exit and presenting the results of the analysis to officials at border control can foster informed decision making. Also, the program sought to create records with which authorities can identify “persons of interests” and whether they are in the country, together with their travel patterns. Raytheon Systems Limited was awarded the contract in 2007 to implement the system with an expected timeline for completion in 2011 (Vaz, 2009).
The contract was terminated in July 2010 by the Home Office due to the claim that significant milestones were not delivered. Successor programs have since been commissioned due to the failure of the e-Borders project. These programs seek to realize the original vision of the e-Borders project even if the strategy that was set to achieve the objective has evolved over time. This paper will undertake an in-depth analysis to determine the reasons behind the failure of e-Borders.
The UK’s e-Borders project under Immigration Control cost the British Government $334 million in a settlement to Raytheon (U.S. Defense Corporation), who were the suppliers of the system, upon its termination. Raytheon had won the nine-year contract, but it was reported as failing after three years. Binding arbitration in August of 2014 settled the argument amongst the two parties. The British government had to pay $334 million for the previously completed work as well as complaint settlements, damages, and interest payments.
Reports show that the factors that caused the failure include unclear requirements, and without any milestones set, progress could not be tracked. It was mistakenly taken to be simple rather than a complex project. Procurement plans were inadequate and the project was poorly managed. Another reason for this public sector project failure, resulting in a significant financial and time loss, is politics. Advanced checks on passengers’ arrivals are working in part, but a newer version of the current program is now under development.
3 THE FAILURE
The e-Borders project failed because it did not deliver the objectives in full despite the fact that the UK government spent £830 million between 2003 and 2015. The year of completion was originally slated for 2011. The original vision of the project has yet to be delivered even now (National Audit Office (NAO), 2015). The e-Borders project has only managed to analyse 86 percent of data collected compared to a 95 percent target.
According to the National Audit Office, the Home Office did not have a consistent strategy in place to deliver the project on such a scale, and it failed in its efforts to develop an integrated system that had the capability to process all the collected information.
The stakeholder’s importance was underestimated during the period of the project as unrealistic assumptions were made in regards to delivery. This disregarded the importance of managing a large portfolio of stakeholders, estimated to be 600 plus. Despite this, that, ferry, plane, and rail carriers experienced improved relationships was indicated (BBC, 2015). The ability to make decisions was impaired because of the gaps in capability and resourcing that existed. The National Audit Office determined that the e-Borders scheme had a total of eight program directors between 2003 and 2015.
The project relied solely on having IT systems rather than including staff that were resourceful and trained properly (Khan, 2015).
The design work showed some inconsistencies because the Raytheon sub-contractors’ work was not sufficiently integrated. Programme delays also contributed to limiting the availability of skilled staff, overlapping with the next phases of the project. The project was faced commercial and contractual differences on a continual basis because the agreements were ill-conceived and because the contractor had a high-risk transfer. An agreement related to a working partnership was absent, and as a result, the parties were not able to work together adequately (Vaz, 2010). The non-existence of an agreed upon programme plan between Raytheon and the UK government department contributed to the failure as well.
4 REQUIREMENTS ANALYSIS & THE FAILURE
Project management is more than a process. It is the ability to deliver in an ecosystem characterized by uncertainty, volatility, complexity, and unknowns. The e-Borders program failed because it did not survive the conditions of its ecosystem, failed to execute the delivery process, and had poor project management practice. However, this article focuses on requirements analysis aspect of the failure.
Despite project management knowledge and practices being extensively documented and promoted, projects often fail to apply these principles. In the case of the e-Borders program, requirements analysis and management was a key contributor to the failure.
4.1.1 Requirements
The inability to define the exact needs sometimes translates into inaccurate or ineffective requirements analysis. This results in a deviation from an ideal scenario that facilitates the determination of exact requirements. The Home Office had a concept and a vision but not well-developed requirements. According to the National Audit Office report (2015), “The Department has little clear idea of how it expects business processes to change.”
In the traditional project management practices, requirements complexity is managed by investing a significant amount of time into the ‘Requirements Analysis’ phase. This is based upon the assumption that the time invested in analysis will reduce complexity. This provides more time to unravel the ‘unknowns’, permitting the stakeholders to make an informed decision while the requirements are being understood and defined. “The Department had incorporated Raytheon’s proposed design within the contract with the company. But the proposals had been based on too high-level requirements, leading to disputes after contract award over whether proposals would meet actual needs” (NAO, 2015).
The program execution was based on a proposed blueprint design rather than real needs or a realistic and tested concept.
4.1.1.1 Blueprint Requirements
It is common for projects to be initiated based on blueprints. However, a blueprint is just a guide to the future state. Its intended purpose is to guide the subsequent analysis and design activities. It does not answer all the questions. The details of what, how, and why are left to requirements analysis.
The e-Borders program relied on a conceptual blueprint design to build a complex technology solution. As the blueprint had few details for the designers and developers to work with, Raytheon struggled to deliver a product that met the client’s needs.
Neither party saw the missing piece, which was the business requirements. The assumption that the blueprint was sufficient to develop such a complex solution was fatal. Relying solely on a high-level blueprint creates volatility of requirements. “The Department frequently found Raytheon’s solutions unconvincing; conversely, Raytheon felt that requirements were growing and shifting, leading to major disputes, including varying interpretations of different parts of the contract” (NAO, 2015).
4.1.2 Feasibility
The Home Office had a concept, not a well-developed set of requirements. Concepts need a reality check; otherwise, you could be chasing a dream! Even though the program ran a pilot to evaluate the feasibility of the concept, the National Audit Office report (2015) claims that it did not cover all aspects of the solution. Consequently, the programme was executed with an untested concept and unknown requirements, which led to disputes.
5 CONCEPT DEVELOPMENT
A concept is constructed from ideas, business needs, business vision, etc. By definition, a concept is untested. Hence, the concept must be tested with real facts and scenarios. It’s what we refer to as ‘feasibility’.
Feasibility’s purpose is to test the practicality of the concept and identify challenges, flaws, and areas in need of improvements. This is accomplished by challenging the concept with real-life scenarios. This testing phase is an opportunity to test and learn about the concept. The e-Borders program failed to test its concept thoroughly. As recommended by NAO, feasibility scope must cover the functional and non-functional aspects of the concept. “Pilots should test not just whether technical challenges can be overcome but also whether the business changes required across participating government or non-government organizations are feasible” (NAO, 2015).
Requirements must be based on facts and real-life scenarios. When the concept is untested or not fully tested, then the requirements are hypothetical. Moving to design and build with hypothetical requirements, the design and/or build phase becomes the testing ground for the concept. This may lead to clashes as each party has different expectations of the outcome of the ‘hypotheses’. This is what NAO report refers to, such as “the Department frequently found Raytheon’s solutions unconvincing.”
During the testing of the concept, conclusions shall be made regarding the practicality of the concept, the challenges, and the flaws. The outcome of the test should be fed back to the concept development in order to align the concept accordingly. This is an iterative process where each version of the concept is reviewed based on the previous version’s tests feedback.
Testing the concept raises the confidence in its practicality and unrevealed the unknowns. Once this confidence is established, the feasibility outcome is fed into requirements analysis to develop set business requirements. At this stage, facts have been established about the concept during testing. Hence, the requirements are less hypothetical.
5.1 SO DOES HOW THIS FIT IN WITH BUSINESS ANALYSIS?
The business analyst is constantly required to educate her or his stakeholders and the project team on the business analysis practice. Requirements analysis and concept development are within the practice of business analysis. Hence, it is the business analyst that is accountable to promote a good practice of business analysis.
A blueprint cannot be prescribed as requirements. The inherent risk of lack of practicality and realism will have a subsequent impact in the design and build phase of the project cycle, as experienced by the UK e-Borders project.
6 CONCLUSIONS
As of March 2015, the Home Office spent at least £830 million on the e-Borders project. Valuable capabilities, however, have been gained that are crucial for risk assessment of the passenger before the border is reached. There is limited evidence, however, that these capabilities are effective. The Home Office has not yet implemented an integrated system, and because of this case, border processes are still inefficient. The government has so far been unable to take full advantage of the received data’s potential. The e-Border project was deemed a failure because it had not achieved value for the money that was spent by the UK government.Based on the e-Borders project failure is a learning opportunity to draw these conclusions for business analysts:
- Concept need to go through a development process to validate and draw realistic tested requirements.
- Hypothetical requirements create uncertainty. A thorough test of the concept must be conducted to assess the practicality and weaknesses of the business concept.
- Good requirements management practice, good requirements management practice, and good requirements management practice!
Author: Adam Alami, Sr. Business Analyst
Adam Alami is a seasoned IT consultant with over 18 years’ experience. Business Analysis and Project Management is his passion. His experience revolved around major business transformation projects. He is a versatile IT professional. He accumulated a wealth of cross industry experience with Tier 1 businesses in major projects in the areas of Enterprise Transformation, Integration, Migration, and Systems Modernization.
Adam has a passion for research. His research interests are IT Offshoring, Global Project Managements, Banking Technology, Business Analysis, Information Technology and Culture, Enterprise Innovation and Business Solutions.
Email: [email protected]
Website: www.adamalami.com
References:
1. BBC, (2015). Home Office criticised over £830m 'failed' borders scheme. [online] BBC News. Available at: http://www.bbc.com/news/uk-34988913 [Accessed 5 Feb. 2016].
2. Foxton, W. (2014). Government IT projects fail because of politicians, not programmers. [online] Technology - Telegraph. Available at: http://blogs.telegraph.co.uk/technology/willardfoxton2/100014123/government-it-projects-fail-because-of-politicians-not-programmers/ [Accessed 5 Feb. 2016].
3. GOV.UK, (2014). Home Secretary letter on the e-Borders programme arbitration. [online] Gov.uk. Available at: https://www.gov.uk/government/news/home-secretary-letter-on-the-e-borders-programme-arbitration [Accessed 5 Feb. 2016].
4. Hampshire, J. (2009). 13. The future of border control: risk management of migration in the UK. Migration and mobility in Europe: Trends, patterns and control, p.229.
5. Khan, A. (2015). Failure to Deal with the Issues: The e-Borders Award and ‘Serious Irregularity’ under the Arbitration Act 1996. Available at SSRN 2604612.
6. National Audit Office (NAO), (2015). E-borders and successor programmes. [online] Nao.org.uk. Available at: https://www.nao.org.uk/report/home-office-e-borders-and-successor-programmes/ [Accessed 5 Feb. 2016].
7. Press Association, (2015). E-Borders scheme to boost security 'wasted £830m'. [online] Mail Online. Available at: http://www.dailymail.co.uk/wires/pa/article-3343650/E-Borders-scheme-boost-security-wasted-830m.html [Accessed 5 Feb. 2016].
8. Rees, P., Lomax, N. and Boden, P. (2015). Alternative Approaches To Forecasting Migration: Framework And UK Illustrations.
9. Vaz, K. (2009). The e-Borders programme. London: Stationery Office.
10. Vaz, K. (2010). UK Border Agency. London: Stationery Office.