By search term

By author
  • Methods
  • Practice

IT Requirements when Buying, not Making

As with most IT projects, capturing and documenting requirements is a foundation for success. However, with selection rather than construction – a project to buy, not make – you need requirements that are specifically engineered for this purpose. This is sometimes termed COTS-Aware Requirements Engineering (CARE) [5] and Procurement-Oriented Requirements Engineering (PORE) [6].

This article, as a practitioner guide, does not contain radically new techniques, but aims to direct your mind-set and the approach you should take if you are the requirements engineer, requirements analyst or business analyst on a software selection project. While it references many academic and practitioner articles, it is not a full survey of the literature.

  • It does aim to guide both your requirements elicitation and how you document the requirements. Both processes are different for a project to select an off-the-shelf solution rather than having software specially written.
  • When documenting the requirements, the requirements engineer should apply the ten useful guidelines to ensure the requirements can, in due course, be used to score candidates in a straightforward manner. To select off-the-shelf software, the requirements must become an effective ‘yardstick’ against which candidate solutions can be measured for fit at different stages in the project, so that candidates are not compared to each other, but measured at each stage against the specific requirements of your organisation.
  • This article also addresses the fact that, when engineering requirements for an off-the-shelf selection, requirements documentation is itself pivotal. While ‘a good conversation’ (as Agilists often call it) is a pre-requisite, documenting the outcomes is essential because the same reference information must be supplied to multiple candidate suppliers [7]. Requirements might be ‘exposed’ incrementally to suppliers – for instance, by releasing a sub-set at the early Request for Information (RFI) stage, when reducing the longlist to a shortlist. However, this pre-supposes a comprehensive set that is documented before approaching the marketplace.

Your approach must also reflect the realities of a project that involves collaboration between different organisations (yours and multiple suppliers) that have different financial imperatives, perspectives and reporting structures. This means understanding where the organisational and personal interests align or diverge, and adopting appropriate and pragmatic techniques to manage stakeholders, expectations, requirements, documentation and meetings [8] [9] [10].


The terms artefact, completeness, priority, requirements document, scenario, stakeholder, supplier, user and validation are used with the meaning defined in the IREB Glossary [11]. The terms component and software component are used as defined in the IREB glossary, but a large solution like an off-the-shelf software product (whether on-premise or cloud) might also be regarded as a system in its own right.

COTS software means a Commercial Off-The-Shelf software component, which may be a cloud service or a more traditional on-premise package.

A candidate is the combination of one solution (software product) as put forward by one prospective supplier – this means that two candidates might be the same reseller putting forward two products, or the same product from two different resellers.


Requirements engineering has addressed the topic of selecting COTS software since the late 1980s. Of course, many standard techniques will apply, including Requirements Elicitation and Requirements Documentation, especially using Natural Language and Requirements Validation and Negotiation [12] [13]. Many common requirements engineering objectives apply, including the fact the “requirements engineer is supposed [to] elicit requirements in a way that is both unbiased and neutral” [14]. There may also be a need for mediation techniques to resolve conflicts in requirements [15], especially if the project spans organisational silos and supply chain boundaries.

This article offers more specific and directed approaches to selecting off-the-shelf IT solutions by drawing on the author’s book [16] and experience as an IT practitioner specialising in selections.

Selection Method Context

There are many published methods for selecting off-the-shelf ie COTS software [8] [17] [18] [19] [20] [21] [22] [23] [29]. This article uses the Decision Evaluation Selection Method to set the context. After scoping and requirements capture, there will be phases of longlisting, RFI, detailed evaluation with scoring, demonstrations, reference sites, negotiation, contract and implementation. See Figure 1.

While this is not a full review of selection methods themselves, some of the main differences between the approach used here and some existing techniques for requirements-based COTS-selection are as follows.

  • A separation of powers, such that the progressive shortlisting happens during stages of formal decision-making. The role of the evaluation team is to marshal the facts (for instance, the supplier responses to requirements) and express them as scores, statistics, cost estimates or short text summaries. A separate project board (or decision-makers at normal senior management meetings) then consider these assembled facts to make an evidence-based decision at each stage.
  • Consciously aiming to expend supplier time sparingly, since candidates have limited resources, meaning sales opportunities compete with revenue-earning. Treating candidate suppliers as stakeholders will reduce the number of supplier withdrawals, ultimately improving the customer choice. It does not confuse a willingness to feed an unlimited number of hours into the opportunity with signs of professionalism, commitment or service orientation. (Indeed, unconstrained effort by a supplier may indicate desperation because their product is nearly-impossible to sell.)
  • Concentrating on face-to-face meetings later in the process, having reducing the number of candidates by rigorous but documentation-based shortlisting. The detailed evaluation meetings are held with 3-4 candidates; demonstration is only of 1-2 highly credible candidates; reference sites are taken up for only 1 or 2 candidates.
  • There is no conventional Invitation to Tender document (ITT). The approach reflects the philosophy that reading and writing documentation to assess the ultimate fitness for purpose of something as complex and fluid as modern software is impossible to achieve and therefore foolish to try. “Talking with each other is often more effective than writing against each other” [24]. Once the shortlist is established, the main vehicles for exchanging information are meetings, telephone calls and video conferences. The documentation follows the discussion. However, process safeguards mean that performance assertions by suppliers have contractual significance.
  • There is no attempt to automatically measure either fit or gaps by sophisticated formulae or statistics. Evaluation team members, referring to requirements and notes from the supplier response meetings, work as a group to express the relative fit of each candidate to each requirement by using a simple but pragmatic scoring scheme.
  • An insistence that customisation or tailoring (modifying source code for a specific customer context) is a last resort. It is best avoided altogether; undertaken only after other measures have been exhausted, preferably at least six months after implementation (not in preparation for the implementation). Decision Evaluation’s selection method consciously avoids the buy-and-adapt approach [25].

Requirements Have Multiple Roles Throughout the Selection

The agreed requirements document will almost certainly be the most important deliverable created during your selection project.

Requirement capture has several ‘general’ uses throughout the project, such as to engage stakeholders and to help familiarise all contributors with organisational processes.

Moreover, requirements specifically feed multiple stages of the evaluation, selection and procurement method. Figure 1 shows how the requirements, once elicited, feed stages further down the selection process. In some cases – see D and H – the main requirements document is provided to suppliers in its original form. In others – such as C and G – a sub-set of requirements is extracted and repurposed into documentation with a specific evaluation objective.

Figure 1: Overview of the selection approach showing requirements flows

A. Scope sets the overall context that affects all requirements. The scoping exercise will not only feed in critical requirements, but will also shape some of the weights for requirement importance or value.

B. The requirements statements provide the basis for the end-of-phase weighting exercise, which will distinguish between the must-have and nice-to-have requirements, to record the organisational value of each requirement.

C. The requirements are repurposed to derive your preliminary shortlisting questionnaire (the request for information, or RFI). One of the initial criteria for extracting the sub-set of requirements is to choose those requirements that have higher weights for importance.

D. The full requirements statement informs 3-4 shortlisted candidate IT suppliers of your organisational need. Candidates respond to the main part of the requirements document (the ‘scored requirements’) during detailed evaluation meetings.

E. The requirements form the basis for scoring shortlisted candidates after detailed evaluation.

F. Requirements feed your software gap analysis, because they create the yardstick that allows you to spot the requirements that are not met. (Note this gap analysis is subsequently one feed into both your questions for reference sites and your preparation for negotiations, but this extra use of requirements is indirect so it is not explicitly shown in Figure 1.)

G. Requirements feed your requested outline for the demonstrations (again, particularly those requirements that have higher weights for importance).

H. The full requirements document is one of three critical working documents that become an attachment to your contract with the successful candidate.

Managing Expectations During Requirements Capture

It is important to manage user expectations during requirements definition interviews or workshops.

Expectations About Meeting Requirements

Many software development projects assume all requirements will be met – even though not all requirements will translate into benefits [26]. With an off-the-shelf selection, the reverse is true – you assume you will never find an off-the-shelf product that fits perfectly. By definition, some requirements will never be met. In practise, most users recognise this is the case, but it is best to begin all requirements capture meetings with this warning.

When the requirement is recorded, there can be no guarantee that the requirement will be met at procurement. This is because the requirement may prove to be technically or economically unfeasible: the solution with best overall fit may lack support for the requirement; the desired facility may only appear in an extra module that is not cost-justified.

However, sufficiency beats perfection. Managing expectations at less than 100% fit is important to making progress. If users insist on a solution to meet every need, they will probably force a ‘shortlist of zero’ with all candidates disqualified. If their organisation lacks the appetite, time, funds or skills to build, they will miss out altogether on new software.

Requirements Cannot Be Ignored

At the other extreme, some organisations have obviously believed it was not worthwhile to consider their own requirements because the project is selecting a pre-existing artefact. Because the software already exists and you probably cannot change its function, limitations, source code, documentation or schedule, they may think there is no point in defining requirements before studying candidates. This approach to evaluation is sometimes called I’ll Know It When I See It or IKIWISI [27].

However, while you might not be able to immediately change the content of a candidate software product, your requirements can (and should) shape the set of facilities you adopt during your selection. The point may be obvious but is sadly often ignored – requirements are essential if you are to find the candidate with the closest fit to your needs.

Project Philosophy Concerning Customisation

It is also important to have a clear view during the project about customisation or tailoring – paying for program modifications to adapt the off-the-shelf software to the specific context of use.

This is tempting, and some software suppliers push the idea – it flatters the customer to say they deserve special software, it increases customer lock-in and their software becomes a Trojan Horse to make the real money on services for the whole life of the installation.

However, customisation should be resisted. In some areas such as financial accounting, off-the-shelf software has been available for over 40 years, so organisations may be using their third generation solution. I find organisations that customised their older software tend to be less likely to tailor now, because they have learned their lesson. For organisations new to COTS software, tailoring as a first resort rather than a last resort is a common approach (and commonly regretted later).

I have met project managers who had multi-million budgets solely to ‘return a product to vanilla’ and I’ve been asked to help companies recover from excessively-modified software that was only one year old. In short, tailoring off-the-shelf software often indicates ineffective change management; it is a risky, costly, inefficient way to make a product fit; it can cut you off from supporting products and future releases; recruiting new staff with experience of the standard software will not be a benefit if they don’t recognise it. Also note that some software suppliers now refuse to entertain customisation and will only implement the standard product. Rather than customise the code base, far better to adopt alternative approaches such as fully exploiting the configuration options (sophisticated settings) or supplemental solutions [16].

Team Communications

Managing user expectations during requirements definition is only one dimension of managing change during the organisational stress of IT-enabled transformations. Stakeholder management and communication amongst all team members are important throughout [28]. While it is a much larger topic than can be covered here, communications are particularly important when the project policies may at first seem shocking to the users, such as the very need to define requirements, or the insistence on barring customisation until the dust has settled on implementation.

Organising Requirements (Cataloguing)

People at requirements definition interviews and workshops rarely present you with ‘pure’ IT requirements. You are presented with a mixture of information about the organisation, its objectives, processes and flaws, and with human aspects, such as power structures. Moreover, when attendees cover requirements, they cover items at front of mind – they rarely give you a systematic trip through the organisation’s workflow. It is usual before documenting requirements to analyse, organise, classify and simply to think about them.

Accordingly, during the meetings and certainly before you document the requirements, you should organise your notes. If you have different notes sheets or online pages for different categories of requirements, you can capture some of them directly onto the relevant sheet during the interview or workshop. Examples include information on the current (incumbent) systems, glossary entries or business volumes.

You may maintain a separate section for project notes – this is project learning that you don’t want to lose about the organisation, but it is not a specific IT requirement that will make one candidate more suitable during selection. For instance, an interviewee may report that the organisation repeatedly under-estimates the importance of training. This is significant information to capture for later in the project, but is not a criterion for evaluating solutions. (Contrast this with factors that are criteria because they may reveal one candidate to be superior to the others – for instance that the supplier is able to offer training, or that there is an ‘ecosystem’ of trainers and training material, or that the solution is known for low training needs.) However, not allowing enough budget for training is an internal characteristic of the prospective customer, rather than one of the criteria for evaluating candidate suppliers. The risk statement may go into an appendix at the back of your requirement documentation, for internal circulation rather than release to suppliers.

You should reflect on your notes and ‘consolidate’ requirements where applicable. When cataloguing for a selection, it is especially important to group requirements into coherent subjects or business topics such as Finance, Scheduling or Documentation [16]. This will be important to allow specialism: subject matter experts in your organisation can review ‘their’ part of a wider requirement statement that cuts across organisational silos; internal specialists can attend meetings by exception, in order to weight requirements or to score candidates; the suppliers can put forward the relevant module expert to respond to a block of requirements.

You often get ‘fragments’ of the same requirement from different interviewees, and need to fit the jigsaw pieces together. The requirements need to be treated thematically because the processing that you purchase off-the-shelf will never be related directly to one interviewee or, indeed, department.

Of course, you may have conflicting requirements that need negotiation, mediation and reconciliation [15].

Adding Standard Requirements

There is also considerable scope for the requirements engineer to put forward requirements that have not been volunteered by business representatives. These may cover wider or specialist topics: technical issues, such as the preferred database technology; IT management dimensions, such as the mechanism for releasing updates; commercial dimensions, such as contractual stipulations; human and organisational change aspects, such as training; service dimensions, such as the types of consultancy available.

As part of compiling requirements, the requirements engineer can (and indeed should) add these ‘standard’ requirements – even though they were not explicitly reported during the requirements definition meetings. This is not prescribing the need to the user. If the requirement is inappropriate, it will be removed during the weighting meeting – see the section Weighting Requirements for Importance below.

Because the end-to-end requirements process will eliminate inappropriate requirements, this means it is safer to add an ‘unnecessary’ requirement than to risk something significant being missed. Often, requirements are not reported because of two extremes – at one end they are so ‘obvious’ that they are invisible to the organisation, or it is regarded as not even necessary to voice them. At the other extreme, the requirement is so sophisticated that is over the horizon for the current thinking.

This process of injecting requirements into the set means the requirements engineer can legitimately contribute subject matter: from other projects; from research into best practice; from the organisation’s library of standard requirements; from reverse engineering (studying software products to work backwards from features to potential requirements).

Later, this pool of requirements will be inspected by user representatives during the weighting meeting.

The Requirements Documentation

Remember that the fundamental objective is a requirement that can be scored later, meaning that (during the detailed evaluation stage) your evaluation team will examine three or four candidates against that requirement and (in a later, separate, scoring meeting) will allocate points for how well each solution meets each requirement. Each requirement needs to be expressed clearly enough that it exposes any significant differences between candidate solutions.

How you document your requirements is a type of decision-making. Decisions can be described as choosing consequences, so, as a practitioner, you should be aware of the consequences of certain styles of documentation, to avoid unintended consequences. Some of your decisions may need to be pragmatic and to actually disregard some of the guidelines of formal specification or quality methods, because of their impact upon the reader – a sales or technical representative at a commercial supplier.

Requirements must represent a blend of software facilities and supplier services. There will be a mix of functional requirements for software processing, plus quality requirements and constraints (such as the availability of documentation, the service desk hours, or commercial factors like the license price).

  • If some candidates are cloud providers, there will probably be a higher proportion of service requirements.
  • While you must accept that all requirements cannot be known before approaching the market (meaning the requirements document is ‘incomplete’) an effective ‘sample’ of ‘good enough’ requirements can, notwithstanding, identify the solution with the best fit.

It is important to write at the correct length and level of detail. High-level, imprecise mission statements do not form an effective yardstick to measure software products or supplier capabilities. Equally, avoid large, multi-paragraph requirements that contain several detailed statements of need, even if they are related. One requirement covering a single page (of European A4 or US Letter) is probably too large for a software selection – solid real world experience indicates a minimum of 2–4 requirements per page is the appropriate level of granularity.

You should avoid overlap or redundancy. This means you should avoid measuring the same attribute multiple times. Also, avoid correlated or co-dependent variables, so that each requirement should ideally aim to test a different aspect of the software or supplier [29].

The Impact of Language on the Users and Suppliers

It is best if the same document is agreed by the user community and eventually provided without repurposing when you have the shortlist of 3-4 candidate IT suppliers. If you can engineer the document to make sense to both stakeholder groups (users and suppliers) this saves significant time and effort – and risk of translation error.

For a selection project, you are more likely to use natural language rather than an artificial specification language. You need to choose a format. The most important aspect of a specification format is that it is credible and engages the audience, so use what works.

You might write in a detailed requirements format, or in a shorter, more outcome-orientated scenarios format [30] [31]. Although user stories are associated with Agile and therefore software development projects, your organisation may be familiar with the format from other projects. If user stories are too high level to later probe the differences between candidates, they might be a precursor to the specification or (preferably) made more specific, for instance by adding test notes or acceptance criteria. These would be clearly associated with each story, without disrupting the main story flow.

Figure 2 illustrates the same requirement expressed within three different formats – one longer requirements definition format, one scenario and one user story.

Figure 2: One requirement in requirements definition, scenario & user story formats

The Roles of Language and Attributes

One specific aspect of wording is important. IREB guidelines suggest two alternative ways of fixing supplier liability for delivery (sometimes called legal obligation). Inline means the “fixing of liability by using the verbs ‘shall’, ‘should’, ‘will’, ‘may’ can be made in the text of the requirement. If the liabilities change, then the requirements change too. The use of attributes is another possibility for documenting the liabilities of requirements.” [32]

For the following reasons, it is much better to use the second approach – attributes, see the section Weighting Requirements for Importance below.

  • The requirements document will be circulated to multiple organisations as suppliers put forward their response teams during the detailed evaluation. External change control is troublesome if you re-issue documentation because some of the requirements have changed their liability.
  • When suppliers respond, it should be ‘blind’ in the sense of not knowing which are the critical requirements. You are assessing the fit of their standard product without modifications and this fit is not affected by the importance of the requirement. The process will be biased by suppliers knowing if any requirements where they are weak are ‘deal breakers’. Likewise, by keeping the importance out of the language, the internal evaluation team will later ‘score blind’ without reminders of the importance – or sight of the weights.
  • It is a mistake to assume that writing requirements as ‘demands’ in the format ‘The system shall…’ or ‘The supplier must…’ will automatically and effortlessly ensure compliance.
  • By setting an authoritarian tone with such demands, you erode the collaborative atmosphere – this is not a battle, but a dance. Your selection process is heavily dependent upon the expertise of highly-skilled knowledge professionals at each supplier. Using the vocabulary of an old-fashioned master-servant relationship will hinder their motivation – and your decision-making.
  • The terms must and will can be counter-productive. They can mean you risk losing control. Candidate suppliers might well respond with extras to give 100% compliance. You have lost the initiative in ‘value engineering’ – where the evaluation team or steering committee decide how much function you as customer can sacrifice to adopt the standard software without customisation (ie tailored code).

Ten Desired Characteristics of a Requirement for an Off-the-Shelf Selection

There are several general characteristics that are desired in any requirements statement – you always expect it to reflect all stakeholder groups, to be understandable, unambiguous, concise, traceable, consistent, quantified, value-driven, neutral/impartial and complete [14] [33] [34] [35] [36]. In addition, some specific characteristic are important when the requirements will be the input to selecting off-the-shelf solutions [16].

  1. Non-prescriptive: concentrates on business outcomes, without mandating the IT processing method, location, medium (such as paper) or person involved (except to set context). This is sometimes referred to as design-independent, as avoiding premature design decisions or avoiding architecture in requirement statements. Describe what the software needs to do, rather than how it should do it.
  2. Collected: brings together similar requirements from different conversations or project documents, from different times and, potentially, different requirements engineers.
  3. Thematic: avoids silos, is non-partisan, reflects systematic thinking.
  4. Unique: avoids duplication to prevent double counting during scoring (duplicated rewards or penalties for the same software processing or supplier attribute).
  5. Sequenced: placed in a logical order in the document flow, preferably within requirement ‘families’ or categories for a coherent response by module specialists at suppliers.
  6. Standalone: without concomitant peers that automatically score high/low in sympathy, so each requirement ideally measures a different dimension of the candidates. This is sometimes referred to as avoiding redundancy.
  7. Measurable: articulated with a contractual commitment to delivery in mind. This is sometimes referred to as verifiable.
  8. Shrewd: avoids vague (arguably naïve) terms such as friendly, flexible, fast, instant, immediate or ‘etc’. Because these are not enforceable, they are a poor basis for an evaluation or a contractual agreement.
  9. Positive: states future need, but avoids dogmatic vocabulary to better ‘manage the talent’ and to motivate suppliers.
  10. Referenced: such as by a number to tie back to the RFI, the detailed scoring matrix and the demonstration outline – manageably fine-grained, with at least 2–4 discrete requirements on a page.

Validate, Agree and Refine Requirements

As in any requirements capture, the requirements need to be validated and negotiated. Remember yourself – and remind users – that not all requirements can be met.

In practise, validating requirements – especially with a large consultation programme – means circulating drafts of the requirements document for comment. This allows you to consult additional people who could not attend the meetings. The fact that some document commentators are reading the requirements ‘cold’ is another reason for a natural language specification.

You may encourage a team to study the requirements document (or ‘their bit’ of it) at a more social review during a meeting, with an appointed scribe to send back comments.

You might invite people to send back a document with tracked changes.

It is vitally important to organisational change management that you reflect the comments you receive in the next version [37]. Often, the consultation itself is more important to project success than the ‘concrete’ requirements derived.


Requirements are not all equal - different requirements have different value to your organisation. Weighting requirements can be considered one part of validation. Weighting means deciding which requirements are must-have and which are nice-to-have – some projects use the MoSCoW approach [38]. While such terms are common, the best approach for a selection project is mathematic, as follows.

  • Rather than using only two values, you should allocate a number, within the recommended range from 1 to 5, to record the spread of importance from trivial to mandatory.
  • During the weighting process itself, use the extra value of 0 to flag entries that are agreed on reflection not to be requirements at all, so they can be removed from the next (final) edition of the requirements document. This approach is an important contribution to the legitimacy of those requirements that remain.

Note that the term Weight is equivalent to Priority in the RE Glossary. However, the term weight works better when referring to the number that indicates importance. It will be used on the Weighted Attribute Matrix for scoring candidates, where the weight for importance is multiplied by the score for fit [16]. During your selection project, you should avoid the term Priority in case it suggests a premature indication of which facilities in the new software will be implemented first – when writing the requirements document, you don’t know the facilities and constraints of the successful solution because you have not chosen it yet. Therefore, you cannot predict (and should not pre-empt) the sequence of implementing features. Priority in the sense of urgency will become relevant later – when planning the implementation of the successful candidate.

The two approaches to determining how important each requirement is can be summarised as ‘expert meeting’ or ‘mechanical’.

  • The expert meeting is the weighting workshop, which should use informal or formal voting [39]. A good decision-making cycle is to vote immediately on each requirement, because you may all have views that align. Go quickly round the table so people can simply state their vote 0–5, without supporting argument. Note that people with no strong view can abstain, and allow the average to be set by those who have a clear opinion. Because you only need to debate requirements where views are split on how important the requirement is, have a vote-debate-revote cycle only for those requirements where views differ.
  • You can also determine weight by mechanical formulae. For instance, if you map each requirement onto the project’s use cases, those requirements which feature in a high proportion of use cases should be allocated a high weight.

Importance of the ‘Weight’ Attribute

The weight is arguably the most importance requirement attribute in a selection project. It has multiple roles.

  • After the weighting meeting, any requirement with a weight above zero has been endorsed by the weighting team as having value to the organisation. It will feature in the final version of the requirements document. This avoids the problematic circulation of a large document with the note to say if there are no comments within a week the requirements will be regarded as acceptable. By endorsing each individual requirement, there is no need to sign off the bulky document itself.
  • Weights will be an important part of deciding which requirements are turned into criteria on the Request for Information (RFI, the preliminary shortlisting questionnaire).
  • The Weighted Attribute Matrix (WAM) is the format that will be used for multi-criteria decision-making (MCDM). The weight ensures the mathematics of WAM (during the scoring for fit) will reward candidates that are strong in areas that are important to the organisation [16].

Exploiting the Requirements During the Subsequent Phases of Evaluation, Selection, Negotiation and Contracting

Requirements are so important that some repetition is warranted. Figure 1 has already shown the flows of requirements. It is worth repeating the subsequent or ‘downstream’ uses of requirements from the perspective of the document that ‘receives’ the requirements. Remember that directly and indirectly, requirements will feed evaluation, selection and decision-making at every subsequent phase. This is because software products are so complex you must avoid comparing them to each other. You must compare all the candidates to the same yardstick. At every major decision-making stage gate, the basis of decision is the result of comparing candidates to your requirements – or criteria, questions or prompts derived from these requirements [16].

  • The Request for Information (RFI) is the preliminary shortlisting questionnaire, sometimes termed the Pre-Qualifying Questionnaire (PQQ). This is an occasion when requirements must be repurposed. To transform user requirements into the RFI criteria, you first need to select a sub-set of requirements. It will be those with high weights. You must also turn your selected requirements into closed questions that can be tested via sending out a document and assessing written replies.
  • When you have a shortlist of 3 or 4 candidates for detailed evaluation, supply your requirements document, usually 1-2 weeks before the evaluation meetings. To avoid translation losses, this is not re-purposed. However, it may be supplied without some of the appendices towards the end of the document, if these are only of internal interest.
  • After detailed evaluation and scoring, you will prepare for reference site interviews. One source of questions is those requirements that are important (high weight) but that proved to have a weak fit (low score). You can ask the reference sites if that gap has proved a tricky issue, and if they have found a workaround.
  • The demonstration outline is not a script. The outline is like the waypoints on an orienteering course – you want the demonstration to pass through them, but there are still choices about the route. You will shape the requested demonstration by referring to the high-weighted requirements. You will also refer to the requirement categories to ensure the demonstration shows off the right features by ‘passing through’ situations that resonate with the main stakeholders, departments, divisions and user communities.
  • Furthermore, the gap analysis after scoring will shape the negotiation agenda. You are aiming to influence the next version of the software specification and therefore the evolution of the standard product. You aim for the next release (or next-but-one) to be a closer fit to your requirement. Enhancements to the standard software at the supplier’s R&D cost can be the most significant negotiation objectives, and the most productive negotiation gains if you can secure them.
  • The full requirements document will become one of the attachments to contract, along with the candidate scoring spreadsheet and the document that defines the scores awarded.


For many organisations and types of software, off-the-shelf is the only practical option. For a selection project, you need requirements and this means you need requirements documentation. This must be agreed and weighted internally and the same detailed document should be released to the candidate suppliers on the shortlist. The language you use is important, as you need to set a collaborative tone for the project. Your language also heavily shapes the supplier responses. The requirements, as originally recorded or as inputs to other documentation, consistently provide the yardstick against which to measure candidates during evaluation, demonstration, reference sites, negotiation and contracting.

References and Literature

  • [1] Morring Jr, F. (2014) Off The Shelf. Aviation Week & Space Technology, Vol. 176, Issue 27. 49-51.
  • [2] Nan, Z. and Jiamin, W. (2014) Reliability Analysis of COTS-based Software System. International Journal of Multimedia and Ubiquitous Engineering, Vol 9, No 8. 73-84.
  • [3] Carney, D. Hissam, S. and Plakosh, D. (2000) Complex COTS-based software systems: practical steps for their maintenance. Journal of Software Maintenance: Research and Practice, Volume 12. 357-376.
  • [4] Marchesi, M. Succi, G. and Russo, B. (2007) A model of the dynamics of the market of COTS software, in the absence of new entrants. Information Systems Frontiers, Volume 9. 257–265.
  • [5] Chung, L. and Cooper, K. (2004) Defining Goals in a COTS-Aware Requirements Engineering Approach. Systems engineering: the Journal of the International Council on Systems Engineering, Vol. 7, Part 1. 61-83.
  • [6] Maiden, N. A. and Ncube, C. (1998) Acquiring COTS Software Selection Requirements. IEEE Software, Vol. 15, No. 2. 46-56.
  • [7] Arthaud, R. (2015) Is requirements engineering still needed in agile development approaches? IREB, Requirements Engineering Magazine., (05 Aug 2015).
  • [8] Nettleton, D. (2003) How to Purchase COTS Software. BioPharm International, March. 26-27.
  • [9] Archer, D. and Cameron, A. (2014) ‘One city: Two stadiums. Lessons learned in megaprojects’. Project Manager Today, XXVI Issue 3 (Apr 2014), 16–19.
  • [10] Archer, D. and Cameron, A. (2013) Collaborative leadership: Building relationships, handling conflict and sharing control. Routledge: New York.
  • [11] Glinz, M. (2014) A Glossary of Requirements Engineering Terminology Version 1.6. Requirements Engineering Research Group, Zurich., (06 June 2015).
  • [12] IREB. (2015) Syllabus ‑ IREB Certified Professional for Requirements Engineering ‑ Foundation Level. International Requirements Engineering Board, Karlsruhe., (06 June 2015).
  • [13] IREB. (2011) Syllabus ‑ IREB Certified Professional for Requirements Engineering ‑ Elicitation and Consolidation, Advanced Level. International Requirements Engineering Board, Karlsruhe., (06 June 2015).
  • [14] Rupp, C. and Schöne, K. (2015) Requirements under construction: Agreed, unambiguous and based on inventions. Requirements Engineering Magazine, IREB., (05 Aug 2015).
  • [15] Rupp, C. (2012) Chapter 18 – Requirements Engineering and Management – Mediation techniques. SOPHIST., (05 Aug 2015).
  • [16] Tate, M. (2015) Off-The-Shelf IT Solutions: A practitioner’s guide to selection and procurement. BCS, The Chartered Institute for IT: Swindon. Especially see:
 Chapter 5 for guidelines on cataloguing, formulating requirements, requirements formats & weighting meeting;
 Chapter 7 for guidelines on selecting & formulating effective RFI questions;
 Chapter 9 for scoring, the Weighted Attribute Matrix and techniques to close gaps in fit without paying for customisation (tailoring).
  • [17] Lawlis, P. K. Mark, K. E. Thomas, D. A. and Courtheyn, T. (2001) A Formal Process for Evaluating COTS Software Products. Computer, IEEE, Vol. 34, Issue 5. 58-63.
  • [18] Schwittek, W. and Eicker, S. (2012) Decision Support for Off-the-Shelf Software Selection in Web Development Projects. Journal on Data Semantics, ICWE 2012 Doctoral Consortium, Berlin, Jul. 238-243.
  • [19] Chung, L. Cooper, K. Lee, S. Shafique, F. and Yi, A. (2003) ACASA – a framework for Adaptable COTS-Aware Software Architecting. Computer Standards and Interfaces, Vol. 25, No. 3. 223-231.
  • [20] Mohamed, A. Ruhe, G. and Eberlein, A. (2007) MiHOS: an approach to support handling the mismatches between system requirements and COTS products. Requirements Engineering, Vol. 12, No. 3. 127-143.
  • [21] Sai, V. Franch, X. and Maiden, N. (2004) Driving Component Selection through Actor-Oriented Models and Use Cases. Journal on data semantics, Issue 2959. 63-73.
  • [22] Ncube, C. and Dean, J. (2002) The Limitations of Current Decision-Making Techniques in the Procurement of COTS Software Components. Journal on data semantics, Issue 2255. 176-187.
  • [23] Ulfat-Bunyadi, N. Kamsties, E. and Pohl, K. (2005) Considering Variability in a System Family’s Architecture During COTS Evaluation. Journal on data semantics, ICCBSS 2005 (COTS-based software systems, Bilbao, Spain, Feb, 2005). 223-235.
  • [24] Rupp, C. (2010) Are System Engineers Complete Losers When it Comes to Communication? SOPHIST., (05 Aug 2015).
  • [25] Vigder, M. Gentleman, W. and Dean, J. (1996) COTS software integration: state of the art. National Research Council Canada (NRC), 39198, January 1996.
  • [26] Thorp, J. (2003) The Information Paradox: Realizing the Business Benefits of Information Technology (New Edition). McGraw Hill Higher Education: Maidenhead.
  • [27] Boehm, B. (2000) Software Management. Requirements that Handle IKIWISI, COTS, and Rapid Change. Computer, IEEE, Vol. 33, Part 7. 99-103.
  • [28] Burden, P. and Tate, M. (2015) The softer side of choosing an off-the-shelf solution. BCS, The Chartered Institute for IT. See Articles at, (01 Aug 2015).
  • [29] Villalba, M. T. (2010) Software Quality Evaluation For Security COTS Products. International Journal of Software Engineering and Knowledge Engineering, Vol. 20, No. 1. 27–48.
  • [30] Alexander, I. F. and Maiden, N. (2004) Scenarios, Stories and Use Cases Throughout the Systems Development Life-Cycle. John Wiley & Sons: Chichester.
  • [31] See two downloadable formats (Detailed and Scenario requirements document) at Templates for stage: Requirement definition (chapters 4-5) by visiting, (15 June 2015).
  • [32] See EU 5 in IREB. (2015) Syllabus ‑ IREB Certified Professional for Requirements Engineering ‑ Foundation Level. International Requirements Engineering Board, Karlsruhe., (06 June 2015).
  • [33] Cadle, J. Paul, D. and Turner, P. (2014) Business analysis techniques: 99 essential tools for success, (2nd edition). BCS, The Chartered Institute for IT: Swindon.
  • [34] Alexander, I. and Stevens, R. (2002) Writing better requirements. Addison Wesley: Harlow.
  • [35] Brodie, L. (Ed) and Gilb, T. (2005) Competitive Engineering: A Handbook for Systems Engineering Requirements Engineering, and Software Engineering Using Planguage. Elsevier Butterworth-Heinemann: Oxford.
  • [36] Joppich, R. Rupp, C and Wünch, C. (2010) Molecular RE – the Blueprint of a Perfect Requirement. SOPHIST., (05 Aug 2015).
  • [37] Morgan, J. and Dale, C. (2013) Managing IT projects for business change: From risk to success. BCS, The Chartered Institute for IT: Swindon.
  • [38] DSDM Consortium. (2015) 10. MoSCoW Prioritisation. Dynamic Systems Development Method Limited., (05 Aug 2015).
  • [39] See Conflict Resolution in Chapter 7 in Pohl, K. and Rupp, C. (2015) Requirements Engineering Fundamentals, 2nd Edition. Rocky Nook: Santa Barbara.
Author's profile
Related Articles
Innovation Arena

An agile and collaborative prioritization technique

Read Article
Think Like a Scientist

Using Hypothesis Testing and Metrics to Drive Requirements Elicitation

Read Article
A key technique

Delegation of requirement verification. A key technique for more mature requirements management.

Read Article