Streamlining Field Evaluation Trials

The Ministry of Defence (MoD) has reportedly identified the field evaluation trial (FET) as one of the stages for the delay in material procurement. The Defence Acquisition Procedure 2020 (DAP 2020) regulates capital acquisitions and envisages 16–24 weeks for trials to be completed and an additional 12 weeks for winter trials, if required.[1]

The overall period of 16 to 36 weeks appears reasonable, but the prevailing impression is that it takes much longer. No reliable data is available publicly on the average time it takes to complete FET or why the trials tend to linger, though opinions on the subject abound. An empirical study to determine why steps taken in the past have not had the desired effect might have helped address the root cause of delay more effectively.

Be that as it may, a good starting point would be to acknowledge that the trials, in whatever form they are conducted, cannot be eliminated, notably when equipment is being inducted into service for the first time. However, its purpose must be clear, as the type and extent of trials/tests to be carried out on the equipment would depend on what is intended to be achieved.

There are many scattered provisions in DAP 2020 relating to various aspects of FET, and an appendix contains guidelines on how field trials are to be conducted.[2] Perhaps the only definitive indication of its purpose is the following statement:

FET will not be conducted as a process of elimination, but to nurture competition. Therefore, the evaluation’s primary focus should be on testing the equipment based on anticipated employability.[3]

However, the primary objective of trials is to ensure that the equipment meets the operational requirements (ORs) for which it is being procured, not to nurture competition. The trials are carried out strictly according to the case-specific trial methodology and directives. The former is specified in the Request for Proposal (RfP), and the latter is formulated later in consultation with the successful vendors after evaluating their technical bids.

It is hard to imagine any trial team—essentially an ad hoc entity constituted whenever a trial is to be carried out—exercising its discretion to deviate from the specific directives and declare that a particular piece of equipment has successfully passed the field evaluation trials because it is fit for anticipated employability, even if it does not fully pass muster with the trial methodology.

No officer or team of officers can ever dare to be so bold as to override the express check points mentioned in the trial methodology and base their judgement regarding acceptability of the equipment under trial for induction into service based on the consideration that it is fit for its anticipated employability, overlooking non-compliance with one or more checkpoints listed in the trial methodology.

This serious choke point traces its origins to how an acquisition proposal is conceived by the Service Headquarters (SHQ) concerned. At that stage, the trial methodology is formulated for inclusion in the RfP. Any deviation from it at the trial stage is potentially risky and legally challengeable.

If this reality is acknowledged, the focus must shift to the formulation of a pragmatic trial methodology to ensure that the equipment under trial serves the operational purpose for which it is to be procured, for the problems inherent in the Services Qualitative Requirements (SQRs) are reflected in the trial methodology and affect FET. This calls for a review of the current procedure of formulating the SQRs, which form the basis of the trial methodology.

The SQRs are drafted by the User Directorate concerned primarily based on the response to the Request for Information (RfI), and in consultation with multiple agencies within the Services and MoD, including the HQ Integrated Defence Staff, Defence Research and Development Organisation (DRDO), Quality Assurance Agency, Directorate of Standardisation, and Additional Directors General (Acquisition–Technical) in the Capital Acquisition Wing.[4]

The SQRs are divided into Essential Parameters A and B, and Enhanced Performance Parameters, though not in all cases. These are approved by the multi-member Staff Equipment Policy Committee (SEPC) of the SHQ concerned or a Joint SEPC, if the equipment is used by more than one service.[5] Interestingly, the SEPC, which takes up the draft SQRs prepared by the SHQ concerned, is ‘authorised to consult Subject Matter Experts as deemed necessary while finalising SQRs’.[6] However, it is unknown how often this provision has been invoked.

The main issue with this arrangement is that the draft SQRs are prepared by a group of officers posted at the SHQ at a given time. As users, they have primacy in identifying the need for a specific operational capability and specifying the ORs. However, converting the ORs into SQRs calls for a particular kind of expertise that transcends the users’ bailiwick. This speaks to the practicality of the RfPs.

As per the 15th report of the Standing Committee on Defence (15th Lok Sabha), in 18 months starting 1 September 2010, as many as 41 RfPs of a particular service fell through because of the QR-related problems. Several years ago, late Manohar Parrikar, an IIT-alumnus and then India’s defence minister, had jokingly said that the military’s SQRs appeared at times to be straight out of the “Marvel comic books” as the technologies demanded were unrealistic and simply non-existent.[7]

More recently, in 2022, while withdrawing from the tender for new submarines required by the Indian Navy, the Russian designers said that the project was unrealistic as the desired technologies could not be made available within the stipulated timelines.[8] Three years later, the contract is yet to be finalised.

The SQR-related issues tend to dog the acquisition process not only at the pre-bid and bidding stages, but also at the trial evaluation stage which typically comprises the User Trials, Technical Trials (including the environmental testing), Maintainability Evaluation Trials (MET), Electro Magnetic Compatibility (EMC)/Electro Magnetic Interference (EMI) Evaluation, and Secrecy Testing, where required.[9] Trials are carried out in different weather conditions and terrains.  For naval and airborne assets, trials are conducted differently.

It would be evident that this is a complex system. Provisions made in DAP 2020 and earlier manuals include accepting vendor self-certification, certification by accredited laboratories, trials by computer simulation/modelling, documented historical validation data, etc.[10]

If that is the case, it is difficult to say what impact these measures have had, or why they have not had the desired effects. It is also difficult to say what can be done to rectify the past mistakes, if any, or what new steps can be taken to simplify this complex system, but one thing is clear: like formulation of SQRs, trials must be conducted by a permanent establishment, not by ad-hoc trial teams, which bring to bear its experience and confidence on the trials.

Perhaps mindful of this functional imperative, MoD had contemplated setting up Trial Wings at Military Training Institutions/Establishments so that trials could be conducted under their aegis.[11] This would have been a welcome, albeit a sub-optimal, substitute for a composite organisation to handle all aspects of defence capital acquisition, which was recommended to be set up by a committee constituted by the MoD about a decade ago.[12]

It would be helpful to implement this idea, taking inspiration from the successful examples of the UK’s Defence Equipment and Support and France’s Direction Générale de l’Armement, or Directorate General of Armament. These professional organisations are responsible for handling all functions relating to defence acquisitions.

An honest introspection is needed to determine why things have not worked out as expected, despite various measures taken to address the problem. This is a sine qua non for identifying appropriate and bold remedial measures for the future.

Views expressed are of the author and do not necessarily reflect the views of the Manohar Parrikar IDSA or of the Government of India.

[1] Defence Acquisition Procedure 2020, Chapter II, pp. 164–165.

[2] Ibid., Chapter II, Appendix G, pp. 146–149.

[3] Ibid., Chapter II, para 66, p. 44.

[4] Ibid., Chapter II, para 15.

[5] Ibid., Chapter II, paras 14 to 20.

[6] Ibid., Chapter II, para 18.

[7] Rahul Bedi, Not Just the MoD, the Ministry’s QR Overreach is Also Culpable for Impeding Modernisation, The Wire, 9 June 2021.

[8] Manu Pubby, Indian P 75I Submarine Plan Unrealistic, Timelines Cannot Be Met: Russian Designers, The Economic Times, 15 August 2022.

[9] DAP 2020, Chapter II, para 66.

[10] Ibid., paras 44 and 68.

[11] Ibid., para 70.

[12] Dinakar Peri, “A Shot in the Arm for Defence Acquisition”, The Hindu, 11 February 2017.

Keywords : Defence, Defence Acquisition