ICC to review independent study to see if there is a fine-edge advance to be made in DRS

Nobody will say so outright but the hope is that the results will be encouraging enough to nudge the Board of Control for Cricket in India (BCCI) – and other sceptics – to start reviewing their stance on DRS, writes Osman Samiuddin.

A view of the new piece of technology. Courtesy ICC
Powered by automated translation

On Tuesday and Wednesday this week, the International Cricket Council (ICC) cricket committee has been presented with what is the first independent assessment of the technologies that are used in cricket's Decision Review System (DRS).

Nobody will say so outright but the hope is that the results will be encouraging enough to nudge the Board of Control for Cricket in India (BCCI) – and other sceptics – to start reviewing their stance on DRS.

Engineers from the Massachusetts Institute of Technology (MIT) in Boston have been working with the ICC over the last year to test the various ball-tracking and edge-detection technologies that cricket uses; Sanjay Sarma, professor for mechanical engineering at MIT, and Dr Jaco Pretorius have headed the project and both were part of the presentations.

Such an assessment has been needed for some time, but was prompted by the events of the 2011 series between England and India and more recently, the 2013 Ashes. In both, Hot Spot, the infrared imaging system employed to detect edges on bats through the heat created by the friction between bat and ball, suffered a couple of high-profile glitches. That led to its creator Warren Brennan to concede that Hot Spot “will miss fine edges” on occasions.

Related: India's MS Dhoni 'still not convinced on DRS' after wrong call in first ODI against Australia

Read also: ICC admit umpire mistake for final England wicket in World Cup defeat to Australia

The ICC were put in touch with Professor Sarma, who doubles as a cricket buff and has followed cricket’s dance with the DRS closely over the years. Sarma has led a team of MIT researchers and engineers on the project.

The first step for his team was to build suitable apparatus that could be used specifically to test the technologies. They began by creating a device for edge-detection technologies that was completed and put to use in September last year at the England Cricket Board’s National Cricket Performance Centre in Loughborough.

“One of the difficulties in testing these edge-detection products is that their performance is probably best assessed when there is really fine contact,” Geoff Allardice, the ICC’s general manager cricket, said earlier this month (Allardice has overseen the entire process).

“But to generate enough really fine contact repeatedly – if you have somebody throw a ball and somebody try to generate thin edges for a big enough sample, you’d have someone there for a week doing it – [they made] an apparatus with a swinging arm where we generate fine contact between ball and bat on a regular basis (see video). They can control the positioning of the ball to calibrate edges.”

*** WATCH: An explainer video of the technology the ICC are reviewing ***

The bat was instrumented with sensors so that vibration from the thinnest contact with the ball registers on a piece of the MIT team’s recording equipment; the sound that the contact creates is also recorded by the technology being tested – in the video, for instance, it was Hawk-Eye’s Ultraedge (Ultraedge detects edges by automatically synching vision from ultra motion cameras to the audio from stump mics). The resulting data – from the MIT device and Ultraedge – were then cross-referenced and compared. The method of testing was also used to assess the other form of edge-detection, the heat-based Hot Spot.

“We went a couple of days later to Lord’s [for England’s ODI against Australia], where they set up the Ultraedge on a desk and the guys assessed that,” said Allardice. “They looked at the timing of the sounds, so the ball hits the middle of the bat and there is a sound that goes up and they make sure that appears at the moment the ball strikes the bat. So we do a closed session and a match session to see, practically, what it is like.”

Similar match assessments were held for Real-Time Snicko – a competitor product to Ultraedge but developed earlier – and Hot Spot during the Christchurch Test between New Zealand and Australia in February this year. In April more offline testing was carried out in Melbourne on the same technologies.

But given that the focus of the BCCI’s long-standing objection has been on ball-tracking systems such as Hawk-Eye and Virtual Eye in particular, it is the assessments of those that will be keenly observed at the cricket committee meeting at Lord’s in London.

The BCCI has not been alone in its reservations about the predictive elements of such technology, which draw up a trajectory of the ball after it strikes a bat or pad – Ian Taylor, the creator of Virtual Eye, has often expressed his unease at a projection being used as a tool of decision-making.

For this the MIT team created a square frame with a laser field within it so that whenever a ball passes through the frame, they can record it’s exact co-ordinates. The frame was built in Boston, disassembled, shipped to Winchester, England in late April and reassembled for the testing.

It was set up around the stumps. Hawk-Eye cameras were in place on temporary scaffoldings in front of the wicket, behind it and one to each side. An infrared camera hovered over a good length area on the pitch where the ball would land – the testing used both real bowlers and a bowling machine to deliver a sample of between 150-200 balls over a couple of days. They used a spinner and a fast bowler, as well as the three colours of balls: red, white and pink.

It left MIT and ICC with data for where the ball pitched as well as where it passed over the stumps, allowing them to compare that with where Hawk-Eye recorded the ball’s point of pitching and passing over the stumps.

Allardice acknowledged the importance and complexity of the ball-tracking assessments. “The edge-detection is a binary result – you can interpret a sound or you can’t, or you can interpret a mark or you can’t. How umpires interpret the images presented by those two systems is another point for consideration. Was a sound registered yes or no?

“This [ball-tracking] is more, how close to the two co-ordinates does the ball pass, what’s the standard deviation and what’s the average variation between the technology and the testing device?”

The assessments of ball-trackers could have implications for one of the more contested elements of the DRS – the “umpire’s call” element in LBW decisions. This is the element that best captures cricket’s prevarication vis-à-vis technology: dangling between wanting to trust technology implicitly and fully, but feeling the need, out of a sense of tradition, to respect the umpire’s authority and retain faith in his judgment. It allows, for instance, a batsman to remain not out even if ball-tracking shows that a ball would have hit the stumps (just not enough of the ball as per a predefined parameter) as long as the umpire’s original decision was not out.

The assessments will prompt a wider discussion around whether “umpire continues to make decisions on-field and technology reviews those decisions, whether players continue to initiate review or somebody else and then the details,” said Allardice.

“People had views on the “umpire’s call”. Lots of different schools of thought on that but the current interpretation flows from the philosophy. Those are all elements of the discussion. We’re looking for the cricket committee to provide some direction how they see DRS being used in the future.”

The results are unlikely to be made public and neither, Allardice said, will they be a ranking of technologies. “Most series are running around the world with good results using all the technologies so I don’t think the aim is for us to be saying yes or no to any technology or that they should be up here, but just more to understand their performance, their strengths and weaknesses.”

The significance of the other prominent figure in the process, Anil Kumble as head of the cricket committee, will not be lost on anyone. Kumble was India’s captain when DRS was used for the first time, in the 2008 series against Sri Lanka. He was not “convinced with the tools used and the accuracy of it” at the time, thus forming what has been the BCCI stance ever since.

There were suggestions of a softening earlier this year, when reports emerged that the BCCI was considering using elements of DRS in the Indian Premier League (IPL). That did not eventually materialise, though David Richardson, the ICC’s chief executive, has said in the past he hopes to convince the BCCI to change their stance by converting Kumble first.

He has time – Kumble has just been reappointed head of the committee for another three years. “Anil is very closely involved in this project,” Allardice said. “He has visited Boston and seen the work they are doing and he is very encouraged by the direction they are heading in.

“He has a very logical thought process and wants to see it independently assessed first. I think at this stage we’re just trying to get all the information to present to the cricket committee.

“And for last couple of years, we’ve been trying to get DRS working as best as we possibly can. We get it working really well in certain games and it’s still dong the job of increasing our percentage of correct decisions by 4-5%.”

DRS technologies

• Hawk-Eye and Virtual Eye

The two main ball-tracking technologies in use through the cricket world, specifically for LBW decisions. The systems use several special cameras placed around the ground, all linked to a computer. The video from the cameras is triangulated and combined to produce an accurate 3D representation of the path of the ball.

• Hot Spot

First introduced to cricket in the 2006-07 Ashes by Channel Nine, Hot Spot uses technology commonly used by the military to track jet fighters and tanks to detect edges on bats. Hot Spot uses two infrared cameras which sense and measure heat from friction generated by a collision, such as ball on pad, ball on bat, ball on ground, or ball on glove, helping in decisions for nicks and bat-pad catches.

• Ultraedge and Real-time Snicko

Both competing products use audio and video technology to help in detecting edges. RTS has built on the original Snick-o-meter product, which used audio from stump mics and video output to highlight edges. Both RTS and Ultraedge are far quicker versions, now aligning ultra-motion cameras to stump mics: the process of synching audio with video has been automated.

osamiuddin@thenational.ae

Follow us on Twitter @NatSportUAE

Like us on Facebook at facebook.com/TheNationalSport