Abstract
Background: In modern day cricket, technology is used to track the trajectory of ball in order help with umpiring decisions, known as the Umpire Decision Review System (DRS). Multiple cameras are placed around the ground at different angles to allow the ball's location to be calculated in three-dimensions. This technology is very expensive and therefore is only exclusively available in elite level matches. In this project, an affordable homemade version of the Umpire DRS is designed.
Methods: A stereo-camera setup was chosen to allow accuracte distance measurements using triangulation in an affordable manner. Simulations were performed to determine the optimal parameters of the physical setup.
Results: Wait and see...
Conclusion: Wait and see...
Umpire DRS in Action - The ball trajectory of the infamous LBW appeal during the 2019 Ashes (3rd Test) against Ben Stokes, bowled by Nathan Lyon. It's worth a dive into the circumstances and consequences of the "Not Out" decision from this delivery if you are unfamilair. It contributes to perhaps one of the most dramatic finishes to a modern test match. Image is a screengrab from SkySports coverage.
What's in this post?
- A bit of background
- The Plan
- Approach
- Simulations
- Field-Test
- Camera Localisation
- Automated Ball Detection
- 3D Visualisation
- Final Product
A bit of background
In the game of cricket, one of the most difficult decisions for an umpire to correctly call is a leg-before-wicket (LBW) dismissal. When an LBW is given, it is because the umpire is convinced that were the ball not intercepted by the batsman's pads, it would have continued to hit the wicket (along with one or two other criteria). Despite the umpire standing in an ideal location to make a decision, it was found that from a sample of 912 LBW appeals from international games, just 77% of LBW appeals are correctly called on-field (Shivakumar, 2016). I say just... This is an extremely difficult decision to make. Modern fast bowlers are slinging the ball at over 90 mph (145 kmph), giving the umpire around half a second to judge all aspects of the delivery and make a decision*.
On the 21st May 2009 for a Test match between England and Pakistan, "Hawk-Eye" was revealed to world of cricket. Hawk-Eye is peice of technology used to track the trajectory of a cricket ball once it has been released by the bowler. Using multiple cameras placed around the ground and capturing images at 240 frames per second, the location of the ball can be determined in 3D space relative to the wicket. This allows many interesting aspects of a delivery to be calculated such as speed, swing, turn and many more. One very appealing piece of information it can provide is an approximation of whether the ball would have continued to hit the stumps had it not been intercepted using a modelled trajectory, as this can help with out difficult LBW decisions.
Millions of pounds have been spent developing the technology to achieve this, as the basis for the system stems from missile detection and tracking. But picture the scene; you're bowling at your mate in the nets at your local village green. Your rhythm is good, the batter is struggling against your tricky length and you decide that now is the time to pin them. You release an absolute jaffa much straighter than your last few, and the ball is hurtling towards the top of middle-stump. You hear the thud of the ball against the front pad as the panicked forward defence fails. You know its plumb, but the batter is stubborn and is convinced it wasn't hitting. You need the satisfaction of the dismissal, but you didn't quite have a few million quid going spare to setup Hawk-Eye for the occasion (or Michael Gough's number).
Well, never again. In this post I'm going to investigate the possibility of developing your own Hawk-Eye using cheap and readily available components to track your cricket deliveries in order to dismiss your stubborn mates. It probably won't be able to detect missiles though.
* It is worth noting here the exceptional accuracy of umpire Michael Gough's on-field decisions. Since 28th September 2017, 95.1% of his on-field decisions have been upheld (ESPNCricInfo). To put this in perspective, the second highest being Kumar Dharmasena with an accuracy of 78.7%. Interestingly enough, he explains the consistency of his decisions comes down to cleaning his ears often (BBC Sport). Who knew? Don't waste your reviews trying to prove Goughy wrong. |
The Plan
This is a pretty hefty project, so I'm going to break it down into smaller chunks.
- The first step is to decide upon the approach that will be used to track the ball's trajectory. There are numerous methods that are feasible, but deciding upon one will require a small investigation.
- Once an approach has been chosen, the scene will be simulated using 3D modelling software. A virtual cricket ball will be placed around a computer generated three-dimensional environment and the measured location will be compared to the true location to understand the setup's accuracy and limitations.
- Using the optimal setup found from simulations, I will perform a field-test of the physical setup. This will involve checking that the calibration is sucesseful, the cricket ball is clearly visible, and that distances can be measured accurately. Hopefully my local cricket club won't mind me borrowing their nets for a few hours...
- When I am happy with the physical setup, I will develop a method to localise the cameras to the wicket. This means making sure that the cameras are aware of where the stumps are so LBW decisions can be made.
- To speed up the process of ball tracking, I will try to implement automated ball detection within each frame to save manual identification. Choosing a method will require a look at what current techniques are most suited.
- Next, I will display the ball trajectory of deliveries using 3D Visualisation to mimic the graphics used by the Umpire DRS seen from the TV coverage.
- Finally, I will see my bowling figures increase as I get to review every LBW appeal that isn't given my way.
Approach
Cameras
The best place to start will be looking at the gold standard in ball tracking - Hawk-Eye Innovations. This technoglogy is now used in over 20 sports, implemented in over 450 stadiums and covers over 7,200 games or events per year (HawkEye Innovations), so it must be pretty good.
This method uses six strategically placed cameras high from the ground all centred on the wicket (Figure 1). They typically have high zoom, large resolution and a very high frame rate, capturing 240 images every second. They use a principle known as "triangulation"; which is the process of calculating an object's location in three-dimensions using multiple images from different angles. The same concept is used by the human eye to guage depth. Have you ever tried catching a cricket ball with one eye closed? It's tough. It's made a whole lot easier with both eyes open, as this allows us to determine how far away it is from our face.
Figure 1 - The locations of the specialised cameras used to track ball trajectory as implemented by Hawkeye Innovations in cricket.
Generally speaking when it comes to triangulation, the more cameras the better. This does however make the mathematics behind triangulation harder. But this is a budget project, so we will be using the bare minimum of two cameras. When using two cameras, we can call the setup a "stereo-camera" system, and the calculations for triangulation become a whole lot easier. Also, there are freely available packages to help with stereo-camera systems.
Okay. So it makes sense financially and mathematically to use two cameras. Next we need to decide on the placement and orientation of the cameras. Realistically, we won't be able to afford cameras with very high (or any) zoom, so we will have to have the cameras fairly close to the wicket to see the ball. Also, the previous work performed with stereo cameras encourages the cameras to be as parallel as possible as straying from this increases the possible error of the measurments. And finally, the cameras will be placed behind the bowlers stumps to minimise motion blur so we can pinpoint the ball better. If the cameras were to be side-on, the pace of the ball across the screen would likely cause motion blurring (unless we have very high frame rate cameras, which we can't afford). Figure 2 shows the suggested location and direction of the stereo cameras. What we don't know yet is the distance between the cameras that will give us the best results, but we will investigate this later. So in summary, the cameras in our setup will be
- close to the wicket
- placed behind the bowlers stumps
- as parallel as possible
Figure 2 - The location of the cameras for our homemade Umpire DRS project. The setup consists of two parallel cameras, close to the wicket and behind the bowlers stumps.
Calibration
The stereo-camera setup is now very well understood, and the available techniques for getting this setup up and running are easily avialble in many different coding languages. The first step to tracking the ball trajectory using stereo-cameras is camera calibration.
Camera calibration is the first step in any system that uses triangulation. It is a small test to measure all the properties of the cameras that can affect the precision of the ball tracking. Some important properties that need to be measured are the distance between the cameras, the precise orientation of the cameras, the focal length and the radial distortion of the image. The most popular method to calibrate was described by Zhang (2000), which was developed to be robust with low cost cameras. It requires presenting both cameras with a black and white checkerboard in various positions around the frame*. After calibration, we will know the distance between the cameras and the focal length (both very important), and the images from the cameras will be aligned (rectified) and corrected to remove radial distorion. Below is a simulated example of camera calibration using ther Zhang method which removes the radial distortion often seen in inexpensive cameras (Figure 3).
Figure 3 - (Left) the captured image of the checkerboard with a grid placed behind to show the radial distortion. (Right) the same image corrected for radial distorion using the technque described by Zhang (2000). Note the grid lines are now straight, but at the cost of losing some of the images around the edges.
The size of the checkerboard and number of squares is something that will have to be optimise for the purpose the ball tracking and will be discussed later in the simulations and field test.
Once the images from both cameras have been rectified and undistorted, we can start converting the location of the ball in the images into a three-dimenstional location. But before we start filming ourselves bowling some nibbling seamers or dirty leggies, we need to determine the ideal setup via simulations.
* There are many way to optimise camera calibration, and a helpful guide can be found here. This includes tips and tricks to make calibration as accurate as possible. It's worth a read if you are planning on doing this yourself. |
Simulations
References
Espncricinfo.com. 2020. Which Umpire Fares The Best When Reviewed By DRS?. [online] Available at: https://www.espncricinfo.com/story/_/id/28993631/which-umpire-fares-best-reviewed-drs [Accessed 5 October 2020].
Hawkeyeinnovations.com. 2020. Hawk-Eye. [online] Available at: https://www.hawkeyeinnovations.com/about [Accessed 5 October 2020].
Shivakumar, R., 2016. What Technology Says About Decision-Making. Journal of Sports Economics, 19(3), pp.315-331.
Zhang, Z., 2000. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), pp.1330-1334.
BBC Sport. 2020. The Secrets Behind The World's Best Umpire. [online] Available at: https://www.bbc.co.uk/sport/cricket/54464650 [Accessed 9 October 2020].