Explore the Power of Deep Learning-based Automated Palm Tree Counting in EOfactory
Providing an accurate evaluation of palm tree plantations in a large region can bring meaningful impacts in both economic and ecological aspects. However, the enormous spatial scale and the variety of geological features across regions have made it a grand challenge with limited solutions based on manual human monitoring efforts. Although deep learning-based algorithms have demonstrated potential in firming an automated approach in recent years, the labeling efforts needed for covering different features in different regions largely constrain its effectiveness in large-scale problems.
Palm oil, an economic perennial crop mostly cultivated across South East Asia, is an important source of edible oils and fats. It is also used for producing oleochemicals, which are the main ingredients of personal care cosmetics and cleaning products. According to F. A. O. (2019) In the past 2 decades from 1997-2018, palm oil plantations expanded from 10 to 21 million hectares (Mha), and crude palm oil production increased by a factor of 3, i.e., from 100 to 300 million tons (Mt). In recent years, palm oil production has exponentially increased with almost 90% produced in SE-Asia alone! Indonesia and Malaysia are by far the largest palm oil-producing nations. Together they account for 85-90 percent of the total production of this vegetable oil.
Production of Palm Oil per year
Remote sensing when coupled with deep learning has the potential to show remarkable performance in object detection. EOfactory’s well-established object detection models developed using Computer Vision for accurate Earth Observation (EO), enable to detect the object of interest such as palm oil trees.
Developing such a dataset of palm oil trees from the latest vintage imageries could help in filling the gap between advancing technologies and developing resource management strategies
a) Evaluating Green House Gas emissions and removals.
b) Establishing strategies regarding plantation management, e.g., with respect to the need for renewal of aging plantations.
c) Periodic monitoring of palm oil trees extent.
UAV-based Palm Tree Inventory
High spatial resolution images captured with UAV offer a reliable prospect to detect palm trees with a characteristic crown formation. Template matching algorithms is a popular technique for detecting object from an image using the object’s boundary as criteria. In some cases, however, the use of boundary could be misleading due to image distortion or occlusion. Furthermore, template matching can be affected by the geometry and scale of the object in the image. To overcome these limitations, object-based analysis was applied where the boundary of objects are defined through segmentation. The selection of segmentation parameters that are suitable for varying geometry and scale of trees can result in accurate detection. For this reason, template matching and object-based image analysis were integrated into a single processing workflow to improve the counting accuracy and the result was compared with another experiment produced with template matching only.
Advances in space science and computing have largely improved farming methods, productivity, and yield. EOfactory analyses and demonstrates an efficient way of taking stock of oil palm tree stands at the comfort of the owner without the usual several days of hard labor. The approach employed significantly minimizes error in counting compared to using the widely employed template matching (a meticulous manual counting check confirmed this). Reduction in the error of estimation from 790 trees stands 582 (i.e. about 27%) is a great improvement to the effective decision-making process, allocation of resources, and quantitative yield estimation. Automation of this process into a computer-based program simplified for non-technical users will be a vital tool not only to the farmers but policymakers and relevant government agencies in making informed decisions. EOfactory is the perfect match for all these tools together to make an accurate decision in near future.