• Cambridge Crop Research, NIAB, 93 Lawrence Weaver Road

    CB3 0LE Cambridge

    United Kingdom

Accepting PhD Students

PhD projects

Agri-Tech innovations; crop phenomics; AI in breeding; vision-based phenotypic analysis

If you made any changes in Pure these will be visible here soon.

Personal profile

Academic Background

At UEA, I work with the computer vision group led by Prof Graham Finlayson and Dr Michal Mackiewicz, as well as AI-driven Internet of Things with Dr Edwin Ren in areas such as remote sensing and vision-based feature extraction. Recently, Prof Tony Bagnall and I start to work on AI-driven crop performance and yield predictive modelling, in particular using the time series measures powered by the sktime library. 

Presently, I am leading the Data Sciences department at Cambridge Crop Research, the National Institute of Agricultural Botany (NIAB), jointly established with the University of Cambridge, where my department and the AI lab led by me recruit scientific programmers and post-doc researchers for multi-scale phenotyping and big data analytics, which is also an ongoing collaboration with CMP. If you are interested in Agri-Tech or AI in crop research, please do email me at Ji.Zhou@UEA.ac.uk as we have several data scientists vacancies open. 

I also hold a full professorship at the Nanjing Agricultural University (NAU) China (a synergy post with the UK), where I lead the Zhou lab (China). I welcome UEA PhDs, post-docs, research fellows and lecturers to apply for the NAU academic positions, if you plan to explore AI in breeding and smart agriculture between China and the UK. Similarly, please also email me at Ji.Zhou@UEA.ac.uk and we can discuss potential opportunities for applicants from UEA.  

Education/Academic qualification

Doctor of Science, University of East Anglia

Sep 2006Mar 2011

Award Date: 14 Jul 2011

External positions

Head of Data Sciences, National Institute of Agricultural Botany

Jan 2020 → …

Network

Recent external collaboration on country/territory level. Dive into details by clicking on the dots or