A combined deep learning GRU-autoencoder for the early detection of respiratory disease in pigs using multiple environmental sensors

by Jake Cowton

16:00 (40 min) in USB 5.008

We design and evaluate an assumption-free, deep learning-based methodology for animal health monitoring, specifically for the early detection of respiratory disease in growing pigs based on environmental sensor data. Two recurrent neural networks (RNNs), each comprising of gated recurrent units (GRUs), were used to create an autoencoder (GRU-AE) into which environmental data, collected from a variety of sensors, was processed to detect anomalies.

An autoencoder is a type of network trained to reconstruct the patterns it is fed as input. By training the GRU-AE using environmental data that did not lead to an occurrence of respiratory disease, data that did not fit the pattern of "healthy environmental data" would have a greater reconstruction error. All reconstruction errors were labelled as either normal or anomalous using threshold-based anomaly detection optimised with particle swarm optimisation (PSO) from which alerts are raised. The results from the GRU-AE method outperformed state of the art techniques raising alerts when such predictions deviated from the actual observations. The results show that a change in the environment can result in occurrences of pigs showing symptoms of respiratory disease within 1 - 7 days, meaning that there is a period of time in which their keepers can act to mitigate the negative effect of respiratory diseases such as porcine reproductive and respiratory syndrome (PRRS), a common and destructive disease endemic in pigs.