All Articles

Joey Iverson Receives National Science Foundation Grant Under Algorithms for Threat Detection Program

Author: Anthony Palaszewski

Associate Professor Joey Iverson was awarded a grant from the National Science Foundation (NSF) under the Algorithms for Threat Detection (ATD) program.

This program supports research projects to develop the next generation of mathematical and statistical algorithms for analysis of large spatiotemporal datasets with application to quantitative models of human dynamics. It is jointly administered with the National Geospatial Intelligence Agency (NGA).

Professor Iverson’s research focuses on harmonic analysis and its interactions with representation theory and algebraic combinatorics. Under the new award, he will receive $122,648 to study “Principled machine learning and packing subspaces for improved spatiotemporal data processing”. The project is collaborative with Prof. Dustin Mixon of The Ohio State University  and Prof. John Jasper of the Air Force Institute of Technology.

 

More details appear below. Congratulations, Joey!

 

Abstract:
For the sake of national security, it is imperative that US government agencies detect and classify potential threats as rapidly and accurately as possible. The project will contribute to this cause by delivering mathematical innovations that offer reliable predictions from machine learning algorithms, superior time resolution for surveillance data, and optimal sensor arrangements for robust dataset assembly. In the context of basic research, these contributions will further develop the mathematics of how machines sense and learn.

The project will focus on the following three objectives: (1) Develop theory and algorithms for transfer learning and neural networks. (2) Design optimal or nearly optimal line packings for extremely sparse signals. (3) Design optimal ensembles of equal-rank operators for dataset assembly. To this end, the research aims to solve various open problems from metric geometry and machine learning: (a) to develop new algorithms for transfer learning; (b) to explain emergent phenomena in neural networks using gradient descent trajectories; (c) to construct new line packings that achieve equality in the second Levenshtein bound; (d) to construct incoherent line packings from finite field objects; and (e) to construct highly symmetric subspace packings that are optimal with respect to spectral distance.