AI is set to transform the insurance industry, and Arbol is keen to stay at the cutting edge of this exciting field. Recently, researchers Michael Isakov, Igor Stankevich, and Yosheb Getachew attended the 40th International Conference on Machine Learning, in Honolulu. The event featured thousands of attendees, hundreds of papers, and numerous workshops, tutorials, and poster sessions on the recent advances in machine learning. Reflecting on the experience, Michael Isakov shares three highlights that will help shape ML research at Arbol over the coming year.
Since the 1960s, weather modeling has been dominated by numerical weather prediction (NWP), which solves physics-derived equations at a high-resolution. In recent years, NWP has been increasingly challenged by data-driven approaches that promise more efficiency and flexibility to learn from long historical time series. The ML community has taken a big interest in weather forecasting, with a multitude of ICML papers focused on extending computer vision techniques to global, spherical inputs. Indeed, one of the keynote speakers, Professor Shakir Mohamed, devoted most of his talk to applications of generative models and graph neural networks for short-term weather prediction. He urged the audience to embrace this challenging and important problem, noting that pushing the bounds of predictability by just a few days can have big societal benefits, like saving lives during a flood evacuation. One particularly interesting poster was ClimaX, which uses both physical simulations and historical data to train one of the first foundation models for weather data.
It’s exciting to see a growing body of research on weather forecasting that ties in well with internal R&D efforts. Arbol’s innovative AI underwriter incorporates the newest techniques from statistics, ML, and deep learning to get a detailed understanding of weather patterns that affect our portfolio. We strive for continual improvement in our pricing and risk management by implementing and testing sophisticated spatio-temporal models including reservoir computers, CNNs, and Transformers. By combining proprietary models with best practices and key ideas from academia, we aim to work on the cutting edge of weather modeling.
One of the big themes at ICML this year was efficiency. If you’ve ever waited for the scrolling output of GPT-4 or stared at a blank screen while Midjourney generates your image, you know that the best AIs can be quite slow. While this is secondary for plenty of applications, low latency is a key parameter for being able to successfully deploy ML models in industry. For instance, Arbol’s insurance platform outputs customized pricing in just a few seconds - this combination of speed and accuracy sets us apart from the competition.
The conference featured lots of interesting ideas for speeding up both training and inference time. For example, consistency models and speculative decoding promise faster output for image generation and LLMs, respectively. And while numerous big models take hundreds of GPUs many weeks to train, ideas like progressive stacking, which starts with a small model and increases its size over the course of training, can potentially lower barriers to model development. Improvements like these will enable us to deploy bigger and more robust models in time-sensitive areas like pricing, driving value for our customers and investors.
Outside of the numerous technical innovations presented at the event, one striking feature was the diverse applications of deep learning models. From improved code generation with LLMs, to solving Sudoku using diffusion, the ML community has taken the latest tools and run with them.
At Arbol, we seek to use the newest methods to go beyond predictive modeling and improve efficiency across the firm. For example, our structuring process focuses on creating policies which most closely match a client’s actual losses, reducing basis risk. Using sophisticated tools like Bayesian optimization to perform key parts of this pipeline, we are able to reduce error-prone manual processes and focus efforts on tasks which require domain expertise. Through creative use of tools like LLMs, knowledge graphs, and optimization, our goal is to transform all aspects of our business operations using AI.