Beyond S-curves: review of machine learning approaches for technology forecasting

he heterogeneity and complexity of the technological landscape make building accurate forecasting models a challenging endeavor. Due to their high prevalence in many complex systems, S-curves are a popular forecasting approach in previous work. However, their forecasting performance has not been directly compared to other technology forecasting approaches. Additionally, recent developments in time-series forecasting that claim to improve forecasting accuracy are yet to be applied to technological development data. This work addresses both research gaps by comparing the forecasting performance of S-curves to baseline methods while also assessing auto-encoder approaches that employ recent advances in machine learning and time series forecasting. Experimental analysis shows that S-curves forecasts predominantly exhibit a mean average percentage error (MAPE) comparable to a simple ARIMA baseline. At the same time, on a minority of emerging technologies, the MAPE increases up to two magnitudes. Additionally, the autoencoder approach improves the MAPE by 13.5% on average over the second-best performance, showing promising results for machine learning and deep learning architectures. We hope this review paves the way for more extensive usage of diverse methods in the technology forecasting field.

About the speaker:

Angelika Romanou is a 4th year Ph.D. candidate in Computer Science at the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland. Her research focuses on Artificial Intelligence and Natural Language Processing (NLP). She holds a Bachelor’s in Management Science and Technology and a Master’s in Data Science from Athens University of Economics and Business. She has 7 years’ industry experience in R&D departments as a Data Engineer and Machine Learning Engineer. In her current role as a Ph.D. candidate, she explores how Large Language Models reason over cause-and-effect relationships. Besides her ongoing research responsibilities, she is also guest lecturing the Modern Natural Language Processing course in EPFL, teaching about the current advancements of Large Language Models. When she is not in the lab, she likes to play tennis or rest with a nice book by the sea.