Currently Ritchie is leading artificial intelligence with his colleagues in Ensemble Capital, an AI hedge fund based in Singapore. He is also an NVIDIA Deep Learning Institute instructor enabling developers, data scientists, and researchers leverage on deep learning to solve the most challenging problems. In his leisure time, he dives into deep learning research, in particular, Visual Question Answering (VQA) with researchers in NUS and MILA.
Richie's passion for enabling anyone to leverage on deep learning has led to the creation of Deep Learning Wizard where he has taught and still continue to teach thousands of students in more than 60 countries around the world.
He was previously conducting research in deep learning, computer vision and natural language processing in NExT Search Centre led by Professor Tat-Seng Chua that is jointly setup between National University of Singapore (NUS) and Tsinghua University and is part of NUS Smart Systems Institute. He managed to publish in top-tier conferences and workshops like ICML.
Rapid Large Scale Fractional Differencing for Stationarizing Time Series Data
Typically we attempt to achieve some form of stationarity via a transformation on our time series through common methods including integer differencing. However, integer differencing unnecessarily removes too much memory to achieve stationarity. An alternative, fractional differencing, allows us to achieve stationarity while maintaining the maximum amount of memory compared to integer differencing. While existing CPU-based implementations are inefficient for running fractional differencing on many large-scale time series, our GPU-based implementation enables rapid fractional differencing.