Blog
Posts
RNNs strike back
Transformers have completely taken by storm the field of sequence modelling with deep networks, becoming the standard for text processing, video, even images. RNNs that were once a very active engineering field have slowly faded into the void. All of them? No, some RNNs are bravely fighting back to claim state-of-the-art results in sequence tasks. The most suprising part? They are linear...Low-rank RNNs in ten minutes
This post is an attempt at summarizing results obtained over five years of research in the laboratory of Srdjan Ostojic, at ENS Paris, on the uses of low-rank RNNs, in a ten-minutes read. Let's see if that's enough to catch your interest!A buzzword tour of 2021 in Neuro and AI
Between self-supervised approaches for transfer learning, contrastive losses and representational similarity analysis, these last few years were as rich in ideas as buzzing with confusing words. Here is a little dictionary to celebrate the end of 2021.Adventures in Statsland: an encouter with CCA
Did you know PCA and Pearson's correlation had a child together ? Let's meet this fascinating and multifaceted tool called CCA.Notes on simple linear regression and 2D point clouds
Linear regression, despite being the most basic model of the field of statistics, hides a lot of subtleties and elegant results.Year in AI - 2020
As humans have been grappling with a pretty rough ride this year, how have machines been doing? Let's find out in this small review!How to parallellize a python loop in one minute
Having machines with many CPU cores is a great way to make all your computations run much faster, right? Unless.. You have to modify the code yourself to take advantage of it? But parallellization seems complicated, involves forks and processes...A quick git primer
As I come from the computer science community, I find that git is a very useful tool for handling projects, although it can be intimidating at first. This is my attempt at get you going on git in a few minutes.
subscribe via RSS