Welcome to my notebook! I’m Daniel, originally from Spain 🇪🇸 and now living in Paris 🇫🇷. I’m currently studying the MVA Master’s (Mathematics, Vision, Learning) at ENS Paris-Saclay. This blog documents my journey exploring machine learning and other topics that interest me. I believe that writing about what you learn can be not only useful for yourself but also for others following a similar path. Moreover, learning in public can spark discussions and positive exchanges.
Let's get into the probabilistic derivation of the Kalman filter and smoother. Kalman filters have become a fundamental tool in control system and signal processing, and here we will explore their mathematical foundations, and applications in machine learning.
A deep dive into the foundational paper "Long Short-Term Memory" by S. Hochreiter and J. Schmidhuber, which introduced the LSTM architecture. We'll explore the motivation behind the model, its mathematical foundations and core intuitions, as well as its properties, achievements, and relevance today.
Building upon the recently implemented and trained VggNet, we will go through and implement 'A Neural Algorithm of Artistic Style' by Gatys, Ecker and Bethge which allowed to transfer the style of one image to the content of a different one. I will dive into some of the complexities of the implementation and give some beautiful examples of the results this technique can offer.
Thank you for visiting my site! Don’t hesitate to reach out to discuss more, share your ideas or just introduce yourself. Feel free to reach me either via the contact form below or connecting with me using the links at the bottom of the page.