What do the Fibonacci sequence and staircases have in common?
A LeetCode exercise taught me that the number of ways to climb an n-step staircase is the (n + 1)-th term of the Fibonacci sequence, which is neat! »
A LeetCode exercise taught me that the number of ways to climb an n-step staircase is the (n + 1)-th term of the Fibonacci sequence, which is neat! »
Sometimes you want to use estimators from one package but methods from another. Maybe, like me, you want to use scikit-learn's grid searching cross validation function with an estimator from statsmodels. These two don't work together straight out of the box, but by writing a quick wrapper, you can make a statsmodels estimator play nice with scikit-learn. »
State space models make up a suite of powerful time series analysis techniques which utilize the Kalman filter to model seasonal, trend, and level components of time series separately. State space methodology gives the developer considerably greater control over how the time series is modeled than most popular time series analysis techniques while also seemlessly allowing the analysis of exogenous variables alongside autoregressive and moving average terms. »
Wrapping up the series on causal inference, this final post covers the essential topic of design sensitivity, which allows a statistician to derive actual insights from an observational study by making some necessary adjustments to the standard statistical inference used in randomized experiments. »
Continuing in the series on causal inference, this post discusses analyzing the results of a pair matched trial design with Wilcoxon's signed rank test and how to compute approximate p-values via normal approximation. »
For a machine learning course, I had to write code to implement the ID3 algorithm to train decision trees from scratch. Writing recursive functions can be challenging and even frustrating, particularly when you are a math/stats master's student just beginning his foray into the world of devops and computer science. Each piece of the unoptimized recursion I wrote is written out in gory detail here for your reading pleasure. »
Continuing in the causal inference series, this post discusses pair matched trial design via propensity scores and the "naive" model of observational studies. »
This is my master's thesis broken into smaller, more digestible pieces. Causal inference is a fascinating (and relatively emergent) branch of statistics that seeks to establish causal relationships between variables. It turns out that establishing causality is intensely more demanding than establishing associations via traditional statistical inference methods. This post covers the groundwork to get started with causal inference, including essential background about randomized experiments and observational studies. »
The gradient descent algorithm turns up nearly everywhere in machine learning. This algorithm is intensely popular because it is excellent at solving certain types of optimization problems. It must be used thoughtfully, however, since it is not guaranteed to converge to global extrema. It's absolutely essential for machine learning engineers to understand the mathematics of this ubiquitous algorithm. »