For my IEOR 165 project, I explored the relationship between housing features and sales prices using regression-based machine learning models. I implemented and compared Ordinary Least Squares (OLS), Ridge Regression, and Nadaraya-Watson Kernel Regression, training each model on an 80/20 data split and tuning hyperparameters through cross-validation.
Among the three, OLS achieved the lowest error rate (RMSE ≈ 1.18M), while Ridge offered better handling of multicollinearity, and Kernel Regression highlighted the challenges of bandwidth sensitivity. This project strengthened my skills in predictive modeling, cross-validation, and evaluation of trade-offs across algorithms.
👉 Click the link here to view the full report and detailed analysis.

Leave a comment