Page 1

Binh Phan

Hayward, CA binhphan@usc.edu (510) 294-8627 binhphan.com

Skills • Machine Learning, Model Selection and Optimization, Deep Learning, Data Mining • Languages: Python (1 year), NumPy (1 year), Pandas (1 year), scikit-learn (1 year), matplotlib (1 year), TensorFlow (<1 year), Keras (<1 year), C++ (2 years), MATLAB (4 years)

Professional Experience Western Digital: ASIC Modeling Intern – Irvine, CA (Summer 2017) • Developed and validated models of ASIC IPs in SystemC, TLM 2.0, and C++ • Developed scripts in PERL and Python to run Linux model regressions • Developed targeted test cases to run on embedded ARM CPU models • Improved the performance of Linux model regressions by converting logging commands to a more efficient model and using a smarter verbosity algorithm. Automated the process by developing a Python script. Pacific Gas and Electric Corporation: Substation Asset Strategy Intern – Fresno CA (Summer 2016) • Managed a database of transformer technical specifications used to predict the risk of transformer failures across PG&E territory Pacific Gas and Electric Corporation: Electric Operations Intern – San Francisco, CA (Summer 2015) • Managed an equipment database that was linked to solar panel projects in PG&E territory that could interface centrally between project managers, field engineers, and contractors, making it easier for all parties to access the same information

Projects Decision Tree Classifier (Spring 2018) – Implemented a decision tree classifier using the ID3 algorithm. Trained and tested the data on the discrete 2D Iris data set. Boosted Classification using Adaboost and Logitboost (Spring 2018) – Implemented a decision stump classifier as a weak classifier, and them implemented boosted classifiers using the Adaboost and Logitboost algorithms. Trained and tested the data on the binarized Iris data set. SVM Classifier (Spring 2018) – Implemented a linear SVM classifier using the primal formulation of the Pegasos algorithm. Trained and tested the model on the MNIST data set. Convolutional Neural Network (Spring 2018) – Implemented a CNN with the following layers: input --> convolution --> relu -> max pooling --> flatten --> dropout --> linear --> softmax_cross_entropy loss. Trained, validated, and tested the data on the MNIST data set. Evaluated the model using different values of α and λ for momentum and weight decay, respectively MLP Classifier (Spring 2018) – Implemented a MLP classifier with the following layers: input -> linear -> relu -> dropout -> linear -> softmax_cross_entropy loss. Implemented drop out and evaluated the model using a range of drop out rates. Trained, validated, and tested the data on the MNIST data set. Extended the model to add another series of convolution, relu, and max pooling to improve performance. Perceptron and MSE Classifier Analysis (Spring 2018) – Implemented a perceptron classifier using sequential gradient descent and a MSE classifier using the pseudoinverse method on the UCI Machine Learning Repository Wine Data Set using Python. Compared the classification performance of both classifiers with both normalized and unnormalized data. Minimum-Distance-to-Class Means Classifier (Spring 2018) – Created a Python implementation of a multiclass MDTCM classifier to test on the UCI Machine Learning Repository Wine Data Set K-Nearest Neighbor Classifier (Spring 2018) – Implemented a kNN classifier for binary classification using only Numpy. Trained and tested the model on a breast cancer data set. Selected the k hyperparameter by choosing the validation set that gave the highest F1 score. Then implemented min-max scaling to improve the model’s performance Logistic Regression Model (Spring 2018) – Implemented a binary logistic regression model using only Numpy and tested the model on synthetic data and the MNIST data set. Extended to a multiclass model using the One-vs-Rest approach and multinomial logistic regression Linear Regression Model (Spring 2018) – Implemented a linear regression model using only Numpy and tested the model on synthetic data. Evaluated the model’s performance by plotting the model’s predictions on Matplotlib. Added polynomial features and tested the model on nonlinear synthetic data. Added ridge regression and selected the best hyperparameter by computing the best model performance on a validation set Face Recognition Program (Winter 2017) – Created a Keras implementation of a face recognition system using the FaceNet algorithm and Inception deep convolutional neural network DeepTraffic (Winter 2017) – Modified a deep neural network with 1 single hidden layer and 10 neurons in order to train a car to achieve optimal performance in traffic


Education University of Southern California • M.S. Electrical Engineering – (Grad 05/2019) Probability and Statistics, Machine Learning, Mathematical Pattern Recognition • B.S. Electrical Engineering – 3.56 GPA (Grad 5/2017) Applied Linear Algebra for Engineering, Software Design and Machine Learning in C++ and Python, Digital Signal Processing, Linear Control Theory, Loudspeaker and Sound-System Design Deeplearning.ai (Fall 2017) • Deep Learning Specialization on Coursera Machine Learning (Fall 2017) • Machine Learning course taught by Andrew Ng on Coursera

Volunteer Experience USC Questbridge (Fall 2016 – Spring 2017) – Mentored incoming freshmen who were part of Questbridge, an organization that helps socio-economically disadvantaged students apply to college USC Joint Educational Program (JEP) (Winter 2015 – Spring 2017) – Tutored Math to high schools in inner-city public schools in Los Angeles and designed curricula to help struggling students. USC Queers in Engineering, Science, and Technology (QuEST) (Fall 2016 – Spring 2017) – Member of QuEST, an organization that provides networking and assistance for LGBT students in STEM majors

Honors and Awards Eta Kappa Nu Great Lakes National Scholar (Fall 2016) IEEE PES Scholar (Fall 2016) Laura Platt Scholar (Fall 2016)

Resume  
Resume  
Advertisement