Chih-hsin Esa Chen
Skip to main content
Computer Science & HCI Engineer

Chih-hsin Esa Chen

Building accessible technology that serves everyone

Columbia University, New York, NY
B.S. Computer Science • Aug 2024 - May 2026 (Expected)
GPA: 4.07/4.33 • Tau Beta Pi
Scroll

Designing technology with people at the center

Esa Chen, Computer Science student at Columbia University

I'm a senior at Columbia University (Aug 2024 - May 2026) pursuing a B.S. in Computer Science with a focus on human-computer interaction and accessibility. With a GPA of 4.07/4.33 and induction into Tau Beta Pi, my work centers on a simple question: how can technology better serve people?

From voice-controlled navigation for blind and low-vision users to real-time health monitoring for dementia care, I build systems that prioritize user needs. I believe the best engineering happens at the intersection of technical depth and human empathy.

Before Columbia, I studied at Colorado College (Aug 2021 - May 2024) through the 3+2 Combined Plan program, graduating with a perfect 4.0/4.0 GPA and earning Phi Beta Kappa honors. I bring a global perspective to my work, speaking Mandarin, Taiwanese, Japanese, and Korean fluently.

Currently, I'm researching at Columbia's Computer-Enabled Abilities Lab and Software Systems Lab, where I'm engineering solutions that make technology more accessible and intelligent for diverse communities.

Experience & Projects

Street View Navigation for BLV Users

Columbia University • Computer-Enabled Abilities Lab • Advisor: Prof. Brian Smith

May 2025 - Present

Led iOS development of a voice-controlled Street View navigation app for blind and low-vision users, validated with Helen Keller Services.

  • Shipped four navigation modes with spatialized audio cues and progress tracking
  • Engineered on-device ASR with homophone normalization to improve accuracy in outdoor settings
  • Coordinated TTS/ASR timing to remove self-echo and reduce announcement latency
Swift iOS ASR TTS Spatial Audio Accessibility

Engineering Advising Chatbot

Columbia University • Software Systems Lab • Advisor: Prof. Junfeng Yang

Oct 2025 - Present

Built a Flask-based chatbot querying AWS Bedrock Knowledge Base with advanced filtering and reranking capabilities.

  • Engineered admin SPA for knowledge base management with search, metadata facets, and CRUD operations
  • Developed data pipeline tooling for FAQ normalization and S3 upload automation
  • Implemented per-turn logging for retrieval evaluation and performance monitoring
Flask AWS Bedrock Python Knowledge Base NLP
SYP - Rate Cocktails app screenshot

SYP - Cocktail Discovery Mobile App

Columbia Build Lab • → View on App Store

May 2025 - Aug 2025

Full-stack developer on a social cocktail discovery platform with geospatial search and activity feeds.

  • Accelerated geospatial query performance by 5× (5s → 1s) by migrating to PostgreSQL + PostGIS
  • Engineered core social features: JWT authentication, follow graph, and paginated activity feed
  • Resolved AWS EC2 deployment blocker by diagnosing memory issues and packaging custom Conda environment
PostgreSQL PostGIS React AWS EC2 Geospatial

Dementia Care Platform

Columbia Build Lab

Jan 2025 - May 2025

Built real-time fall detection system with ML-powered health classification for dementia patients.

  • Developed real-time fall detection processing IoT data via AWS Lambda with WebSocket alerts
  • Created LSTM health classifier in TensorFlow on 5-second windows of HRV/stride data
  • Automated daily data pipeline using CloudWatch Events to process S3 sensor files
AWS Lambda TensorFlow LSTM WebSocket IoT

Speech Emotion Recognition

Columbia University • COMS 6998 (PhD-level)

Jan 2025 - May 2025

ML research project fusing acoustic features with transformer models for emotion classification.

  • Achieved 85.2% accuracy by fusing MFCC features with BERT on 2K+ utterances
  • Observed +2.2 SD pitch shifts in high-arousal emotions via speaker-wise z-normalization
  • Applied multimodal deep learning techniques for robust emotion detection
PyTorch BERT OpenSMILE NLP ML

Technologies & Expertise

Programming Languages

Python Swift JavaScript TypeScript C/C++ SQL

Frontend & Mobile

React Angular iOS (Swift) SwiftUI HTML/CSS Responsive Design

Backend & Cloud

Flask Django PostgreSQL PostGIS AWS Lambda S3 EC2 Bedrock DynamoDB WebSocket Docker

Machine Learning & NLP

TensorFlow PyTorch scikit-learn BERT Transformers NLTK OpenSMILE Parselmouth

Specializations

Accessibility (a11y) Human-Computer Interaction Geospatial Systems Real-time Systems ASR/TTS Knowledge Bases

Human Languages

English (Fluent) Mandarin Chinese (Native) Taiwanese (Native) Japanese (Fluent) Korean (Fluent)

Let's work together

I'm currently open to research opportunities, internships, and collaborations. Feel free to reach out if you'd like to discuss accessibility, HCI, or building technology that matters.

Chih-Hsin Esa Chen

New York, NY 10027 Available for opportunities