Autonomous Systems Engineer
I build autonomous systems and real-time control software that operate under hard constraints. I lead engineering teams, architect complex system solutions, and focus on deterministic, production-grade performance. Passionate about robotics and systems engineering.
About
I approach engineering from the system level down. Whether it is an autonomous robot navigating a field under time constraints or a perception system classifying objects at 30 fps, I focus on architecture that is robust, deterministic, and measurable.
My foundation is in competitive robotics, where I led the design and implementation of autonomous routines, sensor fusion logic, and real-time control loops for 120-pound industrial robots under the FIRST Robotics Competition framework. These systems operated under hard timing constraints with no margin for software failure.
I have extended this systems thinking into machine learning research at Georgia Tech and Bogazici University, and into production software development at Token Financial Technologies. I am driven by the intersection of perception, control, and decision-making in physical systems.
Projects
The dataset contained inconsistent categorical values, mixed data types, missing entries, and non-standard formats in critical columns (registration dates, numeric fields). Required custom validation logic to detect and remove invalid records while preserving dataset size. Categorical features needed encoding while maintaining model generalization.
Implemented a 9-layer sequential neural network with 100 units per dense layer. Applied one-hot encoding for categorical features (model, color, transmission type, fuel type). Used StandardScaler for feature and target normalization. Trained with SGD optimizer (learning rate: 0.00001, momentum: 0.8) with weight decay for regularization. Tracked training and validation loss over 30 epochs to monitor convergence and prevent overfitting.
The robot needed to autonomously score game pieces during a 15-second period with no driver input. This required precise motion profiling, encoder-based dead reckoning, and fault-tolerant state machine logic to handle sensor noise and mechanical slop in real time.
Built on the WPILib command-based framework. Implemented PID controllers for drivetrain velocity and heading, a trapezoidal motion profiler for smooth acceleration, and a finite state machine for sequencing autonomous actions. Integrated gyroscope and wheel encoder fusion for continuous pose estimation.
The primary challenge was maintaining robust tracking across diverse environments and different rodent morphologies. Environmental variations (lighting conditions, cage setups, background textures) and rodent-specific factors (size, fur color, movement speed) required adaptive vision algorithms that could generalize without requiring manual tuning for each experiment.
Implemented computer vision system using OpenCV for real-time rodent detection and tracking. Applied background subtraction, morphological operations, and contour analysis to isolate rodents from complex environments. Integrated machine learning classifiers to identify behavioral patterns from trajectory data. Developed configurable software interface allowing researchers to adapt detection parameters for different rodent species and experimental setups without programming knowledge.
Experience
FIRST Robotics Team Mostra, Istanbul Technical University
FIRST Robotics Team Flare, FMV Erenkoy Isik High School
Georgia Institute of Technology
Token Financial Technologies
Bogazici University
Technical Skills