Shot Velocity Computer
Turning raw sensor waveforms into validated velocity measurements — giving engineers confidence that their Mach 4+ projectiles hit target speeds.
Raw data doesn't answer the question
After each test fire, engineers need to know one critical thing: how fast did the projectile go? The sensors capture voltage waveforms from induction coils along the barrel — but those waveforms don't directly tell you velocity.
Previously, velocity calculation required manual analysis: downloading data, loading it into analysis tools, identifying signal peaks, measuring time deltas, doing the math. It worked, but it was slow and left room for error.
"We'd finish a test and then spend an hour figuring out if it actually worked. By then, the team had moved on mentally — we needed answers faster."
Automated analysis, instant answers
I built the Velocity Computer to automate the entire analysis pipeline — from raw waveform to validated velocity number. The system identifies signal peaks, calculates time-of-flight between sensor pairs, computes velocity, and flags any measurements that look suspicious.
Core Features
- Automatic peak detection in noisy signals
- Velocity calculation at each boost stage
- Visual waveform plots with detected points
- Validation flags (✓ confirmed, ? needs review)
- CSV export for further analysis
Signal Processing Decisions
- Dual-threshold peak detection for robustness
- Smoothing filter tuned to expected signal shape
- Outlier detection based on physics constraints
- Color-coded waveforms per sensor pair
From an hour to seconds
The Velocity Computer now runs immediately after each test. Engineers see computed velocities within seconds — with visual confirmation of the waveforms and detected peaks. Suspicious measurements are flagged for manual review, but the 90% of clean data gets validated automatically.
What I learned
Signal processing in the real world is messy. The theoretical algorithm was straightforward — find peaks, measure time, calculate velocity. But real sensor data has noise, dropouts, and edge cases that break naive implementations.
I spent more time on validation and edge cases than on the core algorithm. The goal wasn't just to compute velocities — it was to compute velocities engineers could trust. That meant showing the work, flagging uncertainty, and making it easy to verify results visually.