When we first started selling to smartphone OEMs at Elliptic Labs, tuning our ultrasound virtual sensors was a significant bottleneck. Each new smartphone integration required our signal processing experts to travel onsite for days of manual parameter tuning and repetitive validation. This process was slow, expensive, and reliant on guesswork, pulling our top talent away from core research and development.
To solve this, I hired and led a team to build a scalable, automated infrastructure that transformed this manual effort into a data-driven process. The system had three core components:
Rich data collection: An Android app guided junior personnel through data capture. It recorded not just acoustic data, but also critical metadata for tagging corner cases and understanding performance issues, along with ground truth from a custom-designed external sensor.
Data quality assurance: Before being used for ML training or testing, all incoming data was reviewed through a dedicated web app. This tool used heuristics to flag potential collection errors, allowing for a quick and effective quality check before the data was committed.
Automated analysis: The quality-assured data was fed into a cloud platform that emulated the on-device environment, running the exact same signal processing libraries and ML models. The platform automatically computed performance metrics, which were then queried via a web interface to make objective, data-backed decisions on model releases (see case study).
Impact: we replaced subjective guesswork with an efficient and reliable validation system. This infrastructure empowered our field engineers to manage customer projects autonomously, completely removing the R&D team from the tuning process. As a result, our top experts were free to stop tweaking parameters and focus entirely on designing the next generation of platforms and user experiences.