One stop solution to your remote job hunt!

By signing up you get access to highly customizable remote jobs newsletter, An app which helps you in your job hunt by providing you all the necessary tools.

Try Worqstrap Remote Jobs commitment free for 7 days, no credit card necessary.
OR
Subscribe to our highly customizable newsletter to get remote jobs from top remote job boards delivered to your inbox.
TalentRemedy over 1 year ago
data🇺🇸usa only
Apply Now
< class="h1">Description

This is a remote position.

The mission of the Processing Team is to build cross-domain systems to perform RF-based data collection and geolocation. The Processing team includes experts across FPGA development, embedded software, software defined radio, and cloud development; plus deep knowledge of signal-of-interest (SOI) digital signal processing, RF communications systems, RF measurement systems, and geolocation. Our client is currently seeking a Senior Data Engineer who can help the Processing team design, build, and deploy data pipelines for RF processing and geolocation.

As a senior data engineer on the Processing team, you will be responsible for designing and implementing distributed, reliable backend software systems to consume and leverage RF data at scale. You will need experience building and running data pipelines in production, with a passion for robustness, observability, and monitoring. A successful data engineer will be expected to work closely with RF & Geolocation domain specialists, data scientists, and analysts to deploy pipelines while optimizing for both performance and low-latency. We support a broad range of software to accomplish our mission, especially favoring python and C++ for backend software; Kubernetes clusters on AWS; data pipelines orchestrated with Airflow; data storage with Amazon S3 and PostgreSQL as appropriate; Elasticsearch and Kibana for logs analytics and monitoring dashboards.   Location:  This position can be hybrid with work from home flexibility or 100% remote.   Your main responsibilities will be:
  • Contribute to the design, implementation, and testing of the company's data platform and data pipelines; optimizing for scalable, low-latency deployment within a batch-processing cloud environment
  • Build, document, and support software systems & tools (data pipelines, utility libraries, core features, etc) enabling high-quality research and production deployments throughout the team
  • Define scope, build consensus within the technical team, and drive new feature development with input from stakeholders throughout the company
  • Participate in collaborative & fast-paced software development practices, particularly performing merge request reviews, providing design feedback, etc
  • Guide and mentor other inidual contributors; work closely with RF & Geolocation domain specialists to achieve the team mission
 

 

< class="h3">Requirements Education and experience:
  • B.S. degree in Computer Science or comparable or equivalent experience
  • 6+ years of professional experience
  • 3+ years of experience building data pipelines and other cloud-based infrastructure: workflow management (e.g. Airflow, Argo workflows, AWS step functions), object storage, relational databases (specifically PostgreSQL, PostGIS, and experience writing/testing SQL), REST/GraphQL APIs, message passing (Kafka, SNS), etc
  • Experience with data science and/or software development using python, especially using industry-standard standard python libraries: pandas, scipy, scikit, dask/ray, flask, fastAPI, etc
  • Experience building software and tools facilitating effective research & development – a passion for writing clean code, scalable architectures, test-driven development, and robust logging
Essential:
  • Familiarity with CI/CD best practices: automated testing, using a dev/prod workflow, deploying to Artifactory or other package manager, deploying containerized software, etc.
  • Track record of building and supporting mission-critical backend applications in production
Desirable:
  • Experience administrating modern cloud applications and infrastructure running in Kubernetes on AWS or other cloud provider
  • Working knowledge of frontend development (react/angular, javascript, web-assembly, etc), especially prior examples building proof-of-concept applications to consume & interact with data products
  • Familiarity with the ELK stack (elasticsearch, logstash, kibana) for aggregating logs, creating queries/dashboards, and monitoring production deployments in real time
  • Familiarity with software acceleration including multi-core parallelism, cluster-based scaling (e.g. Dask, Spark, etc), and/or GPUs, for bespoke applications
  • Familiarity with RF signal processing or geolocation algorithms and applications, particularly in a batch-processed cloud environment
Company Overview: Our client is delivering a revolutionary source of global knowledge based on radio frequency (RF) geo-spatial analytics to those working to make the world a safer place. The company operates a commercial satellite constellation that detects, geo-locates, and identifies a broad range of signals & behaviors. We employ cutting edge AI techniques to equip our global customers with high-impact insights needed to make decisions with confidence. Headquartered in Herndon, Virginia.   The client is committed to hiring and retaining a erse workforce. They are proud to be an Equal Opportunity Employer, making decisions without regard to race, color, religion, sex, sexual orientation, gender identity, gender expression, marital status, national origin, age, veteran status, disability, or any other protected class.  

 

< class="h1">Skills

Education and experience: B.S. degree in Computer Science or comparable or equivalent experience 6+ years of professional experience 3+ years of experience building data pipelines and other cloud-based infrastructure: workflow management (e.g. Airflow, Argo workflows, AWS step functions), object storage, relational databases (specifically PostgreSQL, PostGIS, and experience writing/testing SQL), REST/GraphQL APIs, message passing (Kafka, SNS), etc Experience with data science and/or software development using python, especially using industry-standard standard python libraries: pandas, scipy, scikit, dask/ray, flask, fastAPI, etc Experience building software and tools facilitating effective research & development – a passion for writing clean code, scalable architectures, test-driven development, and robust logging Essential: Familiarity with CI/CD best practices: automated testing, using a dev/prod workflow, deploying to Artifactory or other package manager, deploying containerized software, etc. Track record of building and supporting mission-critical backend applications in production Desirable: Experience administrating modern cloud applications and infrastructure running in Kubernetes on AWS or other cloud provider Working knowledge of frontend development (react/angular, javascript, web-assembly, etc), especially prior examples building proof-of-concept applications to consume & interact with data products Familiarity with the ELK stack (elasticsearch, logstash, kibana) for aggregating logs, creating queries/dashboards, and monitoring production deployments in real time Familiarity with software acceleration including multi-core parallelism, cluster-based scaling (e.g. Dask, Spark, etc), and/or GPUs, for bespoke applications Familiarity with RF signal processing or geolocation algorithms and applications, particularly in a batch-processed cloud environment Company Overview: Our client is delivering a revolutionary source of global knowledge based on radio frequency (RF) geo-spatial analytics to those working to make the world a safer place. The company operates a commercial satellite constellation that detects, geo-locates, and identifies a broad range of signals & behaviors. We employ cutting edge AI techniques to equip our global customers with high-impact insights needed to make decisions with confidence. Headquartered in Herndon, Virginia. The client is committed to hiring and retaining a erse workforce. They are proud to be an Equal Opportunity Employer, making decisions without regard to race, color, religion, sex, sexual orientation, gender identity, gender expression, marital status, national origin, age, veteran status, disability, or any other protected class.

< class="h1">Education

Bachelor's degree