Remote Developer & Coding Jobs

Discover top remote jobs for frontend and backend developers—Ruby, React, PHP, Python, and more.

Signup for our free daily newsletter to get notified of new remote developer jobs.

Unsubscribe at anytime. Privacy policy

Posted over 3 years ago

ROLE AND RESPONSIBILITIES:

Our team is looking for a senior engineer to help craft, build and maintain an efficient data pipeline architecture. You will work closely with internal partners on identifying data-related technical issues and supporting infrastructure needs. We are looking for an integral member of the Software team that will help define the Data Engineering function at PAX.


As the PAX Senior Software Engineer, you will:

  • Assemble large, sophisticated data sets that meet functional / non-functional business requirements;
  • Identify, design, and implement internal processes and process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc;
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Apache Airflow, AWS Athena, AWS EMR, Spark, and other AWS big data tools;
  • Build analytics tools that utilize the data pipelines to provide meaningful insights into customer acquisition, operational efficiency and other key business performance metrics;
  • Work with partners including the Executive, Software, Product, Data and Design teams to assist with data-related issues and support their data infrastructure needs;
  • Keep our data separated and secured within logical boundaries via strict role based access controls;
  • Maintain data tools like Apache Superset, and Zeppelin for analytics and data scientist team members that assist them in building and optimizing our product for future innovations;
  • Work with data scientists and analytics specialists to strive for greater functionality in our data systems

QUALIFICATIONS: 

  • 7+ years experience working as a Data Engineer (or equivalent);
  • Comfort working in an agile and/or lean development environment;
  • Experience with modern data pipelines, data streaming, and real time analytics using systems such as Apache Airflow, AWS Kinesis, AWS EMR, Spark, AWS Lambda, AWS Athena (Presto), Zeppelin, Jupyter Notebook, or similar technologies;
  • Strong programming abilities with Python, Scala, Java or other similar languages;
  • A strong ability to understand and organize data from disparate sources into structures that are easy to digest and query;
  • Sophisticated working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases and their tradeoffs;
  • Experience building and optimizing big data ETL pipelines that are idempotent, incremental and partitioned for performance and cost efficiency;
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and find opportunities for improvement;
  • A successful history of manipulating, processing and extracting value from large disconnected datasets;
  • Solid understanding of message queuing, stream processing, and highly scalable ‘big data’ data stores;
  • Strong project management and organizational skills
  • Write clear, concise and maintainable code;
  • Must be a strong, self-motivated individual that can work with minimal day to day supervision and able to objectively rank and prioritize development work;
  • Excellent English verbal and written communication skills

Experience with the following software/tools:

  • Big data systems: Hadoop, Spark, Sqoop;
  • Relational SQL and NoSQL databases, including MySQL and Presto;
  • Data pipeline and workflow management tools: Apache Airflow;
  • AWS cloud services: EC2, EMR, Kinesis, Lambda, Redshift, Athena, Glue
  • Stream-processing systems: Storm, Spark-Streaming, etc;
  • Object-oriented/object function scripting languages: Python, Java, Scala;
  • Experience with bash scripting, Terraform, machine learning tools and concepts is a plus

Education:

  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field preferred or equivalent experience
 
PAX LABS PERKS & BENEFITS:

  • Competitive compensation, equity & bi-annual performance reviews
  • Fully funded comprehensive medical, dental, and vision coverage
  • 401K plan
  • Generous PTO policy 
  • Paid Parental Leave
  • Monthly wellness reimbursement
  • Cell Phone reimbursement
  • Employee Purchase Program for discounted PAX devices
  • Weekly catered lunch, endless snacks and beverages
  • Dog Friendly HQ in the Mission District of San Francisco
  • Employee Assistance Program including access to online legal support

We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.