Remote Developer & Coding Jobs

Discover top remote jobs for frontend and backend developers—Ruby, React, PHP, Python, and more.

Signup for our free daily newsletter to get notified of new remote developer jobs.

Unsubscribe at anytime. Privacy policy

Posted about 2 months ago

About you:

  • Care deeply about democratizing access to data.  

  • Passionate about big data and are excited by seemingly-impossible challenges.

  • At least 80% of people who have worked with you put you in the top 10% of the people they have worked with.

  • You think life is too short to work with B-players.

  • You are entrepreneurial and want to work in a super fact-paced environment where the solutions aren’t already predefined.

  • You live in the U.S. or Canada and are comfortable working remotely.
About SafeGraph: 

  • SafeGraph is a B2B data company that sells to data scientists and machine learning engineers. 

  • SafeGraph's goal is to be the place for all information about physical Places

  • SafeGraph currently has 20+ people and has raised a $20 million Series A.  CEO previously was founder and CEO of LiveRamp (NYSE:RAMP).

  • Company is growing fast, over $10M ARR, and is currently profitable. 

  • Company is based in San Francisco but about 50% of the team is remote (all in the U.S.). We get the entire company together in the same place every month.
About the role:

  • Core software engineer.

  • Reporting to SafeGraph's CTO.

  • Work as an individual contributor.  

  • Opportunities for future leadership.
Requirements:

  • You have at least 6 years of relevant work experience.

  • Proficiency writing production-quality code, preferably in Scala, Java, or Python.

  • Strong familiarity with map/reduce programming models.

  • Deep understanding of all things “database” - schema design, optimization, scalability, etc.

  • You are authorized to work in the U.S.

  • Excellent communication skills.

  • You are amazingly entrepreneurial.

  • You want to help build a massive company. 
Nice to haves:

  • Experience using Apache Spark to solve production-scale problems.

  • Experience with AWS.

  • Experience with building ML models from the ground up.

  • Experience working with huge data sets.

  • Python, Database and Systems Design, Scala, Data Science, Apache Spark, Hadoop MapReduce.