Portfolio Jobs

Looking for your next role? Take a look at these exciting jobs at Sapphire Ventures’ portfolio companies. Our Talent team is passionate about connecting you to your dream job!

Senior Software Engineer



Software Engineering
New York, NY, USA
Posted on Thursday, May 25, 2023

Who are we?

FalconX is one of the fastest-growing startups in FinTech. We are redefining prime brokerage from the ground up.

We are backed by some of the best investors in the world including Accel, American Express, B Capital, Coinbase, Fidelity, Lightspeed Venture Partners, Fenbushi Capital and Tiger Global Management + more yet to be publicly disclosed.

We deliver institutional digital asset traders best-in-class trading, credit, custody and structured products. We trade, lend and secure tens of billions of dollars monthly, are highly profitable, and growing fast, so we need your help!

We are data-driven. Whether it's a growth or product decision, we believe data can always help us make more precise and informed choices.

We move fast. Speed of execution is essential for any startup, but we believe this is even more pertinent in our 24/7 industry.

We prioritize learning. Outcomes are mission-critical, but we also believe that learning in success and in failure will drive our continued success. Our industry is emergent - there’s no shortage of experiments to get involved with and to continue growing and learning together.

FalconX has offices in San Mateo, Chicago, New York, Bangalore, Malta, and Singapore.

Who is on the team?

We are entrepreneurs. Many in our company have been founders or have aspirations to eventually start their own company. We take these ambitions and experiences to bring a solutions-oriented mindset to the problems we encounter day-to-day.

We are experienced. We have been fortunate to have learned from mentors and peers at institutions such as Google, LinkedIn, JUMP Trading, Citadel, PEAK6 Investments, Goldman Sachs, Harvard Business School, Carnegie Mellon, IIT + more.

FalconX’s Data Infra team builds and operates systems to centralize internal and third-party data, make it easy for engineering, data science, business intelligence, accounting, compliance teams to transform and access that data for analytics and machine learning, and power end-user experiences. As a Data Engineer on the team, you will contribute to the scalable Batch and Streaming ETL pipelines , DWH design and Data modeling, Data Governance initiatives tools and applications that make that data available to other teams and systems.

What you’ll be working on:

  • Provide technical and thought leadership for Data Engineering and Business Intelligence
  • Create, implement and operate the strategy for robust and scalable data pipelines for business intelligence and machine learning
  • Develop and maintain core data framework and key infrastructures
  • Data Warehouse design and data modeling for efficient and cost effective reporting
  • Define and implement Data Governance processes related to data discovery, lineage, access control and quality assurance.

Skills you'll need:

  • Degree in Computer Science, a related field or equivalent professional experience
  • 3+ years of strong experience with data transformation & ETL on large data sets using open technologies like Spark, SQL and Python
  • 3+ years of complex SQL with strong knowledge of SQL optimization and understanding of logical & physical execution plans
  • You have at least 1 year working in AWS environment, and should be familiar with modern web technologies like AWS cloud, MySQL database, Redis caching, messaging tools like Kafka/SQS etc
  • Experience in advanced Data Lake, Data Warehouse concepts & Data Modeling experience (i.e. Relational, Dimensional, internet-scale logs)
  • Knowledge of Python, Spark (Batch/Streaming), SparkSQL and PySpark
  • Proficient in at least one of the following Object-oriented programming languages -- Python / Java / C++
  • Effective craftsmanship in building, testing, and optimizing ETL/feature/metric pipelines
  • Experience with Business Requirements definition and management, structured analysis, process design, use case documentation
  • A data-oriented mindset

Nice to haves:

  • Experience with AWS, and especially RDS, MSK, EMR, S3, Glue, Kinesis etc
  • Prior experience with Databricks, Snowflake, Airflow and Delta Lake

Base pay for this role is expected to be between $155,000 and $245,000 USD. This expected base pay range is based on information at the time this post was generated. This role will also be eligible for other forms of compensation such as a performance linked bonus, equity, and a competitive benefits package. Actual compensation for a successful candidate will be determined based on a number of factors such as skillset, experience, and qualifications.