We are looking for an experienced backend engineer to build high throughput ingestion systems for market data. Our systems currently take in 10,000+ events per second and we are looking to 10x the ingestion flow. As a backend engineer, you will have an extensive impact on our market data infrastructure, work to scale up the ingestion pipelines by an order of magnitude, and deal with highly complex data coming in at a blazing speed.
Our data is both real time and historical, which each present unique challenges in how we store and provide access to our data. Our APIs are written in GraphQL, REST, and gRPC depending on the use case. We like Go for its simplicity and maintainability, and Postgres for features and reliability. We deploy using Kubernetes on Amazon Web Services. That being said, we’re not dogmatic and always believe in using the best tool for the job.
What you'll do
- Streamline the ingestion of high throughput trade events we receive from dozens of sources around the clock
- Design and build new services to broaden the scope of the quantitative data we can provide both to our users and to our research team, both financial and contextual
- Get down to the nitty-gritty: how many bytes is each trade payload? How much RAM do we use per observable market?
- Have a high impact on the design direction and implementation details of the core pipelines that we use to stream to thousands of users around the world
- Scale up our systems by an order of magnitude
Who you are
- 2+ years experience with modern backend technologies including Golang, Postgres, and AWS
- a desire to work on big complex projects with lots of creative freedom
- a passion for shipping high-quality products
- an interest in the crypto space, and a love for building tools that help users navigate the web3 world
- empathetic and love to help your teammates grow
Projects you could work on
- Data aggregation! Design and build new pipelines to provide metrics derived from other metrics, on-demand custom metrics and more
- Trade ingesters! Build our fleet of microservices responsible for observing, standardizing and submitting trades from their source to our core market data service
- Data integrity! Build tools for us to respond to data incidents, create redundancy, and validate our ever-growing database of all crypto market data going back to 2009
- Asset metrics! Create services responsible for the computation of our ever-growing list of asset metrics, from all-time highs to smart contract calls
What’s it like to work with our engineering team?
- A welcoming and open environment with people who love to collaborate on ideas and tackle complex problems
- Work with a small team of engineers with large impact across product, research, and business development
- Participate in forums like Family Meals and Messari Lab to share ideas or technologies you’ve been exploring and tinkering with