Hibachi

Data Engineer

Hibachi

Lightning speed. Institutional-grade privacy. Secured by zk math.

$140,000 - $200,000 /year

Remote (Worldwide)

Data

Mid-Level

11-50 employees

Blockchain Fintech Infrastructure

Last updated 2026-04-02T17:16:48.909Z

Job fit report illustration
Job fit rating Magic wand icon See what are your chances of getting this job!
See example report

About the role

Overview

Who We Are

Stablecoins are beginning to reshape the global FX market, where more than $10 trillion trades every day. Hibachi is building the exchange designed for that shift.

We are building a modern central limit order book for global currencies with transparent prices, direct access to liquidity, and infrastructure designed for continuous global markets. Our goal is to open FX trading beyond the traditional interbank system and create a venue where global money can move freely.

We are a small team of engineers and traders who have built market infrastructure at Tower Research, Citadel, Coinbase, and Bloomberg. We care deeply about performance, correctness, and building systems that operate at global scale.

Hibachi is backed by Dragonfly Capital, Electric Capital, Coinbase Ventures, and Circle Ventures.




About The Technology

Hibachi runs a high performance off chain central limit order book built for fast, private trading and deep liquidity. Zero knowledge proofs allow anyone to verify the exchange’s solvency on chain without revealing user positions. The result is transparent infrastructure built for global markets.


The Role

We are seeking a Data Engineer with broad expertise in data modeling, advanced SQL, ETL/ELT development, and CDC (Change Data Capture). You will design and maintain end-to-end data solutions—covering batch and streaming ingestion, data warehousing with Iceberg, AWS DMS (or similar CDC tools). This role requires strong communication skills to ensure data initiatives align with and drive business objectives.





You’ll Be Responsible for:

• Data Pipeline Development: Architect, build, and maintain batch and streaming data pipelines using PySpark, AWS Glue, and Airflow .Implement Change Data Capture (CDC) with AWS DMS (or comparable tools) to capture incremental updates from source systems.
• Data Modeling & Architecture: Design modular, reusable, and scalable data models adhering to best practices. Work with iceberg backed Data Warehouse solution for performant storage, queries, and transformations. Ensure consistent data definitions and governance using frameworks like the Glue Catalog.
• ETL/ELT: Manage ETL/ELT pipelines ensuring efficient data ingestion, cleansing, and aggregation. Monitor and debug performance bottlenecks, applying tuning techniques where necessary.
• Data Visualization & Analytics: Develop QuickSight dashboards (or similar BI tools) to surface actionable insights for stakeholders.


You’ll Need to Have:

• Bachelor’s or Master’s Degree in Computer Science, Engineering, or a related field (or equivalent experience).
• 2+ years of hands-on experience with PySpark for batch and streaming pipelines. Familiarity with streaming ecosystems (Kafka, Kinesis, Spark Structured Streaming).
• Strong proficiency in AWS Glue, Apache Airflow, and Iceberg
• Experience with AWS DMS or other CDC tools to manage real-time or near-real-time data ingestion.
• Advanced SQL knowledge, including performance tuning and complex transformations.
• Proven background in data modeling and data architecture best practices (data warehouse/data lake).
• Experience with BI platforms (QuickSight, Tableau, Power BI, etc.) for dashboard development.
• Understanding of testing frameworks (e.g., Pytest) for data pipelines, unit testing, and QA processes
• Excellent communication skills, with an ability to bridge technical and business requirements


We’d Love to See:

• Background in trading, HFT, or capital markets infrastructure

Skills

Data Analysis Communication Dashboards Data Modeling ETL/ELT

Candidate requirements

These are the minimum requirements a candidate must meet to be considered for this role.

  • Bachelor’s or Master’s Degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • 2+ years of hands-on experience with PySpark for batch and streaming pipelines.

About Hibachi

Stablecoins are beginning to reshape the global FX market, where more than $10 trillion trades every day. Hibachi is building the exchange designed for that shift.

We are building a modern central limit order book for global currencies with transparent prices, direct access to liquidity, and infrastructure designed for continuous global markets. Our goal is to open FX trading beyond the traditional interbank system and create a venue where global money can move freely.

We are a small team of engineers and traders who have built market infrastructure at Tower Research, Citadel, Coinbase, and Bloomberg. We care deeply about performance, correctness, and building systems that operate at global scale.

Hibachi is backed by Dragonfly Capital, Electric Capital, Coinbase Ventures, and Circle Ventures.

Company score (5)

Most candidates are reviewed quickly