Your space-enabled career begins here

Space-based technologies are the building blocks of these pillars of innovation:

Search for credible job opportunities with top entrepreneurial space companies.

Senior Software Engineer, Data Platform - Lakehouse



Software Engineering
Warsaw, Poland
Posted on Friday, November 10, 2023
Bolt engineering teams are working on unique product challenges: complex algorithms for demand prediction, optimal real-time pricing, routing, fraud detection, distributed systems and much more. Volumes are growing at a rapid pace. We are looking for an experienced engineer who is well-versed in data technologies.

Your daily adventures will include

  • Designing, building and optimizing elements of Bolt's Data Platform - Lakehouse. Main areas include development of Storage and Analytical Systems (Data Lake, Data Warehouse), development of infrastructure for Data Pipelines and Machine Learning models.
  • Investigating and prototyping of new services to improve different aspects of our Data Platform: data quality, monitoring, alerting, performance and costs efficiency
  • Coding mostly in Python, Java and Scala (previous experience is not required), occasionally in other languages.
  • Governing and optimizing SQL queries and data storage formats
  • Proactively solving technical challenges and fixing bugs
  • Contributing ideas to our product development roadmap
  • At Bolt, we are using modern data stack with Data Mesh architecture with Kafka, Presto, dbt, Databricks, Airflow, Looker, Mixpanel and other relevant solutions to serve thousands of internal customers and millions of external customers.
  • We are looking for language-agnostic generalists that are able to pick up new tools to solve the problems they face. Check out our blog to know more about all the exciting projects that we are working on:

We are looking for

  • Experience in at least one of the modern OO languages (Python, Scala, Java, JavaScript, C++, etc)
  • 7+ years of experience in software development
  • Excellent English and communication skills
  • Experience with micro-service and distributed systems
  • Solid understanding of algorithms and data structures
  • Experience with Kubernetes and Docker
  • Good knowledge of SQL and experience in at least one of the popular online analytical processing (OLAP) technologies (AWS Redshift, ClickHouse, Presto, Snowflake, Google BigQuery, DataBricks etc)
  • A university degree in a technical subject (Computer science, Mathematics or similar)

You will get extra credits for

  • Experience in building and designing real-time and asynchronous systems
  • Familiarity with streaming data technologies for low-latency data processing (Apache Spark/Flink, Apache Kafka, RabbitMQ, Hadoop ecosystem)
  • Understanding of NoSQL databases (Redis, ElasticSearch, Apache Cassandra)
  • Experience in building systems based on cloud service providers (AWS, Azure, Google Cloud)