Back
remote allowed
Full time

Software Development / Data Engineer

Data Engineer

Ukraine, Poland

About the company

Quantum is a global technology partner delivering high-end software products that address real-world problems.
We advance emerging technologies for outside-the-box solutions. We focus on Machine Learning, Computer Vision, Deep Learning, GIS, MLOps, Blockchain, and more.
Here at Quantum, we are dedicated to creating state-of-art solutions that effectively address the pressing issues faced by businesses and the world. To date, our team of exceptional people has already helped many organizations globally attain technological leadership.
We constantly discover new ways to solve never-ending business challenges by adopting new technologies, even when there isn’t yet a best practice. If you share our passion for problem-solving and making an impact, join us and enjoy getting to know our wealth of experience!

About the position

Quantum is expanding the team and has brilliant opportunities for the Data Engineer. The client is a technological research company that utilizes proprietary AI-based analysis and language models to provide comprehensive insights into global stocks in all languages. Our mission is to bridge the knowledge gap in the investment world and empower investors of all types to become “super-investors.”
Through our generative AI technology implemented into brokerage platforms and other financial institutions’ infrastructures, we offer instant fundamental analyses of global stocks alongside bespoke investment strategies, enabling informed investment decisions for millions of investors worldwide.

requirements icon

Must have skills:

  • At least 5+ years in a Data Engineer role or relevant experience
  • Expertise with Spark/PySpark
  • Solid knowledge of Python (Java can be an option, but the codebase is on Python)
  • Knowledge of Spark Deployment and Manage Resources for Spark jobs
  • Understanding the difference between OLAP and OLTP: Operational system vs Data processing system
  • Knowledge of Data model abstractions: data warehouse, lakehouse. And data models: Medallion architecture, start schema, wide tables
  • Familiarity with SQL & NoSQL technologies
  • At least an Upper-Intermediate level of English (spoken and written)
requirements icon

Would be a plus:

  • Apache Storm, Apache Flink, GCP Data Flow, Apache beam
  • Apache Hadoop, Apache Hive, Snowflake, Dremio, Redshift
  • Airflow, Prefect, Airbyte
  • Good working knowledge of AWS Glue or Amazon EMR and related tools
  • In the context of data analysis experience with Apache Pulsar, Kafka, Kinesis
  • Presto/Trino, Athena
  • Understanding of the difference between batches and streams. As well - the difference between a queue and a data stream
  • Data quality control: data lineage, data quality, etc
  • Knowledge Architecture patterns like lambda, kappa, data mesh
requirements icon

Your tasks will include:

  • Make architecture for production solutions, suggest solutions for improvement
  • Building and maintaining data pipelines for the extraction, transformation, and loading of data
  • Managing databases, ensuring data consistency, integrity, and security
  • Detect and solve performance issues on architectural, design, and code levels
  • Develop solid unit and integration tests according to organizational standards
  • Generate technical solutions for tasks
  • Make project plans, estimate budgets
  • Collaborates with designers, developers, and product owners
  • Take part in presale, analyze tasks, communicate with customers
requirements icon

We offer:

  • Delivering high-end software projects that address real-world problems
  • Surrounding experts who are ready to move forward professionally
  • Professional growth plan and team leader support
  • Taking ownership of R&D and socially significant projects
  • Participation in worldwide tech conferences and competitions
  • Taking part in regular educational activities
  • Being a part of a multicultural company with a fun and lighthearted atmosphere
  • Working from anywhere with flexible working hours
  • Paid vacation and sick leave days

Join our team. Make a difference.

Take a step towards your data-driven future with Quantum