Data Engineer
Croatia, On-Site, Remote
Life at Q is anything but boring! We’re on a mission to find the As to the most challenging Qs of today. That makes our everyday lives more fun, our team more cohesive and our daily tasks more exciting. Are you ready for a challenge?
We are looking for a Data Engineer who will work on various projects with clients from many industries and different data-related problems to solve. The ideal candidate is adept at assembling different data processing pipelines and ETLs. Strong experience in using a variety of data engineering patterns and tools to collect and convert raw data into normalized datasets ready for dashboarding or further processing by data analysts/scientists. Following new trends and investing in your knowledge is a must.
What is it all about?
- Communication & coordination with partners and clients
- Participation in planning, evaluation, and project estimations
- Participation in the data architecture and design of the application
- Participation in auditing and analysis of project documentation and specifications
- Planning and setting up the data processing part of the project from scratch
- Using strong business context awareness, as well as an ability to communicate complex findings in a simple, actionable way
- Using business context awareness, as well as an ability to communicate complex findings in a simple, actionable way
- Maintaining ongoing projects
- Proactively proposing pragmatic and cost-effective data processing solutions
- Close collaboration with other team members
- Participate in an agile development process with your project team
- Delivering solutions in compliance with prevailing data and ethical standards
- Developing custom data processing pipelines
What do we expect from you?
- 5+ years of experience in Data Engineering
- Advanced SQL skills with experience in NoSQL databases
- Proficient in Python with hands-on experience in PySpark
- In-depth knowledge of ETL/ELT processes
- Extensive experience with BigQuery and Google Cloud Platform (GCP)
- Skilled in data modeling and warehousing (e.g., star and snowflake schemas, data marts)
- Advanced English proficiency
- Strong analytical skills with the ability to solve complex business problems
- Proven experience in leading the design and implementation of pipelines
- Demonstrates a strong work ethic
- Committed to knowledge sharing, team collaboration, and offering support to colleagues
- Capable of working independently or as part of a team
- Skilled in task slicing, estimation, prioritization, and meeting deadlines
- Passionate about continuous learning and skill development
- Flexible and adaptable to new tools and workflows
- Ability to laugh at least a little bit at this: A SQL query walks into a bar, walks up to two tables, and asks “Can I join you?"
And it would be awesome if you have…
- Experience with Databricks
- Experience with Scala
What we bring to the table:
- The location choice is yours: remote, on-site or hybrid
- Flexible working hours
- Work with new technologies in a high-performance environment
- IT community involvement — Meetups, Workshops & Articles
- Internal workshops & personal development
- Educational budget
- 100% paid sick leave
- Paid health insurance
- Subvention of Multisport card
- Transport allowance & meal allowance
Salary range
Salary is based on your experience, level of knowledge & technical interview.