Big Data Engineer


Apply Now
We are seeking an experienced Cloud Data Engineer proficient with at least one hyperscale platform (ideally Azure and/or AWS). The ideal candidate will have hands-on experience in designing and implementing data pipelines, data warehousing solutions, and ETL workflows.


  • Design and implement data pipelines and ETL workflows using Azure or AWS services (or relevant).

  • Develop and maintain data warehousing solutions on AWS and Azure (or relevant).

  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.

  • Build and maintain efficient, scalable, and reliable data architectures for analytics and reporting.

  • Work closely with the data analytics team to optimize data storage, processing, and retrieval for reporting and analysis purposes.

  • Implement data quality checks and data validation processes to ensure accuracy and consistency of the data.

  • Maintain and update data documentation and data dictionaries.

  • Troubleshoot and resolve data-related issues in a timely manner.

  • Bachelor's degree in Computer Science, Information Systems, or a related field.

  • At least 3 years of hands-on experience in cloud data engineering (preferably Azure or AWS).

  • Strong proficiency in SQL and Python.

  • Solid understanding of data warehousing concepts and technologies.

  • Experience in DBT for data transformation and modelling.

  • Basic AWS services understanding (EC2, VPC, KMS, CloudWatch).

  • Knowledge of data quality checks and validation processes.

  • Experience using Docker, and Airflow for deployment, automation, and workflow management (or relevant)

  • Excellent problem-solving skills and attention to detail.

  • Strong communication and collaboration skills to work with cross-functional teams.

  • Advanced analytical abilities for comprehending statistical models while collaborating with Data Scientists.

  • Extensive comprehension of distributed computing frameworks and big data systems, coupled with hands-on experience in object-oriented coding.

  • Proficiency demonstrated in working with technologies like Spark, Hadoop, MapReduce, HDFS, and related tools.

  • Exceptional English proficiency in both spoken and written communication, enabling effective engagement with stakeholders from diverse fields throughout the organization hierarchy.



    Nice to have:

  • SAP integration interfaces basic understanding (ODP/OData).

  • Experience in implementing data pipelines and ETL workflows using AWS services such as S3, Glue, and Lambda (or relevant).

  • AWS AppFlow integration pipelines development.

  • Databricks deployment and development (Python/SQL).

  • Stable employment. On the market since 2008, 1300+ talents currently on board in 7 global sites.

  • Flexibility regarding working hours. 

  • Great Place to Work® certified employer.

  • Comprehensive online onboarding program with a “Buddy” from day 1.

  • Cooperation with top-tier engineers and experts.

  • Unlimited access to the Udemy learning platform from day 1.

  • Certificate training programs. Lingarians earn 500+ technology certificates yearly.

  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly.

  • Grow as we grow as a company. 76% of our managers are internal promotions.

  • A diverse, inclusive, and values-driven community.

  • Autonomy to choose the way you work. We trust your ideas.

  • Create our community together. Refer your friends to receive bonuses.

  • Activities to support your well-being and health.

  • Plenty of opportunities to donate to charities and support the environment.

  • Modern office equipment. Purchased for you or available to borrow, depending on your location.  


Linkedin-job-offer facebook-job-offer