Group 3236_conferences

Lead Big Data Engineer - GCP


Apply Now
Group 3236_conferences
About Lingaro:


Lingaro Group is the end-to-end data services partner to global brands and enterprises. We lead our clients through their data journey, from strategy through development to operations and adoption, helping them to realize the full value of their data.

Since 2008, Lingaro has been recognized by clients and global research and advisory firms for innovation, technology excellence, and the consistent delivery of highest-quality data services. Our commitment to data excellence has created an environment that attracts the brightest global data talent to our team.

About Data Engineering :


Data engineering involves the development of solutions for the collection, transformation, storage and management of data to support data-driven decision making and enable efficient data analysis by end users. It focuses on the technical aspects of data processing, integration, and delivery to ensure that data is accurate, reliable, and accessible in a timely manner.It also focuses on the scalability, cost-effectiveness, security, and supportability of the solution. Data engineering encompasses multiple toolsets and architectural concepts across on-premises and cloud stacks, including but not limited to data warehousing, data lakes, lake house, data mesh, and includes extraction, ingestion, and synchronization of structured and unstructured data across the data ecosystem. It also includes processing organization and orchestration, as well as performance optimization of data processing.




Job Requirements:


  • At least 4+ years experience as DE - Python, GCP, BigQuery

  • Object oriented programming, Python and SQL

  • Strong knowledge in cloud computing platforms - Google Cloud - Candidate should be able to design, build, and deploy data pipelines in the cloud, to ingest data from various sources like databases, APIs or streaming platforms.

  • Experience in composer or Apache Airflow and knowledge of Dataplex is a plus

  • Experience in dq checks implementation, any frameworks like Clouddq, Pydeequ.

  • Good Knowledge of Dq dimensions

  • Experience working with GCP cloud-based infrastructure & systems ·

  • Programming skills (SQL, Python, other scripting),

  • Proficient in data modelling techniques and database optimization.

  • Knowledge of query optimization, indexing, and performance tuning is necessary for efficient data retrieval and processing.

  • Proficient in database management systems such as SQL (Big Query is a must), NoSQL.

  • Candidate should be able to design, configure, and manage databases to ensure optimal performance and reliability.

  • Experience with data integration tools and techniques, such as ETL and ELT Candidate should be able to integrate data from multiple sources and transform it into a format that is suitable for analysis.


Tasks :


  • Modelling the data from various sources and technologies.

  • Troubleshooting and supporting the most complex and high impact problems, to deliver new features and functionalities. Designing and optimizing data storage architectures, including data lakes, data warehouses, or distributed file systems.

  • Implementing techniques like partitioning, compression, or indexing to optimize data storage and retrieval.

  • Identifying and resolving bottlenecks, tuning queries, and implementing caching strategies to enhance data retrieval speed and overall system efficiency.

  • Identifying and resolving issues related to data processing, storage, or infrastructure.

  • Monitoring system performance, identifying anomalies, and conducting root cause analysis to ensure smooth and uninterrupted data operations.

  • Train and mentor junior data engineers, providing guidance and knowledge transfer.



We Offer: 


  • Stable employment. On the market since 2008, 1300+ talents currently on board in 7 global sites.

  • 100% remote.

  • Flexibility regarding working hours.

  • Full-time position

  • Comprehensive online onboarding program with a “Buddy” from day 1.

  • Cooperation with top-tier engineers and experts.

  • Unlimited access to the Udemy learning platform from day 1.

  • Certificate training programs. Lingarians earn 500+ technology certificates yearly.

  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly.

  • Grow as we grow as a company. 76% of our managers are internal promotions.

  • A diverse, inclusive, and values-driven community.

  • Autonomy to choose the way you work. We trust your ideas.

  • Create our community together. Refer your friends to receive bonuses.

  • Activities to support your well-being and health.

  • Plenty of opportunities to donate to charities and support the environment.


Linkedin-job-offer facebook-job-offer