The Big Data Engineer will become part of the team accountable for the design, model, and development of the whole GCP data ecosystem for one of our Clients (Cloud Storage, Cloud Functions, BigQuery).
Tasks:
-
Gather, analyze, model, and document the business/technical requirements that will be needed.
-
Model the data from various sources and technologies.
-
Troubleshoot and support the most complex and high-impact problems to deliver new features and functionalities.
-
Design and optimize data storage architectures including data lakes, data warehouses, or distributed file systems.
-
Implement techniques like partitioning, compression, or indexing to optimize data storage and retrieval.
-
Identify and resolve bottlenecks, tuning queries, and implement caching strategies to enhance data retrieval speed and overall system efficiency.
-
Identify and resolve issues related to data processing, storage, or infrastructure.
-
Monitor system performance, identify anomalies, and conduct root cause analysis to ensure smooth and uninterrupted data operations.
-
Train and mentor junior data engineers, providing guidance and knowledge transfer.
Requirements:
-
At least 4 years of experience as a Data Engineer, including min. 3 years of experience working with GCP cloud-based infrastructure & systems
-
Strong knowledge of cloud computing platforms - Google Cloud - Candidate should be able to design, build, and deploy data pipelines in the cloud, to ingest data from various sources like databases, APIs, or streaming platforms.
-
Programming skills (SQL, Python, other scripting)
-
Proficient in data modeling techniques and database optimization. Knowledge of query optimization, indexing, and performance tuning is necessary for efficient data retrieval and processing
-
Proficient in database management systems such as SQL (Big Query is a must), and NoSQL,. Candidate should be able to design, configure, and manage databases to ensure optimal performance and reliability
-
Experience with data integration tools and techniques, such as ETL and ELT. Candidate should be able to integrate data from multiple sources and transform it into a format that is suitable for analysis
-
Tools knowledge: Git, Jira, Confluence, etc.
-
Excellent communication skills to effectively collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders. Ability to convey technical concepts to non-technical stakeholders clearly and concisely
-
Open to learning new technologies and solutions.
-
Experience in multinational environments and distributed teams
-
Nice to have skills: Certifications in big data technologies and/or cloud platforms, Experience with BI solutions (e.g. PowerBI, Tableau), Experience with ETL tools: (e.g. Talend, Alteryx) Experience with Azure cloud-based infrastructure & systems
Offer:
-
Stable employment. On the market since 2008, 1400+ talents currently on board in 7 global sites.
-
“Office as an option” model. You can choose to work remotely or in the office.
-
Flexibility regarding working hours and your preferred form of contract.
-
Comprehensive online onboarding program with a “Buddy” from day 1.
-
Cooperation with top-tier engineers and experts.
-
Unlimited access to the Udemy learning platform from day 1.
-
Certificate training programs. Lingarians earn 500+ technology certificates yearly.
-
Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly.
-
Internal Gallup Certified Strengths Coach to support your growth.
-
Grow as we grow as a company. 76% of our managers are internal promotions.
-
A diverse, inclusive, and values-driven community.
-
Autonomy to choose the way you work. We trust your ideas.
-
Create our community together. Refer your friends to receive bonuses.
-
Activities to support your well-being and health.
-
Plenty of opportunities to donate to charities and support the environment.
-
Great Place to Work Certified Employer in the Philippines