Data Engineer (Databricks, AWS)
Libis, Quezon City, National Capital Region
Posted 4 days ago
- Company:
- Blaseek
- Company Description:
- Blaseek Human Resource Management Consultancy offers comprehensive recruitment services across IT, Accounting, Engineering, and Commercial Business Support. With a wealth of industry experience, we connect top talent with businesses, ensuring a seamless and successful hiring process.
- Contract Type:
- Full Time
- Experience Required:
- 5 to 10 years
- Education Level:
- Bachelor’s Degree
- Number of vacancies:
- 2
Job Description
We’re looking for a Data Engineer with solid experience in building and managing robust data pipelines, cloud data platforms, and large-scale data systems. This role focuses on ensuring the seamless movement, transformation, and accessibility of data across the organization. A strong background in Databricks, along with Python, SQL, and cloud infrastructure (particularly Azure), is essential. The ideal candidate understands both the technical and business sides of data and can translate requirements into scalable, efficient solutions that support analytics and decision-making.
Key Responsibilities
Design, develop, and maintain scalable ETL/ELT pipelines for ingesting and transforming data from diverse sources.
Build and optimize data architectures using tools like Databricks, ensuring high performance and reliability.
Integrate data from APIs, cloud storage, third-party systems, and relational and non-relational databases.
Work with structured and semi-structured data formats including parquet, JSON, CSV, and SQL.
Collaborate with business and technology teams to understand data needs and deliver clean, accessible datasets.
Develop data validation processes to monitor quality, accuracy, and completeness.
Support data modeling, metadata management, and data governance initiatives.
Manage and optimize compute and storage on Azure, AWS, and Linux environments.
Automate data workflows and processes using Python (with Spark) and SQL.
Participate in testing and validation of data pipelines before deployment to production.
Document data pipelines and workflows for transparency and handover to IT or data operations teams.
Provide support for ad hoc data extraction and reporting needs using tools like Power BI or DAX.
Contribute to Agile ceremonies and workflows (Scrum or Kanban) and help drive delivery in a collaborative environment.
Qualifications
Bachelor’s degree in Computer Engineering, Data Science, Statistics, Physics, or related IT fields.
Strong experience with Databricks for data engineering and processing at scale.
Proficient in Python (especially with Apache Spark), SQL, and DAX.
Familiar with data pipeline development, ETL/ELT, data modeling, and integration.
Experience working with Azure cloud services; knowledge of AWS is a plus.
Hands-on with version control tools like GitHub and experienced in working in Agile teams.
Comfortable with data wrangling, performance tuning, and troubleshooting large datasets.
Familiar with reporting and visualization tools such as Power BI; experience with Tableau is a bonus.
Nice to have: exposure to LS Central, Dynamics 365 (Business Central), Microsoft SQL Server, JavaScript, HTML, CSS, or AL language.
- Salary:
- ₱130,000.00 Monthly