Location:
Pasay, National Capital Region
Contract Type:
Full Time
Experience Required:
2 years
Education Level:
Bachelor’s Degree
Job Description
We’re Hiring!
Nityo Infotech is looking for a talented Data Engineer to join our client in driving data excellence and innovation. If you’re passionate about data and ready to grow your career in a dynamic environment, this opportunity is for you.
Position: Data Engineer
Location: Bagumbayan, Quezon City (Onsite)
Schedule: Day Shift
Experience: 1–2 years (solid experience required)
Employment Type: Full-time
Job Summary:
We are hiring a Data Engineer with 1–2 years of strong experience in Databricks, AWS, and Python to join our client’s data team. You will be responsible for designing and maintaining scalable data pipelines, managing workflows, and ensuring data quality across systems.
Key Responsibilities:
-- Design, develop, and maintain scalable ETL/data pipelines
-- Collaborate with teams to gather data requirements and implement solutions
-- Work with large datasets using Databricks and AWS tools
-- Write efficient Python scripts for data processing and automation
-- Monitor data flows to ensure consistency, accuracy, and reliability
Qualifications:
-- 1–2 years of solid experience in Databricks, AWS, and Python
-- Hands-on knowledge of AWS services like S3, Glue, Redshift
-- Proficiency in Python for data manipulation and automation
-- Strong problem-solving skills and attention to detail
-- Bachelor’s degree in Computer Science, IT, or related field (preferred)
Nityo Infotech is looking for a talented Data Engineer to join our client in driving data excellence and innovation. If you’re passionate about data and ready to grow your career in a dynamic environment, this opportunity is for you.
Position: Data Engineer
Location: Bagumbayan, Quezon City (Onsite)
Schedule: Day Shift
Experience: 1–2 years (solid experience required)
Employment Type: Full-time
Job Summary:
We are hiring a Data Engineer with 1–2 years of strong experience in Databricks, AWS, and Python to join our client’s data team. You will be responsible for designing and maintaining scalable data pipelines, managing workflows, and ensuring data quality across systems.
Key Responsibilities:
-- Design, develop, and maintain scalable ETL/data pipelines
-- Collaborate with teams to gather data requirements and implement solutions
-- Work with large datasets using Databricks and AWS tools
-- Write efficient Python scripts for data processing and automation
-- Monitor data flows to ensure consistency, accuracy, and reliability
Qualifications:
-- 1–2 years of solid experience in Databricks, AWS, and Python
-- Hands-on knowledge of AWS services like S3, Glue, Redshift
-- Proficiency in Python for data manipulation and automation
-- Strong problem-solving skills and attention to detail
-- Bachelor’s degree in Computer Science, IT, or related field (preferred)
Number of vacancies:
1
Company Description
Nityo Infotech is a global technology services company headquartered in the United States. The company offers services such as consulting, technology outsourcing, and staffing solutions across various industries, including banking, healthcare, and telecommunications. Nityo Infotech has a presence in several countries and serves clients worldwide.
View Company Profile