224 Data Pipeline jobs in Singapore
Data Pipeline Architect
Posted today
Job Viewed
Job Description
About the Role:
We are seeking a highly skilled Data Pipeline Architect to join our team. As a key member of our data engineering group, you will be responsible for designing and implementing efficient data pipelines that meet the needs of our organization.
Key Responsibilities:
- Design and develop scalable data pipelines using Python and AWS services (S3, Lambda, EC2, CloudWatch).
- Test and validate data pipelines for all new releases to ensure data integrity and completeness.
- Debug and troubleshoot pipeline issues, implementing fixes and documenting root causes.
- Collaborate with data scientists, engineers, and stakeholders to optimize pipeline performance and efficiency.
- Write clean, maintainable code with unit tests and follow best practices for code quality.
Requirements:
- Proficient in Python programming language, including unit testing and clean code practices.
- Experience in data pipeline development and troubleshooting, with a strong understanding of data science workflows.
- Familiarity with AWS services, including S3, Lambda, EC2, and CloudWatch.
- Strong debugging and problem-solving skills, with attention to detail and focus on data quality.
What We Offer:
We offer a competitive salary and benefits package, as well as opportunities for professional growth and development.
How to Apply:
To apply for this role, please submit your resume and a cover letter explaining your qualifications and experience.
Data Pipeline Specialist
Posted today
Job Viewed
Job Description
We are seeking a highly skilled data engineer with expertise in designing, building, and maintaining robust data pipelines. The ideal candidate will have a strong background in data architecture, ETL processes, and data governance.
Key Responsibilities:- Develop and optimize data models, data warehouses, and ETL processes to support analytics and machine learning initiatives.
- Work closely with data scientists, analysts, and software engineers to understand data requirements and deliver tailored solutions.
- Ensure data integrity, implement validation checks, and enforce data governance policies.
- Bachelor's or master's degree in computer science, data science, information technology, or a related field, with 6+ years of experience.
- Proficiency in SQL, Python, and ETL/ELT tools.
- Experience in big data technologies (Hadoop, Spark) and cloud platforms (Azure, AWS, GCP).
- Strong understanding of data modeling, data warehousing, and stream processing.
Data Pipeline Specialist
Posted today
Job Viewed
Job Description
Are you passionate about building and optimizing data pipelines? Do you have a knack for problem-solving and ensuring data integrity?
We're seeking an experienced Data Engineer to join our team. As a Data Pipeline Specialist, you will be responsible for designing, developing, and deploying efficient data pipelines that meet the needs of our business.
**Job Responsibilities:**
- Design, develop, and deploy high-quality data pipelines using industry-standard tools and technologies.
- Collaborate with cross-functional teams to identify data requirements and ensure data integrity.
- Troubleshoot and resolve issues related to data pipeline performance and reliability.
- Stay up-to-date with the latest industry trends and best practices in data engineering.
**Requirements:**
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 3+ years of experience in data engineering, including design, development, and deployment of data pipelines.
- Proficient in programming languages such as Python, Java, or C++.
- Strong understanding of data modeling, data warehousing, and ETL processes.
**Nice to Have:**
- Experience with cloud-based data platforms such as AWS, Azure, or Google Cloud.
- Knowledge of machine learning algorithms and their application in data engineering.
What We Offer:
- A competitive salary and benefits package.
- The opportunity to work on challenging projects and collaborate with a talented team.
- Professional growth and development opportunities.
Data Pipeline Specialist
Posted today
Job Viewed
Job Description
Job Role Overview:
">This data pipeline role involves testing and validating data pipelines for new releases, ensuring data integrity and completeness after deployments, debugging and troubleshooting pipeline issues, implementing fixes, documenting root causes, and optimizing pipeline performance and efficiency.
">Additional responsibilities include writing clean, maintainable Python code with unit tests, collaborating with data scientists, engineers, and stakeholders, and supporting AWS-based pipeline infrastructure.
">Key requirements for this position include proficiency in Python, experience in data pipeline development and troubleshooting, familiarity with AWS services, strong debugging and problem-solving skills, attention to detail and focus on data quality, and a basic understanding of data science workflows and CI/CD and version control.
">Essential Skills:
">- ">
- Pipeline Development ">
- Troubleshooting ">
- Data Science ">
- Python ">
- AWS Services ">
Chief Data Pipeline Architect
Posted today
Job Viewed
Job Description
The role of Senior Data Engineer involves leading the design, implementation, and operation of large-scale data pipelines. This includes architecting data flow, building and maintaining data processing systems, and ensuring high-quality data delivery.
Key Responsibilities:
- Design and implement scalable data architectures using industry-standard tools and technologies.
- Develop and maintain high-quality code for data processing pipelines, ensuring reliability and performance.
- Collaborate with cross-functional teams to integrate data pipeline components and ensure seamless data flow.
- Monitor and troubleshoot data pipeline issues, implementing solutions to improve data quality and reduce errors.
Requirements:
- At least 3 years of hands-on experience in data engineering, designing and building ETL pipelines.
- Experience in project execution and demonstrated technical expertise.
- Proficient in programming languages such as Python and proficiency in SQL.
- Shell scripting skills (Linux) and knowledge of Airflow or similar tool for creating pipelines.
- AWS tools like S3, Glue, AWS EMR, Lambda, Step functions, and familiarity with tools like AWS Batch, Athena, file formats like Parquet, and NoSQL knowledge.
- The ability to work effectively in a team environment and communicate complex technical ideas to non-technical stakeholders.
Senior Data Pipeline Developer
Posted today
Job Viewed
Job Description
We are seeking a skilled Data Engineer to build and maintain robust data pipelines that drive value creation through data-driven insights.
Senior Data Pipeline Architect
Posted today
Job Viewed
Job Description
This is a highly specialized position that requires expertise in designing and implementing data pipelines, architecture, and infrastructure.
- Contribute to the full data engineering lifecycle, including conceptualization, data modeling, implementation, and operational management of data systems.
- Design and build robust, scalable data infrastructure, including real-time and batch ETL/ELT data pipelines, using orchestration tools like Apache Airflow.
- Support the development and management of the organization's data lake and data warehouse by implementing data models and applying modern data transformation practices with tools like dbt.
- Automate data processes by integrating different systems, APIs, and third-party services to ensure smooth data flow and synchronization.
- Collaborate with cross-functional teams to help translate business requirements into technical specifications and production-ready solutions.
- Develop, deploy, and maintain internal automation solutions and tools by applying AI, web scraping, and custom logic to create effective data products.
We're looking for a talented professional with a solid understanding of software engineering principles, system architecture, and object-oriented design, with experience building applications.
- Bachelor's degree in computer science, software engineering, or related field preferred.
- Minimum 2 years of similar experience.
- Strong proficiency in Python, including experience with data manipulation libraries (e.g., Pandas, NumPy) and frameworks for web scraping (e.g., Scrapy) or AI/ML (e.g., TensorFlow, PyTorch).
- Familiarity with workflow orchestration tools like Apache Airflow for managing data pipelines.
- Experience with data modeling techniques (conceptual, logical, physical) and data warehouse schemas.
- Familiarity with cloud platforms (AWS, GCP, Azure) and proficient in SQL for designing and optimizing relational databases (e.g., PostgreSQL, Oracle, SQL Server).
Be The First To Know
About the latest Data pipeline Jobs in Singapore !
Senior Data Pipeline Specialist
Posted today
Job Viewed
Job Description
A Data Engineer position is available in a dynamic organization. The ideal candidate will be responsible for designing and implementing scalable data pipelines that extract, transform, and load data from various sources into a centralized data storage system.
The successful candidate will also be tasked with integrating data from multiple sources and systems, developing data transformation routines to clean, normalize, and aggregate data, and contributing to common frameworks and best practices in code development.
- Key Responsibilities:
- Design and implement scalable data pipelines using ETL processes.
- Integrate data from multiple sources and systems.
- Develop data transformation routines to clean, normalize, and aggregate data.
- Contribute to common frameworks and best practices in code development.
- Proven Experience:
- At least 5 years of experience as a Data Engineer with a strong track record of delivering scalable data pipelines.
- Technical Skills:
- Extensive hands-on experience developing data processing jobs using PySpark / SQL.
- Experience orchestrating data pipelines on Azure.
- Fluency in SQL with experience using Window functions.
iKas International Asia Pte Ltd
Senior Data Pipeline Specialist
Posted today
Job Viewed
Job Description
We are seeking a skilled Data Engineer to design, develop, and maintain robust data pipelines and ETL processes.
Chief Data Architect - ETL & Data Pipeline Specialist
Posted today
Job Viewed
Job Description
We are seeking a seasoned Senior Data Engineer to design and implement robust data pipelines and backend services that power AI-driven operations.
- Data Pipeline Development:
- Develop scalable ETL pipelines for batch and real-time data ingestion, transformation, and loading from diverse sources.
- Implement data validation, cleansing, and normalization for consistent AI model input.
- Create backend services and APIs to support data ingestion, metadata management, and configuration.
- Optimize ETL jobs for performance, fault tolerance, and low latency.
Required Skills & Qualifications:
- Programming & Scripting: Python, Go (Golang), Java, Ruby, JavaScript/TypeScript
- ETL & Data Engineering: Apache NiFi, Spark, Airflow, Flink, Kafka, Talend
Key Benefits:
- AIOps & Observability tools like Splunk, Dynatrace, AppDynamics, New Relic, Elastic Stack
- ITSM systems (ServiceNow) and CMDB integrations
What We Offer:
- Career growth opportunities
- Collaborative work environment
- Competitive salary and benefits package