224 Data Pipeline jobs in Singapore

Data Pipeline Architect

Singapore, Singapore beBeeDataPipeline

Posted today

Job Viewed

Tap Again To Close

Job Description

About the Role:

We are seeking a highly skilled Data Pipeline Architect to join our team. As a key member of our data engineering group, you will be responsible for designing and implementing efficient data pipelines that meet the needs of our organization.

Key Responsibilities:

  • Design and develop scalable data pipelines using Python and AWS services (S3, Lambda, EC2, CloudWatch).
  • Test and validate data pipelines for all new releases to ensure data integrity and completeness.
  • Debug and troubleshoot pipeline issues, implementing fixes and documenting root causes.
  • Collaborate with data scientists, engineers, and stakeholders to optimize pipeline performance and efficiency.
  • Write clean, maintainable code with unit tests and follow best practices for code quality.

Requirements:

  • Proficient in Python programming language, including unit testing and clean code practices.
  • Experience in data pipeline development and troubleshooting, with a strong understanding of data science workflows.
  • Familiarity with AWS services, including S3, Lambda, EC2, and CloudWatch.
  • Strong debugging and problem-solving skills, with attention to detail and focus on data quality.

What We Offer:

We offer a competitive salary and benefits package, as well as opportunities for professional growth and development.

How to Apply:

To apply for this role, please submit your resume and a cover letter explaining your qualifications and experience.

This advertiser has chosen not to accept applicants from your region.

Data Pipeline Specialist

Singapore, Singapore beBeeData

Posted today

Job Viewed

Tap Again To Close

Job Description

Senior Data Architect

We are seeking a highly skilled data engineer with expertise in designing, building, and maintaining robust data pipelines. The ideal candidate will have a strong background in data architecture, ETL processes, and data governance.

Key Responsibilities:
  • Develop and optimize data models, data warehouses, and ETL processes to support analytics and machine learning initiatives.
  • Work closely with data scientists, analysts, and software engineers to understand data requirements and deliver tailored solutions.
  • Ensure data integrity, implement validation checks, and enforce data governance policies.
Requirements:
  • Bachelor's or master's degree in computer science, data science, information technology, or a related field, with 6+ years of experience.
  • Proficiency in SQL, Python, and ETL/ELT tools.
  • Experience in big data technologies (Hadoop, Spark) and cloud platforms (Azure, AWS, GCP).
  • Strong understanding of data modeling, data warehousing, and stream processing.
This advertiser has chosen not to accept applicants from your region.

Data Pipeline Specialist

Singapore, Singapore beBeePipeline

Posted today

Job Viewed

Tap Again To Close

Job Description

Unlock Your Potential as a Data Pipeline Specialist

Are you passionate about building and optimizing data pipelines? Do you have a knack for problem-solving and ensuring data integrity?

We're seeking an experienced Data Engineer to join our team. As a Data Pipeline Specialist, you will be responsible for designing, developing, and deploying efficient data pipelines that meet the needs of our business.

**Job Responsibilities:**
  • Design, develop, and deploy high-quality data pipelines using industry-standard tools and technologies.
  • Collaborate with cross-functional teams to identify data requirements and ensure data integrity.
  • Troubleshoot and resolve issues related to data pipeline performance and reliability.
  • Stay up-to-date with the latest industry trends and best practices in data engineering.


**Requirements:**
  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 3+ years of experience in data engineering, including design, development, and deployment of data pipelines.
  • Proficient in programming languages such as Python, Java, or C++.
  • Strong understanding of data modeling, data warehousing, and ETL processes.


**Nice to Have:**
  • Experience with cloud-based data platforms such as AWS, Azure, or Google Cloud.
  • Knowledge of machine learning algorithms and their application in data engineering.


What We Offer:
  • A competitive salary and benefits package.
  • The opportunity to work on challenging projects and collaborate with a talented team.
  • Professional growth and development opportunities.


This advertiser has chosen not to accept applicants from your region.

Data Pipeline Specialist

Singapore, Singapore beBeePipeline

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Role Overview:

">

This data pipeline role involves testing and validating data pipelines for new releases, ensuring data integrity and completeness after deployments, debugging and troubleshooting pipeline issues, implementing fixes, documenting root causes, and optimizing pipeline performance and efficiency.

">

Additional responsibilities include writing clean, maintainable Python code with unit tests, collaborating with data scientists, engineers, and stakeholders, and supporting AWS-based pipeline infrastructure.

">

Key requirements for this position include proficiency in Python, experience in data pipeline development and troubleshooting, familiarity with AWS services, strong debugging and problem-solving skills, attention to detail and focus on data quality, and a basic understanding of data science workflows and CI/CD and version control.

">

Essential Skills:

">
    ">
  • Pipeline Development
  • ">
  • Troubleshooting
  • ">
  • Data Science
  • ">
  • Python
  • ">
  • AWS Services
  • ">

This advertiser has chosen not to accept applicants from your region.

Chief Data Pipeline Architect

Singapore, Singapore beBeeDataEngineer

Posted today

Job Viewed

Tap Again To Close

Job Description

Senior Data Engineer

The role of Senior Data Engineer involves leading the design, implementation, and operation of large-scale data pipelines. This includes architecting data flow, building and maintaining data processing systems, and ensuring high-quality data delivery.


Key Responsibilities:
  • Design and implement scalable data architectures using industry-standard tools and technologies.
  • Develop and maintain high-quality code for data processing pipelines, ensuring reliability and performance.
  • Collaborate with cross-functional teams to integrate data pipeline components and ensure seamless data flow.
  • Monitor and troubleshoot data pipeline issues, implementing solutions to improve data quality and reduce errors.

Requirements:
  • At least 3 years of hands-on experience in data engineering, designing and building ETL pipelines.
  • Experience in project execution and demonstrated technical expertise.
  • Proficient in programming languages such as Python and proficiency in SQL.
  • Shell scripting skills (Linux) and knowledge of Airflow or similar tool for creating pipelines.
  • AWS tools like S3, Glue, AWS EMR, Lambda, Step functions, and familiarity with tools like AWS Batch, Athena, file formats like Parquet, and NoSQL knowledge.
  • The ability to work effectively in a team environment and communicate complex technical ideas to non-technical stakeholders.
This advertiser has chosen not to accept applicants from your region.

Senior Data Pipeline Developer

Singapore, Singapore beBeeDataEngineer

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary

We are seeking a skilled Data Engineer to build and maintain robust data pipelines that drive value creation through data-driven insights.

This advertiser has chosen not to accept applicants from your region.

Senior Data Pipeline Architect

Singapore, Singapore beBeeDataEngineer

Posted today

Job Viewed

Tap Again To Close

Job Description

About the Role

This is a highly specialized position that requires expertise in designing and implementing data pipelines, architecture, and infrastructure.

  • Contribute to the full data engineering lifecycle, including conceptualization, data modeling, implementation, and operational management of data systems.
  • Design and build robust, scalable data infrastructure, including real-time and batch ETL/ELT data pipelines, using orchestration tools like Apache Airflow.
  • Support the development and management of the organization's data lake and data warehouse by implementing data models and applying modern data transformation practices with tools like dbt.
  • Automate data processes by integrating different systems, APIs, and third-party services to ensure smooth data flow and synchronization.
Key Responsibilities
  • Collaborate with cross-functional teams to help translate business requirements into technical specifications and production-ready solutions.
  • Develop, deploy, and maintain internal automation solutions and tools by applying AI, web scraping, and custom logic to create effective data products.
Requirements

We're looking for a talented professional with a solid understanding of software engineering principles, system architecture, and object-oriented design, with experience building applications.

  • Bachelor's degree in computer science, software engineering, or related field preferred.
  • Minimum 2 years of similar experience.
  • Strong proficiency in Python, including experience with data manipulation libraries (e.g., Pandas, NumPy) and frameworks for web scraping (e.g., Scrapy) or AI/ML (e.g., TensorFlow, PyTorch).
  • Familiarity with workflow orchestration tools like Apache Airflow for managing data pipelines.
  • Experience with data modeling techniques (conceptual, logical, physical) and data warehouse schemas.
  • Familiarity with cloud platforms (AWS, GCP, Azure) and proficient in SQL for designing and optimizing relational databases (e.g., PostgreSQL, Oracle, SQL Server).
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data pipeline Jobs in Singapore !

Senior Data Pipeline Specialist

Singapore, Singapore beBeeDataEngineer

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Opportunity

A Data Engineer position is available in a dynamic organization. The ideal candidate will be responsible for designing and implementing scalable data pipelines that extract, transform, and load data from various sources into a centralized data storage system.

The successful candidate will also be tasked with integrating data from multiple sources and systems, developing data transformation routines to clean, normalize, and aggregate data, and contributing to common frameworks and best practices in code development.

  • Key Responsibilities:
  • Design and implement scalable data pipelines using ETL processes.
  • Integrate data from multiple sources and systems.
  • Develop data transformation routines to clean, normalize, and aggregate data.
  • Contribute to common frameworks and best practices in code development.
Requirements
  • Proven Experience:
  • At least 5 years of experience as a Data Engineer with a strong track record of delivering scalable data pipelines.
  • Technical Skills:
  • Extensive hands-on experience developing data processing jobs using PySpark / SQL.
  • Experience orchestrating data pipelines on Azure.
  • Fluency in SQL with experience using Window functions.

iKas International Asia Pte Ltd

This advertiser has chosen not to accept applicants from your region.

Senior Data Pipeline Specialist

Singapore, Singapore beBeeData

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary

We are seeking a skilled Data Engineer to design, develop, and maintain robust data pipelines and ETL processes.

This advertiser has chosen not to accept applicants from your region.

Chief Data Architect - ETL & Data Pipeline Specialist

Singapore, Singapore beBeeDataEngineer

Posted today

Job Viewed

Tap Again To Close

Job Description

Senior Data Engineer

We are seeking a seasoned Senior Data Engineer to design and implement robust data pipelines and backend services that power AI-driven operations.

  • Data Pipeline Development:
  • Develop scalable ETL pipelines for batch and real-time data ingestion, transformation, and loading from diverse sources.
  • Implement data validation, cleansing, and normalization for consistent AI model input.
  • Create backend services and APIs to support data ingestion, metadata management, and configuration.
  • Optimize ETL jobs for performance, fault tolerance, and low latency.

Required Skills & Qualifications:

  • Programming & Scripting: Python, Go (Golang), Java, Ruby, JavaScript/TypeScript
  • ETL & Data Engineering: Apache NiFi, Spark, Airflow, Flink, Kafka, Talend

Key Benefits:

  • AIOps & Observability tools like Splunk, Dynatrace, AppDynamics, New Relic, Elastic Stack
  • ITSM systems (ServiceNow) and CMDB integrations

What We Offer:

  • Career growth opportunities
  • Collaborative work environment
  • Competitive salary and benefits package
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Pipeline Jobs