142 Data Pipelines jobs in Singapore
Data Architecture Specialist
Posted today
Job Viewed
Job Description
We are seeking a skilled Data Architecture Specialist to join our team. As a key member of our data organization, you will be responsible for designing and implementing robust data architectures that support business growth and decision-making.
Data Architecture Specialist
Posted today
Job Viewed
Job Description
Data architecture specialists design and implement large-scale data systems, ensuring efficient data processing and storage.
- Data Processing : Develop ETL/ELT pipelines that efficiently process large volumes of structured and unstructured data from diverse sources.
- Data Governance : Embed data quality, governance, and security standards throughout all data engineering processes.
Hands-on Development : Create and optimize PySpark scripts for efficient data extraction, transformation, and loading from large datasets.
Mentorship and Leadership : Provide technical mentorship to junior data engineers and analysts, lead code reviews, and drive the adoption of modern data engineering tools and methodologies.
Collaboration : Work with cross-functional teams to deliver integrated solutions, including data scientists, analysts, and software engineers.
Pipeline Optimization : Monitor and optimize data pipeline performance, implementing solutions for scalability and cost-effectiveness.
Requirements- Proficiency in PySpark, Python, and SQL
- Experience with ETL/ELT pipeline development
- Strong understanding of data governance and security principles
- Ability to work collaboratively in a team environment
A dynamic work environment with opportunities for professional growth and development.
How to ApplyPlease submit your resume and cover letter to apply for this exciting opportunity.
Data Architecture Specialist
Posted today
Job Viewed
Job Description
We are seeking a seasoned data architecture specialist to join our team as a senior contributor responsible for designing, implementing, and optimizing robust data infrastructure and pipelines. This role offers the opportunity to work hands-on with cutting-edge AWS and Databricks technologies whilst providing technical guidance to team members and collaborating with stakeholders across the organization on complex data engineering challenges.
Key Responsibilities:- Data Architecture & Engineering : Design and implement enterprise-scale data architectures, including data lakes, warehouses, and real-time streaming platforms. Develop and maintain ETL/ELT pipelines that efficiently process large volumes of structured and unstructured data from diverse sources. Ensure data quality, governance, and security standards are embedded throughout all data engineering processes.
- Technical Implementation : Hands-on development of Databricks notebooks using PySpark, Python, and SQL for ETL automation. Create and optimize PySpark scripts for efficient data extraction, transformation, and loading from large datasets. Implement custom data manipulation, validation, and error handling solutions to enhance ETL robustness.
- Technical Guidance : Provide technical mentorship to junior data engineers and analysts. Lead code reviews, establish best practices, and drive adoption of modern data engineering tools and methodologies. Collaborate with cross-functional teams including data scientists, analysts, and software engineers to deliver integrated solutions.
- Performance & Optimisation : Monitor and optimize data pipeline performance, implementing solutions for scalability and cost-effectiveness. Conduct testing, debugging, and troubleshooting of data transformation processes. Verify data integrity throughout pipeline stages and resolve complex technical issues.
SAP Data Architecture
Posted today
Job Viewed
Job Description
Role – SAP Data Architecture
Job Requisition Number: SA-002
Job Level: 5 – 7 Years of experience
Key Responsibilities
Develop data analysis and reporting framework for S/4HANA eco-system and ensure it is aligned to our enterprise level data lake and analytics architecture & security guidelines
Assess platforms and tools that are required for the implementation of data analysis & reporting solutions and be able to advise the project team on the merits & demerits of different solution approaches and make solution recommendations
Drive our SAP eco system analytics implementation and work with both internal and external teams for the execution to completion
Collaborate closely with both internal & external partners to achieve alignment between people working on the implementation
Assess new reporting/analytics technologies and recommend adoption where it will improve overall usage of data as enabler for business purposes
This is an individual contributor & hands-on role
Requirements
• Degree in Information Technology or related fields
• At least 5 - 7 years of experience in designing and implementing Data Warehouse and SAP Analytics cloud solutions, including integration with data from SAP Cloud solutions such as SuccessFactors, Ariba, Concur, etc, along with integration to non-SAP systems
• Have experience in customizing standard business contents in BW to suit the reporting requirements of customized process in S/4HANA
• Experience in SAP Datasphere & SAC Implementation
• Should have worked for at least 1 full cycle embedded analytics and enterprise analytics implementation projects with end-to-end experience in requirements gathering, functional analysis, high level design, built, testing and deployment (implementation in BW/4HANA would be an added advantage)
• Good understanding of ETL processes and techniques such as SDI and Data services for extraction of data from S/4HANA and non-SAP databases
• Have knowledge of working with native HANA modelling and BW modelling techniques and tools
• Knowledge of SAP Analytics Cloud (SAC) BI tool, its pre-built visualization contents, integration and connectivity capabilities
• Have knowledge / experience in delivering projects under the Agile framework
• Work independently as well as collaboratively as part of a highly skilled team
• Good problem-solving and communication skills
Please forward your resume in MS word format to /
Tell employers what skills you haveTableau
Requirements Gathering
Microsoft Excel
Data Analysis
Level Design
Architect
Agile
ETL
Information Technology
SAP
MS Word
Data Architecture
Functional Analysis
Visualization
Requisition
Databases
Data Architecture Specialist
Posted today
Job Viewed
Job Description
Data Architecture Expertise ">
- ">
- We are looking for a skilled expert with experience in designing and developing large-scale data pipelines using Scala, Python, and PySpark. ">
- The successful candidate will lead and mentor a team of data engineers to build and maintain cutting-edge data architectures. ">
- Collaboration with cross-functional teams is crucial to identifying data requirements and implementing data-driven solutions that meet business needs. ">
- The ideal candidate will ensure data quality, integrity, and security across all data pipelines. ">
Key Requirements ">
- ">
- A minimum of 5 years of experience in big data engineering with expertise in Scala, Python, and PySpark. ">
- Strong experience with big data technologies such as Apache Spark and Hadoop. ">
- Experience with cloud-based data platforms such as AWS, GCP, or Azure. ">
- Strong understanding of data architecture, data governance, and data security principles. ">
- Excellent leadership and mentoring skills with experience leading high-performing teams. ">
- Strong communication and collaboration skills with ability to work with cross-functional teams. ">
Benefits of this role include ">
- ">
- Opportunities to work on challenging projects and develop your technical skills. ">
- A collaborative and dynamic work environment. ">
- A competitive salary and benefits package. ">
If you are a motivated and experienced Big Data Engineering Lead looking for a new challenge, please apply
Data Architecture Specialist
Posted today
Job Viewed
Job Description
We are seeking a skilled professional to design and implement large-scale data architectures using cutting-edge technologies.
Key Responsibilities:
- Develop scalable data systems that process vast amounts of unstructured data from diverse sources.
- Analyze complex data sets to drive business value and optimize performance.
- Create real-time data pipelines for seamless data integration and analytics.
- Collaborate with cross-functional teams to embed data insights into business applications.
- Troubleshoot data pipeline issues and ensure high-quality data processing.
Requirements:
- At least 10 years of experience in data engineering, with a focus on building robust data systems.
- Expertise in Spark Streaming, PySpark, and Scala programming languages.
- Strong analytical and problem-solving skills, with attention to detail and scalability.
- Excellent communication and collaboration skills.
Benefits:
- The opportunity to work with innovative technologies and contribute to the development of cutting-edge data solutions.
- A dynamic and supportive work environment that fosters growth and collaboration.
Chief Data Architecture Specialist
Posted today
Job Viewed
Job Description
We are seeking a highly skilled Chief Data Architecture Specialist to join our team. In this role, you will be responsible for designing and implementing data models, developing data warehouses and lakes, and ensuring seamless integration between various data systems.
Responsibilities- Design and implement data models using tools like ER Studio.
- Develop and maintain data warehouses, data lakes, and operational data stores.
- Ensure seamless integration between various data systems and applications.
- Implement data security and compliance requirements.
- Design scalable solutions for data integration and consolidation.
To be successful in this role, you will need:
- At least 3 years of experience in data engineering or similar role.
- Strong proficiency in Python, VQL, SQL.
- Experience with AWS services (Glue, Athena, S3, RDS, Sagemaker).
- Knowledge of data virtualisation concepts and tools.
- Experience with BI tools.
- Understanding of data modelling and database design principles.
- Familiarity with data and master data management concepts.
- Experience working in Agile environments with iterative development practices.
AWS Bedrock, Azure AI, LLMs would be advantageous.
Be The First To Know
About the latest Data pipelines Jobs in Singapore !
Project Manager, Data Architecture
Posted 1 day ago
Job Viewed
Job Description
This position bridge between business operations and IT, playing a crucial role in enabling data-driven decision-making and supporting the organization's overall data strategy.
What To Expect
(Project Leadership)
- Connect and collaborate with teams from other regions of the world to share best practices and global standards definitions.
- Interface with several corporate areas and preparation of technical documents, presentations, and reports.
(Data Architecture)
- Providing a high-level, abstract representation of an organization's data requirements
- Design and implement comprehensive data management strategies aligned with business goals
- Domain-oriented data architecture (DODA) structures data around business domains, enabling
- Implement data governance policies and data quality control measures
- Develop and maintain enterprise data architecture solutions, including data warehouses and data lakes
- Ensure data security and compliance with relevant regulations
- Create data models and strategies to organize and store data entities efficiently
- Assess existing databases and data architectures for weaknesses
What You'll Bring
- Bachelor's Degree in Computer Science, Information Technology or relevant disciplines
- At least 10 years of experience in data architect and technologies
- Proficiency in programming languages such as Python, SQL, Java, and C++
- Deep understanding of database management systems and data modeling techniques
- Experience with Big Data technologies, cloud computing platforms, and data analytics tools
- Strong project management and time management abilities
- Excellent communication skills to collaborate with various stakeholders
- Develop data pipeline processes to move data from OLTP to OLAP systems
Junior Data Engineer/Architecture
Posted today
Job Viewed
Job Description
Key Responsibilities
- Build and maintain data pipelines for ingesting futures market data from multiple sources (exchange feeds, vendors, internal systems).
- Design and improve data models and structures to store historical and intraday futures prices across multiple asset classes.
- Implement logic to roll futures contracts based on configurable rules.
- Develop tools to stitch together contract-level data into continuous time series, handling edge cases (e.g. holidays, partial rolls, gaps).
- Ensure data quality and integrity via validation checks, exception handling, and alerting.
- Work with Quant Researchers to deliver clean, backtest-ready datasets that match trading and strategy assumptions.
- Document technical processes and support other teams with data usage and integration.
Requirements
- 1–3 years of experience in a data engineering, quant research, or backoffice data role. Fresh graduates with relevant experience are welcome.
- Solid programming skills in Python (e.g., pandas , numpy , datetime , etc.)
- Good understanding of futures markets and contract structures (e.g., expiries, rollovers, front-month logic).
- Familiarity with data versioning, time series databases, or data lakes.
- Experience working with APIs, flat files, or vendor feeds (e.g., Bloomberg, ICE, CME, Refinitiv).
- Strong attention to detail and ability to work with messy or inconsistent datasets.
Data Integration
Posted today
Job Viewed
Job Description
Data Integration & Historian Specialist (Pharma, Contract)
12 Months Contract
Location: Tuas
Up to $10,000 depending on experience
Industry: Pharmaceutical
Our client aims at creating a new manufacturing concept consisting of a new generation of evolutive multi-product facilities, modular, adaptable, and agile, leveraging new disruptive technologies, to better address vaccine business challenges
Responsibilities:
- Define and support equipment interfacing to enable contextualized data acquisition (APRM), including Alarm & Event and Audit Trail data.
- Develop and optimize SQL queries to extract and manage relevant data.
- Collaborate on Plug & Produce (P&P) investigations and design efforts for seamless integration between MES and Historian systems.
- Review and approve technical specifications for system interfaces.
- Provide support for test case execution and perform teat case activities as required, ensuring system readiness and compliance with defined requirements.
- Act as a liaison between Digital and other cross function team (Automation, MES and MSAT) as well as Global Teams to support and troubleshoot for application-related incidents, ensuring timely resolution and effective communication.
- Diagnose and resolve issues related to data acquisition and interface connectivity between MES, PLC, DCS, and Historian systems.
- Monitor application performance and availability to ensure continuous operation of the Historian system.
- Conduct routine maintenance tasks including log reviews, job monitoring, and system health checks.
- Ensure robust backup and recovery procedures are in place and regularly tested.
- Participate in Historian upgrade activities, including impact assessments to evaluate risks, dependencies, and validation requirements.
Requirements:
- Proficient in both Waterfall and Agile project methodologies
- Deep expertise in deploying and configuring Aspen Historian Suite, including Aspen IP.21, Aspen Production Record Manager (APRM), Event21, CIM-IO, Batch Extractor and Aspen SQL +
- Skilled in configuring Kepware, including driver setup for protocol transaction and data aggregation
- Experienced working in pharmaceutical manufacturing environments
- Proven experience integrating with a wide range of automation and control systems:
- DCS systems such as Emerson DeltaV
- SCADA platforms including WinCC and Wonderware
- PLCs, benchtop instruments, and other lab/manufacturing equipment
- Technical Proficiencies
- Database: MySQL, Microsoft SQL Server, Oracle
- Data and Industrial Protocols: ODBC, FTP, MQTT, OPC-UA, OPC-DA, Modbus
- Scripting & Automation: PowerShell scripting for automation and data handling
- Infrastructure Knowledge: In-depth understanding of networking principles and operating systems (Windows/Linux)
Lim Pey Chyi -
Recruitment Consultant (R )
Manpower Staffing Services (S) Pte Ltd
EA Licence: 02C3423
Tell employers what skills you haveMES
DCS
Oracle
Wonderware
Staffing Services
MySQL
Data Integration
Microsoft SQL Server
PLC
Protocol
SCADA
Adaptable
Operating Systems
WinCC