121 Integration Developer jobs in Singapore
Data Integration Specialist
Posted today
Job Viewed
Job Description
At the heart of our organization, we require a skilled professional to drive data integration initiatives. The successful candidate will be responsible for ensuring seamless data exchange across systems, leveraging their expertise in Informatica PowerCenter and Oracle databases.
Key Responsibilities:
- Install, configure, and maintain Informatica PowerCenter software to facilitate efficient data integration.
- Translate technical requirements into actionable plans, including source-to-target mapping, to ensure accurate data transfer.
- Develop job schedules to synchronize upstream and downstream systems, guaranteeing data consistency and integrity.
- Verify data accuracy during testing phases, identifying and rectifying any discrepancies.
- Ensure timely delivery of documentation, design, build, testing, and deployment according to established work breakdown structures.
- Support production activities, including environment migrations, continuity testing, and maintenance.
- Proficiency in Informatica PowerCenter 10.x or higher, preferably in a Unix (Linux/AIX) environment with AS400/Oracle sources and targets.
- Experience with Oracle 19c or higher (SQL*Plus, PL/SQL) and Unix Shell Scripting (Linux/AIX).
- Familiarity with analyzing and producing technical mapping design specifications.
Data Integration Architect (API)
Posted today
Job Viewed
Job Description
about company
I am currently working with a well-known telecommunications company in Singapore.
5 days in office. 2 rounds of interview
about job (Please reach out for the whole JD)
- Design and build secure, reusable APIs (REST, GraphQL, event-driven) for AI agent and application integration.
- Architect end-to-end system and data integrations across hybrid cloud and on-premise environments.
- Develop scalable, secure data pipelines and frameworks for AI/ML platforms.
- Enable real-time, context-rich data access for LLMs and agent-based workflows.
skills and requirements
- Bachelor’s in Computer Science or related field.
- Min 3 years in data architecture, system and API integration engineering.
- Demonstrated experience in designing integration flows for large-scale, real-time systems across cloud and legacy environments.
- Proficient in data pipelines, lakes, warehouses, and lakehouse architectures, API (REST, SOAP, etc)
- Skilled with orchestration and middleware tools (Kafka, Azure Data Factory, Databricks, Airflow).
- Knowledgeable in data security, governance, and compliance (encryption, RBAC, masking, PDPA).
To apply online please use the 'apply' function, alternatively you may contact Stella at 96554170 (EA: 94C3609 /R1875382)
Data Integration Engineer, Data Science
Posted today
Job Viewed
Job Description
The Job:
To design, build, and maintain secure, scalable, and high-performance data pipelines for our next-generation Air Traffic Management (ATM) platform.
Work with modern cloud and data technologies, enabling real-time data-driven decision-making for safer and more efficient airspace operations.
While not mandatory, familiarity with AI/Agentic AI concepts and aviation data is a plus.
The Role:
• Design and develop scalable batch and streaming pipelines using modern tools like AWS, Databricks, etc.
• Implement reliable data processing, transformation, and storage with a focus on performance, security, and governance.
• Collaborate with data scientists, engineers, and domain experts to ensure the platform supports advanced analytics and AI-readiness.
• Optimize pipelines for resilience, observability, and cost-effectiveness using monitoring and automation tools.
• Maintain clear documentation and contribute to continuous improvements in data architecture.
The Requirements:
• Strong hands-on experience with AWS cloud services, particularly for data engineering.
• Proficiency in Python or Scala, with practical knowledge of Spark and distributed data processing.
• Experience building and maintaining scalable data pipelines (batch and real-time).
• Solid knowledge of SQL, data modelling, and transformation techniques.
• Familiarity with data security, governance, and compliance best practices.
• Strong problem-solving, analytical, and communication skills.
• Experience with AWS Databricks, Delta Lake, and medallion architecture.
• Exposure to AI/Gen AI concepts or intelligent agent-based architectures.
• Familiarity with aviation data standards (e.g., ADS-B, ARINC 424, flight schedules) or willingness to learn.
• AWS Certifications in Data Analytics, Big Data, or Machine Learning.
• Experience with real-time data processing and high-volume systems.
To Apply:
Please send your latest CV in word file to
Kindly indicate your availability, current and expected remuneration package.
We regret that only shortlisted candidates will be notified.
Quinnox Solutions Pte. Ltd. (License Number: 06C3126)
Registered EA Personnel (Reg. No.:R11100)
Tell employers what skills you haveMachine Learning
Security Governance
Aviation
Air Traffic Management
Scala
Big Data
Pipelines
Automation Tools
Data Engineering
SQL
Python
Data Architecture
Communication Skills
Cloud Services
Data Science
Data Analytics
Data Integration Engineer, Data Science
Posted 4 days ago
Job Viewed
Job Description
The Job:
To design, build, and maintain secure, scalable, and high-performance data pipelines for our next-generation Air Traffic Management (ATM) platform.
Work with modern cloud and data technologies, enabling real-time data-driven decision-making for safer and more efficient airspace operations.
While not mandatory, familiarity with AI/Agentic AI concepts and aviation data is a plus.
The Role:
• Design and develop scalable batch and streaming pipelines using modern tools like AWS, Databricks, etc.
• Implement reliable data processing, transformation, and storage with a focus on performance, security, and governance.
• Collaborate with data scientists, engineers, and domain experts to ensure the platform supports advanced analytics and AI-readiness.
• Optimize pipelines for resilience, observability, and cost-effectiveness using monitoring and automation tools.
• Maintain clear documentation and contribute to continuous improvements in data architecture.
The Requirements:
• Strong hands-on experience with AWS cloud services, particularly for data engineering.
• Proficiency in Python or Scala, with practical knowledge of Spark and distributed data processing.
• Experience building and maintaining scalable data pipelines (batch and real-time).
• Solid knowledge of SQL, data modelling, and transformation techniques.
• Familiarity with data security, governance, and compliance best practices.
• Strong problem-solving, analytical, and communication skills.
• Experience with AWS Databricks, Delta Lake, and medallion architecture.
• Exposure to AI/Gen AI concepts or intelligent agent-based architectures.
• Familiarity with aviation data standards (e.g., ADS-B, ARINC 424, flight schedules) or willingness to learn.
• AWS Certifications in Data Analytics, Big Data, or Machine Learning.
• Experience with real-time data processing and high-volume systems.
To Apply:
Please send your latest CV in word file to
Kindly indicate your availability, current and expected remuneration package.
We regret that only shortlisted candidates will be notified.
Quinnox Solutions Pte. Ltd. (License Number: 06C3126)
Registered EA Personnel (Reg. No.:R11100)
Chief Data Integration Specialist
Posted today
Job Viewed
Job Description
About the Role:
The data architect plays a vital role in designing and implementing data infrastructure to support analytics and data science initiatives. This position is responsible for developing and optimizing data pipelines, ensuring data quality and accessibility, and collaborating with data scientists and analysts to enable efficient decision-making.
Key Responsibilities:
- Data Pipeline Development : Design and implement efficient ETL processes to integrate data from various sources. Optimize existing pipelines for improved performance and scalability.
- Data Architecture Management: Develop and maintain the data architecture, ensuring it meets the needs of the team. Implement data modeling techniques to optimize data storage and retrieval.
- Data Quality Assurance: Implement data quality checks and monitoring systems to ensure the accuracy and reliability of data used in analytics and reporting. Develop and maintain data documentation and metadata.
- Big Data Technologies: Utilize big data technologies to process and analyze large volumes of customer data efficiently. Implement solutions for real-time data processing when required.
- Infrastructure Optimization: Continuously assess and optimize the data infrastructure to improve performance, reduce costs, and enhance scalability. Implement automation solutions to streamline data processes.
Education Level:
- Bachelor's degree in Computer Science, Information Systems, or a related field. Master's degree in a relevant field is preferred.
- Required Experience and Knowledge
- 3-5 years of experience in data engineering or a related field.
- Strong knowledge of data warehouse concepts, ETL processes, and data modeling techniques.
- Experience with cloud-based data platforms.
- Proficiency in SQL and experience with NoSQL databases.
- Experience with big data technologies such as Hadoop, Spark, or Kafka.
- Proficiency in Python or Scala for data processing and automation.
- Experience with ETL tools.
- Knowledge of data visualization tools to support data quality checks and pipeline monitoring.
- Familiarity with version control systems and CI/CD practices.
- Experience with container technologies and orchestration tools.
- Understanding of data security best practices and implementation.
- Strong problem-solving and analytical skills.
- Excellent communication abilities to collaborate with technical and non-technical team members.
- Proactive approach to identifying and resolving data-related issues.
- Ability to manage multiple projects and priorities effectively.
- Detail-oriented with a focus on data quality and system reliability.
- Adaptability to work with evolving technologies and changing business requirements.
- Strong teamwork skills and ability to work in a collaborative environment.
Senior Data Integration Specialist
Posted today
Job Viewed
Job Description
Data Integration Engineer, Data Science: Reimagined
The Role:
We seek a seasoned Data Integration Engineer to spearhead the development of cutting-edge data pipelines for our next-generation Air Traffic Management (ATM) platform.
As a key team member, you will work closely with modern cloud and data technologies to empower real-time data-driven decision-making for safer and more efficient airspace operations.
Responsibilities:
- Design and develop robust batch and streaming pipelines using leading-edge tools like AWS, Databricks, etc.
- Implement reliable data processing, transformation, and storage with a focus on high-performance, security, and governance.
- Collaborate with data scientists, engineers, and domain experts to ensure the platform supports advanced analytics and AI-readiness.
- Optimize pipelines for resilience, observability, and cost-effectiveness using monitoring and automation tools.
- Maintain clear documentation and contribute to continuous improvements in data architecture.
Requirements:
- Strong hands-on experience with AWS cloud services, particularly for data engineering.
- Proficiency in Python or Scala, with practical knowledge of Spark and distributed data processing.
- Experience building and maintaining scalable data pipelines (batch and real-time).
- Solid knowledge of SQL, data modelling, and transformation techniques.
- Familiarity with data security, governance, and compliance best practices.
- Strong problem-solving, analytical, and communication skills.
- Experience with AWS Databricks, Delta Lake, and medallion architecture.
- Exposure to AI/Gen AI concepts or intelligent agent-based architectures.
- Familiarity with aviation data standards (e.g., ADS-B, ARINC 424, flight schedules) or willingness to learn.
- AWS Certifications in Data Analytics, Big Data, or Machine Learning.
- Experience with real-time data processing and high-volume systems.
Business Data Integration Specialist
Posted today
Job Viewed
Job Description
Key Responsibilities:
- We install and maintain PowerCenter software, a robust data integration platform.
- We develop requirements at all levels, including source-to-target mapping specifications that ensure seamless data transfer.
- We design and implement job schedules to integrate systems in an efficient manner.
- We are accountable for the accuracy and integrity of our work products during testing phases.
- We guarantee timely delivery of all documentation, designs, builds, tests, and deployments according to project plans.
- We support production activities, such as migrations, continuity testing, and environment migrations.
Requirements:
- We require proficiency in Informatica PowerCenter 10.x or higher, preferably in a Unix (Linux/AIX) environment with AS400/Oracle sources and targets.
- We need experience in Oracle 19c or higher (SQL*Plus, PL/SQL), as well as Unix Shell Scripting (Linux/AIX).
- We expect candidates to have experience in analyzing and producing technical mapping design specifications.
Be The First To Know
About the latest Integration developer Jobs in Singapore !
Chief Data Integration Specialist
Posted today
Job Viewed
Job Description
Job Summary:
We are seeking an experienced Senior Data Engineer to lead the design, development, and deployment of scalable ETL processes across diverse enterprise environments.
The ideal candidate will have extensive hands-on experience with Talend, SSIS, Snowflake, and Data Vault methodologies, along with a proven track record in managing large-scale migration and data warehousing projects.
Key Responsibilities:
- Lead the design, development, and deployment of scalable ETL processes using Talend, SSIS, and Snowflake for multi-source data ingestion and transformation.
- Architect and implement enterprise data models (dimensional, logical, and physical) using ERwin and MySQL Workbench, incorporating Data Vault 2.0 practices.
- Collaborate with business analysts to gather requirements, define technical specifications, and create robust data mapping frameworks.
- Oversee migration initiatives from legacy SSIS-based systems to Talend Cloud/Teradata, enhancing orchestration flows, automation, and CI/CD pipelines.
- Optimize SQL scripts, perform query tuning, and manage multi-layer data architectures to improve performance and reliability.
- Maintain version control (TFS, Git), troubleshoot production issues, and ensure ongoing support for existing BI and ETL solutions.
Requirements:
- Bachelor's or Master's degree in IT, Computer Science, or related field.
- 10+ years of experience in ETL development, data integration, and data modeling.
- Proficiency in Talend (v7.3 & v8.0), SSIS, Snowflake, SQL Server, and MariaDB.
- Strong expertise in ERwin Data Modeler, Data Vault 2.0, and performance optimization.
- Experience with Azure DevOps, Git, TFS, and CI/CD pipelines.
- Excellent problem-solving, analytical, and communication skills.
Senior Data Integration Specialist
Posted today
Job Viewed
Job Description
Job Summary:
We are seeking a highly skilled Data Integration Specialist to join our team. The successful candidate will be responsible for the design, implementation and maintenance of data integration pipelines and architectures.
Key Responsibilities:
- Design and develop scalable data models and schemas using various data modelling best practices.
- Collaborate with cross-functional teams to deploy and deliver software products, actively participating in product enhancement.
- Take ownership of key software components, developing and maintaining them according to industry standards.
- Develop data transformation routines to clean, normalize and aggregate data, applying data processing techniques as needed.
- Implement event-driven processing pipelines using frameworks like Solace Pubsub, Apache Kafka and AMQP.
- Ensure compliance to data governance policies and standards, ensuring data quality, integrity, security and consistency.
Requirements:
- Bachelor's/Master's degree in Computer Science, Engineering or related field, or relevant experience.
- Proficiency in at least one object-oriented language, Java skills highly desirable.
- Deep understanding of designing and implementing data lifecycle management.
- Experience working in a dynamic environment managing senior stakeholders from different organizations.
Benefits:
- 5 day week
About Us:
Maestro HR is an innovative human resources company dedicated to improving workforce performance.
Chief Data Integration Specialist
Posted today
Job Viewed
Job Description
Data Warehouse Architect
About the Role:
The successful candidate will be responsible for designing and implementing end-to-end software solutions for big projects and change requests. They will also provide support to users, addressing performance issues and query responses.
Key Responsibilities:
- Design, develop, and implement software solutions from start to finish.
- Support users with query responses and address system performance issues.
Required Skills:
- Expertise in developing data pipelines using Pyspark, Scala, and Java.
- Implementation of Hadoop-based Data marts using Spark-based frameworks.
- Good working experience in core technical areas including Python, Java, PySpark, and Scala.
- Experience with Cloudera CDH/CDP components.
- Knowledge in developing Spark-based ingestion frameworks.
- Building and operationalizing feature pipelines for AI/ML model execution and large-scale data warehouse/data mart support.
Benefits:
- Scripting knowledge in Bash, Python, and Perl for automation and repetitive tasks.
- Technical expertise in RHEL/Linux, Unix hardware, operating systems, and system services.