219 Data Integration jobs in Singapore

Data Integration Specialist

Singapore, Singapore beBeeInformatica

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Data Integration Specialist

At the heart of our organization, we require a skilled professional to drive data integration initiatives. The successful candidate will be responsible for ensuring seamless data exchange across systems, leveraging their expertise in Informatica PowerCenter and Oracle databases.

Key Responsibilities:
  1. Install, configure, and maintain Informatica PowerCenter software to facilitate efficient data integration.
  2. Translate technical requirements into actionable plans, including source-to-target mapping, to ensure accurate data transfer.
  3. Develop job schedules to synchronize upstream and downstream systems, guaranteeing data consistency and integrity.
  4. Verify data accuracy during testing phases, identifying and rectifying any discrepancies.
  5. Ensure timely delivery of documentation, design, build, testing, and deployment according to established work breakdown structures.
  6. Support production activities, including environment migrations, continuity testing, and maintenance.
Requirements:
  • Proficiency in Informatica PowerCenter 10.x or higher, preferably in a Unix (Linux/AIX) environment with AS400/Oracle sources and targets.
  • Experience with Oracle 19c or higher (SQL*Plus, PL/SQL) and Unix Shell Scripting (Linux/AIX).
  • Familiarity with analyzing and producing technical mapping design specifications.
This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer, Data Science

Singapore, Singapore QUINNOX SOLUTIONS PTE. LTD.

Posted today

Job Viewed

Tap Again To Close

Job Description

Roles & Responsibilities

The Job:

To design, build, and maintain secure, scalable, and high-performance data pipelines for our next-generation Air Traffic Management (ATM) platform.

Work with modern cloud and data technologies, enabling real-time data-driven decision-making for safer and more efficient airspace operations.

While not mandatory, familiarity with AI/Agentic AI concepts and aviation data is a plus.

The Role:


• Design and develop scalable batch and streaming pipelines using modern tools like AWS, Databricks, etc.


• Implement reliable data processing, transformation, and storage with a focus on performance, security, and governance.


• Collaborate with data scientists, engineers, and domain experts to ensure the platform supports advanced analytics and AI-readiness.


• Optimize pipelines for resilience, observability, and cost-effectiveness using monitoring and automation tools.


• Maintain clear documentation and contribute to continuous improvements in data architecture.

The Requirements:


• Strong hands-on experience with AWS cloud services, particularly for data engineering.


• Proficiency in Python or Scala, with practical knowledge of Spark and distributed data processing.


• Experience building and maintaining scalable data pipelines (batch and real-time).


• Solid knowledge of SQL, data modelling, and transformation techniques.


• Familiarity with data security, governance, and compliance best practices.


• Strong problem-solving, analytical, and communication skills.


• Experience with AWS Databricks, Delta Lake, and medallion architecture.


• Exposure to AI/Gen AI concepts or intelligent agent-based architectures.


• Familiarity with aviation data standards (e.g., ADS-B, ARINC 424, flight schedules) or willingness to learn.


• AWS Certifications in Data Analytics, Big Data, or Machine Learning.


• Experience with real-time data processing and high-volume systems.

To Apply:

Please send your latest CV in word file to

Kindly indicate your availability, current and expected remuneration package.

We regret that only shortlisted candidates will be notified.

Quinnox Solutions Pte. Ltd. (License Number: 06C3126)

Registered EA Personnel (Reg. No.:R11100)

Tell employers what skills you have

Machine Learning
Security Governance
Aviation
Air Traffic Management
Scala
Big Data
Pipelines
Automation Tools
Data Engineering
SQL
Python
Data Architecture
Communication Skills
Cloud Services
Data Science
Data Analytics
This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer, Data Science

$7000 Monthly QUINNOX SOLUTIONS PTE. LTD.

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

The Job:

To design, build, and maintain secure, scalable, and high-performance data pipelines for our next-generation Air Traffic Management (ATM) platform.

Work with modern cloud and data technologies, enabling real-time data-driven decision-making for safer and more efficient airspace operations.

While not mandatory, familiarity with AI/Agentic AI concepts and aviation data is a plus.

The Role:

• Design and develop scalable batch and streaming pipelines using modern tools like AWS, Databricks, etc.

• Implement reliable data processing, transformation, and storage with a focus on performance, security, and governance.

• Collaborate with data scientists, engineers, and domain experts to ensure the platform supports advanced analytics and AI-readiness.

• Optimize pipelines for resilience, observability, and cost-effectiveness using monitoring and automation tools.

• Maintain clear documentation and contribute to continuous improvements in data architecture.

The Requirements:

• Strong hands-on experience with AWS cloud services, particularly for data engineering.

• Proficiency in Python or Scala, with practical knowledge of Spark and distributed data processing.

• Experience building and maintaining scalable data pipelines (batch and real-time).

• Solid knowledge of SQL, data modelling, and transformation techniques.

• Familiarity with data security, governance, and compliance best practices.

• Strong problem-solving, analytical, and communication skills.

• Experience with AWS Databricks, Delta Lake, and medallion architecture.

• Exposure to AI/Gen AI concepts or intelligent agent-based architectures.

• Familiarity with aviation data standards (e.g., ADS-B, ARINC 424, flight schedules) or willingness to learn.

• AWS Certifications in Data Analytics, Big Data, or Machine Learning.

• Experience with real-time data processing and high-volume systems.


To Apply:

Please send your latest CV in word file to

Kindly indicate your availability, current and expected remuneration package.

We regret that only shortlisted candidates will be notified.


Quinnox Solutions Pte. Ltd. (License Number: 06C3126)

Registered EA Personnel (Reg. No.:R11100)

This advertiser has chosen not to accept applicants from your region.

Senior Data & Integration Architect

Singapore, Singapore Singtel Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Select how often (in days) to receive an alert:

An empowering career at Singtel begins with a Hello. Our purpose, to Empower Every Generation, connects people to the possibilities they need to excel. Every "hello" at Singtel opens doors to new initiatives, growth, and BIG possibilities that takes your career to new heights. So, when you say hello to us, you are really empowered to say…“Hello BIG Possibilities”.

Be a Part of Something BIG!

The Senior Data & Integration Architect plays a critical role in designing and implementing the data and integration layer of Singtel’s next-generation AI platform. This role focuses on orchestrating secure, scalable, and interoperable data architectures and API integration frameworks that support advanced AI and ML workloads across the enterprise.

You will work closely with the AI architecture, security, platform, and business teams to ensure seamless data movement, governance alignment, and real-time system interoperability for serving as the bridge between distributed data systems, cloud services, and AI-enabled applications.

Make An Impact By

  • Design, build and implement enterprise-wide data and API integration frameworks to support AI/ML platforms across hybrid cloud and on-premise environments
  • Work with system owners and data domain leads to design and deliver scalable end-to-end data flows across operational, analytical, and AI systems
  • Define and develop secure, reusable API interfaces (REST, GraphQL, event-driven) and data interface (batch or streaming) that enable seamless interoperability between internal systems and AI services
  • Oversee and evaluate new data integration approaches and pipeline designs to ensure efficient, secure, and scalable data flow between data sources and AI platforms.
  • Collaborate with Security and Data Governance teams to ensure integration designs align with compliance, privacy, and policy requirements (e.g., PDPA, data classification)
  • Design and enable data access strategies for LLMs and agent-based workflows, ensuring context-rich, real-time connectivity to distributed enterprise systems
  • Implement and maintain integration middleware and tooling (e.g. Kafka, Azure ML/Foundry, Databricks, etc) to support data orchestration, synchronization, and reliability
  • Contribute integration expertise to data or AI experimentation, PoCs, and platform upgrades, ensuring architectural consistency and production-readiness
  • Define and enforce data and integration design standards, focusing on scalability, resilience, observability, and system decoupling
  • Work closely with business units, IT, and Networks to align integration plans with enterprise priorities and ensure successful data exchange across functional boundaries

Skills to Succee

  • Bachelor’s in Computer Science, Engineering, Data, AI/ML, or related field.
  • At least 3 years of experience in data architecture, system and API integration engineering.
  • Demonstrated experience in designing integration flows for large-scale, real-time systems across cloud and legacy environments.
  • Experience in designing and implementing data integration frameworks across hybrid cloud and on-premise environments, including building scalable and secure data pipelines for AI/ML platforms.
  • Proficient in data integration design, with solid knowledge of data pipelines, data lakes, data warehouses, and data lakehouse architectures.
  • Good knowledge of modern data orchestration and middleware tools such as Apache Kafka, Azure Data Factory, Databricks, Airflow, and experience in managing data flow between operational, analytical, and AI environments.
  • Working knowledge of data security, data protection and data quality management including implementation of encryption, RBAC, masking, and alignment with regulatory frameworks such as PDPA and internal data classification policies.
  • Proven experience integrating data systems with AI/ML workflows, including model training, serving, monitoring, and enabling context-aware access for LLMs and agent-based automation.
  • Effective collaboration skills to work across data, platform, machine learning engineering and API integration teams, with a clear communication style to bridge business and technical stakeholders
  • Good internal (IT, Networks, business) and external (suppliers, government) stakeholders management skills
  • Strong technical writing and presentation skills, with the ability to communicate complex concepts clearly to both technical and non-technical stakeholders.
  • Proactive and fast learner with a strong drive to stay current on emerging technologies and industry trends.

Rewards that Go Beyond

  • Full suite of health and wellness benefits
  • Ongoing training and development programs
  • Internal mobility opportunities

Your Career Growth Starts Here. Apply Now!

We are committed to a safe and healthy environment for our employees & customers and will require all prospective employees to be fully vaccinated.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Data & Integration Architect

Singapore, Singapore Singtel Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Select how often (in days) to receive an alert:

An empowering career at Singtel begins with a Hello. Our purpose, to Empower Every Generation, connects people to the possibilities they need to excel. Every "hello" at Singtel opens doors to new initiatives, growth, and BIG possibilities that takes your career to new heights. So, when you say hello to us, you are really empowered to say…“Hello BIG Possibilities”.

Be a Part of Something BIG!

The Senior Data & Integration Architect plays a critical role in designing and implementing the data and integration layer of Singtel’s next-generation AI platform. This role focuses on orchestrating secure, scalable, and interoperable data architectures and API integration frameworks that support advanced AI and ML workloads across the enterprise.

You will work closely with the AI architecture, security, platform, and business teams to ensure seamless data movement, governance alignment, and real-time system interoperability for serving as the bridge between distributed data systems, cloud services, and AI-enabled applications.

Make An Impact By

  • Design, build and implement enterprise-wide data and API integration frameworks to support AI/ML platforms across hybrid cloud and on-premise environments
  • Work with system owners and data domain leads to design and deliver scalable end-to-end data flows across operational, analytical, and AI systems
  • Define and develop secure, reusable API interfaces (REST, GraphQL, event-driven) and data interface (batch or streaming) that enable seamless interoperability between internal systems and AI services
  • Oversee and evaluate new data integration approaches and pipeline designs to ensure efficient, secure, and scalable data flow between data sources and AI platforms.
  • Collaborate with Security and Data Governance teams to ensure integration designs align with compliance, privacy, and policy requirements (e.g., PDPA, data classification)
  • Design and enable data access strategies for LLMs and agent-based workflows, ensuring context-rich, real-time connectivity to distributed enterprise systems
  • Implement and maintain integration middleware and tooling (e.g. Kafka, Azure ML/Foundry, Databricks, etc) to support data orchestration, synchronization, and reliability
  • Contribute integration expertise to data or AI experimentation, PoCs, and platform upgrades, ensuring architectural consistency and production-readiness
  • Define and enforce data and integration design standards, focusing on scalability, resilience, observability, and system decoupling
  • Work closely with business units, IT, and Networks to align integration plans with enterprise priorities and ensure successful data exchange across functional boundaries

Skills to Succee

  • Bachelor’s in Computer Science, Engineering, Data, AI/ML, or related field.
  • At least 3 years of experience in data architecture, system and API integration engineering.
  • Demonstrated experience in designing integration flows for large-scale, real-time systems across cloud and legacy environments.
  • Experience in designing and implementing data integration frameworks across hybrid cloud and on-premise environments, including building scalable and secure data pipelines for AI/ML platforms.
  • Proficient in data integration design, with solid knowledge of data pipelines, data lakes, data warehouses, and data lakehouse architectures.
  • Good knowledge of modern data orchestration and middleware tools such as Apache Kafka, Azure Data Factory, Databricks, Airflow, and experience in managing data flow between operational, analytical, and AI environments.
  • Working knowledge of data security, data protection and data quality management including implementation of encryption, RBAC, masking, and alignment with regulatory frameworks such as PDPA and internal data classification policies.
  • Proven experience integrating data systems with AI/ML workflows, including model training, serving, monitoring, and enabling context-aware access for LLMs and agent-based automation.
  • Effective collaboration skills to work across data, platform, machine learning engineering and API integration teams, with a clear communication style to bridge business and technical stakeholders
  • Good internal (IT, Networks, business) and external (suppliers, government) stakeholders management skills
  • Strong technical writing and presentation skills, with the ability to communicate complex concepts clearly to both technical and non-technical stakeholders.
  • Proactive and fast learner with a strong drive to stay current on emerging technologies and industry trends.

Rewards that Go Beyond

  • Full suite of health and wellness benefits
  • Ongoing training and development programs
  • Internal mobility opportunities

Your Career Growth Starts Here. Apply Now!

We are committed to a safe and healthy environment for our employees & customers and will require all prospective employees to be fully vaccinated.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

AI Data Integration Specialist

Singapore, Singapore beBeeMACHINELEARNING

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Overview

We are seeking a highly skilled Machine Learning Integration Specialist to join our team in Singapore. The ideal candidate will have a unique blend of technical expertise and business acumen, enabling them to drive strategic solutions and deliver innovative outcomes.

The successful candidate will be responsible for integrating machine learning technologies into our clients' systems, driving business growth through data-driven insights and analytics.

This role requires strong analytical and problem-solving skills, with the ability to communicate complex ideas effectively to both technical and non-technical stakeholders.

We are looking for a self-motivated individual who is passionate about leveraging technology to drive business success.

Key Responsibilities
  • Design and implement machine learning integration strategies that meet client needs and drive business outcomes
  • Collaborate with cross-functional teams to develop and deploy machine learning solutions
  • Analyze data and provide insights to inform business decisions
  • Stay up-to-date with emerging trends and technologies in machine learning and AI
Requirements
  • 3+ years of experience in machine learning integration or a related field
  • Strong technical skills in machine learning, data analysis, and software development
  • Excellent communication and collaboration skills
  • Bachelor's degree in Computer Science, Engineering, or a related field
What We Offer

We offer a competitive salary and benefits package, as well as opportunities for professional growth and development. If you are passionate about machine learning and want to make a meaningful impact, we encourage you to apply.

This advertiser has chosen not to accept applicants from your region.

Chief Data Integration Specialist

Singapore, Singapore beBeeData

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Data Architect

About the Role:

The data architect plays a vital role in designing and implementing data infrastructure to support analytics and data science initiatives. This position is responsible for developing and optimizing data pipelines, ensuring data quality and accessibility, and collaborating with data scientists and analysts to enable efficient decision-making.

Key Responsibilities:
  • Data Pipeline Development : Design and implement efficient ETL processes to integrate data from various sources. Optimize existing pipelines for improved performance and scalability.
  • Data Architecture Management: Develop and maintain the data architecture, ensuring it meets the needs of the team. Implement data modeling techniques to optimize data storage and retrieval.
  • Data Quality Assurance: Implement data quality checks and monitoring systems to ensure the accuracy and reliability of data used in analytics and reporting. Develop and maintain data documentation and metadata.
  • Big Data Technologies: Utilize big data technologies to process and analyze large volumes of customer data efficiently. Implement solutions for real-time data processing when required.
  • Infrastructure Optimization: Continuously assess and optimize the data infrastructure to improve performance, reduce costs, and enhance scalability. Implement automation solutions to streamline data processes.
Qualifications

Education Level:
  • Bachelor's degree in Computer Science, Information Systems, or a related field. Master's degree in a relevant field is preferred.
  • Required Experience and Knowledge
  • 3-5 years of experience in data engineering or a related field.
  • Strong knowledge of data warehouse concepts, ETL processes, and data modeling techniques.
  • Experience with cloud-based data platforms.
  • Proficiency in SQL and experience with NoSQL databases.
  • Experience with big data technologies such as Hadoop, Spark, or Kafka.
Job-Specific Technical Skills:
  • Proficiency in Python or Scala for data processing and automation.
  • Experience with ETL tools.
  • Knowledge of data visualization tools to support data quality checks and pipeline monitoring.
  • Familiarity with version control systems and CI/CD practices.
  • Experience with container technologies and orchestration tools.
  • Understanding of data security best practices and implementation.
Behavioural Skills:
  • Strong problem-solving and analytical skills.
  • Excellent communication abilities to collaborate with technical and non-technical team members.
  • Proactive approach to identifying and resolving data-related issues.
  • Ability to manage multiple projects and priorities effectively.
  • Detail-oriented with a focus on data quality and system reliability.
  • Adaptability to work with evolving technologies and changing business requirements.
  • Strong teamwork skills and ability to work in a collaborative environment.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data integration Jobs in Singapore !

Business Data Integration Specialist

Singapore, Singapore beBeeDataIntegrator

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:

  • We install and maintain PowerCenter software, a robust data integration platform.
  • We develop requirements at all levels, including source-to-target mapping specifications that ensure seamless data transfer.
  • We design and implement job schedules to integrate systems in an efficient manner.
  • We are accountable for the accuracy and integrity of our work products during testing phases.
  • We guarantee timely delivery of all documentation, designs, builds, tests, and deployments according to project plans.
  • We support production activities, such as migrations, continuity testing, and environment migrations.

Requirements:

  • We require proficiency in Informatica PowerCenter 10.x or higher, preferably in a Unix (Linux/AIX) environment with AS400/Oracle sources and targets.
  • We need experience in Oracle 19c or higher (SQL*Plus, PL/SQL), as well as Unix Shell Scripting (Linux/AIX).
  • We expect candidates to have experience in analyzing and producing technical mapping design specifications.
This advertiser has chosen not to accept applicants from your region.

Senior Data Integration Specialist

Singapore, Singapore beBeeDataIntegration

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary:

We are seeking a highly skilled Data Integration Specialist to join our team. The successful candidate will be responsible for the design, implementation and maintenance of data integration pipelines and architectures.

Key Responsibilities:

  • Design and develop scalable data models and schemas using various data modelling best practices.
  • Collaborate with cross-functional teams to deploy and deliver software products, actively participating in product enhancement.
  • Take ownership of key software components, developing and maintaining them according to industry standards.
  • Develop data transformation routines to clean, normalize and aggregate data, applying data processing techniques as needed.
  • Implement event-driven processing pipelines using frameworks like Solace Pubsub, Apache Kafka and AMQP.
  • Ensure compliance to data governance policies and standards, ensuring data quality, integrity, security and consistency.

Requirements:

  • Bachelor's/Master's degree in Computer Science, Engineering or related field, or relevant experience.
  • Proficiency in at least one object-oriented language, Java skills highly desirable.
  • Deep understanding of designing and implementing data lifecycle management.
  • Experience working in a dynamic environment managing senior stakeholders from different organizations.

Benefits:

  • 5 day week

About Us:

Maestro HR is an innovative human resources company dedicated to improving workforce performance.

This advertiser has chosen not to accept applicants from your region.

Chief Data Integration Specialist

Singapore, Singapore beBeeSeniorDataEngineer

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary:

We are seeking an experienced Senior Data Engineer to lead the design, development, and deployment of scalable ETL processes across diverse enterprise environments.

The ideal candidate will have extensive hands-on experience with Talend, SSIS, Snowflake, and Data Vault methodologies, along with a proven track record in managing large-scale migration and data warehousing projects.


Key Responsibilities:

  • Lead the design, development, and deployment of scalable ETL processes using Talend, SSIS, and Snowflake for multi-source data ingestion and transformation.
  • Architect and implement enterprise data models (dimensional, logical, and physical) using ERwin and MySQL Workbench, incorporating Data Vault 2.0 practices.
  • Collaborate with business analysts to gather requirements, define technical specifications, and create robust data mapping frameworks.
  • Oversee migration initiatives from legacy SSIS-based systems to Talend Cloud/Teradata, enhancing orchestration flows, automation, and CI/CD pipelines.
  • Optimize SQL scripts, perform query tuning, and manage multi-layer data architectures to improve performance and reliability.
  • Maintain version control (TFS, Git), troubleshoot production issues, and ensure ongoing support for existing BI and ETL solutions.

Requirements:

  • Bachelor's or Master's degree in IT, Computer Science, or related field.
  • 10+ years of experience in ETL development, data integration, and data modeling.
  • Proficiency in Talend (v7.3 & v8.0), SSIS, Snowflake, SQL Server, and MariaDB.
  • Strong expertise in ERwin Data Modeler, Data Vault 2.0, and performance optimization.
  • Experience with Azure DevOps, Git, TFS, and CI/CD pipelines.
  • Excellent problem-solving, analytical, and communication skills.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Integration Jobs