148 Etl Architect jobs in Singapore

Data Integration Specialist

Singapore, Singapore beBeeDataIntegration

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary:

We are seeking a highly skilled Data Integration Specialist to join our team. This role will involve designing, developing and maintaining ETL processes for extracting, transforming and loading data from multiple sources into target systems.

:

  • Develop and maintain ETL pipelines using Informatica, Talend, SSIS or other ETL tools.
  • Work closely with business analysts and data architects to translate business requirements into technical solutions.
  • Write and optimize SQL queries for complex data transformations and validations.
  • Conduct unit testing, system testing and integration testing of ETL workflows to ensure data accuracy and completeness.
  • Create and maintain technical documentation including data flow diagrams, mapping documents and transformation rules.
  • Identify and resolve performance bottlenecks in ETL processes.
  • Collaborate with QA teams to prepare and execute ETL test cases, track defects and verify fixes.

  • Bachelor's degree in Computer Science, Information Technology or related field.
  • 5+ years of experience in ETL development using tools like Informatica, Talend, SSIS or Pentaho.
  • Strong SQL skills for data manipulation, validation and performance tuning.
  • Experience in ETL testing including data reconciliation and regression testing.
  • Good understanding of data warehousing concepts, dimensional modeling and ETL best practices.
  • Familiarity with defect tracking tools e.g JIRA HP ALM.

  • Experience with cloud-based ETL platforms e.g AWS Glue Azure Data Factory Google Dataflow Snowflake.
  • Exposure to automation testing tools for ETL validation e.g QuerySurge Python based scripts.
  • Knowledge of big data tools Hadoop Spark is a plus.

  • Technical Documentation
  • Factory
  • Big Data
  • Pipelines
  • Unit Testing
  • Hadoop
  • Informatica
  • ETL
  • Data Integration
  • Information Technology
  • Test Cases
  • SQL
  • Performance Tuning
  • SSIS
  • Data Warehousing
  • Business Requirements
This advertiser has chosen not to accept applicants from your region.

Data Integration Specialist

Singapore, Singapore beBeeKeyword

Posted today

Job Viewed

Tap Again To Close

Job Description

**Job Overview:**

We are seeking a skilled Informatica PowerCenter software professional to join our team.

As an ODI System Analyst, you will be responsible for installing, configuring, and maintaining the software, as well as translating requirements and developing job schedules to integrate upstream and downstream systems.

The ideal candidate will have experience with Oracle 19c or higher, Unix Shell Scripting, and technical mapping design specifications.

**Key Responsibilities:**

  • Install and configure Informatica PowerCenter software
  • Translate requirements from high to low level, including source to target mapping
  • Develop job schedules to integrate upstream and downstream systems
  • Ensure data accuracy and integrity during testing
  • Deliver documentation, design, build, testing, and deployment according to work breakdown structures
  • Support production activities, including migrations and continuity testing

**Required Skills and Qualifications:**

  • Proficiency in Informatica PowerCenter 10.x or higher, preferably in a Unix (Linux/AIX) environment with AS400/Oracle sources and targets
  • Experience with Oracle 19c or higher (SQL*Plus, PL/SQL)
  • Unix Shell Scripting skills
  • Technical mapping design specifications experience

**Benefits:**

Interested candidates are invited to submit their applications.

This advertiser has chosen not to accept applicants from your region.

Data Integration Architect (API)

Singapore, Singapore Randstad Singapore

Posted today

Job Viewed

Tap Again To Close

Job Description

Join to apply for the
Data Integration Architect (API)
role at
Randstad Singapore .
3 days ago. Be among the first 25 applicants.
This range is provided by Randstad Singapore. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay range
Direct message the job poster from Randstad Singapore.
About the company
I am currently working with a well-known telecommunications company in Singapore.
Work in-office 5 days a week. 2 rounds of interviews.
About the job (Please reach out for the full JD)
Design and build secure, reusable APIs (REST, GraphQL, event-driven) for AI agent and application integration.
Architect end-to-end system and data integrations across hybrid cloud and on-premise environments.
Develop scalable, secure data pipelines and frameworks for AI/ML platforms.
Enable real-time, context-rich data access for LLMs and agent-based workflows.
Skills and requirements
Bachelor's degree in Computer Science or related field.
Minimum 3 years of experience in data architecture, system, and API integration engineering.
Experience designing integration flows for large-scale, real-time systems across cloud and legacy environments.
Proficient in data pipelines, lakes, warehouses, and lakehouse architectures; API (REST, SOAP, etc.).
Skilled with orchestration and middleware tools (Kafka, Azure Data Factory, Databricks, Airflow).
Knowledge of data security, governance, and compliance (encryption, RBAC, masking, PDPA).
To apply online, please use the 'apply' function. Alternatively, contact Stella at 96554170 (EA: 94C3609 / R1875382).
Desired skills and experience
API, Restful, REST, SOAP, Data, Integration, Cloud.
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology
Industries
Data Security Software Products
Referrals increase your chances of interviewing at Randstad Singapore by 2x.
Get notified about new Data Integration Specialist jobs in
Singapore, Singapore .
#J-18808-Ljbffr

This advertiser has chosen not to accept applicants from your region.

Senior Data & Integration Architect

Singapore, Singapore Singtel Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Select how often (in days) to receive an alert:
An empowering career at Singtel begins with a Hello. Our purpose, to Empower Every Generation, connects people to the possibilities they need to excel. Every "hello" at Singtel opens doors to new initiatives, growth, and BIG possibilities that takes your career to new heights. So, when you say hello to us, you are really empowered to say.“Hello BIG Possibilities”.
Be a Part of Something BIG!
The Senior Data & Integration Architect plays a critical role in designing and implementing the data and integration layer of Singtel’s next-generation AI platform. This role focuses on orchestrating secure, scalable, and interoperable data architectures and API integration frameworks that support advanced AI and ML workloads across the enterprise.
You will work closely with the AI architecture, security, platform, and business teams to ensure seamless data movement, governance alignment, and real-time system interoperability for serving as the bridge between distributed data systems, cloud services, and AI-enabled applications.
Make An Impact By
Design, build and implement enterprise-wide data and API integration frameworks to support AI/ML platforms across hybrid cloud and on-premise environments
Work with system owners and data domain leads to design and deliver scalable end-to-end data flows across operational, analytical, and AI systems
Define and develop secure, reusable API interfaces (REST, GraphQL, event-driven) and data interface (batch or streaming) that enable seamless interoperability between internal systems and AI services
Oversee and evaluate new data integration approaches and pipeline designs to ensure efficient, secure, and scalable data flow between data sources and AI platforms.
Collaborate with Security and Data Governance teams to ensure integration designs align with compliance, privacy, and policy requirements (e.g., PDPA, data classification)
Design and enable data access strategies for LLMs and agent-based workflows, ensuring context-rich, real-time connectivity to distributed enterprise systems
Implement and maintain integration middleware and tooling (e.g. Kafka, Azure ML/Foundry, Databricks, etc) to support data orchestration, synchronization, and reliability
Contribute integration expertise to data or AI experimentation, PoCs, and platform upgrades, ensuring architectural consistency and production-readiness
Define and enforce data and integration design standards, focusing on scalability, resilience, observability, and system decoupling
Work closely with business units, IT, and Networks to align integration plans with enterprise priorities and ensure successful data exchange across functional boundaries
Skills to Succee
Bachelor’s in Computer Science, Engineering, Data, AI/ML, or related field.
At least 3 years of experience in data architecture, system and API integration engineering.
Demonstrated experience in designing integration flows for large-scale, real-time systems across cloud and legacy environments.
Experience in designing and implementing data integration frameworks across hybrid cloud and on-premise environments, including building scalable and secure data pipelines for AI/ML platforms.
Proficient in data integration design, with solid knowledge of data pipelines, data lakes, data warehouses, and data lakehouse architectures.
Good knowledge of modern data orchestration and middleware tools such as Apache Kafka, Azure Data Factory, Databricks, Airflow, and experience in managing data flow between operational, analytical, and AI environments.
Working knowledge of data security, data protection and data quality management including implementation of encryption, RBAC, masking, and alignment with regulatory frameworks such as PDPA and internal data classification policies.
Proven experience integrating data systems with AI/ML workflows, including model training, serving, monitoring, and enabling context-aware access for LLMs and agent-based automation.
Effective collaboration skills to work across data, platform, machine learning engineering and API integration teams, with a clear communication style to bridge business and technical stakeholders
Good internal (IT, Networks, business) and external (suppliers, government) stakeholders management skills
Strong technical writing and presentation skills, with the ability to communicate complex concepts clearly to both technical and non-technical stakeholders.
Proactive and fast learner with a strong drive to stay current on emerging technologies and industry trends.
Rewards that Go Beyond
Full suite of health and wellness benefits
Ongoing training and development programs
Internal mobility opportunities
Your Career Growth Starts Here. Apply Now!
We are committed to a safe and healthy environment for our employees & customers and will require all prospective employees to be fully vaccinated.
#J-18808-Ljbffr

This advertiser has chosen not to accept applicants from your region.

Data Integration Architect (API)

048616 Raffles Place, Singapore $13000 Monthly RANDSTAD PTE. LIMITED

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

about company

I am currently working with a well-known telecommunications company in Singapore.


5 days in office. 2 rounds of interview


about job (Please reach out for the whole JD)

  • Design and build secure, reusable APIs (REST, GraphQL, event-driven) for AI agent and application integration.
  • Architect end-to-end system and data integrations across hybrid cloud and on-premise environments.
  • Develop scalable, secure data pipelines and frameworks for AI/ML platforms.
  • Enable real-time, context-rich data access for LLMs and agent-based workflows.

skills and requirements

  • Bachelor’s in Computer Science or related field.
  • Min 3 years in data architecture, system and API integration engineering.
  • Demonstrated experience in designing integration flows for large-scale, real-time systems across cloud and legacy environments.
  • Proficient in data pipelines, lakes, warehouses, and lakehouse architectures, API (REST, SOAP, etc)
  • Skilled with orchestration and middleware tools (Kafka, Azure Data Factory, Databricks, Airflow).
  • Knowledgeable in data security, governance, and compliance (encryption, RBAC, masking, PDPA).

To apply online please use the 'apply' function, alternatively you may contact Stella at 96554170 (EA: 94C3609 /R1875382)

This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer, Data Science

Singapore, Singapore QUINNOX SOLUTIONS PTE. LTD.

Posted today

Job Viewed

Tap Again To Close

Job Description

Roles & Responsibilities

The Job:

To design, build, and maintain secure, scalable, and high-performance data pipelines for our next-generation Air Traffic Management (ATM) platform.

Work with modern cloud and data technologies, enabling real-time data-driven decision-making for safer and more efficient airspace operations.

While not mandatory, familiarity with AI/Agentic AI concepts and aviation data is a plus.

The Role:


• Design and develop scalable batch and streaming pipelines using modern tools like AWS, Databricks, etc.


• Implement reliable data processing, transformation, and storage with a focus on performance, security, and governance.


• Collaborate with data scientists, engineers, and domain experts to ensure the platform supports advanced analytics and AI-readiness.


• Optimize pipelines for resilience, observability, and cost-effectiveness using monitoring and automation tools.


• Maintain clear documentation and contribute to continuous improvements in data architecture.

The Requirements:


• Strong hands-on experience with AWS cloud services, particularly for data engineering.


• Proficiency in Python or Scala, with practical knowledge of Spark and distributed data processing.


• Experience building and maintaining scalable data pipelines (batch and real-time).


• Solid knowledge of SQL, data modelling, and transformation techniques.


• Familiarity with data security, governance, and compliance best practices.


• Strong problem-solving, analytical, and communication skills.


• Experience with AWS Databricks, Delta Lake, and medallion architecture.


• Exposure to AI/Gen AI concepts or intelligent agent-based architectures.


• Familiarity with aviation data standards (e.g., ADS-B, ARINC 424, flight schedules) or willingness to learn.


• AWS Certifications in Data Analytics, Big Data, or Machine Learning.


• Experience with real-time data processing and high-volume systems.

To Apply:

Please send your latest CV in word file to

Kindly indicate your availability, current and expected remuneration package.

We regret that only shortlisted candidates will be notified.

Quinnox Solutions Pte. Ltd. (License Number: 06C3126)

Registered EA Personnel (Reg. No.:R11100)

Tell employers what skills you have

Machine Learning
Security Governance
Aviation
Air Traffic Management
Scala
Big Data
Pipelines
Automation Tools
Data Engineering
SQL
Python
Data Architecture
Communication Skills
Cloud Services
Data Science
Data Analytics
This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer, Data Science

$7000 Monthly QUINNOX SOLUTIONS PTE. LTD.

Posted 15 days ago

Job Viewed

Tap Again To Close

Job Description

The Job:

To design, build, and maintain secure, scalable, and high-performance data pipelines for our next-generation Air Traffic Management (ATM) platform.

Work with modern cloud and data technologies, enabling real-time data-driven decision-making for safer and more efficient airspace operations.

While not mandatory, familiarity with AI/Agentic AI concepts and aviation data is a plus.

The Role:

• Design and develop scalable batch and streaming pipelines using modern tools like AWS, Databricks, etc.

• Implement reliable data processing, transformation, and storage with a focus on performance, security, and governance.

• Collaborate with data scientists, engineers, and domain experts to ensure the platform supports advanced analytics and AI-readiness.

• Optimize pipelines for resilience, observability, and cost-effectiveness using monitoring and automation tools.

• Maintain clear documentation and contribute to continuous improvements in data architecture.

The Requirements:

• Strong hands-on experience with AWS cloud services, particularly for data engineering.

• Proficiency in Python or Scala, with practical knowledge of Spark and distributed data processing.

• Experience building and maintaining scalable data pipelines (batch and real-time).

• Solid knowledge of SQL, data modelling, and transformation techniques.

• Familiarity with data security, governance, and compliance best practices.

• Strong problem-solving, analytical, and communication skills.

• Experience with AWS Databricks, Delta Lake, and medallion architecture.

• Exposure to AI/Gen AI concepts or intelligent agent-based architectures.

• Familiarity with aviation data standards (e.g., ADS-B, ARINC 424, flight schedules) or willingness to learn.

• AWS Certifications in Data Analytics, Big Data, or Machine Learning.

• Experience with real-time data processing and high-volume systems.


To Apply:

Please send your latest CV in word file to

Kindly indicate your availability, current and expected remuneration package.

We regret that only shortlisted candidates will be notified.


Quinnox Solutions Pte. Ltd. (License Number: 06C3126)

Registered EA Personnel (Reg. No.:R11100)

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Etl architect Jobs in Singapore !

Data Integration Specialist - Core Data

Singapore, Singapore Qube Research & Technologies Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Qube Research & Technologies (QRT) is a global quantitative and systematic investment manager, operating in all liquid asset classes across the world. We are a technology and data driven group implementing a scientific approach to investing. Combining data, research, technology, and trading expertise has shaped our collaborative mindset, which enables us to solve the most complex challenges. QRT’s culture of innovation continuously drives our ambition to deliver high quality returns for our investors.
Your responsibilities will include:
Your core objective is to manage the multiple and extensive datasets used in the QRT research and trading platform.
Contributing to design and implementing QRT data store and its API.
Monitoring fetching processes and data health.
Integrating alternative datasets.
Supporting users (Quants & Traders).
2+ years’ experience in a related field.
Strong understanding and experience with Python.
Familiarity with SQL and relational databases.
Ability to communicate and understand user needs.
Intellectual curiosity to learn new technologies.
Capacity to work with autonomy within a global team.
Good-to-have:
Structured and unstructured data management expertise.
Knowledge of the financial data of equity/derivatives.
Experience with market data.
Intellectual curiosity to learn new technologies.
Capacity to work with autonomy within a global team.
QRT is an equal opportunity employer. We welcome diversity as essential to our success. QRT empowers employees to work openly and respectfully to achieve collective success. In addition to professional achievement, we are offering initiatives and programs to enable employees to achieve a healthy work-life balance.
#J-18808-Ljbffr

This advertiser has chosen not to accept applicants from your region.

Senior Data Integration Specialist

Singapore, Singapore beBeeDataIntegration

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title

Data Integration Engineer, Data Science: Reimagined

The Role:

We seek a seasoned Data Integration Engineer to spearhead the development of cutting-edge data pipelines for our next-generation Air Traffic Management (ATM) platform.

As a key team member, you will work closely with modern cloud and data technologies to empower real-time data-driven decision-making for safer and more efficient airspace operations.

Responsibilities:

  • Design and develop robust batch and streaming pipelines using leading-edge tools like AWS, Databricks, etc.
  • Implement reliable data processing, transformation, and storage with a focus on high-performance, security, and governance.
  • Collaborate with data scientists, engineers, and domain experts to ensure the platform supports advanced analytics and AI-readiness.
  • Optimize pipelines for resilience, observability, and cost-effectiveness using monitoring and automation tools.
  • Maintain clear documentation and contribute to continuous improvements in data architecture.

Requirements:

  • Strong hands-on experience with AWS cloud services, particularly for data engineering.
  • Proficiency in Python or Scala, with practical knowledge of Spark and distributed data processing.
  • Experience building and maintaining scalable data pipelines (batch and real-time).
  • Solid knowledge of SQL, data modelling, and transformation techniques.
  • Familiarity with data security, governance, and compliance best practices.
  • Strong problem-solving, analytical, and communication skills.
  • Experience with AWS Databricks, Delta Lake, and medallion architecture.
  • Exposure to AI/Gen AI concepts or intelligent agent-based architectures.
  • Familiarity with aviation data standards (e.g., ADS-B, ARINC 424, flight schedules) or willingness to learn.
  • AWS Certifications in Data Analytics, Big Data, or Machine Learning.
  • Experience with real-time data processing and high-volume systems.
This advertiser has chosen not to accept applicants from your region.

Senior Data Integration Specialist

Singapore, Singapore beBeeDataIntegration

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary:

We are seeking a highly skilled Data Integration Specialist to join our team. The successful candidate will be responsible for the design, implementation and maintenance of data integration pipelines and architectures.

Key Responsibilities:

  • Design and develop scalable data models and schemas using various data modelling best practices.
  • Collaborate with cross-functional teams to deploy and deliver software products, actively participating in product enhancement.
  • Take ownership of key software components, developing and maintaining them according to industry standards.
  • Develop data transformation routines to clean, normalize and aggregate data, applying data processing techniques as needed.
  • Implement event-driven processing pipelines using frameworks like Solace Pubsub, Apache Kafka and AMQP.
  • Ensure compliance to data governance policies and standards, ensuring data quality, integrity, security and consistency.

Requirements:

  • Bachelor's/Master's degree in Computer Science, Engineering or related field, or relevant experience.
  • Proficiency in at least one object-oriented language, Java skills highly desirable.
  • Deep understanding of designing and implementing data lifecycle management.
  • Experience working in a dynamic environment managing senior stakeholders from different organizations.

Benefits:

  • 5 day week

About Us:

Maestro HR is an innovative human resources company dedicated to improving workforce performance.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Etl Architect Jobs