185 Data Engineering Manager jobs in Singapore
Data Engineering Manager (Data Governance) - Data Cycling Centre
Posted 16 days ago
Job Viewed
Job Description
Data Engineering Manager (Data Governance) - Data Cycling Centre
About Data Cycling Center
AI is the new electricity and data is the energy for AI. Our core belief is that unstructured data contains the untapped wisdom of humanity — and by building the best AI-ready data infrastructure, we enable more meaningful, responsible, and creative uses of AI.
DCC is a Data Science team that is responsible for building the mid-platform data layer that powers AI development across all business lines of TikTok, including e-commerce, advertising, livestreaming, and emerging technologies. Our initiatives include:
- Scaled human-in-the-loop (HITL) data processing (including capability, methodology, E2E process, platform), delivering the most cost-effective results at benchmark quality levels that outperform leading tech peers.
- Built a highly automated end-to-end data pipeline for AI-generated content (AIGC) that supports strong model performance with minimal resource input.
- Develop a comprehensive content understanding and insight generation system, setting new benchmarks for marketing intelligence in the online ads ecosystem.
DCC aims to be the number 1 data service provider in the AI era. Our mission is to build the most effective (lossless and affordable) content understanding capabilities to fully satisfy AI needs of TikTok and to help the industry push the limit.
About the roleAs a Lead Data Engineer, you will be responsible for building up both the team and the data pipeline. You will be working on cutting-edge challenges in the big data and AI industry which requires strong passion and capability of innovation. You will collaborate closely with cross-functional teams to understand business requirements and translate them into technical solutions. Additionally, you will provide technical guidance, mentorship, and support to junior team members.
Responsibilities- Architect efficient, scalable and reliable data pipelines and infrastructure for ingesting, processing, and transforming large volumes of data
- Define the technical strategy and roadmap for data engineering projects in alignment with business objectives, actively evaluate and bring in industry best practices and state-of-the-art technical approaches, and timely update the strategy according to the rapid change of the industry
- Build and lead a high-performing data engineering team with a clear strategy, providing business, technical, and personal development coaching
- Own and drive data engineering projects by leveraging both internal and cross-functional resources, setting meaningful and challenging targets, and achieving them with innovative approaches
- Foster a collaborative and inclusive team culture within and across teams, collaborate with data scientists
- Shape the Future of AI — Work at the core of TikTok’s AI development, where your data powers cutting-edge applications across e-commerce, ads, livestream, and more.
- Massive Scale, Real-World Impact — Process petabytes of unstructured data to influence real products used by over a billion users worldwide.
- Global Team, Local Leadership — Join a diverse, high-performing team spread across Singapore, New York, San Jose, Shanghai, and more
- Build What Others Benchmark Against — Our innovations in HITL systems, AIGC pipelines, and content understanding set the industry standard.
- From Data to Strategy — We turn data into actionable insight that shapes product and business strategy across all TikTok verticals.
- Dynamic, Fast-Paced, Fun — Experience the energy of a high-impact team that moves fast and never stops learning.
- Bachelor's or Master's degree in Computer Science, Engineering, or related field.
- At least 5 years of experience in data engineering with proven expertise in building scalable data pipelines and infrastructure.
- Experience with big data technologies such as Apache Spark, Hadoop, and Kafka; proficiency in Python, SQL, and Java.
- Strong understanding of database systems, data warehousing, and distributed computing concepts.
- Proven experience in team management with a track record of leading teams and delivering complex data engineering projects.
- Excellent communication, analytical, and collaboration skills; independent thinking and ability to prioritise core business needs.
- Ability to thrive in a fast-paced, dynamic environment and manage multiple priorities.
TikTok is the leading destination for short-form mobile video. Our mission is to inspire creativity and bring joy. Our global headquarters are in Los Angeles and Singapore, with offices worldwide.
Diversity & InclusionTikTok is committed to creating an inclusive space where employees are valued for their skills, experiences, and perspectives. We celebrate diverse voices and strive to create an environment that reflects the communities we reach.
#J-18808-LjbffrData Engineering Manager (Data Governance) - Data Cycling Centre
Posted today
Job Viewed
Job Description
Data Engineering Manager (Data Governance) - Data Cycling Centre
About Data Cycling Center
AI is the new electricity and data is the energy for AI. Our core belief is that unstructured data contains the untapped wisdom of humanity — and by building the best AI-ready data infrastructure, we enable more meaningful, responsible, and creative uses of AI.
DCC is a Data Science team that is responsible for building the mid-platform data layer that powers AI development across all business lines of TikTok, including e-commerce, advertising, livestreaming, and emerging technologies. Our initiatives include:
Scaled human-in-the-loop (HITL) data processing (including capability, methodology, E2E process, platform), delivering the most cost-effective results at benchmark quality levels that outperform leading tech peers.
Built a highly automated end-to-end data pipeline for AI-generated content (AIGC) that supports strong model performance with minimal resource input.
Develop a comprehensive content understanding and insight generation system, setting new benchmarks for marketing intelligence in the online ads ecosystem.
DCC aims to be the number 1 data service provider in the AI era. Our mission is to build the most effective (lossless and affordable) content understanding capabilities to fully satisfy AI needs of TikTok and to help the industry push the limit.
About the role
As a Lead Data Engineer, you will be responsible for building up both the team and the data pipeline. You will be working on cutting-edge challenges in the big data and AI industry which requires strong passion and capability of innovation. You will collaborate closely with cross-functional teams to understand business requirements and translate them into technical solutions. Additionally, you will provide technical guidance, mentorship, and support to junior team members.
Responsibilities
Architect efficient, scalable and reliable data pipelines and infrastructure for ingesting, processing, and transforming large volumes of data
Define the technical strategy and roadmap for data engineering projects in alignment with business objectives, actively evaluate and bring in industry best practices and state-of-the-art technical approaches, and timely update the strategy according to the rapid change of the industry
Build and lead a high-performing data engineering team with a clear strategy, providing business, technical, and personal development coaching
Own and drive data engineering projects by leveraging both internal and cross-functional resources, setting meaningful and challenging targets, and achieving them with innovative approaches
Foster a collaborative and inclusive team culture within and across teams, collaborate with data scientists
Why Join Us
Shape the Future of AI — Work at the core of TikTok’s AI development, where your data powers cutting-edge applications across e-commerce, ads, livestream, and more.
Massive Scale, Real-World Impact — Process petabytes of unstructured data to influence real products used by over a billion users worldwide.
Global Team, Local Leadership — Join a diverse, high-performing team spread across Singapore, New York, San Jose, Shanghai, and more
Build What Others Benchmark Against — Our innovations in HITL systems, AIGC pipelines, and content understanding set the industry standard.
From Data to Strategy — We turn data into actionable insight that shapes product and business strategy across all TikTok verticals.
Dynamic, Fast-Paced, Fun — Experience the energy of a high-impact team that moves fast and never stops learning.
Qualifications
Bachelor's or Master's degree in Computer Science, Engineering, or related field.
At least 5 years of experience in data engineering with proven expertise in building scalable data pipelines and infrastructure.
Experience with big data technologies such as Apache Spark, Hadoop, and Kafka; proficiency in Python, SQL, and Java.
Strong understanding of database systems, data warehousing, and distributed computing concepts.
Proven experience in team management with a track record of leading teams and delivering complex data engineering projects.
Excellent communication, analytical, and collaboration skills; independent thinking and ability to prioritise core business needs.
Ability to thrive in a fast-paced, dynamic environment and manage multiple priorities.
About TikTok
TikTok is the leading destination for short-form mobile video. Our mission is to inspire creativity and bring joy. Our global headquarters are in Los Angeles and Singapore, with offices worldwide.
Diversity & Inclusion
TikTok is committed to creating an inclusive space where employees are valued for their skills, experiences, and perspectives. We celebrate diverse voices and strive to create an environment that reflects the communities we reach.
#J-18808-Ljbffr
Data Architecture Specialist
Posted today
Job Viewed
Job Description
We are seeking a skilled Data Architecture Specialist to join our team. As a key member of our data engineering group, you will be responsible for designing and implementing scalable and robust data pipelines and ETL processes to ingest, parse, process, and store structured and unstructured data from various sources.
Key Responsibilities:- Design and develop data architectures to support efficient querying and reporting using data platforms such as Snowflake, Redshift, MySQL, MongoDB, and S3.
- Implement change data capture (CDC) capable of capturing changes made to source data and propagating them downstream to other systems or applications.
- Collaborate with data scientists and business stakeholders to understand their data requirements and ensure data integrity, accuracy, and timeliness.
- Monitor and troubleshoot data pipeline performance to ensure optimal data flow and minimal downtime.
- Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field.
- 3-5 years of experience in data engineering or a related field, with a strong background in data architecture and data modeling.
- Proficient in SQL, Python, and/or other programming languages relevant to data engineering.
- Experience with data ingestion, processing, and storage tools and technologies, such as Snowflake, Redshift, MySQL, MongoDB, S3, Airflow, dbt, AWS Glue.
- Strong understanding of data warehousing concepts, ETL processes, and data modeling best practices.
- A fast-paced and dynamic work environment that fosters collaboration and innovation.
- The opportunity to work on cutting-edge data engineering projects and technologies.
- A competitive compensation package that includes benefits and opportunities for professional growth.
- A collaborative and supportive team environment that encourages open communication and knowledge sharing.
- The opportunity to work on diverse and challenging data engineering projects that require creative problem-solving and technical expertise.
- A comprehensive training program that includes workshops, conferences, and online courses to help you stay up-to-date with the latest data engineering trends and technologies.
Data Architecture Specialist
Posted today
Job Viewed
Job Description
Job Title: Data Architecture Specialist
We are seeking a seasoned Data Architecture Specialist to lead the design and implementation of scalable data pipelines and ETL/ELT frameworks for our Enterprise Data Warehouse (EDW) and data marts.
The ideal candidate will have a proven track record in designing and implementing robust data architectures, strong expertise in data warehouse design methodologies and data modeling, and experience with Azure Data Factory.
About the Job
- Analyze business requirements and translate them into scalable technical designs.
- Design, build, and maintain scalable and reliable ETL/ELT data pipelines and processes.
- Build and manage Data Infrastructure for smooth data ingestion and increased reliability.
- Write clean, maintainable, and well-documented code with external vendors/offshore developers.
- Proactively identify areas for improvement to enhance existing data platform.
Requirements
- Proven track record in designing and implementing robust data architectures and ETL/ELT frameworks.
- Strong expertise in data warehouse design methodologies and data modeling.
- Experience with Azure Data Factory required for this role.
This is an exciting opportunity to join a leading organization and make a significant impact on our data strategy. If you are a highly skilled data professional with a passion for innovation and excellence, we encourage you to apply.
Data Architecture Specialist
Posted today
Job Viewed
Job Description
Job Title: Data Architecture Specialist
">I am seeking a talented Data Architecture Specialist to join our team. As a key member of the data team, you will be responsible for designing and developing data architectures that meet the company's business needs.
">Key Responsibilities:
">- Design and develop data pipelines and systems for data modeling, mining, and production ">
- Develop, construct, test, and maintain data architectures such as databases, data warehouses, and large-scale data processing systems ">
- Make raw data clean and highly available for use in descriptive and predictive modeling ">
- Recommend and implement ways to improve data quality, reliability, flexibility, and efficiency ">
- Ensure data assets and data catalogs are organized and stored in an efficient way ">
Requirements:
">- Min 3 years in data architecture, data warehousing, data processing, data modeling, and ETL/ELT with familiarity with real-time streaming solutions ">
- Experience in Kubernetes-based DevOps practices, with experience in container orchestration, CI/CD pipelines, and microservices deployment ">
- Experience in database development (Oracle SQL/PLSQL) ">
- Working experience in AWS cloud environment, familiar with solutions such as EC2, S3, EMR, Redshift, Athena, Kinesis ">
- Programming knowledge in Python, R, SQL for data cleaning, processing, and aggregation ">
- Mandarin speaking required ">
Benefits:
">- Hybrid working arrangement ">
- Office at CBD ">
- Variable bonus ">
This is a fantastic opportunity to work on high-impact projects and develop your skills in data architecture and engineering. If you are passionate about data and technology, please apply today!
SAP Data Architecture
Posted today
Job Viewed
Job Description
Role – SAP Data Architecture
Job Requisition Number: SA-002
Job Level: 5 – 7 Years of experience
Key Responsibilities
Develop data analysis and reporting framework for S/4HANA eco-system and ensure it is aligned to our enterprise level data lake and analytics architecture & security guidelines
Assess platforms and tools that are required for the implementation of data analysis & reporting solutions and be able to advise the project team on the merits & demerits of different solution approaches and make solution recommendations
Drive our SAP eco system analytics implementation and work with both internal and external teams for the execution to completion
Collaborate closely with both internal & external partners to achieve alignment between people working on the implementation
Assess new reporting/analytics technologies and recommend adoption where it will improve overall usage of data as enabler for business purposes
This is an individual contributor & hands-on role
Requirements
• Degree in Information Technology or related fields
• At least 5 - 7 years of experience in designing and implementing Data Warehouse and SAP Analytics cloud solutions, including integration with data from SAP Cloud solutions such as SuccessFactors, Ariba, Concur, etc, along with integration to non-SAP systems
• Have experience in customizing standard business contents in BW to suit the reporting requirements of customized process in S/4HANA
• Experience in SAP Datasphere & SAC Implementation
• Should have worked for at least 1 full cycle embedded analytics and enterprise analytics implementation projects with end-to-end experience in requirements gathering, functional analysis, high level design, built, testing and deployment (implementation in BW/4HANA would be an added advantage)
• Good understanding of ETL processes and techniques such as SDI and Data services for extraction of data from S/4HANA and non-SAP databases
• Have knowledge of working with native HANA modelling and BW modelling techniques and tools
• Knowledge of SAP Analytics Cloud (SAC) BI tool, its pre-built visualization contents, integration and connectivity capabilities
• Have knowledge / experience in delivering projects under the Agile framework
• Work independently as well as collaboratively as part of a highly skilled team
• Good problem-solving and communication skills
Please forward your resume in MS word format to /
Tell employers what skills you haveTableau
Requirements Gathering
Microsoft Excel
Data Analysis
Level Design
Architect
Agile
ETL
Information Technology
SAP
MS Word
Data Architecture
Functional Analysis
Visualization
Requisition
Databases
SAP Data Architecture
Posted 2 days ago
Job Viewed
Job Description
Role – SAP Data Architecture
Job Requisition Number: SA-002
Job Level: 5 – 7 Years of experience
Key Responsibilities
Develop data analysis and reporting framework for S/4HANA eco-system and ensure it is aligned to our enterprise level data lake and analytics architecture & security guidelines
Assess platforms and tools that are required for the implementation of data analysis & reporting solutions and be able to advise the project team on the merits & demerits of different solution approaches and make solution recommendations
Drive our SAP eco system analytics implementation and work with both internal and external teams for the execution to completion
Collaborate closely with both internal & external partners to achieve alignment between people working on the implementation
Assess new reporting/analytics technologies and recommend adoption where it will improve overall usage of data as enabler for business purposes
This is an individual contributor & hands-on role
Requirements
• Degree in Information Technology or related fields
• At least 5 - 7 years of experience in designing and implementing Data Warehouse and SAP Analytics cloud solutions, including integration with data from SAP Cloud solutions such as SuccessFactors, Ariba, Concur, etc, along with integration to non-SAP systems
• Have experience in customizing standard business contents in BW to suit the reporting requirements of customized process in S/4HANA
• Experience in SAP Datasphere & SAC Implementation
• Should have worked for at least 1 full cycle embedded analytics and enterprise analytics implementation projects with end-to-end experience in requirements gathering, functional analysis, high level design, built, testing and deployment (implementation in BW/4HANA would be an added advantage)
• Good understanding of ETL processes and techniques such as SDI and Data services for extraction of data from S/4HANA and non-SAP databases
• Have knowledge of working with native HANA modelling and BW modelling techniques and tools
• Knowledge of SAP Analytics Cloud (SAC) BI tool, its pre-built visualization contents, integration and connectivity capabilities
• Have knowledge / experience in delivering projects under the Agile framework
• Work independently as well as collaboratively as part of a highly skilled team
• Good problem-solving and communication skills
Please forward your resume in MS word format to /
Be The First To Know
About the latest Data engineering manager Jobs in Singapore !
Project Manager, Data Architecture
Posted 5 days ago
Job Viewed
Job Description
Join to apply for the Project Manager, Data Architecture role at Hyundai Motor Group Innovation Center Singapore (HMGICS)
2 days ago Be among the first 25 applicants
Join to apply for the Project Manager, Data Architecture role at Hyundai Motor Group Innovation Center Singapore (HMGICS)
This position bridge between business operations and IT, playing a crucial role in enabling data-driven decision-making and supporting the organization's overall data strategy.
What To Expect
- Connect and collaborate with teams from other regions of the world to share best practices and global standards definitions.
- Interface with several corporate areas and preparation of technical documents, presentations, and reports.
- Providing a high-level, abstract representation of an organization's data requirements
- Design and implement comprehensive data management strategies aligned with business goals
- Domain-oriented data architecture (DODA) structures data around business domains, enabling
- Implement data governance policies and data quality control measures
- Develop and maintain enterprise data architecture solutions, including data warehouses and data lakes
- Ensure data security and compliance with relevant regulations
- Create data models and strategies to organize and store data entities efficiently
- Assess existing databases and data architectures for weaknesses
What You'll Bring
- Bachelor's Degree in Computer Science, Information Technology or relevant disciplines
- At least 13 years of experience in data architect and technologies
- Proficiency in programming languages such as Python, SQL, Java, and C++
- Deep understanding of database management systems and data modeling techniques
- Experience with Big Data technologies, cloud computing platforms, and data analytics tools
- Strong project management and time management abilities
- Excellent communication skills to collaborate with various stakeholders
- Develop data pipeline processes to move data from OLTP to OLAP systems
- Seniority level Entry level
- Employment type Full-time
- Job function Project Management and Information Technology
- Industries Motor Vehicle Manufacturing
Referrals increase your chances of interviewing at Hyundai Motor Group Innovation Center Singapore (HMGICS) by 2x
Sign in to set job alerts for “Project Manager” roles. Strategy Project Manager - Regional Operations Project Manager, Regional Brand & Growth Marketing Regional Project Manager (Managed multi-million projects) AVP/VP, Project Management Officer, COO's OfficeWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrAdvanced Data Architecture Professional
Posted today
Job Viewed
Job Description
Senior Data Architect Position
We are seeking a highly skilled Senior Data Architect to lead our efforts in data engineering and infrastructure. This individual will work closely with cross-functional teams to design, build, and maintain our data systems.
This role combines deep technical expertise with business acumen, ensuring that our data solutions meet the evolving needs of our customers.
The ideal candidate will have experience in designing and implementing large-scale data architectures, as well as a strong understanding of cloud technologies such as AWS, Azure, and Google Cloud.
Key responsibilities include:
- Designing and implementing data ingestion pipelines to collect, clean, and harmonize data from various sources.
- Collaborating with partner IT teams on technology stack, infrastructure, and security alignment.
What We Are Looking For:
- A Bachelor's Degree in Computer Science, Software Engineering, or related field.
- Deep understanding of system design, data structure, and algorithms, data modeling, data access, and data storage.
- Demonstrated ability in using cloud technologies such as AWS, Azure, and Google Cloud.
- Experience with Databricks.
- Experience in designing, building, and maintaining batch and real-time data pipelines.
- Experience with orchestration frameworks such as Airflow, Azure Data Factory.
- Proficiency in working with Python, Shell Scripts, and SQL.
Preferred Requirements:
- Familiarity with building and using CI/CD pipelines.
- Familiarity with DevOps tools such as Docker, Git, Terraform.
- Experience with implementing technical processes to enforce data security, data quality, and data governance.
- Familiarity with government systems and policies relating to data governance, data management, data infrastructure, and data security.
- Experience in Climate and Weather domains is an advantage.
We are an equal opportunity employer committed to fostering an inclusive workplace that values diverse voices and perspectives.
We champion flexible work arrangements and trust you to manage your time to deliver your best.
Our employee benefits are based on a total rewards approach, offering a holistic and market-competitive suite of perks.
Chief Data Architecture Specialist
Posted today
Job Viewed
Job Description
Senior Data Engineer
Our organization is seeking an experienced and skilled Data Engineer to work collaboratively with agencies, optimizing Continuous Integration/Continuous Deployment (CI/CD) and Site Reliability Engineering (SRE) practices for enhanced development delivery efficiency and system resiliency.
As a forward-deployed engineer, you will partner with agencies to solve complex challenges, gather insights, and drive continuous improvement in our product offerings. You will be responsible for:
- Translating data requirements from business users into technical specifications.
- Collaborating with partner agency's IT teams on technology stack, infrastructure, and security alignment.
Key Responsibilities:
- Architect and build ingestion pipelines to collect, clean, merge, and harmonize data from different source systems.
- Day-to-day monitoring of databases and ETL systems, e.g., database capacity planning and maintenance, monitoring, and performance tuning; diagnose issues and deploy measures to prevent recurrence; ensure maximum database uptime.
- Construct, test, and update useful and reusable data models based on data needs of end users.
- Design and build secure mechanisms for end users and systems to access data in data warehouse.
- Research, propose, and develop new technologies and processes to improve agency data infrastructure.
- Collaborate with data stewards to establish and enforce data governance policies, best practices, and procedures.
- Maintain data catalogue to document data assets, metadata, and lineage.
- Implement data quality checks and validation processes to ensure data accuracy and consistency.
- Implement and enforce data security best practices, including access control, encryption, and data masking, to safeguard sensitive data.
Requirements:
Our ideal candidate should possess the following qualifications:
- A Bachelor's Degree, preferably in Computer Science, Software Engineering, Information Technology, or related disciplines.
- Deep understanding of system design, data structure, and algorithms, data modelling, data access, and data storage.
- Demonstrated ability in using cloud technologies such as AWS, Azure, and Google Cloud.
- Experience with Databricks.
- Experience in designing, building, and maintaining batch and real-time data pipelines.
- Experience with orchestration frameworks such as Airflow, Azure Data Factory.
- Proficiency in working with Python, Shell Scripts, and SQL.
Preferred Requirements:
While not mandatory, the following skills would be highly beneficial:
- Familiarity with building and using CI/CD pipelines.
- Familiarity with DevOps tools such as Docker, Git, Terraform.
- Experience with implementing technical processes to enforce data security, data quality, and data governance.
- Familiarity with government systems and government's policies relating to data governance, data management, data infrastructure, and data security.
- Experience in Climate and Weather domains will be an advantage.