812 Data Infrastructure jobs in Singapore
Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
We are seeking a skilled data engineer to collaborate with agencies on various projects, leveraging expertise in system design, data structure, and algorithms. This role involves working directly with clients to translate business requirements into technical specifications.
The ideal candidate will have experience in cloud technologies such as AWS, Azure, and Google Cloud, as well as proficiency in using Databricks. Additionally, they should be knowledgeable about designing, building, and maintaining batch and real-time data pipelines.
A bachelor's degree in computer science, software engineering, information technology, or a related field is required. Minimum 7 years of relevant experience is necessary for this position.
Responsibilities- Collaborate with clients to understand their data needs and translate them into technical specifications.
- Design and build data products that meet client requirements, including ingestion pipelines, data storage solutions, and data visualization tools.
- Maintain existing data systems, ensuring optimal performance and scalability.
- Develop new technologies and processes to improve agency data infrastructure.
- Bachelor's degree in computer science, software engineering, information technology, or a related field.
- Minimum 7 years of relevant experience.
- Deep understanding of system design, data structure, and algorithms.
- Experience in cloud technologies such as AWS, Azure, and Google Cloud.
- Proficiency in using Databricks.
- Knowledgeable about designing, building, and maintaining batch and real-time data pipelines.
This role offers a competitive salary and excellent benefits package, including health insurance, retirement plan, and paid time off.
How to ApplyIf you are a motivated and detail-oriented individual with a passion for data engineering, please submit your application, including your resume and a cover letter explaining why you are the ideal candidate for this role.
Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
Seeking an experienced Data Engineer to drive enterprise AI and data initiatives forward. This role focuses on developing scalable data infrastructure, migrating legacy systems, and enabling advanced analytics and machine learning use cases.
Key Responsibilities:
- Design and implement ETL/ELT pipelines to support AI/ML models and business intelligence
- Migrate legacy data platforms to a modern Snowflake-based architecture
- Ensure secure, high-performance infrastructure aligned with data governance policies
- Collaborate with cross-functional teams including data scientists and business stakeholders
- Support real-time data processing and integration of unstructured data sources
- Contribute to AI model deployment by preparing datasets and ensuring seamless system integration
Technical Environment:
- Platforms: Snowflake, AWS (Glue, S3, RDS, Fargate), Azure
- Tools: dbt, MLflow, LangChain, Docker
- Languages: SQL, Python, Java
- AI/ML: TensorFlow, PyTorch, LLM pipelines, vector databases
Requirements:
- More than 4 years of hands-on experience in data engineering
- Experienced in Snowflake (SnowPro certification preferred)
- Strong knowledge of cloud-based data architecture and data warehouse migration
- Experience with ETL/ELT frameworks and MLOps tools
- Familiarity with data security, governance, and performance optimization
- Ability to collaborate effectively with technical and non-technical stakeholders
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field
Leading Data Infrastructure Innovator
Posted today
Job Viewed
Job Description
**Data Engineer Role Overview:**
We are seeking an experienced Data Engineer to join our team. The ideal candidate will have a strong background in developing and maintaining large-scale data infrastructure, as well as experience with data modeling, architecture, and governance.
**Key Responsibilities:**
- Design and develop new data pipelines and architectures to support business growth and improve data quality.
- Collaborate with cross-functional teams to identify and prioritize data-related initiatives.
- Develop and maintain data models and metadata to ensure data accuracy and consistency.
- Implement data governance policies and procedures to ensure compliance with regulatory requirements.
**Requirements:**
- 5+ years of experience in data engineering, with a focus on developing and maintaining large-scale data infrastructure.
- Strong skills in SQL and Python, with experience in Spark or distributed computing.
- Experience with data modeling and architecture concepts.
- Experience with data platform tools like Databricks, Snowflake, Cloudera, etc.
- Experience with cloud providers like AWS, Azure, or GCP.
**Nice to Have:**
- Knowledge of data governance principles and practices.
- Experience with data security and access control.
- Ability to work independently and collaboratively as part of a team.
**What We Offer:**
- Competitive salary and benefits package.
- Opportunities for professional growth and development.
- A dynamic and collaborative work environment.
Global Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
**Job Title:** Data Center Facility Engineer
Description:Data Center Facility Engineers are responsible for critical facility operations of overseas data centers. They manage changes on critical systems, respond to emergent failures, and track preventative maintenance.
Key Responsibilities:
- Manage capacity management of facility systems including power and cooling.
- Oversee availability risk of facility systems and drive resolution.
- Drive projects to improve capacity, efficiency, and reliability of current facility systems.
- Support IT managers with their projects and operations.
- Assist design and project teams on new site/data-hall construction and commissioning.
- Establish performance benchmarks, conduct analyses, and prepare reports on all aspects of critical facility infrastructure operations and maintenance.
To be successful in this role, you will need:
- Bachelor's degree in electrical engineering or a related field.
- 3+ years of critical facility management or operation experience in large-scale data centers or 5+ years of large-scale production facility management experience in large plants.
- Experience with MEP equipment such as UPS, generators, chillers, pumps, cooling towers, etc.
- Strong communication skills in English and attention to detail to identify risks and resolve them.
- Ability to communicate in Chinese and maintain a safe and efficient working environment.
Chief Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
As a Chief Data Infrastructure Specialist, you will design and implement scalable data storage solutions to ensure the integrity and dependability of large datasets.
Key responsibilities include designing and implementing data pipelines for efficient data ingestion, processing, and transformation. This includes leveraging Azure Data Factory or comparable technologies to create complex data models for analytics purposes and assemble large, complex data sets using Databricks.
Other key tasks include ensuring data security and compliance, designing validation and cleansing procedures, and collaborating with cross-functional teams to drive business outcomes through data-driven insights.
- Design and implement data storage solutions
- Develop data pipelines for efficient data ingestion and transformation
- Create data models for analytics purposes
- Assemble large, complex data sets
- Implement data validation and cleansing procedures
- Ensure data security and compliance
Required Skills:
- Azure
- Big Data
- Pipelines
- Scrum
- Hadoop
- ETL
- SQL
- Networking
- SQL Server
- Python
- Business Process
- Data Analytics
- Databases
- Linux
- Analysis Services
Real-Time Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
We are seeking an experienced Real-Time Data Infrastructure Specialist to join our team. The successful candidate will be responsible for designing and implementing data acquisition interfaces for contextualized data, ensuring seamless integration between systems.
- Built upon a strong foundation of SQL query development, the ideal candidate will design and refine queries to extract, transform, and manage production and system data with ease.
- The role requires collaboration with cross-functional teams to drive investigations and design activities related to Plug & Produce (P&P) for MES and Historian platform integration.
- A key aspect of the job is reviewing and validating technical documentation for system interfaces to ensure accuracy and compliance.
- In this position, you will support validation activities by executing and documenting test cases to guarantee system readiness and adherence to regulatory requirements.
- You will work closely with Digital, Automation, MES, MSAT, and Global teams to analyze and resolve application incidents, providing timely follow-up and resolution to ensure minimal disruption.
- The ability to troubleshoot and resolve connectivity and data flow issues across MES, PLC, DCS, and Historian systems is crucial for success in this role.
- Your responsibility will include monitoring performance and availability of Historian platforms to maintain stability and reliability.
- You will also perform regular maintenance tasks such as log reviews, job monitoring, and health checks to prevent potential issues.
- Lastly, the role requires ensuring robust backup and recovery processes are in place and validated to mitigate data loss risks.
AI-Powered Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
We are seeking a Data Engineer to build and maintain data infrastructure that supports AI-powered detection of unauthorized privileged access.
Be The First To Know
About the latest Data infrastructure Jobs in Singapore !
Technical Product Manager - Data Infrastructure
Posted today
Job Viewed
Job Description
Technical Product Manager - Data Infrastructure
Our team is responsible for building a stable, efficient, secure, and easy-to-use big data infrastructure and platform for the company. We provide data storage, calculation, query, and analysis capabilities for business teams, data teams, data analysts, machine learning teams, BI teams, and others. This includes data collection, storage, offline calculation of massive data, real-time stream computing, online analysis and processing, instant return query, and other data infrastructure support, as well as big data development, production scheduling, data quality monitoring, and data maps. We provide a basic platform for computing scheduling, batch computing, real-time computing, structured data query, big data analysis, KV query, and help business teams build data reports, dashboards, real-time data processing, data mining, and analysis.
Job Description
Manage one of the Data Suite products, including but not limited to Data Map, Data Integration, Data Development, Data Management, Data Quality, Data Services, Data Metrics, Data Visualisation, etc.
Responsible for requirements analysis, product design, prototype development, as well as product operation.
Understand user needs and big data use cases, abstract complex requirements into solutions, and implement them in the product to empower users.
Develop and maintain user relationships and communication channels with users; synthesize product features from user requirements and feedback.
Explore and continuously learn best practices from the big data industry, and plan the product blueprints in the long term.
Requirements
Bachelor's Degree or higher
At least 3 years of experience in product management, preferably data-related products.
Familiarity with data management, data development and data application use cases and products.
Excellent product skills and problem-solving capabilities.
Clear logical thinking and a service-oriented mindset.
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology
Industries
Internet Marketplace Platforms and Technology, Information and Internet
#J-18808-Ljbffr
Cloud Infrastructure and Data Engineer
Posted today
Job Viewed
Job Description
Start Date: Immediate
Role Overview
We are seeking a highly skilled Cloud Infrastructure & Data Engineer to oversee delivery of projects under the Government Portfolio. The candidate will provide both infrastructure and data engineering expertise, ensuring smooth implementation of data-driven initiatives on AWS Cloud.
Key Responsibilities
Act as a delivery-side technical lead for cloud and data engineering initiatives within Government portfolio projects.
Design, implement, and manage cloud infrastructure solutions using AWS native services (EC2, S3, RDS, Lambda, IAM, Glue, Redshift, etc.).
Lead and support data ingestion, transformation, and management workflows on AWS.
Collaborate with project teams, vendors, and stakeholders to ensure delivery excellence.
Implement automation and Infrastructure-as-Code solutions using Terraform and AWS CloudFormation .
Contribute to the design and deployment of scalable data management platforms including Databricks and Snowflake (preferred).
Develop automation and integration scripts using Python and AWS Lambda .
Ensure compliance with Government security, governance, and delivery standards.
Required Skills & Experience
Strong expertise in AWS Cloud infrastructure and data engineering .
Proven experience with AWS native services for data ingestion & management (e.g., Glue, Kinesis, Redshift, Athena, Lake Formation).
Good understanding of cloud governance and compliance (GCC) frameworks.
Proficiency in Python for scripting and automation.
Hands-on experience with Terraform (IaC) and automation pipelines.
Strong knowledge of data platforms (Databricks, Snowflake preferred).
Experience in overseeing project delivery in a cloud + data environment.
Excellent problem-solving and stakeholder communication skills.
Good to Have
Familiarity with DevOps practices and CI/CD pipelines.
Knowledge of containerization (Docker, Kubernetes) in AWS.
Previous experience in the Singapore public sector or Government related projects.
Senior DevOps Engineer - AI & Data Infrastructure
Posted today
Job Viewed
Job Description
Our Values
Dream big
– Be visionary, strategic, and open to innovation
Build great things
– Work in service of our users, always improving and pushing higher
Operate like an owner
– Take responsibility with bold decision‐making and bias for action
Win like a sports team
– Be trusting and collaborative while empowering others
Learn and grow fast
– Never stop learning and iterate fast
Share our passion
– Share ideas and practice enthusiasm and joy
Be user obsessed
– Empathetic, inquisitive, practical
About the team
AI team in Goodnotes is looking to transform education and productivity using the latest technologies. We are working on helping students and teachers learn and teach more efficiently, and aim to achieve this by developing tools for Goodnotes AI and more foundational technologies like subject‐oriented handwriting recognition. You will join a distributed team across Europe and Asia focusing specifically on building infrastructure for running our end‐to‐end AI technology. You will work alongside other engineers, as well as iOS engineers, machine learning engineers, QAs and designers to build and improve our new suite of AI features. Type of AI technologies running on our infrastructure include large language models, math solvers, handwriting recognition systems, etc. Besides models, our backend engineering team is building a holistic data engineering pipeline for training and production data.
About the role
Do you believe in automating yourself out of a job? With our company's growth, that’s not going to happen anytime soon. Join a fast‐growing and talented Platform Team to build a modern, strong and stable platform to accelerate the rollout of many new features, for millions of users around the world!
Responsibilities
Design, build, and maintain the Goodnotes infrastructure, ensure it adheres to Dickerson’s Hierarchy of Reliability.
Build CI/CD pipelines for AI and Data Infrastructure workloads.
Be the go‐to person for higher‐level escalation for applications.
Improve the system monitoring, health reporting, and logging.
Design and implement security, assist in maintaining information security practices and procedures.
The tech stack
Monitoring and Logging: We are currently using Datadog for monitoring, APM, logging, CI/CD optimization, Budget and cost management. Metrics are collected across our agents, taken from the logs using metric filters, and updated directly from lambda function or the application.
Infrastructure‐as‐Code: most of our infrastructure is written and defined in Terraform using Atlantis.
CI/CD: We are currently using GitHub Actions for our backend applications and CircleCI for our iOS applications.
Deployments: We have multiple EKS clusters set up either for Blue/Green rollouts or dedicated feature sets. We manage the workload configurations using ArgoCD and Helm.
Lakehouse: Mainly on AWS infrastructure, including Athena, S3 Table, DBT, Kafka, and SQS.
AI infrastructure with external AI providers, large-scale backend bonus Ray on EKS.
The skills you will need to be successful in the above
Familiarity with lakehouse architecture or large‐scale AI systems.
Some background in back‐end development, including API usage and creation.
Deep understanding of Linux and Networking fundamentals.
Strong knowledge of network and container security.
Extensive experience working with AWS products.
Proficiency in container orchestration, with a particular emphasis on Kubernetes.
Familiarity with distributed databases.
Proven track record in building and maintaining CI/CD pipelines.
Proven experience in managing Relational and Non‐relational databases, including backup and restore operations.
Proficiency in automation/configuration management tools like Terraform.
Proficiency in scripting languages, particularly Python.
Even if you don’t meet all the criteria listed above, we would still love to hear from you! Goodnotes places a lot of value on learning and development and will support your growth if needed.
Bonus: If you are familiar with running workloads inside China.
The interview process
Introductory call with someone from our talent acquisition team. They want to hear more about your background, what you are looking for, and why you’d like to join Goodnotes.
Take‐home challenge to set up an end‐to‐end CI/CD pipeline for an API.
Live coding call with one of our engineers. This is where you get to see what it would be like working at Goodnotes as well as the chance to ask any engineering questions you may have.
Behavioural interview with your manager: This is the person who will be managing you day to day, working on your growth and development with you as well as support you throughout your career at Goodnotes.
Panel interview asking behavioural questions on our company values.
What’s in it for you
Meaningful equity in a profitable tech startup.
Budget for things like noise‐cancelling headphones, setting up your home office, personal development, professional training, and health & wellness.
Sponsored visits to our Hong Kong or London office every 2 years.
Company‐wide annual offsite (we met in Portugal in 2023 and Bali in 2024, Turkiye and Korea in 2025).
Flexible working hours and location.
Medical insurance for you and your dependents.
Note: Employment is contingent upon successful completion of background checks, including verification of employment, education, and criminal records.
By submitting your application, you acknowledge that you have read and understood our Candidate Privacy Notice, which provides important information about the data we collect during the application process. You can find it here.
Goodnotes is committed to equality of opportunity for all staff and applications from individuals are encouraged regardless of age, disability, sex, gender reassignment, sexual orientation, pregnancy and maternity, race, religion or belief and marriage and civil partnerships. To help us ensure a fully diverse, equitable and inclusive working environment, we invite you to fill out this voluntary survey so we can track and further our Diversity, Equity and Inclusion efforts. The information shared here cannot and will not affect your job application in any way. It’s also 100% anonymous, and is not linked to your name, identity or application.
#J-18808-Ljbffr