140 Cloud Data jobs in Singapore
Cloud Data Engineer
Posted today
Job Viewed
Job Description
- Develop scalable and compliant data engineering solutions for BI and Data Warehouse projects.
- Collaborate with cross-functional teams to build robust data driven products.
- Provide ongoing maintenance and technical support for deployed data engineering systems.
The ideal candidate will have experience in cloud technologies, database design, big data frameworks and tools.
Required Skills and Qualifications- Proficient in general data cleaning and transformation (e.g. SQL, VQL, pandas, R).
- Proficient in building ETL pipelines using various tools and languages.
- Proficient in database design and management.
Good communication skills are required to work closely with stakeholders and team members.
Cloud Data engineer
Posted today
Job Viewed
Job Description
Overview
Cloud Data Engineer (AI & Analytics domain) – Location: Singapore
Job Summary
As a Cloud data engineer, you will design, build, and maintain scalable, secure, and efficient cloud data architectures on platforms like AWS, Azure, or Google Cloud.
Required Skills And Experience
5+ years of Data consulting experience, or other relevant experience in AI & Analytics domain, with a proven track record of building and maintaining client relationships
Collaborate with customers and account partners to identify new Data and AI opportunities, and build proposals.
Organize and lead educational and ideation AI and Generative AI workshops for customers
Understand customer's needs and assess their data maturity
Evaluate and recommend appropriate data pipelines, frameworks, tools and platforms
Lead data feasibility studies and PoC projects to demonstrate the value and viability of Data and AI solutions
Develop end to end AI PoC projects using Python, Flask, FastAPI and Streamlit
Have experience in cloud services of AWS / GCP / Azure to deploy PoCs and pipelines
Have experience with big data frameworks on prem and on cloud
Collaborate with AI architects and engineers to develop Data and AI solutions tailored to the clients' requirements
Lead integration and deployment of data and AI products
Stay updated with the latest advancements in big data and GenAI along with best practices to develop thought leadership and PoV
Work with cross-functional team and partners to develop/ enhance and package AI offerings and assets for customer-facing discussions
About Encora
Encora is a global company that offers Software and Digital Engineering solutions. Our practices include Cloud Services, Product Engineering & Application Modernization, Data & Analytics, Digital Experience & Design Services, DevSecOps, Cybersecurity, Quality Engineering, AI & LLM Engineering, among others.
At Encora, we hire professionals based solely on their skills and do not discriminate based on age, disability, religion, gender, sexual orientation, socioeconomic status, or nationality.
Seniority level
Mid-Senior level
Employment type
Contract
Job function
Information Technology
Industries
IT Services and IT Consulting
#J-18808-Ljbffr
Cloud Data Engineer
Posted today
Job Viewed
Job Description
Responsibilities
Design and architect data storage solutions, including databases, data lakes, and warehouses, using AWS services such as Amazon S3, Amazon RDS, Amazon Redshift, and Amazon DynamoDB, along with Databricks' Delta Lake. Integrate Informatica IDMC for metadata management and data cataloging.
Create, manage, and optimize data pipelines for ingesting, processing, and transforming data using AWS services like AWS Glue, AWS Data Pipeline, and AWS Lambda, Databricks for advanced data processing, and Informatica IDMC for data integration and quality.
Integrate data from various sources, both internal and external, into AWS and Databricks environments, ensuring data consistency and quality, while leveraging Informatica IDMC for data integration, transformation, and governance.
Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and enrich data, making it suitable for analytical purposes using Databricks' Spark capabilities and Informatica IDMC for data transformation and quality.
Monitor and optimize data processing and query performance in both AWS and Databricks environments, making necessary adjustments to meet performance and scalability requirements. Utilize Informatica IDMC for optimizing data workflows.
Implement security best practices and data encryption methods to protect sensitive data in both AWS and Databricks, while ensuring 1 | P a g e compliance with data privacy regulations. Employ Informatica IDMC for data governance and compliance.
Implement automation for routine tasks, such as data ingestion, transformation, and monitoring, using AWS services like AWS Step Functions, AWS Lambda, Databricks Jobs, and Informatica IDMC for workflow automation.
Maintain clear and comprehensive documentation of data infrastructure, pipelines, and configurations in both AWS and Databricks environments, with metadata management facilitated by Informatica IDMC.
Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and deliver appropriate solutions across AWS, Databricks, and Informatica IDMC.
Identify and resolve data-related issues and provide support to ensure data availability and integrity in both AWS, Databricks, and Informatica IDMC environments.
Optimize AWS, Databricks, and Informatica resource usage to control costs while meeting performance and scalability requirements. Stay up-to-date with AWS, Databricks, Informatica IDMC services, and data engineering best practices to recommend and implement new technologies and techniques.
Qualifications
Bachelor’s or master’s degree in computer science, data engineering, or a related field.
Minimum 7 years of experience in data engineering, with expertise in AWS services, Databricks, and/or Informatica IDMC.
Proficiency in programming languages such as Python, Java, or Scala for building data pipelines.
Evaluate potential technical solutions and make recommendations to resolve data issues especially on performance assessment for complex data transformations and long running data processes.
Strong knowledge of SQL and NoSQL databases.
Familiarity with data modeling and schema design.
Excellent problem-solving and analytical skills.
Strong communication and collaboration skills.
AWS certifications (e.g., AWS Certified Data Analytics - Specialty), Databricks certifications, and Informatica certifications are a plus.
Good to have skills: Informatica Cloud, Databricks and AWS.
#J-18808-Ljbffr
Data Scientist (Cloud data)
Posted today
Job Viewed
Job Description
About The Data Scientist Role
We are seeking a skilled Data Scientist to design and implement data-driven solutions using AWS technologies for cloud data. The role involves performing statistical analysis, developing predictive models, and creating interactive dashboards in AWS QuickSight to deliver actionable business insights. You will support data pipelines, CI/CD workflows, and infrastructure automation while ensuring data quality and governance. The ideal candidate combines strong technical expertise in Python, SQL, and AWS data tools with a deep understanding of analytics, visualization, and operational excellence in cloud environments.
Responsibilities
Data Analysis & Insights
Perform data analysis and statistical modelling using AWS Redshift data
Develop predictive models and machine learning algorithms
Generate actionable insights from large datasets
Conduct data quality assessments and validation
Dashboard & Visualization Development
Create and maintain interactive dashboards in AWS QuickSight
Design data visualizations to support business decision-making
Optimize dashboard performance and user experience
Ensure data accuracy in reporting and visualizations
Data Pipeline & Engineering Support
Monitor and troubleshoot AWS Glue jobs and data ingestion processes
Support CI/CD pipelines with data-focused monitoring and validation
Assist with GitLab pipeline configurations for data workflows
Support AWS Lambda functions related to data processing
Collaborate on Infrastructure as Code (IaC) for data infrastructure
Data Science Operations
Monitor data pipelines and flag data quality issues
Collaborate with technical teams on data requirements
Support data governance and best practices implementation
Assist in data model validation and testing
Documentation & Reporting
Document analytical methodologies and findings
Prepare regular reports on data insights and model performance
Conduct monthly progress meetings to present findings
Maintain project documentation on SHIP-HATS Confluence
Track analytical tasks through SHIP-HATS Jira
Requirements
Strong background in data science, statistics, and machine learning
Proficiency in data analysis tools (Python, R, SQL)
Experience with AWS data services (Redshift, QuickSight, S3, Glue, Lambda)
Data pipeline development and troubleshooting experience
Basic CI/CD pipeline knowledge (GitLab preferred)
Infrastructure as Code (IaC) familiarity for data environments
Data visualization and dashboard development skills
Strong analytical thinking and problem-solving abilities
Excellent documentation and presentation skills
Seniority Level
Entry level
Employment Type
Full-time
Job Function
Engineering and Information Technology
#J-18808-Ljbffr
INFORMATICA Cloud Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Building data ingestion pipelines to integrate databases and datasets
Monitoring and alerting across data pipelines in order to make sure that data ingests are reliable and correct
Interpreting Technical Design documents and implementing them in INFORMATICA IDMC Platform
Mentoring Junior Data Engineers and guiding them to accomplish the tasks
Requirements
5+ Years of IT/Database experience
3+ years of experience in a data engineering role with INFORMATICA as a platform(PowerCenter / BDM, Data Engineering DEI)
1+ Years of experience in INFORMATICA Cloud (IDMC/IICS) with Cloud Data Integration (CDI) as a tool
Associate/Practioner Certification in Data Engineering from INFORMATICA is desirable
Hands-on Experience with relational SQL (such as Oracle, SQL Server and MySQL/PostgreSQL) OR NoSQL databases, such as Hadoop and Hive etc. is essential
Experience in working with other ETL tools, such as Data Stage, Talend, SSIS or dbt
Proficiency in writing SQL queries and knowledge of analytical data warehouses such as Hive, Databricks, and Snowflake
Some working experience with CI/CD automation such as GitHub Actions
Developing APIs and integrating with 3rd party APIs
experience in INFORMATICA BDM/DEI will be a distinct advantage
#J-18808-Ljbffr
INFORMATICA Cloud Data Engineer
Posted 4 days ago
Job Viewed
Job Description
Job Description
- Building data ingestion pipelines to integrate databases and datasets
- Monitoring and alerting across data pipelines in order to make sure that data ingests are reliable and correct
- Interpreting Technical Design documents and implementing them in INFORMATICA IDMC Platform
- Mentoring Junior Data Engineers and guiding them to accomplish the tasks
Requirements
- 5+ Years of IT/Database experience
- 3+ years of experience in a data engineering role with INFORMATICA as a platform(PowerCenter / BDM, Data Engineering DEI)
- 1+ Years of experience in INFORMATICA Cloud (IDMC/IICS) with Cloud Data Integration (CDI) as a tool
- Associate/Practioner Certification in Data Engineering from INFORMATICA is desirable
- Hands-on Experience with relational SQL (such as Oracle, SQL Server and MySQL/PostgreSQL) OR NoSQL databases, such as Hadoop and Hive etc. is essential
- Experience in working with other ETL tools, such as Data Stage, Talend, SSIS or dbt
- Proficiency in writing SQL queries and knowledge of analytical data warehouses such as Hive, Databricks, and Snowflake
- Some working experience with CI/CD automation such as GitHub Actions
- Developing APIs and integrating with 3rd party APIs
- experience in INFORMATICA BDM/DEI will be a distinct advantage
Cloud Data & Analytics Technology Lead
Posted 2 days ago
Job Viewed
Job Description
Do meaningful work with us. Every day.
At Amplify Health, we’re looking for individuals with ambition, resilience and passion for healthcare, insurance, wellness and digital technology. As a fast-growing business with the ambition of making people and communities across Asia healthier, we have exciting career opportunities available to help us achieve our vision.
Lead development and production deployment of enterprise-level analytic data products determining appropriate design strategies and methodologies.
- Primary Job Duties & Responsibilities
- Lead development and production deployment of enterprise-level analytic data products (also potentially pilots and proof of concepts), determining appropriate design strategies and methodologies.
- Find creative solutions to challenging problems involving factors with potentially broad implications; reflecting on solutions, measuring impact, and using that information to ideate and optimize.
- Execute data strategies with an understanding of enterprise architecture, consumption patterns, platforms and application infrastructure.
- Develop business partnerships and influence priorities by identifying solutions that are aligned with current business objectives and closely follow industry trends.
- Communicate with partners, describing technology concepts in ways the business can understand, documenting initiatives in a concise and clear manner, and empathetically and actively listening to other’s thoughts and ideas.
- Partner with data management & information security colleagues to ensure the protection of highly sensitive datasets.
- Maintain relationships with partner teams to ensure needs are met and impacted areas plan work accordingly.
- Present analysis and recommendations to help influence strategic decisions.
- Lead and take action, inspire and motivate others, and be effective at influencing team members.
- Guide and coach team members, focusing on individual’s professional development as well as overall team health and technical proficiency.
What Will Our Ideal Candidate Have?
- Preferably 10+ years of work experience building, managing, and leading high-performing, diverse, collaborative, and geographically distributed data engineering teams.
- Demonstrated ability as a strategic technical partner, working collaboratively with data analytics, data science, product, engineering, and other cross-functional partners, to plan, prioritize, and achieve company goals.
- Experience starting, leading, and evolving technical forums, with effective soft skills, as well as representing data engineering and architectural considerations in cross-team settings.
- Experience developing and nurturing data engineering talent, including implementing training, upskilling, and mentorship plans. Expert-level engineering, architecture, and system design knowledge, with strong computer science fundamentals.
- Experience developing efficient and scalable production software in Python, Java, or other programming languages commonly used in data engineering
- Understanding of event-driven and/or streaming workflows with tools like Kafka and Spark Aptitude with ETL concepts and tools, including experience ingesting, processing, and transforming a variety of data at scale Proficiency with SQL and NoSQL databases, data warehousing concepts, and cloud-based analytics databases.
- Experience with some of the following tools & platforms (or similar): Azure (ADLS, ADF, AKS, Synapse, Purview), Databricks, Python, JavaScript, Kafka, dbt, Terraform, Snowflake, SQL, Jenkins, Github, Airflow, MLFlow Knowledge and experience with the some of the following concepts: Real-time & Batch Data Processing, Workload Orchestration, Cloud, Datalakes, Data Security, Networking, Serverless, Testing / Test Automation (Unit, Integration, Performance, etc.), WebServices, DevOps, Logging, Monitoring, and Alerting, Containerization, Encryption / Decryption, Data Masking, Cost & Performance Optimization.
- Excellent written and oral communication skills, with a demonstrated ability to communicate complex concepts to a wide range of audiences.
- Ability to thrive in a fast-paced, agile, and dynamic environment, while exuding a can-do attitude Ability to handle multiple concurrent projects while working independently and in teams of all sizes with representatives from a diverse set of technical backgrounds.
- Bachelor’s or Master’s degree in Computer Science or equivalent; team leadership and management training is a plus.
You must provide all requested information, including Personal Data, to be considered for this career opportunity. Failure to provide such information may influence the processing and outcome of your application. You are responsible for ensuring that the information you submit is accurate and up-to-date.
Be The First To Know
About the latest Cloud data Jobs in Singapore !
Cloud Storage Specialist
Posted today
Job Viewed
Job Description
Job Description
We are seeking an experienced Cloud Storage Specialist to join our team. As a Cloud Storage Specialist, you will be responsible for designing and deploying cloud storage solutions that meet the needs of our clients.
Your primary focus will be on traditional on-premises storage platforms, cloud service offerings in hybrid cloud environments, and excellent knowledge of AWS storage services. You will also have expertise in merger and acquisition storage migration projects in AWS, Azure, Synology, and Snowball.
You will work closely with our team to design and deploy AWS services using solution design and deployment approach. Additionally, you will be responsible for operational ITIL support processes and managing platform support KPIs.
Required Skills and Qualifications- 10-14 years of relevant IT technical experience in enterprise storage solutions, cloud providers, and Merger and accusation
- Hands-on experience leading storage merger and accusation data-migration projects as technical SME, and handling escalations as SME
- Experience with infrastructure, network, or security issues troubleshooting
- Proven experience working and supporting Hybrid cloud environments
- Experience with developing self-services using ServiceNow SRM
- Knowledge of observability tools such as Beyond compare, Prometheus, Grafana, Nobel9, or similar platforms
- Experience building solution using AWS Services Lambda, and CloudWatch etc.
- Expertise in Migration & Storage: NetApp, Nutanix, EMC, Synology, M365 migrations, Data Center to Cloud migration
- Experience in DevOps: Terraform, CloudFormation, Ansible, Jenkins, Git, CI/CD pipelines, Automation
We offer a competitive salary and benefits package, including health insurance, retirement plan, and paid time off.
OthersWe are an equal opportunity employer and welcome applications from diverse candidates.
Cloud Specialist (Data)
Posted today
Job Viewed
Job Description
Job Descriptions:
Develop and provide cloud computing services (platform-as-a-service or infrastructure-as-a-service) suggestions/proposals for SME, Enterprise and various industry verticals.
Design data solutions aligned to the design principles and security standards of the Google Cloud Platform.
Design and implement solutions around data warehouse implementation ranging from architecture, ETL processes, multidimensional modelling, data marts implementation.
Design and optimize data acquisition, data ingestion, data processing using ETL (Extract, Load, Transform) tool and developing dashboards or other visualizations using Google Cloud Platform or other solutions.
Evaluate the requirements and determine actionable tasks, estimates and provide efforts for business solution build and architecture.
Responsible for technical documentation on solution implementation.
Perform operational readiness tasks and ensure production acceptance criteria are met.
Respond to platform technical issues in a professional and timely manner.
Work closely with the sales team for cloud-related sales opportunities.
Support Pre-Sales activities, i.e. proposal, systems design, proof-of-concept, demo, workshop conduct, etc.
Requirements:
Bachelor’s degree in Computer Science, related technical field, or equivalent practical experience.
3 – 5 years of experience in Business Analytics, Data Engineering or Data Science.
Strong logical/analytical troubleshooting.
Experience in deploying and operating multi/hybrid-cloud platforms, such as Kubernetes, Anthos etc is a plus.
#J-18808-Ljbffr
Cloud Specialist (Data)
Posted 16 days ago
Job Viewed
Job Description
Job Descriptions:
- Develop and provide cloud computing services (platform-as-a-service or infrastructure-as-a-service) suggestions/proposals for SME, Enterprise and various industry verticals.
- Design data solutions aligned to the design principles and security standards of the Google Cloud Platform.
- Design and implement solutions around data warehouse implementation ranging from architecture, ETL processes, multidimensional modelling, data marts implementation.
- Design and optimize data acquisition, data ingestion, data processing using ETL (Extract, Load, Transform) tool and developing dashboards or other visualizations using Google Cloud Platform or other solutions.
- Evaluate the requirements and determine actionable tasks, estimates and provide efforts for business solution build and architecture.
- Responsible for technical documentation on solution implementation.
- Perform operational readiness tasks and ensure production acceptance criteria are met.
- Respond to platform technical issues in a professional and timely manner.
- Work closely with the sales team for cloud-related sales opportunities.
- Support Pre-Sales activities, i.e. proposal, systems design, proof-of-concept, demo, workshop conduct, etc.
Requirements:
- Bachelor’s degree in Computer Science, related technical field, or equivalent practical experience.
- 3 – 5 years of experience in Business Analytics, Data Engineering or Data Science.
- Strong logical/analytical troubleshooting.
- Experience in deploying and operating multi/hybrid-cloud platforms, such as Kubernetes, Anthos etc is a plus.