172 Data Architect jobs in Singapore
Big Data Engineer & Architect
Posted today
Job Viewed
Job Description
Job Title: Data Engineer & Architect
About the Role:We are seeking a highly skilled and motivated Data Engineer & Architect to join our team. As a Data Engineer & Architect, you will be responsible for designing, developing, and deploying data pipelines using Python and PySpark in cloud-based environments.
Key Responsibilities:- Design and implement data pipelines using Python and PySpark in AWS and Google Cloud Platform (GCP) environments.
- Coding using Python and PySpark in cloud-based environments involving big-data frameworks in AWS resources like EMR, Lambda, S3 bucket, RDS, EC2, ECS, EKS etc.
- Experience in ETL frameworks for data ingestion, data pipeline to Snowflake Data Warehouse at high-frequency & volume scenarios (giga / tera-bytes of data ingestions per day / month).
- Optimize pyspark jobs for performance, efficiency, troubleshoot issues, ensure data quality and availability.
- Implement processes for automating data delivery, monitoring data quality, production deployment.
- The candidate should be an expert in development with hands-on experience in leveraging Git/GitLab based repository management with understanding of Government Commercial Cloud (GCC) requirements.
- Proficient in consulting communication skills - good in articulating business problem, approach for solution, respond to changes in scenarios for business problems.
- Degree in Computer Science, Computer Engineering or any STEM equivalent.
- Familiar with working in Government Commercial Cloud (GCC) environment.
- Independent and self-motivated contributor and passionate about software development.
A dynamic work environment with opportunities for growth and development.
An excellent salary package that reflects your skills and qualifications.
Ongoing training and support to help you achieve your career goals.
A collaborative and inclusive team culture that values diversity and creativity.
Flexible working arrangements to suit your needs.
Access to cutting-edge technology and tools to help you stay ahead of the curve.
Big Data Systems Architect
Posted today
Job Viewed
Job Description
Job Description:
We are seeking a highly skilled Big Data / Systems Engineer with experience in implementing and supporting large-scale MLS systems, including environment setup, capacity planning, and production support. The ideal candidate will have strong skills in Hadoop, NoSQL, Spark, and DevOps tools, with scripting and SQL expertise to optimize system performance and support cross-functional teams.
The successful applicant will participate in system implementation of MLS system, collaborate with Architects, Security and Development Lead on solution design, and liaise with application teams across groups/countries on the enhancement builds. Additionally, they will be responsible for setting up servers, databases, middleware, deployment pipelines, managing environments, and ensuring recommended software improvements are optimized.
Required Skills and Qualifications:
- Experience in Hadoop ecosystem and technologies
- Experience in NoSQL database or Big Data
- Experience in Git, Jenkins, JIRA
- Experience in SharePoint and Confluence
- Experience/knowledge in Unix Shell Scripting, Oracle, Java, and Python, Spark
- Strong SQL skills are required
Benefits:
The ideal candidate will receive competitive compensation and benefits package, along with opportunities for career growth and professional development.
Others:
The role requires strong collaboration and communication skills, as well as ability to work in a fast-paced environment. The successful applicant must have a strong passion for technology and innovation, with a desire to make a meaningful contribution to our organization.
Data Architect
Posted today
Job Viewed
Job Description
About the Role
We are seeking a Databricks Architect to play a pivotal role in shaping and delivering enterprise-level data architecture and scalable solutions that support business-critical operations. This position offers the opportunity to work on innovative data initiatives and contribute to the development of cloud-native architectures for a diverse range of industries.
Key Responsibilities
- Lead the design and implementation of enterprise data architecture and big data solutions.
- Develop end-to-end data models, from conceptual and logical designs through to physical implementation.
- Drive data integration initiatives while ensuring data quality and governance best practices.
- Architect cloud-native data solutions using Databricks, Azure services, and other modern data platforms.
- Collaborate with stakeholders to understand business needs and translate them into technical solutions.
- Provide technical guidance to project teams, ensuring alignment with business objectives.
Requirements
- Proven experience in enterprise data architecture and big data solution design.
- Hands-on expertise with Databricks (Unity Catalog, Delta Live Tables, Workflows), Azure Data Factory, Service Bus, Event Grid, Power BI, PySpark, SQL, Python, SQL Server, SSAS, and ER Studio.
- Strong knowledge of data integration, governance frameworks, and data quality management.
- Experience with cloud-native data architectures.
- Strong analytical and problem-solving abilities.
What's on Offer
- Opportunity to work in a collaborative and global environment.
- Exposure to cutting-edge data platforms and enterprise-scale projects.
- Competitive remuneration and benefits package
EA: 14S7084 | Registration No: R
Data Architect
Posted today
Job Viewed
Job Description
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
YOUR ROLE
A professional, who is in charge of establishing, creating, and maintaining data architecture. He/She will make certain that data is efficiently stored, organized, and retrieved to support various corporate operations and decision-making processes with the following attributes
- Lead and instruct team and assist other employees in accomplishment of the project.
- Ability to articulate/relate complex Technology material to a variety of stakeholders.
- Assist in efforts on establishing corporate guidelines, standards, and best practices in the use/deployment of business intelligence solutions.
YOUR PROFILE
• Work with core platform team on installing inhouse packed on-premises Kubernetes.
• Configure Azure Kubernetes on cloud platform.
• Deploy Apache Spark, Apache Ranger and Apache Kafka on Kubernetes.
• Azure DevOps (ADO)/ recommended tool for deployment automation.
• Storage & Data Management:
Data Storage are divided on Hot and Cold logical partition in GDP. Integrate ADLS G2 with the Hot and Cold storage via S3 Protocol.
• Monitoring & Observability:
Integration with the bank's Central Observability Platform (COP).
Grafana-based monitoring dashboards for Spark jobs and K8s pods.
• Security & Compliance: Hybrid big data platform
Implementing data encryption at rest using TDE via CaaS (Cryptography as a Service).
Configuring on-wire encryption (TLS) for intra/inter-cluster communication.
Enforcing RBAC (Role-Based Access Control) via Apache Ranger for Datahub, Spark, and Kafka.
Working alongside the central security team for platform control assessments.
• Data file transfer:
During project execution, Client will come up with tools technology to transfer 5 PB data from one logical partition to another logical partition within existing Hadoop Platform.
Migrating <1PB of data from Brownfield Azure HDI (ADLS G2) to Greenfield GDP AKS (ADLS G2).
• Maintenance & Disaster Recovery:
Implementing a backup & disaster recovery strategy for ADLS and Isilon.
Tokenization of personal data using Protegrity.
Admin role: On-premises Kubernetes (inhouse Kubernetes Package) + Azure Kubernetes + Spark Cluster + ADO CICD--Apache Ranger , Spark, Kafka, Datahub, Airflow, Trino, Iceberg , Azure DevOps (CI/CD)& Kubernetes'
WHAT YOU'LL LOVE ABOUT WORKING HERE
We promote Diversity & Inclusion as we believe diversity of thought fuels excellence and innovation.
In Capgemini, you are the architect of your career growth. We equip people in maximizing their full potential by providing wide array of career growth programs that empower them to get the future they want.
Capgemini fosters impactful experiences for its people that would aid in bringing out the best in them for them, for the company, and for their clients.
Disclaimer:
Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.
This is a general description of the Duties, Responsibilities, Qualifications required for this position. Physical, mental, sensory, or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity. Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship. Capgemini is committed to providing reasonable accommodations during our recruitment process. If you need assistance or accommodation, please reach out to your recruiting contact.
Click the following link for more information on your rights as an Applicant
Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Data Architect
Posted today
Job Viewed
Job Description
We are seeking a highly skilled and experienced Azure Data Architect to design and implement secure, scalable, and high-performing data platform solutions. This role will lead the end-to-end architecture of enterprise-grade data systems, integrating Azure and Microsoft Fabric services with Databricks to drive analytics, AI, and business intelligence initiatives across data, apps, and AI.
Key Responsibilities- Architecture & Design
Design and implement modern data platform architectures leveraging:
Azure Data Lake, Azure Synapse Analytics, Azure Data Factory
Microsoft Fabric (OneLake, Dataflows Gen2, Pipelines, Direct Lake Mode, Semantic Models)
Azure Databricks for data engineering, machine learning, and real-time analytics
Provide architectural direction for enterprise data lakehouse and warehouse environments, including medallion (Bronze/Silver/Gold) designs within Fabric and Databricks.
Build end-to-end pipelines (ingestion, transformation, modeling, and publishing) using Fabric pipelines and Databricks notebooks.
Design and deploy Power BI solutions with Direct Lake, Lakehouse, or Warehouse connectivity.
- Governance & Security
Define data governance, security, and compliance frameworks in collaboration with IT security teams.
Collaborate with governance teams to integrate Microsoft Purview for data cataloging, classification, lineage, and sensitivity labeling.
Define and enforce RBAC, managed identity access, private endpoint strategy, and DLP policies across all data services.
Ensure compliance with regulatory frameworks such as GDPR, PDPA, or HIPAA.
- Performance & Cost Optimisation
Optimise performance and cost by selecting appropriate storage and compute SKUs, caching, partitioning, and workload management strategies.
Implement real-time ingestion and streaming solutions using Real-Time Hub and Event Streams in Fabric.
- Collaboration & Leadership
Work closely with Data Engineers, BI Developers, Data Scientists, and Business Analysts to translate business requirements into technical designs.
Mentor technical teams in implementing best practices for data architecture, DevOps, and CI/CD automation.
Participate in cloud transformation initiatives, migrations, and legacy modernization projects.
Required Skills and ExperienceCore Technical Skills
- Azure Services: Synapse Analytics, Data Factory, Data Lake Gen2, Key Vault, Monitor, Purview, Event Hub, Functions, Storage Accounts, Private DNS
- Microsoft Fabric: OneLake, Pipelines, Notebooks, Semantic Models, Real-Time Hub, Data Activator
- Databricks: Delta Lake, Unity Catalog, MLflow, Delta Live Tables, Job Clusters, Notebooks
- Power BI: Advanced DAX, deployment pipelines, pbip projects, dataset modeling
- Security: Private Endpoints, RBAC, Managed Identity, network isolation, Purview integration
- Infrastructure-as-Code: Terraform, ARM Templates, Azure DevOps or GitHub Actions for CI/CD
Experience
- Minimum 5 years in data architecture or equivalent roles
- At least 3 years of experience in Azure data stack
- Hands-on experience with Microsoft Fabric (public or private preview) and production workloads in Databricks
- Proven experience implementing secure and governed enterprise data platforms
Preferred Certifications
- Microsoft Certified: Azure Solutions Architect Expert
- Microsoft Certified: Fabric Analytics Engineer Associate
- Databricks Certified Data Engineer (Associate/Professional)
- Microsoft Certified: Azure Data Engineer Associate
Soft Skills
- Strong communication and stakeholder management skills
- Analytical mindset with a problem-solving attitude
- Familiarity with Agile and DevOps methodologies
- Ability to work across cross-functional teams in multi-cloud or hybrid environments
Data Architect
Posted today
Job Viewed
Job Description
About Us
Singapore Management University is a place where high-level professionalism blends together with a healthy informality. The 'family-like' atmosphere among the SMU community fosters a culture where employees work, plan, organise and play together - building a strong collegiality and morale within the university.
Our commitment to attract and retain talent is ongoing. We offer attractive benefits and welfare, competitive compensation packages, and generous professional development opportunities - all to meet the work-life needs of our staff. No wonder, then, that SMU continues to be given numerous awards and recognition for its human resource excellence.
Job Description
- This position is for Office of Integrated Information Technology Services (IITS).
Data Architecture
Create and maintain data architecture framework, standards and principles including modelling, metadata, security, master and reference data.
Data Analytics Solution Leadership
Lead BI Specialists, data scientists to build, support and continuously improve data analytics solutions in both day-to-day data management (BAU) activities and new project implementation.
Champion the good practices of enterprise-grade analytics solution design that promote standardization, scalability, integrability, reusability, resiliency, etc.
Data Analytics System Delivery Management
Work with data engineers, data scientists and other IT members in the implementation of analytics solutions.
- Responsible for requirements analysis, design, development, testing, implementation and maintenance of enterprise-grade analytics system.
Effectively manage project timelines, resources, and deliverables to ensure successful project outcomes, aligning with organizational objectives.
Data Analytics System Support
Lead operational support for enterprise analytics systems, ensure high adoption and Customer Satisfaction (CSAT).
Establish necessary forums with stakeholders to define and execute in accordance to analytics solution roadmap.
Controls & Standards
Define, develop and promote BI & Analytics framework, standards, guidelines, procedures and practices.
Team Management
Take ownership and support team and while holding team accountable for their commitments, removing roadblocks to their work.
- Leverage organizational resources to improve capacity for project work.
Mentor and develop team members.
Any other duties as assigned.
Qualifications
- Degree or Diploma in Computer Science, Information Technology, or a related discipline.
- Minimum 3 years' experience in architect and lead roles, showcasing proficiency in end-to-end BI solution design and architecture.
- Must be technically competent with hands-on expertise in designing enterprise-grade BI solutions using Microsoft Power BI and/or Qlik Sense.
- Proficient in end-to-end analytics solutions and familiar with third normalization, dimensional modelling, ETL, visualization, statistical analysis and machine learning. Experience in cloud technologies, e.g. Azure, AWS, Google is advantageous.
- Relevant certification in data management domain (e.g. DCAM, CDMP) is advantageous.
- Good to be knowledgeable in AI and LLM.
- Capable of assuming roles as project manager and business analyst, leading discussions from project initiation to planning and delivery of data and analytics projects.
- Skilled at quickly assimilating ideas, techniques and information, translating complex requirements into clear and concise statements.
- Well-versed in secure software development life cycle models, code versioning, release management, ticket management, CI/CD, and disaster recovery planning. Experience in establishing these processes is advantageous.
- Experienced in managing third-party vendors, particularly offshore partners, with a focus on project delivery initiatives.
- Resourceful and able to work independently, demonstrating strong planning, organizational, and problem-solving skills to manage competing demands.
- Ability to manage and prioritise work within a small team, and function as a strong individual contributor as well as team leader. The team is small, and strong performers must demonstrate their capability as a subject matter expert.
- Excellent interpersonal, persuasive, and communication skills, with proficiency in both oral and written communication. Capable of collaborating effectively with individuals at various levels of the organization.
Other Information
LI-XL1Candidates who do not possess the stipulated qualifications but have relevant work experience may still apply. Remuneration and appointment terms shall commensurate with qualifications and experience. SMU reserves the right to modify the appointment terms where necessary.
Data Architect
Posted today
Job Viewed
Job Description
*The Opportunity. *
Thakral One is looking for a Data Architect to lead data architecture design and modernization initiatives in core banking. The role involves shaping scalable, secure, and high-performance data solutions across SQL/NoSQL systems, data lakes, and cloud platforms, while ensuring governance and compliance. Ideal candidates bring deep expertise in system integration, ETL frameworks, and banking data domains, with strong communication and stakeholder management skills.
*The Role. *
- Lead the data architecture design for core banking modernization initiatives
- Provides consultancy and advisory services for system integration, PoC, and impact analysis
- Conduct an in-depth analysis of system performance and identify any potential bottlenecks that may hinder optimal performance
- Effectively convey solution designs to diverse teams, and provide hands-on documentation or training sessions as necessary
- Define and implement data lake integration strategies to centralize and streamline data access
- Collaborate with cross-functional teams (Engineering, DevOps, Product, and Business) to align data models with business needs
- Design scalable, secure, and high-performance data pipelines and ETL frameworks
- Ensure data governance, lineage, and quality across all systems
- Evaluate and recommend modern data technologies, tools, and platforms
- Support migration from legacy systems to cloud-native or hybrid data platforms
- Create and maintain architecture documentation, standards, and best practices
*The Expertise. *
- Proven experience or certifications on the following skillsets - Messaging, container implementation, container orchestration, multi-tier application architecture, monitoring tools, and collaboration tools
- Experience in quotation of solutions, differentiating CAPEX and OPEX concepts and evaluating different options to find the most cost-effective solution
- A deep technical understanding required to source infra resources ranging from on-premises to cloud. Should be familiar with architect best practices, but also be flexible to adapt to the team's established practices and environment
- Good understanding of concurrent software systems and building them in a way that is scalable, maintainable, and robust
- Experience in various SQL and NoSQL databases (Oracle, MongoDB, Cassandra, PostgreSQL, MySQL)
- Proven experience in modernizing legacy banking systems and core banking platforms
- Strong expertise in data lake architecture (e.g., Azure Data Lake, AWS Lake Formation, Google Big Lake)
- Hands-on experience with data modeling, ETL/ELT, and real-time data streaming (e.g., Kafka, Spark, Flink)
- Proficiency in SQL, Python, or Scala for data engineering tasks
- Familiarity with cloud platforms (Azure, AWS, GCP) and containerization (Docker, Kubernetes)
- Deep understanding of data governance, security, and compliance (e.g., MAS, GDPR, PCI-DSS)
- Experience working with banking data domains such as payments, loans, deposits, and customer 360
- Excellent communication and stakeholder management skills
About us.
Thakral One is a consulting and technology services company headquartered in Singapore, with a pan-Asian presence. We focus primarily around technology-driven consulting, adoption of value-added bespoke solutions, enabling enhanced decision support through data analytics, and embracing possibilities in the cloud. We are heavily inclined towards building capabilities collaboratively with clients and believe strongly in improving grounded and practical outcomes. This approach is possible through our partnership with leading global technology providers and internal R&D teams. Our clients come from Financial Services, Banking, Telco, Government, Healthcare, and Consumer-oriented organisations.
information_technology
Be The First To Know
About the latest Data architect Jobs in Singapore !
Data Architect
Posted today
Job Viewed
Job Description
We are looking for a highly experienced and skilled Data Architect to join our team. The ideal candidate will have 12-15 years of experience in architecting solutions of data engineering, focusing on ELT and PySpark/Hadoop workloads. In addition to strong solution and delivery skills, the ideal candidate will also have a view on business growth and managing stakeholders.
Responsibilities:
- Design and implement high-performance, scalable, and secure data architectures.
- Work with business stakeholders to understand their data needs and translate them into technical requirements.
- Design and develop data pipelines and workflows using ELT principles and PySpark/Hadoop
- Optimize data pipelines and workflows for performance and efficiency
- Work with data scientists and engineers to ensure that data is accessible and usable for analytics and machine learning
- Implement data governance and security best practices
- Manage and mentor data engineers
- Contribute to the overall data engineering strategy and roadmap
Requirements
Qualifications:
- 12-15 years of experience in data engineering, with a focus on ELT and PySpark/Hadoop workloads
- Strong experience in designing and implementing high-performance, scalable, and secure data architectures.
- Experience with data governance and security best practices
- Experience in managing and mentoring data engineers
- Excellent communication and interpersonal skills
- Ability to work independently and as part of a team
- Strong problem-solving and analytical skills
Desired Skills:
- Experience with cloud computing platforms such as AWS, Azure, or GCP
- Experience with big data technologies such as Spark, Hadoop, Hive, and Kafka
- Experience with data warehousing and data lakes
Experience with DevOps and MLOps practices
Experience with data science and machine learning, streaming data processing
- Experience with real-time analytics, data visualization and reporting tools
Data Architect
Posted today
Job Viewed
Job Description
Data Engineer, Growth
">DescriptionAbout our Data Engineering team:
Our mission is to inspire creativity and bring joy through data-driven insights. We build platform foundations, leveraging data and machine learning models to power global growth of products.
We are a diverse group of talented engineers working together to drive business impact through data engineering.
Responsibilities:
- Design and implement scalable data pipelines to provide a comprehensive data service;
- Extract information and signals from a broad range of data to accomplish analytical goals for
Data Architect
Posted today
Job Viewed
Job Description
We are seeking a highly skilled Senior Data Architecture Specialist to lead our data integration and analytics solutions. This role will be responsible for designing, developing, and maintaining strategic data platforms.