367 Data Architect jobs in Singapore
Senior Big Data Architect
Posted today
Job Viewed
Job Description
We are seeking a highly motivated and skilled Big Data Engineer to join our dynamic team.
Responsibilities:
- Data Engineering : Design, develop, and maintain complex big data systems for efficient data processing.
- Data Management : Implement and manage large-scale data storage solutions using relational databases and NoSQL technologies.
- Apache Spark and Hive : Utilize Apache Spark for high-performance data processing and Hive for data warehousing.
- Collaboration and Design : Collaborate with senior engineers to design and optimize data pipelines.
- Quality Assurance : Ensure data quality, accuracy, and performance across systems.
Requirements:
- Education : Bachelor's degree in Computer Science or related field.
- Experience : 5 years of experience in big data engineering or related roles.
- Programming Skills : Strong programming skills in languages such as Java, Python, or Scala (mandatory).
- Apache Spark and Hive : Hands-on experience with Apache Spark and Hive (medium-level proficiency).
- Big Data Concepts : Familiarity with big data concepts, ETL processes, and data warehousing.
- Problem-Solving Skills : Good problem-solving and analytical skills.
- Learning Ability : Eagerness to learn new tools and technologies.
Our ideal candidate is someone who can work independently and collaboratively as part of a team. If you have relevant skills and experiences, we encourage you to apply.
Data Architect - Big Data Platforms
Posted today
Job Viewed
Job Description
Big Data Platform Specialist
We are seeking a highly skilled Big Data Platform Specialist to join our team. As a key member, you will design and implement big data platforms built on Hadoop, Spark, and Hive.
About the Role
- Design and implement big data platforms using Hadoop, Spark, and Hive.
- Develop and optimize data ingestion, transformation, and storage frameworks for high-volume, high-velocity datasets.
- Maintain and fine-tune Hadoop clusters (HDFS, YARN) including capacity planning, monitoring, and troubleshooting.
- Build and support Hive-based data warehouses for analytics, including schema design, partitioning, and performance optimization.
- Optimize Spark jobs for batch and streaming use cases (memory management, shuffle tuning, query optimization).
- Implement data security and governance policies to ensure compliance and controlled access.
Key Skills and Qualifications
- Experience in data engineering or platform engineering with strong focus on big data ecosystems.
- Deep expertise in Hadoop (HDFS, YARN) cluster management and optimization.
- Strong hands-on experience in Spark (batch + streaming) and Hive data modeling.
- Experience in performance tuning, query optimization, and cluster troubleshooting.
Big Data Solution Architect
Posted today
Job Viewed
Job Description
You will be working with a well-renowned organisation in the public and financial sector.
The ideal candidate will have a minimum of 4 years of experience managing data engineering jobs in big data environments, such as Cloudera Data Platform.
Responsibilities:- Analyse data requirements and document specifications.
- Refine data collection/consumption by migrating to more efficient channels.
- Plan, design, and implement data engineering jobs and reporting solutions to meet analytical needs.
- Develop test plans and scripts for system testing and support user acceptance testing.
- Build reports and dashboards according to user requirements.
- Work with internal technical teams to ensure smooth deployment and adoption of new solutions.
- Ensure smooth operations and service levels of IT solutions.
- Support production issues.
- Bachelor's Degree in Computer Science, Computer Engineering, or a related field.
- Strong SQL, data modelling, and data analysis skills are essential.
- Hands-on experience in big data engineering jobs using Python, PySpark, Linux, and ETL tools like Informatica.
- Hands-on experience in a reporting or visualisation tool like SAP BO and Tableau is necessary.
- Hands-on experience in DevOps deployment and data virtualisation tools like Denodo will be an advantage.
- Track record in implementing systems using Hive, Impala, and Cloudera Data Platform will be preferred.
- Good understanding of analytics and data warehouse implementations.
- Ability to troubleshoot complex issues ranging from system resources to application stack traces.
- Track record in implementing systems with high availability, high performance, high security hosted at various data centres or hybrid cloud environments will be an added advantage.
- Passion for automation, standardisation, and best practices.
- Good understanding and completion of projects using Waterfall/Agile methodology.
Data Architect
Posted 2 days ago
Job Viewed
Job Description
We are looking for a highly experienced and skilled Data Architect to join our team. The ideal candidate will have 12-15 years of experience in architecting solutions of data engineering, focusing on ELT and PySpark/Hadoop workloads. In addition to strong solution and delivery skills, the ideal candidate will also have a view on business growth and managing stakeholders.
Responsibilities:
- Design and implement high-performance, scalable, and secure data architectures.
- Work with business stakeholders to understand their data needs and translate them into technical requirements.
- Design and develop data pipelines and workflows using ELT principles and PySpark/Hadoop
- Optimize data pipelines and workflows for performance and efficiency
- Work with data scientists and engineers to ensure that data is accessible and usable for analytics and machine learning
- Implement data governance and security best practices
- Manage and mentor data engineers
- Contribute to the overall data engineering strategy and roadmap
Qualifications:
- 12-15 years of experience in data engineering, with a focus on ELT and PySpark/Hadoop workloads
- Strong experience in designing and implementing high-performance, scalable, and secure data architectures.
- Experience with data governance and security best practices
- Experience in managing and mentoring data engineers
- Excellent communication and interpersonal skills
- Ability to work independently and as part of a team
- Strong problem-solving and analytical skills
Desired Skills:
- Experience with cloud computing platforms such as AWS, Azure, or GCP
- Experience with big data technologies such as Spark, Hadoop, Hive, and Kafka
- Experience with data warehousing and data lakes
- Experience with DevOps and MLOps practices
- Experience with data science and machine learning, streaming data processing
- Experience with real-time analytics, data visualization and reporting tools
Data Architect
Posted 4 days ago
Job Viewed
Job Description
- Design and deliver a scalable data lake architecture to support advanced analytics and cross-business reporting
- Define data ingestion strategies from legacy and modern systems across multiple markets and business units
- Ensure interoperability with enterprise platforms via APIs and standard protocols
- Standardize and consolidate data across disparate systems
- Implement robust data transformation pipelines to ensure clean, accurate, and accessible data
- Collaborate with infrastructure and application teams to ensure seamless data flow
- Define and implement governance policies including data lineage, access control, and regulatory compliance
- Champion data ownership and stewardship in collaboration with business stakeholders
- Ensure data privacy and security protocols across regions
- Partner with business, product, and IT teams to align data architecture with operational goals
- Translate complex business needs into scalable technical solutions
- Support regional nuances while designing a globally consistent data framework
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field
- 15+ years’ experience in data architecture, including leading enterprise-wide data platforms and lakes
- Expertise in modern cloud-based ecosystems (AWS, Azure, GCP), ETL frameworks, and data modeling
- Familiarity with data governance tools (e.g., Collibra, Alation) and security best practices
- Strong cross-functional and international collaboration experience, ideally in matrixed or PE-backed settings
- Exceptional stakeholder management and communication skills
Data Architect
Posted 6 days ago
Job Viewed
Job Description
We are seeking a highly skilled and experienced Azure Data Architect to design and implement secure, scalable, and high-performing data platform solutions. This role will lead the end-to-end architecture of enterprise-grade data systems, integrating Azure and Microsoft Fabric services with Databricks to drive analytics, AI, and business intelligence initiatives across data, apps, and AI.
Key Responsibilities- Architecture & Design: Design and implement modern data platform architectures leveraging Azure Data Lake, Azure Synapse Analytics, Azure Data Factory, Microsoft Fabric (OneLake, Dataflows Gen2, Pipelines, Direct Lake Mode, Semantic Models), Azure Databricks for data engineering, machine learning, and real-time analytics. Provide architectural direction for enterprise data lakehouse and warehouse environments, including medallion (Bronze/Silver/Gold) designs within Fabric and Databricks. Build end-to-end pipelines (ingestion, transformation, modeling, and publishing) using Fabric pipelines and Databricks notebooks. Design and deploy Power BI solutions with Direct Lake, Lakehouse, or Warehouse connectivity.
- Governance & Security: Define data governance, security, and compliance frameworks in collaboration with IT security teams. Collaborate with governance teams to integrate Microsoft Purview for data cataloging, classification, lineage, and sensitivity labeling. Define and enforce RBAC, managed identity access, private endpoint strategy, and DLP policies across all data services. Ensure compliance with regulatory frameworks such as GDPR, PDPA, or HIPAA.
- Performance & Cost Optimisation: Optimise performance and cost by selecting appropriate storage and compute SKUs, caching, partitioning, and workload management strategies. Implement real-time ingestion and streaming solutions using Real-Time Hub and Event Streams in Fabric.
- Collaboration & Leadership: Work closely with Data Engineers, BI Developers, Data Scientists, and Business Analysts to translate business requirements into technical designs. Mentor technical teams in implementing best practices for data architecture, DevOps, and CI/CD automation. Participate in cloud transformation initiatives, migrations, and legacy modernization projects.
Core Technical Skills
- Azure Services: Synapse Analytics, Data Factory, Data Lake Gen2, Key Vault, Monitor, Purview, Event Hub, Functions, Storage Accounts, Private DNS
- Microsoft Fabric: OneLake, Pipelines, Notebooks, Semantic Models, Real-Time Hub, Data Activator
- Databricks: Delta Lake, Unity Catalog, MLflow, Delta Live Tables, Job Clusters, Notebooks
- Power BI: Advanced DAX, deployment pipelines, pbip projects, dataset modeling
- Security: Private Endpoints, RBAC, Managed Identity, network isolation, Purview integration
- Infrastructure-as-Code: Terraform, ARM Templates, Azure DevOps or GitHub Actions for CI/CD
Experience
- Minimum 5 years in data architecture or equivalent roles
- At least 3 years of experience in Azure data stack
- Hands-on experience with Microsoft Fabric (public or private preview) and production workloads in Databricks
- Proven experience implementing secure and governed enterprise data platforms
Preferred Certifications
- Microsoft Certified: Azure Solutions Architect Expert
- Microsoft Certified: Fabric Analytics Engineer Associate
- Databricks Certified Data Engineer (Associate/Professional)
- Microsoft Certified: Azure Data Engineer Associate
Soft Skills
- Strong communication and stakeholder management skills
- Analytical mindset with a problem-solving attitude
- Familiarity with Agile and DevOps methodologies
- Ability to work across cross-functional teams in multi-cloud or hybrid environments
Data Architect
Posted 10 days ago
Job Viewed
Job Description
Get AI-powered advice on this job and more exclusive features.
Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion.
Job Summary:
A professional, who is in charge of establishing, creating, and maintaining data architecture. He/She will make certain that data is efficiently stored, organized, and retrieved to support various corporate operations and decision-making processes with the following attributes
- Self-starter with the ability to work independently or as part of a project team
- Leads and instruct team and assist other employees in accomplishment of the project.
- Ability to articulate/relate complex Technology material to a variety of stakeholders.
- Assist in efforts on establishing corporate guidelines, standards, and best practices in the use/deployment of business intelligence solutions.
Key Responsibilities:
- Understand the existing landscape, client capabilities and requirements and translate them into data solutions that meet the desired performance, scalability, and availability needs of the customer.
- Lead teams in designing and developing data solutions configurations as per the proposed architecture.
- Recommend approaches for solving development and integration problems within data warehouse and business intelligence solutions.
- Provide strategy direction and design implementation roadmaps.
- Be the go-to person for any architecture, design, and development questions.
- Very strong in data technologies
- Identify and adopt new technologies & methodologies to increase performance, automation, and scalability.
- Support the business development activities; shape and design solution architectures for a proposal; attending client meetings with the sales team.
Required:
- Experience in Data Engagements, especially leading data strategy, roadmap and implementation
- Architecture: Enterprise Data Strategy & alignment to Business Strategy, Transformation Projects, various data and warehouse architecture models, Information flow, Integration/interfacing models/approach, Business Information Models, Logical Data Models, Canonical models, Data Life Cycle Management, Reporting (Analytical, Compliance, Regulatory, Models)
- Banking Experience and Domain Knowledge is must
- Data Migration: Strategizing, planning, costing, execution, data retention/archival and disposal.
- Master Data Management: Strategy & Execution, Multi-domain. Tools: Informatica MDM, Stibo PIM, Talend Data fabric
- Data Governance: Data/Information governance capability Assessment (as-is, to-be), Framework design, models, roadmap & Execution
- Architecture Tools: PowerDesigner, ArchiMate, Toad, ERWin
- Design Methodologies: Normalized/De-normalized designs, ER-Models, Star schema models, relational models, Hierarchical models, Inmon and Kimball methodologies
Good to have:
- Programming & GUI: MS SQL, PL/SQL, NoSQL, Shell Scripting, VBA, T-SQL, DAX, MS SSRS, Oracle Forms (6i, 9i & 10g), Intellimatch, MS Access
- RDBMS: Oracle 9i/10g (SQL,PL/SQL), MS SQL Server 2008/ 2012/2014, Oracle 9i AS/10g
- Web/Cloud tools: Site Core, Amazon AWS, WordPress, Google Analytics/GTM reporting
- Management Tools: MS Project, MS Office (Access, Excel, Word, PPT), JIRA, rLean
- TOGAF for Architecture, DAMA – DMBoK.
Our company culture is based on 7 shared values. Honesty and trust allow collaboration; team spirit and modesty sustain it; and the resulting creative freedom and boldness lead to quality results –especially when infused with a sense of fun In addition to an interesting remuneration package, we offer you a professional and international work environment where you will work on major projects. We provide you with intense professional development and stretch you as much as needed to put your skills into action, learn and progress.
Let's talk about what's in it for you!
Passionate people are Capgemini's Ace of Spades - join us to discover a career that will challenge, support and inspire you. Working at Capgemini you'll find the rewards are more than just financial. You will work alongside some very smart and inspiring people on exciting projects and you will also enjoy incredible benefits. We offer flexible work practices and 40 hours of self-development every year with a huge selection of learning opportunities to choose from.
As "Architects of Positive Futures", Capgemini actively supports the community in 3 ways:
Diversity and Inclusion - we believe diversity of thought fuels excellence and innovation, which is why we positively encourage applications from suitably qualified candidates regardless of their gender identity, ethnicity, sexual orientation, religion, ability, intersex status or age. To support our commitment to diversity and inclusion, we celebrate special events and days of significance that are important to our employees such as Diwali, Bastille Day, Pride, IDAHOBIT, IWD and International day of people with Disabilities. Our Employee Resource Groups and OutFront support the grassroots passion of employees to drive our diversity agenda and effect change.
Digital inclusion - at Capgemini we are using our skills to drive social impact initiatives focusing on helping society address the impact of the digital and automation revolution. We also provide employees with opportunities to give back to the community through charity projects and volunteer days.
Environmental Sustainability - Capgemini joined the CDP's (Carbon Disclosure Project) prestigious "A list" for its commitment to the Net-Zero economy. We are focusing on helping our clients transform towards more sustainable business models and committing to reduce our own carbon emissions (GHG) by 20% per employee by 2020.
Recognized by Ethisphere as one of the World's Most Ethical Companies for the last 8 years in a row, ethics and values are at the heart of Capgemini's corporate culture and business. Embedded in our DNA, our seven values - Honesty, Boldness, Trust, Team Spirit, Freedom, Fun and Modesty - have remained the same since company inception in 1967. To see how we bring these values to life, click here to listen to some of our employee’s stories.
Come join us, bring your whole self to work, create new possibilities for you, your customers and your community and help us to be Architects of Positive Futures.
We are seeking the following role in Singapore: Senior Data Architect
#J-18808-LjbffrBe The First To Know
About the latest Data architect Jobs in Singapore !
Data Architect
Posted 12 days ago
Job Viewed
Job Description
Onsite | West Singapore
Build the data engine that powers tomorrow’s factory.
We’re hiring a Data Architect to lead large-scale data infrastructure projects — from modeling and pipelines to compliance and analytics. If OLTP-to-OLAP flows and real-world impact excite you, this one’s for you.
- Design and implement scalable enterprise data architecture
- Build and maintain data lakes, warehouses, and end-to-end data pipelines
- Own OLTP to OLAP flows — from raw operational data to business insights
- Develop robust data models to support analytics, reporting, and operations
- Ensure data compliance (GDPR and others) and enforce governance practices
- Collaborate with cross-functional teams to align data strategy with business needs
- Identify gaps in existing systems and propose scalable solutions
- Document architecture, pipelines, and best practices clearly and consistently
- Degree in Computer Science, IT, or equivalent
- 13+ years in data architecture, data engineering, or enterprise systems
- Strong in data modeling, database design, and building scalable pipelines
- Hands-on with SQL, Python, Java, C++
- Experience with cloud data platforms, ETL tools, and large-scale systems
- Deep understanding of GDPR and enterprise-level data compliance
- Great communicator who can align business and technical teams
Data Architect
Posted 17 days ago
Job Viewed
Job Description
About Aspire Lifestyles
Aspire Lifestyles is an integrated Concierge, personal assistance and customer relationship engagement company. We develop and design white label loyalty programs for leading brands as their marketing proposition which enables new customer acquisitions, retention and loyalty of their customers. This position is responsible for new business development (B2B) by targeting CXOs, CMOs and product managers of leading banks, Hospitality Luxury Auto & Insurance companies along with retention and growth of the current business through relationship and key account management.
Key Responsibilities- Build database systems of high availability and quality depending on the need of the application, business and agency's requirements.
- Strong understanding of data management including integration, compliance, privacy, analytics, master data management and metadata management.
- Manage databases in cloud Azure and AWS.
- Design and implement databases according to end users' information needs and views; create data models, primary and secondary keys, indexes and views.
- Write, code and optimize store procedures and SQL statements to extract data for processing.
- Work with other city agencies and stakeholders to provide data imports and transfer as well as connectivity to other databases.
- Prepare documentation and specifications such as data dictionaries, ERD diagrams, and database schema.
- Collaborate with different teams on application project developments to meet business requirements and timelines.
- Define users and enable data distribution to the right user, in appropriate format and in a timely manner.
- Conduct weekly database maintenance and optimize for high availability.
- Use high-speed transaction recovery techniques and handle database procedures such as upgrades, backups, recoveries, migrations, etc.
- Minimize database downtime and perform query optimization to provide fast query responses.
- Provide proactive and reactive data management support and training to users.
- Determine, enforce and document database policies, procedures and standards.
- Perform tests and evaluations regularly to ensure data security, privacy and integrity; monitor database performance and apply new patches and versions when required.
- Keep abreast of the latest database technology and participate in ongoing training.
- Lead analysis, architecture, design, and development of application, data warehouse, data lake and business intelligence solutions.
- Research, analyze, recommend, and select technical approaches for solving migration, development and integration problems.
- Conduct and support white-boarding sessions, workshops, design sessions, and project meetings as needed, playing a key role in client relations.
- Lead and contribute to data implementation projects and/or project workstreams.
- Work independently or as part of a team to design and develop enterprise data solutions.
- Manage and mentor team members.
- Demonstrate the ability to take on new challenges and work outside comfort zone.
- Communicate effectively, with strong written and oral skills, including presentation and interpersonal abilities; learn new technologies and analytics techniques on the job.
- Strong understanding of Data & analytics APIs.
- Demonstrated experience with some of the following: Amazon Web Services, Google Cloud Platform, Microsoft Azure, Snowflake, Cosmos (NOSQL), Databricks, Airflow, and related cloud solutions and architectures.
- Understanding of distributed systems and architecture design trade-offs.
- Experience with data architectures like Delta Lake and/or Lambda for designing data ingestion frameworks for real time and batch processing.
- 5+ years in data modeling (including Data Vault) and data architecture for enterprise data modeling across multiple subject areas.
- Experience with conceptualizing and architecting data lakes on cloud-centric platforms; ability to articulate value and components of data lakes.
- Hands-on with SQL and relational database design and development; Python for data-intensive integration tasks.
- Experience migrating data to cloud platforms (lift-and-shift).
- Document database topology, architecture, processes and procedures.
- Experience designing digital data platforms leveraging clickstream data from Google Analytics.
- Experience working in Agile Scrum teams.
Bachelor's degree in Computer Engineering, Computer Science, or related discipline.
We take care of our Employees- Medical coverage for employee
- Highly engaged and empowered work culture
- Continuous learning & development
Data Architect
Posted 24 days ago
Job Viewed
Job Description
Thakral One is looking for a Data Architect to lead data architecture design and modernization initiatives in core banking. The role involves shaping scalable, secure, and high-performance data solutions across SQL/NoSQL systems, data lakes, and cloud platforms, while ensuring governance and compliance. Ideal candidates bring deep expertise in system integration, ETL frameworks, and banking data domains, with strong communication and stakeholder management skills.
The Role.- Lead thedata architecture design for core banking modernization initiatives
- Provides consultancy and advisory services for system integration, PoC, and impact analysis
- Conduct an in-depth analysis of system performance and identify any potential bottlenecks that may hinder optimal performance
- Effectively convey solution designs to diverse teams, and provide hands-on documentation or training sessions as necessary
- Define and implementdata lake integration strategies to centralize and streamline data access
- Collaborate with cross-functional teams (Engineering, DevOps, Product, and Business) to align data models with business needs
- Design scalable, secure, and high-performancedata pipelines and ETL frameworks
- Ensure data governance, lineage, and quality across all systems
- Evaluate and recommend modern data technologies, tools, and platforms
- Support migration from legacy systems to cloud-native or hybrid data platforms
- Create and maintainarchitecture documentation , standards, and best practices
- Proven experience or certifications on the following skillsets–Messaging, container implementation, container orchestration, multi-tier application architecture, monitoring tools, and collaboration tools
- Experience in quotation of solutions, differentiating CAPEX and OPEX concepts and evaluating different options to find the most cost-effective solution
- A deep technical understanding required to source infra resources ranging from on-premises to cloud. Should be familiar with architect best practices, but also be flexible to adapt to the team’s established practices and environment
- Good understanding of concurrent software systems and building them in a way that is scalable, maintainable, and robust
- Experience in various SQL and NoSQL databases (Oracle, MongoDB, Cassandra, PostgreSQL, MySQL)
- Proven experience inmodernizing legacy banking systems and core banking platforms
- Strong expertise indata lake architecture (e.g., Azure Data Lake, AWS Lake Formation, Google Big Lake)
- Hands-on experience withdata modeling, ETL/ELT, and real-time data streaming (e.g., Kafka, Spark, Flink)
- Proficiency inSQL, Python, or Scala for data engineering tasks
- Familiarity withcloud platforms (Azure, AWS, GCP) and containerization (Docker, Kubernetes)
- Deep understanding ofdata governance, security, and compliance (e.g., MAS, GDPR, PCI-DSS)
- Experience working withbanking data domains such as payments, loans, deposits, and customer 360
- Excellent communication and stakeholder management skills
Thakral One is a consulting and technology services company headquartered in Singapore, with a pan-Asian presence. We focus primarily around technology-driven consulting, adoption of value-added bespoke solutions, enabling enhanced decision support through data analytics, and embracing possibilities in the cloud. We are heavily inclined towards building capabilities collaboratively with clients and believe strongly in improving grounded and practical outcomes. This approach is possible through our partnership with leading global technology providers and internal R&D teams. Our clients come from Financial Services, Banking, Telco, Government, Healthcare, and Consumer-oriented organisations.
Apply Now
First Name *
Last Name *
Email ID *
Phone *
Notice Period in Days *
Remarks (NA if none) *
Upload CV
You may upload doc, docx & pdf file only. Maximum file size is 25MB.
#J-18808-Ljbffr