25 Data Architect jobs in Singapore
Data Architect
Posted 15 days ago
Job Viewed
Job Description
Join to apply for the Data Architect role at Internal Security Department
1 week ago Be among the first 25 applicants
Join to apply for the Data Architect role at Internal Security Department
Get AI-powered advice on this job and more exclusive features.
What The Role IsISD confronts and addresses threats to Singapore’s internal security and stability. For over 75 years, ISD and its predecessor organisations have played a central role in countering threats such as those posed by foreign subversive elements, spies, racial and religious extremists, and terrorists. A fulfilling and rewarding career awaits those who want to join in ISD’s critical mission of keeping Singapore safe, secure and sovereign for all Singaporeans.
ISD utilises state-of-the-art technology and technical know-how to safeguard Singapore’s internal security, stability, and sovereignty. We are seeking highly motivated Data Architects to design, build, and administer a powerful data science and analytics ecosystem that enables the organisation to extract insights from voluminous data and make data-driven decisions.
We are hiring for Data Architect roles in the software engineering field. Those without relevant working experience are welcome to apply as long as they are able to demonstrate relevant knowledge and an ability to pick up the required skills quickly.
You will be supported by a team of highly technical and collaborative colleagues, and have opportunities for advanced training, like attending leading overseas conferences and courses. You will also have many opportunities to work on challenging, diverse and unique problem statements.
What You Will Be Working OnOur team’s flagship product is built on the following web stack technologies: Linux servers, nginx, Oracle database, Spark, Hive, Impala, HBase, Elasticsearch, Cloudera Machine Learning, Flask, FastAPI, Celery, RabbitMQ, Ansible, ArcGIS, and Svelte.
We are looking to expand our stack to include: DuckDB, graph databases, data virtualisation, and containerisation.
Depending on your level of experience, you will be helping to:
- Design, develop, deploy, administer and/or maintain (portions of) this tech stack, or solutions based off it.
- Recommend, evaluate, adopt, integrate and/or build on new open-source and commercial tools.
- Engage end users, data analysts, data scientists, data engineers and project managers in the planning, development and delivery of solutions.
- Lead projects, mentor juniors, make technical decisions, and/or formulate strategies.
- Someone passionate about software architecture (and engineering) applied to the domain of data science and analytics.
- Self-motivated, open-minded, resourceful and meticulous candidates. Able to work independently and in teams.
- Ability to keep the overarching objectives and constraints of the project in mind while pursuing the technical details.
- Competent in at least one programming/scripting language (e.g. Python, Scala, JavaScript, bash). Interest and ability to gain expertise in other programming languages, frameworks and technologies.
- Familiarity with one or more database and its related querying languages (e.g. SQL, Query DSL, Cypher).
- Knowledge and experience with any of the following software engineering domains would be an advantage:
- Web development
- System administration
- Big data technologies
- Container technologies
- Data analytics, machine learning and artificial intelligence
- Only Singaporeans need apply
- Entry level
- Full-time
- Information Technology
- Government Administration and Industrial Machinery Manufacturing
Referrals increase your chances of interviewing at Internal Security Department by 2x
Get notified about new Data Architect jobs in Singapore, Singapore .
#J-18808-LjbffrData Architect
Posted today
Job Viewed
Job Description
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
YOUR ROLE
A professional, who is in charge of establishing, creating, and maintaining data architecture. He/She will make certain that data is efficiently stored, organized, and retrieved to support various corporate operations and decision-making processes with the following attributes
- Lead and instruct team and assist other employees in accomplishment of the project.
- Ability to articulate/relate complex Technology material to a variety of stakeholders.
- Assist in efforts on establishing corporate guidelines, standards, and best practices in the use/deployment of business intelligence solutions.
YOUR PROFILE
• Work with core platform team on installing inhouse packed on-premises Kubernetes.
• Configure Azure Kubernetes on cloud platform.
• Deploy Apache Spark, Apache Ranger and Apache Kafka on Kubernetes.
• Azure DevOps (ADO)/ recommended tool for deployment automation.
• Storage & Data Management:
Data Storage are divided on Hot and Cold logical partition in GDP. Integrate ADLS G2 with the Hot and Cold storage via S3 Protocol.
• Monitoring & Observability:
Integration with the bank's Central Observability Platform (COP).
Grafana-based monitoring dashboards for Spark jobs and K8s pods.
• Security & Compliance: Hybrid big data platform
Implementing data encryption at rest using TDE via CaaS (Cryptography as a Service).
Configuring on-wire encryption (TLS) for intra/inter-cluster communication.
Enforcing RBAC (Role-Based Access Control) via Apache Ranger for Datahub, Spark, and Kafka.
Working alongside the central security team for platform control assessments.
• Data file transfer:
During project execution, Client will come up with tools technology to transfer 5 PB data from one logical partition to another logical partition within existing Hadoop Platform.
Migrating <1PB of data from Brownfield Azure HDI (ADLS G2) to Greenfield GDP AKS (ADLS G2).
• Maintenance & Disaster Recovery:
Implementing a backup & disaster recovery strategy for ADLS and Isilon.
Tokenization of personal data using Protegrity.
Admin role: On-premises Kubernetes (inhouse Kubernetes Package) + Azure Kubernetes + Spark Cluster + ADO CICD--Apache Ranger , Spark, Kafka, Datahub, Airflow, Trino, Iceberg , Azure DevOps (CI/CD)& Kubernetes'
WHAT YOU'LL LOVE ABOUT WORKING HERE
We promote Diversity & Inclusion as we believe diversity of thought fuels excellence and innovation.
In Capgemini, you are the architect of your career growth. We equip people in maximizing their full potential by providing wide array of career growth programs that empower them to get the future they want.
Capgemini fosters impactful experiences for its people that would aid in bringing out the best in them for them, for the company, and for their clients.
Disclaimer:
Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.
This is a general description of the Duties, Responsibilities, Qualifications required for this position. Physical, mental, sensory, or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity. Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship. Capgemini is committed to providing reasonable accommodations during our recruitment process. If you need assistance or accommodation, please reach out to your recruiting contact.
Click the following link for more information on your rights as an Applicant
Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Data Architect
Posted today
Job Viewed
Job Description
About Us
Singapore Management University is a place where high-level professionalism blends together with a healthy informality. The 'family-like' atmosphere among the SMU community fosters a culture where employees work, plan, organise and play together - building a strong collegiality and morale within the university.
Our commitment to attract and retain talent is ongoing. We offer attractive benefits and welfare, competitive compensation packages, and generous professional development opportunities - all to meet the work-life needs of our staff. No wonder, then, that SMU continues to be given numerous awards and recognition for its human resource excellence.
Job Description
- This position is for Office of Integrated Information Technology Services (IITS).
Data Architecture
Create and maintain data architecture framework, standards and principles including modelling, metadata, security, master and reference data.
Data Analytics Solution Leadership
Lead BI Specialists, data scientists to build, support and continuously improve data analytics solutions in both day-to-day data management (BAU) activities and new project implementation.
Champion the good practices of enterprise-grade analytics solution design that promote standardization, scalability, integrability, reusability, resiliency, etc.
Data Analytics System Delivery Management
Work with data engineers, data scientists and other IT members in the implementation of analytics solutions.
- Responsible for requirements analysis, design, development, testing, implementation and maintenance of enterprise-grade analytics system.
Effectively manage project timelines, resources, and deliverables to ensure successful project outcomes, aligning with organizational objectives.
Data Analytics System Support
Lead operational support for enterprise analytics systems, ensure high adoption and Customer Satisfaction (CSAT).
Establish necessary forums with stakeholders to define and execute in accordance to analytics solution roadmap.
Controls & Standards
Define, develop and promote BI & Analytics framework, standards, guidelines, procedures and practices.
Team Management
Take ownership and support team and while holding team accountable for their commitments, removing roadblocks to their work.
- Leverage organizational resources to improve capacity for project work.
Mentor and develop team members.
Any other duties as assigned.
Qualifications
- Degree or Diploma in Computer Science, Information Technology, or a related discipline.
- Minimum 3 years' experience in architect and lead roles, showcasing proficiency in end-to-end BI solution design and architecture.
- Must be technically competent with hands-on expertise in designing enterprise-grade BI solutions using Microsoft Power BI and/or Qlik Sense.
- Proficient in end-to-end analytics solutions and familiar with third normalization, dimensional modelling, ETL, visualization, statistical analysis and machine learning. Experience in cloud technologies, e.g. Azure, AWS, Google is advantageous.
- Relevant certification in data management domain (e.g. DCAM, CDMP) is advantageous.
- Good to be knowledgeable in AI and LLM.
- Capable of assuming roles as project manager and business analyst, leading discussions from project initiation to planning and delivery of data and analytics projects.
- Skilled at quickly assimilating ideas, techniques and information, translating complex requirements into clear and concise statements.
- Well-versed in secure software development life cycle models, code versioning, release management, ticket management, CI/CD, and disaster recovery planning. Experience in establishing these processes is advantageous.
- Experienced in managing third-party vendors, particularly offshore partners, with a focus on project delivery initiatives.
- Resourceful and able to work independently, demonstrating strong planning, organizational, and problem-solving skills to manage competing demands.
- Ability to manage and prioritise work within a small team, and function as a strong individual contributor as well as team leader. The team is small, and strong performers must demonstrate their capability as a subject matter expert.
- Excellent interpersonal, persuasive, and communication skills, with proficiency in both oral and written communication. Capable of collaborating effectively with individuals at various levels of the organization.
Other Information
LI-XL1Candidates who do not possess the stipulated qualifications but have relevant work experience may still apply. Remuneration and appointment terms shall commensurate with qualifications and experience. SMU reserves the right to modify the appointment terms where necessary.
Data Architect
Posted today
Job Viewed
Job Description
About the Role
We are seeking a Databricks Architect to play a pivotal role in shaping and delivering enterprise-level data architecture and scalable solutions that support business-critical operations. This position offers the opportunity to work on innovative data initiatives and contribute to the development of cloud-native architectures for a diverse range of industries.
Key Responsibilities
- Lead the design and implementation of enterprise data architecture and big data solutions.
- Develop end-to-end data models, from conceptual and logical designs through to physical implementation.
- Drive data integration initiatives while ensuring data quality and governance best practices.
- Architect cloud-native data solutions using Databricks, Azure services, and other modern data platforms.
- Collaborate with stakeholders to understand business needs and translate them into technical solutions.
- Provide technical guidance to project teams, ensuring alignment with business objectives.
Requirements
- Proven experience in enterprise data architecture and big data solution design.
- Hands-on expertise with Databricks (Unity Catalog, Delta Live Tables, Workflows), Azure Data Factory, Service Bus, Event Grid, Power BI, PySpark, SQL, Python, SQL Server, SSAS, and ER Studio.
- Strong knowledge of data integration, governance frameworks, and data quality management.
- Experience with cloud-native data architectures.
- Strong analytical and problem-solving abilities.
What's on Offer
- Opportunity to work in a collaborative and global environment.
- Exposure to cutting-edge data platforms and enterprise-scale projects.
- Competitive remuneration and benefits package
EA: 14S7084 | Registration No: R
Data Architect
Posted today
Job Viewed
Job Description
We are seeking a highly skilled and experienced Azure Data Architect to design and implement secure, scalable, and high-performing data platform solutions. This role will lead the end-to-end architecture of enterprise-grade data systems, integrating Azure and Microsoft Fabric services with Databricks to drive analytics, AI, and business intelligence initiatives across data, apps, and AI.
Key Responsibilities- Architecture & Design
Design and implement modern data platform architectures leveraging:
Azure Data Lake, Azure Synapse Analytics, Azure Data Factory
Microsoft Fabric (OneLake, Dataflows Gen2, Pipelines, Direct Lake Mode, Semantic Models)
Azure Databricks for data engineering, machine learning, and real-time analytics
Provide architectural direction for enterprise data lakehouse and warehouse environments, including medallion (Bronze/Silver/Gold) designs within Fabric and Databricks.
Build end-to-end pipelines (ingestion, transformation, modeling, and publishing) using Fabric pipelines and Databricks notebooks.
Design and deploy Power BI solutions with Direct Lake, Lakehouse, or Warehouse connectivity.
- Governance & Security
Define data governance, security, and compliance frameworks in collaboration with IT security teams.
Collaborate with governance teams to integrate Microsoft Purview for data cataloging, classification, lineage, and sensitivity labeling.
Define and enforce RBAC, managed identity access, private endpoint strategy, and DLP policies across all data services.
Ensure compliance with regulatory frameworks such as GDPR, PDPA, or HIPAA.
- Performance & Cost Optimisation
Optimise performance and cost by selecting appropriate storage and compute SKUs, caching, partitioning, and workload management strategies.
Implement real-time ingestion and streaming solutions using Real-Time Hub and Event Streams in Fabric.
- Collaboration & Leadership
Work closely with Data Engineers, BI Developers, Data Scientists, and Business Analysts to translate business requirements into technical designs.
Mentor technical teams in implementing best practices for data architecture, DevOps, and CI/CD automation.
Participate in cloud transformation initiatives, migrations, and legacy modernization projects.
Required Skills and ExperienceCore Technical Skills
- Azure Services: Synapse Analytics, Data Factory, Data Lake Gen2, Key Vault, Monitor, Purview, Event Hub, Functions, Storage Accounts, Private DNS
- Microsoft Fabric: OneLake, Pipelines, Notebooks, Semantic Models, Real-Time Hub, Data Activator
- Databricks: Delta Lake, Unity Catalog, MLflow, Delta Live Tables, Job Clusters, Notebooks
- Power BI: Advanced DAX, deployment pipelines, pbip projects, dataset modeling
- Security: Private Endpoints, RBAC, Managed Identity, network isolation, Purview integration
- Infrastructure-as-Code: Terraform, ARM Templates, Azure DevOps or GitHub Actions for CI/CD
Experience
- Minimum 5 years in data architecture or equivalent roles
- At least 3 years of experience in Azure data stack
- Hands-on experience with Microsoft Fabric (public or private preview) and production workloads in Databricks
- Proven experience implementing secure and governed enterprise data platforms
Preferred Certifications
- Microsoft Certified: Azure Solutions Architect Expert
- Microsoft Certified: Fabric Analytics Engineer Associate
- Databricks Certified Data Engineer (Associate/Professional)
- Microsoft Certified: Azure Data Engineer Associate
Soft Skills
- Strong communication and stakeholder management skills
- Analytical mindset with a problem-solving attitude
- Familiarity with Agile and DevOps methodologies
- Ability to work across cross-functional teams in multi-cloud or hybrid environments
Data Architect
Posted today
Job Viewed
Job Description
Job Summary
We are seeking a highly skilled and experienced Azure Data Architect to design and implement secure, scalable, and high-performing data platform solutions. This role will lead the end-to-end architecture of enterprise-grade data systems, integrating Azure and Microsoft Fabric services with Databricks to drive analytics, AI, and business intelligence initiatives across data, apps, and AI.
Key Responsibilities- Architecture & Design
Design and implement modern data platform architectures leveraging:
Azure Data Lake, Azure Synapse Analytics, Azure Data Factory
Microsoft Fabric (OneLake, Dataflows Gen2, Pipelines, Direct Lake Mode, Semantic Models)
Azure Databricks for data engineering, machine learning, and real-time analytics
Provide architectural direction for enterprise data lakehouse and warehouse environments, including medallion (Bronze/Silver/Gold) designs within Fabric and Databricks.
Build end-to-end pipelines (ingestion, transformation, modeling, and publishing) using Fabric pipelines and Databricks notebooks.
Design and deploy Power BI solutions with Direct Lake, Lakehouse, or Warehouse connectivity. - Governance & Security
Define data governance, security, and compliance frameworks in collaboration with IT security teams.
Collaborate with governance teams to integrate Microsoft Purview for data cataloging, classification, lineage, and sensitivity labeling.
Define and enforce RBAC, managed identity access, private endpoint strategy, and DLP policies across all data services.
Ensure compliance with regulatory frameworks such as GDPR, PDPA, or HIPAA. - Performance & Cost Optimisation
Optimise performance and cost by selecting appropriate storage and compute SKUs, caching, partitioning, and workload management strategies.
Implement real-time ingestion and streaming solutions using Real-Time Hub and Event Streams in Fabric. - Collaboration & Leadership
Work closely with Data Engineers, BI Developers, Data Scientists, and Business Analysts to translate business requirements into technical designs.
Mentor technical teams in implementing best practices for data architecture, DevOps, and CI/CD automation.
Participate in cloud transformation initiatives, migrations, and legacy modernization projects.
Core Technical Skills
- Azure Services: Synapse Analytics, Data Factory, Data Lake Gen2, Key Vault, Monitor, Purview, Event Hub, Functions, Storage Accounts, Private DNS
- Microsoft Fabric: OneLake, Pipelines, Notebooks, Semantic Models, Real-Time Hub, Data Activator
- Databricks: Delta Lake, Unity Catalog, MLflow, Delta Live Tables, Job Clusters, Notebooks
- Power BI: Advanced DAX, deployment pipelines, pbip projects, dataset modeling
- Security: Private Endpoints, RBAC, Managed Identity, network isolation, Purview integration
- Infrastructure-as-Code: Terraform, ARM Templates, Azure DevOps or GitHub Actions for CI/CD
Experience
- Minimum 5 years in data architecture or equivalent roles
- At least 3 years of experience in Azure data stack
- Hands-on experience with Microsoft Fabric (public or private preview) and production workloads in Databricks
- Proven experience implementing secure and governed enterprise data platforms
Preferred Certifications
- Microsoft Certified: Azure Solutions Architect Expert
- Microsoft Certified: Fabric Analytics Engineer Associate
- Databricks Certified Data Engineer (Associate/Professional)
- Microsoft Certified: Azure Data Engineer Associate
Soft Skills
- Strong communication and stakeholder management skills
- Analytical mindset with a problem-solving attitude
- Familiarity with Agile and DevOps methodologies
- Ability to work across cross-functional teams in multi-cloud or hybrid environments
Machine Learning
Management Skills
Leadership
Business Intelligence
Modeling
Azure
Pipelines
Architect
Architectural
Soft Skills
Data Governance
Data Engineering
Data Architecture
Power BI
Business Requirements
Data Architect
Posted 5 days ago
Job Viewed
Job Description
We are seeking a highly skilled and experienced Azure Data Architect to design and implement secure, scalable, and high-performing data platform solutions. This role will lead the end-to-end architecture of enterprise-grade data systems, integrating Azure and Microsoft Fabric services with Databricks to drive analytics, AI, and business intelligence initiatives across data, apps, and AI.
Key Responsibilities- Architecture & Design
Design and implement modern data platform architectures leveraging:
Azure Data Lake, Azure Synapse Analytics, Azure Data Factory
Microsoft Fabric (OneLake, Dataflows Gen2, Pipelines, Direct Lake Mode, Semantic Models)
Azure Databricks for data engineering, machine learning, and real-time analytics
Provide architectural direction for enterprise data lakehouse and warehouse environments, including medallion (Bronze/Silver/Gold) designs within Fabric and Databricks.
Build end-to-end pipelines (ingestion, transformation, modeling, and publishing) using Fabric pipelines and Databricks notebooks.
Design and deploy Power BI solutions with Direct Lake, Lakehouse, or Warehouse connectivity. - Governance & Security
Define data governance, security, and compliance frameworks in collaboration with IT security teams.
Collaborate with governance teams to integrate Microsoft Purview for data cataloging, classification, lineage, and sensitivity labeling.
Define and enforce RBAC, managed identity access, private endpoint strategy, and DLP policies across all data services.
Ensure compliance with regulatory frameworks such as GDPR, PDPA, or HIPAA. - Performance & Cost Optimisation
Optimise performance and cost by selecting appropriate storage and compute SKUs, caching, partitioning, and workload management strategies.
Implement real-time ingestion and streaming solutions using Real-Time Hub and Event Streams in Fabric. - Collaboration & Leadership
Work closely with Data Engineers, BI Developers, Data Scientists, and Business Analysts to translate business requirements into technical designs.
Mentor technical teams in implementing best practices for data architecture, DevOps, and CI/CD automation.
Participate in cloud transformation initiatives, migrations, and legacy modernization projects.
Core Technical Skills
- Azure Services: Synapse Analytics, Data Factory, Data Lake Gen2, Key Vault, Monitor, Purview, Event Hub, Functions, Storage Accounts, Private DNS
- Microsoft Fabric: OneLake, Pipelines, Notebooks, Semantic Models, Real-Time Hub, Data Activator
- Databricks: Delta Lake, Unity Catalog, MLflow, Delta Live Tables, Job Clusters, Notebooks
- Power BI: Advanced DAX, deployment pipelines, pbip projects, dataset modeling
- Security: Private Endpoints, RBAC, Managed Identity, network isolation, Purview integration
- Infrastructure-as-Code: Terraform, ARM Templates, Azure DevOps or GitHub Actions for CI/CD
Experience
- Minimum 5 years in data architecture or equivalent roles
- At least 3 years of experience in Azure data stack
- Hands-on experience with Microsoft Fabric (public or private preview) and production workloads in Databricks
- Proven experience implementing secure and governed enterprise data platforms
Preferred Certifications
- Microsoft Certified: Azure Solutions Architect Expert
- Microsoft Certified: Fabric Analytics Engineer Associate
- Databricks Certified Data Engineer (Associate/Professional)
- Microsoft Certified: Azure Data Engineer Associate
Soft Skills
- Strong communication and stakeholder management skills
- Analytical mindset with a problem-solving attitude
- Familiarity with Agile and DevOps methodologies
- Ability to work across cross-functional teams in multi-cloud or hybrid environments
Be The First To Know
About the latest Data architect Jobs in Singapore !
Quantexa Data Architect
Posted today
Job Viewed
Job Description
*Job Description *
We are seeking a highly skilled and experienced Quantexa Data Architect to lead the design and implementation of advanced data solutions leveraging the Quantexa platform. The ideal candidate will possess deep expertise in data engineering, entity resolution, and graph analytics, with a proven track record of delivering scalable solutions across financial services, public sector, and RegTech domains.
Key Responsibilities:
Architecture & Implementation
- Design and implement end-to-end data pipelines and entity resolution solutions using Quantexa, Spark, and Scala.
- Architect real-time streaming solutions utilizing Kafka, Flink, and Kubernetes.
Team Leadership & Collaboration
- Lead cross-functional teams of data engineers, analysts, and developers to deliver contextual data insights and graph analytics.
- Collaborate with stakeholders to gather requirements, conduct workshops, and translate business needs into technical specifications.
Data Quality & Monitoring
- Ensure data quality and integrity through profiling, validation, and automated testing frameworks.
- Develop and maintain monitoring dashboards using Grafana, Prometheus, Qlik Sense, and Tableau.
Innovation & Optimization
- Drive innovation by exploring and implementing emerging technologies in Big Data, cloud computing, and network analytics.
- Optimize data architecture for performance, scalability, and reliability.
*Job Requirements *
- Bachelor's degree in Information Technology, Computer Science, or a related field.
- 8+ years of experience in data architecture, engineering, and analytics.
- Quantexa Certified Data Engineer and Business Analyst.
- Strong proficiency in Scala, Spark (including PySpark), Kafka, Flink, and AWS services.
- Experience in graph and network analytics, entity resolution, and fraud detection.
- Hands-on experience with visualization tools such as Qlik Sense, Tableau, Grafana.
- Familiarity with SQL (PostgreSQL, MS SQL Server, Oracle DB) and NoSQL databases (Cassandra, DynamoDB).
- Knowledge of Kubernetes/OpenShift, ElasticSearch, and Logstash.
- Excellent communication, leadership, and stakeholder management skills.
*Preferred Skills: *
- Experience in financial services, asset management, public sector, or RegTech.
- Background in Agile and Waterfall project delivery methodologies.
- Ability to design business and reporting architectures and conduct functional testing.
- Proficiency in Python, Shell scripting, and data modeling.
*Certifications: *
- Quantexa Data Engineer & Business Analyst
- Qlik Sense Data Architect
- AWS Cloud Practitioner
ServiceNow Data Architect
Posted today
Job Viewed
Job Description
Web Technologies:
Knowledge of web services, SSO, SAML, and other integration technologies.
Database Concepts:
Familiarity with relational databases (SQL) and potentially NoSQL databases, as well as enterprise data warehouses.
Scripting:
Strong scripting skills in JavaScript, GlideScript, and client-side/server-side scripting.
Integrations:
Proficiency in designing and implementing integrations using REST, SOAP, IntegrationHub, and MID Servers.
Data Modeling:
Expertise in designing and implementing data models, understanding of data schemas, tables, and relationships within ServiceNow.
ServiceNow Platform Knowledge:
Deep understanding of the ServiceNow platform, including App Engine, Flow Designer, Service Graph CMDB, and its various modules (ITSM, ITOM, GRC).
Data Management & Optimization :
Analyze, map, and transform data between ServiceNow and external platforms, ensuring data accuracy, integrity, and consistency.
Optimize integrations for performance, scalability, and reliability.
SAP DATA Architect
Posted today
Job Viewed
Job Description
Hi,
SAP Data Architect - Onsite Singapore - Min 5 yrs of relevant years of Experience
Exp in SAC and Datasphere is Must
Key Responsibilities
Play a key role in the data architecture, data cleansing and data migration track of our SAP S/4HANA implementation project:
Develop data analysis and reporting architecture framework for S/4HANA eco-system and ensure it is aligned to our enterprise level data lake and analytics architecture & security guidelines
- Assess platforms and tools that are required for the implementation of data analysis & reporting solutions and be able to advise the project team on the merits & demerits of different solution approaches and make solution recommendations
- Review technical design of our to-be S/4HANA and SAP cloud solution analytics architecture and associated documentations
- Drive our S/4HANA analytics implementation and work with both internal and external teams for the execution to completion
- Collaborate closely with both internal & external partners to achieve alignment between people working on the implementation
Assist the project team in data cleansing and data migration activities
Assess new reporting/analytics technologies and recommend adoption where it will improve overall usage of data as enabler for business purposes
- Any relevant ad-hoc duties
- This is an individual contributor role
Requirements
- Degree in Information Technology or related fields
- At least 5 - 7 years of experience in designing and implementing Data Warehouse and SAP Analytics cloud solutions, including integration with data from SAP Cloud solutions such as SuccessFactors, Ariba, Concur, etc, along with integration to non-SAP systems
- Have experience in customizing standard business contents in BW to suit the reporting requirements of customized process in S/4HANA
- Have experience in migrating data and contents from older versions of ECC BW to BWoH or BW4HANA
- Experience in implementing data quality and master data governance tools
- Should have worked for at least 1 full cycle embedded analytics and enterprise analytics implementation projects with end-to-end experience in requirements gathering, functional analysis, high level design, built, testing and deployment (implementation in BW/4HANA would be an added advantage)
- Good understanding of ETL processes and techniques such as SDI and Data services for extraction of data from S/4HANA and non-SAP databases
- Have knowledge of working with native HANA modelling and BW modelling techniques and tools
- Knowledge of SAP Analytics Cloud (SAC) BI tool, its pre-built visualization contents, integration and connectivity capabilities
- Knowledge of SAP Datawarehouse cloud (DWC), BW Bridge, AWS S3, AWS Redshift and Tableau reporting tool
- Have knowledge / experience in delivering projects under the Agile framework
- Work independently as well as collaboratively as part of a highly skilled team
- Good problem-solving and communication skills
Regards
Kshama
Job Types: Full-time, Permanent